diff -Nru python-urllib3-1.8.3/CHANGES.rst python-urllib3-1.7.1/CHANGES.rst --- python-urllib3-1.8.3/CHANGES.rst 2014-06-24 18:19:21.000000000 +0000 +++ python-urllib3-1.7.1/CHANGES.rst 2013-09-25 16:05:54.000000000 +0000 @@ -1,101 +1,10 @@ Changes ======= -1.8.3 (2014-06-23) -++++++++++++++++++ - -* Fix TLS verification when using a proxy in Python 3.4.1. (Issue #385) - -* Add ``disable_cache`` option to ``urllib3.util.make_headers``. (Issue #393) - -* Wrap ``socket.timeout`` exception with - ``urllib3.exceptions.ReadTimeoutError``. (Issue #399) - -* Fixed proxy-related bug where connections were being reused incorrectly. - (Issues #366, #369) - -* Added ``socket_options`` keyword parameter which allows to define - ``setsockopt`` configuration of new sockets. (Issue #397) - -* Removed ``HTTPConnection.tcp_nodelay`` in favor of - ``HTTPConnection.default_socket_options``. (Issue #397) - -* Fixed ``TypeError`` bug in Python 2.6.4. (Issue #411) - - -1.8.2 (2014-04-17) -++++++++++++++++++ - -* Fix ``urllib3.util`` not being included in the package. - - -1.8.1 (2014-04-17) -++++++++++++++++++ - -* Fix AppEngine bug of HTTPS requests going out as HTTP. (Issue #356) - -* Don't install ``dummyserver`` into ``site-packages`` as it's only needed - for the test suite. (Issue #362) - -* Added support for specifying ``source_address``. (Issue #352) - - -1.8 (2014-03-04) -++++++++++++++++ - -* Improved url parsing in ``urllib3.util.parse_url`` (properly parse '@' in - username, and blank ports like 'hostname:'). - -* New ``urllib3.connection`` module which contains all the HTTPConnection - objects. - -* Several ``urllib3.util.Timeout``-related fixes. Also changed constructor - signature to a more sensible order. [Backwards incompatible] - (Issues #252, #262, #263) - -* Use ``backports.ssl_match_hostname`` if it's installed. (Issue #274) - -* Added ``.tell()`` method to ``urllib3.response.HTTPResponse`` which - returns the number of bytes read so far. (Issue #277) - -* Support for platforms without threading. (Issue #289) - -* Expand default-port comparison in ``HTTPConnectionPool.is_same_host`` - to allow a pool with no specified port to be considered equal to to an - HTTP/HTTPS url with port 80/443 explicitly provided. (Issue #305) - -* Improved default SSL/TLS settings to avoid vulnerabilities. - (Issue #309) - -* Fixed ``urllib3.poolmanager.ProxyManager`` not retrying on connect errors. - (Issue #310) - -* Disable Nagle's Algorithm on the socket for non-proxies. A subset of requests - will send the entire HTTP request ~200 milliseconds faster; however, some of - the resulting TCP packets will be smaller. (Issue #254) - -* Increased maximum number of SubjectAltNames in ``urllib3.contrib.pyopenssl`` - from the default 64 to 1024 in a single certificate. (Issue #318) - -* Headers are now passed and stored as a custom - ``urllib3.collections_.HTTPHeaderDict`` object rather than a plain ``dict``. - (Issue #329, #333) - -* Headers no longer lose their case on Python 3. (Issue #236) - -* ``urllib3.contrib.pyopenssl`` now uses the operating system's default CA - certificates on inject. (Issue #332) - -* Requests with ``retries=False`` will immediately raise any exceptions without - wrapping them in ``MaxRetryError``. (Issue #348) - -* Fixed open socket leak with SSL-related failures. (Issue #344, #348) - - 1.7.1 (2013-09-25) ++++++++++++++++++ -* Added granular timeout support with new ``urllib3.util.Timeout`` class. +* Added granular timeout support with new `urllib3.util.Timeout` class. (Issue #231) * Fixed Python 3.4 support. (Issue #238) diff -Nru python-urllib3-1.8.3/CONTRIBUTORS.txt python-urllib3-1.7.1/CONTRIBUTORS.txt --- python-urllib3-1.8.3/CONTRIBUTORS.txt 2014-04-18 05:35:23.000000000 +0000 +++ python-urllib3-1.7.1/CONTRIBUTORS.txt 2013-09-25 16:01:09.000000000 +0000 @@ -90,32 +90,5 @@ * Kevin Burke and Pavel Kirichenko * Support for separate connect and request timeouts -* Peter Waller - * HTTPResponse.tell() for determining amount received over the wire - -* Nipunn Koorapati - * Ignore default ports when comparing hosts for equality - -* Danilo @dbrgn - * Disabled TLS compression by default on Python 3.2+ - * Disabled TLS compression in pyopenssl contrib module - * Configurable cipher suites in pyopenssl contrib module - -* Roman Bogorodskiy - * Account retries on proxy errors - -* Nicolas Delaby - * Use the platform-specific CA certificate locations - -* Josh Schneier - * HTTPHeaderDict and associated tests and docs - * Bugfixes, docs, test coverage - -* Tahia Khan - * Added Timeout examples in docs - -* Arthur Grunseid - * source_address support and tests (with https://github.com/bui) - * [Your name or handle] <[email or website]> * [Brief summary of your changes] diff -Nru python-urllib3-1.8.3/debian/changelog python-urllib3-1.7.1/debian/changelog --- python-urllib3-1.8.3/debian/changelog 2014-07-07 19:26:38.000000000 +0000 +++ python-urllib3-1.7.1/debian/changelog 2014-07-18 16:17:27.000000000 +0000 @@ -1,81 +1,14 @@ -python-urllib3 (1.8.3-1) unstable; urgency=medium +python-urllib3 (1.7.1-1build2) precise; urgency=low - * New upstream release (Closes: #754090) - * debian/patches/01_do-not-use-embedded-python-six.patch - - Refresh - * debian/patches/04_relax_nosetests_options.patch - - Refresh - - -- Daniele Tricoli Mon, 07 Jul 2014 16:09:06 +0200 - -python-urllib3 (1.8.2-1) unstable; urgency=medium - - * New upstream release - * debian/clean - - Removed .coverage entry - * debian/control - - Added python3-coverage, python3-mock, python3-nose to Build-Depends - - Bumped python(3)-coverage to (>=3.6) - - Removed python-tornado from Build-Depends since it was used only for - dummyserver - * debian/copyright - - Updated copyright years - * debian/patches/01_do-not-use-embedded-python-six.patch - - Refreshed - * debian/patches/02_require-cert-verification.patch - - Refreshed - * debian/patches/03_no-setuptools.patch - - Superseded by debian/patches/setuptools.patch - * debian/patches/03_force-setuptools.patch - - Renamed from setuptools.patch - - Added description - * debian/patches/05_do-not-use-embedded-ssl-match-hostname.patch - - Do not use embedded copy of ssl.match_hostname - * debian/patches/06_relax-test-requirements.patch - - Relax version of packages needed for testing - * debian/rules - - Enabled tests at build time also for Python 3 using the custom build - plugin of pybuild - - Cleaned .coverage file generated by nose using coverage plugin - - No need to remove dummyserver since it is not installed anymore - - -- Daniele Tricoli Wed, 28 May 2014 19:41:18 +0200 - -python-urllib3 (1.8-2) unstable; urgency=medium - - * Team upload. - * d/control: - - Fix python-urllib3-whl Depends. - - Fix typo in python-urllib3-whl description. - - -- Barry Warsaw Thu, 22 May 2014 18:19:16 -0400 - -python-urllib3 (1.8-1) unstable; urgency=medium - - * Team upload. - - [ Daniele Tricoli ] - * New upstream release - * debian/control - - Bumped Standards-Version to 3.9.5 (no changes needed) - * debian/patches/01_do-not-use-embedded-python-six.patch - - Refreshed - * debian/patches/02_require-cert-verification.patch - - Refreshed - - [ Barry Warsaw ] - * d/control: - - Added python-setuptools, python3-setuptools, and python3-wheel to - Build-Depends. - - Added python-urllib3-whl binary package. - * d/rules: - - Build the universal wheels. - - Simplify through use of PYBUILD_NAME. - * d/python-urllib3-whl.install: Added. - * d/patches/setuptools.patch: Use setuptools.setup() so that the - bdist_wheel command will work. + * No-change backport to precise - -- Barry Warsaw Thu, 15 May 2014 17:21:50 -0400 + -- Lars Butler (larsbutler) Fri, 18 Jul 2014 16:17:07 +0000 + +python-urllib3 (1.7.1-1build1) trusty; urgency=medium + + * Rebuild to drop files installed into /usr/share/pyshared. + + -- Matthias Klose Sun, 23 Feb 2014 13:53:23 +0000 python-urllib3 (1.7.1-1) unstable; urgency=low diff -Nru python-urllib3-1.8.3/debian/clean python-urllib3-1.7.1/debian/clean --- python-urllib3-1.8.3/debian/clean 2014-05-28 19:46:12.000000000 +0000 +++ python-urllib3-1.7.1/debian/clean 2013-10-17 21:40:59.000000000 +0000 @@ -1 +1,2 @@ urllib3.egg-info/* +.coverage diff -Nru python-urllib3-1.8.3/debian/control python-urllib3-1.7.1/debian/control --- python-urllib3-1.8.3/debian/control 2014-05-28 19:46:12.000000000 +0000 +++ python-urllib3-1.7.1/debian/control 2013-10-17 21:40:59.000000000 +0000 @@ -7,19 +7,14 @@ debhelper (>= 9), dh-python, python-all (>= 2.6.6-3), - python-coverage (>= 3.6), + python-coverage (>= 3.4), python-mock, python-nose (>=1.1.2), - python-setuptools, python-six, + python-tornado, python3-all, - python3-coverage (>= 3.6), - python3-mock, - python3-nose (>=1.1.2), - python3-setuptools, python3-six, - python3-wheel, -Standards-Version: 3.9.5 +Standards-Version: 3.9.4 X-Python-Version: >= 2.6 X-Python3-Version: >= 3.0 Homepage: http://urllib3.readthedocs.org @@ -69,26 +64,3 @@ building upon. . This package contains the Python 3 version of the library. - -Package: python-urllib3-whl -Architecture: all -Depends: - ${misc:Depends}, - ${python3:Depends}, - python-six-whl -Recommends: - ca-certificates -Description: HTTP library with thread-safe connection pooling - urllib3 supports features left out of urllib and urllib2 libraries. - . - - Re-use the same socket connection for multiple requests (HTTPConnectionPool - and HTTPSConnectionPool) (with optional client-side certificate - verification). - - File posting (encode_multipart_formdata). - - Built-in redirection and retries (optional). - - Supports gzip and deflate decoding. - - Thread-safe and sanity-safe. - - Small and easy to understand codebase perfect for extending and - building upon. - . - This package contains the universal wheel. diff -Nru python-urllib3-1.8.3/debian/copyright python-urllib3-1.7.1/debian/copyright --- python-urllib3-1.8.3/debian/copyright 2014-05-28 19:46:12.000000000 +0000 +++ python-urllib3-1.7.1/debian/copyright 2013-10-17 21:40:59.000000000 +0000 @@ -20,7 +20,7 @@ License: PSF-2 Files: debian/* -Copyright: 2012-2014, Daniele Tricoli +Copyright: 2012-2013, Daniele Tricoli License: Expat License: Expat diff -Nru python-urllib3-1.8.3/debian/patches/01_do-not-use-embedded-python-six.patch python-urllib3-1.7.1/debian/patches/01_do-not-use-embedded-python-six.patch --- python-urllib3-1.8.3/debian/patches/01_do-not-use-embedded-python-six.patch 2014-07-07 19:26:26.000000000 +0000 +++ python-urllib3-1.7.1/debian/patches/01_do-not-use-embedded-python-six.patch 2013-10-17 21:40:59.000000000 +0000 @@ -1,14 +1,14 @@ Description: Do not use embedded copy of python-six. Author: Daniele Tricoli Forwarded: not-needed -Last-Update: 2014-07-7 +Last-Update: 2013-10-16 --- a/test/test_collections.py +++ b/test/test_collections.py -@@ -4,7 +4,7 @@ - HTTPHeaderDict, - RecentlyUsedContainer as Container - ) +@@ -1,7 +1,7 @@ + import unittest + + from urllib3._collections import RecentlyUsedContainer as Container -from urllib3.packages import six +import six xrange = six.moves.xrange @@ -16,18 +16,18 @@ --- a/urllib3/connectionpool.py +++ b/urllib3/connectionpool.py -@@ -31,7 +31,7 @@ +@@ -55,7 +55,7 @@ ProxyError, ) - from .packages.ssl_match_hostname import CertificateError + from .packages.ssl_match_hostname import CertificateError, match_hostname -from .packages import six +import six - from .connection import ( - port_by_scheme, - DummyConnection, + from .request import RequestMethods + from .response import HTTPResponse + from .util import ( --- a/urllib3/filepost.py +++ b/urllib3/filepost.py -@@ -9,8 +9,8 @@ +@@ -10,8 +10,8 @@ from uuid import uuid4 from io import BytesIO @@ -40,15 +40,26 @@ writer = codecs.lookup('utf-8')[3] --- a/urllib3/response.py +++ b/urllib3/response.py -@@ -11,7 +11,7 @@ +@@ -10,7 +10,7 @@ + import io - from ._collections import HTTPHeaderDict - from .exceptions import DecodeError, ReadTimeoutError + from .exceptions import DecodeError -from .packages.six import string_types as basestring, binary_type +from six import string_types as basestring, binary_type from .util import is_fp_closed +--- a/urllib3/util.py ++++ b/urllib3/util.py +@@ -32,7 +32,7 @@ + except ImportError: + pass + +-from .packages import six ++import six + from .exceptions import LocationParseError, SSLError, TimeoutStateError + + --- a/test/test_filepost.py +++ b/test/test_filepost.py @@ -2,7 +2,7 @@ @@ -62,7 +73,7 @@ BOUNDARY = '!! test boundary !!' --- a/dummyserver/handlers.py +++ b/dummyserver/handlers.py -@@ -190,7 +190,7 @@ +@@ -186,7 +186,7 @@ """ import tornado.httputil import email.utils @@ -88,40 +99,8 @@ import unittest from urllib3.fields import guess_content_type, RequestField --from urllib3.packages.six import u -+from six import u +-from urllib3.packages.six import b, u ++from six import b, u class TestRequestField(unittest.TestCase): ---- a/urllib3/_collections.py -+++ b/urllib3/_collections.py -@@ -20,7 +20,7 @@ - from collections import OrderedDict - except ImportError: - from .packages.ordered_dict import OrderedDict --from .packages.six import itervalues -+from six import itervalues - - - __all__ = ['RecentlyUsedContainer', 'HTTPHeaderDict'] ---- a/urllib3/connection.py -+++ b/urllib3/connection.py -@@ -34,7 +34,7 @@ - ConnectTimeoutError, - ) - from .packages.ssl_match_hostname import match_hostname --from .packages import six -+import six - from .util import ( - assert_fingerprint, - resolve_cert_reqs, ---- a/urllib3/util/request.py -+++ b/urllib3/util/request.py -@@ -1,6 +1,6 @@ - from base64 import b64encode - --from ..packages import six -+import six - - - ACCEPT_ENCODING = 'gzip,deflate' diff -Nru python-urllib3-1.8.3/debian/patches/02_require-cert-verification.patch python-urllib3-1.7.1/debian/patches/02_require-cert-verification.patch --- python-urllib3-1.8.3/debian/patches/02_require-cert-verification.patch 2014-05-26 16:32:14.000000000 +0000 +++ python-urllib3-1.7.1/debian/patches/02_require-cert-verification.patch 2013-10-17 21:40:59.000000000 +0000 @@ -3,20 +3,28 @@ CERT_REQUIRED and using the system /etc/ssl/certs/ca-certificates.crt Bug-Ubuntu: https://launchpad.net/bugs/1047054 Bug-Debian: http://bugs.debian.org/686872 -Last-Update: 2014-05-24 +Last-Update: 2013-10-16 --- a/urllib3/connectionpool.py +++ b/urllib3/connectionpool.py -@@ -591,6 +591,8 @@ - ``ssl_version`` are only used if :mod:`ssl` is available and are fed into - :meth:`urllib3.util.ssl_wrap_socket` to upgrade the connection socket - into an SSL socket. -+ -+ On Debian, SSL certificate validation is required by default +@@ -87,12 +87,13 @@ + Based on httplib.HTTPSConnection but wraps the socket with + SSL certification. """ +- cert_reqs = None +- ca_certs = None ++ # On Debian, SSL certificate validation is required by default ++ cert_reqs = 'CERT_REQUIRED' ++ ca_certs = '/etc/ssl/certs/ca-certificates.crt' + ssl_version = None - scheme = 'https' -@@ -600,8 +602,8 @@ + def set_cert(self, key_file=None, cert_file=None, +- cert_reqs=None, ca_certs=None, ++ cert_reqs='CERT_REQUIRED', ca_certs='/etc/ssl/certs/ca-certificates.crt', + assert_hostname=None, assert_fingerprint=None): + + self.key_file = key_file +@@ -644,8 +645,8 @@ strict=False, timeout=None, maxsize=1, block=False, headers=None, _proxy=None, _proxy_headers=None, @@ -24,6 +32,6 @@ - ca_certs=None, ssl_version=None, + key_file=None, cert_file=None, cert_reqs='CERT_REQUIRED', + ca_certs='/etc/ssl/certs/ca-certificates.crt', ssl_version=None, - assert_hostname=None, assert_fingerprint=None, - **conn_kw): + assert_hostname=None, assert_fingerprint=None): + HTTPConnectionPool.__init__(self, host, port, strict, timeout, maxsize, diff -Nru python-urllib3-1.8.3/debian/patches/03_force_setuptools.patch python-urllib3-1.7.1/debian/patches/03_force_setuptools.patch --- python-urllib3-1.8.3/debian/patches/03_force_setuptools.patch 2014-05-26 16:32:14.000000000 +0000 +++ python-urllib3-1.7.1/debian/patches/03_force_setuptools.patch 1970-01-01 00:00:00.000000000 +0000 @@ -1,15 +0,0 @@ -Author: Barry Warsaw -Description: Use setuptools.setup() so that the bdist_wheel - command will work. -Last-Update: 2014-05-15 - ---- a/setup.py -+++ b/setup.py -@@ -1,6 +1,6 @@ - #!/usr/bin/env python - --from distutils.core import setup -+from setuptools import setup - - import os - import re diff -Nru python-urllib3-1.8.3/debian/patches/03_no-setuptools.patch python-urllib3-1.7.1/debian/patches/03_no-setuptools.patch --- python-urllib3-1.8.3/debian/patches/03_no-setuptools.patch 1970-01-01 00:00:00.000000000 +0000 +++ python-urllib3-1.7.1/debian/patches/03_no-setuptools.patch 2013-05-10 17:22:09.000000000 +0000 @@ -0,0 +1,26 @@ +Description: Do not use setuptools. +Author: Daniele Tricoli +Forwarded: not-needed +Last-Update: 2013-05-10 + +--- a/setup.py ++++ b/setup.py +@@ -5,11 +5,6 @@ + import os + import re + +-try: +- import setuptools +-except ImportError: +- pass # No 'develop' command, oh well. +- + base_path = os.path.dirname(__file__) + + # Get the version (borrowed from SQLAlchemy) +@@ -49,6 +44,4 @@ + 'urllib3.contrib', + ], + requires=requirements, +- tests_require=tests_requirements, +- test_suite='test', + ) diff -Nru python-urllib3-1.8.3/debian/patches/04_relax_nosetests_options.patch python-urllib3-1.7.1/debian/patches/04_relax_nosetests_options.patch --- python-urllib3-1.8.3/debian/patches/04_relax_nosetests_options.patch 2014-07-07 19:26:26.000000000 +0000 +++ python-urllib3-1.7.1/debian/patches/04_relax_nosetests_options.patch 2013-10-17 21:40:59.000000000 +0000 @@ -3,7 +3,7 @@ it will be easier to backport python-urllib3 to Wheezy. Author: Daniele Tricoli Forwarded: not-needed -Last-Update: 2014-7-7 +Last-Update: 2013-10-16 --- a/setup.cfg +++ b/setup.cfg @@ -17,4 +17,4 @@ +# cover-min-percentage = 100 cover-erase = true - [flake8] + [egg_info] diff -Nru python-urllib3-1.8.3/debian/patches/05_do-not-use-embedded-ssl-match-hostname.patch python-urllib3-1.7.1/debian/patches/05_do-not-use-embedded-ssl-match-hostname.patch --- python-urllib3-1.8.3/debian/patches/05_do-not-use-embedded-ssl-match-hostname.patch 2014-05-26 16:32:14.000000000 +0000 +++ python-urllib3-1.7.1/debian/patches/05_do-not-use-embedded-ssl-match-hostname.patch 1970-01-01 00:00:00.000000000 +0000 @@ -1,56 +0,0 @@ -Description: Do not use embedded copy of ssl.match_hostname. -Author: Daniele Tricoli -Forwarded: not-needed -Last-Update: 2014-05-25 - ---- a/test/test_connectionpool.py -+++ b/test/test_connectionpool.py -@@ -6,7 +6,7 @@ - HTTPConnectionPool, - ) - from urllib3.util import Timeout --from urllib3.packages.ssl_match_hostname import CertificateError -+from ssl import CertificateError - from urllib3.exceptions import ( - ClosedPoolError, - EmptyPoolError, ---- a/urllib3/connection.py -+++ b/urllib3/connection.py -@@ -38,7 +38,7 @@ - from .exceptions import ( - ConnectTimeoutError, - ) --from .packages.ssl_match_hostname import match_hostname -+from ssl import match_hostname - import six - from .util import ( - assert_fingerprint, ---- a/urllib3/connectionpool.py -+++ b/urllib3/connectionpool.py -@@ -31,7 +31,7 @@ - ReadTimeoutError, - ProxyError, - ) --from .packages.ssl_match_hostname import CertificateError -+from ssl import CertificateError - import six - from .connection import ( - port_by_scheme, ---- a/urllib3/packages/__init__.py -+++ b/urllib3/packages/__init__.py -@@ -1,4 +1,3 @@ - from __future__ import absolute_import - --from . import ssl_match_hostname - ---- a/setup.py -+++ b/setup.py -@@ -45,7 +45,7 @@ - url='http://urllib3.readthedocs.org/', - license='MIT', - packages=['urllib3', -- 'urllib3.packages', 'urllib3.packages.ssl_match_hostname', -+ 'urllib3.packages', - 'urllib3.contrib', 'urllib3.util', - ], - requires=requirements, diff -Nru python-urllib3-1.8.3/debian/patches/06_relax-test-requirements.patch python-urllib3-1.7.1/debian/patches/06_relax-test-requirements.patch --- python-urllib3-1.8.3/debian/patches/06_relax-test-requirements.patch 2014-05-26 16:32:14.000000000 +0000 +++ python-urllib3-1.7.1/debian/patches/06_relax-test-requirements.patch 1970-01-01 00:00:00.000000000 +0000 @@ -1,16 +0,0 @@ -Description: Relax version of packages needed for testing. -Author: Daniele Tricoli -Forwarded: not-needed -Last-Update: 2014-05-25 - ---- a/test-requirements.txt -+++ b/test-requirements.txt -@@ -1,4 +1,4 @@ --nose==1.3 --mock==1.0.1 --tornado==3.1.1 --coverage==3.6 -+nose>=1.3 -+mock>=1.0.1 -+tornado>=3.1.1 -+coverage>=3.6 diff -Nru python-urllib3-1.8.3/debian/patches/series python-urllib3-1.7.1/debian/patches/series --- python-urllib3-1.8.3/debian/patches/series 2014-05-26 16:32:14.000000000 +0000 +++ python-urllib3-1.7.1/debian/patches/series 2013-10-17 21:40:59.000000000 +0000 @@ -1,6 +1,4 @@ 01_do-not-use-embedded-python-six.patch 02_require-cert-verification.patch -03_force_setuptools.patch +03_no-setuptools.patch 04_relax_nosetests_options.patch -05_do-not-use-embedded-ssl-match-hostname.patch -06_relax-test-requirements.patch diff -Nru python-urllib3-1.8.3/debian/python-urllib3-whl.install python-urllib3-1.7.1/debian/python-urllib3-whl.install --- python-urllib3-1.8.3/debian/python-urllib3-whl.install 2014-05-26 16:32:14.000000000 +0000 +++ python-urllib3-1.7.1/debian/python-urllib3-whl.install 1970-01-01 00:00:00.000000000 +0000 @@ -1 +0,0 @@ -usr/share/python-wheels diff -Nru python-urllib3-1.8.3/debian/rules python-urllib3-1.7.1/debian/rules --- python-urllib3-1.8.3/debian/rules 2014-05-28 19:46:17.000000000 +0000 +++ python-urllib3-1.7.1/debian/rules 2013-10-17 21:40:59.000000000 +0000 @@ -1,8 +1,10 @@ #!/usr/bin/make -f -export PYBUILD_NAME=urllib3 +export PYBUILD_DESTDIR_python2=debian/python-urllib3/ +export PYBUILD_DESTDIR_python3=debian/python3-urllib3/ export PYTHONWARNINGS=d + %: dh $@ --with python2,python3 --buildsystem=pybuild @@ -11,16 +13,17 @@ override_dh_auto_install: dh_auto_install - python3 setup.py bdist_wheel \ - --universal \ - -d $(CURDIR)/debian/tmp/usr/share/python-wheels + # Remove dummyserver/ tests to not pollute namespace. + rm -rf debian/python*-urllib3/usr/lib/python*/dist-packages/dummyserver override_dh_auto_test: - PYBUILD_SYSTEM=custom \ - PYBUILD_TEST_ARGS="cd {build_dir}; {interpreter} -m nose {dir}/test --with-coverage" dh_auto_test - # Clean here .coverage because it is created by nose using the coverage - # plugin - find . -name .coverage -delete +ifeq (,$(filter nocheck,$(DEB_BUILD_OPTIONS))) + # Python3 testing is not possible at the moment because missing + # dependencies: python3-coverage. + # Upstream is using a python2.7 features: assertRaises() as a context + # manager + set -ex; python2.7 /usr/bin/nosetests +endif override_dh_installchangelogs: dh_installchangelogs CHANGES.rst diff -Nru python-urllib3-1.8.3/dummyserver/certs/cacert.key python-urllib3-1.7.1/dummyserver/certs/cacert.key --- python-urllib3-1.8.3/dummyserver/certs/cacert.key 2013-12-16 00:27:37.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/certs/cacert.key 1970-01-01 00:00:00.000000000 +0000 @@ -1,15 +0,0 @@ ------BEGIN RSA PRIVATE KEY----- -MIICXgIBAAKBgQDKz8a9X2SfNms9TffyNaFO/K42fAjUI1dAM1G8TVoj0a81ay7W -z4R7V1zfjXFT/WoRW04Y6xek0bff0OtsW+AriooUy7+pPYnrchpAW0p7hPjH1DIB -Vab01CJMhQ24er92Q1dF4WBv4yKqEaV1IYz1cvqvCCJgAbsWn1I8Cna1lwIDAQAB -AoGAPpkK+oBrCkk9qFpcYUH0W/DZxK9b+j4+O+6bF8e4Pr4FmjNO7bZ3aap5W/bI -N+hLyLepzz8guRqR6l8NixCAi+JiVW/agh5o4Jrek8UJWQamwSL4nJ36U3Iw/l7w -vcN1txfkpsA2SB9QFPGfDKcP3+IZMOZ7uFLzk/gzgLYiCEECQQD+M5Lj+e/sNBkb -XeIBxWIrPfEeIkk4SDkqImzDjq1FcfxZkvfskqyJgUvcLe5hb+ibY8jqWvtpvFTI -5v/tzHvPAkEAzD8fNrGz8KiAVTo7+0vrb4AebAdSLZUvbp0AGs5pXUAuQx6VEgz8 -opNKpZjBwAFsZKlwhgDqaChiAt9aKUkzuQJBALlai9I2Dg7SkjgVRdX6wjE7slRB -tdgXOa+SeHJD1+5aRiJeeu8CqFJ/d/wtdbOQsTCVGwxfmREpZT00ywrvXpsCQQCU -gs1Kcrn5Ijx2PCrDFbfyUkFMoaIiXNipYGVkGHRKhtFcoo8YGfNUry7W7BTtbNuI -8h9MgLvw0nQ5zHf9jymZAkEA7o4uA6XSS1zUqEQ55bZRFHcz/99pLH35G906iwVb -d5rd1Z4Cf5s/91o5gwL6ZP2Ig34CCn+NSL4avgz6K0VUaA== ------END RSA PRIVATE KEY----- diff -Nru python-urllib3-1.8.3/dummyserver/certs/cacert.pem python-urllib3-1.7.1/dummyserver/certs/cacert.pem --- python-urllib3-1.8.3/dummyserver/certs/cacert.pem 2013-12-16 00:27:37.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/certs/cacert.pem 1970-01-01 00:00:00.000000000 +0000 @@ -1,23 +0,0 @@ ------BEGIN CERTIFICATE----- -MIIDzDCCAzWgAwIBAgIJALPrscov4b/jMA0GCSqGSIb3DQEBBQUAMIGBMQswCQYD -VQQGEwJGSTEOMAwGA1UECBMFZHVtbXkxDjAMBgNVBAcTBWR1bW15MQ4wDAYDVQQK -EwVkdW1teTEOMAwGA1UECxMFZHVtbXkxETAPBgNVBAMTCFNuYWtlT2lsMR8wHQYJ -KoZIhvcNAQkBFhBkdW1teUB0ZXN0LmxvY2FsMB4XDTExMTIyMjA3NTYxNVoXDTIx -MTIxOTA3NTYxNVowgYExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIEwVkdW1teTEOMAwG -A1UEBxMFZHVtbXkxDjAMBgNVBAoTBWR1bW15MQ4wDAYDVQQLEwVkdW1teTERMA8G -A1UEAxMIU25ha2VPaWwxHzAdBgkqhkiG9w0BCQEWEGR1bW15QHRlc3QubG9jYWww -gZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJAoGBAMrPxr1fZJ82az1N9/I1oU78rjZ8 -CNQjV0AzUbxNWiPRrzVrLtbPhHtXXN+NcVP9ahFbThjrF6TRt9/Q62xb4CuKihTL -v6k9ietyGkBbSnuE+MfUMgFVpvTUIkyFDbh6v3ZDV0XhYG/jIqoRpXUhjPVy+q8I -ImABuxafUjwKdrWXAgMBAAGjggFIMIIBRDAdBgNVHQ4EFgQUGXd/I2JiQllF+3Wd -x3NyBLszCi0wgbYGA1UdIwSBrjCBq4AUGXd/I2JiQllF+3Wdx3NyBLszCi2hgYek -gYQwgYExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIEwVkdW1teTEOMAwGA1UEBxMFZHVt -bXkxDjAMBgNVBAoTBWR1bW15MQ4wDAYDVQQLEwVkdW1teTERMA8GA1UEAxMIU25h -a2VPaWwxHzAdBgkqhkiG9w0BCQEWEGR1bW15QHRlc3QubG9jYWyCCQCz67HKL+G/ -4zAPBgNVHRMBAf8EBTADAQH/MBEGCWCGSAGG+EIBAQQEAwIBBjAJBgNVHRIEAjAA -MCsGCWCGSAGG+EIBDQQeFhxUaW55Q0EgR2VuZXJhdGVkIENlcnRpZmljYXRlMA4G -A1UdDwEB/wQEAwICBDANBgkqhkiG9w0BAQUFAAOBgQBnnwtO8onsyhGOvS6cS8af -IRZyAXgouuPeP3Zrf5W80iZcV23u94969sPEIsD8Ujv5u0hUSrToGl4ahOMEOFNL -R5ndQOkh3VsepJnoE+RklZzbHWxU8onWlVzsNBFbclxidzaU3UHmdgXJAJL5nVSd -Zpn44QSS0UXsaC0mBimVNw== ------END CERTIFICATE----- diff -Nru python-urllib3-1.8.3/dummyserver/certs/client_bad.pem python-urllib3-1.7.1/dummyserver/certs/client_bad.pem --- python-urllib3-1.8.3/dummyserver/certs/client_bad.pem 2013-12-16 00:27:37.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/certs/client_bad.pem 1970-01-01 00:00:00.000000000 +0000 @@ -1,17 +0,0 @@ ------BEGIN CERTIFICATE----- -MIICsDCCAhmgAwIBAgIJAL63Nc6KY94BMA0GCSqGSIb3DQEBBQUAMEUxCzAJBgNV -BAYTAkFVMRMwEQYDVQQIEwpTb21lLVN0YXRlMSEwHwYDVQQKExhJbnRlcm5ldCBX -aWRnaXRzIFB0eSBMdGQwHhcNMTExMDExMjMxMjAzWhcNMjExMDA4MjMxMjAzWjBF -MQswCQYDVQQGEwJBVTETMBEGA1UECBMKU29tZS1TdGF0ZTEhMB8GA1UEChMYSW50 -ZXJuZXQgV2lkZ2l0cyBQdHkgTHRkMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKB -gQC8HGxvblJ4Z0i/lIlG8jrNsFrCqYRAXtj3xdnnjfUpd/kNhU/KahMsG6urAe/4 -Yj+Zqf1sVnt0Cye8FZE3cN9RAcwJrlTCRiicJiXEbA7cPfMphqNGqjVHtmxQ1OsU -NHK7cxKa9OX3xmg4h55vxSZYgibAEPO2g3ueGk7RWIAQ8wIDAQABo4GnMIGkMB0G -A1UdDgQWBBSeeo/YRpdn5DK6bUI7ZDJ57pzGdDB1BgNVHSMEbjBsgBSeeo/YRpdn -5DK6bUI7ZDJ57pzGdKFJpEcwRTELMAkGA1UEBhMCQVUxEzARBgNVBAgTClNvbWUt -U3RhdGUxITAfBgNVBAoTGEludGVybmV0IFdpZGdpdHMgUHR5IEx0ZIIJAL63Nc6K -Y94BMAwGA1UdEwQFMAMBAf8wDQYJKoZIhvcNAQEFBQADgYEAOntoloMGt1325UR0 -GGEKQJbiRhLXY4otdgFjEvCG2RPZVLxWYhLMu0LkB6HBYULEuoy12ushtRWlhS1k -6PNRkaZ+LQTSREj6Do4c4zzLxCDmxYmejOz63cIWX2x5IY6qEx2BNOfmM4xEdF8W -LSGGbQfuAghiEh0giAi4AQloDlY= ------END CERTIFICATE----- diff -Nru python-urllib3-1.8.3/dummyserver/certs/client.csr python-urllib3-1.7.1/dummyserver/certs/client.csr --- python-urllib3-1.8.3/dummyserver/certs/client.csr 2013-12-16 00:27:37.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/certs/client.csr 1970-01-01 00:00:00.000000000 +0000 @@ -1,23 +0,0 @@ ------BEGIN CERTIFICATE----- -MIID1TCCAz6gAwIBAgIBAjANBgkqhkiG9w0BAQUFADCBgTELMAkGA1UEBhMCRkkx -DjAMBgNVBAgTBWR1bW15MQ4wDAYDVQQHEwVkdW1teTEOMAwGA1UEChMFZHVtbXkx -DjAMBgNVBAsTBWR1bW15MREwDwYDVQQDEwhTbmFrZU9pbDEfMB0GCSqGSIb3DQEJ -ARYQZHVtbXlAdGVzdC5sb2NhbDAeFw0xMTEyMjIwNzU5NTlaFw0yMTEyMTgwNzU5 -NTlaMH8xCzAJBgNVBAYTAkZJMQ4wDAYDVQQIEwVkdW1teTEOMAwGA1UEBxMFZHVt -bXkxDjAMBgNVBAoTBWR1bW15MQ4wDAYDVQQLEwVkdW1teTEPMA0GA1UEAxMGY2xp -ZW50MR8wHQYJKoZIhvcNAQkBFhBjbGllbnRAbG9jYWxob3N0MIGfMA0GCSqGSIb3 -DQEBAQUAA4GNADCBiQKBgQDaITA/XCzviqjex+lJJP+pgmQQ+ncUf+PDaFw86kWh -cWuI2eSBVaIaP6SsxYgIODQTjqYGjRogsd1Nvx3gRdIMEagTfVQyVwfDfNp8aT8v -SY/wDYFjsD07asmjGvwiu0sLp4t/tMz+x5ELlU4+hGnmPInH6hLK150DqgbNmJus -3wIDAQABo4IBXDCCAVgwCQYDVR0TBAIwADARBglghkgBhvhCAQEEBAMCBLAwKwYJ -YIZIAYb4QgENBB4WHFRpbnlDQSBHZW5lcmF0ZWQgQ2VydGlmaWNhdGUwHQYDVR0O -BBYEFG71FCU2yisH1GyrcqYaPKVeTWxBMIG2BgNVHSMEga4wgauAFBl3fyNiYkJZ -Rft1ncdzcgS7MwotoYGHpIGEMIGBMQswCQYDVQQGEwJGSTEOMAwGA1UECBMFZHVt -bXkxDjAMBgNVBAcTBWR1bW15MQ4wDAYDVQQKEwVkdW1teTEOMAwGA1UECxMFZHVt -bXkxETAPBgNVBAMTCFNuYWtlT2lsMR8wHQYJKoZIhvcNAQkBFhBkdW1teUB0ZXN0 -LmxvY2FsggkAs+uxyi/hv+MwCQYDVR0SBAIwADAbBgNVHREEFDASgRBjbGllbnRA -bG9jYWxob3N0MAsGA1UdDwQEAwIFoDANBgkqhkiG9w0BAQUFAAOBgQDEwZmp3yE8 -R4U9Ob/IeEo6O3p0T4o7GNvufGksM/mELmzyC+Qh/Ul6fNn+IhdKWpo61sMZou+n -eOufXVouc8dGhQ1Qi5s0i51d/ouhfYNs+AGRcpwEieVjZhgE1XfrNwvvjIx3yPtK -m9LSmCtVKcTWqOHQywKn+G83a+7bsh835Q== ------END CERTIFICATE----- diff -Nru python-urllib3-1.8.3/dummyserver/certs/client.key python-urllib3-1.7.1/dummyserver/certs/client.key --- python-urllib3-1.8.3/dummyserver/certs/client.key 2013-12-16 00:27:37.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/certs/client.key 1970-01-01 00:00:00.000000000 +0000 @@ -1,15 +0,0 @@ ------BEGIN RSA PRIVATE KEY----- -MIICWwIBAAKBgQDaITA/XCzviqjex+lJJP+pgmQQ+ncUf+PDaFw86kWhcWuI2eSB -VaIaP6SsxYgIODQTjqYGjRogsd1Nvx3gRdIMEagTfVQyVwfDfNp8aT8vSY/wDYFj -sD07asmjGvwiu0sLp4t/tMz+x5ELlU4+hGnmPInH6hLK150DqgbNmJus3wIDAQAB -AoGAKMMg+AYqo4z+57rl/nQ6jpu+RWn4zMzlbEPZUMzavEOsu8M0L3MoOs1/4YV8 -WUTffnQe1ISTyF5Uo82+MIX7rUtfJITFSQrIWe7AGdm6Nir8TQQ7fD97modXyAUx -69I9SQjQlseg5PCRCp/DfcBncvHeYuf8gAJK5FfC1VW1cQECQQDvzFNoGrwnsrtm -4gj1Kt0c20jkIYFN6iQ6Sjs/1fk1cXDeWzjPaa92zF+i+02Ma/eWJ0ZVrhisw6sv -zxGp+ByBAkEA6N4SpuGWytJqCRfwenQZ4Oa8mNcVo5ulGf/eUHVXvHewWxQ7xWRi -iWUj/z1byR9+yno8Yfd04kaNCPYN/ICZXwJAAf5//xCh2e6pkkx06J0Ho7LLI2KH -8b7tuDJf1cMQxHoCB0dY7JijZeiDLxbJ6U4IjA4djp7ZA67I4KfnLLOsgQJARLZS -dp+WKR7RXwGLWfasNCqhd8/veKlSnEtdxAv76Ya/qQBdaq9mS/hmGMh4Lu52MTTE -YHvuJ159+yjvk5Q2rQJABjlU1+GZqwv/7QM7GxfJO+GPI4PHv5Yji5s7LLu2c6dL -XY2XiTHQL9PnPrKp3+qDDzxjyej30lfz4he6E5pI+g== ------END RSA PRIVATE KEY----- diff -Nru python-urllib3-1.8.3/dummyserver/certs/client.pem python-urllib3-1.7.1/dummyserver/certs/client.pem --- python-urllib3-1.8.3/dummyserver/certs/client.pem 2013-12-16 00:27:37.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/certs/client.pem 1970-01-01 00:00:00.000000000 +0000 @@ -1,22 +0,0 @@ ------BEGIN CERTIFICATE----- -MIIDqDCCAxGgAwIBAgIBATANBgkqhkiG9w0BAQUFADCBgTELMAkGA1UEBhMCRkkx -DjAMBgNVBAgTBWR1bW15MQ4wDAYDVQQHEwVkdW1teTEOMAwGA1UEChMFZHVtbXkx -DjAMBgNVBAsTBWR1bW15MREwDwYDVQQDEwhTbmFrZU9pbDEfMB0GCSqGSIb3DQEJ -ARYQZHVtbXlAdGVzdC5sb2NhbDAeFw0xMTEyMjIwNzU4NDBaFw0yMTEyMTgwNzU4 -NDBaMGExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIEwVkdW1teTEOMAwGA1UEBxMFZHVt -bXkxDjAMBgNVBAoTBWR1bW15MQ4wDAYDVQQLEwVkdW1teTESMBAGA1UEAxMJbG9j -YWxob3N0MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDXe3FqmCWvP8XPxqtT -+0bfL1Tvzvebi46k0WIcUV8bP3vyYiSRXG9ALmyzZH4GHY9UVs4OEDkCMDOBSezB -0y9ai/9doTNcaictdEBu8nfdXKoTtzrn+VX4UPrkH5hm7NQ1fTQuj1MR7yBCmYqN -3Q2Q+Efuujyx0FwBzAuy1aKYuwIDAQABo4IBTTCCAUkwCQYDVR0TBAIwADARBglg -hkgBhvhCAQEEBAMCBkAwKwYJYIZIAYb4QgENBB4WHFRpbnlDQSBHZW5lcmF0ZWQg -Q2VydGlmaWNhdGUwHQYDVR0OBBYEFBvnSuVKLNPEFMAFqHw292vGHGJSMIG2BgNV -HSMEga4wgauAFBl3fyNiYkJZRft1ncdzcgS7MwotoYGHpIGEMIGBMQswCQYDVQQG -EwJGSTEOMAwGA1UECBMFZHVtbXkxDjAMBgNVBAcTBWR1bW15MQ4wDAYDVQQKEwVk -dW1teTEOMAwGA1UECxMFZHVtbXkxETAPBgNVBAMTCFNuYWtlT2lsMR8wHQYJKoZI -hvcNAQkBFhBkdW1teUB0ZXN0LmxvY2FsggkAs+uxyi/hv+MwCQYDVR0SBAIwADAZ -BgNVHREEEjAQgQ5yb290QGxvY2FsaG9zdDANBgkqhkiG9w0BAQUFAAOBgQBXdedG -XHLPmOVBeKWjTmaekcaQi44snhYqE1uXRoIQXQsyw+Ya5+n/uRxPKZO/C78EESL0 -8rnLTdZXm4GBYyHYmMy0AdWR7y030viOzAkWWRRRbuecsaUzFCI+F9jTV5LHuRzz -V8fUKwiEE9swzkWgMpfVTPFuPgzxwG9gMbrBfg== ------END CERTIFICATE----- diff -Nru python-urllib3-1.8.3/dummyserver/certs/server.crt python-urllib3-1.7.1/dummyserver/certs/server.crt --- python-urllib3-1.8.3/dummyserver/certs/server.crt 2013-12-16 00:27:37.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/certs/server.crt 1970-01-01 00:00:00.000000000 +0000 @@ -1,22 +0,0 @@ ------BEGIN CERTIFICATE----- -MIIDqDCCAxGgAwIBAgIBATANBgkqhkiG9w0BAQUFADCBgTELMAkGA1UEBhMCRkkx -DjAMBgNVBAgTBWR1bW15MQ4wDAYDVQQHEwVkdW1teTEOMAwGA1UEChMFZHVtbXkx -DjAMBgNVBAsTBWR1bW15MREwDwYDVQQDEwhTbmFrZU9pbDEfMB0GCSqGSIb3DQEJ -ARYQZHVtbXlAdGVzdC5sb2NhbDAeFw0xMTEyMjIwNzU4NDBaFw0yMTEyMTgwNzU4 -NDBaMGExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIEwVkdW1teTEOMAwGA1UEBxMFZHVt -bXkxDjAMBgNVBAoTBWR1bW15MQ4wDAYDVQQLEwVkdW1teTESMBAGA1UEAxMJbG9j -YWxob3N0MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDXe3FqmCWvP8XPxqtT -+0bfL1Tvzvebi46k0WIcUV8bP3vyYiSRXG9ALmyzZH4GHY9UVs4OEDkCMDOBSezB -0y9ai/9doTNcaictdEBu8nfdXKoTtzrn+VX4UPrkH5hm7NQ1fTQuj1MR7yBCmYqN -3Q2Q+Efuujyx0FwBzAuy1aKYuwIDAQABo4IBTTCCAUkwCQYDVR0TBAIwADARBglg -hkgBhvhCAQEEBAMCBkAwKwYJYIZIAYb4QgENBB4WHFRpbnlDQSBHZW5lcmF0ZWQg -Q2VydGlmaWNhdGUwHQYDVR0OBBYEFBvnSuVKLNPEFMAFqHw292vGHGJSMIG2BgNV -HSMEga4wgauAFBl3fyNiYkJZRft1ncdzcgS7MwotoYGHpIGEMIGBMQswCQYDVQQG -EwJGSTEOMAwGA1UECBMFZHVtbXkxDjAMBgNVBAcTBWR1bW15MQ4wDAYDVQQKEwVk -dW1teTEOMAwGA1UECxMFZHVtbXkxETAPBgNVBAMTCFNuYWtlT2lsMR8wHQYJKoZI -hvcNAQkBFhBkdW1teUB0ZXN0LmxvY2FsggkAs+uxyi/hv+MwCQYDVR0SBAIwADAZ -BgNVHREEEjAQgQ5yb290QGxvY2FsaG9zdDANBgkqhkiG9w0BAQUFAAOBgQBXdedG -XHLPmOVBeKWjTmaekcaQi44snhYqE1uXRoIQXQsyw+Ya5+n/uRxPKZO/C78EESL0 -8rnLTdZXm4GBYyHYmMy0AdWR7y030viOzAkWWRRRbuecsaUzFCI+F9jTV5LHuRzz -V8fUKwiEE9swzkWgMpfVTPFuPgzxwG9gMbrBfg== ------END CERTIFICATE----- diff -Nru python-urllib3-1.8.3/dummyserver/certs/server.csr python-urllib3-1.7.1/dummyserver/certs/server.csr --- python-urllib3-1.8.3/dummyserver/certs/server.csr 2013-12-16 00:27:37.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/certs/server.csr 1970-01-01 00:00:00.000000000 +0000 @@ -1,22 +0,0 @@ ------BEGIN CERTIFICATE----- -MIIDqDCCAxGgAwIBAgIBATANBgkqhkiG9w0BAQUFADCBgTELMAkGA1UEBhMCRkkx -DjAMBgNVBAgTBWR1bW15MQ4wDAYDVQQHEwVkdW1teTEOMAwGA1UEChMFZHVtbXkx -DjAMBgNVBAsTBWR1bW15MREwDwYDVQQDEwhTbmFrZU9pbDEfMB0GCSqGSIb3DQEJ -ARYQZHVtbXlAdGVzdC5sb2NhbDAeFw0xMTEyMjIwNzU4NDBaFw0yMTEyMTgwNzU4 -NDBaMGExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIEwVkdW1teTEOMAwGA1UEBxMFZHVt -bXkxDjAMBgNVBAoTBWR1bW15MQ4wDAYDVQQLEwVkdW1teTESMBAGA1UEAxMJbG9j -YWxob3N0MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDXe3FqmCWvP8XPxqtT -+0bfL1Tvzvebi46k0WIcUV8bP3vyYiSRXG9ALmyzZH4GHY9UVs4OEDkCMDOBSezB -0y9ai/9doTNcaictdEBu8nfdXKoTtzrn+VX4UPrkH5hm7NQ1fTQuj1MR7yBCmYqN -3Q2Q+Efuujyx0FwBzAuy1aKYuwIDAQABo4IBTTCCAUkwCQYDVR0TBAIwADARBglg -hkgBhvhCAQEEBAMCBkAwKwYJYIZIAYb4QgENBB4WHFRpbnlDQSBHZW5lcmF0ZWQg -Q2VydGlmaWNhdGUwHQYDVR0OBBYEFBvnSuVKLNPEFMAFqHw292vGHGJSMIG2BgNV -HSMEga4wgauAFBl3fyNiYkJZRft1ncdzcgS7MwotoYGHpIGEMIGBMQswCQYDVQQG -EwJGSTEOMAwGA1UECBMFZHVtbXkxDjAMBgNVBAcTBWR1bW15MQ4wDAYDVQQKEwVk -dW1teTEOMAwGA1UECxMFZHVtbXkxETAPBgNVBAMTCFNuYWtlT2lsMR8wHQYJKoZI -hvcNAQkBFhBkdW1teUB0ZXN0LmxvY2FsggkAs+uxyi/hv+MwCQYDVR0SBAIwADAZ -BgNVHREEEjAQgQ5yb290QGxvY2FsaG9zdDANBgkqhkiG9w0BAQUFAAOBgQBXdedG -XHLPmOVBeKWjTmaekcaQi44snhYqE1uXRoIQXQsyw+Ya5+n/uRxPKZO/C78EESL0 -8rnLTdZXm4GBYyHYmMy0AdWR7y030viOzAkWWRRRbuecsaUzFCI+F9jTV5LHuRzz -V8fUKwiEE9swzkWgMpfVTPFuPgzxwG9gMbrBfg== ------END CERTIFICATE----- diff -Nru python-urllib3-1.8.3/dummyserver/certs/server.key python-urllib3-1.7.1/dummyserver/certs/server.key --- python-urllib3-1.8.3/dummyserver/certs/server.key 2013-12-16 00:27:37.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/certs/server.key 1970-01-01 00:00:00.000000000 +0000 @@ -1,15 +0,0 @@ ------BEGIN RSA PRIVATE KEY----- -MIICXgIBAAKBgQDXe3FqmCWvP8XPxqtT+0bfL1Tvzvebi46k0WIcUV8bP3vyYiSR -XG9ALmyzZH4GHY9UVs4OEDkCMDOBSezB0y9ai/9doTNcaictdEBu8nfdXKoTtzrn -+VX4UPrkH5hm7NQ1fTQuj1MR7yBCmYqN3Q2Q+Efuujyx0FwBzAuy1aKYuwIDAQAB -AoGBANOGBM6bbhq7ImYU4qf8+RQrdVg2tc9Fzo+yTnn30sF/rx8/AiCDOV4qdGAh -HKjKKaGj2H/rotqoEFcxBy05LrgJXxydBP72e9PYhNgKOcSmCQu4yALIPEXfKuIM -zgAErHVJ2l79fif3D4hzNyz+u5E1A9n3FG9cgaJSiYP8IG2RAkEA82GZ8rBkSGQQ -ZQ3oFuzPAAL21lbj8D0p76fsCpvS7427DtZDOjhOIKZmaeykpv+qSzRraqEqjDRi -S4kjQvwh6QJBAOKniZ+NDo2lSpbOFk+XlmABK1DormVpj8KebHEZYok1lRI+WiX9 -Nnoe9YLgix7++6H5SBBCcTB4HvM+5A4BuwMCQQChcX/eZbXP81iQwB3Rfzp8xnqY -icDf7qKvz9Ma4myU7Y5E9EpaB1mD/P14jDpYcMW050vNyqTfpiwB8TFL0NZpAkEA -02jkFH9UyMgZV6qo4tqI98l/ZrtyF8OrxSNSEPhVkZf6EQc5vN9/lc8Uv1vESEgb -3AwRrKDcxRH2BHtv6qSwkwJAGjqnkIcEkA75r1e55/EF2chcZW1+tpwKupE8CtAH -VXGd5DVwt4cYWkLUj2gF2fJbV97uu2MAg5CFDb+vQ6p5eA== ------END RSA PRIVATE KEY----- diff -Nru python-urllib3-1.8.3/dummyserver/certs/server.key.org python-urllib3-1.7.1/dummyserver/certs/server.key.org --- python-urllib3-1.8.3/dummyserver/certs/server.key.org 2013-12-16 00:27:37.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/certs/server.key.org 1970-01-01 00:00:00.000000000 +0000 @@ -1,12 +0,0 @@ ------BEGIN RSA PRIVATE KEY----- -Proc-Type: 4,ENCRYPTED -DEK-Info: DES-EDE3-CBC,8B3708EAD53963D4 - -uyLo4sFmSo7+K1uVgSENI+85JsG5o1JmovvxD/ucUl9CDhDj4KgFzs95r7gjjlhS -kA/hIY8Ec9i6T3zMXpAswWI5Mv2LE+UdYR5h60dYtIinLC7KF0QIztSecNWy20Bi -/NkobZhN7VZUuCEoSRWj4Ia3EuATF8Y9ZRGFPNsqMbSAhsGZ1P5xbDMEpE+5PbJP -LvdF9yWDT77rHeI4CKV4aP/yxtm1heEhKw5o6hdpPBQajPpjSQbh7/V6Qd0QsKcV -n27kPnSabsTbbc2IR40il4mZfHvXAlp4KoHL3RUgaons7q0hAUpUi+vJXbEukGGt -3dlyWwKwEFS7xBQ1pQvzcePI4/fRQxhZNxeFZW6n12Y3X61vg1IsG7usPhRe3iDP -3g1MXQMAhxaECnDN9b006IeoYdaktd4wrs/fn8x6Yz4= ------END RSA PRIVATE KEY----- diff -Nru python-urllib3-1.8.3/dummyserver/handlers.py python-urllib3-1.7.1/dummyserver/handlers.py --- python-urllib3-1.8.3/dummyserver/handlers.py 2014-04-18 05:35:23.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/handlers.py 2013-08-14 21:43:05.000000000 +0000 @@ -70,10 +70,6 @@ "Render simple message" return Response("Dummy server!") - def source_address(self, request): - """Return the requester's IP address.""" - return Response(request.remote_ip) - def set_up(self, request): test_type = request.params.get('test_type') test_id = request.params.get('test_id') diff -Nru python-urllib3-1.8.3/dummyserver/server.py python-urllib3-1.7.1/dummyserver/server.py --- python-urllib3-1.8.3/dummyserver/server.py 2014-03-04 19:08:03.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/server.py 2013-08-14 21:43:05.000000000 +0000 @@ -5,21 +5,21 @@ """ from __future__ import print_function -import errno import logging import os -import random -import string import sys import threading import socket -from tornado.platform.auto import set_close_exec +from tornado import netutil import tornado.wsgi import tornado.httpserver import tornado.ioloop import tornado.web +from dummyserver.handlers import TestingApp +from dummyserver.proxy import ProxyHandler + log = logging.getLogger(__name__) @@ -51,7 +51,7 @@ self.ready_event = ready_event def _start_server(self): - sock = socket.socket(socket.AF_INET6) + sock = socket.socket() if sys.platform != 'win32': sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) sock.bind((self.host, 0)) @@ -70,112 +70,59 @@ self.server = self._start_server() -# FIXME: there is a pull request patching bind_sockets in Tornado directly. -# If it gets merged and released we can drop this and use -# `tornado.netutil.bind_sockets` again. -# https://github.com/facebook/tornado/pull/977 - -def bind_sockets(port, address=None, family=socket.AF_UNSPEC, backlog=128, - flags=None): - """Creates listening sockets bound to the given port and address. - - Returns a list of socket objects (multiple sockets are returned if - the given address maps to multiple IP addresses, which is most common - for mixed IPv4 and IPv6 use). - - Address may be either an IP address or hostname. If it's a hostname, - the server will listen on all IP addresses associated with the - name. Address may be an empty string or None to listen on all - available interfaces. Family may be set to either `socket.AF_INET` - or `socket.AF_INET6` to restrict to IPv4 or IPv6 addresses, otherwise - both will be used if available. +class TornadoServerThread(threading.Thread): + app = tornado.wsgi.WSGIContainer(TestingApp()) - The ``backlog`` argument has the same meaning as for - `socket.listen() `. + def __init__(self, host='localhost', scheme='http', certs=None, + ready_event=None): + threading.Thread.__init__(self) - ``flags`` is a bitmask of AI_* flags to `~socket.getaddrinfo`, like - ``socket.AI_PASSIVE | socket.AI_NUMERICHOST``. - """ - sockets = [] - if address == "": - address = None - if not socket.has_ipv6 and family == socket.AF_UNSPEC: - # Python can be compiled with --disable-ipv6, which causes - # operations on AF_INET6 sockets to fail, but does not - # automatically exclude those results from getaddrinfo - # results. - # http://bugs.python.org/issue16208 - family = socket.AF_INET - if flags is None: - flags = socket.AI_PASSIVE - binded_port = None - for res in set(socket.getaddrinfo(address, port, family, - socket.SOCK_STREAM, 0, flags)): - af, socktype, proto, canonname, sockaddr = res - try: - sock = socket.socket(af, socktype, proto) - except socket.error as e: - if e.args[0] == errno.EAFNOSUPPORT: - continue - raise - set_close_exec(sock.fileno()) - if os.name != 'nt': - sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) - if af == socket.AF_INET6: - # On linux, ipv6 sockets accept ipv4 too by default, - # but this makes it impossible to bind to both - # 0.0.0.0 in ipv4 and :: in ipv6. On other systems, - # separate sockets *must* be used to listen for both ipv4 - # and ipv6. For consistency, always disable ipv4 on our - # ipv6 sockets and use a separate ipv4 socket when needed. - # - # Python 2.x on windows doesn't have IPPROTO_IPV6. - if hasattr(socket, "IPPROTO_IPV6"): - sock.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_V6ONLY, 1) - - # automatic port allocation with port=None - # should bind on the same port on IPv4 and IPv6 - host, requested_port = sockaddr[:2] - if requested_port == 0 and binded_port is not None: - sockaddr = tuple([host, binded_port] + list(sockaddr[2:])) - - sock.setblocking(0) - sock.bind(sockaddr) - binded_port = sock.getsockname()[1] - sock.listen(backlog) - sockets.append(sock) - return sockets - - -def run_tornado_app(app, io_loop, certs, scheme, host): - if scheme == 'https': - http_server = tornado.httpserver.HTTPServer(app, ssl_options=certs, - io_loop=io_loop) - else: - http_server = tornado.httpserver.HTTPServer(app, io_loop=io_loop) - - sockets = bind_sockets(None, address=host) - port = sockets[0].getsockname()[1] - http_server.add_sockets(sockets) - return http_server, port + self.host = host + self.scheme = scheme + self.certs = certs + self.ready_event = ready_event + def _start_server(self): + if self.scheme == 'https': + http_server = tornado.httpserver.HTTPServer(self.app, + ssl_options=self.certs) + else: + http_server = tornado.httpserver.HTTPServer(self.app) -def run_loop_in_thread(io_loop): - t = threading.Thread(target=io_loop.start) - t.start() - return t + family = socket.AF_INET6 if ':' in self.host else socket.AF_INET + sock, = netutil.bind_sockets(None, address=self.host, family=family) + self.port = sock.getsockname()[1] + http_server.add_sockets([sock]) + return http_server + def run(self): + self.ioloop = tornado.ioloop.IOLoop.instance() + self.server = self._start_server() + if self.ready_event: + self.ready_event.set() + self.ioloop.start() -def get_unreachable_address(): - while True: - host = ''.join(random.choice(string.ascii_lowercase) - for _ in range(60)) - sockaddr = (host, 54321) - - # check if we are really "lucky" and hit an actual server - try: - s = socket.create_connection(sockaddr) - except socket.error: - return sockaddr - else: - s.close() + def stop(self): + self.ioloop.add_callback(self.server.stop) + self.ioloop.add_callback(self.ioloop.stop) + + +class ProxyServerThread(TornadoServerThread): + app = tornado.web.Application([(r'.*', ProxyHandler)]) + + +if __name__ == '__main__': + log.setLevel(logging.DEBUG) + log.addHandler(logging.StreamHandler(sys.stderr)) + + from urllib3 import get_host + + url = "http://localhost:8081" + if len(sys.argv) > 1: + url = sys.argv[1] + + print("Starting WSGI server at: %s" % url) + + scheme, host, port = get_host(url) + t = TornadoServerThread(scheme=scheme, host=host, port=port) + t.start() diff -Nru python-urllib3-1.8.3/dummyserver/testcase.py python-urllib3-1.7.1/dummyserver/testcase.py --- python-urllib3-1.8.3/dummyserver/testcase.py 2014-03-04 19:08:03.000000000 +0000 +++ python-urllib3-1.7.1/dummyserver/testcase.py 2013-08-14 21:43:05.000000000 +0000 @@ -2,17 +2,14 @@ import socket import threading from nose.plugins.skip import SkipTest -from tornado import ioloop, web, wsgi from dummyserver.server import ( - SocketServerThread, - run_tornado_app, - run_loop_in_thread, + TornadoServerThread, SocketServerThread, DEFAULT_CERTS, + ProxyServerThread, ) -from dummyserver.handlers import TestingApp -from dummyserver.proxy import ProxyHandler +has_ipv6 = hasattr(socket, 'has_ipv6') class SocketDummyServerTestCase(unittest.TestCase): @@ -36,7 +33,7 @@ @classmethod def tearDownClass(cls): if hasattr(cls, 'server_thread'): - cls.server_thread.join(0.1) + cls.server_thread.join() class HTTPDummyServerTestCase(unittest.TestCase): @@ -47,16 +44,18 @@ @classmethod def _start_server(cls): - cls.io_loop = ioloop.IOLoop() - app = wsgi.WSGIContainer(TestingApp()) - cls.server, cls.port = run_tornado_app(app, cls.io_loop, cls.certs, - cls.scheme, cls.host) - cls.server_thread = run_loop_in_thread(cls.io_loop) + ready_event = threading.Event() + cls.server_thread = TornadoServerThread(host=cls.host, + scheme=cls.scheme, + certs=cls.certs, + ready_event=ready_event) + cls.server_thread.start() + ready_event.wait() + cls.port = cls.server_thread.port @classmethod def _stop_server(cls): - cls.io_loop.add_callback(cls.server.stop) - cls.io_loop.add_callback(cls.io_loop.stop) + cls.server_thread.stop() cls.server_thread.join() @classmethod @@ -88,29 +87,27 @@ @classmethod def setUpClass(cls): - cls.io_loop = ioloop.IOLoop() - - app = wsgi.WSGIContainer(TestingApp()) - cls.http_server, cls.http_port = run_tornado_app( - app, cls.io_loop, None, 'http', cls.http_host) + cls.http_thread = TornadoServerThread(host=cls.http_host, + scheme='http') + cls.http_thread._start_server() + cls.http_port = cls.http_thread.port + + cls.https_thread = TornadoServerThread( + host=cls.https_host, scheme='https', certs=cls.https_certs) + cls.https_thread._start_server() + cls.https_port = cls.https_thread.port - app = wsgi.WSGIContainer(TestingApp()) - cls.https_server, cls.https_port = run_tornado_app( - app, cls.io_loop, cls.https_certs, 'https', cls.http_host) - - app = web.Application([(r'.*', ProxyHandler)]) - cls.proxy_server, cls.proxy_port = run_tornado_app( - app, cls.io_loop, None, 'http', cls.proxy_host) - - cls.server_thread = run_loop_in_thread(cls.io_loop) + ready_event = threading.Event() + cls.proxy_thread = ProxyServerThread( + host=cls.proxy_host, ready_event=ready_event) + cls.proxy_thread.start() + ready_event.wait() + cls.proxy_port = cls.proxy_thread.port @classmethod def tearDownClass(cls): - cls.io_loop.add_callback(cls.http_server.stop) - cls.io_loop.add_callback(cls.https_server.stop) - cls.io_loop.add_callback(cls.proxy_server.stop) - cls.io_loop.add_callback(cls.io_loop.stop) - cls.server_thread.join() + cls.proxy_thread.stop() + cls.proxy_thread.join() class IPv6HTTPDummyServerTestCase(HTTPDummyServerTestCase): @@ -118,7 +115,7 @@ @classmethod def setUpClass(cls): - if not socket.has_ipv6: + if not has_ipv6: raise SkipTest('IPv6 not available') else: super(IPv6HTTPDummyServerTestCase, cls).setUpClass() diff -Nru python-urllib3-1.8.3/MANIFEST.in python-urllib3-1.7.1/MANIFEST.in --- python-urllib3-1.8.3/MANIFEST.in 2014-06-24 18:19:21.000000000 +0000 +++ python-urllib3-1.7.1/MANIFEST.in 2012-02-06 16:01:56.000000000 +0000 @@ -1,3 +1 @@ include README.rst CHANGES.rst LICENSE.txt CONTRIBUTORS.txt test-requirements.txt -recursive-include dummyserver *.* -prune *.pyc diff -Nru python-urllib3-1.8.3/PKG-INFO python-urllib3-1.7.1/PKG-INFO --- python-urllib3-1.8.3/PKG-INFO 2014-06-24 18:22:53.000000000 +0000 +++ python-urllib3-1.7.1/PKG-INFO 2013-09-25 16:06:36.000000000 +0000 @@ -1,6 +1,6 @@ -Metadata-Version: 1.1 +Metadata-Version: 1.0 Name: urllib3 -Version: 1.8.3 +Version: 1.7.1 Summary: HTTP library with thread-safe connection pooling, file post, and more. Home-page: http://urllib3.readthedocs.org/ Author: Andrey Petrov @@ -28,14 +28,7 @@ - Tested on Python 2.6+ and Python 3.2+, 100% unit test coverage. - Small and easy to understand codebase perfect for extending and building upon. For a more comprehensive solution, have a look at - `Requests `_ which is also powered by ``urllib3``. - - You might already be using urllib3! - =================================== - - ``urllib3`` powers `many great Python libraries `_, - including ``pip`` and ``requests``. - + `Requests `_ which is also powered by urllib3. What's wrong with urllib and urllib2? ===================================== @@ -106,7 +99,6 @@ py27: commands succeeded py32: commands succeeded py33: commands succeeded - py34: commands succeeded Note that code coverage less than 100% is regarded as a failing run. @@ -129,101 +121,10 @@ Changes ======= - 1.8.3 (2014-06-23) - ++++++++++++++++++ - - * Fix TLS verification when using a proxy in Python 3.4.1. (Issue #385) - - * Add ``disable_cache`` option to ``urllib3.util.make_headers``. (Issue #393) - - * Wrap ``socket.timeout`` exception with - ``urllib3.exceptions.ReadTimeoutError``. (Issue #399) - - * Fixed proxy-related bug where connections were being reused incorrectly. - (Issues #366, #369) - - * Added ``socket_options`` keyword parameter which allows to define - ``setsockopt`` configuration of new sockets. (Issue #397) - - * Removed ``HTTPConnection.tcp_nodelay`` in favor of - ``HTTPConnection.default_socket_options``. (Issue #397) - - * Fixed ``TypeError`` bug in Python 2.6.4. (Issue #411) - - - 1.8.2 (2014-04-17) - ++++++++++++++++++ - - * Fix ``urllib3.util`` not being included in the package. - - - 1.8.1 (2014-04-17) - ++++++++++++++++++ - - * Fix AppEngine bug of HTTPS requests going out as HTTP. (Issue #356) - - * Don't install ``dummyserver`` into ``site-packages`` as it's only needed - for the test suite. (Issue #362) - - * Added support for specifying ``source_address``. (Issue #352) - - - 1.8 (2014-03-04) - ++++++++++++++++ - - * Improved url parsing in ``urllib3.util.parse_url`` (properly parse '@' in - username, and blank ports like 'hostname:'). - - * New ``urllib3.connection`` module which contains all the HTTPConnection - objects. - - * Several ``urllib3.util.Timeout``-related fixes. Also changed constructor - signature to a more sensible order. [Backwards incompatible] - (Issues #252, #262, #263) - - * Use ``backports.ssl_match_hostname`` if it's installed. (Issue #274) - - * Added ``.tell()`` method to ``urllib3.response.HTTPResponse`` which - returns the number of bytes read so far. (Issue #277) - - * Support for platforms without threading. (Issue #289) - - * Expand default-port comparison in ``HTTPConnectionPool.is_same_host`` - to allow a pool with no specified port to be considered equal to to an - HTTP/HTTPS url with port 80/443 explicitly provided. (Issue #305) - - * Improved default SSL/TLS settings to avoid vulnerabilities. - (Issue #309) - - * Fixed ``urllib3.poolmanager.ProxyManager`` not retrying on connect errors. - (Issue #310) - - * Disable Nagle's Algorithm on the socket for non-proxies. A subset of requests - will send the entire HTTP request ~200 milliseconds faster; however, some of - the resulting TCP packets will be smaller. (Issue #254) - - * Increased maximum number of SubjectAltNames in ``urllib3.contrib.pyopenssl`` - from the default 64 to 1024 in a single certificate. (Issue #318) - - * Headers are now passed and stored as a custom - ``urllib3.collections_.HTTPHeaderDict`` object rather than a plain ``dict``. - (Issue #329, #333) - - * Headers no longer lose their case on Python 3. (Issue #236) - - * ``urllib3.contrib.pyopenssl`` now uses the operating system's default CA - certificates on inject. (Issue #332) - - * Requests with ``retries=False`` will immediately raise any exceptions without - wrapping them in ``MaxRetryError``. (Issue #348) - - * Fixed open socket leak with SSL-related failures. (Issue #344, #348) - - 1.7.1 (2013-09-25) ++++++++++++++++++ - * Added granular timeout support with new ``urllib3.util.Timeout`` class. + * Added granular timeout support with new `urllib3.util.Timeout` class. (Issue #231) * Fixed Python 3.4 support. (Issue #238) diff -Nru python-urllib3-1.8.3/README.rst python-urllib3-1.7.1/README.rst --- python-urllib3-1.8.3/README.rst 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/README.rst 2013-09-25 16:01:09.000000000 +0000 @@ -20,14 +20,7 @@ - Tested on Python 2.6+ and Python 3.2+, 100% unit test coverage. - Small and easy to understand codebase perfect for extending and building upon. For a more comprehensive solution, have a look at - `Requests `_ which is also powered by ``urllib3``. - -You might already be using urllib3! -=================================== - -``urllib3`` powers `many great Python libraries `_, -including ``pip`` and ``requests``. - + `Requests `_ which is also powered by urllib3. What's wrong with urllib and urllib2? ===================================== @@ -98,7 +91,6 @@ py27: commands succeeded py32: commands succeeded py33: commands succeeded - py34: commands succeeded Note that code coverage less than 100% is regarded as a failing run. diff -Nru python-urllib3-1.8.3/setup.cfg python-urllib3-1.7.1/setup.cfg --- python-urllib3-1.8.3/setup.cfg 2014-06-24 18:22:53.000000000 +0000 +++ python-urllib3-1.7.1/setup.cfg 2013-09-25 16:06:36.000000000 +0000 @@ -5,9 +5,6 @@ cover-min-percentage = 100 cover-erase = true -[flake8] -max-line-length = 99 - [egg_info] tag_build = tag_date = 0 diff -Nru python-urllib3-1.8.3/setup.py python-urllib3-1.7.1/setup.py --- python-urllib3-1.8.3/setup.py 2014-06-24 18:19:21.000000000 +0000 +++ python-urllib3-1.7.1/setup.py 2013-06-26 18:39:49.000000000 +0000 @@ -44,9 +44,9 @@ author_email='andrey.petrov@shazow.net', url='http://urllib3.readthedocs.org/', license='MIT', - packages=['urllib3', + packages=['urllib3', 'dummyserver', 'urllib3.packages', 'urllib3.packages.ssl_match_hostname', - 'urllib3.contrib', 'urllib3.util', + 'urllib3.contrib', ], requires=requirements, tests_require=tests_requirements, diff -Nru python-urllib3-1.8.3/test/benchmark.py python-urllib3-1.7.1/test/benchmark.py --- python-urllib3-1.8.3/test/benchmark.py 1970-01-01 00:00:00.000000000 +0000 +++ python-urllib3-1.7.1/test/benchmark.py 2012-01-29 21:29:05.000000000 +0000 @@ -0,0 +1,77 @@ +#!/usr/bin/env python + +""" +Really simple rudimentary benchmark to compare ConnectionPool versus standard +urllib to demonstrate the usefulness of connection re-using. +""" +from __future__ import print_function + +import sys +import time +import urllib + +sys.path.append('../') +import urllib3 + + +# URLs to download. Doesn't matter as long as they're from the same host, so we +# can take advantage of connection re-using. +TO_DOWNLOAD = [ + 'http://code.google.com/apis/apps/', + 'http://code.google.com/apis/base/', + 'http://code.google.com/apis/blogger/', + 'http://code.google.com/apis/calendar/', + 'http://code.google.com/apis/codesearch/', + 'http://code.google.com/apis/contact/', + 'http://code.google.com/apis/books/', + 'http://code.google.com/apis/documents/', + 'http://code.google.com/apis/finance/', + 'http://code.google.com/apis/health/', + 'http://code.google.com/apis/notebook/', + 'http://code.google.com/apis/picasaweb/', + 'http://code.google.com/apis/spreadsheets/', + 'http://code.google.com/apis/webmastertools/', + 'http://code.google.com/apis/youtube/', +] + + +def urllib_get(url_list): + assert url_list + for url in url_list: + now = time.time() + r = urllib.urlopen(url) + elapsed = time.time() - now + print("Got in %0.3f: %s" % (elapsed, url)) + + +def pool_get(url_list): + assert url_list + pool = urllib3.connection_from_url(url_list[0]) + for url in url_list: + now = time.time() + r = pool.get_url(url) + elapsed = time.time() - now + print("Got in %0.3fs: %s" % (elapsed, url)) + + +if __name__ == '__main__': + print("Running pool_get ...") + now = time.time() + pool_get(TO_DOWNLOAD) + pool_elapsed = time.time() - now + + print("Running urllib_get ...") + now = time.time() + urllib_get(TO_DOWNLOAD) + urllib_elapsed = time.time() - now + + print("Completed pool_get in %0.3fs" % pool_elapsed) + print("Completed urllib_get in %0.3fs" % urllib_elapsed) + + +""" +Example results: + +Completed pool_get in 1.163s +Completed urllib_get in 2.318s +""" diff -Nru python-urllib3-1.8.3/test/test_collections.py python-urllib3-1.7.1/test/test_collections.py --- python-urllib3-1.8.3/test/test_collections.py 2014-03-15 00:05:07.000000000 +0000 +++ python-urllib3-1.7.1/test/test_collections.py 2013-06-26 18:39:49.000000000 +0000 @@ -1,9 +1,6 @@ import unittest -from urllib3._collections import ( - HTTPHeaderDict, - RecentlyUsedContainer as Container -) +from urllib3._collections import RecentlyUsedContainer as Container from urllib3.packages import six xrange = six.moves.xrange @@ -124,57 +121,5 @@ self.assertRaises(NotImplementedError, d.__iter__) - -class TestHTTPHeaderDict(unittest.TestCase): - def setUp(self): - self.d = HTTPHeaderDict(A='foo') - self.d.add('a', 'bar') - - def test_overwriting_with_setitem_replaces(self): - d = HTTPHeaderDict() - - d['A'] = 'foo' - self.assertEqual(d['a'], 'foo') - - d['a'] = 'bar' - self.assertEqual(d['A'], 'bar') - - def test_copy(self): - h = self.d.copy() - self.assertTrue(self.d is not h) - self.assertEqual(self.d, h) - - def test_add(self): - d = HTTPHeaderDict() - - d['A'] = 'foo' - d.add('a', 'bar') - - self.assertEqual(d['a'], 'foo, bar') - self.assertEqual(d['A'], 'foo, bar') - - def test_getlist(self): - self.assertEqual(self.d.getlist('a'), ['foo', 'bar']) - self.assertEqual(self.d.getlist('A'), ['foo', 'bar']) - self.assertEqual(self.d.getlist('b'), []) - - def test_delitem(self): - del self.d['a'] - self.assertFalse('a' in self.d) - self.assertFalse('A' in self.d) - - def test_equal(self): - b = HTTPHeaderDict({'a': 'foo, bar'}) - self.assertEqual(self.d, b) - c = [('a', 'foo, bar')] - self.assertNotEqual(self.d, c) - - def test_len(self): - self.assertEqual(len(self.d), 1) - - def test_repr(self): - rep = "HTTPHeaderDict({'A': 'foo, bar'})" - self.assertEqual(repr(self.d), rep) - if __name__ == '__main__': unittest.main() diff -Nru python-urllib3-1.8.3/test/test_compatibility.py python-urllib3-1.7.1/test/test_compatibility.py --- python-urllib3-1.8.3/test/test_compatibility.py 2014-03-15 00:05:07.000000000 +0000 +++ python-urllib3-1.7.1/test/test_compatibility.py 1970-01-01 00:00:00.000000000 +0000 @@ -1,23 +0,0 @@ -import unittest -import warnings - -from urllib3.connection import HTTPConnection - - -class TestVersionCompatibility(unittest.TestCase): - def test_connection_strict(self): - with warnings.catch_warnings(record=True) as w: - warnings.simplefilter("always") - - # strict=True is deprecated in Py33+ - conn = HTTPConnection('localhost', 12345, strict=True) - - if w: - self.fail('HTTPConnection raised warning on strict=True: %r' % w[0].message) - - def test_connection_source_address(self): - try: - # source_address does not exist in Py26- - conn = HTTPConnection('localhost', 12345, source_address='127.0.0.1') - except TypeError as e: - self.fail('HTTPConnection raised TypeError on source_adddress: %r' % e) diff -Nru python-urllib3-1.8.3/test/test_connectionpool.py python-urllib3-1.7.1/test/test_connectionpool.py --- python-urllib3-1.8.3/test/test_connectionpool.py 2014-03-04 19:08:03.000000000 +0000 +++ python-urllib3-1.7.1/test/test_connectionpool.py 2013-09-25 16:01:09.000000000 +0000 @@ -13,9 +13,10 @@ HostChangedError, MaxRetryError, SSLError, + ReadTimeoutError, ) -from socket import error as SocketError +from socket import error as SocketError, timeout as SocketTimeout from ssl import SSLError as BaseSSLError try: # Python 3 @@ -38,11 +39,6 @@ ('http://google.com/', 'http://google.com'), ('http://google.com/', 'http://google.com/abra/cadabra'), ('http://google.com:42/', 'http://google.com:42/abracadabra'), - # Test comparison using default ports - ('http://google.com:80/', 'http://google.com/abracadabra'), - ('http://google.com/', 'http://google.com:80/abracadabra'), - ('https://google.com:443/', 'https://google.com/abracadabra'), - ('https://google.com/', 'https://google.com:443/abracadabra'), ] for a, b in same_host: @@ -55,22 +51,11 @@ ('http://yahoo.com/', 'http://google.com/'), ('http://google.com:42', 'https://google.com/abracadabra'), ('http://google.com', 'https://google.net/'), - # Test comparison with default ports - ('http://google.com:42', 'http://google.com'), - ('https://google.com:42', 'https://google.com'), - ('http://google.com:443', 'http://google.com'), - ('https://google.com:80', 'https://google.com'), - ('http://google.com:443', 'https://google.com'), - ('https://google.com:80', 'http://google.com'), - ('https://google.com:443', 'http://google.com'), - ('http://google.com:80', 'https://google.com'), ] for a, b in not_same_host: c = connection_from_url(a) self.assertFalse(c.is_same_host(b), "%s =? %s" % (a, b)) - c = connection_from_url(b) - self.assertFalse(c.is_same_host(a), "%s =? %s" % (b, a)) def test_max_connections(self): @@ -143,8 +128,9 @@ self.assertEqual(pool.pool.qsize(), POOL_SIZE) - # Make sure that all of the exceptions return the connection to the pool - _test(Empty, EmptyPoolError) + #make sure that all of the exceptions return the connection to the pool + _test(Empty, ReadTimeoutError) + _test(SocketTimeout, ReadTimeoutError) _test(BaseSSLError, SSLError) _test(CertificateError, SSLError) diff -Nru python-urllib3-1.8.3/test/test_exceptions.py python-urllib3-1.7.1/test/test_exceptions.py --- python-urllib3-1.8.3/test/test_exceptions.py 2014-03-15 00:05:07.000000000 +0000 +++ python-urllib3-1.7.1/test/test_exceptions.py 2013-09-25 16:01:09.000000000 +0000 @@ -11,36 +11,25 @@ class TestPickle(unittest.TestCase): - def verify_pickling(self, item): + def cycle(self, item): return pickle.loads(pickle.dumps(item)) def test_exceptions(self): - assert self.verify_pickling(HTTPError(None)) - assert self.verify_pickling(MaxRetryError(None, None, None)) - assert self.verify_pickling(LocationParseError(None)) - assert self.verify_pickling(ConnectTimeoutError(None)) + assert self.cycle(HTTPError(None)) + assert self.cycle(MaxRetryError(None, None, None)) + assert self.cycle(LocationParseError(None)) + assert self.cycle(ConnectTimeoutError(None)) def test_exceptions_with_objects(self): - assert self.verify_pickling( - HTTPError('foo')) - - assert self.verify_pickling( - HTTPError('foo', IOError('foo'))) - - assert self.verify_pickling( - MaxRetryError(HTTPConnectionPool('localhost'), '/', None)) - - assert self.verify_pickling( - LocationParseError('fake location')) - - assert self.verify_pickling( - ClosedPoolError(HTTPConnectionPool('localhost'), None)) - - assert self.verify_pickling( - EmptyPoolError(HTTPConnectionPool('localhost'), None)) - - assert self.verify_pickling( - HostChangedError(HTTPConnectionPool('localhost'), '/', None)) - - assert self.verify_pickling( - ReadTimeoutError(HTTPConnectionPool('localhost'), '/', None)) + assert self.cycle(HTTPError('foo')) + assert self.cycle(MaxRetryError(HTTPConnectionPool('localhost'), + '/', None)) + assert self.cycle(LocationParseError('fake location')) + assert self.cycle(ClosedPoolError(HTTPConnectionPool('localhost'), + None)) + assert self.cycle(EmptyPoolError(HTTPConnectionPool('localhost'), + None)) + assert self.cycle(HostChangedError(HTTPConnectionPool('localhost'), + '/', None)) + assert self.cycle(ReadTimeoutError(HTTPConnectionPool('localhost'), + '/', None)) diff -Nru python-urllib3-1.8.3/test/test_fields.py python-urllib3-1.7.1/test/test_fields.py --- python-urllib3-1.8.3/test/test_fields.py 2014-03-04 19:08:03.000000000 +0000 +++ python-urllib3-1.7.1/test/test_fields.py 2013-08-14 21:43:05.000000000 +0000 @@ -1,39 +1,34 @@ import unittest from urllib3.fields import guess_content_type, RequestField -from urllib3.packages.six import u +from urllib3.packages.six import b, u class TestRequestField(unittest.TestCase): def test_guess_content_type(self): - self.assertTrue(guess_content_type('image.jpg') in - ['image/jpeg', 'image/pjpeg']) - self.assertEqual(guess_content_type('notsure'), - 'application/octet-stream') - self.assertEqual(guess_content_type(None), 'application/octet-stream') + self.assertEqual(guess_content_type('image.jpg'), 'image/jpeg') + self.assertEqual(guess_content_type('notsure'), 'application/octet-stream') + self.assertEqual(guess_content_type(None), 'application/octet-stream') def test_create(self): - simple_field = RequestField('somename', 'data') - self.assertEqual(simple_field.render_headers(), '\r\n') - filename_field = RequestField('somename', 'data', - filename='somefile.txt') - self.assertEqual(filename_field.render_headers(), '\r\n') - headers_field = RequestField('somename', 'data', - headers={'Content-Length': 4}) - self.assertEqual( - headers_field.render_headers(), 'Content-Length: 4\r\n\r\n') + simple_field = RequestField('somename', 'data') + self.assertEqual(simple_field.render_headers(), '\r\n') + filename_field = RequestField('somename', 'data', filename='somefile.txt') + self.assertEqual(filename_field.render_headers(), '\r\n') + headers_field = RequestField('somename', 'data', headers={'Content-Length': 4}) + self.assertEqual(headers_field.render_headers(), + 'Content-Length: 4\r\n' + '\r\n') def test_make_multipart(self): - field = RequestField('somename', 'data') - field.make_multipart(content_type='image/jpg', - content_location='/test') - self.assertEqual( - field.render_headers(), - 'Content-Disposition: form-data; name="somename"\r\n' - 'Content-Type: image/jpg\r\n' - 'Content-Location: /test\r\n' - '\r\n') + field = RequestField('somename', 'data') + field.make_multipart(content_type='image/jpg', content_location='/test') + self.assertEqual(field.render_headers(), + 'Content-Disposition: form-data; name="somename"\r\n' + 'Content-Type: image/jpg\r\n' + 'Content-Location: /test\r\n' + '\r\n') def test_render_parts(self): field = RequestField('somename', 'data') diff -Nru python-urllib3-1.8.3/test/test_filepost.py python-urllib3-1.7.1/test/test_filepost.py --- python-urllib3-1.8.3/test/test_filepost.py 2014-03-15 00:05:07.000000000 +0000 +++ python-urllib3-1.7.1/test/test_filepost.py 2013-08-14 21:43:05.000000000 +0000 @@ -124,7 +124,7 @@ encoded, content_type = encode_multipart_formdata(fields, boundary=BOUNDARY) - self.assertEqual(encoded, + self.assertEquals(encoded, b'--' + b(BOUNDARY) + b'\r\n' b'Content-Type: image/jpeg\r\n' b'\r\n' diff -Nru python-urllib3-1.8.3/test/test_poolmanager.py python-urllib3-1.7.1/test/test_poolmanager.py --- python-urllib3-1.8.3/test/test_poolmanager.py 2014-04-18 05:35:23.000000000 +0000 +++ python-urllib3-1.7.1/test/test_poolmanager.py 2013-06-26 18:39:49.000000000 +0000 @@ -2,10 +2,7 @@ from urllib3.poolmanager import PoolManager from urllib3 import connection_from_url -from urllib3.exceptions import ( - ClosedPoolError, - LocationParseError, -) +from urllib3.exceptions import ClosedPoolError class TestPoolManager(unittest.TestCase): @@ -66,9 +63,6 @@ self.assertEqual(len(p.pools), 0) - def test_nohost(self): - p = PoolManager(5) - self.assertRaises(LocationParseError, p.connection_from_url, 'http://@') if __name__ == '__main__': diff -Nru python-urllib3-1.8.3/test/test_response.py python-urllib3-1.7.1/test/test_response.py --- python-urllib3-1.8.3/test/test_response.py 2014-03-15 00:05:07.000000000 +0000 +++ python-urllib3-1.7.1/test/test_response.py 2013-08-14 21:43:05.000000000 +0000 @@ -5,25 +5,6 @@ from urllib3.response import HTTPResponse from urllib3.exceptions import DecodeError - -from base64 import b64decode - -# A known random (i.e, not-too-compressible) payload generated with: -# "".join(random.choice(string.printable) for i in xrange(512)) -# .encode("zlib").encode("base64") -# Randomness in tests == bad, and fixing a seed may not be sufficient. -ZLIB_PAYLOAD = b64decode(b"""\ -eJwFweuaoQAAANDfineQhiKLUiaiCzvuTEmNNlJGiL5QhnGpZ99z8luQfe1AHoMioB+QSWHQu/L+ -lzd7W5CipqYmeVTBjdgSATdg4l4Z2zhikbuF+EKn69Q0DTpdmNJz8S33odfJoVEexw/l2SS9nFdi -pis7KOwXzfSqarSo9uJYgbDGrs1VNnQpT9f8zAorhYCEZronZQF9DuDFfNK3Hecc+WHLnZLQptwk -nufw8S9I43sEwxsT71BiqedHo0QeIrFE01F/4atVFXuJs2yxIOak3bvtXjUKAA6OKnQJ/nNvDGKZ -Khe5TF36JbnKVjdcL1EUNpwrWVfQpFYJ/WWm2b74qNeSZeQv5/xBhRdOmKTJFYgO96PwrHBlsnLn -a3l0LwJsloWpMbzByU5WLbRE6X5INFqjQOtIwYz5BAlhkn+kVqJvWM5vBlfrwP42ifonM5yF4ciJ -auHVks62997mNGOsM7WXNG3P98dBHPo2NhbTvHleL0BI5dus2JY81MUOnK3SGWLH8HeWPa1t5KcW -S5moAj5HexY/g/F8TctpxwsvyZp38dXeLDjSQvEQIkF7XR3YXbeZgKk3V34KGCPOAeeuQDIgyVhV -nP4HF2uWHA==""") - - class TestLegacyResponse(unittest.TestCase): def test_getheaders(self): headers = {'host': 'example.com'} @@ -186,23 +167,6 @@ self.assertEqual(next(stream), b'o') self.assertRaises(StopIteration, next, stream) - def test_streaming_tell(self): - fp = BytesIO(b'foo') - resp = HTTPResponse(fp, preload_content=False) - stream = resp.stream(2, decode_content=False) - - position = 0 - - position += len(next(stream)) - self.assertEqual(2, position) - self.assertEqual(position, resp.tell()) - - position += len(next(stream)) - self.assertEqual(3, position) - self.assertEqual(position, resp.tell()) - - self.assertRaises(StopIteration, next, stream) - def test_gzipped_streaming(self): import zlib compress = zlib.compressobj(6, zlib.DEFLATED, 16 + zlib.MAX_WBITS) @@ -218,78 +182,6 @@ self.assertEqual(next(stream), b'oo') self.assertRaises(StopIteration, next, stream) - def test_gzipped_streaming_tell(self): - import zlib - compress = zlib.compressobj(6, zlib.DEFLATED, 16 + zlib.MAX_WBITS) - uncompressed_data = b'foo' - data = compress.compress(uncompressed_data) - data += compress.flush() - - fp = BytesIO(data) - resp = HTTPResponse(fp, headers={'content-encoding': 'gzip'}, - preload_content=False) - stream = resp.stream() - - # Read everything - payload = next(stream) - self.assertEqual(payload, uncompressed_data) - - self.assertEqual(len(data), resp.tell()) - - self.assertRaises(StopIteration, next, stream) - - def test_deflate_streaming_tell_intermediate_point(self): - # Ensure that ``tell()`` returns the correct number of bytes when - # part-way through streaming compressed content. - import zlib - - NUMBER_OF_READS = 10 - - class MockCompressedDataReading(BytesIO): - """ - A ByteIO-like reader returning ``payload`` in ``NUMBER_OF_READS`` - calls to ``read``. - """ - - def __init__(self, payload, payload_part_size): - self.payloads = [ - payload[i*payload_part_size:(i+1)*payload_part_size] - for i in range(NUMBER_OF_READS+1)] - - assert b"".join(self.payloads) == payload - - def read(self, _): - # Amount is unused. - if len(self.payloads) > 0: - return self.payloads.pop(0) - return b"" - - uncompressed_data = zlib.decompress(ZLIB_PAYLOAD) - - payload_part_size = len(ZLIB_PAYLOAD) // NUMBER_OF_READS - fp = MockCompressedDataReading(ZLIB_PAYLOAD, payload_part_size) - resp = HTTPResponse(fp, headers={'content-encoding': 'deflate'}, - preload_content=False) - stream = resp.stream() - - parts_positions = [(part, resp.tell()) for part in stream] - end_of_stream = resp.tell() - - self.assertRaises(StopIteration, next, stream) - - parts, positions = zip(*parts_positions) - - # Check that the payload is equal to the uncompressed data - payload = b"".join(parts) - self.assertEqual(uncompressed_data, payload) - - # Check that the positions in the stream are correct - expected = [(i+1)*payload_part_size for i in range(NUMBER_OF_READS)] - self.assertEqual(expected, list(positions)) - - # Check that the end of the stream is in the correct place - self.assertEqual(len(ZLIB_PAYLOAD), end_of_stream) - def test_deflate_streaming(self): import zlib data = zlib.compress(b'foo') @@ -352,11 +244,6 @@ self.assertEqual(next(stream), b'o') self.assertRaises(StopIteration, next, stream) - def test_get_case_insensitive_headers(self): - headers = {'host': 'example.com'} - r = HTTPResponse(headers=headers) - self.assertEqual(r.headers.get('host'), 'example.com') - self.assertEqual(r.headers.get('Host'), 'example.com') if __name__ == '__main__': unittest.main() diff -Nru python-urllib3-1.8.3/test/test_util.py python-urllib3-1.7.1/test/test_util.py --- python-urllib3-1.8.3/test/test_util.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/test/test_util.py 2013-09-25 16:01:09.000000000 +0000 @@ -1,6 +1,5 @@ import logging import unittest -import ssl from mock import patch @@ -12,7 +11,6 @@ parse_url, Timeout, Url, - resolve_cert_reqs, ) from urllib3.exceptions import LocationParseError, TimeoutStateError @@ -66,7 +64,7 @@ } for url, expected_host in url_host_map.items(): returned_host = get_host(url) - self.assertEqual(returned_host, expected_host) + self.assertEquals(returned_host, expected_host) def test_invalid_host(self): # TODO: Add more tests @@ -79,7 +77,6 @@ for location in invalid_host: self.assertRaises(LocationParseError, get_host, location) - def test_parse_url(self): url_host_map = { 'http://google.com/mail': Url('http', host='google.com', path='/mail'), @@ -88,8 +85,6 @@ 'http://google.com/': Url('http', host='google.com', path='/'), 'http://google.com': Url('http', host='google.com'), 'http://google.com?foo': Url('http', host='google.com', path='', query='foo'), - - # Path/query/fragment '': Url(), '/': Url(path='/'), '?': Url(path='', query=''), @@ -98,23 +93,10 @@ '/foo': Url(path='/foo'), '/foo?bar=baz': Url(path='/foo', query='bar=baz'), '/foo?bar=baz#banana?apple/orange': Url(path='/foo', query='bar=baz', fragment='banana?apple/orange'), - - # Port - 'http://google.com/': Url('http', host='google.com', path='/'), - 'http://google.com:80/': Url('http', host='google.com', port=80, path='/'), - 'http://google.com:/': Url('http', host='google.com', path='/'), - 'http://google.com:80': Url('http', host='google.com', port=80), - 'http://google.com:': Url('http', host='google.com'), - - # Auth - 'http://foo:bar@localhost/': Url('http', auth='foo:bar', host='localhost', path='/'), - 'http://foo@localhost/': Url('http', auth='foo', host='localhost', path='/'), - 'http://foo:bar@baz@localhost/': Url('http', auth='foo:bar@baz', host='localhost', path='/'), - 'http://@': Url('http', host=None, auth='') } for url, expected_url in url_host_map.items(): returned_url = parse_url(url) - self.assertEqual(returned_url, expected_url) + self.assertEquals(returned_url, expected_url) def test_parse_url_invalid_IPv6(self): self.assertRaises(ValueError, parse_url, '[::1') @@ -133,7 +115,7 @@ } for url, expected_request_uri in url_host_map.items(): returned_url = parse_url(url) - self.assertEqual(returned_url.request_uri, expected_request_uri) + self.assertEquals(returned_url.request_uri, expected_request_uri) def test_netloc(self): url_netloc_map = { @@ -144,7 +126,7 @@ } for url, expected_netloc in url_netloc_map.items(): - self.assertEqual(parse_url(url).netloc, expected_netloc) + self.assertEquals(parse_url(url).netloc, expected_netloc) def test_make_headers(self): self.assertEqual( @@ -175,13 +157,6 @@ make_headers(basic_auth='foo:bar'), {'authorization': 'Basic Zm9vOmJhcg=='}) - self.assertEqual( - make_headers(proxy_basic_auth='foo:bar'), - {'proxy-authorization': 'Basic Zm9vOmJhcg=='}) - - self.assertEqual( - make_headers(disable_cache=True), - {'cache-control': 'no-cache'}) def test_split_first(self): test_cases = { @@ -239,7 +214,7 @@ self.assertTrue('int or float' in str(e)) - @patch('urllib3.util.timeout.current_time') + @patch('urllib3.util.current_time') def test_timeout(self, current_time): timeout = Timeout(total=3) @@ -275,9 +250,6 @@ self.assertEqual(timeout.read_timeout, None) self.assertEqual(timeout.total, None) - timeout = Timeout(5) - self.assertEqual(timeout.total, 5) - def test_timeout_str(self): timeout = Timeout(connect=1, read=2, total=3) @@ -286,7 +258,7 @@ self.assertEqual(str(timeout), "Timeout(connect=1, read=None, total=3)") - @patch('urllib3.util.timeout.current_time') + @patch('urllib3.util.current_time') def test_timeout_elapsed(self, current_time): current_time.return_value = TIMEOUT_EPOCH timeout = Timeout(total=3) @@ -300,11 +272,4 @@ current_time.return_value = TIMEOUT_EPOCH + 37 self.assertEqual(timeout.get_connect_duration(), 37) - def test_resolve_cert_reqs(self): - self.assertEqual(resolve_cert_reqs(None), ssl.CERT_NONE) - self.assertEqual(resolve_cert_reqs(ssl.CERT_NONE), ssl.CERT_NONE) - - self.assertEqual(resolve_cert_reqs(ssl.CERT_REQUIRED), ssl.CERT_REQUIRED) - self.assertEqual(resolve_cert_reqs('REQUIRED'), ssl.CERT_REQUIRED) - self.assertEqual(resolve_cert_reqs('CERT_REQUIRED'), ssl.CERT_REQUIRED) diff -Nru python-urllib3-1.8.3/test-requirements.txt python-urllib3-1.7.1/test-requirements.txt --- python-urllib3-1.8.3/test-requirements.txt 2014-06-24 18:19:21.000000000 +0000 +++ python-urllib3-1.7.1/test-requirements.txt 2013-09-25 16:01:09.000000000 +0000 @@ -1,4 +1,4 @@ nose==1.3 mock==1.0.1 -tornado==3.1.1 +tornado==2.4.1 coverage==3.6 diff -Nru python-urllib3-1.8.3/urllib3/_collections.py python-urllib3-1.7.1/urllib3/_collections.py --- python-urllib3-1.8.3/urllib3/_collections.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/_collections.py 2013-08-14 21:43:05.000000000 +0000 @@ -4,26 +4,16 @@ # This module is part of urllib3 and is released under # the MIT License: http://www.opensource.org/licenses/mit-license.php -from collections import Mapping, MutableMapping -try: - from threading import RLock -except ImportError: # Platform-specific: No threads available - class RLock: - def __enter__(self): - pass - - def __exit__(self, exc_type, exc_value, traceback): - pass - +from collections import MutableMapping +from threading import RLock try: # Python 2.7+ from collections import OrderedDict except ImportError: from .packages.ordered_dict import OrderedDict -from .packages.six import itervalues -__all__ = ['RecentlyUsedContainer', 'HTTPHeaderDict'] +__all__ = ['RecentlyUsedContainer'] _Null = object() @@ -102,104 +92,3 @@ def keys(self): with self.lock: return self._container.keys() - - -class HTTPHeaderDict(MutableMapping): - """ - :param headers: - An iterable of field-value pairs. Must not contain multiple field names - when compared case-insensitively. - - :param kwargs: - Additional field-value pairs to pass in to ``dict.update``. - - A ``dict`` like container for storing HTTP Headers. - - Field names are stored and compared case-insensitively in compliance with - RFC 7230. Iteration provides the first case-sensitive key seen for each - case-insensitive pair. - - Using ``__setitem__`` syntax overwrites fields that compare equal - case-insensitively in order to maintain ``dict``'s api. For fields that - compare equal, instead create a new ``HTTPHeaderDict`` and use ``.add`` - in a loop. - - If multiple fields that are equal case-insensitively are passed to the - constructor or ``.update``, the behavior is undefined and some will be - lost. - - >>> headers = HTTPHeaderDict() - >>> headers.add('Set-Cookie', 'foo=bar') - >>> headers.add('set-cookie', 'baz=quxx') - >>> headers['content-length'] = '7' - >>> headers['SET-cookie'] - 'foo=bar, baz=quxx' - >>> headers['Content-Length'] - '7' - - If you want to access the raw headers with their original casing - for debugging purposes you can access the private ``._data`` attribute - which is a normal python ``dict`` that maps the case-insensitive key to a - list of tuples stored as (case-sensitive-original-name, value). Using the - structure from above as our example: - - >>> headers._data - {'set-cookie': [('Set-Cookie', 'foo=bar'), ('set-cookie', 'baz=quxx')], - 'content-length': [('content-length', '7')]} - """ - - def __init__(self, headers=None, **kwargs): - self._data = {} - if headers is None: - headers = {} - self.update(headers, **kwargs) - - def add(self, key, value): - """Adds a (name, value) pair, doesn't overwrite the value if it already - exists. - - >>> headers = HTTPHeaderDict(foo='bar') - >>> headers.add('Foo', 'baz') - >>> headers['foo'] - 'bar, baz' - """ - self._data.setdefault(key.lower(), []).append((key, value)) - - def getlist(self, key): - """Returns a list of all the values for the named field. Returns an - empty list if the key doesn't exist.""" - return self[key].split(', ') if key in self else [] - - def copy(self): - h = HTTPHeaderDict() - for key in self._data: - for rawkey, value in self._data[key]: - h.add(rawkey, value) - return h - - def __eq__(self, other): - if not isinstance(other, Mapping): - return False - other = HTTPHeaderDict(other) - return dict((k1, self[k1]) for k1 in self._data) == \ - dict((k2, other[k2]) for k2 in other._data) - - def __getitem__(self, key): - values = self._data[key.lower()] - return ', '.join(value[1] for value in values) - - def __setitem__(self, key, value): - self._data[key.lower()] = [(key, value)] - - def __delitem__(self, key): - del self._data[key.lower()] - - def __len__(self): - return len(self._data) - - def __iter__(self): - for headers in itervalues(self._data): - yield headers[0][0] - - def __repr__(self): - return '%s(%r)' % (self.__class__.__name__, dict(self.items())) diff -Nru python-urllib3-1.8.3/urllib3/connectionpool.py python-urllib3-1.7.1/urllib3/connectionpool.py --- python-urllib3-1.8.3/urllib3/connectionpool.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/connectionpool.py 2013-09-25 16:01:09.000000000 +0000 @@ -4,57 +4,143 @@ # This module is part of urllib3 and is released under # the MIT License: http://www.opensource.org/licenses/mit-license.php -import sys import errno import logging from socket import error as SocketError, timeout as SocketTimeout import socket -try: # Python 3 +try: # Python 3 + from http.client import HTTPConnection, HTTPException + from http.client import HTTP_PORT, HTTPS_PORT +except ImportError: + from httplib import HTTPConnection, HTTPException + from httplib import HTTP_PORT, HTTPS_PORT + +try: # Python 3 from queue import LifoQueue, Empty, Full except ImportError: from Queue import LifoQueue, Empty, Full import Queue as _ # Platform-specific: Windows +try: # Compiled with SSL? + HTTPSConnection = object + + class BaseSSLError(BaseException): + pass + + ssl = None + + try: # Python 3 + from http.client import HTTPSConnection + except ImportError: + from httplib import HTTPSConnection + + import ssl + BaseSSLError = ssl.SSLError + +except (ImportError, AttributeError): # Platform-specific: No SSL. + pass + + from .exceptions import ( ClosedPoolError, - ConnectionError, + ConnectTimeoutError, EmptyPoolError, HostChangedError, - LocationParseError, MaxRetryError, SSLError, - TimeoutError, ReadTimeoutError, ProxyError, ) -from .packages.ssl_match_hostname import CertificateError +from .packages.ssl_match_hostname import CertificateError, match_hostname from .packages import six -from .connection import ( - port_by_scheme, - DummyConnection, - HTTPConnection, HTTPSConnection, VerifiedHTTPSConnection, - HTTPException, BaseSSLError, -) from .request import RequestMethods from .response import HTTPResponse from .util import ( + assert_fingerprint, get_host, is_connection_dropped, + resolve_cert_reqs, + resolve_ssl_version, + ssl_wrap_socket, Timeout, ) - xrange = six.moves.xrange log = logging.getLogger(__name__) _Default = object() +port_by_scheme = { + 'http': HTTP_PORT, + 'https': HTTPS_PORT, +} + + +## Connection objects (extension of httplib) + +class VerifiedHTTPSConnection(HTTPSConnection): + """ + Based on httplib.HTTPSConnection but wraps the socket with + SSL certification. + """ + cert_reqs = None + ca_certs = None + ssl_version = None + + def set_cert(self, key_file=None, cert_file=None, + cert_reqs=None, ca_certs=None, + assert_hostname=None, assert_fingerprint=None): + + self.key_file = key_file + self.cert_file = cert_file + self.cert_reqs = cert_reqs + self.ca_certs = ca_certs + self.assert_hostname = assert_hostname + self.assert_fingerprint = assert_fingerprint + + def connect(self): + # Add certificate verification + try: + sock = socket.create_connection( + address=(self.host, self.port), + timeout=self.timeout) + except SocketTimeout: + raise ConnectTimeoutError( + self, "Connection to %s timed out. (connect timeout=%s)" % + (self.host, self.timeout)) + + resolved_cert_reqs = resolve_cert_reqs(self.cert_reqs) + resolved_ssl_version = resolve_ssl_version(self.ssl_version) + + if self._tunnel_host: + self.sock = sock + # Calls self._set_hostport(), so self.host is + # self._tunnel_host below. + self._tunnel() + + # Wrap socket using verification with the root certs in + # trusted_root_certs + self.sock = ssl_wrap_socket(sock, self.key_file, self.cert_file, + cert_reqs=resolved_cert_reqs, + ca_certs=self.ca_certs, + server_hostname=self.host, + ssl_version=resolved_ssl_version) + + if resolved_cert_reqs != ssl.CERT_NONE: + if self.assert_fingerprint: + assert_fingerprint(self.sock.getpeercert(binary_form=True), + self.assert_fingerprint) + elif self.assert_hostname is not False: + match_hostname(self.sock.getpeercert(), + self.assert_hostname or self.host) + ## Pool objects + class ConnectionPool(object): """ Base class for all connection pools, such as @@ -65,9 +151,6 @@ QueueCls = LifoQueue def __init__(self, host, port=None): - if host is None: - raise LocationParseError(host) - # httplib doesn't like it when we include brackets in ipv6 addresses host = host.strip('[]') @@ -81,7 +164,6 @@ # This is taken from http://hg.python.org/cpython/file/7aaba721ebc0/Lib/socket.py#l252 _blocking_errnos = set([errno.EAGAIN, errno.EWOULDBLOCK]) - class HTTPConnectionPool(ConnectionPool, RequestMethods): """ Thread-safe connection pool for one host. @@ -133,18 +215,13 @@ :param _proxy_headers: A dictionary with proxy headers, should not be used directly, instead, see :class:`urllib3.connectionpool.ProxyManager`" - - :param \**conn_kw: - Additional parameters are used to create fresh :class:`urllib3.connection.HTTPConnection`, - :class:`urllib3.connection.HTTPSConnection` instances. """ scheme = 'http' - ConnectionCls = HTTPConnection def __init__(self, host, port=None, strict=False, timeout=Timeout.DEFAULT_TIMEOUT, maxsize=1, block=False, - headers=None, _proxy=None, _proxy_headers=None, **conn_kw): + headers=None, _proxy=None, _proxy_headers=None): ConnectionPool.__init__(self, host, port) RequestMethods.__init__(self, headers) @@ -170,26 +247,22 @@ # These are mostly for testing and debugging purposes. self.num_connections = 0 self.num_requests = 0 - self.conn_kw = conn_kw - - if self.proxy: - # Enable Nagle's algorithm for proxies, to avoid packet fragmentation. - # We cannot know if the user has added default socket options, so we cannot replace the - # list. - self.conn_kw.setdefault('socket_options', []) def _new_conn(self): """ - Return a fresh :class:`HTTPConnection`. + Return a fresh :class:`httplib.HTTPConnection`. """ self.num_connections += 1 log.info("Starting new HTTP connection (%d): %s" % (self.num_connections, self.host)) + extra_params = {} + if not six.PY3: # Python 2 + extra_params['strict'] = self.strict + + return HTTPConnection(host=self.host, port=self.port, + timeout=self.timeout.connect_timeout, + **extra_params) - conn = self.ConnectionCls(host=self.host, port=self.port, - timeout=self.timeout.connect_timeout, - strict=self.strict, **self.conn_kw) - return conn def _get_conn(self, timeout=None): """ @@ -207,7 +280,7 @@ try: conn = self.pool.get(block=self.block, timeout=timeout) - except AttributeError: # self.pool is None + except AttributeError: # self.pool is None raise ClosedPoolError(self, "Pool is closed.") except Empty: @@ -221,11 +294,6 @@ if conn and is_connection_dropped(conn): log.info("Resetting dropped connection: %s" % self.host) conn.close() - if getattr(conn, 'auto_open', 1) == 0: - # This is a proxied connection that has been mutated by - # httplib._tunnel() and cannot be reused (since it would - # attempt to bypass the proxy) - conn = None return conn or self._new_conn() @@ -245,15 +313,14 @@ """ try: self.pool.put(conn, block=False) - return # Everything is dandy, done. + return # Everything is dandy, done. except AttributeError: # self.pool is None. pass except Full: # This should never happen if self.block == True - log.warning( - "Connection pool is full, discarding connection: %s" % - self.host) + log.warning("HttpConnectionPool is full, discarding connection: %s" + % self.host) # Connection never got put back into the pool, close it. if conn: @@ -274,7 +341,7 @@ def _make_request(self, conn, method, url, timeout=_Default, **httplib_request_kw): """ - Perform a request on a given urllib connection object taken from our + Perform a request on a given httplib connection object taken from our pool. :param conn: @@ -291,17 +358,24 @@ timeout_obj = self._get_timeout(timeout) - timeout_obj.start_connect() - conn.timeout = timeout_obj.connect_timeout - # conn.request() calls httplib.*.request, not the method in - # urllib3.request. It also calls makefile (recv) on the socket. - conn.request(method, url, **httplib_request_kw) + try: + timeout_obj.start_connect() + conn.timeout = timeout_obj.connect_timeout + # conn.request() calls httplib.*.request, not the method in + # request.py. It also calls makefile (recv) on the socket + conn.request(method, url, **httplib_request_kw) + except SocketTimeout: + raise ConnectTimeoutError( + self, "Connection to %s timed out. (connect timeout=%s)" % + (self.host, timeout_obj.connect_timeout)) # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout - + log.debug("Setting read timeout to %s" % read_timeout) # App Engine doesn't have a sock attr - if hasattr(conn, 'sock'): + if hasattr(conn, 'sock') and \ + read_timeout is not None and \ + read_timeout is not Timeout.DEFAULT_TIMEOUT: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching @@ -311,41 +385,28 @@ raise ReadTimeoutError( self, url, "Read timed out. (read timeout=%s)" % read_timeout) - if read_timeout is Timeout.DEFAULT_TIMEOUT: - conn.sock.settimeout(socket.getdefaulttimeout()) - else: # None or a value - conn.sock.settimeout(read_timeout) + conn.sock.settimeout(read_timeout) # Receive the response from the server try: - try: # Python 2.7+, use buffering of HTTP responses + try: # Python 2.7+, use buffering of HTTP responses httplib_response = conn.getresponse(buffering=True) - except TypeError: # Python 2.6 and older + except TypeError: # Python 2.6 and older httplib_response = conn.getresponse() except SocketTimeout: raise ReadTimeoutError( self, url, "Read timed out. (read timeout=%s)" % read_timeout) - except BaseSSLError as e: - # Catch possible read timeouts thrown as SSL errors. If not the - # case, rethrow the original. We need to do this because of: - # http://bugs.python.org/issue10272 - if 'timed out' in str(e) or \ - 'did not complete (read)' in str(e): # Python 2.6 - raise ReadTimeoutError(self, url, "Read timed out.") - - raise - - except SocketError as e: # Platform-specific: Python 2 + except SocketError as e: # Platform-specific: Python 2 # See the above comment about EAGAIN in Python 3. In Python 2 we # have to specifically catch it and throw the timeout error if e.errno in _blocking_errnos: raise ReadTimeoutError( self, url, "Read timed out. (read timeout=%s)" % read_timeout) - raise + # AppEngine doesn't have a version attr. http_version = getattr(conn, '_http_vsn_str', 'HTTP/?') log.debug("\"%s %s %s\" %s %s" % (method, url, http_version, @@ -367,7 +428,7 @@ conn.close() except Empty: - pass # Done. + pass # Done. def is_same_host(self, url): """ @@ -380,11 +441,9 @@ # TODO: Add optional support for socket.gethostbyname checking. scheme, host, port = get_host(url) - # Use explicit default port for comparison when none is given if self.port and not port: + # Use explicit default port for comparison when none is given. port = port_by_scheme.get(scheme) - elif not self.port and port == port_by_scheme.get(scheme): - port = None return (scheme, host, port) == (self.scheme, self.host, self.port) @@ -423,13 +482,10 @@ :param retries: Number of retries to allow before raising a MaxRetryError exception. - If `False`, then retries are disabled and any exception is raised - immediately. :param redirect: If True, automatically handle redirects (status codes 301, 302, - 303, 307, 308). Each redirect counts as a retry. Disabling retries - will disable redirect, too. + 303, 307, 308). Each redirect counts as a retry. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is @@ -463,7 +519,7 @@ if headers is None: headers = self.headers - if retries < 0 and retries is not False: + if retries < 0: raise MaxRetryError(self, url) if release_conn is None: @@ -475,17 +531,6 @@ conn = None - # Merge the proxy headers. Only do this in HTTP. We have to copy the - # headers dict so we can safely change it without those changes being - # reflected in anyone else's copy. - if self.scheme == 'http': - headers = headers.copy() - headers.update(self.proxy_headers) - - # Must keep the exception bound to a separate variable or else Python 3 - # complains about UnboundLocalError. - err = None - try: # Request a connection from the queue conn = self._get_conn(timeout=pool_timeout) @@ -513,41 +558,38 @@ # ``response.read()``) except Empty: - # Timed out by queue. - raise EmptyPoolError(self, "No pool connections are available.") + # Timed out by queue + raise ReadTimeoutError( + self, url, "Read timed out, no pool connections are available.") + + except SocketTimeout: + # Timed out by socket + raise ReadTimeoutError(self, url, "Read timed out.") - except (BaseSSLError, CertificateError) as e: - # Release connection unconditionally because there is no way to - # close it externally in case of exception. - release_conn = True + except BaseSSLError as e: + # SSL certificate error + if 'timed out' in str(e) or \ + 'did not complete (read)' in str(e): # Platform-specific: Python 2.6 + raise ReadTimeoutError(self, url, "Read timed out.") raise SSLError(e) - except (TimeoutError, HTTPException, SocketError) as e: - if conn: - # Discard the connection for these exceptions. It will be - # be replaced during the next _get_conn() call. - conn.close() - conn = None - - if not retries: - if isinstance(e, TimeoutError): - # TimeoutError is exempt from MaxRetryError-wrapping. - # FIXME: ... Not sure why. Add a reason here. - raise - - # Wrap unexpected exceptions with the most appropriate - # module-level exception and re-raise. - if isinstance(e, SocketError) and self.proxy: - raise ProxyError('Cannot connect to proxy.', e) + except CertificateError as e: + # Name mismatch + raise SSLError(e) - if retries is False: - raise ConnectionError('Connection failed.', e) + except (HTTPException, SocketError) as e: + if isinstance(e, SocketError) and self.proxy is not None: + raise ProxyError('Cannot connect to proxy. ' + 'Socket error: %s.' % e) + + # Connection broken, discard. It will be replaced next _get_conn(). + conn = None + # This is necessary so we can access e below + err = e + if retries == 0: raise MaxRetryError(self, url, e) - # Keep track of the error for the retry warning. - err = e - finally: if release_conn: # Put the connection back to be reused. If the connection is @@ -557,8 +599,8 @@ if not conn: # Try again - log.warning("Retrying (%d attempts remain) after connection " - "broken by '%r': %s" % (retries, err, url)) + log.warn("Retrying (%d attempts remain) after connection " + "broken by '%r': %s" % (retries, err, url)) return self.urlopen(method, url, body, headers, retries - 1, redirect, assert_same_host, timeout=timeout, pool_timeout=pool_timeout, @@ -566,7 +608,7 @@ # Handle redirect? redirect_location = redirect and response.get_redirect_location() - if redirect_location and retries is not False: + if redirect_location: if response.status == 303: method = 'GET' log.info("Redirecting %s -> %s" % (url, redirect_location)) @@ -584,7 +626,7 @@ When Python is compiled with the :mod:`ssl` module, then :class:`.VerifiedHTTPSConnection` is used, which *can* verify certificates, - instead of :class:`.HTTPSConnection`. + instead of :class:`httplib.HTTPSConnection`. :class:`.VerifiedHTTPSConnection` uses one of ``assert_fingerprint``, ``assert_hostname`` and ``host`` in this order to verify connections. @@ -597,7 +639,6 @@ """ scheme = 'https' - ConnectionCls = HTTPSConnection def __init__(self, host, port=None, strict=False, timeout=None, maxsize=1, @@ -605,12 +646,10 @@ _proxy=None, _proxy_headers=None, key_file=None, cert_file=None, cert_reqs=None, ca_certs=None, ssl_version=None, - assert_hostname=None, assert_fingerprint=None, - **conn_kw): + assert_hostname=None, assert_fingerprint=None): HTTPConnectionPool.__init__(self, host, port, strict, timeout, maxsize, - block, headers, _proxy, _proxy_headers, - **conn_kw) + block, headers, _proxy, _proxy_headers) self.key_file = key_file self.cert_file = cert_file self.cert_reqs = cert_reqs @@ -619,38 +658,33 @@ self.assert_hostname = assert_hostname self.assert_fingerprint = assert_fingerprint - def _prepare_conn(self, conn): + def _prepare_conn(self, connection): """ Prepare the ``connection`` for :meth:`urllib3.util.ssl_wrap_socket` and establish the tunnel if proxy is used. """ - if isinstance(conn, VerifiedHTTPSConnection): - conn.set_cert(key_file=self.key_file, - cert_file=self.cert_file, - cert_reqs=self.cert_reqs, - ca_certs=self.ca_certs, - assert_hostname=self.assert_hostname, - assert_fingerprint=self.assert_fingerprint) - conn.ssl_version = self.ssl_version + if isinstance(connection, VerifiedHTTPSConnection): + connection.set_cert(key_file=self.key_file, + cert_file=self.cert_file, + cert_reqs=self.cert_reqs, + ca_certs=self.ca_certs, + assert_hostname=self.assert_hostname, + assert_fingerprint=self.assert_fingerprint) + connection.ssl_version = self.ssl_version if self.proxy is not None: # Python 2.7+ try: - set_tunnel = conn.set_tunnel + set_tunnel = connection.set_tunnel except AttributeError: # Platform-specific: Python 2.6 - set_tunnel = conn._set_tunnel - - if sys.version_info <= (2, 6, 4) and not self.proxy_headers: # Python 2.6.4 and older - set_tunnel(self.host, self.port) - else: - set_tunnel(self.host, self.port, self.proxy_headers) - + set_tunnel = connection._set_tunnel + set_tunnel(self.host, self.port, self.proxy_headers) # Establish tunnel connection early, because otherwise httplib # would improperly set Host: header to proxy's IP:port. - conn.connect() + connection.connect() - return conn + return connection def _new_conn(self): """ @@ -660,22 +694,28 @@ log.info("Starting new HTTPS connection (%d): %s" % (self.num_connections, self.host)) - if not self.ConnectionCls or self.ConnectionCls is DummyConnection: - # Platform-specific: Python without ssl - raise SSLError("Can't connect to HTTPS URL because the SSL " - "module is not available.") - actual_host = self.host actual_port = self.port if self.proxy is not None: actual_host = self.proxy.host actual_port = self.proxy.port - conn = self.ConnectionCls(host=actual_host, port=actual_port, - timeout=self.timeout.connect_timeout, - strict=self.strict, **self.conn_kw) + if not ssl: # Platform-specific: Python compiled without +ssl + if not HTTPSConnection or HTTPSConnection is object: + raise SSLError("Can't connect to HTTPS URL because the SSL " + "module is not available.") + connection_class = HTTPSConnection + else: + connection_class = VerifiedHTTPSConnection + + extra_params = {} + if not six.PY3: # Python 2 + extra_params['strict'] = self.strict + connection = connection_class(host=actual_host, port=actual_port, + timeout=self.timeout.connect_timeout, + **extra_params) - return self._prepare_conn(conn) + return self._prepare_conn(connection) def connection_from_url(url, **kw): diff -Nru python-urllib3-1.8.3/urllib3/connection.py python-urllib3-1.7.1/urllib3/connection.py --- python-urllib3-1.8.3/urllib3/connection.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/connection.py 1970-01-01 00:00:00.000000000 +0000 @@ -1,232 +0,0 @@ -# urllib3/connection.py -# Copyright 2008-2013 Andrey Petrov and contributors (see CONTRIBUTORS.txt) -# -# This module is part of urllib3 and is released under -# the MIT License: http://www.opensource.org/licenses/mit-license.php - -import sys -import socket -from socket import timeout as SocketTimeout - -try: # Python 3 - from http.client import HTTPConnection as _HTTPConnection, HTTPException -except ImportError: - from httplib import HTTPConnection as _HTTPConnection, HTTPException - - -class DummyConnection(object): - "Used to detect a failed ConnectionCls import." - pass - - -try: # Compiled with SSL? - HTTPSConnection = DummyConnection - import ssl - BaseSSLError = ssl.SSLError -except (ImportError, AttributeError): # Platform-specific: No SSL. - ssl = None - - class BaseSSLError(BaseException): - pass - - -from .exceptions import ( - ConnectTimeoutError, -) -from .packages.ssl_match_hostname import match_hostname -from .packages import six -from .util import ( - assert_fingerprint, - resolve_cert_reqs, - resolve_ssl_version, - ssl_wrap_socket, -) - - -port_by_scheme = { - 'http': 80, - 'https': 443, -} - - -class HTTPConnection(_HTTPConnection, object): - """ - Based on httplib.HTTPConnection but provides an extra constructor - backwards-compatibility layer between older and newer Pythons. - - Additional keyword parameters are used to configure attributes of the connection. - Accepted parameters include: - - - ``strict``: See the documentation on :class:`urllib3.connectionpool.HTTPConnectionPool` - - ``source_address``: Set the source address for the current connection. - - .. note:: This is ignored for Python 2.6. It is only applied for 2.7 and 3.x - - - ``socket_options``: Set specific options on the underlying socket. If not specified, then - defaults are loaded from ``HTTPConnection.default_socket_options`` which includes disabling - Nagle's algorithm (sets TCP_NODELAY to 1) unless the connection is behind a proxy. - - For example, if you wish to enable TCP Keep Alive in addition to the defaults, - you might pass:: - - HTTPConnection.default_socket_options + [ - (socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1), - ] - - Or you may want to disable the defaults by passing an empty list (e.g., ``[]``). - """ - - default_port = port_by_scheme['http'] - - #: Disable Nagle's algorithm by default. - #: ``[(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)]`` - default_socket_options = [(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)] - - def __init__(self, *args, **kw): - if six.PY3: # Python 3 - kw.pop('strict', None) - if sys.version_info < (2, 7): # Python 2.6 and older - kw.pop('source_address', None) - - # Pre-set source_address in case we have an older Python like 2.6. - self.source_address = kw.get('source_address') - - #: The socket options provided by the user. If no options are - #: provided, we use the default options. - self.socket_options = kw.pop('socket_options', self.default_socket_options) - - # Superclass also sets self.source_address in Python 2.7+. - _HTTPConnection.__init__(self, *args, **kw) - - def _new_conn(self): - """ Establish a socket connection and set nodelay settings on it. - - :return: New socket connection. - """ - extra_args = [] - if self.source_address: # Python 2.7+ - extra_args.append(self.source_address) - - try: - conn = socket.create_connection( - (self.host, self.port), self.timeout, *extra_args) - - except SocketTimeout: - raise ConnectTimeoutError( - self, "Connection to %s timed out. (connect timeout=%s)" % - (self.host, self.timeout)) - - # Set options on the socket. - self._set_options_on(conn) - - return conn - - def _prepare_conn(self, conn): - self.sock = conn - # the _tunnel_host attribute was added in python 2.6.3 (via - # http://hg.python.org/cpython/rev/0f57b30a152f) so pythons 2.6(0-2) do - # not have them. - if getattr(self, '_tunnel_host', None): - # TODO: Fix tunnel so it doesn't depend on self.sock state. - self._tunnel() - # Mark this connection as not reusable - self.auto_open = 0 - - def _set_options_on(self, conn): - # Disable all socket options if the user passes ``socket_options=None`` - if self.socket_options is None: - return - - for opt in self.socket_options: - conn.setsockopt(*opt) - - def connect(self): - conn = self._new_conn() - self._prepare_conn(conn) - - -class HTTPSConnection(HTTPConnection): - default_port = port_by_scheme['https'] - - def __init__(self, host, port=None, key_file=None, cert_file=None, - strict=None, timeout=socket._GLOBAL_DEFAULT_TIMEOUT, **kw): - - HTTPConnection.__init__(self, host, port, strict=strict, - timeout=timeout, **kw) - - self.key_file = key_file - self.cert_file = cert_file - - # Required property for Google AppEngine 1.9.0 which otherwise causes - # HTTPS requests to go out as HTTP. (See Issue #356) - self._protocol = 'https' - - def connect(self): - conn = self._new_conn() - self._prepare_conn(conn) - self.sock = ssl.wrap_socket(conn, self.key_file, self.cert_file) - - -class VerifiedHTTPSConnection(HTTPSConnection): - """ - Based on httplib.HTTPSConnection but wraps the socket with - SSL certification. - """ - cert_reqs = None - ca_certs = None - ssl_version = None - - def set_cert(self, key_file=None, cert_file=None, - cert_reqs=None, ca_certs=None, - assert_hostname=None, assert_fingerprint=None): - - self.key_file = key_file - self.cert_file = cert_file - self.cert_reqs = cert_reqs - self.ca_certs = ca_certs - self.assert_hostname = assert_hostname - self.assert_fingerprint = assert_fingerprint - - def connect(self): - # Add certificate verification - conn = self._new_conn() - - resolved_cert_reqs = resolve_cert_reqs(self.cert_reqs) - resolved_ssl_version = resolve_ssl_version(self.ssl_version) - - hostname = self.host - if getattr(self, '_tunnel_host', None): - # _tunnel_host was added in Python 2.6.3 - # (See: http://hg.python.org/cpython/rev/0f57b30a152f) - - self.sock = conn - # Calls self._set_hostport(), so self.host is - # self._tunnel_host below. - self._tunnel() - # Mark this connection as not reusable - self.auto_open = 0 - - # Override the host with the one we're requesting data from. - hostname = self._tunnel_host - - # Wrap socket using verification with the root certs in - # trusted_root_certs - self.sock = ssl_wrap_socket(conn, self.key_file, self.cert_file, - cert_reqs=resolved_cert_reqs, - ca_certs=self.ca_certs, - server_hostname=hostname, - ssl_version=resolved_ssl_version) - - if resolved_cert_reqs != ssl.CERT_NONE: - if self.assert_fingerprint: - assert_fingerprint(self.sock.getpeercert(binary_form=True), - self.assert_fingerprint) - elif self.assert_hostname is not False: - match_hostname(self.sock.getpeercert(), - self.assert_hostname or hostname) - - -if ssl: - # Make a copy for testing. - UnverifiedHTTPSConnection = HTTPSConnection - HTTPSConnection = VerifiedHTTPSConnection diff -Nru python-urllib3-1.8.3/urllib3/contrib/pyopenssl.py python-urllib3-1.7.1/urllib3/contrib/pyopenssl.py --- python-urllib3-1.8.3/urllib3/contrib/pyopenssl.py 2014-04-18 05:35:23.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/contrib/pyopenssl.py 2013-09-25 16:01:09.000000000 +0000 @@ -1,7 +1,4 @@ -'''SSL with SNI_-support for Python 2. Follow these instructions if you would -like to verify SSL certificates in Python 2. Note, the default libraries do -*not* do certificate checking; you need to do additional work to validate -certificates yourself. +'''SSL with SNI-support for Python 2. This needs the following packages installed: @@ -9,15 +6,9 @@ * ndg-httpsclient (tested with 0.3.2) * pyasn1 (tested with 0.1.6) -You can install them with the following command: - - pip install pyopenssl ndg-httpsclient pyasn1 - -To activate certificate checking, call -:func:`~urllib3.contrib.pyopenssl.inject_into_urllib3` from your Python code -before you begin making HTTP requests. This can be done in a ``sitecustomize`` -module, or at any other time before your application begins using ``urllib3``, -like this:: +To activate it call :func:`~urllib3.contrib.pyopenssl.inject_into_urllib3`. +This can be done in a ``sitecustomize`` module, or at any other time before +your application begins using ``urllib3``, like this:: try: import urllib3.contrib.pyopenssl @@ -27,36 +18,17 @@ Now you can use :mod:`urllib3` as you normally would, and it will support SNI when the required modules are installed. - -Activating this module also has the positive side effect of disabling SSL/TLS -encryption in Python 2 (see `CRIME attack`_). - -If you want to configure the default list of supported cipher suites, you can -set the ``urllib3.contrib.pyopenssl.DEFAULT_SSL_CIPHER_LIST`` variable. - -Module Variables ----------------- - -:var DEFAULT_SSL_CIPHER_LIST: The list of supported SSL/TLS cipher suites. - Default: ``ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES: - ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+3DES:!aNULL:!MD5:!DSS`` - -.. _sni: https://en.wikipedia.org/wiki/Server_Name_Indication -.. _crime attack: https://en.wikipedia.org/wiki/CRIME_(security_exploit) - ''' from ndg.httpsclient.ssl_peer_verification import SUBJ_ALT_NAME_SUPPORT -from ndg.httpsclient.subj_alt_name import SubjectAltName as BaseSubjectAltName +from ndg.httpsclient.subj_alt_name import SubjectAltName import OpenSSL.SSL from pyasn1.codec.der import decoder as der_decoder -from pyasn1.type import univ, constraint -from socket import _fileobject, timeout +from socket import _fileobject import ssl -import select from cStringIO import StringIO -from .. import connection +from .. import connectionpool from .. import util __all__ = ['inject_into_urllib3', 'extract_from_urllib3'] @@ -77,54 +49,26 @@ + OpenSSL.SSL.VERIFY_FAIL_IF_NO_PEER_CERT, } -# A secure default. -# Sources for more information on TLS ciphers: -# -# - https://wiki.mozilla.org/Security/Server_Side_TLS -# - https://www.ssllabs.com/projects/best-practices/index.html -# - https://hynek.me/articles/hardening-your-web-servers-ssl-ciphers/ -# -# The general intent is: -# - Prefer cipher suites that offer perfect forward secrecy (DHE/ECDHE), -# - prefer ECDHE over DHE for better performance, -# - prefer any AES-GCM over any AES-CBC for better performance and security, -# - use 3DES as fallback which is secure but slow, -# - disable NULL authentication, MD5 MACs and DSS for security reasons. -DEFAULT_SSL_CIPHER_LIST = "ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:" + \ - "ECDH+AES128:DH+AES:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+3DES:" + \ - "!aNULL:!MD5:!DSS" - orig_util_HAS_SNI = util.HAS_SNI -orig_connection_ssl_wrap_socket = connection.ssl_wrap_socket +orig_connectionpool_ssl_wrap_socket = connectionpool.ssl_wrap_socket def inject_into_urllib3(): 'Monkey-patch urllib3 with PyOpenSSL-backed SSL-support.' - connection.ssl_wrap_socket = ssl_wrap_socket + connectionpool.ssl_wrap_socket = ssl_wrap_socket util.HAS_SNI = HAS_SNI def extract_from_urllib3(): 'Undo monkey-patching by :func:`inject_into_urllib3`.' - connection.ssl_wrap_socket = orig_connection_ssl_wrap_socket + connectionpool.ssl_wrap_socket = orig_connectionpool_ssl_wrap_socket util.HAS_SNI = orig_util_HAS_SNI ### Note: This is a slightly bug-fixed version of same from ndg-httpsclient. -class SubjectAltName(BaseSubjectAltName): - '''ASN.1 implementation for subjectAltNames support''' - - # There is no limit to how many SAN certificates a certificate may have, - # however this needs to have some limit so we'll set an arbitrarily high - # limit. - sizeSpec = univ.SequenceOf.sizeSpec + \ - constraint.ValueSizeConstraint(1, 1024) - - -### Note: This is a slightly bug-fixed version of same from ndg-httpsclient. def get_subj_alt_name(peer_cert): # Search through extensions dns_name = [] @@ -157,13 +101,6 @@ class fileobject(_fileobject): - def _wait_for_sock(self): - rd, wd, ed = select.select([self._sock], [], [], - self._sock.gettimeout()) - if not rd: - raise timeout() - - def read(self, size=-1): # Use max, disallow tiny reads in a loop as they are very inefficient. # We never leave read() with any leftover data from a new recv() call @@ -181,7 +118,6 @@ try: data = self._sock.recv(rbufsize) except OpenSSL.SSL.WantReadError: - self._wait_for_sock() continue if not data: break @@ -209,7 +145,6 @@ try: data = self._sock.recv(left) except OpenSSL.SSL.WantReadError: - self._wait_for_sock() continue if not data: break @@ -261,7 +196,6 @@ break buffers.append(data) except OpenSSL.SSL.WantReadError: - self._wait_for_sock() continue break return "".join(buffers) @@ -272,7 +206,6 @@ try: data = self._sock.recv(self._rbufsize) except OpenSSL.SSL.WantReadError: - self._wait_for_sock() continue if not data: break @@ -300,8 +233,7 @@ try: data = self._sock.recv(self._rbufsize) except OpenSSL.SSL.WantReadError: - self._wait_for_sock() - continue + continue if not data: break left = size - buf_len @@ -396,15 +328,6 @@ ctx.load_verify_locations(ca_certs, None) except OpenSSL.SSL.Error as e: raise ssl.SSLError('bad ca_certs: %r' % ca_certs, e) - else: - ctx.set_default_verify_paths() - - # Disable TLS compression to migitate CRIME attack (issue #309) - OP_NO_COMPRESSION = 0x20000 - ctx.set_options(OP_NO_COMPRESSION) - - # Set list of supported ciphersuites. - ctx.set_cipher_list(DEFAULT_SSL_CIPHER_LIST) cnx = OpenSSL.SSL.Connection(ctx, sock) cnx.set_tlsext_host_name(server_hostname) @@ -413,7 +336,6 @@ try: cnx.do_handshake() except OpenSSL.SSL.WantReadError: - select.select([sock], [], []) continue except OpenSSL.SSL.Error as e: raise ssl.SSLError('bad handshake', e) diff -Nru python-urllib3-1.8.3/urllib3/exceptions.py python-urllib3-1.7.1/urllib3/exceptions.py --- python-urllib3-1.8.3/urllib3/exceptions.py 2014-03-15 00:05:07.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/exceptions.py 2013-09-25 16:01:09.000000000 +0000 @@ -44,11 +44,6 @@ pass -class ConnectionError(HTTPError): - "Raised when a normal connection fails." - pass - - class DecodeError(HTTPError): "Raised when automatic decoding based on Content-Type fails." pass diff -Nru python-urllib3-1.8.3/urllib3/fields.py python-urllib3-1.7.1/urllib3/fields.py --- python-urllib3-1.8.3/urllib3/fields.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/fields.py 2013-08-14 21:43:05.000000000 +0000 @@ -15,7 +15,7 @@ Guess the "Content-Type" of a file. :param filename: - The filename to guess the "Content-Type" of using :mod:`mimetypes`. + The filename to guess the "Content-Type" of using :mod:`mimetimes`. :param default: If no "Content-Type" can be guessed, default to `default`. """ @@ -78,10 +78,9 @@ """ A :class:`~urllib3.fields.RequestField` factory from old-style tuple parameters. - Supports constructing :class:`~urllib3.fields.RequestField` from - parameter of key/value strings AND key/filetuple. A filetuple is a - (filename, data, MIME type) tuple where the MIME type is optional. - For example: :: + Supports constructing :class:`~urllib3.fields.RequestField` from parameter + of key/value strings AND key/filetuple. A filetuple is a (filename, data, MIME type) + tuple where the MIME type is optional. For example: :: 'foo': 'bar', 'fakefile': ('foofile.txt', 'contents of foofile'), @@ -126,8 +125,8 @@ 'Content-Disposition' fields. :param header_parts: - A sequence of (k, v) typles or a :class:`dict` of (k, v) to format - as `k1="v1"; k2="v2"; ...`. + A sequence of (k, v) typles or a :class:`dict` of (k, v) to format as + `k1="v1"; k2="v2"; ...`. """ parts = [] iterable = header_parts @@ -159,8 +158,7 @@ lines.append('\r\n') return '\r\n'.join(lines) - def make_multipart(self, content_disposition=None, content_type=None, - content_location=None): + def make_multipart(self, content_disposition=None, content_type=None, content_location=None): """ Makes this request field into a multipart request field. @@ -174,10 +172,6 @@ """ self.headers['Content-Disposition'] = content_disposition or 'form-data' - self.headers['Content-Disposition'] += '; '.join([ - '', self._render_parts( - (('name', self._name), ('filename', self._filename)) - ) - ]) + self.headers['Content-Disposition'] += '; '.join(['', self._render_parts((('name', self._name), ('filename', self._filename)))]) self.headers['Content-Type'] = content_type self.headers['Content-Location'] = content_location diff -Nru python-urllib3-1.8.3/urllib3/filepost.py python-urllib3-1.7.1/urllib3/filepost.py --- python-urllib3-1.8.3/urllib3/filepost.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/filepost.py 2013-08-14 21:43:05.000000000 +0000 @@ -5,6 +5,7 @@ # the MIT License: http://www.opensource.org/licenses/mit-license.php import codecs +import mimetypes from uuid import uuid4 from io import BytesIO @@ -37,23 +38,24 @@ i = iter(fields) for field in i: - if isinstance(field, RequestField): - yield field - else: - yield RequestField.from_tuples(*field) + if isinstance(field, RequestField): + yield field + else: + yield RequestField.from_tuples(*field) def iter_fields(fields): """ - .. deprecated:: 1.6 - Iterate over fields. - The addition of :class:`~urllib3.fields.RequestField` makes this function - obsolete. Instead, use :func:`iter_field_objects`, which returns - :class:`~urllib3.fields.RequestField` objects. + .. deprecated :: + + The addition of `~urllib3.fields.RequestField` makes this function + obsolete. Instead, use :func:`iter_field_objects`, which returns + `~urllib3.fields.RequestField` objects, instead. Supports list of (k, v) tuples and dicts. + """ if isinstance(fields, dict): return ((k, v) for k, v in six.iteritems(fields)) diff -Nru python-urllib3-1.8.3/urllib3/__init__.py python-urllib3-1.7.1/urllib3/__init__.py --- python-urllib3-1.8.3/urllib3/__init__.py 2014-06-24 18:19:21.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/__init__.py 2013-09-25 16:05:54.000000000 +0000 @@ -10,7 +10,7 @@ __author__ = 'Andrey Petrov (andrey.petrov@shazow.net)' __license__ = 'MIT' -__version__ = '1.8.3' +__version__ = '1.7.1' from .connectionpool import ( diff -Nru python-urllib3-1.8.3/urllib3/packages/ssl_match_hostname/_implementation.py python-urllib3-1.7.1/urllib3/packages/ssl_match_hostname/_implementation.py --- python-urllib3-1.8.3/urllib3/packages/ssl_match_hostname/_implementation.py 2014-03-04 19:08:03.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/packages/ssl_match_hostname/_implementation.py 1970-01-01 00:00:00.000000000 +0000 @@ -1,105 +0,0 @@ -"""The match_hostname() function from Python 3.3.3, essential when using SSL.""" - -# Note: This file is under the PSF license as the code comes from the python -# stdlib. http://docs.python.org/3/license.html - -import re - -__version__ = '3.4.0.2' - -class CertificateError(ValueError): - pass - - -def _dnsname_match(dn, hostname, max_wildcards=1): - """Matching according to RFC 6125, section 6.4.3 - - http://tools.ietf.org/html/rfc6125#section-6.4.3 - """ - pats = [] - if not dn: - return False - - # Ported from python3-syntax: - # leftmost, *remainder = dn.split(r'.') - parts = dn.split(r'.') - leftmost = parts[0] - remainder = parts[1:] - - wildcards = leftmost.count('*') - if wildcards > max_wildcards: - # Issue #17980: avoid denials of service by refusing more - # than one wildcard per fragment. A survey of established - # policy among SSL implementations showed it to be a - # reasonable choice. - raise CertificateError( - "too many wildcards in certificate DNS name: " + repr(dn)) - - # speed up common case w/o wildcards - if not wildcards: - return dn.lower() == hostname.lower() - - # RFC 6125, section 6.4.3, subitem 1. - # The client SHOULD NOT attempt to match a presented identifier in which - # the wildcard character comprises a label other than the left-most label. - if leftmost == '*': - # When '*' is a fragment by itself, it matches a non-empty dotless - # fragment. - pats.append('[^.]+') - elif leftmost.startswith('xn--') or hostname.startswith('xn--'): - # RFC 6125, section 6.4.3, subitem 3. - # The client SHOULD NOT attempt to match a presented identifier - # where the wildcard character is embedded within an A-label or - # U-label of an internationalized domain name. - pats.append(re.escape(leftmost)) - else: - # Otherwise, '*' matches any dotless string, e.g. www* - pats.append(re.escape(leftmost).replace(r'\*', '[^.]*')) - - # add the remaining fragments, ignore any wildcards - for frag in remainder: - pats.append(re.escape(frag)) - - pat = re.compile(r'\A' + r'\.'.join(pats) + r'\Z', re.IGNORECASE) - return pat.match(hostname) - - -def match_hostname(cert, hostname): - """Verify that *cert* (in decoded format as returned by - SSLSocket.getpeercert()) matches the *hostname*. RFC 2818 and RFC 6125 - rules are followed, but IP addresses are not accepted for *hostname*. - - CertificateError is raised on failure. On success, the function - returns nothing. - """ - if not cert: - raise ValueError("empty or no certificate") - dnsnames = [] - san = cert.get('subjectAltName', ()) - for key, value in san: - if key == 'DNS': - if _dnsname_match(value, hostname): - return - dnsnames.append(value) - if not dnsnames: - # The subject is only checked when there is no dNSName entry - # in subjectAltName - for sub in cert.get('subject', ()): - for key, value in sub: - # XXX according to RFC 2818, the most specific Common Name - # must be used. - if key == 'commonName': - if _dnsname_match(value, hostname): - return - dnsnames.append(value) - if len(dnsnames) > 1: - raise CertificateError("hostname %r " - "doesn't match either of %s" - % (hostname, ', '.join(map(repr, dnsnames)))) - elif len(dnsnames) == 1: - raise CertificateError("hostname %r " - "doesn't match %r" - % (hostname, dnsnames[0])) - else: - raise CertificateError("no appropriate commonName or " - "subjectAltName fields were found") diff -Nru python-urllib3-1.8.3/urllib3/packages/ssl_match_hostname/__init__.py python-urllib3-1.7.1/urllib3/packages/ssl_match_hostname/__init__.py --- python-urllib3-1.8.3/urllib3/packages/ssl_match_hostname/__init__.py 2014-03-15 00:05:07.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/packages/ssl_match_hostname/__init__.py 2013-08-14 21:43:05.000000000 +0000 @@ -1,13 +1,98 @@ -try: - # Python 3.2+ - from ssl import CertificateError, match_hostname -except ImportError: - try: - # Backport of the function from a pypi module - from backports.ssl_match_hostname import CertificateError, match_hostname - except ImportError: - # Our vendored copy - from ._implementation import CertificateError, match_hostname +"""The match_hostname() function from Python 3.2, essential when using SSL.""" -# Not needed, but documenting what we provide. -__all__ = ('CertificateError', 'match_hostname') +import re + +__version__ = '3.2.2' + +class CertificateError(ValueError): + pass + +def _dnsname_match(dn, hostname, max_wildcards=1): + """Matching according to RFC 6125, section 6.4.3 + + http://tools.ietf.org/html/rfc6125#section-6.4.3 + """ + pats = [] + if not dn: + return False + + parts = dn.split(r'.') + leftmost = parts[0] + + wildcards = leftmost.count('*') + if wildcards > max_wildcards: + # Issue #17980: avoid denials of service by refusing more + # than one wildcard per fragment. A survery of established + # policy among SSL implementations showed it to be a + # reasonable choice. + raise CertificateError( + "too many wildcards in certificate DNS name: " + repr(dn)) + + # speed up common case w/o wildcards + if not wildcards: + return dn.lower() == hostname.lower() + + # RFC 6125, section 6.4.3, subitem 1. + # The client SHOULD NOT attempt to match a presented identifier in which + # the wildcard character comprises a label other than the left-most label. + if leftmost == '*': + # When '*' is a fragment by itself, it matches a non-empty dotless + # fragment. + pats.append('[^.]+') + elif leftmost.startswith('xn--') or hostname.startswith('xn--'): + # RFC 6125, section 6.4.3, subitem 3. + # The client SHOULD NOT attempt to match a presented identifier + # where the wildcard character is embedded within an A-label or + # U-label of an internationalized domain name. + pats.append(re.escape(leftmost)) + else: + # Otherwise, '*' matches any dotless string, e.g. www* + pats.append(re.escape(leftmost).replace(r'\*', '[^.]*')) + + # add the remaining fragments, ignore any wildcards + for frag in parts[1:]: + pats.append(re.escape(frag)) + + pat = re.compile(r'\A' + r'\.'.join(pats) + r'\Z', re.IGNORECASE) + return pat.match(hostname) + + +def match_hostname(cert, hostname): + """Verify that *cert* (in decoded format as returned by + SSLSocket.getpeercert()) matches the *hostname*. RFC 2818 and RFC 6125 + rules are followed, but IP addresses are not accepted for *hostname*. + + CertificateError is raised on failure. On success, the function + returns nothing. + """ + if not cert: + raise ValueError("empty or no certificate") + dnsnames = [] + san = cert.get('subjectAltName', ()) + for key, value in san: + if key == 'DNS': + if _dnsname_match(value, hostname): + return + dnsnames.append(value) + if not dnsnames: + # The subject is only checked when there is no dNSName entry + # in subjectAltName + for sub in cert.get('subject', ()): + for key, value in sub: + # XXX according to RFC 2818, the most specific Common Name + # must be used. + if key == 'commonName': + if _dnsname_match(value, hostname): + return + dnsnames.append(value) + if len(dnsnames) > 1: + raise CertificateError("hostname %r " + "doesn't match either of %s" + % (hostname, ', '.join(map(repr, dnsnames)))) + elif len(dnsnames) == 1: + raise CertificateError("hostname %r " + "doesn't match %r" + % (hostname, dnsnames[0])) + else: + raise CertificateError("no appropriate commonName or " + "subjectAltName fields were found") diff -Nru python-urllib3-1.8.3/urllib3/poolmanager.py python-urllib3-1.7.1/urllib3/poolmanager.py --- python-urllib3-1.8.3/urllib3/poolmanager.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/poolmanager.py 2013-08-14 21:43:05.000000000 +0000 @@ -1,5 +1,5 @@ # urllib3/poolmanager.py -# Copyright 2008-2014 Andrey Petrov and contributors (see CONTRIBUTORS.txt) +# Copyright 2008-2013 Andrey Petrov and contributors (see CONTRIBUTORS.txt) # # This module is part of urllib3 and is released under # the MIT License: http://www.opensource.org/licenses/mit-license.php @@ -161,7 +161,7 @@ # Support relative URLs for redirecting. redirect_location = urljoin(url, redirect_location) - # RFC 7231, Section 6.4.4 + # RFC 2616, Section 10.3.4 if response.status == 303: method = 'GET' @@ -176,7 +176,7 @@ Behaves just like :class:`PoolManager`, but sends all requests through the defined proxy, using the CONNECT method for HTTPS URLs. - :param proxy_url: + :param poxy_url: The URL of the proxy to be used. :param proxy_headers: @@ -245,11 +245,12 @@ u = parse_url(url) if u.scheme == "http": - # For proxied HTTPS requests, httplib sets the necessary headers - # on the CONNECT to the proxy. For HTTP, we'll definitely - # need to set 'Host' at the very least. + # It's too late to set proxy headers on per-request basis for + # tunnelled HTTPS connections, should use + # constructor's proxy_headers instead. kw['headers'] = self._set_proxy_headers(url, kw.get('headers', self.headers)) + kw['headers'].update(self.proxy_headers) return super(ProxyManager, self).urlopen(method, url, redirect, **kw) diff -Nru python-urllib3-1.8.3/urllib3/request.py python-urllib3-1.7.1/urllib3/request.py --- python-urllib3-1.8.3/urllib3/request.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/request.py 2013-08-14 21:43:05.000000000 +0000 @@ -26,8 +26,8 @@ Specifically, - :meth:`.request_encode_url` is for sending requests whose fields are - encoded in the URL (such as GET, HEAD, DELETE). + :meth:`.request_encode_url` is for sending requests whose fields are encoded + in the URL (such as GET, HEAD, DELETE). :meth:`.request_encode_body` is for sending requests whose fields are encoded in the *body* of the request using multipart or www-form-urlencoded @@ -45,13 +45,14 @@ """ _encode_url_methods = set(['DELETE', 'GET', 'HEAD', 'OPTIONS']) + _encode_body_methods = set(['PATCH', 'POST', 'PUT', 'TRACE']) def __init__(self, headers=None): self.headers = headers or {} def urlopen(self, method, url, body=None, headers=None, encode_multipart=True, multipart_boundary=None, - **kw): # Abstract + **kw): # Abstract raise NotImplemented("Classes extending RequestMethods must implement " "their own ``urlopen`` method.") @@ -61,8 +62,8 @@ ``fields`` based on the ``method`` used. This is a convenience method that requires the least amount of manual - effort. It can be used in most situations, while still having the - option to drop down to more specific methods when necessary, such as + effort. It can be used in most situations, while still having the option + to drop down to more specific methods when necessary, such as :meth:`request_encode_url`, :meth:`request_encode_body`, or even the lowest level :meth:`urlopen`. """ @@ -70,12 +71,12 @@ if method in self._encode_url_methods: return self.request_encode_url(method, url, fields=fields, - headers=headers, - **urlopen_kw) - else: - return self.request_encode_body(method, url, fields=fields, headers=headers, **urlopen_kw) + else: + return self.request_encode_body(method, url, fields=fields, + headers=headers, + **urlopen_kw) def request_encode_url(self, method, url, fields=None, **urlopen_kw): """ @@ -94,14 +95,14 @@ the body. This is useful for request methods like POST, PUT, PATCH, etc. When ``encode_multipart=True`` (default), then - :meth:`urllib3.filepost.encode_multipart_formdata` is used to encode - the payload with the appropriate content type. Otherwise + :meth:`urllib3.filepost.encode_multipart_formdata` is used to encode the + payload with the appropriate content type. Otherwise :meth:`urllib.urlencode` is used with the 'application/x-www-form-urlencoded' content type. Multipart encoding must be used when posting files, and it's reasonably - safe to use it in other times too. However, it may break request - signing, such as with OAuth. + safe to use it in other times too. However, it may break request signing, + such as with OAuth. Supports an optional ``fields`` parameter of key/value strings AND key/filetuple. A filetuple is a (filename, data, MIME type) tuple where @@ -119,17 +120,17 @@ When uploading a file, providing a filename (the first parameter of the tuple) is optional but recommended to best mimick behavior of browsers. - Note that if ``headers`` are supplied, the 'Content-Type' header will - be overwritten because it depends on the dynamic random boundary string + Note that if ``headers`` are supplied, the 'Content-Type' header will be + overwritten because it depends on the dynamic random boundary string which is used to compose the body of the request. The random boundary string can be explicitly set with the ``multipart_boundary`` parameter. """ if encode_multipart: - body, content_type = encode_multipart_formdata( - fields or {}, boundary=multipart_boundary) + body, content_type = encode_multipart_formdata(fields or {}, + boundary=multipart_boundary) else: body, content_type = (urlencode(fields or {}), - 'application/x-www-form-urlencoded') + 'application/x-www-form-urlencoded') if headers is None: headers = self.headers diff -Nru python-urllib3-1.8.3/urllib3/response.py python-urllib3-1.7.1/urllib3/response.py --- python-urllib3-1.8.3/urllib3/response.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/response.py 2013-09-25 16:01:09.000000000 +0000 @@ -5,16 +5,18 @@ # the MIT License: http://www.opensource.org/licenses/mit-license.php +import logging import zlib import io -from socket import timeout as SocketTimeout -from ._collections import HTTPHeaderDict -from .exceptions import DecodeError, ReadTimeoutError +from .exceptions import DecodeError from .packages.six import string_types as basestring, binary_type from .util import is_fp_closed +log = logging.getLogger(__name__) + + class DeflateDecoder(object): def __init__(self): @@ -77,10 +79,7 @@ def __init__(self, body='', headers=None, status=0, version=0, reason=None, strict=0, preload_content=True, decode_content=True, original_response=None, pool=None, connection=None): - - self.headers = HTTPHeaderDict() - if headers: - self.headers.update(headers) + self.headers = headers or {} self.status = status self.version = version self.reason = reason @@ -91,7 +90,6 @@ self._body = body if body and isinstance(body, basestring) else None self._fp = None self._original_response = original_response - self._fp_bytes_read = 0 self._pool = pool self._connection = connection @@ -131,14 +129,6 @@ if self._fp: return self.read(cache_content=True) - def tell(self): - """ - Obtain the number of bytes pulled over the wire so far. May differ from - the amount of content returned by :meth:``HTTPResponse.read`` if bytes - are encoded on the wire (e.g, compressed). - """ - return self._fp_bytes_read - def read(self, amt=None, decode_content=None, cache_content=False): """ Similar to :meth:`httplib.HTTPResponse.read`, but with two additional @@ -160,8 +150,8 @@ after having ``.read()`` the file object. (Overridden if ``amt`` is set.) """ - # Note: content-encoding value should be case-insensitive, per RFC 7230 - # Section 3.2 + # Note: content-encoding value should be case-insensitive, per RFC 2616 + # Section 3.5 content_encoding = self.headers.get('content-encoding', '').lower() if self._decoder is None: if content_encoding in self.CONTENT_DECODERS: @@ -175,31 +165,23 @@ flush_decoder = False try: - try: - if amt is None: - # cStringIO doesn't like amt=None - data = self._fp.read() + if amt is None: + # cStringIO doesn't like amt=None + data = self._fp.read() + flush_decoder = True + else: + cache_content = False + data = self._fp.read(amt) + if amt != 0 and not data: # Platform-specific: Buggy versions of Python. + # Close the connection when no data is returned + # + # This is redundant to what httplib/http.client _should_ + # already do. However, versions of python released before + # December 15, 2012 (http://bugs.python.org/issue16298) do not + # properly close the connection in all cases. There is no harm + # in redundantly calling close. + self._fp.close() flush_decoder = True - else: - cache_content = False - data = self._fp.read(amt) - if amt != 0 and not data: # Platform-specific: Buggy versions of Python. - # Close the connection when no data is returned - # - # This is redundant to what httplib/http.client _should_ - # already do. However, versions of python released before - # December 15, 2012 (http://bugs.python.org/issue16298) do - # not properly close the connection in all cases. There is - # no harm in redundantly calling close. - self._fp.close() - flush_decoder = True - - except SocketTimeout: - # FIXME: Ideally we'd like to include the url in the ReadTimeoutError but - # there is yet no clean way to get at it from this context. - raise ReadTimeoutError(self._pool, None, 'Read timed out.') - - self._fp_bytes_read += len(data) try: if decode_content and self._decoder: @@ -207,7 +189,8 @@ except (IOError, zlib.error) as e: raise DecodeError( "Received response with content-encoding: %s, but " - "failed to decode it." % content_encoding, e) + "failed to decode it." % content_encoding, + e) if flush_decoder and decode_content and self._decoder: buf = self._decoder.decompress(binary_type()) @@ -244,6 +227,7 @@ if data: yield data + @classmethod def from_httplib(ResponseCls, r, **response_kw): """ @@ -254,9 +238,17 @@ with ``original_response=r``. """ - headers = HTTPHeaderDict() + # Normalize headers between different versions of Python + headers = {} for k, v in r.getheaders(): - headers.add(k, v) + # Python 3: Header keys are returned capitalised + k = k.lower() + + has_value = headers.get(k) + if has_value: # Python 3: Repeating header keys are unmerged. + v = ', '.join([has_value, v]) + + headers[k] = v # HTTPResponse objects in Python 3 don't have a .strict attribute strict = getattr(r, 'strict', 0) @@ -298,7 +290,7 @@ elif hasattr(self._fp, "fileno"): return self._fp.fileno() else: - raise IOError("The file-like object this HTTPResponse is wrapped " + raise IOError("The file-like object this HTTPResponse is wrapped " "around has no file descriptor") def flush(self): diff -Nru python-urllib3-1.8.3/urllib3/util/connection.py python-urllib3-1.7.1/urllib3/util/connection.py --- python-urllib3-1.8.3/urllib3/util/connection.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/util/connection.py 1970-01-01 00:00:00.000000000 +0000 @@ -1,43 +0,0 @@ -from socket import error as SocketError -try: - from select import poll, POLLIN -except ImportError: # `poll` doesn't exist on OSX and other platforms - poll = False - try: - from select import select - except ImportError: # `select` doesn't exist on AppEngine. - select = False - - -def is_connection_dropped(conn): # Platform-specific - """ - Returns True if the connection is dropped and should be closed. - - :param conn: - :class:`httplib.HTTPConnection` object. - - Note: For platforms like AppEngine, this will always return ``False`` to - let the platform handle connection recycling transparently for us. - """ - sock = getattr(conn, 'sock', False) - if sock is False: # Platform-specific: AppEngine - return False - if sock is None: # Connection already closed (such as by httplib). - return True - - if not poll: - if not select: # Platform-specific: AppEngine - return False - - try: - return select([sock], [], [], 0.0)[0] - except SocketError: - return True - - # This version is better on platforms that support it. - p = poll() - p.register(sock, POLLIN) - for (fno, ev) in p.poll(0.0): - if fno == sock.fileno(): - # Either data is buffered (bad), or the connection is dropped. - return True diff -Nru python-urllib3-1.8.3/urllib3/util/__init__.py python-urllib3-1.7.1/urllib3/util/__init__.py --- python-urllib3-1.8.3/urllib3/util/__init__.py 2014-04-18 05:35:23.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/util/__init__.py 1970-01-01 00:00:00.000000000 +0000 @@ -1,27 +0,0 @@ -# urllib3/util/__init__.py -# Copyright 2008-2014 Andrey Petrov and contributors (see CONTRIBUTORS.txt) -# -# This module is part of urllib3 and is released under -# the MIT License: http://www.opensource.org/licenses/mit-license.php - -from .connection import is_connection_dropped -from .request import make_headers -from .response import is_fp_closed -from .ssl_ import ( - SSLContext, - HAS_SNI, - assert_fingerprint, - resolve_cert_reqs, - resolve_ssl_version, - ssl_wrap_socket, -) -from .timeout import ( - current_time, - Timeout, -) -from .url import ( - get_host, - parse_url, - split_first, - Url, -) diff -Nru python-urllib3-1.8.3/urllib3/util/request.py python-urllib3-1.7.1/urllib3/util/request.py --- python-urllib3-1.8.3/urllib3/util/request.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/util/request.py 1970-01-01 00:00:00.000000000 +0000 @@ -1,72 +0,0 @@ -from base64 import b64encode - -from ..packages import six - - -ACCEPT_ENCODING = 'gzip,deflate' - - -def make_headers(keep_alive=None, accept_encoding=None, user_agent=None, - basic_auth=None, proxy_basic_auth=None, disable_cache=None): - """ - Shortcuts for generating request headers. - - :param keep_alive: - If ``True``, adds 'connection: keep-alive' header. - - :param accept_encoding: - Can be a boolean, list, or string. - ``True`` translates to 'gzip,deflate'. - List will get joined by comma. - String will be used as provided. - - :param user_agent: - String representing the user-agent you want, such as - "python-urllib3/0.6" - - :param basic_auth: - Colon-separated username:password string for 'authorization: basic ...' - auth header. - - :param proxy_basic_auth: - Colon-separated username:password string for - 'proxy-authorization: basic ...' auth header. - - :param disable_cache: - If ``True``, adds 'cache-control: no-cache' header. - - Example: :: - - >>> make_headers(keep_alive=True, user_agent="Batman/1.0") - {'connection': 'keep-alive', 'user-agent': 'Batman/1.0'} - >>> make_headers(accept_encoding=True) - {'accept-encoding': 'gzip,deflate'} - """ - headers = {} - if accept_encoding: - if isinstance(accept_encoding, str): - pass - elif isinstance(accept_encoding, list): - accept_encoding = ','.join(accept_encoding) - else: - accept_encoding = ACCEPT_ENCODING - headers['accept-encoding'] = accept_encoding - - if user_agent: - headers['user-agent'] = user_agent - - if keep_alive: - headers['connection'] = 'keep-alive' - - if basic_auth: - headers['authorization'] = 'Basic ' + \ - b64encode(six.b(basic_auth)).decode('utf-8') - - if proxy_basic_auth: - headers['proxy-authorization'] = 'Basic ' + \ - b64encode(six.b(proxy_basic_auth)).decode('utf-8') - - if disable_cache: - headers['cache-control'] = 'no-cache' - - return headers diff -Nru python-urllib3-1.8.3/urllib3/util/response.py python-urllib3-1.7.1/urllib3/util/response.py --- python-urllib3-1.8.3/urllib3/util/response.py 2014-04-18 05:35:23.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/util/response.py 1970-01-01 00:00:00.000000000 +0000 @@ -1,13 +0,0 @@ -def is_fp_closed(obj): - """ - Checks whether a given file-like object is closed. - - :param obj: - The file-like object to check. - """ - if hasattr(obj, 'fp'): - # Object is a container for another file-like object that gets released - # on exhaustion (e.g. HTTPResponse) - return obj.fp is None - - return obj.closed diff -Nru python-urllib3-1.8.3/urllib3/util/ssl_.py python-urllib3-1.7.1/urllib3/util/ssl_.py --- python-urllib3-1.8.3/urllib3/util/ssl_.py 2014-04-18 05:35:23.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/util/ssl_.py 1970-01-01 00:00:00.000000000 +0000 @@ -1,133 +0,0 @@ -from binascii import hexlify, unhexlify -from hashlib import md5, sha1 - -from ..exceptions import SSLError - - -try: # Test for SSL features - SSLContext = None - HAS_SNI = False - - import ssl - from ssl import wrap_socket, CERT_NONE, PROTOCOL_SSLv23 - from ssl import SSLContext # Modern SSL? - from ssl import HAS_SNI # Has SNI? -except ImportError: - pass - - -def assert_fingerprint(cert, fingerprint): - """ - Checks if given fingerprint matches the supplied certificate. - - :param cert: - Certificate as bytes object. - :param fingerprint: - Fingerprint as string of hexdigits, can be interspersed by colons. - """ - - # Maps the length of a digest to a possible hash function producing - # this digest. - hashfunc_map = { - 16: md5, - 20: sha1 - } - - fingerprint = fingerprint.replace(':', '').lower() - - digest_length, rest = divmod(len(fingerprint), 2) - - if rest or digest_length not in hashfunc_map: - raise SSLError('Fingerprint is of invalid length.') - - # We need encode() here for py32; works on py2 and p33. - fingerprint_bytes = unhexlify(fingerprint.encode()) - - hashfunc = hashfunc_map[digest_length] - - cert_digest = hashfunc(cert).digest() - - if not cert_digest == fingerprint_bytes: - raise SSLError('Fingerprints did not match. Expected "{0}", got "{1}".' - .format(hexlify(fingerprint_bytes), - hexlify(cert_digest))) - - -def resolve_cert_reqs(candidate): - """ - Resolves the argument to a numeric constant, which can be passed to - the wrap_socket function/method from the ssl module. - Defaults to :data:`ssl.CERT_NONE`. - If given a string it is assumed to be the name of the constant in the - :mod:`ssl` module or its abbrevation. - (So you can specify `REQUIRED` instead of `CERT_REQUIRED`. - If it's neither `None` nor a string we assume it is already the numeric - constant which can directly be passed to wrap_socket. - """ - if candidate is None: - return CERT_NONE - - if isinstance(candidate, str): - res = getattr(ssl, candidate, None) - if res is None: - res = getattr(ssl, 'CERT_' + candidate) - return res - - return candidate - - -def resolve_ssl_version(candidate): - """ - like resolve_cert_reqs - """ - if candidate is None: - return PROTOCOL_SSLv23 - - if isinstance(candidate, str): - res = getattr(ssl, candidate, None) - if res is None: - res = getattr(ssl, 'PROTOCOL_' + candidate) - return res - - return candidate - - -if SSLContext is not None: # Python 3.2+ - def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None, - ca_certs=None, server_hostname=None, - ssl_version=None): - """ - All arguments except `server_hostname` have the same meaning as for - :func:`ssl.wrap_socket` - - :param server_hostname: - Hostname of the expected certificate - """ - context = SSLContext(ssl_version) - context.verify_mode = cert_reqs - - # Disable TLS compression to migitate CRIME attack (issue #309) - OP_NO_COMPRESSION = 0x20000 - context.options |= OP_NO_COMPRESSION - - if ca_certs: - try: - context.load_verify_locations(ca_certs) - # Py32 raises IOError - # Py33 raises FileNotFoundError - except Exception as e: # Reraise as SSLError - raise SSLError(e) - if certfile: - # FIXME: This block needs a test. - context.load_cert_chain(certfile, keyfile) - if HAS_SNI: # Platform-specific: OpenSSL with enabled SNI - return context.wrap_socket(sock, server_hostname=server_hostname) - return context.wrap_socket(sock) - -else: # Python 3.1 and earlier - def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None, - ca_certs=None, server_hostname=None, - ssl_version=None): - return wrap_socket(sock, keyfile=keyfile, certfile=certfile, - ca_certs=ca_certs, cert_reqs=cert_reqs, - ssl_version=ssl_version) diff -Nru python-urllib3-1.8.3/urllib3/util/timeout.py python-urllib3-1.7.1/urllib3/util/timeout.py --- python-urllib3-1.8.3/urllib3/util/timeout.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/util/timeout.py 1970-01-01 00:00:00.000000000 +0000 @@ -1,233 +0,0 @@ -from socket import _GLOBAL_DEFAULT_TIMEOUT -import time - -from ..exceptions import TimeoutStateError - - -def current_time(): - """ - Retrieve the current time, this function is mocked out in unit testing. - """ - return time.time() - - -_Default = object() -# The default timeout to use for socket connections. This is the attribute used -# by httplib to define the default timeout - - -class Timeout(object): - """ - Utility object for storing timeout values. - - Example usage: - - .. code-block:: python - - timeout = urllib3.util.Timeout(connect=2.0, read=7.0) - pool = HTTPConnectionPool('www.google.com', 80, timeout=timeout) - pool.request(...) # Etc, etc - - :param connect: - The maximum amount of time to wait for a connection attempt to a server - to succeed. Omitting the parameter will default the connect timeout to - the system default, probably `the global default timeout in socket.py - `_. - None will set an infinite timeout for connection attempts. - - :type connect: integer, float, or None - - :param read: - The maximum amount of time to wait between consecutive - read operations for a response from the server. Omitting - the parameter will default the read timeout to the system - default, probably `the global default timeout in socket.py - `_. - None will set an infinite timeout. - - :type read: integer, float, or None - - :param total: - This combines the connect and read timeouts into one; the read timeout - will be set to the time leftover from the connect attempt. In the - event that both a connect timeout and a total are specified, or a read - timeout and a total are specified, the shorter timeout will be applied. - - Defaults to None. - - :type total: integer, float, or None - - .. note:: - - Many factors can affect the total amount of time for urllib3 to return - an HTTP response. Specifically, Python's DNS resolver does not obey the - timeout specified on the socket. Other factors that can affect total - request time include high CPU load, high swap, the program running at a - low priority level, or other behaviors. The observed running time for - urllib3 to return a response may be greater than the value passed to - `total`. - - In addition, the read and total timeouts only measure the time between - read operations on the socket connecting the client and the server, - not the total amount of time for the request to return a complete - response. For most requests, the timeout is raised because the server - has not sent the first byte in the specified time. This is not always - the case; if a server streams one byte every fifteen seconds, a timeout - of 20 seconds will not ever trigger, even though the request will - take several minutes to complete. - - If your goal is to cut off any request after a set amount of wall clock - time, consider having a second "watcher" thread to cut off a slow - request. - """ - - #: A sentinel object representing the default timeout value - DEFAULT_TIMEOUT = _GLOBAL_DEFAULT_TIMEOUT - - def __init__(self, total=None, connect=_Default, read=_Default): - self._connect = self._validate_timeout(connect, 'connect') - self._read = self._validate_timeout(read, 'read') - self.total = self._validate_timeout(total, 'total') - self._start_connect = None - - def __str__(self): - return '%s(connect=%r, read=%r, total=%r)' % ( - type(self).__name__, self._connect, self._read, self.total) - - @classmethod - def _validate_timeout(cls, value, name): - """ Check that a timeout attribute is valid. - - :param value: The timeout value to validate - :param name: The name of the timeout attribute to validate. This is - used to specify in error messages. - :return: The validated and casted version of the given value. - :raises ValueError: If the type is not an integer or a float, or if it - is a numeric value less than zero. - """ - if value is _Default: - return cls.DEFAULT_TIMEOUT - - if value is None or value is cls.DEFAULT_TIMEOUT: - return value - - try: - float(value) - except (TypeError, ValueError): - raise ValueError("Timeout value %s was %s, but it must be an " - "int or float." % (name, value)) - - try: - if value < 0: - raise ValueError("Attempted to set %s timeout to %s, but the " - "timeout cannot be set to a value less " - "than 0." % (name, value)) - except TypeError: # Python 3 - raise ValueError("Timeout value %s was %s, but it must be an " - "int or float." % (name, value)) - - return value - - @classmethod - def from_float(cls, timeout): - """ Create a new Timeout from a legacy timeout value. - - The timeout value used by httplib.py sets the same timeout on the - connect(), and recv() socket requests. This creates a :class:`Timeout` - object that sets the individual timeouts to the ``timeout`` value - passed to this function. - - :param timeout: The legacy timeout value. - :type timeout: integer, float, sentinel default object, or None - :return: Timeout object - :rtype: :class:`Timeout` - """ - return Timeout(read=timeout, connect=timeout) - - def clone(self): - """ Create a copy of the timeout object - - Timeout properties are stored per-pool but each request needs a fresh - Timeout object to ensure each one has its own start/stop configured. - - :return: a copy of the timeout object - :rtype: :class:`Timeout` - """ - # We can't use copy.deepcopy because that will also create a new object - # for _GLOBAL_DEFAULT_TIMEOUT, which socket.py uses as a sentinel to - # detect the user default. - return Timeout(connect=self._connect, read=self._read, - total=self.total) - - def start_connect(self): - """ Start the timeout clock, used during a connect() attempt - - :raises urllib3.exceptions.TimeoutStateError: if you attempt - to start a timer that has been started already. - """ - if self._start_connect is not None: - raise TimeoutStateError("Timeout timer has already been started.") - self._start_connect = current_time() - return self._start_connect - - def get_connect_duration(self): - """ Gets the time elapsed since the call to :meth:`start_connect`. - - :return: Elapsed time. - :rtype: float - :raises urllib3.exceptions.TimeoutStateError: if you attempt - to get duration for a timer that hasn't been started. - """ - if self._start_connect is None: - raise TimeoutStateError("Can't get connect duration for timer " - "that has not started.") - return current_time() - self._start_connect - - @property - def connect_timeout(self): - """ Get the value to use when setting a connection timeout. - - This will be a positive float or integer, the value None - (never timeout), or the default system timeout. - - :return: Connect timeout. - :rtype: int, float, :attr:`Timeout.DEFAULT_TIMEOUT` or None - """ - if self.total is None: - return self._connect - - if self._connect is None or self._connect is self.DEFAULT_TIMEOUT: - return self.total - - return min(self._connect, self.total) - - @property - def read_timeout(self): - """ Get the value for the read timeout. - - This assumes some time has elapsed in the connection timeout and - computes the read timeout appropriately. - - If self.total is set, the read timeout is dependent on the amount of - time taken by the connect timeout. If the connection time has not been - established, a :exc:`~urllib3.exceptions.TimeoutStateError` will be - raised. - - :return: Value to use for the read timeout. - :rtype: int, float, :attr:`Timeout.DEFAULT_TIMEOUT` or None - :raises urllib3.exceptions.TimeoutStateError: If :meth:`start_connect` - has not yet been called on this object. - """ - if (self.total is not None and - self.total is not self.DEFAULT_TIMEOUT and - self._read is not None and - self._read is not self.DEFAULT_TIMEOUT): - # In case the connect timeout has not yet been established. - if self._start_connect is None: - return self._read - return max(0, min(self.total - self.get_connect_duration(), - self._read)) - elif self.total is not None and self.total is not self.DEFAULT_TIMEOUT: - return max(0, self.total - self.get_connect_duration()) - else: - return self._read diff -Nru python-urllib3-1.8.3/urllib3/util/url.py python-urllib3-1.7.1/urllib3/util/url.py --- python-urllib3-1.8.3/urllib3/util/url.py 2014-06-23 23:44:37.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/util/url.py 1970-01-01 00:00:00.000000000 +0000 @@ -1,166 +0,0 @@ -from collections import namedtuple - -from ..exceptions import LocationParseError - -url_attrs = ['scheme', 'auth', 'host', 'port', 'path', 'query', 'fragment'] - - -class Url(namedtuple('Url', url_attrs)): - """ - Datastructure for representing an HTTP URL. Used as a return value for - :func:`parse_url`. - """ - slots = () - - def __new__(cls, scheme=None, auth=None, host=None, port=None, path=None, - query=None, fragment=None): - return super(Url, cls).__new__(cls, scheme, auth, host, port, path, - query, fragment) - - @property - def hostname(self): - """For backwards-compatibility with urlparse. We're nice like that.""" - return self.host - - @property - def request_uri(self): - """Absolute path including the query string.""" - uri = self.path or '/' - - if self.query is not None: - uri += '?' + self.query - - return uri - - @property - def netloc(self): - """Network location including host and port""" - if self.port: - return '%s:%d' % (self.host, self.port) - return self.host - - -def split_first(s, delims): - """ - Given a string and an iterable of delimiters, split on the first found - delimiter. Return two split parts and the matched delimiter. - - If not found, then the first part is the full input string. - - Example: :: - - >>> split_first('foo/bar?baz', '?/=') - ('foo', 'bar?baz', '/') - >>> split_first('foo/bar?baz', '123') - ('foo/bar?baz', '', None) - - Scales linearly with number of delims. Not ideal for large number of delims. - """ - min_idx = None - min_delim = None - for d in delims: - idx = s.find(d) - if idx < 0: - continue - - if min_idx is None or idx < min_idx: - min_idx = idx - min_delim = d - - if min_idx is None or min_idx < 0: - return s, '', None - - return s[:min_idx], s[min_idx+1:], min_delim - - -def parse_url(url): - """ - Given a url, return a parsed :class:`.Url` namedtuple. Best-effort is - performed to parse incomplete urls. Fields not provided will be None. - - Partly backwards-compatible with :mod:`urlparse`. - - Example: :: - - >>> parse_url('http://google.com/mail/') - Url(scheme='http', host='google.com', port=None, path='/', ...) - >>> parse_url('google.com:80') - Url(scheme=None, host='google.com', port=80, path=None, ...) - >>> parse_url('/foo?bar') - Url(scheme=None, host=None, port=None, path='/foo', query='bar', ...) - """ - - # While this code has overlap with stdlib's urlparse, it is much - # simplified for our needs and less annoying. - # Additionally, this implementations does silly things to be optimal - # on CPython. - - scheme = None - auth = None - host = None - port = None - path = None - fragment = None - query = None - - # Scheme - if '://' in url: - scheme, url = url.split('://', 1) - - # Find the earliest Authority Terminator - # (http://tools.ietf.org/html/rfc3986#section-3.2) - url, path_, delim = split_first(url, ['/', '?', '#']) - - if delim: - # Reassemble the path - path = delim + path_ - - # Auth - if '@' in url: - # Last '@' denotes end of auth part - auth, url = url.rsplit('@', 1) - - # IPv6 - if url and url[0] == '[': - host, url = url.split(']', 1) - host += ']' - - # Port - if ':' in url: - _host, port = url.split(':', 1) - - if not host: - host = _host - - if port: - # If given, ports must be integers. - if not port.isdigit(): - raise LocationParseError(url) - port = int(port) - else: - # Blank ports are cool, too. (rfc3986#section-3.2.3) - port = None - - elif not host and url: - host = url - - if not path: - return Url(scheme, auth, host, port, path, query, fragment) - - # Fragment - if '#' in path: - path, fragment = path.split('#', 1) - - # Query - if '?' in path: - path, query = path.split('?', 1) - - return Url(scheme, auth, host, port, path, query, fragment) - - -def get_host(url): - """ - Deprecated. Use :func:`.parse_url` instead. - """ - p = parse_url(url) - return p.scheme or 'http', p.hostname, p.port diff -Nru python-urllib3-1.8.3/urllib3/util.py python-urllib3-1.7.1/urllib3/util.py --- python-urllib3-1.8.3/urllib3/util.py 1970-01-01 00:00:00.000000000 +0000 +++ python-urllib3-1.7.1/urllib3/util.py 2013-09-25 16:01:09.000000000 +0000 @@ -0,0 +1,626 @@ +# urllib3/util.py +# Copyright 2008-2013 Andrey Petrov and contributors (see CONTRIBUTORS.txt) +# +# This module is part of urllib3 and is released under +# the MIT License: http://www.opensource.org/licenses/mit-license.php + + +from base64 import b64encode +from binascii import hexlify, unhexlify +from collections import namedtuple +from hashlib import md5, sha1 +from socket import error as SocketError, _GLOBAL_DEFAULT_TIMEOUT +import time + +try: + from select import poll, POLLIN +except ImportError: # `poll` doesn't exist on OSX and other platforms + poll = False + try: + from select import select + except ImportError: # `select` doesn't exist on AppEngine. + select = False + +try: # Test for SSL features + SSLContext = None + HAS_SNI = False + + import ssl + from ssl import wrap_socket, CERT_NONE, PROTOCOL_SSLv23 + from ssl import SSLContext # Modern SSL? + from ssl import HAS_SNI # Has SNI? +except ImportError: + pass + +from .packages import six +from .exceptions import LocationParseError, SSLError, TimeoutStateError + + +_Default = object() +# The default timeout to use for socket connections. This is the attribute used +# by httplib to define the default timeout + + +def current_time(): + """ + Retrieve the current time, this function is mocked out in unit testing. + """ + return time.time() + + +class Timeout(object): + """ + Utility object for storing timeout values. + + Example usage: + + .. code-block:: python + + timeout = urllib3.util.Timeout(connect=2.0, read=7.0) + pool = HTTPConnectionPool('www.google.com', 80, timeout=timeout) + pool.request(...) # Etc, etc + + :param connect: + The maximum amount of time to wait for a connection attempt to a server + to succeed. Omitting the parameter will default the connect timeout to + the system default, probably `the global default timeout in socket.py + `_. + None will set an infinite timeout for connection attempts. + + :type connect: integer, float, or None + + :param read: + The maximum amount of time to wait between consecutive + read operations for a response from the server. Omitting + the parameter will default the read timeout to the system + default, probably `the global default timeout in socket.py + `_. + None will set an infinite timeout. + + :type read: integer, float, or None + + :param total: + The maximum amount of time to wait for an HTTP request to connect and + return. This combines the connect and read timeouts into one. In the + event that both a connect timeout and a total are specified, or a read + timeout and a total are specified, the shorter timeout will be applied. + + Defaults to None. + + + :type total: integer, float, or None + + .. note:: + + Many factors can affect the total amount of time for urllib3 to return + an HTTP response. Specifically, Python's DNS resolver does not obey the + timeout specified on the socket. Other factors that can affect total + request time include high CPU load, high swap, the program running at a + low priority level, or other behaviors. The observed running time for + urllib3 to return a response may be greater than the value passed to + `total`. + + In addition, the read and total timeouts only measure the time between + read operations on the socket connecting the client and the server, not + the total amount of time for the request to return a complete response. + As an example, you may want a request to return within 7 seconds or + fail, so you set the ``total`` timeout to 7 seconds. If the server + sends one byte to you every 5 seconds, the request will **not** trigger + time out. This case is admittedly rare. + """ + + #: A sentinel object representing the default timeout value + DEFAULT_TIMEOUT = _GLOBAL_DEFAULT_TIMEOUT + + def __init__(self, connect=_Default, read=_Default, total=None): + self._connect = self._validate_timeout(connect, 'connect') + self._read = self._validate_timeout(read, 'read') + self.total = self._validate_timeout(total, 'total') + self._start_connect = None + + def __str__(self): + return '%s(connect=%r, read=%r, total=%r)' % ( + type(self).__name__, self._connect, self._read, self.total) + + + @classmethod + def _validate_timeout(cls, value, name): + """ Check that a timeout attribute is valid + + :param value: The timeout value to validate + :param name: The name of the timeout attribute to validate. This is used + for clear error messages + :return: the value + :raises ValueError: if the type is not an integer or a float, or if it + is a numeric value less than zero + """ + if value is _Default: + return cls.DEFAULT_TIMEOUT + + if value is None or value is cls.DEFAULT_TIMEOUT: + return value + + try: + float(value) + except (TypeError, ValueError): + raise ValueError("Timeout value %s was %s, but it must be an " + "int or float." % (name, value)) + + try: + if value < 0: + raise ValueError("Attempted to set %s timeout to %s, but the " + "timeout cannot be set to a value less " + "than 0." % (name, value)) + except TypeError: # Python 3 + raise ValueError("Timeout value %s was %s, but it must be an " + "int or float." % (name, value)) + + return value + + @classmethod + def from_float(cls, timeout): + """ Create a new Timeout from a legacy timeout value. + + The timeout value used by httplib.py sets the same timeout on the + connect(), and recv() socket requests. This creates a :class:`Timeout` + object that sets the individual timeouts to the ``timeout`` value passed + to this function. + + :param timeout: The legacy timeout value + :type timeout: integer, float, sentinel default object, or None + :return: a Timeout object + :rtype: :class:`Timeout` + """ + return Timeout(read=timeout, connect=timeout) + + def clone(self): + """ Create a copy of the timeout object + + Timeout properties are stored per-pool but each request needs a fresh + Timeout object to ensure each one has its own start/stop configured. + + :return: a copy of the timeout object + :rtype: :class:`Timeout` + """ + # We can't use copy.deepcopy because that will also create a new object + # for _GLOBAL_DEFAULT_TIMEOUT, which socket.py uses as a sentinel to + # detect the user default. + return Timeout(connect=self._connect, read=self._read, + total=self.total) + + def start_connect(self): + """ Start the timeout clock, used during a connect() attempt + + :raises urllib3.exceptions.TimeoutStateError: if you attempt + to start a timer that has been started already. + """ + if self._start_connect is not None: + raise TimeoutStateError("Timeout timer has already been started.") + self._start_connect = current_time() + return self._start_connect + + def get_connect_duration(self): + """ Gets the time elapsed since the call to :meth:`start_connect`. + + :return: the elapsed time + :rtype: float + :raises urllib3.exceptions.TimeoutStateError: if you attempt + to get duration for a timer that hasn't been started. + """ + if self._start_connect is None: + raise TimeoutStateError("Can't get connect duration for timer " + "that has not started.") + return current_time() - self._start_connect + + @property + def connect_timeout(self): + """ Get the value to use when setting a connection timeout. + + This will be a positive float or integer, the value None + (never timeout), or the default system timeout. + + :return: the connect timeout + :rtype: int, float, :attr:`Timeout.DEFAULT_TIMEOUT` or None + """ + if self.total is None: + return self._connect + + if self._connect is None or self._connect is self.DEFAULT_TIMEOUT: + return self.total + + return min(self._connect, self.total) + + @property + def read_timeout(self): + """ Get the value for the read timeout. + + This assumes some time has elapsed in the connection timeout and + computes the read timeout appropriately. + + If self.total is set, the read timeout is dependent on the amount of + time taken by the connect timeout. If the connection time has not been + established, a :exc:`~urllib3.exceptions.TimeoutStateError` will be + raised. + + :return: the value to use for the read timeout + :rtype: int, float, :attr:`Timeout.DEFAULT_TIMEOUT` or None + :raises urllib3.exceptions.TimeoutStateError: If :meth:`start_connect` + has not yet been called on this object. + """ + if (self.total is not None and + self.total is not self.DEFAULT_TIMEOUT and + self._read is not None and + self._read is not self.DEFAULT_TIMEOUT): + # in case the connect timeout has not yet been established. + if self._start_connect is None: + return self._read + return max(0, min(self.total - self.get_connect_duration(), + self._read)) + elif self.total is not None and self.total is not self.DEFAULT_TIMEOUT: + return max(0, self.total - self.get_connect_duration()) + else: + return self._read + + +class Url(namedtuple('Url', ['scheme', 'auth', 'host', 'port', 'path', 'query', 'fragment'])): + """ + Datastructure for representing an HTTP URL. Used as a return value for + :func:`parse_url`. + """ + slots = () + + def __new__(cls, scheme=None, auth=None, host=None, port=None, path=None, query=None, fragment=None): + return super(Url, cls).__new__(cls, scheme, auth, host, port, path, query, fragment) + + @property + def hostname(self): + """For backwards-compatibility with urlparse. We're nice like that.""" + return self.host + + @property + def request_uri(self): + """Absolute path including the query string.""" + uri = self.path or '/' + + if self.query is not None: + uri += '?' + self.query + + return uri + + @property + def netloc(self): + """Network location including host and port""" + if self.port: + return '%s:%d' % (self.host, self.port) + return self.host + + +def split_first(s, delims): + """ + Given a string and an iterable of delimiters, split on the first found + delimiter. Return two split parts and the matched delimiter. + + If not found, then the first part is the full input string. + + Example: :: + + >>> split_first('foo/bar?baz', '?/=') + ('foo', 'bar?baz', '/') + >>> split_first('foo/bar?baz', '123') + ('foo/bar?baz', '', None) + + Scales linearly with number of delims. Not ideal for large number of delims. + """ + min_idx = None + min_delim = None + for d in delims: + idx = s.find(d) + if idx < 0: + continue + + if min_idx is None or idx < min_idx: + min_idx = idx + min_delim = d + + if min_idx is None or min_idx < 0: + return s, '', None + + return s[:min_idx], s[min_idx+1:], min_delim + + +def parse_url(url): + """ + Given a url, return a parsed :class:`.Url` namedtuple. Best-effort is + performed to parse incomplete urls. Fields not provided will be None. + + Partly backwards-compatible with :mod:`urlparse`. + + Example: :: + + >>> parse_url('http://google.com/mail/') + Url(scheme='http', host='google.com', port=None, path='/', ...) + >>> parse_url('google.com:80') + Url(scheme=None, host='google.com', port=80, path=None, ...) + >>> parse_url('/foo?bar') + Url(scheme=None, host=None, port=None, path='/foo', query='bar', ...) + """ + + # While this code has overlap with stdlib's urlparse, it is much + # simplified for our needs and less annoying. + # Additionally, this implementations does silly things to be optimal + # on CPython. + + scheme = None + auth = None + host = None + port = None + path = None + fragment = None + query = None + + # Scheme + if '://' in url: + scheme, url = url.split('://', 1) + + # Find the earliest Authority Terminator + # (http://tools.ietf.org/html/rfc3986#section-3.2) + url, path_, delim = split_first(url, ['/', '?', '#']) + + if delim: + # Reassemble the path + path = delim + path_ + + # Auth + if '@' in url: + auth, url = url.split('@', 1) + + # IPv6 + if url and url[0] == '[': + host, url = url.split(']', 1) + host += ']' + + # Port + if ':' in url: + _host, port = url.split(':', 1) + + if not host: + host = _host + + if not port.isdigit(): + raise LocationParseError("Failed to parse: %s" % url) + + port = int(port) + + elif not host and url: + host = url + + if not path: + return Url(scheme, auth, host, port, path, query, fragment) + + # Fragment + if '#' in path: + path, fragment = path.split('#', 1) + + # Query + if '?' in path: + path, query = path.split('?', 1) + + return Url(scheme, auth, host, port, path, query, fragment) + + +def get_host(url): + """ + Deprecated. Use :func:`.parse_url` instead. + """ + p = parse_url(url) + return p.scheme or 'http', p.hostname, p.port + + +def make_headers(keep_alive=None, accept_encoding=None, user_agent=None, + basic_auth=None): + """ + Shortcuts for generating request headers. + + :param keep_alive: + If ``True``, adds 'connection: keep-alive' header. + + :param accept_encoding: + Can be a boolean, list, or string. + ``True`` translates to 'gzip,deflate'. + List will get joined by comma. + String will be used as provided. + + :param user_agent: + String representing the user-agent you want, such as + "python-urllib3/0.6" + + :param basic_auth: + Colon-separated username:password string for 'authorization: basic ...' + auth header. + + Example: :: + + >>> make_headers(keep_alive=True, user_agent="Batman/1.0") + {'connection': 'keep-alive', 'user-agent': 'Batman/1.0'} + >>> make_headers(accept_encoding=True) + {'accept-encoding': 'gzip,deflate'} + """ + headers = {} + if accept_encoding: + if isinstance(accept_encoding, str): + pass + elif isinstance(accept_encoding, list): + accept_encoding = ','.join(accept_encoding) + else: + accept_encoding = 'gzip,deflate' + headers['accept-encoding'] = accept_encoding + + if user_agent: + headers['user-agent'] = user_agent + + if keep_alive: + headers['connection'] = 'keep-alive' + + if basic_auth: + headers['authorization'] = 'Basic ' + \ + b64encode(six.b(basic_auth)).decode('utf-8') + + return headers + + +def is_connection_dropped(conn): # Platform-specific + """ + Returns True if the connection is dropped and should be closed. + + :param conn: + :class:`httplib.HTTPConnection` object. + + Note: For platforms like AppEngine, this will always return ``False`` to + let the platform handle connection recycling transparently for us. + """ + sock = getattr(conn, 'sock', False) + if not sock: # Platform-specific: AppEngine + return False + + if not poll: + if not select: # Platform-specific: AppEngine + return False + + try: + return select([sock], [], [], 0.0)[0] + except SocketError: + return True + + # This version is better on platforms that support it. + p = poll() + p.register(sock, POLLIN) + for (fno, ev) in p.poll(0.0): + if fno == sock.fileno(): + # Either data is buffered (bad), or the connection is dropped. + return True + + +def resolve_cert_reqs(candidate): + """ + Resolves the argument to a numeric constant, which can be passed to + the wrap_socket function/method from the ssl module. + Defaults to :data:`ssl.CERT_NONE`. + If given a string it is assumed to be the name of the constant in the + :mod:`ssl` module or its abbrevation. + (So you can specify `REQUIRED` instead of `CERT_REQUIRED`. + If it's neither `None` nor a string we assume it is already the numeric + constant which can directly be passed to wrap_socket. + """ + if candidate is None: + return CERT_NONE + + if isinstance(candidate, str): + res = getattr(ssl, candidate, None) + if res is None: + res = getattr(ssl, 'CERT_' + candidate) + return res + + return candidate + + +def resolve_ssl_version(candidate): + """ + like resolve_cert_reqs + """ + if candidate is None: + return PROTOCOL_SSLv23 + + if isinstance(candidate, str): + res = getattr(ssl, candidate, None) + if res is None: + res = getattr(ssl, 'PROTOCOL_' + candidate) + return res + + return candidate + + +def assert_fingerprint(cert, fingerprint): + """ + Checks if given fingerprint matches the supplied certificate. + + :param cert: + Certificate as bytes object. + :param fingerprint: + Fingerprint as string of hexdigits, can be interspersed by colons. + """ + + # Maps the length of a digest to a possible hash function producing + # this digest. + hashfunc_map = { + 16: md5, + 20: sha1 + } + + fingerprint = fingerprint.replace(':', '').lower() + + digest_length, rest = divmod(len(fingerprint), 2) + + if rest or digest_length not in hashfunc_map: + raise SSLError('Fingerprint is of invalid length.') + + # We need encode() here for py32; works on py2 and p33. + fingerprint_bytes = unhexlify(fingerprint.encode()) + + hashfunc = hashfunc_map[digest_length] + + cert_digest = hashfunc(cert).digest() + + if not cert_digest == fingerprint_bytes: + raise SSLError('Fingerprints did not match. Expected "{0}", got "{1}".' + .format(hexlify(fingerprint_bytes), + hexlify(cert_digest))) + +def is_fp_closed(obj): + """ + Checks whether a given file-like object is closed. + + :param obj: + The file-like object to check. + """ + if hasattr(obj, 'fp'): + # Object is a container for another file-like object that gets released + # on exhaustion (e.g. HTTPResponse) + return obj.fp is None + + return obj.closed + + +if SSLContext is not None: # Python 3.2+ + def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None, + ca_certs=None, server_hostname=None, + ssl_version=None): + """ + All arguments except `server_hostname` have the same meaning as for + :func:`ssl.wrap_socket` + + :param server_hostname: + Hostname of the expected certificate + """ + context = SSLContext(ssl_version) + context.verify_mode = cert_reqs + if ca_certs: + try: + context.load_verify_locations(ca_certs) + # Py32 raises IOError + # Py33 raises FileNotFoundError + except Exception as e: # Reraise as SSLError + raise SSLError(e) + if certfile: + # FIXME: This block needs a test. + context.load_cert_chain(certfile, keyfile) + if HAS_SNI: # Platform-specific: OpenSSL with enabled SNI + return context.wrap_socket(sock, server_hostname=server_hostname) + return context.wrap_socket(sock) + +else: # Python 3.1 and earlier + def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None, + ca_certs=None, server_hostname=None, + ssl_version=None): + return wrap_socket(sock, keyfile=keyfile, certfile=certfile, + ca_certs=ca_certs, cert_reqs=cert_reqs, + ssl_version=ssl_version) diff -Nru python-urllib3-1.8.3/urllib3.egg-info/PKG-INFO python-urllib3-1.7.1/urllib3.egg-info/PKG-INFO --- python-urllib3-1.8.3/urllib3.egg-info/PKG-INFO 2014-06-24 18:22:52.000000000 +0000 +++ python-urllib3-1.7.1/urllib3.egg-info/PKG-INFO 2013-09-25 16:06:36.000000000 +0000 @@ -1,6 +1,6 @@ -Metadata-Version: 1.1 +Metadata-Version: 1.0 Name: urllib3 -Version: 1.8.3 +Version: 1.7.1 Summary: HTTP library with thread-safe connection pooling, file post, and more. Home-page: http://urllib3.readthedocs.org/ Author: Andrey Petrov @@ -28,14 +28,7 @@ - Tested on Python 2.6+ and Python 3.2+, 100% unit test coverage. - Small and easy to understand codebase perfect for extending and building upon. For a more comprehensive solution, have a look at - `Requests `_ which is also powered by ``urllib3``. - - You might already be using urllib3! - =================================== - - ``urllib3`` powers `many great Python libraries `_, - including ``pip`` and ``requests``. - + `Requests `_ which is also powered by urllib3. What's wrong with urllib and urllib2? ===================================== @@ -106,7 +99,6 @@ py27: commands succeeded py32: commands succeeded py33: commands succeeded - py34: commands succeeded Note that code coverage less than 100% is regarded as a failing run. @@ -129,101 +121,10 @@ Changes ======= - 1.8.3 (2014-06-23) - ++++++++++++++++++ - - * Fix TLS verification when using a proxy in Python 3.4.1. (Issue #385) - - * Add ``disable_cache`` option to ``urllib3.util.make_headers``. (Issue #393) - - * Wrap ``socket.timeout`` exception with - ``urllib3.exceptions.ReadTimeoutError``. (Issue #399) - - * Fixed proxy-related bug where connections were being reused incorrectly. - (Issues #366, #369) - - * Added ``socket_options`` keyword parameter which allows to define - ``setsockopt`` configuration of new sockets. (Issue #397) - - * Removed ``HTTPConnection.tcp_nodelay`` in favor of - ``HTTPConnection.default_socket_options``. (Issue #397) - - * Fixed ``TypeError`` bug in Python 2.6.4. (Issue #411) - - - 1.8.2 (2014-04-17) - ++++++++++++++++++ - - * Fix ``urllib3.util`` not being included in the package. - - - 1.8.1 (2014-04-17) - ++++++++++++++++++ - - * Fix AppEngine bug of HTTPS requests going out as HTTP. (Issue #356) - - * Don't install ``dummyserver`` into ``site-packages`` as it's only needed - for the test suite. (Issue #362) - - * Added support for specifying ``source_address``. (Issue #352) - - - 1.8 (2014-03-04) - ++++++++++++++++ - - * Improved url parsing in ``urllib3.util.parse_url`` (properly parse '@' in - username, and blank ports like 'hostname:'). - - * New ``urllib3.connection`` module which contains all the HTTPConnection - objects. - - * Several ``urllib3.util.Timeout``-related fixes. Also changed constructor - signature to a more sensible order. [Backwards incompatible] - (Issues #252, #262, #263) - - * Use ``backports.ssl_match_hostname`` if it's installed. (Issue #274) - - * Added ``.tell()`` method to ``urllib3.response.HTTPResponse`` which - returns the number of bytes read so far. (Issue #277) - - * Support for platforms without threading. (Issue #289) - - * Expand default-port comparison in ``HTTPConnectionPool.is_same_host`` - to allow a pool with no specified port to be considered equal to to an - HTTP/HTTPS url with port 80/443 explicitly provided. (Issue #305) - - * Improved default SSL/TLS settings to avoid vulnerabilities. - (Issue #309) - - * Fixed ``urllib3.poolmanager.ProxyManager`` not retrying on connect errors. - (Issue #310) - - * Disable Nagle's Algorithm on the socket for non-proxies. A subset of requests - will send the entire HTTP request ~200 milliseconds faster; however, some of - the resulting TCP packets will be smaller. (Issue #254) - - * Increased maximum number of SubjectAltNames in ``urllib3.contrib.pyopenssl`` - from the default 64 to 1024 in a single certificate. (Issue #318) - - * Headers are now passed and stored as a custom - ``urllib3.collections_.HTTPHeaderDict`` object rather than a plain ``dict``. - (Issue #329, #333) - - * Headers no longer lose their case on Python 3. (Issue #236) - - * ``urllib3.contrib.pyopenssl`` now uses the operating system's default CA - certificates on inject. (Issue #332) - - * Requests with ``retries=False`` will immediately raise any exceptions without - wrapping them in ``MaxRetryError``. (Issue #348) - - * Fixed open socket leak with SSL-related failures. (Issue #344, #348) - - 1.7.1 (2013-09-25) ++++++++++++++++++ - * Added granular timeout support with new ``urllib3.util.Timeout`` class. + * Added granular timeout support with new `urllib3.util.Timeout` class. (Issue #231) * Fixed Python 3.4 support. (Issue #238) diff -Nru python-urllib3-1.8.3/urllib3.egg-info/SOURCES.txt python-urllib3-1.7.1/urllib3.egg-info/SOURCES.txt --- python-urllib3-1.8.3/urllib3.egg-info/SOURCES.txt 2014-06-24 18:22:53.000000000 +0000 +++ python-urllib3-1.7.1/urllib3.egg-info/SOURCES.txt 2013-09-25 16:06:36.000000000 +0000 @@ -11,18 +11,9 @@ dummyserver/proxy.py dummyserver/server.py dummyserver/testcase.py -dummyserver/certs/cacert.key -dummyserver/certs/cacert.pem -dummyserver/certs/client.csr -dummyserver/certs/client.key -dummyserver/certs/client.pem -dummyserver/certs/client_bad.pem -dummyserver/certs/server.crt -dummyserver/certs/server.csr -dummyserver/certs/server.key -dummyserver/certs/server.key.org +test/__init__.py +test/benchmark.py test/test_collections.py -test/test_compatibility.py test/test_connectionpool.py test/test_exceptions.py test/test_fields.py @@ -33,7 +24,6 @@ test/test_util.py urllib3/__init__.py urllib3/_collections.py -urllib3/connection.py urllib3/connectionpool.py urllib3/exceptions.py urllib3/fields.py @@ -41,6 +31,7 @@ urllib3/poolmanager.py urllib3/request.py urllib3/response.py +urllib3/util.py urllib3.egg-info/PKG-INFO urllib3.egg-info/SOURCES.txt urllib3.egg-info/dependency_links.txt @@ -51,12 +42,4 @@ urllib3/packages/__init__.py urllib3/packages/ordered_dict.py urllib3/packages/six.py -urllib3/packages/ssl_match_hostname/__init__.py -urllib3/packages/ssl_match_hostname/_implementation.py -urllib3/util/__init__.py -urllib3/util/connection.py -urllib3/util/request.py -urllib3/util/response.py -urllib3/util/ssl_.py -urllib3/util/timeout.py -urllib3/util/url.py \ No newline at end of file +urllib3/packages/ssl_match_hostname/__init__.py \ No newline at end of file diff -Nru python-urllib3-1.8.3/urllib3.egg-info/top_level.txt python-urllib3-1.7.1/urllib3.egg-info/top_level.txt --- python-urllib3-1.8.3/urllib3.egg-info/top_level.txt 2014-06-24 18:22:52.000000000 +0000 +++ python-urllib3-1.7.1/urllib3.egg-info/top_level.txt 2013-09-25 16:06:36.000000000 +0000 @@ -1 +1,2 @@ urllib3 +dummyserver