diff -Nru python-urllib3-1.13.1/CHANGES.rst python-urllib3-1.15.1/CHANGES.rst --- python-urllib3-1.13.1/CHANGES.rst 2015-12-18 22:47:24.000000000 +0000 +++ python-urllib3-1.15.1/CHANGES.rst 2016-04-11 17:22:45.000000000 +0000 @@ -1,14 +1,53 @@ Changes ======= +1.15.1 (2016-04-11) +------------------- + +* Fix packaging to include backports module. (Issue #841) + + +1.15 (2016-04-06) +----------------- + +* Added Retry(raise_on_status=False). (Issue #720) + +* Always use setuptools, no more distutils fallback. (Issue #785) + +* Dropped support for Python 3.2. (Issue #786) + +* Chunked transfer encoding when requesting with ``chunked=True``. + (Issue #790) + +* Fixed regression with IPv6 port parsing. (Issue #801) + +* Append SNIMissingWarning messages to allow users to specify it in + the PYTHONWARNINGS environment variable. (Issue #816) + +* Handle unicode headers in Py2. (Issue #818) + +* Log certificate when there is a hostname mismatch. (Issue #820) + +* Preserve order of request/response headers. (Issue #821) + + +1.14 (2015-12-29) +----------------- + +* contrib: SOCKS proxy support! (Issue #762) + +* Fixed AppEngine handling of transfer-encoding header and bug + in Timeout defaults checking. (Issue #763) + + 1.13.1 (2015-12-18) -+++++++++++++++++++ +------------------- * Fixed regression in IPv6 + SSL for match_hostname. (Issue #761) 1.13 (2015-12-14) -+++++++++++++++++ +----------------- * Fixed ``pip install urllib3[secure]`` on modern pip. (Issue #706) @@ -25,7 +64,7 @@ 1.12 (2015-09-03) -+++++++++++++++++ +----------------- * Rely on ``six`` for importing ``httplib`` to work around conflicts with other Python 3 shims. (Issue #688) @@ -38,7 +77,7 @@ 1.11 (2015-07-21) -+++++++++++++++++ +----------------- * When ``ca_certs`` is given, ``cert_reqs`` defaults to ``'CERT_REQUIRED'``. (Issue #650) @@ -83,7 +122,7 @@ (Issue #674) 1.10.4 (2015-05-03) -+++++++++++++++++++ +------------------- * Migrate tests to Tornado 4. (Issue #594) @@ -99,7 +138,7 @@ 1.10.3 (2015-04-21) -+++++++++++++++++++ +------------------- * Emit ``InsecurePlatformWarning`` when SSLContext object is missing. (Issue #558) @@ -120,7 +159,7 @@ 1.10.2 (2015-02-25) -+++++++++++++++++++ +------------------- * Fix file descriptor leakage on retries. (Issue #548) @@ -132,7 +171,7 @@ 1.10.1 (2015-02-10) -+++++++++++++++++++ +------------------- * Pools can be used as context managers. (Issue #545) @@ -146,7 +185,7 @@ 1.10 (2014-12-14) -+++++++++++++++++ +----------------- * Disabled SSLv3. (Issue #473) @@ -178,7 +217,7 @@ 1.9.1 (2014-09-13) -++++++++++++++++++ +------------------ * Apply socket arguments before binding. (Issue #427) @@ -199,7 +238,7 @@ 1.9 (2014-07-04) -++++++++++++++++ +---------------- * Shuffled around development-related files. If you're maintaining a distro package of urllib3, you may need to tweak things. (Issue #415) @@ -236,7 +275,7 @@ 1.8.3 (2014-06-23) -++++++++++++++++++ +------------------ * Fix TLS verification when using a proxy in Python 3.4.1. (Issue #385) @@ -258,13 +297,13 @@ 1.8.2 (2014-04-17) -++++++++++++++++++ +------------------ * Fix ``urllib3.util`` not being included in the package. 1.8.1 (2014-04-17) -++++++++++++++++++ +------------------ * Fix AppEngine bug of HTTPS requests going out as HTTP. (Issue #356) @@ -275,7 +314,7 @@ 1.8 (2014-03-04) -++++++++++++++++ +---------------- * Improved url parsing in ``urllib3.util.parse_url`` (properly parse '@' in username, and blank ports like 'hostname:'). @@ -327,7 +366,7 @@ 1.7.1 (2013-09-25) -++++++++++++++++++ +------------------ * Added granular timeout support with new ``urllib3.util.Timeout`` class. (Issue #231) @@ -336,7 +375,7 @@ 1.7 (2013-08-14) -++++++++++++++++ +---------------- * More exceptions are now pickle-able, with tests. (Issue #174) @@ -375,7 +414,7 @@ 1.6 (2013-04-25) -++++++++++++++++ +---------------- * Contrib: Optional SNI support for Py2 using PyOpenSSL. (Issue #156) @@ -435,7 +474,7 @@ 1.5 (2012-08-02) -++++++++++++++++ +---------------- * Added ``urllib3.add_stderr_logger()`` for quickly enabling STDERR debug logging in urllib3. @@ -460,7 +499,7 @@ 1.4 (2012-06-16) -++++++++++++++++ +---------------- * Minor AppEngine-related fixes. @@ -472,7 +511,7 @@ 1.3 (2012-03-25) -++++++++++++++++ +---------------- * Removed pre-1.0 deprecated API. @@ -491,13 +530,13 @@ 1.2.2 (2012-02-06) -++++++++++++++++++ +------------------ * Fixed packaging bug of not shipping ``test-requirements.txt``. (Issue #47) 1.2.1 (2012-02-05) -++++++++++++++++++ +------------------ * Fixed another bug related to when ``ssl`` module is not available. (Issue #41) @@ -506,7 +545,7 @@ 1.2 (2012-01-29) -++++++++++++++++ +---------------- * Added Python 3 support (tested on 3.2.2) @@ -532,7 +571,7 @@ 1.1 (2012-01-07) -++++++++++++++++ +---------------- * Refactored ``dummyserver`` to its own root namespace module (used for testing). @@ -549,7 +588,7 @@ 1.0.2 (2011-11-04) -++++++++++++++++++ +------------------ * Fixed typo in ``VerifiedHTTPSConnection`` which would only present as a bug if you're using the object manually. (Thanks pyos) @@ -562,14 +601,14 @@ 1.0.1 (2011-10-10) -++++++++++++++++++ +------------------ * Fixed a bug where the same connection would get returned into the pool twice, causing extraneous "HttpConnectionPool is full" log warnings. 1.0 (2011-10-08) -++++++++++++++++ +---------------- * Added ``PoolManager`` with LRU expiration of connections (tested and documented). @@ -592,13 +631,13 @@ 0.4.1 (2011-07-17) -++++++++++++++++++ +------------------ * Minor bug fixes, code cleanup. 0.4 (2011-03-01) -++++++++++++++++ +---------------- * Better unicode support. * Added ``VerifiedHTTPSConnection``. @@ -607,13 +646,13 @@ 0.3.1 (2010-07-13) -++++++++++++++++++ +------------------ * Added ``assert_host_name`` optional parameter. Now compatible with proxies. 0.3 (2009-12-10) -++++++++++++++++ +---------------- * Added HTTPS support. * Minor bug fixes. @@ -622,13 +661,13 @@ 0.2 (2008-11-17) -++++++++++++++++ +---------------- * Added unit tests. * Bug fixes. 0.1 (2008-11-16) -++++++++++++++++ +---------------- * First release. diff -Nru python-urllib3-1.13.1/CONTRIBUTORS.txt python-urllib3-1.15.1/CONTRIBUTORS.txt --- python-urllib3-1.13.1/CONTRIBUTORS.txt 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/CONTRIBUTORS.txt 2016-04-06 19:16:56.000000000 +0000 @@ -184,5 +184,18 @@ * Andy Caldwell * Bugfix related to reusing connections in indeterminate states. +* Ville Skyttรค + * Logging efficiency improvements, spelling fixes, Travis config. + +* Shige Takeda + * Started Recipes documentation and added a recipe about handling concatenated gzip data in HTTP response + +* Jesse Shapiro + * Working on encoding unicode header parameter names + * Making setup.py resilient to ASCII locales + +* David Foster + * Ensure order of request and response headers are preserved. + * [Your name or handle] <[email or website]> * [Brief summary of your changes] diff -Nru python-urllib3-1.13.1/debian/changelog python-urllib3-1.15.1/debian/changelog --- python-urllib3-1.13.1/debian/changelog 2016-05-10 08:42:02.000000000 +0000 +++ python-urllib3-1.15.1/debian/changelog 2016-05-26 21:42:16.000000000 +0000 @@ -1,10 +1,25 @@ -python-urllib3 (1.13.1-2ubuntu1) yakkety; urgency=medium +python-urllib3 (1.15.1-2) unstable; urgency=medium - * d/p/06_revert_square_brackets_httplib.patch: Cherry pick revert of - change to behaviour with httplib which breaks IPv6 square bracket - handling (LP: #1578351). + * debian/patches/01_do-not-use-embedded-python-six.patch + - Patch urllib3.contrib.appengine and dummyserver tests. (Closes: #825310) - -- James Page Tue, 10 May 2016 09:41:48 +0100 + -- Daniele Tricoli Thu, 26 May 2016 05:11:02 +0200 + +python-urllib3 (1.15.1-1) unstable; urgency=medium + + * New upstream release. + * debian/control + - Bump Standards-Version to 3.9.8 (no changes needed). + - Add python{,3}-socks to Suggests. + * debian/copyright + - Update copyright years. + * debian/patches/01_do-not-use-embedded-python-six.patch + - Refresh. + * debian/rules + - Don't run contrib socks tests at build time. + - Exclude GAE tests. (Closes: #825168) + + -- Daniele Tricoli Tue, 24 May 2016 16:18:22 +0200 python-urllib3 (1.13.1-2) unstable; urgency=medium diff -Nru python-urllib3-1.13.1/debian/control python-urllib3-1.15.1/debian/control --- python-urllib3-1.13.1/debian/control 2016-05-10 08:42:05.000000000 +0000 +++ python-urllib3-1.15.1/debian/control 2016-05-24 21:00:40.000000000 +0000 @@ -1,6 +1,5 @@ Source: python-urllib3 -Maintainer: Ubuntu Developers -XSBC-Original-Maintainer: Debian Python Modules Team +Maintainer: Debian Python Modules Team Uploaders: Daniele Tricoli Section: python Priority: optional @@ -20,8 +19,8 @@ python3-nose (>=1.3.3), python3-setuptools, python3-six, - python3-tornado, -Standards-Version: 3.9.7 + python3-tornado +Standards-Version: 3.9.8 X-Python-Version: >= 2.6 X-Python3-Version: >= 3.2 Homepage: http://urllib3.readthedocs.org @@ -40,7 +39,8 @@ python-openssl, python-pyasn1 Suggests: - python-ntlm + python-ntlm, + python-socks Description: HTTP library with thread-safe connection pooling for Python urllib3 supports features left out of urllib and urllib2 libraries. . @@ -65,7 +65,8 @@ Suggests: python3-ndg-httpsclient, python3-openssl, - python3-pyasn1 + python3-pyasn1, + python3-socks Description: HTTP library with thread-safe connection pooling for Python3 urllib3 supports features left out of urllib and urllib2 libraries. . diff -Nru python-urllib3-1.13.1/debian/copyright python-urllib3-1.15.1/debian/copyright --- python-urllib3-1.13.1/debian/copyright 2016-02-12 09:04:04.000000000 +0000 +++ python-urllib3-1.15.1/debian/copyright 2016-05-18 22:06:02.000000000 +0000 @@ -4,7 +4,7 @@ Source: http://pypi.python.org/pypi/urllib3 Files: * -Copyright: 2008-2013, Andrey Petrov +Copyright: 2008-2016, Andrey Petrov License: Expat Files: urllib3/packages/six.py diff -Nru python-urllib3-1.13.1/debian/.git-dpm python-urllib3-1.15.1/debian/.git-dpm --- python-urllib3-1.13.1/debian/.git-dpm 2015-12-25 15:31:02.000000000 +0000 +++ python-urllib3-1.15.1/debian/.git-dpm 2016-05-26 21:39:25.000000000 +0000 @@ -1,11 +1,11 @@ # see git-dpm(1) from git-dpm package -468650ebc00a3c8458882bc83a1616aece7e2705 -468650ebc00a3c8458882bc83a1616aece7e2705 -49444bdd74c2a37b97d86fc9cb5b85a28ffd659d -49444bdd74c2a37b97d86fc9cb5b85a28ffd659d -python-urllib3_1.13.1.orig.tar.gz -1309e9536c74cdad6d5ab089c83235a687b6d7e6 -156259 +21501b42c55182861ad7afe82d5267f442722e03 +21501b42c55182861ad7afe82d5267f442722e03 +9105037e3d018d407e6b6faad26ae5a8a17a732a +9105037e3d018d407e6b6faad26ae5a8a17a732a +python-urllib3_1.15.1.orig.tar.gz +a66d3db6d53d96c851316087406185d35555dcc4 +169242 debianTag="debian/%e%v" patchedTag="patched/%e%v" upstreamTag="upstream/%e%u" diff -Nru python-urllib3-1.13.1/debian/patches/01_do-not-use-embedded-python-six.patch python-urllib3-1.15.1/debian/patches/01_do-not-use-embedded-python-six.patch --- python-urllib3-1.13.1/debian/patches/01_do-not-use-embedded-python-six.patch 2015-12-25 15:31:02.000000000 +0000 +++ python-urllib3-1.15.1/debian/patches/01_do-not-use-embedded-python-six.patch 2016-05-26 21:39:25.000000000 +0000 @@ -1,4 +1,4 @@ -From 00470dbca1765992a48cd2b324e7a3c4a6d40a5c Mon Sep 17 00:00:00 2001 +From 8e3232792cac904df40bed08f36b5a2c66468612 Mon Sep 17 00:00:00 2001 From: Daniele Tricoli Date: Thu, 8 Oct 2015 13:19:46 -0700 Subject: Do not use embedded copy of python-six. @@ -7,29 +7,33 @@ Patch-Name: 01_do-not-use-embedded-python-six.patch --- - dummyserver/handlers.py | 2 +- - test/__init__.py | 2 +- - test/contrib/test_pyopenssl.py | 2 +- - test/test_collections.py | 2 +- - test/test_fields.py | 2 +- - test/test_filepost.py | 2 +- - test/test_retry.py | 2 +- - urllib3/_collections.py | 2 +- - urllib3/connection.py | 2 +- - urllib3/connectionpool.py | 2 +- - urllib3/fields.py | 2 +- - urllib3/filepost.py | 4 ++-- - urllib3/response.py | 4 ++-- - urllib3/util/request.py | 2 +- - urllib3/util/response.py | 2 +- - urllib3/util/retry.py | 2 +- - 16 files changed, 18 insertions(+), 18 deletions(-) + dummyserver/handlers.py | 2 +- + test/__init__.py | 2 +- + test/contrib/test_pyopenssl.py | 2 +- + test/test_collections.py | 2 +- + test/test_fields.py | 2 +- + test/test_filepost.py | 2 +- + test/test_retry.py | 2 +- + test/with_dummyserver/test_chunked_transfer.py | 2 +- + test/with_dummyserver/test_connectionpool.py | 2 +- + test/with_dummyserver/test_https.py | 2 +- + urllib3/_collections.py | 2 +- + urllib3/connection.py | 2 +- + urllib3/connectionpool.py | 2 +- + urllib3/contrib/appengine.py | 2 +- + urllib3/fields.py | 2 +- + urllib3/filepost.py | 4 ++-- + urllib3/response.py | 4 ++-- + urllib3/util/request.py | 2 +- + urllib3/util/response.py | 2 +- + urllib3/util/retry.py | 2 +- + 20 files changed, 22 insertions(+), 22 deletions(-) diff --git a/dummyserver/handlers.py b/dummyserver/handlers.py -index fb6f44f..c5ac9b4 100644 +index 7598329..504b998 100644 --- a/dummyserver/handlers.py +++ b/dummyserver/handlers.py -@@ -264,7 +264,7 @@ def _parse_header(line): +@@ -269,7 +269,7 @@ def _parse_header(line): """ import tornado.httputil import email.utils @@ -39,7 +43,7 @@ line = line.encode('utf-8') parts = tornado.httputil._parseparam(';' + line) diff --git a/test/__init__.py b/test/__init__.py -index f7c4a7a..22d3616 100644 +index bab39ed..076cdf0 100644 --- a/test/__init__.py +++ b/test/__init__.py @@ -8,7 +8,7 @@ import socket @@ -52,7 +56,7 @@ # We need a host that will not immediately close the connection with a TCP # Reset. SO suggests this hostname diff --git a/test/contrib/test_pyopenssl.py b/test/contrib/test_pyopenssl.py -index 5d57527..f23ff19 100644 +index ab304f8..b9e6572 100644 --- a/test/contrib/test_pyopenssl.py +++ b/test/contrib/test_pyopenssl.py @@ -1,5 +1,5 @@ @@ -60,8 +64,8 @@ -from urllib3.packages import six +import six - if six.PY3: - raise SkipTest('Testing of PyOpenSSL disabled on PY3') + try: + from urllib3.contrib.pyopenssl import (inject_into_urllib3, diff --git a/test/test_collections.py b/test/test_collections.py index 9d72939..78ef634 100644 --- a/test/test_collections.py @@ -76,18 +80,18 @@ from nose.plugins.skip import SkipTest diff --git a/test/test_fields.py b/test/test_fields.py -index cdec68b..66da148 100644 +index 21b4481..b547aa8 100644 --- a/test/test_fields.py +++ b/test/test_fields.py @@ -1,7 +1,7 @@ import unittest from urllib3.fields import guess_content_type, RequestField --from urllib3.packages.six import u -+from six import u +-from urllib3.packages.six import u, PY3 ++from six import u, PY3 + from . import onlyPy2 - class TestRequestField(unittest.TestCase): diff --git a/test/test_filepost.py b/test/test_filepost.py index 390dbb3..ecc6710 100644 --- a/test/test_filepost.py @@ -114,8 +118,47 @@ from urllib3.util.retry import Retry from urllib3.exceptions import ( ConnectTimeoutError, +diff --git a/test/with_dummyserver/test_chunked_transfer.py b/test/with_dummyserver/test_chunked_transfer.py +index 1d58e23..abfc777 100644 +--- a/test/with_dummyserver/test_chunked_transfer.py ++++ b/test/with_dummyserver/test_chunked_transfer.py +@@ -1,7 +1,7 @@ + # -*- coding: utf-8 -*- + + from urllib3 import HTTPConnectionPool +-from urllib3.packages import six ++import six + from dummyserver.testcase import SocketDummyServerTestCase + + +diff --git a/test/with_dummyserver/test_connectionpool.py b/test/with_dummyserver/test_connectionpool.py +index 0f31fa0..b3b72ca 100644 +--- a/test/with_dummyserver/test_connectionpool.py ++++ b/test/with_dummyserver/test_connectionpool.py +@@ -31,7 +31,7 @@ from urllib3.exceptions import ( + ProtocolError, + NewConnectionError, + ) +-from urllib3.packages.six import b, u ++from six import b, u + from urllib3.util.retry import Retry + from urllib3.util.timeout import Timeout + +diff --git a/test/with_dummyserver/test_https.py b/test/with_dummyserver/test_https.py +index 2c5f035..be54e3c 100644 +--- a/test/with_dummyserver/test_https.py ++++ b/test/with_dummyserver/test_https.py +@@ -36,7 +36,7 @@ from urllib3.exceptions import ( + SystemTimeWarning, + InsecurePlatformWarning, + ) +-from urllib3.packages import six ++import six + from urllib3.util.timeout import Timeout + import urllib3.util as util + diff --git a/urllib3/_collections.py b/urllib3/_collections.py -index 67f3ce9..b69ce20 100644 +index 77cee01..114116c 100644 --- a/urllib3/_collections.py +++ b/urllib3/_collections.py @@ -15,7 +15,7 @@ try: # Python 2.7+ @@ -128,10 +171,10 @@ __all__ = ['RecentlyUsedContainer', 'HTTPHeaderDict'] diff --git a/urllib3/connection.py b/urllib3/connection.py -index 1e4cd41..0075541 100644 +index 5ce0080..f4b86fa 100644 --- a/urllib3/connection.py +++ b/urllib3/connection.py -@@ -5,7 +5,7 @@ import sys +@@ -6,7 +6,7 @@ import sys import socket from socket import error as SocketError, timeout as SocketTimeout import warnings @@ -141,7 +184,7 @@ try: # Python 3 from http.client import HTTPConnection as _HTTPConnection diff --git a/urllib3/connectionpool.py b/urllib3/connectionpool.py -index 995b416..2204b30 100644 +index 3fcfb12..989f8ca 100644 --- a/urllib3/connectionpool.py +++ b/urllib3/connectionpool.py @@ -31,7 +31,7 @@ from .exceptions import ( @@ -153,8 +196,21 @@ from .connection import ( port_by_scheme, DummyConnection, +diff --git a/urllib3/contrib/appengine.py b/urllib3/contrib/appengine.py +index f4289c0..5355244 100644 +--- a/urllib3/contrib/appengine.py ++++ b/urllib3/contrib/appengine.py +@@ -12,7 +12,7 @@ from ..exceptions import ( + SSLError + ) + +-from ..packages.six import BytesIO ++from six import BytesIO + from ..request import RequestMethods + from ..response import HTTPResponse + from ..util.timeout import Timeout diff --git a/urllib3/fields.py b/urllib3/fields.py -index c7d4811..2152829 100644 +index 8fa2a12..14e5604 100644 --- a/urllib3/fields.py +++ b/urllib3/fields.py @@ -2,7 +2,7 @@ from __future__ import absolute_import @@ -182,7 +238,7 @@ writer = codecs.lookup('utf-8')[3] diff --git a/urllib3/response.py b/urllib3/response.py -index 8f2a1b5..e034068 100644 +index ac1b2f1..a2b0cb3 100644 --- a/urllib3/response.py +++ b/urllib3/response.py @@ -9,8 +9,8 @@ from ._collections import HTTPHeaderDict @@ -210,7 +266,7 @@ ACCEPT_ENCODING = 'gzip,deflate' diff --git a/urllib3/util/response.py b/urllib3/util/response.py -index bc72327..6756b99 100644 +index 0b5c75c..d7759ae 100644 --- a/urllib3/util/response.py +++ b/urllib3/util/response.py @@ -1,5 +1,5 @@ @@ -221,7 +277,7 @@ from ..exceptions import HeaderParsingError diff --git a/urllib3/util/retry.py b/urllib3/util/retry.py -index 03a0124..fd1f5dd 100644 +index 2d3aa20..867d616 100644 --- a/urllib3/util/retry.py +++ b/urllib3/util/retry.py @@ -9,7 +9,7 @@ from ..exceptions import ( diff -Nru python-urllib3-1.13.1/debian/patches/02_require-cert-verification.patch python-urllib3-1.15.1/debian/patches/02_require-cert-verification.patch --- python-urllib3-1.13.1/debian/patches/02_require-cert-verification.patch 2015-12-25 15:31:02.000000000 +0000 +++ python-urllib3-1.15.1/debian/patches/02_require-cert-verification.patch 2016-05-26 21:39:25.000000000 +0000 @@ -1,4 +1,4 @@ -From f52be2bcb24472e3b746c449f58fed0eb3775094 Mon Sep 17 00:00:00 2001 +From e2f0f9789b9e5992b33c7ede2f70c9b0f356a56a Mon Sep 17 00:00:00 2001 From: Jamie Strandboge Date: Thu, 8 Oct 2015 13:19:47 -0700 Subject: require SSL certificate validation by default by using @@ -14,10 +14,10 @@ 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/urllib3/connectionpool.py b/urllib3/connectionpool.py -index 2204b30..ef60000 100644 +index 989f8ca..e8e47f1 100644 --- a/urllib3/connectionpool.py +++ b/urllib3/connectionpool.py -@@ -683,6 +683,8 @@ class HTTPSConnectionPool(HTTPConnectionPool): +@@ -713,6 +713,8 @@ class HTTPSConnectionPool(HTTPConnectionPool): ``ca_cert_dir``, and ``ssl_version`` are only used if :mod:`ssl` is available and are fed into :meth:`urllib3.util.ssl_wrap_socket` to upgrade the connection socket into an SSL socket. @@ -26,7 +26,7 @@ """ scheme = 'https' -@@ -692,8 +694,8 @@ class HTTPSConnectionPool(HTTPConnectionPool): +@@ -722,8 +724,8 @@ class HTTPSConnectionPool(HTTPConnectionPool): strict=False, timeout=Timeout.DEFAULT_TIMEOUT, maxsize=1, block=False, headers=None, retries=None, _proxy=None, _proxy_headers=None, diff -Nru python-urllib3-1.13.1/debian/patches/03_force_setuptools.patch python-urllib3-1.15.1/debian/patches/03_force_setuptools.patch --- python-urllib3-1.13.1/debian/patches/03_force_setuptools.patch 2015-12-25 15:31:02.000000000 +0000 +++ python-urllib3-1.15.1/debian/patches/03_force_setuptools.patch 1970-01-01 00:00:00.000000000 +0000 @@ -1,25 +0,0 @@ -From a3f9df1c55db089852db9dd233148e32beabaf32 Mon Sep 17 00:00:00 2001 -From: Barry Warsaw -Date: Thu, 8 Oct 2015 13:19:49 -0700 -Subject: Use setuptools.setup() so that the bdist_wheel - - command will work. -Last-Update: 2014-05-15 - -Patch-Name: 03_force_setuptools.patch ---- - setup.py | 2 +- - 1 file changed, 1 insertion(+), 1 deletion(-) - -diff --git a/setup.py b/setup.py -index 0a2dac3..02d5ec6 100644 ---- a/setup.py -+++ b/setup.py -@@ -1,6 +1,6 @@ - #!/usr/bin/env python - --from distutils.core import setup -+from setuptools import setup - - import os - import re diff -Nru python-urllib3-1.13.1/debian/patches/04_relax_nosetests_options.patch python-urllib3-1.15.1/debian/patches/04_relax_nosetests_options.patch --- python-urllib3-1.13.1/debian/patches/04_relax_nosetests_options.patch 2015-12-25 15:31:02.000000000 +0000 +++ python-urllib3-1.15.1/debian/patches/04_relax_nosetests_options.patch 2016-05-26 21:39:25.000000000 +0000 @@ -1,4 +1,4 @@ -From 2e8803cd1dd80c4d7b03582cbe807becc4373f2c Mon Sep 17 00:00:00 2001 +From f7d1a13d72330cfad9077c62c93ae2407acc20bd Mon Sep 17 00:00:00 2001 From: Daniele Tricoli Date: Thu, 8 Oct 2015 13:19:50 -0700 Subject: Do not use logging-clear-handlers to see all logging output and @@ -14,7 +14,7 @@ 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/setup.cfg b/setup.cfg -index b5fe992..ca24a71 100644 +index ec26776..a240096 100644 --- a/setup.cfg +++ b/setup.cfg @@ -1,8 +1,8 @@ diff -Nru python-urllib3-1.13.1/debian/patches/05_avoid-embedded-ssl-match-hostname.patch python-urllib3-1.15.1/debian/patches/05_avoid-embedded-ssl-match-hostname.patch --- python-urllib3-1.13.1/debian/patches/05_avoid-embedded-ssl-match-hostname.patch 2015-12-25 15:31:02.000000000 +0000 +++ python-urllib3-1.15.1/debian/patches/05_avoid-embedded-ssl-match-hostname.patch 2016-05-26 21:39:25.000000000 +0000 @@ -1,4 +1,4 @@ -From 468650ebc00a3c8458882bc83a1616aece7e2705 Mon Sep 17 00:00:00 2001 +From 21501b42c55182861ad7afe82d5267f442722e03 Mon Sep 17 00:00:00 2001 From: Stefano Rivera Date: Thu, 8 Oct 2015 13:19:51 -0700 Subject: Do not use embedded copy of ssl.match_hostname, when possible diff -Nru python-urllib3-1.13.1/debian/patches/06_revert_square_brackets_httplib.patch python-urllib3-1.15.1/debian/patches/06_revert_square_brackets_httplib.patch --- python-urllib3-1.13.1/debian/patches/06_revert_square_brackets_httplib.patch 2016-05-10 08:40:32.000000000 +0000 +++ python-urllib3-1.15.1/debian/patches/06_revert_square_brackets_httplib.patch 1970-01-01 00:00:00.000000000 +0000 @@ -1,24 +0,0 @@ -From a3b9a34be9dccfc42df65b2403bc0c6666fd3392 Mon Sep 17 00:00:00 2001 -From: Cory Benfield -Date: Wed, 10 Feb 2016 09:49:24 +0000 -Subject: [PATCH] Revert "We do not require square brackets for httplib." - -This reverts commit 019dbaeadb6318d98c78f2c874e2d49c06ebda15. ---- - urllib3/connectionpool.py | 3 ++- - 1 file changed, 2 insertions(+), 1 deletion(-) - -diff --git a/urllib3/connectionpool.py b/urllib3/connectionpool.py -index 5a9d766..6f6e905 100644 ---- a/urllib3/connectionpool.py -+++ b/urllib3/connectionpool.py -@@ -69,7 +69,8 @@ def __init__(self, host, port=None): - if not host: - raise LocationValueError("No host specified.") - -- self.host = host -+ # httplib doesn't like it when we include brackets in ipv6 addresses -+ self.host = host.strip('[]') - self.port = port - - def __str__(self): diff -Nru python-urllib3-1.13.1/debian/patches/series python-urllib3-1.15.1/debian/patches/series --- python-urllib3-1.13.1/debian/patches/series 2016-05-10 08:40:39.000000000 +0000 +++ python-urllib3-1.15.1/debian/patches/series 2016-05-18 22:06:02.000000000 +0000 @@ -1,6 +1,4 @@ 01_do-not-use-embedded-python-six.patch 02_require-cert-verification.patch -03_force_setuptools.patch 04_relax_nosetests_options.patch 05_avoid-embedded-ssl-match-hostname.patch -06_revert_square_brackets_httplib.patch diff -Nru python-urllib3-1.13.1/debian/rules python-urllib3-1.15.1/debian/rules --- python-urllib3-1.13.1/debian/rules 2016-02-12 09:04:04.000000000 +0000 +++ python-urllib3-1.15.1/debian/rules 2016-05-24 21:00:47.000000000 +0000 @@ -10,9 +10,8 @@ rm -f urllib3/packages/six.py override_dh_auto_test: - # Exclude dummyserver tests since they are also failing upstream. PYBUILD_SYSTEM=custom \ - PYBUILD_TEST_ARGS="cd {build_dir}; {interpreter} -m nose {dir}/test --with-coverage -e with_dummyserver" dh_auto_test + PYBUILD_TEST_ARGS="cd {build_dir}; {interpreter} -m nose {dir}/test --with-coverage -e with_dummyserver -e socks --ignore-files=\"test_gae_manager\.py\"" dh_auto_test # Clean here .coverage because it is created by nose using the coverage # plugin find . -name .coverage -delete diff -Nru python-urllib3-1.13.1/dev-requirements.txt python-urllib3-1.15.1/dev-requirements.txt --- python-urllib3-1.13.1/dev-requirements.txt 2015-12-03 22:57:02.000000000 +0000 +++ python-urllib3-1.15.1/dev-requirements.txt 2016-04-06 19:16:56.000000000 +0000 @@ -6,3 +6,4 @@ twine==1.5.0 wheel==0.24.0 tornado==4.2.1 +PySocks==1.5.6 diff -Nru python-urllib3-1.13.1/docs/contrib.rst python-urllib3-1.15.1/docs/contrib.rst --- python-urllib3-1.13.1/docs/contrib.rst 2015-09-04 00:16:43.000000000 +0000 +++ python-urllib3-1.15.1/docs/contrib.rst 2016-04-06 19:16:56.000000000 +0000 @@ -4,7 +4,7 @@ =============== These modules implement various extra features, that may not be ready for -prime time. +prime time or that require optional third-party dependencies. .. _contrib-pyopenssl: @@ -16,7 +16,7 @@ .. _gae: -Google App Engine +Google App Engine ----------------- The :mod:`urllib3.contrib.appengine` module provides a pool manager that @@ -45,8 +45,58 @@ 1. You can use :class:`AppEngineManager` with URLFetch. URLFetch is cost-effective in many circumstances as long as your usage is within the limitations. 2. You can use a normal :class:`PoolManager` by enabling sockets. Sockets also have `limitations and restrictions `_ and have a lower free quota than URLFetch. To use sockets, be sure to specify the following in your ``app.yaml``:: - + env_variables: GAE_USE_SOCKETS_HTTPLIB : 'true' 3. If you are using `Managed VMs `_, you can use the standard :class:`PoolManager` without any configuration or special environment variables. + + +.. _socks: + +SOCKS Proxies +------------- + +.. versionadded:: 1.14 + +The :mod:`urllib3.contrib.socks` module enables urllib3 to work with proxies +that use either the SOCKS4 or SOCKS5 protocols. These proxies are common in +environments that want to allow generic TCP/UDP traffic through their borders, +but don't want unrestricted traffic flows. + +To use it, either install ``PySocks`` or install urllib3 with the ``socks`` +extra, like so: + +.. code-block:: bash + + $ pip install -U urllib3[socks] + +The SOCKS module provides a +:class:`SOCKSProxyManager ` that can +be used when SOCKS support is required. This class behaves very much like a +standard :class:`ProxyManager `, but allows +the use of a SOCKS proxy instead. + +Using it is simple. For example, with a SOCKS5 proxy running on the local +machine, listening on port 8889: + +.. code-block:: python + + from urllib3.contrib.socks import SOCKSProxyManager + + http = SOCKSProxyManager('socks5://localhost:8889/') + r = http.request('GET', 'https://www.google.com/') + +The SOCKS implementation supports the full range of urllib3 features. It also +supports the following SOCKS features: + +- SOCKS4 +- SOCKS4a +- SOCKS5 +- Usernames and passwords for the SOCKS proxy + +The SOCKS module does have the following limitations: + +- No support for contacting a SOCKS proxy via IPv6. +- No support for reaching websites via a literal IPv6 address: domain names + must be used. diff -Nru python-urllib3-1.13.1/docs/index.rst python-urllib3-1.15.1/docs/index.rst --- python-urllib3-1.13.1/docs/index.rst 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/docs/index.rst 2016-04-11 17:22:45.000000000 +0000 @@ -12,6 +12,7 @@ exceptions collections contrib + recipes Highlights @@ -22,7 +23,7 @@ :class:`~urllib3.connectionpool.HTTPConnectionPool` and :class:`~urllib3.connectionpool.HTTPSConnectionPool` -- File posting. See: +- File posting with multipart encoding. See: :func:`~urllib3.filepost.encode_multipart_formdata` - Built-in redirection and retries (optional). @@ -33,12 +34,14 @@ - Thread-safe and sanity-safe. +- Proxy over :ref:`HTTP or SOCKS `. + - Tested on Python 2.6+ and Python 3.2+, 100% unit test coverage. - Works with AppEngine, gevent, eventlib, and the standard library :mod:`io` module. - Small and easy to understand codebase perfect for extending and building upon. - For a more comprehensive solution, have a look at + For a simplified abstraction, have a look at `Requests `_ which is also powered by urllib3. @@ -98,6 +101,18 @@ >>> secondpart = b.read() +The response can be treated as a file-like object. +A file can be downloaded directly to a local file in a context without +being saved in memory. + +.. doctest :: + + >>> url = 'http://example.com/file' + >>> http = urllib3.PoolManager() + >>> with http.request('GET', url, preload_content=False) as r, open('filename', 'wb') as fp: + >>> .... shutil.copyfileobj(r, fp) + + Upgrading & Versioning ---------------------- @@ -153,13 +168,18 @@ similar, so that instances of either can be passed around interchangeably. +.. _proxymanager: + ProxyManager ------------ +HTTP Proxy +~~~~~~~~~~ + The :class:`~urllib3.poolmanagers.ProxyManager` is an HTTP proxy-aware subclass of :class:`~urllib3.poolmanagers.PoolManager`. It produces a single :class:`~urllib3.connectionpool.HTTPConnectionPool` instance for all HTTP -connections and individual per-server:port +connections and individual per-``server:port`` :class:`~urllib3.connectionpool.HTTPSConnectionPool` instances for tunnelled HTTPS connections: @@ -176,6 +196,12 @@ 3 +SOCKS Proxy +~~~~~~~~~~~ + +The :ref:`contrib module ` includes support for a :class:`SOCKSProxyManager `. + + ConnectionPool -------------- @@ -327,7 +353,7 @@ --------------- These modules implement various extra features, that may not be ready for -prime time. +prime time or that require optional third-party dependencies. * :ref:`contrib-modules` @@ -353,31 +379,25 @@ Please consider sponsoring urllib3 development, especially if your company benefits from this library. -* **Project Grant**: A grant for contiguous full-time development has the - biggest impact for progress. Periods of 3 to 10 days allow a contributor to - tackle substantial complex issues which are otherwise left to linger until - somebody can't afford to not fix them. +We welcome your patronage on `Bountysource `_: - Contact `@shazow `_ to arrange a grant for a core - contributor. +* `Contribute a recurring amount to the team `_ +* `Place a bounty on a specific feature `_ -* **One-off**: Development will continue regardless of funding, but donations help move - things further along quicker as the maintainer can allocate more time off to - work on urllib3 specifically. +Your contribution will go towards adding new features to urllib3 and making +sure all functionality continues to meet our high quality standards. - .. raw:: html - Sponsor with Credit Card +Project Grant +------------- - Sponsor with Bitcoin - +A grant for contiguous full-time development has the biggest impact for +progress. Periods of 3 to 10 days allow a contributor to tackle substantial +complex issues which are otherwise left to linger until somebody can't afford +to not fix them. -* **Recurring**: You're welcome to `support the maintainer on Gittip - `_. - - -Recent Sponsors ---------------- +Contact `@shazow `_ to arrange a grant for a core +contributor. Huge thanks to all the companies and individuals who financially contributed to the development of urllib3. Please send a PR if you've donated and would like @@ -385,4 +405,4 @@ * `Stripe `_ (June 23, 2014) -.. * [Company] ([optional tagline]), [optional description of grant] ([date]) +.. * [Company] ([date]) diff -Nru python-urllib3-1.13.1/docs/recipes.rst python-urllib3-1.15.1/docs/recipes.rst --- python-urllib3-1.13.1/docs/recipes.rst 1970-01-01 00:00:00.000000000 +0000 +++ python-urllib3-1.15.1/docs/recipes.rst 2016-04-06 19:16:56.000000000 +0000 @@ -0,0 +1,41 @@ +Recipes +======= + +This page includes a collection of recipes in the urlib3 cookbook. + +Decode HTTP Response Data in Concatenated Gzip Format +----------------------------------------------------- + +By default, urllib3 checks ``Content-Encoding`` header in HTTP response and decodes the data in ``gzip`` or ``deflate`` transparently. If ``Content-Encoding`` is not either of them, however, you will have to decode data in your application. + +This recipe shows how to decode data in the concatenated gzip format where multiple gzipped data chunks are concatenated in HTTP response. + +.. doctest :: + + import zlib + import urllib3 + + CHUNK_SIZE = 1024 + + def decode_gzip_raw_content(raw_data_fd): + obj = zlib.decompressobj(16 + zlib.MAX_WBITS) + output = [] + d = raw_data_fd.read(CHUNK_SIZE) + while d: + output.append(obj.decompress(d)) + while obj.unused_data != b'': + unused_data = obj.unused_data + obj = zlib.decompressobj(16 + zlib.MAX_WBITS) + output.append(obj.decompress(unused_data)) + d = raw_data_fd.read(CHUNK_SIZE) + return b''.join(output) + + + def test_urllib3_concatenated_gzip_in_http_response(): + # example for urllib3 + http = urllib3.PoolManager() + r = http.request('GET', 'http://example.com/abc.txt', + decode_content=False, preload_content=False) + content = decode_gzip_raw_content(r).decode('utf-8') + +``obj.unused_data`` includes the left over data in the previous ``obj.decompress`` method call. A new ``zlib.decompressobj`` is used to start decoding the next gzipped data chunk until no further data is given. diff -Nru python-urllib3-1.13.1/docs/security.rst python-urllib3-1.15.1/docs/security.rst --- python-urllib3-1.13.1/docs/security.rst 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/docs/security.rst 2016-04-06 19:16:56.000000000 +0000 @@ -17,7 +17,7 @@ ----------------------- First we need to make sure your Python installation has SSL enabled. Easiest -way to check is to simply open a Python shell and type `import ssl`:: +way to check is to simply open a Python shell and type ``import ssl``:: >>> import ssl Traceback (most recent call last): @@ -199,9 +199,11 @@ succeed on more featureful platforms to fail, and can cause certain security features to be unavailable. -If you encounter this warning, it is strongly recommended you upgrade to a -newer Python version, or that you use pyOpenSSL as described in the -:ref:`pyopenssl` section. +If you encounter this warning, it is strongly recommended you: + +- upgrade to a newer Python version +- upgrade ``ndg-httpsclient`` with ``pip install --upgrade ndg-httpsclient`` +- use pyOpenSSL as described in the :ref:`pyopenssl` section For info about disabling warnings, see `Disabling Warnings`_. diff -Nru python-urllib3-1.13.1/dummyserver/certs/client.pem python-urllib3-1.15.1/dummyserver/certs/client.pem --- python-urllib3-1.13.1/dummyserver/certs/client.pem 2014-08-06 18:02:49.000000000 +0000 +++ python-urllib3-1.15.1/dummyserver/certs/client.pem 2016-04-06 19:16:56.000000000 +0000 @@ -1,22 +1,21 @@ -----BEGIN CERTIFICATE----- -MIIDqDCCAxGgAwIBAgIBATANBgkqhkiG9w0BAQUFADCBgTELMAkGA1UEBhMCRkkx -DjAMBgNVBAgTBWR1bW15MQ4wDAYDVQQHEwVkdW1teTEOMAwGA1UEChMFZHVtbXkx -DjAMBgNVBAsTBWR1bW15MREwDwYDVQQDEwhTbmFrZU9pbDEfMB0GCSqGSIb3DQEJ +MIIDczCCAtygAwIBAgIBATANBgkqhkiG9w0BAQUFADCBgTELMAkGA1UEBhMCRkkx +DjAMBgNVBAgMBWR1bW15MQ4wDAYDVQQHDAVkdW1teTEOMAwGA1UECgwFZHVtbXkx +DjAMBgNVBAsMBWR1bW15MREwDwYDVQQDDAhTbmFrZU9pbDEfMB0GCSqGSIb3DQEJ ARYQZHVtbXlAdGVzdC5sb2NhbDAeFw0xMTEyMjIwNzU4NDBaFw0yMTEyMTgwNzU4 -NDBaMGExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIEwVkdW1teTEOMAwGA1UEBxMFZHVt -bXkxDjAMBgNVBAoTBWR1bW15MQ4wDAYDVQQLEwVkdW1teTESMBAGA1UEAxMJbG9j +NDBaMGExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIDAVkdW1teTEOMAwGA1UEBwwFZHVt +bXkxDjAMBgNVBAoMBWR1bW15MQ4wDAYDVQQLDAVkdW1teTESMBAGA1UEAwwJbG9j YWxob3N0MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDXe3FqmCWvP8XPxqtT +0bfL1Tvzvebi46k0WIcUV8bP3vyYiSRXG9ALmyzZH4GHY9UVs4OEDkCMDOBSezB 0y9ai/9doTNcaictdEBu8nfdXKoTtzrn+VX4UPrkH5hm7NQ1fTQuj1MR7yBCmYqN -3Q2Q+Efuujyx0FwBzAuy1aKYuwIDAQABo4IBTTCCAUkwCQYDVR0TBAIwADARBglg -hkgBhvhCAQEEBAMCBkAwKwYJYIZIAYb4QgENBB4WHFRpbnlDQSBHZW5lcmF0ZWQg -Q2VydGlmaWNhdGUwHQYDVR0OBBYEFBvnSuVKLNPEFMAFqHw292vGHGJSMIG2BgNV -HSMEga4wgauAFBl3fyNiYkJZRft1ncdzcgS7MwotoYGHpIGEMIGBMQswCQYDVQQG -EwJGSTEOMAwGA1UECBMFZHVtbXkxDjAMBgNVBAcTBWR1bW15MQ4wDAYDVQQKEwVk -dW1teTEOMAwGA1UECxMFZHVtbXkxETAPBgNVBAMTCFNuYWtlT2lsMR8wHQYJKoZI -hvcNAQkBFhBkdW1teUB0ZXN0LmxvY2FsggkAs+uxyi/hv+MwCQYDVR0SBAIwADAZ -BgNVHREEEjAQgQ5yb290QGxvY2FsaG9zdDANBgkqhkiG9w0BAQUFAAOBgQBXdedG -XHLPmOVBeKWjTmaekcaQi44snhYqE1uXRoIQXQsyw+Ya5+n/uRxPKZO/C78EESL0 -8rnLTdZXm4GBYyHYmMy0AdWR7y030viOzAkWWRRRbuecsaUzFCI+F9jTV5LHuRzz -V8fUKwiEE9swzkWgMpfVTPFuPgzxwG9gMbrBfg== +3Q2Q+Efuujyx0FwBzAuy1aKYuwIDAQABo4IBGDCCARQwCQYDVR0TBAIwADAdBgNV +HQ4EFgQUG+dK5Uos08QUwAWofDb3a8YcYlIwgbYGA1UdIwSBrjCBq4AUGXd/I2Ji +QllF+3Wdx3NyBLszCi2hgYekgYQwgYExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIDAVk +dW1teTEOMAwGA1UEBwwFZHVtbXkxDjAMBgNVBAoMBWR1bW15MQ4wDAYDVQQLDAVk +dW1teTERMA8GA1UEAwwIU25ha2VPaWwxHzAdBgkqhkiG9w0BCQEWEGR1bW15QHRl +c3QubG9jYWyCCQCz67HKL+G/4zAJBgNVHRIEAjAAMCQGA1UdEQQdMBuBDnJvb3RA +bG9jYWxob3N0gglsb2NhbGhvc3QwDQYJKoZIhvcNAQEFBQADgYEAgcW6X1ZUyufm +TFEqEAdpKXdL0rxDwcsM/qqqsXbkz17otH6ujPhBEagzdKtgeNKfy0aXz6rWZugk +lF0IqyC4mcI+vvfgGR5Iy4KdXMrIX98MbrvGJBfbdKhGW2b84wDV42DIDiD2ZGGe +6YZQQIo9LxjuOTf9jsvf+PIkbI4H0To= -----END CERTIFICATE----- diff -Nru python-urllib3-1.13.1/dummyserver/certs/server.crt python-urllib3-1.15.1/dummyserver/certs/server.crt --- python-urllib3-1.13.1/dummyserver/certs/server.crt 2014-08-06 18:02:49.000000000 +0000 +++ python-urllib3-1.15.1/dummyserver/certs/server.crt 2016-04-06 19:16:56.000000000 +0000 @@ -1,22 +1,21 @@ -----BEGIN CERTIFICATE----- -MIIDqDCCAxGgAwIBAgIBATANBgkqhkiG9w0BAQUFADCBgTELMAkGA1UEBhMCRkkx -DjAMBgNVBAgTBWR1bW15MQ4wDAYDVQQHEwVkdW1teTEOMAwGA1UEChMFZHVtbXkx -DjAMBgNVBAsTBWR1bW15MREwDwYDVQQDEwhTbmFrZU9pbDEfMB0GCSqGSIb3DQEJ +MIIDczCCAtygAwIBAgIBATANBgkqhkiG9w0BAQUFADCBgTELMAkGA1UEBhMCRkkx +DjAMBgNVBAgMBWR1bW15MQ4wDAYDVQQHDAVkdW1teTEOMAwGA1UECgwFZHVtbXkx +DjAMBgNVBAsMBWR1bW15MREwDwYDVQQDDAhTbmFrZU9pbDEfMB0GCSqGSIb3DQEJ ARYQZHVtbXlAdGVzdC5sb2NhbDAeFw0xMTEyMjIwNzU4NDBaFw0yMTEyMTgwNzU4 -NDBaMGExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIEwVkdW1teTEOMAwGA1UEBxMFZHVt -bXkxDjAMBgNVBAoTBWR1bW15MQ4wDAYDVQQLEwVkdW1teTESMBAGA1UEAxMJbG9j +NDBaMGExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIDAVkdW1teTEOMAwGA1UEBwwFZHVt +bXkxDjAMBgNVBAoMBWR1bW15MQ4wDAYDVQQLDAVkdW1teTESMBAGA1UEAwwJbG9j YWxob3N0MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDXe3FqmCWvP8XPxqtT +0bfL1Tvzvebi46k0WIcUV8bP3vyYiSRXG9ALmyzZH4GHY9UVs4OEDkCMDOBSezB 0y9ai/9doTNcaictdEBu8nfdXKoTtzrn+VX4UPrkH5hm7NQ1fTQuj1MR7yBCmYqN -3Q2Q+Efuujyx0FwBzAuy1aKYuwIDAQABo4IBTTCCAUkwCQYDVR0TBAIwADARBglg -hkgBhvhCAQEEBAMCBkAwKwYJYIZIAYb4QgENBB4WHFRpbnlDQSBHZW5lcmF0ZWQg -Q2VydGlmaWNhdGUwHQYDVR0OBBYEFBvnSuVKLNPEFMAFqHw292vGHGJSMIG2BgNV -HSMEga4wgauAFBl3fyNiYkJZRft1ncdzcgS7MwotoYGHpIGEMIGBMQswCQYDVQQG -EwJGSTEOMAwGA1UECBMFZHVtbXkxDjAMBgNVBAcTBWR1bW15MQ4wDAYDVQQKEwVk -dW1teTEOMAwGA1UECxMFZHVtbXkxETAPBgNVBAMTCFNuYWtlT2lsMR8wHQYJKoZI -hvcNAQkBFhBkdW1teUB0ZXN0LmxvY2FsggkAs+uxyi/hv+MwCQYDVR0SBAIwADAZ -BgNVHREEEjAQgQ5yb290QGxvY2FsaG9zdDANBgkqhkiG9w0BAQUFAAOBgQBXdedG -XHLPmOVBeKWjTmaekcaQi44snhYqE1uXRoIQXQsyw+Ya5+n/uRxPKZO/C78EESL0 -8rnLTdZXm4GBYyHYmMy0AdWR7y030viOzAkWWRRRbuecsaUzFCI+F9jTV5LHuRzz -V8fUKwiEE9swzkWgMpfVTPFuPgzxwG9gMbrBfg== +3Q2Q+Efuujyx0FwBzAuy1aKYuwIDAQABo4IBGDCCARQwCQYDVR0TBAIwADAdBgNV +HQ4EFgQUG+dK5Uos08QUwAWofDb3a8YcYlIwgbYGA1UdIwSBrjCBq4AUGXd/I2Ji +QllF+3Wdx3NyBLszCi2hgYekgYQwgYExCzAJBgNVBAYTAkZJMQ4wDAYDVQQIDAVk +dW1teTEOMAwGA1UEBwwFZHVtbXkxDjAMBgNVBAoMBWR1bW15MQ4wDAYDVQQLDAVk +dW1teTERMA8GA1UEAwwIU25ha2VPaWwxHzAdBgkqhkiG9w0BCQEWEGR1bW15QHRl +c3QubG9jYWyCCQCz67HKL+G/4zAJBgNVHRIEAjAAMCQGA1UdEQQdMBuBDnJvb3RA +bG9jYWxob3N0gglsb2NhbGhvc3QwDQYJKoZIhvcNAQEFBQADgYEAgcW6X1ZUyufm +TFEqEAdpKXdL0rxDwcsM/qqqsXbkz17otH6ujPhBEagzdKtgeNKfy0aXz6rWZugk +lF0IqyC4mcI+vvfgGR5Iy4KdXMrIX98MbrvGJBfbdKhGW2b84wDV42DIDiD2ZGGe +6YZQQIo9LxjuOTf9jsvf+PIkbI4H0To= -----END CERTIFICATE----- diff -Nru python-urllib3-1.13.1/dummyserver/handlers.py python-urllib3-1.15.1/dummyserver/handlers.py --- python-urllib3-1.13.1/dummyserver/handlers.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/dummyserver/handlers.py 2016-04-06 19:16:56.000000000 +0000 @@ -246,6 +246,11 @@ data, headers=[('Content-Type', 'application/octet-stream')]) + def status(self, request): + status = request.params.get("status", "200 OK") + + return Response(status=status) + def shutdown(self, request): sys.exit() diff -Nru python-urllib3-1.13.1/dummyserver/server.py python-urllib3-1.15.1/dummyserver/server.py --- python-urllib3-1.13.1/dummyserver/server.py 2015-12-18 22:47:24.000000000 +0000 +++ python-urllib3-1.15.1/dummyserver/server.py 2015-12-29 20:28:18.000000000 +0000 @@ -90,6 +90,8 @@ :param ready_event: Event which gets set when the socket handler is ready to receive requests. """ + USE_IPV6 = HAS_IPV6_AND_DNS + def __init__(self, socket_handler, host='localhost', port=8081, ready_event=None): threading.Thread.__init__(self) @@ -100,7 +102,7 @@ self.ready_event = ready_event def _start_server(self): - if HAS_IPV6_AND_DNS: + if self.USE_IPV6: sock = socket.socket(socket.AF_INET6) else: warnings.warn("No IPv6 support. Falling back to IPv4.", diff -Nru python-urllib3-1.13.1/dummyserver/testcase.py python-urllib3-1.15.1/dummyserver/testcase.py --- python-urllib3-1.13.1/dummyserver/testcase.py 2015-12-18 22:47:24.000000000 +0000 +++ python-urllib3-1.15.1/dummyserver/testcase.py 2015-12-29 20:28:18.000000000 +0000 @@ -71,6 +71,21 @@ cls.server_thread.join(0.1) +class IPV4SocketDummyServerTestCase(SocketDummyServerTestCase): + @classmethod + def _start_server(cls, socket_handler): + ready_event = threading.Event() + cls.server_thread = SocketServerThread(socket_handler=socket_handler, + ready_event=ready_event, + host=cls.host) + cls.server_thread.USE_IPV6 = False + cls.server_thread.start() + ready_event.wait(5) + if not ready_event.is_set(): + raise Exception("most likely failed to start server") + cls.port = cls.server_thread.port + + class HTTPDummyServerTestCase(unittest.TestCase): """ A simple HTTP server that runs when your test class runs diff -Nru python-urllib3-1.13.1/LICENSE.txt python-urllib3-1.15.1/LICENSE.txt --- python-urllib3-1.13.1/LICENSE.txt 2014-08-06 18:02:49.000000000 +0000 +++ python-urllib3-1.15.1/LICENSE.txt 2016-04-06 19:16:56.000000000 +0000 @@ -1,6 +1,6 @@ This is the MIT license: http://www.opensource.org/licenses/mit-license.php -Copyright 2008-2014 Andrey Petrov and contributors (see CONTRIBUTORS.txt) +Copyright 2008-2016 Andrey Petrov and contributors (see CONTRIBUTORS.txt) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software diff -Nru python-urllib3-1.13.1/PKG-INFO python-urllib3-1.15.1/PKG-INFO --- python-urllib3-1.13.1/PKG-INFO 2015-12-18 22:47:28.000000000 +0000 +++ python-urllib3-1.15.1/PKG-INFO 2016-04-11 17:26:04.000000000 +0000 @@ -1,6 +1,6 @@ Metadata-Version: 1.1 Name: urllib3 -Version: 1.13.1 +Version: 1.15.1 Summary: HTTP library with thread-safe connection pooling, file post, and more. Home-page: http://urllib3.readthedocs.org/ Author: Andrey Petrov @@ -26,9 +26,10 @@ - File posting (``encode_multipart_formdata``). - Built-in redirection and retries (optional). - Supports gzip and deflate decoding. + - Proxy over HTTP or SOCKS. - Thread-safe and sanity-safe. - Works with AppEngine, gevent, and eventlib. - - Tested on Python 2.6+, Python 3.2+, and PyPy, with 100% unit test coverage. + - Tested on Python 2.6+, Python 3.3+, and PyPy, with 100% unit test coverage. - Small and easy to understand codebase perfect for extending and building upon. For a more comprehensive solution, have a look at `Requests `_ which is also powered by ``urllib3``. @@ -134,6 +135,21 @@ Contributing ============ + Thank you for giving back to urllib3. Please meet our jolly team + of code-sherpas: + + Maintainers + ----------- + + - `@lukasa `_ (Cory Benfield) + - `@sigmavirus24 `_ (Ian Cordasco) + - `@shazow `_ (Andrey Petrov) + + ๐Ÿ‘‹ + + Getting Started + --------------- + #. `Check for open issues `_ or open a fresh issue to start a discussion around a feature idea or a bug. There is a *Contributor Friendly* tag for issues that should be ideal for people who @@ -156,14 +172,53 @@ Changes ======= + 1.15.1 (2016-04-11) + ------------------- + + * Fix packaging to include backports module. (Issue #841) + + + 1.15 (2016-04-06) + ----------------- + + * Added Retry(raise_on_status=False). (Issue #720) + + * Always use setuptools, no more distutils fallback. (Issue #785) + + * Dropped support for Python 3.2. (Issue #786) + + * Chunked transfer encoding when requesting with ``chunked=True``. + (Issue #790) + + * Fixed regression with IPv6 port parsing. (Issue #801) + + * Append SNIMissingWarning messages to allow users to specify it in + the PYTHONWARNINGS environment variable. (Issue #816) + + * Handle unicode headers in Py2. (Issue #818) + + * Log certificate when there is a hostname mismatch. (Issue #820) + + * Preserve order of request/response headers. (Issue #821) + + + 1.14 (2015-12-29) + ----------------- + + * contrib: SOCKS proxy support! (Issue #762) + + * Fixed AppEngine handling of transfer-encoding header and bug + in Timeout defaults checking. (Issue #763) + + 1.13.1 (2015-12-18) - +++++++++++++++++++ + ------------------- * Fixed regression in IPv6 + SSL for match_hostname. (Issue #761) 1.13 (2015-12-14) - +++++++++++++++++ + ----------------- * Fixed ``pip install urllib3[secure]`` on modern pip. (Issue #706) @@ -180,7 +235,7 @@ 1.12 (2015-09-03) - +++++++++++++++++ + ----------------- * Rely on ``six`` for importing ``httplib`` to work around conflicts with other Python 3 shims. (Issue #688) @@ -193,7 +248,7 @@ 1.11 (2015-07-21) - +++++++++++++++++ + ----------------- * When ``ca_certs`` is given, ``cert_reqs`` defaults to ``'CERT_REQUIRED'``. (Issue #650) @@ -238,7 +293,7 @@ (Issue #674) 1.10.4 (2015-05-03) - +++++++++++++++++++ + ------------------- * Migrate tests to Tornado 4. (Issue #594) @@ -254,7 +309,7 @@ 1.10.3 (2015-04-21) - +++++++++++++++++++ + ------------------- * Emit ``InsecurePlatformWarning`` when SSLContext object is missing. (Issue #558) @@ -275,7 +330,7 @@ 1.10.2 (2015-02-25) - +++++++++++++++++++ + ------------------- * Fix file descriptor leakage on retries. (Issue #548) @@ -287,7 +342,7 @@ 1.10.1 (2015-02-10) - +++++++++++++++++++ + ------------------- * Pools can be used as context managers. (Issue #545) @@ -301,7 +356,7 @@ 1.10 (2014-12-14) - +++++++++++++++++ + ----------------- * Disabled SSLv3. (Issue #473) @@ -333,7 +388,7 @@ 1.9.1 (2014-09-13) - ++++++++++++++++++ + ------------------ * Apply socket arguments before binding. (Issue #427) @@ -354,7 +409,7 @@ 1.9 (2014-07-04) - ++++++++++++++++ + ---------------- * Shuffled around development-related files. If you're maintaining a distro package of urllib3, you may need to tweak things. (Issue #415) @@ -391,7 +446,7 @@ 1.8.3 (2014-06-23) - ++++++++++++++++++ + ------------------ * Fix TLS verification when using a proxy in Python 3.4.1. (Issue #385) @@ -413,13 +468,13 @@ 1.8.2 (2014-04-17) - ++++++++++++++++++ + ------------------ * Fix ``urllib3.util`` not being included in the package. 1.8.1 (2014-04-17) - ++++++++++++++++++ + ------------------ * Fix AppEngine bug of HTTPS requests going out as HTTP. (Issue #356) @@ -430,7 +485,7 @@ 1.8 (2014-03-04) - ++++++++++++++++ + ---------------- * Improved url parsing in ``urllib3.util.parse_url`` (properly parse '@' in username, and blank ports like 'hostname:'). @@ -482,7 +537,7 @@ 1.7.1 (2013-09-25) - ++++++++++++++++++ + ------------------ * Added granular timeout support with new ``urllib3.util.Timeout`` class. (Issue #231) @@ -491,7 +546,7 @@ 1.7 (2013-08-14) - ++++++++++++++++ + ---------------- * More exceptions are now pickle-able, with tests. (Issue #174) @@ -530,7 +585,7 @@ 1.6 (2013-04-25) - ++++++++++++++++ + ---------------- * Contrib: Optional SNI support for Py2 using PyOpenSSL. (Issue #156) @@ -590,7 +645,7 @@ 1.5 (2012-08-02) - ++++++++++++++++ + ---------------- * Added ``urllib3.add_stderr_logger()`` for quickly enabling STDERR debug logging in urllib3. @@ -615,7 +670,7 @@ 1.4 (2012-06-16) - ++++++++++++++++ + ---------------- * Minor AppEngine-related fixes. @@ -627,7 +682,7 @@ 1.3 (2012-03-25) - ++++++++++++++++ + ---------------- * Removed pre-1.0 deprecated API. @@ -646,13 +701,13 @@ 1.2.2 (2012-02-06) - ++++++++++++++++++ + ------------------ * Fixed packaging bug of not shipping ``test-requirements.txt``. (Issue #47) 1.2.1 (2012-02-05) - ++++++++++++++++++ + ------------------ * Fixed another bug related to when ``ssl`` module is not available. (Issue #41) @@ -661,7 +716,7 @@ 1.2 (2012-01-29) - ++++++++++++++++ + ---------------- * Added Python 3 support (tested on 3.2.2) @@ -687,7 +742,7 @@ 1.1 (2012-01-07) - ++++++++++++++++ + ---------------- * Refactored ``dummyserver`` to its own root namespace module (used for testing). @@ -704,7 +759,7 @@ 1.0.2 (2011-11-04) - ++++++++++++++++++ + ------------------ * Fixed typo in ``VerifiedHTTPSConnection`` which would only present as a bug if you're using the object manually. (Thanks pyos) @@ -717,14 +772,14 @@ 1.0.1 (2011-10-10) - ++++++++++++++++++ + ------------------ * Fixed a bug where the same connection would get returned into the pool twice, causing extraneous "HttpConnectionPool is full" log warnings. 1.0 (2011-10-08) - ++++++++++++++++ + ---------------- * Added ``PoolManager`` with LRU expiration of connections (tested and documented). @@ -747,13 +802,13 @@ 0.4.1 (2011-07-17) - ++++++++++++++++++ + ------------------ * Minor bug fixes, code cleanup. 0.4 (2011-03-01) - ++++++++++++++++ + ---------------- * Better unicode support. * Added ``VerifiedHTTPSConnection``. @@ -762,13 +817,13 @@ 0.3.1 (2010-07-13) - ++++++++++++++++++ + ------------------ * Added ``assert_host_name`` optional parameter. Now compatible with proxies. 0.3 (2009-12-10) - ++++++++++++++++ + ---------------- * Added HTTPS support. * Minor bug fixes. @@ -777,14 +832,14 @@ 0.2 (2008-11-17) - ++++++++++++++++ + ---------------- * Added unit tests. * Bug fixes. 0.1 (2008-11-16) - ++++++++++++++++ + ---------------- * First release. diff -Nru python-urllib3-1.13.1/README.rst python-urllib3-1.15.1/README.rst --- python-urllib3-1.13.1/README.rst 2015-04-01 22:24:02.000000000 +0000 +++ python-urllib3-1.15.1/README.rst 2016-04-11 17:22:45.000000000 +0000 @@ -18,9 +18,10 @@ - File posting (``encode_multipart_formdata``). - Built-in redirection and retries (optional). - Supports gzip and deflate decoding. +- Proxy over HTTP or SOCKS. - Thread-safe and sanity-safe. - Works with AppEngine, gevent, and eventlib. -- Tested on Python 2.6+, Python 3.2+, and PyPy, with 100% unit test coverage. +- Tested on Python 2.6+, Python 3.3+, and PyPy, with 100% unit test coverage. - Small and easy to understand codebase perfect for extending and building upon. For a more comprehensive solution, have a look at `Requests `_ which is also powered by ``urllib3``. @@ -126,6 +127,21 @@ Contributing ============ +Thank you for giving back to urllib3. Please meet our jolly team +of code-sherpas: + +Maintainers +----------- + +- `@lukasa `_ (Cory Benfield) +- `@sigmavirus24 `_ (Ian Cordasco) +- `@shazow `_ (Andrey Petrov) + +๐Ÿ‘‹ + +Getting Started +--------------- + #. `Check for open issues `_ or open a fresh issue to start a discussion around a feature idea or a bug. There is a *Contributor Friendly* tag for issues that should be ideal for people who diff -Nru python-urllib3-1.13.1/setup.cfg python-urllib3-1.15.1/setup.cfg --- python-urllib3-1.13.1/setup.cfg 2015-12-18 22:47:28.000000000 +0000 +++ python-urllib3-1.15.1/setup.cfg 2016-04-11 17:26:04.000000000 +0000 @@ -21,7 +21,7 @@ certifi; extra == 'secure' [egg_info] +tag_svn_revision = 0 tag_build = tag_date = 0 -tag_svn_revision = 0 diff -Nru python-urllib3-1.13.1/setup.py python-urllib3-1.15.1/setup.py --- python-urllib3-1.13.1/setup.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/setup.py 2016-04-11 17:22:45.000000000 +0000 @@ -1,14 +1,10 @@ #!/usr/bin/env python -from distutils.core import setup +from setuptools import setup import os import re - -try: - import setuptools # noqa: unused -except ImportError: - pass # No 'develop' command, oh well. +import codecs base_path = os.path.dirname(__file__) @@ -18,13 +14,14 @@ re.S).match(fp.read()).group(1) fp.close() - +readme = codecs.open('README.rst', encoding='utf-8').read() +changes = codecs.open('CHANGES.rst', encoding='utf-8').read() version = VERSION setup(name='urllib3', version=version, description="HTTP library with thread-safe connection pooling, file post, and more.", - long_description=open('README.rst').read() + '\n\n' + open('CHANGES.rst').read(), + long_description=u'\n\n'.join([readme, changes]), classifiers=[ 'Environment :: Web Environment', 'Intended Audience :: Developers', @@ -43,7 +40,8 @@ license='MIT', packages=['urllib3', 'urllib3.packages', 'urllib3.packages.ssl_match_hostname', - 'urllib3.contrib', 'urllib3.util', + 'urllib3.packages.backports', 'urllib3.contrib', + 'urllib3.util', ], requires=[], tests_require=[ @@ -61,5 +59,8 @@ 'pyasn1', 'certifi', ], + 'socks': [ + 'PySocks>=1.5.6,<2.0', + ] }, ) diff -Nru python-urllib3-1.13.1/test/contrib/test_pyopenssl.py python-urllib3-1.15.1/test/contrib/test_pyopenssl.py --- python-urllib3-1.13.1/test/contrib/test_pyopenssl.py 2015-04-01 22:24:02.000000000 +0000 +++ python-urllib3-1.15.1/test/contrib/test_pyopenssl.py 2016-04-06 19:16:56.000000000 +0000 @@ -1,9 +1,6 @@ from nose.plugins.skip import SkipTest from urllib3.packages import six -if six.PY3: - raise SkipTest('Testing of PyOpenSSL disabled on PY3') - try: from urllib3.contrib.pyopenssl import (inject_into_urllib3, extract_from_urllib3) diff -Nru python-urllib3-1.13.1/test/contrib/test_socks.py python-urllib3-1.15.1/test/contrib/test_socks.py --- python-urllib3-1.13.1/test/contrib/test_socks.py 1970-01-01 00:00:00.000000000 +0000 +++ python-urllib3-1.15.1/test/contrib/test_socks.py 2015-12-29 20:28:18.000000000 +0000 @@ -0,0 +1,599 @@ +import threading +import socket + +from urllib3.contrib import socks +from urllib3.exceptions import ConnectTimeoutError, NewConnectionError + +from dummyserver.server import DEFAULT_CERTS +from dummyserver.testcase import IPV4SocketDummyServerTestCase + +from nose.plugins.skip import SkipTest + +try: + import ssl + from urllib3.util import ssl_ as better_ssl + HAS_SSL = True +except ImportError: + ssl = None + better_ssl = None + HAS_SSL = False + + +SOCKS_NEGOTIATION_NONE = b'\x00' +SOCKS_NEGOTIATION_PASSWORD = b'\x02' + +SOCKS_VERSION_SOCKS4 = b'\x04' +SOCKS_VERSION_SOCKS5 = b'\x05' + + +def _get_free_port(host): + """ + Gets a free port by opening a socket, binding it, checking the assigned + port, and then closing it. + """ + s = socket.socket() + s.bind((host, 0)) + port = s.getsockname()[1] + s.close() + return port + + +def _read_exactly(sock, amt): + """ + Read *exactly* ``amt`` bytes from the socket ``sock``. + """ + data = b'' + + while amt > 0: + chunk = sock.recv(amt) + data += chunk + amt -= len(chunk) + + return data + + +def _read_until(sock, char): + """ + Read from the socket until the character is received. + """ + chunks = [] + while True: + chunk = sock.recv(1) + chunks.append(chunk) + if chunk == char: + break + + return b''.join(chunks) + + +def _address_from_socket(sock): + """ + Returns the address from the SOCKS socket + """ + addr_type = sock.recv(1) + + if addr_type == b'\x01': + ipv4_addr = _read_exactly(sock, 4) + return socket.inet_ntoa(ipv4_addr) + elif addr_type == b'\x04': + ipv6_addr = _read_exactly(sock, 16) + return socket.inet_ntop(socket.AF_INET6, ipv6_addr) + elif addr_type == b'\x03': + addr_len = ord(sock.recv(1)) + return _read_exactly(sock, addr_len) + else: + raise RuntimeError("Unexpected addr type: %r" % addr_type) + + +def handle_socks5_negotiation(sock, negotiate, username=None, + password=None): + """ + Handle the SOCKS5 handshake. + + Returns a generator object that allows us to break the handshake into + steps so that the test code can intervene at certain useful points. + """ + received_version = sock.recv(1) + assert received_version == SOCKS_VERSION_SOCKS5 + nmethods = ord(sock.recv(1)) + methods = _read_exactly(sock, nmethods) + + if negotiate: + assert SOCKS_NEGOTIATION_PASSWORD in methods + send_data = SOCKS_VERSION_SOCKS5 + SOCKS_NEGOTIATION_PASSWORD + sock.sendall(send_data) + + # This is the password negotiation. + negotiation_version = sock.recv(1) + assert negotiation_version == b'\x01' + ulen = ord(sock.recv(1)) + provided_username = _read_exactly(sock, ulen) + plen = ord(sock.recv(1)) + provided_password = _read_exactly(sock, plen) + + if username == provided_username and password == provided_password: + sock.sendall(b'\x01\x00') + else: + sock.sendall(b'\x01\x01') + sock.close() + yield False + return + else: + assert SOCKS_NEGOTIATION_NONE in methods + send_data = SOCKS_VERSION_SOCKS5 + SOCKS_NEGOTIATION_NONE + sock.sendall(send_data) + + # Client sends where they want to go. + received_version = sock.recv(1) + command = sock.recv(1) + reserved = sock.recv(1) + addr = _address_from_socket(sock) + port = _read_exactly(sock, 2) + port = (ord(port[0:1]) << 8) + (ord(port[1:2])) + + # Check some basic stuff. + assert received_version == SOCKS_VERSION_SOCKS5 + assert command == b'\x01' # Only support connect, not bind. + assert reserved == b'\x00' + + # Yield the address port tuple. + succeed = yield addr, port + + if succeed: + # Hard-coded response for now. + response = ( + SOCKS_VERSION_SOCKS5 + b'\x00\x00\x01\x7f\x00\x00\x01\xea\x60' + ) + else: + # Hard-coded response for now. + response = SOCKS_VERSION_SOCKS5 + b'\x01\00' + + sock.sendall(response) + yield True # Avoid StopIteration exceptions getting fired. + + +def handle_socks4_negotiation(sock, username=None): + """ + Handle the SOCKS4 handshake. + + Returns a generator object that allows us to break the handshake into + steps so that the test code can intervene at certain useful points. + """ + received_version = sock.recv(1) + command = sock.recv(1) + port = _read_exactly(sock, 2) + port = (ord(port[0:1]) << 8) + (ord(port[1:2])) + addr = _read_exactly(sock, 4) + provided_username = _read_until(sock, b'\x00')[:-1] # Strip trailing null. + + if addr == b'\x00\x00\x00\x01': + # Magic string: means DNS name. + addr = _read_until(sock, b'\x00')[:-1] # Strip trailing null. + else: + addr = socket.inet_ntoa(addr) + + # Check some basic stuff. + assert received_version == SOCKS_VERSION_SOCKS4 + assert command == b'\x01' # Only support connect, not bind. + + if username is not None and username != provided_username: + sock.sendall(b'\x00\x5d\x00\x00\x00\x00\x00\x00') + sock.close() + yield False + return + + # Yield the address port tuple. + succeed = yield addr, port + + if succeed: + response = b'\x00\x5a\xea\x60\x7f\x00\x00\x01' + else: + response = b'\x00\x5b\x00\x00\x00\x00\x00\x00' + + sock.sendall(response) + yield True # Avoid StopIteration exceptions getting fired. + + +class TestSocks5Proxy(IPV4SocketDummyServerTestCase): + """ + Test the SOCKS proxy in SOCKS5 mode. + """ + def test_basic_request(self): + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks5_negotiation(sock, negotiate=False) + addr, port = next(handler) + + self.assertEqual(addr, '16.17.18.19') + self.assertTrue(port, 80) + handler.send(True) + + while True: + buf = sock.recv(65535) + if buf.endswith(b'\r\n\r\n'): + break + + sock.sendall(b'HTTP/1.1 200 OK\r\n' + b'Server: SocksTestServer\r\n' + b'Content-Length: 0\r\n' + b'\r\n') + sock.close() + + self._start_server(request_handler) + proxy_url = "socks5://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url) + response = pm.request('GET', 'http://16.17.18.19') + + self.assertEqual(response.status, 200) + self.assertEqual(response.data, b'') + self.assertEqual(response.headers['Server'], 'SocksTestServer') + + def test_correct_header_line(self): + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks5_negotiation(sock, negotiate=False) + addr, port = next(handler) + + self.assertEqual(addr, b'example.com') + self.assertTrue(port, 80) + handler.send(True) + + buf = b'' + while True: + buf += sock.recv(65535) + if buf.endswith(b'\r\n\r\n'): + break + + self.assertTrue(buf.startswith(b'GET / HTTP/1.1')) + self.assertTrue(b'Host: example.com' in buf) + + sock.sendall(b'HTTP/1.1 200 OK\r\n' + b'Server: SocksTestServer\r\n' + b'Content-Length: 0\r\n' + b'\r\n') + sock.close() + + self._start_server(request_handler) + proxy_url = "socks5://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url) + response = pm.request('GET', 'http://example.com') + self.assertEqual(response.status, 200) + + def test_connection_timeouts(self): + event = threading.Event() + + def request_handler(listener): + event.wait() + + self._start_server(request_handler) + proxy_url = "socks5://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url) + + self.assertRaises( + ConnectTimeoutError, pm.request, 'GET', 'http://example.com', + timeout=0.001, retries=False + ) + event.set() + + def test_connection_failure(self): + event = threading.Event() + + def request_handler(listener): + listener.close() + event.set() + + self._start_server(request_handler) + proxy_url = "socks5://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url) + + event.wait() + self.assertRaises( + NewConnectionError, pm.request, 'GET', 'http://example.com', + retries=False + ) + + def test_proxy_rejection(self): + evt = threading.Event() + + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks5_negotiation(sock, negotiate=False) + addr, port = next(handler) + handler.send(False) + + evt.wait() + sock.close() + + self._start_server(request_handler) + proxy_url = "socks5://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url) + + self.assertRaises( + NewConnectionError, pm.request, 'GET', 'http://example.com', + retries=False + ) + evt.set() + + def test_socks_with_password(self): + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks5_negotiation( + sock, negotiate=True, username=b'user', password=b'pass' + ) + addr, port = next(handler) + + self.assertEqual(addr, '16.17.18.19') + self.assertTrue(port, 80) + handler.send(True) + + while True: + buf = sock.recv(65535) + if buf.endswith(b'\r\n\r\n'): + break + + sock.sendall(b'HTTP/1.1 200 OK\r\n' + b'Server: SocksTestServer\r\n' + b'Content-Length: 0\r\n' + b'\r\n') + sock.close() + + self._start_server(request_handler) + proxy_url = "socks5://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url, username='user', + password='pass') + response = pm.request('GET', 'http://16.17.18.19') + + self.assertEqual(response.status, 200) + self.assertEqual(response.data, b'') + self.assertEqual(response.headers['Server'], 'SocksTestServer') + + def test_socks_with_invalid_password(self): + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks5_negotiation( + sock, negotiate=True, username=b'user', password=b'pass' + ) + next(handler) + + self._start_server(request_handler) + proxy_url = "socks5://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url, username='user', + password='badpass') + + try: + pm.request('GET', 'http://example.com', retries=False) + except NewConnectionError as e: + self.assertTrue("SOCKS5 authentication failed" in str(e)) + else: + self.fail("Did not raise") + + def test_source_address_works(self): + expected_port = _get_free_port(self.host) + + def request_handler(listener): + sock = listener.accept()[0] + self.assertEqual(sock.getpeername()[0], '127.0.0.1') + self.assertEqual(sock.getpeername()[1], expected_port) + + handler = handle_socks5_negotiation(sock, negotiate=False) + addr, port = next(handler) + + self.assertEqual(addr, '16.17.18.19') + self.assertTrue(port, 80) + handler.send(True) + + while True: + buf = sock.recv(65535) + if buf.endswith(b'\r\n\r\n'): + break + + sock.sendall(b'HTTP/1.1 200 OK\r\n' + b'Server: SocksTestServer\r\n' + b'Content-Length: 0\r\n' + b'\r\n') + sock.close() + + self._start_server(request_handler) + proxy_url = "socks5://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager( + proxy_url, source_address=('127.0.0.1', expected_port) + ) + response = pm.request('GET', 'http://16.17.18.19') + self.assertEqual(response.status, 200) + + +class TestSOCKS4Proxy(IPV4SocketDummyServerTestCase): + """ + Test the SOCKS proxy in SOCKS4 mode. + + Has relatively fewer tests than the SOCKS5 case, mostly because once the + negotiation is done the two cases behave identically. + """ + def test_basic_request(self): + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks4_negotiation(sock) + addr, port = next(handler) + + self.assertEqual(addr, '16.17.18.19') + self.assertTrue(port, 80) + handler.send(True) + + while True: + buf = sock.recv(65535) + if buf.endswith(b'\r\n\r\n'): + break + + sock.sendall(b'HTTP/1.1 200 OK\r\n' + b'Server: SocksTestServer\r\n' + b'Content-Length: 0\r\n' + b'\r\n') + sock.close() + + self._start_server(request_handler) + proxy_url = "socks4://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url) + response = pm.request('GET', 'http://16.17.18.19') + + self.assertEqual(response.status, 200) + self.assertEqual(response.headers['Server'], 'SocksTestServer') + self.assertEqual(response.data, b'') + + def test_correct_header_line(self): + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks4_negotiation(sock) + addr, port = next(handler) + + self.assertEqual(addr, b'example.com') + self.assertTrue(port, 80) + handler.send(True) + + buf = b'' + while True: + buf += sock.recv(65535) + if buf.endswith(b'\r\n\r\n'): + break + + self.assertTrue(buf.startswith(b'GET / HTTP/1.1')) + self.assertTrue(b'Host: example.com' in buf) + + sock.sendall(b'HTTP/1.1 200 OK\r\n' + b'Server: SocksTestServer\r\n' + b'Content-Length: 0\r\n' + b'\r\n') + sock.close() + + self._start_server(request_handler) + proxy_url = "socks4://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url) + response = pm.request('GET', 'http://example.com') + self.assertEqual(response.status, 200) + + def test_proxy_rejection(self): + evt = threading.Event() + + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks4_negotiation(sock) + addr, port = next(handler) + handler.send(False) + + evt.wait() + sock.close() + + self._start_server(request_handler) + proxy_url = "socks4://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url) + + self.assertRaises( + NewConnectionError, pm.request, 'GET', 'http://example.com', + retries=False + ) + evt.set() + + def test_socks4_with_username(self): + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks4_negotiation(sock, username=b'user') + addr, port = next(handler) + + self.assertEqual(addr, '16.17.18.19') + self.assertTrue(port, 80) + handler.send(True) + + while True: + buf = sock.recv(65535) + if buf.endswith(b'\r\n\r\n'): + break + + sock.sendall(b'HTTP/1.1 200 OK\r\n' + b'Server: SocksTestServer\r\n' + b'Content-Length: 0\r\n' + b'\r\n') + sock.close() + + self._start_server(request_handler) + proxy_url = "socks4://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url, username='user') + response = pm.request('GET', 'http://16.17.18.19') + + self.assertEqual(response.status, 200) + self.assertEqual(response.data, b'') + self.assertEqual(response.headers['Server'], 'SocksTestServer') + + def test_socks_with_invalid_username(self): + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks4_negotiation(sock, username=b'user') + next(handler) + + self._start_server(request_handler) + proxy_url = "socks4://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url, username='baduser') + + try: + pm.request('GET', 'http://example.com', retries=False) + except NewConnectionError as e: + self.assertTrue("different user-ids" in str(e)) + else: + self.fail("Did not raise") + + +class TestSOCKSWithTLS(IPV4SocketDummyServerTestCase): + """ + Test that TLS behaves properly for SOCKS proxies. + """ + def test_basic_request(self): + if not HAS_SSL: + raise SkipTest("No TLS available") + + def request_handler(listener): + sock = listener.accept()[0] + + handler = handle_socks5_negotiation(sock, negotiate=False) + addr, port = next(handler) + + self.assertEqual(addr, b'localhost') + self.assertTrue(port, 443) + handler.send(True) + + # Wrap in TLS + context = better_ssl.SSLContext(ssl.PROTOCOL_SSLv23) + context.load_cert_chain( + DEFAULT_CERTS['certfile'], DEFAULT_CERTS['keyfile'] + ) + tls = context.wrap_socket(sock, server_side=True) + buf = b'' + + while True: + buf += tls.recv(65535) + if buf.endswith(b'\r\n\r\n'): + break + + self.assertTrue(buf.startswith(b'GET / HTTP/1.1\r\n')) + + tls.sendall(b'HTTP/1.1 200 OK\r\n' + b'Server: SocksTestServer\r\n' + b'Content-Length: 0\r\n' + b'\r\n') + tls.close() + + self._start_server(request_handler) + proxy_url = "socks5://%s:%s" % (self.host, self.port) + pm = socks.SOCKSProxyManager(proxy_url) + response = pm.request('GET', 'https://localhost') + + self.assertEqual(response.status, 200) + self.assertEqual(response.data, b'') + self.assertEqual(response.headers['Server'], 'SocksTestServer') diff -Nru python-urllib3-1.13.1/test/__init__.py python-urllib3-1.15.1/test/__init__.py --- python-urllib3-1.13.1/test/__init__.py 2015-09-06 18:40:19.000000000 +0000 +++ python-urllib3-1.15.1/test/__init__.py 2016-04-06 19:16:56.000000000 +0000 @@ -33,7 +33,6 @@ clear_warnings() warnings.simplefilter('ignore', HTTPWarning) - def onlyPy26OrOlder(test): """Skips this test unless you are on Python2.6.x or earlier.""" @@ -57,7 +56,7 @@ return wrapper def onlyPy279OrNewer(test): - """Skips this test unless you are onl Python 2.7.9 or later.""" + """Skips this test unless you are on Python 2.7.9 or later.""" @functools.wraps(test) def wrapper(*args, **kwargs): @@ -66,6 +65,17 @@ raise SkipTest(msg) return test(*args, **kwargs) return wrapper + +def onlyPy2(test): + """Skips this test unless you are on Python 2.x""" + + @functools.wraps(test) + def wrapper(*args, **kwargs): + msg = "{name} requires Python 2.x to run".format(name=test.__name__) + if six.PY3: + raise SkipTest(msg) + return test(*args, **kwargs) + return wrapper def onlyPy3(test): """Skips this test unless you are on Python3.x""" diff -Nru python-urllib3-1.13.1/test/test_connectionpool.py python-urllib3-1.15.1/test/test_connectionpool.py --- python-urllib3-1.13.1/test/test_connectionpool.py 2015-09-04 00:16:43.000000000 +0000 +++ python-urllib3-1.15.1/test/test_connectionpool.py 2016-04-06 19:16:56.000000000 +0000 @@ -4,6 +4,7 @@ connection_from_url, HTTPConnection, HTTPConnectionPool, + HTTPSConnectionPool, ) from urllib3.util.timeout import Timeout from urllib3.packages.ssl_match_hostname import CertificateError @@ -74,6 +75,54 @@ c = connection_from_url(b) self.assertFalse(c.is_same_host(a), "%s =? %s" % (b, a)) + def test_same_host_no_port(self): + # This test was introduced in #801 to deal with the fact that urllib3 + # never initializes ConnectionPool objects with port=None. + same_host_http = [ + ('google.com', '/'), + ('google.com', 'http://google.com/'), + ('google.com', 'http://google.com'), + ('google.com', 'http://google.com/abra/cadabra'), + # Test comparison using default ports + ('google.com', 'http://google.com:80/abracadabra'), + ] + same_host_https = [ + ('google.com', '/'), + ('google.com', 'https://google.com/'), + ('google.com', 'https://google.com'), + ('google.com', 'https://google.com/abra/cadabra'), + # Test comparison using default ports + ('google.com', 'https://google.com:443/abracadabra'), + ] + + for a, b in same_host_http: + c = HTTPConnectionPool(a) + self.assertTrue(c.is_same_host(b), "%s =? %s" % (a, b)) + for a, b in same_host_https: + c = HTTPSConnectionPool(a) + self.assertTrue(c.is_same_host(b), "%s =? %s" % (a, b)) + + not_same_host_http = [ + ('google.com', 'https://google.com/'), + ('yahoo.com', 'http://google.com/'), + ('google.com', 'https://google.net/'), + ] + not_same_host_https = [ + ('google.com', 'http://google.com/'), + ('yahoo.com', 'https://google.com/'), + ('google.com', 'https://google.net/'), + ] + + for a, b in not_same_host_http: + c = HTTPConnectionPool(a) + self.assertFalse(c.is_same_host(b), "%s =? %s" % (a, b)) + c = HTTPConnectionPool(b) + self.assertFalse(c.is_same_host(a), "%s =? %s" % (b, a)) + for a, b in not_same_host_https: + c = HTTPSConnectionPool(a) + self.assertFalse(c.is_same_host(b), "%s =? %s" % (a, b)) + c = HTTPSConnectionPool(b) + self.assertFalse(c.is_same_host(a), "%s =? %s" % (b, a)) def test_max_connections(self): pool = HTTPConnectionPool(host='localhost', maxsize=1, block=True) @@ -234,6 +283,33 @@ conn = pool._get_conn() self.assertEqual(conn.cert_reqs, 'CERT_REQUIRED') + def test_cleanup_on_extreme_connection_error(self): + """ + This test validates that we clean up properly even on exceptions that + we'd not otherwise catch, i.e. those that inherit from BaseException + like KeyboardInterrupt or gevent.Timeout. See #805 for more details. + """ + class RealBad(BaseException): + pass + + def kaboom(*args, **kwargs): + raise RealBad() + + c = connection_from_url('http://localhost:80') + c._make_request = kaboom + + initial_pool_size = c.pool.qsize() + + try: + # We need to release_conn this way or we'd put it away regardless. + c.urlopen('GET', '/', release_conn=False) + except RealBad: + pass + + new_pool_size = c.pool.qsize() + self.assertEqual(initial_pool_size, new_pool_size) + + if __name__ == '__main__': unittest.main() diff -Nru python-urllib3-1.13.1/test/test_connection.py python-urllib3-1.15.1/test/test_connection.py --- python-urllib3-1.13.1/test/test_connection.py 1970-01-01 00:00:00.000000000 +0000 +++ python-urllib3-1.15.1/test/test_connection.py 2016-04-06 19:16:56.000000000 +0000 @@ -0,0 +1,48 @@ +import unittest + +import mock + +from urllib3.connection import ( + CertificateError, + VerifiedHTTPSConnection, + _match_hostname, +) + + +class TestConnection(unittest.TestCase): + """ + Tests in this suite should not make any network requests or connections. + """ + def test_match_hostname_no_cert(self): + cert = None + asserted_hostname = 'foo' + self.assertRaises(ValueError, _match_hostname, cert, asserted_hostname) + + def test_match_hostname_empty_cert(self): + cert = {} + asserted_hostname = 'foo' + self.assertRaises(ValueError, _match_hostname, cert, asserted_hostname) + + def test_match_hostname_match(self): + cert = {'subjectAltName': [('DNS', 'foo')]} + asserted_hostname = 'foo' + _match_hostname(cert, asserted_hostname) + + def test_match_hostname_mismatch(self): + cert = {'subjectAltName': [('DNS', 'foo')]} + asserted_hostname = 'bar' + try: + with mock.patch('urllib3.connection.log.error') as mock_log: + _match_hostname(cert, asserted_hostname) + except CertificateError as e: + self.assertEqual(str(e), "hostname 'bar' doesn't match 'foo'") + mock_log.assert_called_once_with( + 'Certificate did not match expected hostname: %s. ' + 'Certificate: %s', + 'bar', {'subjectAltName': [('DNS', 'foo')]} + ) + self.assertEqual(e._peer_cert, cert) + + +if __name__ == '__main__': + unittest.main() diff -Nru python-urllib3-1.13.1/test/test_fields.py python-urllib3-1.15.1/test/test_fields.py --- python-urllib3-1.13.1/test/test_fields.py 2015-04-01 22:24:02.000000000 +0000 +++ python-urllib3-1.15.1/test/test_fields.py 2016-04-06 19:16:56.000000000 +0000 @@ -1,7 +1,8 @@ import unittest from urllib3.fields import guess_content_type, RequestField -from urllib3.packages.six import u +from urllib3.packages.six import u, PY3 +from . import onlyPy2 class TestRequestField(unittest.TestCase): @@ -47,3 +48,9 @@ field = RequestField('somename', 'data') param = field._render_part('filename', u('n\u00e4me')) self.assertEqual(param, "filename*=utf-8''n%C3%A4me") + + @onlyPy2 + def test_render_unicode_bytes_py2(self): + field = RequestField('somename', 'data') + param = field._render_part('filename', 'n\xc3\xa4me') + self.assertEqual(param, "filename*=utf-8''n%C3%A4me") diff -Nru python-urllib3-1.13.1/test/with_dummyserver/test_chunked_transfer.py python-urllib3-1.15.1/test/with_dummyserver/test_chunked_transfer.py --- python-urllib3-1.13.1/test/with_dummyserver/test_chunked_transfer.py 1970-01-01 00:00:00.000000000 +0000 +++ python-urllib3-1.15.1/test/with_dummyserver/test_chunked_transfer.py 2016-04-06 19:16:56.000000000 +0000 @@ -0,0 +1,76 @@ +# -*- coding: utf-8 -*- + +from urllib3 import HTTPConnectionPool +from urllib3.packages import six +from dummyserver.testcase import SocketDummyServerTestCase + + +class TestChunkedTransfer(SocketDummyServerTestCase): + def start_chunked_handler(self): + self.buffer = b'' + + def socket_handler(listener): + sock = listener.accept()[0] + + while not self.buffer.endswith(b'\r\n0\r\n\r\n'): + self.buffer += sock.recv(65536) + + sock.send( + b'HTTP/1.1 200 OK\r\n' + b'Content-type: text/plain\r\n' + b'Content-Length: 0\r\n' + b'\r\n') + sock.close() + + self._start_server(socket_handler) + + def test_chunks(self): + self.start_chunked_handler() + chunks = ['foo', 'bar', '', 'bazzzzzzzzzzzzzzzzzzzzzz'] + pool = HTTPConnectionPool(self.host, self.port, retries=False) + r = pool.urlopen('GET', '/', chunks, headers=dict(DNT='1'), chunked=True) + + self.assertTrue(b'Transfer-Encoding' in self.buffer) + body = self.buffer.split(b'\r\n\r\n', 1)[1] + lines = body.split(b'\r\n') + # Empty chunks should have been skipped, as this could not be distinguished + # from terminating the transmission + for i, chunk in enumerate([c for c in chunks if c]): + self.assertEqual(lines[i * 2], hex(len(chunk))[2:].encode('utf-8')) + self.assertEqual(lines[i * 2 + 1], chunk.encode('utf-8')) + + def _test_body(self, data): + self.start_chunked_handler() + pool = HTTPConnectionPool(self.host, self.port, retries=False) + r = pool.urlopen('GET', '/', data, chunked=True) + header, body = self.buffer.split(b'\r\n\r\n', 1) + + self.assertTrue(b'Transfer-Encoding: chunked' in header.split(b'\r\n')) + if data: + bdata = data if isinstance(data, six.binary_type) else data.encode('utf-8') + self.assertTrue(b'\r\n' + bdata + b'\r\n' in body) + self.assertTrue(body.endswith(b'\r\n0\r\n\r\n')) + + len_str = body.split(b'\r\n', 1)[0] + stated_len = int(len_str, 16) + self.assertEqual(stated_len, len(bdata)) + else: + self.assertEqual(body, b'0\r\n\r\n') + + def test_bytestring_body(self): + self._test_body(b'thisshouldbeonechunk\r\nasdf') + + def test_unicode_body(self): + # Define u'thisshouldbeonechunk\r\nรครถรผรŸ' in a way, so that python3.1 + # does not suffer a syntax error + chunk = b'thisshouldbeonechunk\r\n\xc3\xa4\xc3\xb6\xc3\xbc\xc3\x9f'.decode('utf-8') + self._test_body(chunk) + + def test_empty_body(self): + self._test_body(None) + + def test_empty_string_body(self): + self._test_body('') + + def test_empty_iterable_body(self): + self._test_body([]) diff -Nru python-urllib3-1.13.1/test/with_dummyserver/test_https.py python-urllib3-1.15.1/test/with_dummyserver/test_https.py --- python-urllib3-1.13.1/test/with_dummyserver/test_https.py 2015-12-18 22:47:24.000000000 +0000 +++ python-urllib3-1.15.1/test/with_dummyserver/test_https.py 2016-04-06 19:16:56.000000000 +0000 @@ -38,7 +38,7 @@ ) from urllib3.packages import six from urllib3.util.timeout import Timeout -from urllib3.util.ssl_ import HAS_SNI +import urllib3.util as util ResourceWarning = getattr( @@ -77,11 +77,11 @@ r = https_pool.request('GET', '/') self.assertEqual(r.status, 200) - if sys.version_info >= (2, 7, 9): + if sys.version_info >= (2, 7, 9) or util.IS_PYOPENSSL: self.assertFalse(warn.called, warn.call_args_list) else: self.assertTrue(warn.called) - if HAS_SNI: + if util.HAS_SNI: call = warn.call_args_list[0] else: call = warn.call_args_list[1] @@ -181,9 +181,9 @@ self.assertTrue(warn.called) calls = warn.call_args_list - if sys.version_info >= (2, 7, 9): + if sys.version_info >= (2, 7, 9) or util.IS_PYOPENSSL: category = calls[0][0][1] - elif HAS_SNI: + elif util.HAS_SNI: category = calls[1][0][1] else: category = calls[2][0][1] @@ -237,8 +237,9 @@ cert_reqs='CERT_REQUIRED', ca_certs=DEFAULT_CA) - https_pool.assert_fingerprint = 'CA:84:E1:AD0E5a:ef:2f:C3:09' \ - ':E7:30:F8:CD:C8:5B' + https_pool.assert_fingerprint = 'F2:06:5A:42:10:3F:45:1C:17:FE:E6:' \ + '07:1E:8A:86:E5' + https_pool.request('GET', '/') def test_assert_fingerprint_sha1(self): @@ -246,8 +247,8 @@ cert_reqs='CERT_REQUIRED', ca_certs=DEFAULT_CA) - https_pool.assert_fingerprint = 'CC:45:6A:90:82:F7FF:C0:8218:8e:' \ - '7A:F2:8A:D7:1E:07:33:67:DE' + https_pool.assert_fingerprint = '92:81:FE:85:F7:0C:26:60:EC:D6:B3:' \ + 'BF:93:CF:F9:71:CC:07:7D:0A' https_pool.request('GET', '/') def test_assert_fingerprint_sha256(self): @@ -255,9 +256,9 @@ cert_reqs='CERT_REQUIRED', ca_certs=DEFAULT_CA) - https_pool.assert_fingerprint = ('9A:29:9D:4F:47:85:1C:51:23:F5:9A:A3:' - '0F:5A:EF:96:F9:2E:3C:22:2E:FC:E8:BC:' - '0E:73:90:37:ED:3B:AA:AB') + https_pool.assert_fingerprint = ('C5:4D:0B:83:84:89:2E:AE:B4:58:BB:12:' + 'F7:A6:C4:76:05:03:88:D8:57:65:51:F3:' + '1E:60:B0:8B:70:18:64:E6') https_pool.request('GET', '/') def test_assert_invalid_fingerprint(self): @@ -294,8 +295,8 @@ cert_reqs='CERT_NONE', ca_certs=DEFAULT_CA_BAD) - https_pool.assert_fingerprint = 'CC:45:6A:90:82:F7FF:C0:8218:8e:' \ - '7A:F2:8A:D7:1E:07:33:67:DE' + https_pool.assert_fingerprint = '92:81:FE:85:F7:0C:26:60:EC:D6:B3:' \ + 'BF:93:CF:F9:71:CC:07:7D:0A' https_pool.request('GET', '/') def test_good_fingerprint_and_hostname_mismatch(self): @@ -303,8 +304,8 @@ cert_reqs='CERT_REQUIRED', ca_certs=DEFAULT_CA) - https_pool.assert_fingerprint = 'CC:45:6A:90:82:F7FF:C0:8218:8e:' \ - '7A:F2:8A:D7:1E:07:33:67:DE' + https_pool.assert_fingerprint = '92:81:FE:85:F7:0C:26:60:EC:D6:B3:' \ + 'BF:93:CF:F9:71:CC:07:7D:0A' https_pool.request('GET', '/') @requires_network @@ -325,8 +326,8 @@ timeout=timeout, retries=False, cert_reqs='CERT_REQUIRED') https_pool.ca_certs = DEFAULT_CA - https_pool.assert_fingerprint = 'CC:45:6A:90:82:F7FF:C0:8218:8e:' \ - '7A:F2:8A:D7:1E:07:33:67:DE' + https_pool.assert_fingerprint = '92:81:FE:85:F7:0C:26:60:EC:D6:B3:' \ + 'BF:93:CF:F9:71:CC:07:7D:0A' timeout = Timeout(total=None) https_pool = HTTPSConnectionPool(self.host, self.port, timeout=timeout, @@ -385,7 +386,7 @@ timeout=Timeout(total=None, connect=0.001)) def test_enhanced_ssl_connection(self): - fingerprint = 'CC:45:6A:90:82:F7FF:C0:8218:8e:7A:F2:8A:D7:1E:07:33:67:DE' + fingerprint = '92:81:FE:85:F7:0C:26:60:EC:D6:B3:BF:93:CF:F9:71:CC:07:7D:0A' conn = VerifiedHTTPSConnection(self.host, self.port) https_pool = HTTPSConnectionPool(self.host, self.port, diff -Nru python-urllib3-1.13.1/test/with_dummyserver/test_poolmanager.py python-urllib3-1.15.1/test/with_dummyserver/test_poolmanager.py --- python-urllib3-1.13.1/test/with_dummyserver/test_poolmanager.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/test/with_dummyserver/test_poolmanager.py 2016-04-06 19:16:56.000000000 +0000 @@ -109,6 +109,34 @@ self.assertEqual(r.status, 303) + def test_raise_on_status(self): + http = PoolManager() + + try: + # the default is to raise + r = http.request('GET', '%s/status' % self.base_url, + fields={'status': '500 Internal Server Error'}, + retries=Retry(total=1, status_forcelist=range(500, 600))) + self.fail("Failed to raise MaxRetryError exception, returned %r" % r.status) + except MaxRetryError: + pass + + try: + # raise explicitly + r = http.request('GET', '%s/status' % self.base_url, + fields={'status': '500 Internal Server Error'}, + retries=Retry(total=1, status_forcelist=range(500, 600), raise_on_status=True)) + self.fail("Failed to raise MaxRetryError exception, returned %r" % r.status) + except MaxRetryError: + pass + + # don't raise + r = http.request('GET', '%s/status' % self.base_url, + fields={'status': '500 Internal Server Error'}, + retries=Retry(total=1, status_forcelist=range(500, 600), raise_on_status=False)) + + self.assertEqual(r.status, 500) + def test_missing_port(self): # Can a URL that lacks an explicit port like ':80' succeed, or # will all such URLs fail with an error? diff -Nru python-urllib3-1.13.1/test/with_dummyserver/test_socketlevel.py python-urllib3-1.15.1/test/with_dummyserver/test_socketlevel.py --- python-urllib3-1.13.1/test/with_dummyserver/test_socketlevel.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/test/with_dummyserver/test_socketlevel.py 2016-04-06 19:16:56.000000000 +0000 @@ -15,7 +15,7 @@ from urllib3.util.ssl_ import HAS_SNI from urllib3.util.timeout import Timeout from urllib3.util.retry import Retry -from urllib3._collections import HTTPHeaderDict +from urllib3._collections import HTTPHeaderDict, OrderedDict from dummyserver.testcase import SocketDummyServerTestCase from dummyserver.server import ( @@ -81,7 +81,7 @@ except SSLError: # We are violating the protocol pass done_receiving.wait() - self.assertTrue(self.host.encode() in self.buf, + self.assertTrue(self.host.encode('ascii') in self.buf, "missing hostname in SSL handshake") @@ -239,6 +239,33 @@ finally: timed_out.set() + def test_delayed_body_read_timeout_with_preload(self): + timed_out = Event() + + def socket_handler(listener): + sock = listener.accept()[0] + buf = b'' + body = 'Hi' + while not buf.endswith(b'\r\n\r\n'): + buf += sock.recv(65536) + sock.send(('HTTP/1.1 200 OK\r\n' + 'Content-Type: text/plain\r\n' + 'Content-Length: %d\r\n' + '\r\n' % len(body)).encode('utf-8')) + + timed_out.wait(5) + sock.close() + + self._start_server(socket_handler) + pool = HTTPConnectionPool(self.host, self.port) + + try: + self.assertRaises(ReadTimeoutError, pool.urlopen, + 'GET', '/', retries=False, + timeout=Timeout(connect=1, read=0.001)) + finally: + timed_out.set() + def test_incomplete_response(self): body = 'Response' partial_body = body[:2] @@ -433,6 +460,53 @@ timeout=Timeout(connect=1, read=0.1)) self.assertEqual(len(response.read()), 8) + def test_closing_response_actually_closes_connection(self): + done_closing = Event() + complete = Event() + # The insane use of this variable here is to get around the fact that + # Python 2.6 does not support returning a value from Event.wait(). This + # means we can't tell if an event timed out, so we can't use the timing + # out of the 'complete' event to determine the success or failure of + # the test. Python 2 also doesn't have the nonlocal statement, so we + # can't write directly to this variable, only mutate it. Hence: list. + successful = [] + + def socket_handler(listener): + sock = listener.accept()[0] + + buf = b'' + while not buf.endswith(b'\r\n\r\n'): + buf = sock.recv(65536) + + sock.send(('HTTP/1.1 200 OK\r\n' + 'Content-Type: text/plain\r\n' + 'Content-Length: 0\r\n' + '\r\n').encode('utf-8')) + + # Wait for the socket to close. + done_closing.wait(timeout=1) + + # Look for the empty string to show that the connection got closed. + # Don't get stuck in a timeout. + sock.settimeout(1) + new_data = sock.recv(65536) + self.assertFalse(new_data) + successful.append(True) + sock.close() + complete.set() + + self._start_server(socket_handler) + pool = HTTPConnectionPool(self.host, self.port) + + response = pool.request('GET', '/', retries=0, preload_content=False) + self.assertEqual(response.status, 200) + response.close() + + done_closing.set() # wait until the socket in our pool gets closed + complete.wait(timeout=1) + if not successful: + self.fail("Timed out waiting for connection close") + class TestProxyManager(SocketDummyServerTestCase): @@ -751,9 +825,8 @@ for header in headers_list: (key, value) = header.split(b': ') - parsed_headers[key.decode()] = value.decode() + parsed_headers[key.decode('ascii')] = value.decode('ascii') - # Send incomplete message (note Content-Length) sock.send(( 'HTTP/1.1 204 No Content\r\n' 'Content-Length: 0\r\n' @@ -769,6 +842,76 @@ pool = HTTPConnectionPool(self.host, self.port, retries=False) pool.request('GET', '/', headers=HTTPHeaderDict(headers)) self.assertEqual(expected_headers, parsed_headers) + + def test_request_headers_are_sent_in_the_original_order(self): + # NOTE: Probability this test gives a false negative is 1/(K!) + K = 16 + # NOTE: Provide headers in non-sorted order (i.e. reversed) + # so that if the internal implementation tries to sort them, + # a change will be detected. + expected_request_headers = [(u'X-Header-%d' % i, str(i)) for i in reversed(range(K))] + + actual_request_headers = [] + + def socket_handler(listener): + sock = listener.accept()[0] + + buf = b'' + while not buf.endswith(b'\r\n\r\n'): + buf += sock.recv(65536) + + headers_list = [header for header in buf.split(b'\r\n')[1:] if header] + + for header in headers_list: + (key, value) = header.split(b': ') + if not key.decode('ascii').startswith(u'X-Header-'): + continue + actual_request_headers.append((key.decode('ascii'), value.decode('ascii'))) + + sock.send(( + u'HTTP/1.1 204 No Content\r\n' + u'Content-Length: 0\r\n' + u'\r\n').encode('ascii')) + + sock.close() + + self._start_server(socket_handler) + + pool = HTTPConnectionPool(self.host, self.port, retries=False) + pool.request('GET', '/', headers=OrderedDict(expected_request_headers)) + self.assertEqual(expected_request_headers, actual_request_headers) + + def test_response_headers_are_returned_in_the_original_order(self): + # NOTE: Probability this test gives a false negative is 1/(K!) + K = 16 + # NOTE: Provide headers in non-sorted order (i.e. reversed) + # so that if the internal implementation tries to sort them, + # a change will be detected. + expected_response_headers = [('X-Header-%d' % i, str(i)) for i in reversed(range(K))] + + def socket_handler(listener): + sock = listener.accept()[0] + + buf = b'' + while not buf.endswith(b'\r\n\r\n'): + buf += sock.recv(65536) + + sock.send(b'HTTP/1.1 200 OK\r\n' + + b'\r\n'.join([ + (k.encode('utf8') + b': ' + v.encode('utf8')) + for (k, v) in expected_response_headers + ]) + + b'\r\n') + sock.close() + + self._start_server(socket_handler) + pool = HTTPConnectionPool(self.host, self.port) + r = pool.request('GET', '/', retries=0) + actual_response_headers = [ + (k, v) for (k, v) in r.headers.items() + if k.startswith('X-Header-') + ] + self.assertEqual(expected_response_headers, actual_response_headers) class TestBrokenHeaders(SocketDummyServerTestCase): diff -Nru python-urllib3-1.13.1/urllib3/_collections.py python-urllib3-1.15.1/urllib3/_collections.py --- python-urllib3-1.13.1/urllib3/_collections.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/_collections.py 2016-04-06 19:16:56.000000000 +0000 @@ -134,7 +134,7 @@ def __init__(self, headers=None, **kwargs): super(HTTPHeaderDict, self).__init__() - self._container = {} + self._container = OrderedDict() if headers is not None: if isinstance(headers, HTTPHeaderDict): self._copy_from(headers) diff -Nru python-urllib3-1.13.1/urllib3/connectionpool.py python-urllib3-1.15.1/urllib3/connectionpool.py --- python-urllib3-1.13.1/urllib3/connectionpool.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/connectionpool.py 2016-04-06 19:16:56.000000000 +0000 @@ -69,7 +69,13 @@ if not host: raise LocationValueError("No host specified.") - self.host = host + # httplib doesn't like it when we include brackets in ipv6 addresses + # Specifically, if we include brackets but also pass the port then + # httplib crazily doubles up the square brackets on the Host header. + # Instead, we need to make sure we never pass ``None`` as the port. + # However, for backward compatibility reasons we can't actually + # *assert* that. + self.host = host.strip('[]') self.port = port def __str__(self): @@ -203,8 +209,8 @@ Return a fresh :class:`HTTPConnection`. """ self.num_connections += 1 - log.info("Starting new HTTP connection (%d): %s" % - (self.num_connections, self.host)) + log.info("Starting new HTTP connection (%d): %s", + self.num_connections, self.host) conn = self.ConnectionCls(host=self.host, port=self.port, timeout=self.timeout.connect_timeout, @@ -239,7 +245,7 @@ # If this is a persistent connection, check if it got disconnected if conn and is_connection_dropped(conn): - log.info("Resetting dropped connection: %s" % self.host) + log.info("Resetting dropped connection: %s", self.host) conn.close() if getattr(conn, 'auto_open', 1) == 0: # This is a proxied connection that has been mutated by @@ -272,7 +278,7 @@ except Full: # This should never happen if self.block == True log.warning( - "Connection pool is full, discarding connection: %s" % + "Connection pool is full, discarding connection: %s", self.host) # Connection never got put back into the pool, close it. @@ -318,7 +324,7 @@ if 'timed out' in str(err) or 'did not complete (read)' in str(err): # Python 2.6 raise ReadTimeoutError(self, url, "Read timed out. (read timeout=%s)" % timeout_value) - def _make_request(self, conn, method, url, timeout=_Default, + def _make_request(self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw): """ Perform a request on a given urllib connection object taken from our @@ -350,7 +356,10 @@ # conn.request() calls httplib.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. - conn.request(method, url, **httplib_request_kw) + if chunked: + conn.request_chunked(method, url, **httplib_request_kw) + else: + conn.request(method, url, **httplib_request_kw) # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout @@ -382,9 +391,8 @@ # AppEngine doesn't have a version attr. http_version = getattr(conn, '_http_vsn_str', 'HTTP/?') - log.debug("\"%s %s %s\" %s %s" % (method, url, http_version, - httplib_response.status, - httplib_response.length)) + log.debug("\"%s %s %s\" %s %s", method, url, http_version, + httplib_response.status, httplib_response.length) try: assert_header_parsing(httplib_response.msg) @@ -435,7 +443,8 @@ def urlopen(self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, - pool_timeout=None, release_conn=None, **response_kw): + pool_timeout=None, release_conn=None, chunked=False, + **response_kw): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all @@ -512,6 +521,11 @@ back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. + :param chunked: + If True, urllib3 will send the body using chunked transfer + encoding. Otherwise, urllib3 will send the body using the standard + content-length form. Defaults to False. + :param \**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` @@ -542,6 +556,10 @@ # complains about UnboundLocalError. err = None + # Keep track of whether we cleanly exited the except block. This + # ensures we do proper cleanup in finally. + clean_exit = False + try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) @@ -556,13 +574,14 @@ # Make the request on the httplib connection object. httplib_response = self._make_request(conn, method, url, timeout=timeout_obj, - body=body, headers=headers) + body=body, headers=headers, + chunked=chunked) # If we're going to release the connection in ``finally:``, then - # the request doesn't need to know about the connection. Otherwise + # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. - response_conn = not release_conn and conn + response_conn = conn if not release_conn else None # Import httplib's response into our own wrapper object response = HTTPResponse.from_httplib(httplib_response, @@ -570,10 +589,8 @@ connection=response_conn, **response_kw) - # else: - # The connection will be put back into the pool when - # ``response.release_conn()`` is called (implicitly by - # ``response.read()``) + # Everything went great! + clean_exit = True except Empty: # Timed out by queue. @@ -583,22 +600,19 @@ # Close the connection. If a connection is reused on which there # was a Certificate error, the next request will certainly raise # another Certificate error. - conn = conn and conn.close() - release_conn = True + clean_exit = False raise SSLError(e) except SSLError: # Treat SSLError separately from BaseSSLError to preserve # traceback. - conn = conn and conn.close() - release_conn = True + clean_exit = False raise except (TimeoutError, HTTPException, SocketError, ProtocolError) as e: # Discard the connection for these exceptions. It will be # be replaced during the next _get_conn() call. - conn = conn and conn.close() - release_conn = True + clean_exit = False if isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError('Cannot connect to proxy.', e) @@ -613,6 +627,14 @@ err = e finally: + if not clean_exit: + # We hit some kind of exception, handled or otherwise. We need + # to throw the connection away unless explicitly told not to. + # Close the connection, set the variable to None, and make sure + # we put the None back in the pool to avoid leaking it. + conn = conn and conn.close() + release_conn = True + if release_conn: # Put the connection back to be reused. If the connection is # expired then it will be None, which will get replaced with a @@ -622,7 +644,7 @@ if not conn: # Try again log.warning("Retrying (%r) after connection " - "broken by '%r': %s" % (retries, err, url)) + "broken by '%r': %s", retries, err, url) return self.urlopen(method, url, body, headers, retries, redirect, assert_same_host, timeout=timeout, pool_timeout=pool_timeout, @@ -644,7 +666,7 @@ raise return response - log.info("Redirecting %s -> %s" % (url, redirect_location)) + log.info("Redirecting %s -> %s", url, redirect_location) return self.urlopen( method, redirect_location, body, headers, retries=retries, redirect=redirect, @@ -654,9 +676,17 @@ # Check if we should retry the HTTP response. if retries.is_forced_retry(method, status_code=response.status): - retries = retries.increment(method, url, response=response, _pool=self) + try: + retries = retries.increment(method, url, response=response, _pool=self) + except MaxRetryError: + if retries.raise_on_status: + # Release the connection for this response, since we're not + # returning it to be released manually. + response.release_conn() + raise + return response retries.sleep() - log.info("Forced retry: %s" % url) + log.info("Forced retry: %s", url) return self.urlopen( method, url, body, headers, retries=retries, redirect=redirect, @@ -742,7 +772,7 @@ except AttributeError: # Platform-specific: Python 2.6 set_tunnel = conn._set_tunnel - if sys.version_info <= (2, 6, 4) and not self.proxy_headers: # Python 2.6.4 and older + if sys.version_info <= (2, 6, 4) and not self.proxy_headers: # Python 2.6.4 and older set_tunnel(self.host, self.port) else: set_tunnel(self.host, self.port, self.proxy_headers) @@ -754,8 +784,8 @@ Return a fresh :class:`httplib.HTTPSConnection`. """ self.num_connections += 1 - log.info("Starting new HTTPS connection (%d): %s" - % (self.num_connections, self.host)) + log.info("Starting new HTTPS connection (%d): %s", + self.num_connections, self.host) if not self.ConnectionCls or self.ConnectionCls is DummyConnection: raise SSLError("Can't connect to HTTPS URL because the SSL " @@ -812,6 +842,7 @@ >>> r = conn.request('GET', '/') """ scheme, host, port = get_host(url) + port = port or port_by_scheme.get(scheme, 80) if scheme == 'https': return HTTPSConnectionPool(host, port=port, **kw) else: diff -Nru python-urllib3-1.13.1/urllib3/connection.py python-urllib3-1.15.1/urllib3/connection.py --- python-urllib3-1.13.1/urllib3/connection.py 2015-12-18 22:47:24.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/connection.py 2016-04-06 19:16:56.000000000 +0000 @@ -1,5 +1,6 @@ from __future__ import absolute_import import datetime +import logging import os import sys import socket @@ -38,7 +39,7 @@ SubjectAltNameWarning, SystemTimeWarning, ) -from .packages.ssl_match_hostname import match_hostname +from .packages.ssl_match_hostname import match_hostname, CertificateError from .util.ssl_ import ( resolve_cert_reqs, @@ -50,6 +51,10 @@ from .util import connection +from ._collections import HTTPHeaderDict + +log = logging.getLogger(__name__) + port_by_scheme = { 'http': 80, 'https': 443, @@ -162,6 +167,38 @@ conn = self._new_conn() self._prepare_conn(conn) + def request_chunked(self, method, url, body=None, headers=None): + """ + Alternative to the common request method, which sends the + body with chunked encoding and not as one block + """ + headers = HTTPHeaderDict(headers if headers is not None else {}) + skip_accept_encoding = 'accept-encoding' in headers + self.putrequest(method, url, skip_accept_encoding=skip_accept_encoding) + for header, value in headers.items(): + self.putheader(header, value) + if 'transfer-encoding' not in headers: + self.putheader('Transfer-Encoding', 'chunked') + self.endheaders() + + if body is not None: + stringish_types = six.string_types + (six.binary_type,) + if isinstance(body, stringish_types): + body = (body,) + for chunk in body: + if not chunk: + continue + if not isinstance(chunk, six.binary_type): + chunk = chunk.encode('utf8') + len_str = hex(len(chunk))[2:] + self.send(len_str.encode('utf-8')) + self.send(b'\r\n') + self.send(chunk) + self.send(b'\r\n') + + # After the if clause, to always have a closed body + self.send(b'0\r\n\r\n') + class HTTPSConnection(HTTPConnection): default_port = port_by_scheme['https'] @@ -265,21 +302,26 @@ 'for details.)'.format(hostname)), SubjectAltNameWarning ) - - # In case the hostname is an IPv6 address, strip the square - # brackets from it before using it to validate. This is because - # a certificate with an IPv6 address in it won't have square - # brackets around that address. Sadly, match_hostname won't do this - # for us: it expects the plain host part without any extra work - # that might have been done to make it palatable to httplib. - asserted_hostname = self.assert_hostname or hostname - asserted_hostname = asserted_hostname.strip('[]') - match_hostname(cert, asserted_hostname) + _match_hostname(cert, self.assert_hostname or hostname) self.is_verified = (resolved_cert_reqs == ssl.CERT_REQUIRED or self.assert_fingerprint is not None) +def _match_hostname(cert, asserted_hostname): + try: + match_hostname(cert, asserted_hostname) + except CertificateError as e: + log.error( + 'Certificate did not match expected hostname: %s. ' + 'Certificate: %s', asserted_hostname, cert + ) + # Add cert to exception and reraise so client code can inspect + # the cert when catching the exception, if they want to + e._peer_cert = cert + raise + + if ssl: # Make a copy for testing. UnverifiedHTTPSConnection = HTTPSConnection diff -Nru python-urllib3-1.13.1/urllib3/contrib/appengine.py python-urllib3-1.15.1/urllib3/contrib/appengine.py --- python-urllib3-1.13.1/urllib3/contrib/appengine.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/contrib/appengine.py 2015-12-29 20:28:18.000000000 +0000 @@ -144,7 +144,7 @@ if retries.is_forced_retry(method, status_code=http_response.status): retries = retries.increment( method, url, response=http_response, _pool=self) - log.info("Forced retry: %s" % url) + log.info("Forced retry: %s", url) retries.sleep() return self.urlopen( method, url, @@ -164,6 +164,14 @@ if content_encoding == 'deflate': del urlfetch_resp.headers['content-encoding'] + transfer_encoding = urlfetch_resp.headers.get('transfer-encoding') + # We have a full response's content, + # so let's make sure we don't report ourselves as chunked data. + if transfer_encoding == 'chunked': + encodings = transfer_encoding.split(",") + encodings.remove('chunked') + urlfetch_resp.headers['transfer-encoding'] = ','.join(encodings) + return HTTPResponse( # In order for decoding to work, we must present the content as # a file-like object. @@ -177,7 +185,7 @@ if timeout is Timeout.DEFAULT_TIMEOUT: return 5 # 5s is the default timeout for URLFetch. if isinstance(timeout, Timeout): - if timeout.read is not timeout.connect: + if timeout._read is not timeout._connect: warnings.warn( "URLFetch does not support granular timeout settings, " "reverting to total timeout.", AppEnginePlatformWarning) diff -Nru python-urllib3-1.13.1/urllib3/contrib/ntlmpool.py python-urllib3-1.15.1/urllib3/contrib/ntlmpool.py --- python-urllib3-1.13.1/urllib3/contrib/ntlmpool.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/contrib/ntlmpool.py 2015-12-29 20:28:18.000000000 +0000 @@ -43,8 +43,8 @@ # Performs the NTLM handshake that secures the connection. The socket # must be kept open while requests are performed. self.num_connections += 1 - log.debug('Starting NTLM HTTPS connection no. %d: https://%s%s' % - (self.num_connections, self.host, self.authurl)) + log.debug('Starting NTLM HTTPS connection no. %d: https://%s%s', + self.num_connections, self.host, self.authurl) headers = {} headers['Connection'] = 'Keep-Alive' @@ -56,13 +56,13 @@ # Send negotiation message headers[req_header] = ( 'NTLM %s' % ntlm.create_NTLM_NEGOTIATE_MESSAGE(self.rawuser)) - log.debug('Request headers: %s' % headers) + log.debug('Request headers: %s', headers) conn.request('GET', self.authurl, None, headers) res = conn.getresponse() reshdr = dict(res.getheaders()) - log.debug('Response status: %s %s' % (res.status, res.reason)) - log.debug('Response headers: %s' % reshdr) - log.debug('Response data: %s [...]' % res.read(100)) + log.debug('Response status: %s %s', res.status, res.reason) + log.debug('Response headers: %s', reshdr) + log.debug('Response data: %s [...]', res.read(100)) # Remove the reference to the socket, so that it can not be closed by # the response object (we want to keep the socket open) @@ -87,12 +87,12 @@ self.pw, NegotiateFlags) headers[req_header] = 'NTLM %s' % auth_msg - log.debug('Request headers: %s' % headers) + log.debug('Request headers: %s', headers) conn.request('GET', self.authurl, None, headers) res = conn.getresponse() - log.debug('Response status: %s %s' % (res.status, res.reason)) - log.debug('Response headers: %s' % dict(res.getheaders())) - log.debug('Response data: %s [...]' % res.read()[:100]) + log.debug('Response status: %s %s', res.status, res.reason) + log.debug('Response headers: %s', dict(res.getheaders())) + log.debug('Response data: %s [...]', res.read()[:100]) if res.status != 200: if res.status == 401: raise Exception('Server rejected request: wrong ' diff -Nru python-urllib3-1.13.1/urllib3/contrib/pyopenssl.py python-urllib3-1.15.1/urllib3/contrib/pyopenssl.py --- python-urllib3-1.13.1/urllib3/contrib/pyopenssl.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/contrib/pyopenssl.py 2016-04-06 19:16:56.000000000 +0000 @@ -54,9 +54,17 @@ import OpenSSL.SSL from pyasn1.codec.der import decoder as der_decoder from pyasn1.type import univ, constraint -from socket import _fileobject, timeout, error as SocketError +from socket import timeout, error as SocketError + +try: # Platform-specific: Python 2 + from socket import _fileobject +except ImportError: # Platform-specific: Python 3 + _fileobject = None + from urllib3.packages.backports.makefile import backport_makefile + import ssl import select +import six from .. import connection from .. import util @@ -90,7 +98,7 @@ OpenSSL.SSL.VERIFY_PEER + OpenSSL.SSL.VERIFY_FAIL_IF_NO_PEER_CERT, } -DEFAULT_SSL_CIPHER_LIST = util.ssl_.DEFAULT_CIPHERS +DEFAULT_SSL_CIPHER_LIST = util.ssl_.DEFAULT_CIPHERS.encode('ascii') # OpenSSL will only write 16K at a time SSL_WRITE_BLOCKSIZE = 16384 @@ -104,6 +112,7 @@ connection.ssl_wrap_socket = ssl_wrap_socket util.HAS_SNI = HAS_SNI + util.IS_PYOPENSSL = True def extract_from_urllib3(): @@ -111,6 +120,7 @@ connection.ssl_wrap_socket = orig_connection_ssl_wrap_socket util.HAS_SNI = orig_util_HAS_SNI + util.IS_PYOPENSSL = False # Note: This is a slightly bug-fixed version of same from ndg-httpsclient. @@ -135,7 +145,7 @@ for i in range(peer_cert.get_extension_count()): ext = peer_cert.get_extension(i) ext_name = ext.get_short_name() - if ext_name != 'subjectAltName': + if ext_name != b'subjectAltName': continue # PyOpenSSL returns extension data in ASN.1 encoded form @@ -167,13 +177,17 @@ self.socket = socket self.suppress_ragged_eofs = suppress_ragged_eofs self._makefile_refs = 0 + self._closed = False def fileno(self): return self.socket.fileno() - def makefile(self, mode, bufsize=-1): - self._makefile_refs += 1 - return _fileobject(self, mode, bufsize, close=True) + # Copy-pasted from Python 3.5 source code + def _decref_socketios(self): + if self._makefile_refs > 0: + self._makefile_refs -= 1 + if self._closed: + self.close() def recv(self, *args, **kwargs): try: @@ -182,7 +196,7 @@ if self.suppress_ragged_eofs and e.args == (-1, 'Unexpected EOF'): return b'' else: - raise SocketError(e) + raise SocketError(str(e)) except OpenSSL.SSL.ZeroReturnError as e: if self.connection.get_shutdown() == OpenSSL.SSL.RECEIVED_SHUTDOWN: return b'' @@ -198,6 +212,27 @@ else: return data + def recv_into(self, *args, **kwargs): + try: + return self.connection.recv_into(*args, **kwargs) + except OpenSSL.SSL.SysCallError as e: + if self.suppress_ragged_eofs and e.args == (-1, 'Unexpected EOF'): + return 0 + else: + raise SocketError(str(e)) + except OpenSSL.SSL.ZeroReturnError as e: + if self.connection.get_shutdown() == OpenSSL.SSL.RECEIVED_SHUTDOWN: + return 0 + else: + raise + except OpenSSL.SSL.WantReadError: + rd, wd, ed = select.select( + [self.socket], [], [], self.socket.gettimeout()) + if not rd: + raise timeout('The read operation timed out') + else: + return self.recv_into(*args, **kwargs) + def settimeout(self, timeout): return self.socket.settimeout(timeout) @@ -225,6 +260,7 @@ def close(self): if self._makefile_refs < 1: try: + self._closed = True return self.connection.close() except OpenSSL.SSL.Error: return @@ -262,6 +298,16 @@ self._makefile_refs -= 1 +if _fileobject: # Platform-specific: Python 2 + def makefile(self, mode, bufsize=-1): + self._makefile_refs += 1 + return _fileobject(self, mode, bufsize, close=True) +else: # Platform-specific: Python 3 + makefile = backport_makefile + +WrappedSocket.makefile = makefile + + def _verify_callback(cnx, x509, err_no, err_depth, return_code): return err_no == 0 @@ -285,7 +331,7 @@ else: ctx.set_default_verify_paths() - # Disable TLS compression to migitate CRIME attack (issue #309) + # Disable TLS compression to mitigate CRIME attack (issue #309) OP_NO_COMPRESSION = 0x20000 ctx.set_options(OP_NO_COMPRESSION) @@ -293,6 +339,8 @@ ctx.set_cipher_list(DEFAULT_SSL_CIPHER_LIST) cnx = OpenSSL.SSL.Connection(ctx, sock) + if isinstance(server_hostname, six.text_type): # Platform-specific: Python 3 + server_hostname = server_hostname.encode('utf-8') cnx.set_tlsext_host_name(server_hostname) cnx.set_connect_state() while True: diff -Nru python-urllib3-1.13.1/urllib3/contrib/socks.py python-urllib3-1.15.1/urllib3/contrib/socks.py --- python-urllib3-1.13.1/urllib3/contrib/socks.py 1970-01-01 00:00:00.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/contrib/socks.py 2016-04-06 19:16:56.000000000 +0000 @@ -0,0 +1,172 @@ +# -*- coding: utf-8 -*- +""" +SOCKS support for urllib3 +~~~~~~~~~~~~~~~~~~~~~~~~~ + +This contrib module contains provisional support for SOCKS proxies from within +urllib3. This module supports SOCKS4 (specifically the SOCKS4A variant) and +SOCKS5. To enable its functionality, either install PySocks or install this +module with the ``socks`` extra. + +Known Limitations: + +- Currently PySocks does not support contacting remote websites via literal + IPv6 addresses. Any such connection attempt will fail. +- Currently PySocks does not support IPv6 connections to the SOCKS proxy. Any + such connection attempt will fail. +""" +from __future__ import absolute_import + +try: + import socks +except ImportError: + import warnings + from ..exceptions import DependencyWarning + + warnings.warn(( + 'SOCKS support in urllib3 requires the installation of optional ' + 'dependencies: specifically, PySocks. For more information, see ' + 'https://urllib3.readthedocs.org/en/latest/contrib.html#socks-proxies' + ), + DependencyWarning + ) + raise + +from socket import error as SocketError, timeout as SocketTimeout + +from ..connection import ( + HTTPConnection, HTTPSConnection +) +from ..connectionpool import ( + HTTPConnectionPool, HTTPSConnectionPool +) +from ..exceptions import ConnectTimeoutError, NewConnectionError +from ..poolmanager import PoolManager +from ..util.url import parse_url + +try: + import ssl +except ImportError: + ssl = None + + +class SOCKSConnection(HTTPConnection): + """ + A plain-text HTTP connection that connects via a SOCKS proxy. + """ + def __init__(self, *args, **kwargs): + self._socks_options = kwargs.pop('_socks_options') + super(SOCKSConnection, self).__init__(*args, **kwargs) + + def _new_conn(self): + """ + Establish a new connection via the SOCKS proxy. + """ + extra_kw = {} + if self.source_address: + extra_kw['source_address'] = self.source_address + + if self.socket_options: + extra_kw['socket_options'] = self.socket_options + + try: + conn = socks.create_connection( + (self.host, self.port), + proxy_type=self._socks_options['socks_version'], + proxy_addr=self._socks_options['proxy_host'], + proxy_port=self._socks_options['proxy_port'], + proxy_username=self._socks_options['username'], + proxy_password=self._socks_options['password'], + timeout=self.timeout, + **extra_kw + ) + + except SocketTimeout as e: + raise ConnectTimeoutError( + self, "Connection to %s timed out. (connect timeout=%s)" % + (self.host, self.timeout)) + + except socks.ProxyError as e: + # This is fragile as hell, but it seems to be the only way to raise + # useful errors here. + if e.socket_err: + error = e.socket_err + if isinstance(error, SocketTimeout): + raise ConnectTimeoutError( + self, + "Connection to %s timed out. (connect timeout=%s)" % + (self.host, self.timeout) + ) + else: + raise NewConnectionError( + self, + "Failed to establish a new connection: %s" % error + ) + else: + raise NewConnectionError( + self, + "Failed to establish a new connection: %s" % e + ) + + except SocketError as e: # Defensive: PySocks should catch all these. + raise NewConnectionError( + self, "Failed to establish a new connection: %s" % e) + + return conn + + +# We don't need to duplicate the Verified/Unverified distinction from +# urllib3/connection.py here because the HTTPSConnection will already have been +# correctly set to either the Verified or Unverified form by that module. This +# means the SOCKSHTTPSConnection will automatically be the correct type. +class SOCKSHTTPSConnection(SOCKSConnection, HTTPSConnection): + pass + + +class SOCKSHTTPConnectionPool(HTTPConnectionPool): + ConnectionCls = SOCKSConnection + + +class SOCKSHTTPSConnectionPool(HTTPSConnectionPool): + ConnectionCls = SOCKSHTTPSConnection + + +class SOCKSProxyManager(PoolManager): + """ + A version of the urllib3 ProxyManager that routes connections via the + defined SOCKS proxy. + """ + pool_classes_by_scheme = { + 'http': SOCKSHTTPConnectionPool, + 'https': SOCKSHTTPSConnectionPool, + } + + def __init__(self, proxy_url, username=None, password=None, + num_pools=10, headers=None, **connection_pool_kw): + parsed = parse_url(proxy_url) + + if parsed.scheme == 'socks5': + socks_version = socks.PROXY_TYPE_SOCKS5 + elif parsed.scheme == 'socks4': + socks_version = socks.PROXY_TYPE_SOCKS4 + else: + raise ValueError( + "Unable to determine SOCKS version from %s" % proxy_url + ) + + self.proxy_url = proxy_url + + socks_options = { + 'socks_version': socks_version, + 'proxy_host': parsed.host, + 'proxy_port': parsed.port, + 'username': username, + 'password': password, + } + connection_pool_kw['_socks_options'] = socks_options + + super(SOCKSProxyManager, self).__init__( + num_pools, headers, **connection_pool_kw + ) + + self.pool_classes_by_scheme = SOCKSProxyManager.pool_classes_by_scheme diff -Nru python-urllib3-1.13.1/urllib3/exceptions.py python-urllib3-1.15.1/urllib3/exceptions.py --- python-urllib3-1.13.1/urllib3/exceptions.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/exceptions.py 2015-12-29 20:28:18.000000000 +0000 @@ -180,6 +180,14 @@ pass +class DependencyWarning(HTTPWarning): + """ + Warned when an attempt is made to import a module with missing optional + dependencies. + """ + pass + + class ResponseNotChunked(ProtocolError, ValueError): "Response needs to be chunked in order to read it as chunks." pass diff -Nru python-urllib3-1.13.1/urllib3/fields.py python-urllib3-1.15.1/urllib3/fields.py --- python-urllib3-1.13.1/urllib3/fields.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/fields.py 2016-04-06 19:16:56.000000000 +0000 @@ -36,11 +36,11 @@ result = '%s="%s"' % (name, value) try: result.encode('ascii') - except UnicodeEncodeError: + except (UnicodeEncodeError, UnicodeDecodeError): pass else: return result - if not six.PY3: # Python 2: + if not six.PY3 and isinstance(value, six.text_type): # Python 2: value = value.encode('utf-8') value = email.utils.encode_rfc2231(value, 'utf-8') value = '%s*=%s' % (name, value) diff -Nru python-urllib3-1.13.1/urllib3/__init__.py python-urllib3-1.15.1/urllib3/__init__.py --- python-urllib3-1.13.1/urllib3/__init__.py 2015-12-18 22:47:24.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/__init__.py 2016-04-11 17:23:23.000000000 +0000 @@ -32,7 +32,7 @@ __author__ = 'Andrey Petrov (andrey.petrov@shazow.net)' __license__ = 'MIT' -__version__ = '1.13.1' +__version__ = '1.15.1' __all__ = ( 'HTTPConnectionPool', @@ -68,22 +68,25 @@ handler.setFormatter(logging.Formatter('%(asctime)s %(levelname)s %(message)s')) logger.addHandler(handler) logger.setLevel(level) - logger.debug('Added a stderr logging handler to logger: %s' % __name__) + logger.debug('Added a stderr logging handler to logger: %s', __name__) return handler # ... Clean up. del NullHandler +# All warning filters *must* be appended unless you're really certain that they +# shouldn't be: otherwise, it's very hard for users to use most Python +# mechanisms to silence them. # SecurityWarning's always go off by default. warnings.simplefilter('always', exceptions.SecurityWarning, append=True) # SubjectAltNameWarning's should go off once per host -warnings.simplefilter('default', exceptions.SubjectAltNameWarning) +warnings.simplefilter('default', exceptions.SubjectAltNameWarning, append=True) # InsecurePlatformWarning's don't vary between requests, so we keep it default. warnings.simplefilter('default', exceptions.InsecurePlatformWarning, append=True) # SNIMissingWarnings should go off only once. -warnings.simplefilter('default', exceptions.SNIMissingWarning) +warnings.simplefilter('default', exceptions.SNIMissingWarning, append=True) def disable_warnings(category=exceptions.HTTPWarning): diff -Nru python-urllib3-1.13.1/urllib3/packages/backports/makefile.py python-urllib3-1.15.1/urllib3/packages/backports/makefile.py --- python-urllib3-1.13.1/urllib3/packages/backports/makefile.py 1970-01-01 00:00:00.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/packages/backports/makefile.py 2016-04-06 19:16:56.000000000 +0000 @@ -0,0 +1,53 @@ +# -*- coding: utf-8 -*- +""" +backports.makefile +~~~~~~~~~~~~~~~~~~ + +Backports the Python 3 ``socket.makefile`` method for use with anything that +wants to create a "fake" socket object. +""" +import io + +from socket import SocketIO + + +def backport_makefile(self, mode="r", buffering=None, encoding=None, + errors=None, newline=None): + """ + Backport of ``socket.makefile`` from Python 3.5. + """ + if not set(mode) <= set(["r", "w", "b"]): + raise ValueError( + "invalid mode %r (only r, w, b allowed)" % (mode,) + ) + writing = "w" in mode + reading = "r" in mode or not writing + assert reading or writing + binary = "b" in mode + rawmode = "" + if reading: + rawmode += "r" + if writing: + rawmode += "w" + raw = SocketIO(self, rawmode) + self._makefile_refs += 1 + if buffering is None: + buffering = -1 + if buffering < 0: + buffering = io.DEFAULT_BUFFER_SIZE + if buffering == 0: + if not binary: + raise ValueError("unbuffered streams must be binary") + return raw + if reading and writing: + buffer = io.BufferedRWPair(raw, raw, buffering) + elif reading: + buffer = io.BufferedReader(raw, buffering) + else: + assert writing + buffer = io.BufferedWriter(raw, buffering) + if binary: + return buffer + text = io.TextIOWrapper(buffer, encoding, errors, newline) + text.mode = mode + return text diff -Nru python-urllib3-1.13.1/urllib3/poolmanager.py python-urllib3-1.15.1/urllib3/poolmanager.py --- python-urllib3-1.13.1/urllib3/poolmanager.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/poolmanager.py 2015-12-29 20:28:18.000000000 +0000 @@ -18,16 +18,16 @@ __all__ = ['PoolManager', 'ProxyManager', 'proxy_from_url'] -pool_classes_by_scheme = { - 'http': HTTPConnectionPool, - 'https': HTTPSConnectionPool, -} - log = logging.getLogger(__name__) SSL_KEYWORDS = ('key_file', 'cert_file', 'cert_reqs', 'ca_certs', 'ssl_version', 'ca_cert_dir') +pool_classes_by_scheme = { + 'http': HTTPConnectionPool, + 'https': HTTPSConnectionPool, +} + class PoolManager(RequestMethods): """ @@ -65,6 +65,9 @@ self.pools = RecentlyUsedContainer(num_pools, dispose_func=lambda p: p.close()) + # Locally set the pool classes so other PoolManagers can override them. + self.pool_classes_by_scheme = pool_classes_by_scheme + def __enter__(self): return self @@ -81,7 +84,7 @@ by :meth:`connection_from_url` and companion methods. It is intended to be overridden for customization. """ - pool_cls = pool_classes_by_scheme[scheme] + pool_cls = self.pool_classes_by_scheme[scheme] kwargs = self.connection_pool_kw if scheme == 'http': kwargs = self.connection_pool_kw.copy() @@ -186,7 +189,7 @@ kw['retries'] = retries kw['redirect'] = redirect - log.info("Redirecting %s -> %s" % (url, redirect_location)) + log.info("Redirecting %s -> %s", url, redirect_location) return self.urlopen(method, redirect_location, **kw) diff -Nru python-urllib3-1.13.1/urllib3/response.py python-urllib3-1.15.1/urllib3/response.py --- python-urllib3-1.13.1/urllib3/response.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/response.py 2016-04-06 19:16:56.000000000 +0000 @@ -221,6 +221,8 @@ On exit, release the connection back to the pool. """ + clean_exit = False + try: try: yield @@ -243,20 +245,27 @@ # This includes IncompleteRead. raise ProtocolError('Connection broken: %r' % e, e) - except Exception: - # The response may not be closed but we're not going to use it anymore - # so close it now to ensure that the connection is released back to the pool. - if self._original_response and not self._original_response.isclosed(): - self._original_response.close() - - # Closing the response may not actually be sufficient to close - # everything, so if we have a hold of the connection close that - # too. - if self._connection is not None: - self._connection.close() - - raise + # If no exception is thrown, we should avoid cleaning up + # unnecessarily. + clean_exit = True finally: + # If we didn't terminate cleanly, we need to throw away our + # connection. + if not clean_exit: + # The response may not be closed but we're not going to use it + # anymore so close it now to ensure that the connection is + # released back to the pool. + if self._original_response: + self._original_response.close() + + # Closing the response may not actually be sufficient to close + # everything, so if we have a hold of the connection close that + # too. + if self._connection: + self._connection.close() + + # If we hold the original response but it's closed now, we should + # return the connection back to the pool. if self._original_response and self._original_response.isclosed(): self.release_conn() @@ -387,6 +396,9 @@ if not self.closed: self._fp.close() + if self._connection: + self._connection.close() + @property def closed(self): if self._fp is None: diff -Nru python-urllib3-1.13.1/urllib3/util/__init__.py python-urllib3-1.15.1/urllib3/util/__init__.py --- python-urllib3-1.13.1/urllib3/util/__init__.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/util/__init__.py 2016-04-06 19:16:56.000000000 +0000 @@ -6,6 +6,7 @@ from .ssl_ import ( SSLContext, HAS_SNI, + IS_PYOPENSSL, assert_fingerprint, resolve_cert_reqs, resolve_ssl_version, @@ -26,6 +27,7 @@ __all__ = ( 'HAS_SNI', + 'IS_PYOPENSSL', 'SSLContext', 'Retry', 'Timeout', diff -Nru python-urllib3-1.13.1/urllib3/util/response.py python-urllib3-1.15.1/urllib3/util/response.py --- python-urllib3-1.13.1/urllib3/util/response.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/util/response.py 2015-12-29 20:28:18.000000000 +0000 @@ -61,7 +61,7 @@ def is_response_to_head(response): """ - Checks, wether a the request of a response has been a HEAD-request. + Checks whether the request of a response has been a HEAD-request. Handles the quirks of AppEngine. :param conn: diff -Nru python-urllib3-1.13.1/urllib3/util/retry.py python-urllib3-1.15.1/urllib3/util/retry.py --- python-urllib3-1.13.1/urllib3/util/retry.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/util/retry.py 2016-04-06 19:16:56.000000000 +0000 @@ -102,6 +102,11 @@ :param bool raise_on_redirect: Whether, if the number of redirects is exhausted, to raise a MaxRetryError, or to return a response with a response code in the 3xx range. + + :param bool raise_on_status: Similar meaning to ``raise_on_redirect``: + whether we should raise an exception, or return a response, + if status falls in ``status_forcelist`` range and retries have + been exhausted. """ DEFAULT_METHOD_WHITELIST = frozenset([ @@ -112,7 +117,8 @@ def __init__(self, total=10, connect=None, read=None, redirect=None, method_whitelist=DEFAULT_METHOD_WHITELIST, status_forcelist=None, - backoff_factor=0, raise_on_redirect=True, _observed_errors=0): + backoff_factor=0, raise_on_redirect=True, raise_on_status=True, + _observed_errors=0): self.total = total self.connect = connect @@ -127,6 +133,7 @@ self.method_whitelist = method_whitelist self.backoff_factor = backoff_factor self.raise_on_redirect = raise_on_redirect + self.raise_on_status = raise_on_status self._observed_errors = _observed_errors # TODO: use .history instead? def new(self, **kw): @@ -137,6 +144,7 @@ status_forcelist=self.status_forcelist, backoff_factor=self.backoff_factor, raise_on_redirect=self.raise_on_redirect, + raise_on_status=self.raise_on_status, _observed_errors=self._observed_errors, ) params.update(kw) @@ -153,7 +161,7 @@ redirect = bool(redirect) and None new_retries = cls(retries, redirect=redirect) - log.debug("Converted retries value: %r -> %r" % (retries, new_retries)) + log.debug("Converted retries value: %r -> %r", retries, new_retries) return new_retries def get_backoff_time(self): @@ -272,7 +280,7 @@ if new_retry.is_exhausted(): raise MaxRetryError(_pool, url, error or ResponseError(cause)) - log.debug("Incremented Retry for (url='%s'): %r" % (url, new_retry)) + log.debug("Incremented Retry for (url='%s'): %r", url, new_retry) return new_retry diff -Nru python-urllib3-1.13.1/urllib3/util/ssl_.py python-urllib3-1.15.1/urllib3/util/ssl_.py --- python-urllib3-1.13.1/urllib3/util/ssl_.py 2015-12-14 21:06:26.000000000 +0000 +++ python-urllib3-1.15.1/urllib3/util/ssl_.py 2016-04-06 19:16:56.000000000 +0000 @@ -12,6 +12,7 @@ SSLContext = None HAS_SNI = False create_default_context = None +IS_PYOPENSSL = False # Maps the length of a digest to a possible hash function producing this digest HASHFUNC_MAP = { @@ -110,11 +111,12 @@ ) self.ciphers = cipher_suite - def wrap_socket(self, socket, server_hostname=None): + def wrap_socket(self, socket, server_hostname=None, server_side=False): warnings.warn( 'A true SSLContext object is not available. This prevents ' 'urllib3 from configuring SSL appropriately and may cause ' - 'certain SSL connections to fail. For more information, see ' + 'certain SSL connections to fail. You can upgrade to a newer ' + 'version of Python to solve this. For more information, see ' 'https://urllib3.readthedocs.org/en/latest/security.html' '#insecureplatformwarning.', InsecurePlatformWarning @@ -125,6 +127,7 @@ 'ca_certs': self.ca_certs, 'cert_reqs': self.verify_mode, 'ssl_version': self.protocol, + 'server_side': server_side, } if self.supports_set_ciphers: # Platform-specific: Python 2.7+ return wrap_socket(socket, ciphers=self.ciphers, **kwargs) @@ -308,8 +311,8 @@ 'An HTTPS request has been made, but the SNI (Subject Name ' 'Indication) extension to TLS is not available on this platform. ' 'This may cause the server to present an incorrect TLS ' - 'certificate, which can cause validation failures. For more ' - 'information, see ' + 'certificate, which can cause validation failures. You can upgrade to ' + 'a newer version of Python to solve this. For more information, see ' 'https://urllib3.readthedocs.org/en/latest/security.html' '#snimissingwarning.', SNIMissingWarning diff -Nru python-urllib3-1.13.1/urllib3.egg-info/pbr.json python-urllib3-1.15.1/urllib3.egg-info/pbr.json --- python-urllib3-1.13.1/urllib3.egg-info/pbr.json 2015-12-18 22:47:28.000000000 +0000 +++ python-urllib3-1.15.1/urllib3.egg-info/pbr.json 2015-12-29 20:28:24.000000000 +0000 @@ -1 +1 @@ -{"is_release": false, "git_version": "12d04b7"} \ No newline at end of file +{"is_release": false, "git_version": "27df29b"} \ No newline at end of file diff -Nru python-urllib3-1.13.1/urllib3.egg-info/PKG-INFO python-urllib3-1.15.1/urllib3.egg-info/PKG-INFO --- python-urllib3-1.13.1/urllib3.egg-info/PKG-INFO 2015-12-18 22:47:28.000000000 +0000 +++ python-urllib3-1.15.1/urllib3.egg-info/PKG-INFO 2016-04-11 17:26:03.000000000 +0000 @@ -1,6 +1,6 @@ Metadata-Version: 1.1 Name: urllib3 -Version: 1.13.1 +Version: 1.15.1 Summary: HTTP library with thread-safe connection pooling, file post, and more. Home-page: http://urllib3.readthedocs.org/ Author: Andrey Petrov @@ -26,9 +26,10 @@ - File posting (``encode_multipart_formdata``). - Built-in redirection and retries (optional). - Supports gzip and deflate decoding. + - Proxy over HTTP or SOCKS. - Thread-safe and sanity-safe. - Works with AppEngine, gevent, and eventlib. - - Tested on Python 2.6+, Python 3.2+, and PyPy, with 100% unit test coverage. + - Tested on Python 2.6+, Python 3.3+, and PyPy, with 100% unit test coverage. - Small and easy to understand codebase perfect for extending and building upon. For a more comprehensive solution, have a look at `Requests `_ which is also powered by ``urllib3``. @@ -134,6 +135,21 @@ Contributing ============ + Thank you for giving back to urllib3. Please meet our jolly team + of code-sherpas: + + Maintainers + ----------- + + - `@lukasa `_ (Cory Benfield) + - `@sigmavirus24 `_ (Ian Cordasco) + - `@shazow `_ (Andrey Petrov) + + ๐Ÿ‘‹ + + Getting Started + --------------- + #. `Check for open issues `_ or open a fresh issue to start a discussion around a feature idea or a bug. There is a *Contributor Friendly* tag for issues that should be ideal for people who @@ -156,14 +172,53 @@ Changes ======= + 1.15.1 (2016-04-11) + ------------------- + + * Fix packaging to include backports module. (Issue #841) + + + 1.15 (2016-04-06) + ----------------- + + * Added Retry(raise_on_status=False). (Issue #720) + + * Always use setuptools, no more distutils fallback. (Issue #785) + + * Dropped support for Python 3.2. (Issue #786) + + * Chunked transfer encoding when requesting with ``chunked=True``. + (Issue #790) + + * Fixed regression with IPv6 port parsing. (Issue #801) + + * Append SNIMissingWarning messages to allow users to specify it in + the PYTHONWARNINGS environment variable. (Issue #816) + + * Handle unicode headers in Py2. (Issue #818) + + * Log certificate when there is a hostname mismatch. (Issue #820) + + * Preserve order of request/response headers. (Issue #821) + + + 1.14 (2015-12-29) + ----------------- + + * contrib: SOCKS proxy support! (Issue #762) + + * Fixed AppEngine handling of transfer-encoding header and bug + in Timeout defaults checking. (Issue #763) + + 1.13.1 (2015-12-18) - +++++++++++++++++++ + ------------------- * Fixed regression in IPv6 + SSL for match_hostname. (Issue #761) 1.13 (2015-12-14) - +++++++++++++++++ + ----------------- * Fixed ``pip install urllib3[secure]`` on modern pip. (Issue #706) @@ -180,7 +235,7 @@ 1.12 (2015-09-03) - +++++++++++++++++ + ----------------- * Rely on ``six`` for importing ``httplib`` to work around conflicts with other Python 3 shims. (Issue #688) @@ -193,7 +248,7 @@ 1.11 (2015-07-21) - +++++++++++++++++ + ----------------- * When ``ca_certs`` is given, ``cert_reqs`` defaults to ``'CERT_REQUIRED'``. (Issue #650) @@ -238,7 +293,7 @@ (Issue #674) 1.10.4 (2015-05-03) - +++++++++++++++++++ + ------------------- * Migrate tests to Tornado 4. (Issue #594) @@ -254,7 +309,7 @@ 1.10.3 (2015-04-21) - +++++++++++++++++++ + ------------------- * Emit ``InsecurePlatformWarning`` when SSLContext object is missing. (Issue #558) @@ -275,7 +330,7 @@ 1.10.2 (2015-02-25) - +++++++++++++++++++ + ------------------- * Fix file descriptor leakage on retries. (Issue #548) @@ -287,7 +342,7 @@ 1.10.1 (2015-02-10) - +++++++++++++++++++ + ------------------- * Pools can be used as context managers. (Issue #545) @@ -301,7 +356,7 @@ 1.10 (2014-12-14) - +++++++++++++++++ + ----------------- * Disabled SSLv3. (Issue #473) @@ -333,7 +388,7 @@ 1.9.1 (2014-09-13) - ++++++++++++++++++ + ------------------ * Apply socket arguments before binding. (Issue #427) @@ -354,7 +409,7 @@ 1.9 (2014-07-04) - ++++++++++++++++ + ---------------- * Shuffled around development-related files. If you're maintaining a distro package of urllib3, you may need to tweak things. (Issue #415) @@ -391,7 +446,7 @@ 1.8.3 (2014-06-23) - ++++++++++++++++++ + ------------------ * Fix TLS verification when using a proxy in Python 3.4.1. (Issue #385) @@ -413,13 +468,13 @@ 1.8.2 (2014-04-17) - ++++++++++++++++++ + ------------------ * Fix ``urllib3.util`` not being included in the package. 1.8.1 (2014-04-17) - ++++++++++++++++++ + ------------------ * Fix AppEngine bug of HTTPS requests going out as HTTP. (Issue #356) @@ -430,7 +485,7 @@ 1.8 (2014-03-04) - ++++++++++++++++ + ---------------- * Improved url parsing in ``urllib3.util.parse_url`` (properly parse '@' in username, and blank ports like 'hostname:'). @@ -482,7 +537,7 @@ 1.7.1 (2013-09-25) - ++++++++++++++++++ + ------------------ * Added granular timeout support with new ``urllib3.util.Timeout`` class. (Issue #231) @@ -491,7 +546,7 @@ 1.7 (2013-08-14) - ++++++++++++++++ + ---------------- * More exceptions are now pickle-able, with tests. (Issue #174) @@ -530,7 +585,7 @@ 1.6 (2013-04-25) - ++++++++++++++++ + ---------------- * Contrib: Optional SNI support for Py2 using PyOpenSSL. (Issue #156) @@ -590,7 +645,7 @@ 1.5 (2012-08-02) - ++++++++++++++++ + ---------------- * Added ``urllib3.add_stderr_logger()`` for quickly enabling STDERR debug logging in urllib3. @@ -615,7 +670,7 @@ 1.4 (2012-06-16) - ++++++++++++++++ + ---------------- * Minor AppEngine-related fixes. @@ -627,7 +682,7 @@ 1.3 (2012-03-25) - ++++++++++++++++ + ---------------- * Removed pre-1.0 deprecated API. @@ -646,13 +701,13 @@ 1.2.2 (2012-02-06) - ++++++++++++++++++ + ------------------ * Fixed packaging bug of not shipping ``test-requirements.txt``. (Issue #47) 1.2.1 (2012-02-05) - ++++++++++++++++++ + ------------------ * Fixed another bug related to when ``ssl`` module is not available. (Issue #41) @@ -661,7 +716,7 @@ 1.2 (2012-01-29) - ++++++++++++++++ + ---------------- * Added Python 3 support (tested on 3.2.2) @@ -687,7 +742,7 @@ 1.1 (2012-01-07) - ++++++++++++++++ + ---------------- * Refactored ``dummyserver`` to its own root namespace module (used for testing). @@ -704,7 +759,7 @@ 1.0.2 (2011-11-04) - ++++++++++++++++++ + ------------------ * Fixed typo in ``VerifiedHTTPSConnection`` which would only present as a bug if you're using the object manually. (Thanks pyos) @@ -717,14 +772,14 @@ 1.0.1 (2011-10-10) - ++++++++++++++++++ + ------------------ * Fixed a bug where the same connection would get returned into the pool twice, causing extraneous "HttpConnectionPool is full" log warnings. 1.0 (2011-10-08) - ++++++++++++++++ + ---------------- * Added ``PoolManager`` with LRU expiration of connections (tested and documented). @@ -747,13 +802,13 @@ 0.4.1 (2011-07-17) - ++++++++++++++++++ + ------------------ * Minor bug fixes, code cleanup. 0.4 (2011-03-01) - ++++++++++++++++ + ---------------- * Better unicode support. * Added ``VerifiedHTTPSConnection``. @@ -762,13 +817,13 @@ 0.3.1 (2010-07-13) - ++++++++++++++++++ + ------------------ * Added ``assert_host_name`` optional parameter. Now compatible with proxies. 0.3 (2009-12-10) - ++++++++++++++++ + ---------------- * Added HTTPS support. * Minor bug fixes. @@ -777,14 +832,14 @@ 0.2 (2008-11-17) - ++++++++++++++++ + ---------------- * Added unit tests. * Bug fixes. 0.1 (2008-11-16) - ++++++++++++++++ + ---------------- * First release. diff -Nru python-urllib3-1.13.1/urllib3.egg-info/requires.txt python-urllib3-1.15.1/urllib3.egg-info/requires.txt --- python-urllib3-1.13.1/urllib3.egg-info/requires.txt 2015-12-18 22:47:28.000000000 +0000 +++ python-urllib3-1.15.1/urllib3.egg-info/requires.txt 2016-04-11 17:26:03.000000000 +0000 @@ -4,3 +4,6 @@ ndg-httpsclient pyasn1 certifi + +[socks] +PySocks>=1.5.6,<2.0 diff -Nru python-urllib3-1.13.1/urllib3.egg-info/SOURCES.txt python-urllib3-1.15.1/urllib3.egg-info/SOURCES.txt --- python-urllib3-1.13.1/urllib3.egg-info/SOURCES.txt 2015-12-18 22:47:28.000000000 +0000 +++ python-urllib3-1.15.1/urllib3.egg-info/SOURCES.txt 2016-04-11 17:26:04.000000000 +0000 @@ -19,6 +19,7 @@ docs/make.bat docs/managers.rst docs/pools.rst +docs/recipes.rst docs/security.rst dummyserver/__init__.py dummyserver/handlers.py @@ -49,6 +50,7 @@ test/port_helpers.py test/test_collections.py test/test_compatibility.py +test/test_connection.py test/test_connectionpool.py test/test_exceptions.py test/test_fields.py @@ -67,7 +69,9 @@ test/contrib/__init__.py test/contrib/test_gae_manager.py test/contrib/test_pyopenssl.py +test/contrib/test_socks.py test/with_dummyserver/__init__.py +test/with_dummyserver/test_chunked_transfer.py test/with_dummyserver/test_connectionpool.py test/with_dummyserver/test_https.py test/with_dummyserver/test_no_ssl.py @@ -94,9 +98,12 @@ urllib3/contrib/appengine.py urllib3/contrib/ntlmpool.py urllib3/contrib/pyopenssl.py +urllib3/contrib/socks.py urllib3/packages/__init__.py urllib3/packages/ordered_dict.py urllib3/packages/six.py +urllib3/packages/backports/__init__.py +urllib3/packages/backports/makefile.py urllib3/packages/ssl_match_hostname/__init__.py urllib3/packages/ssl_match_hostname/_implementation.py urllib3/util/__init__.py