python-proxy-headers 0.1.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- python_proxy_headers-0.1.0/LICENSE +21 -0
- python_proxy_headers-0.1.0/PKG-INFO +151 -0
- python_proxy_headers-0.1.0/README.md +133 -0
- python_proxy_headers-0.1.0/pyproject.toml +26 -0
- python_proxy_headers-0.1.0/python_proxy_headers/__init__.py +0 -0
- python_proxy_headers-0.1.0/python_proxy_headers/aiohttp_proxy.py +159 -0
- python_proxy_headers-0.1.0/python_proxy_headers/httpx_proxy.py +376 -0
- python_proxy_headers-0.1.0/python_proxy_headers/requests_adapter.py +73 -0
- python_proxy_headers-0.1.0/python_proxy_headers/urllib3_proxy_manager.py +135 -0
- python_proxy_headers-0.1.0/python_proxy_headers.egg-info/PKG-INFO +151 -0
- python_proxy_headers-0.1.0/python_proxy_headers.egg-info/SOURCES.txt +12 -0
- python_proxy_headers-0.1.0/python_proxy_headers.egg-info/dependency_links.txt +1 -0
- python_proxy_headers-0.1.0/python_proxy_headers.egg-info/top_level.txt +1 -0
- python_proxy_headers-0.1.0/setup.cfg +4 -0
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2025 ProxyMesh
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
|
@@ -0,0 +1,151 @@
|
|
|
1
|
+
Metadata-Version: 2.2
|
|
2
|
+
Name: python-proxy-headers
|
|
3
|
+
Version: 0.1.0
|
|
4
|
+
Summary: Handle custom proxy headers for http requests in various python libraries
|
|
5
|
+
Author-email: ProxyMesh <support@proxymesh.com>
|
|
6
|
+
Project-URL: Homepage, https://github.com/proxymesh/python-proxy-headers
|
|
7
|
+
Project-URL: Changelog, https://github.com/proxymesh/python-proxy-headers/commits/main/
|
|
8
|
+
Project-URL: Issues, https://github.com/proxymesh/python-proxy-headers/issues
|
|
9
|
+
Classifier: Programming Language :: Python :: 3
|
|
10
|
+
Classifier: Operating System :: OS Independent
|
|
11
|
+
Classifier: License :: OSI Approved :: BSD License
|
|
12
|
+
Classifier: Intended Audience :: Developers
|
|
13
|
+
Classifier: Topic :: Internet :: WWW/HTTP
|
|
14
|
+
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
|
15
|
+
Requires-Python: >=3.8
|
|
16
|
+
Description-Content-Type: text/markdown
|
|
17
|
+
License-File: LICENSE
|
|
18
|
+
|
|
19
|
+
# Python Proxy Headers
|
|
20
|
+
|
|
21
|
+
The `python-proxy-headers` package provides support for handling custom proxy headers when making HTTPS requests in various python modules.
|
|
22
|
+
|
|
23
|
+
We currently provide extensions to the following packages:
|
|
24
|
+
|
|
25
|
+
* [urllib3](https://urllib3.readthedocs.io/en/stable/)
|
|
26
|
+
* [requests](https://docs.python-requests.org/en/latest/index.html)
|
|
27
|
+
* [aiohttp](https://docs.aiohttp.org/en/stable/index.html)
|
|
28
|
+
* [httpx](https://www.python-httpx.org/)
|
|
29
|
+
|
|
30
|
+
None of these modules provide good support for parsing custom response headers from proxy servers. And some of them make it hard to send custom headers to proxy servers. So we at [ProxyMesh](https://proxymesh.com) made these extension modules to support our customers that use Python and want to use custom headers to control our proxy behavior. But these modules can work for handling custom headers with any proxy.
|
|
31
|
+
|
|
32
|
+
*If you are looking for [Scrapy](https://scrapy.org/) support, please see our [scrapy-proxy-headers](https://github.com/proxymesh/scrapy-proxy-headers) project.*
|
|
33
|
+
|
|
34
|
+
## Installation
|
|
35
|
+
|
|
36
|
+
Examples for how to use these extension modules are described below. You must first do the following:
|
|
37
|
+
|
|
38
|
+
1. `pip install python-proxy-headers`
|
|
39
|
+
2. Install the appropriate package based on the python module you want to use.
|
|
40
|
+
|
|
41
|
+
This package does not have any dependencies because we don't know which module you want to use.
|
|
42
|
+
|
|
43
|
+
You can also find more example code in our [proxy-examples for python](https://github.com/proxymesh/proxy-examples/tree/main/python).
|
|
44
|
+
|
|
45
|
+
## urllib3
|
|
46
|
+
|
|
47
|
+
If you just want to send custom proxy headers, but don't need to receive proxy response headers, then you can [urllib3.ProxyManager](https://urllib3.readthedocs.io/en/stable/reference/urllib3.poolmanager.html#urllib3.ProxyManager), like so:
|
|
48
|
+
|
|
49
|
+
``` python
|
|
50
|
+
import urllib3
|
|
51
|
+
proxy = urllib3.ProxyManager('http://PROXYHOST:PORT', proxy_headers={'X-ProxyMesh-Country': 'US'})
|
|
52
|
+
r = proxy.request('GET', 'https://api.ipify.org?format=json')
|
|
53
|
+
```
|
|
54
|
+
|
|
55
|
+
Note that when using this method, if you keep reusing the same `ProxyManager` instance, you may be re-using the proxy connection, which may have different behavior than if you create a new proxy connection for each request. For example, with ProxyMesh you may keep getting the same IP address if you reuse the proxy connection.
|
|
56
|
+
|
|
57
|
+
To get proxy response headers, use our extension module like this:
|
|
58
|
+
|
|
59
|
+
``` python
|
|
60
|
+
from python_proxy_headers import urllib3_proxy_manager
|
|
61
|
+
proxy = urllib3_proxy_manager.ProxyHeaderManager('http://PROXYHOST:PORT')
|
|
62
|
+
r = proxy.request('GET', 'https://api.ipify.org?format=json')
|
|
63
|
+
r.headers['X-ProxyMesh-IP']
|
|
64
|
+
```
|
|
65
|
+
|
|
66
|
+
You can also pass `proxy_headers` into our `ProxyHeaderManager` as well. For example, you can pass back the same `X-ProxyMesh-IP` header to ensure you get the same IP address on subsequent requests.
|
|
67
|
+
|
|
68
|
+
## requests
|
|
69
|
+
|
|
70
|
+
The requests adapter builds on our `urllib3_proxy_manager` module to make it easy to pass in proxy headers and receive proxy response headers.
|
|
71
|
+
|
|
72
|
+
``` python
|
|
73
|
+
from python_proxy_headers import requests_adapter
|
|
74
|
+
r = requests_adapter.get('https://api.ipify.org?format=json', proxies={'http': 'http://PROXYHOST:PORT', 'https': 'http://PROXYHOST:PORT'}, proxy_headers={'X-ProxyMesh-Country': 'US'})
|
|
75
|
+
r.headers['X-ProxyMesh-IP']
|
|
76
|
+
```
|
|
77
|
+
|
|
78
|
+
The `requests_adapter` module supports all the standard requests methods: `get`, `post`, `put`, `delete`, etc.
|
|
79
|
+
|
|
80
|
+
## aiohttp
|
|
81
|
+
|
|
82
|
+
While it's not documented, aiohttp does support passing in custom proxy headers by default.
|
|
83
|
+
|
|
84
|
+
``` python
|
|
85
|
+
import aiohttp
|
|
86
|
+
async with aiohttp.ClientSession() as session:
|
|
87
|
+
async with session.get('https://api.ipify.org?format=json', proxy="http://PROXYHOST:PORT", proxy_headers={'X-ProxyMesh-Country': 'US'}) as r:
|
|
88
|
+
await r.text()
|
|
89
|
+
```
|
|
90
|
+
|
|
91
|
+
However, if you want to get proxy response, you should use our extension module:
|
|
92
|
+
|
|
93
|
+
``` python
|
|
94
|
+
from python_proxy_headers import aiohttp_proxy
|
|
95
|
+
async with aiohttp_proxy.ProxyClientSession() as session:
|
|
96
|
+
async with session.get('https://api.ipify.org?format=json', proxy="http://PROXYHOST:PORT", proxy_headers={'X-ProxyMesh-Country': 'US'}) as r:
|
|
97
|
+
await r.text()
|
|
98
|
+
|
|
99
|
+
r.headers['X-ProxyMesh-IP']
|
|
100
|
+
```
|
|
101
|
+
|
|
102
|
+
## httpx
|
|
103
|
+
|
|
104
|
+
httpx also supports proxy headers by default, though it's not documented:
|
|
105
|
+
|
|
106
|
+
``` python
|
|
107
|
+
import httpx
|
|
108
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
109
|
+
transport = HTTPProxyTransport(proxy=proxy)
|
|
110
|
+
with httpx.Client(mounts={'http://': transort, 'https://': transport}) as client:
|
|
111
|
+
r = client.get('https://api.ipify.org?format=json')
|
|
112
|
+
```
|
|
113
|
+
|
|
114
|
+
But to get the response headers, you need to use our extension module:
|
|
115
|
+
|
|
116
|
+
``` python
|
|
117
|
+
import httpx
|
|
118
|
+
from python_proxy_headers.httpx_proxy import HTTPProxyTransport
|
|
119
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
120
|
+
transport = HTTPProxyTransport(proxy=proxy)
|
|
121
|
+
with httpx.Client(mounts={'http://': transort, 'https://': transport}) as client:
|
|
122
|
+
r = client.get('https://api.ipify.org?format=json')
|
|
123
|
+
|
|
124
|
+
r.headers['X-ProxyMesh-IP']
|
|
125
|
+
```
|
|
126
|
+
|
|
127
|
+
This module also provide helper methods similar to requests:
|
|
128
|
+
|
|
129
|
+
``` python
|
|
130
|
+
import httpx
|
|
131
|
+
from python_proxy_headers import httpx_proxy
|
|
132
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
133
|
+
r = httpx_proxy.get('https://api.ipify.org?format=json', proxy=proxy)
|
|
134
|
+
r.headers['X-ProxyMesh-IP']
|
|
135
|
+
```
|
|
136
|
+
|
|
137
|
+
And finally, httpx supports async requests, so we provide an async extension too:
|
|
138
|
+
|
|
139
|
+
``` python
|
|
140
|
+
import httpx
|
|
141
|
+
from python_proxy_headers.httpx_proxy import AsyncHTTPProxyTransport
|
|
142
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
143
|
+
transport = AsyncHTTPProxyTransport(proxy=proxy)
|
|
144
|
+
async with httpx.AsyncClient(mounts={'http://': transport, 'https://': transport}) as client:
|
|
145
|
+
r = await client.get('https://api.ipify.org?format=json')
|
|
146
|
+
|
|
147
|
+
r.headers['X-ProxyMesh-IP']
|
|
148
|
+
```
|
|
149
|
+
|
|
150
|
+
Our httpx helper module internally provides extension classes for [httpcore](https://www.encode.io/httpcore/), for handling proxy headers over tunnel connections.
|
|
151
|
+
You can use those classes if you're building on top of httpcore.
|
|
@@ -0,0 +1,133 @@
|
|
|
1
|
+
# Python Proxy Headers
|
|
2
|
+
|
|
3
|
+
The `python-proxy-headers` package provides support for handling custom proxy headers when making HTTPS requests in various python modules.
|
|
4
|
+
|
|
5
|
+
We currently provide extensions to the following packages:
|
|
6
|
+
|
|
7
|
+
* [urllib3](https://urllib3.readthedocs.io/en/stable/)
|
|
8
|
+
* [requests](https://docs.python-requests.org/en/latest/index.html)
|
|
9
|
+
* [aiohttp](https://docs.aiohttp.org/en/stable/index.html)
|
|
10
|
+
* [httpx](https://www.python-httpx.org/)
|
|
11
|
+
|
|
12
|
+
None of these modules provide good support for parsing custom response headers from proxy servers. And some of them make it hard to send custom headers to proxy servers. So we at [ProxyMesh](https://proxymesh.com) made these extension modules to support our customers that use Python and want to use custom headers to control our proxy behavior. But these modules can work for handling custom headers with any proxy.
|
|
13
|
+
|
|
14
|
+
*If you are looking for [Scrapy](https://scrapy.org/) support, please see our [scrapy-proxy-headers](https://github.com/proxymesh/scrapy-proxy-headers) project.*
|
|
15
|
+
|
|
16
|
+
## Installation
|
|
17
|
+
|
|
18
|
+
Examples for how to use these extension modules are described below. You must first do the following:
|
|
19
|
+
|
|
20
|
+
1. `pip install python-proxy-headers`
|
|
21
|
+
2. Install the appropriate package based on the python module you want to use.
|
|
22
|
+
|
|
23
|
+
This package does not have any dependencies because we don't know which module you want to use.
|
|
24
|
+
|
|
25
|
+
You can also find more example code in our [proxy-examples for python](https://github.com/proxymesh/proxy-examples/tree/main/python).
|
|
26
|
+
|
|
27
|
+
## urllib3
|
|
28
|
+
|
|
29
|
+
If you just want to send custom proxy headers, but don't need to receive proxy response headers, then you can [urllib3.ProxyManager](https://urllib3.readthedocs.io/en/stable/reference/urllib3.poolmanager.html#urllib3.ProxyManager), like so:
|
|
30
|
+
|
|
31
|
+
``` python
|
|
32
|
+
import urllib3
|
|
33
|
+
proxy = urllib3.ProxyManager('http://PROXYHOST:PORT', proxy_headers={'X-ProxyMesh-Country': 'US'})
|
|
34
|
+
r = proxy.request('GET', 'https://api.ipify.org?format=json')
|
|
35
|
+
```
|
|
36
|
+
|
|
37
|
+
Note that when using this method, if you keep reusing the same `ProxyManager` instance, you may be re-using the proxy connection, which may have different behavior than if you create a new proxy connection for each request. For example, with ProxyMesh you may keep getting the same IP address if you reuse the proxy connection.
|
|
38
|
+
|
|
39
|
+
To get proxy response headers, use our extension module like this:
|
|
40
|
+
|
|
41
|
+
``` python
|
|
42
|
+
from python_proxy_headers import urllib3_proxy_manager
|
|
43
|
+
proxy = urllib3_proxy_manager.ProxyHeaderManager('http://PROXYHOST:PORT')
|
|
44
|
+
r = proxy.request('GET', 'https://api.ipify.org?format=json')
|
|
45
|
+
r.headers['X-ProxyMesh-IP']
|
|
46
|
+
```
|
|
47
|
+
|
|
48
|
+
You can also pass `proxy_headers` into our `ProxyHeaderManager` as well. For example, you can pass back the same `X-ProxyMesh-IP` header to ensure you get the same IP address on subsequent requests.
|
|
49
|
+
|
|
50
|
+
## requests
|
|
51
|
+
|
|
52
|
+
The requests adapter builds on our `urllib3_proxy_manager` module to make it easy to pass in proxy headers and receive proxy response headers.
|
|
53
|
+
|
|
54
|
+
``` python
|
|
55
|
+
from python_proxy_headers import requests_adapter
|
|
56
|
+
r = requests_adapter.get('https://api.ipify.org?format=json', proxies={'http': 'http://PROXYHOST:PORT', 'https': 'http://PROXYHOST:PORT'}, proxy_headers={'X-ProxyMesh-Country': 'US'})
|
|
57
|
+
r.headers['X-ProxyMesh-IP']
|
|
58
|
+
```
|
|
59
|
+
|
|
60
|
+
The `requests_adapter` module supports all the standard requests methods: `get`, `post`, `put`, `delete`, etc.
|
|
61
|
+
|
|
62
|
+
## aiohttp
|
|
63
|
+
|
|
64
|
+
While it's not documented, aiohttp does support passing in custom proxy headers by default.
|
|
65
|
+
|
|
66
|
+
``` python
|
|
67
|
+
import aiohttp
|
|
68
|
+
async with aiohttp.ClientSession() as session:
|
|
69
|
+
async with session.get('https://api.ipify.org?format=json', proxy="http://PROXYHOST:PORT", proxy_headers={'X-ProxyMesh-Country': 'US'}) as r:
|
|
70
|
+
await r.text()
|
|
71
|
+
```
|
|
72
|
+
|
|
73
|
+
However, if you want to get proxy response, you should use our extension module:
|
|
74
|
+
|
|
75
|
+
``` python
|
|
76
|
+
from python_proxy_headers import aiohttp_proxy
|
|
77
|
+
async with aiohttp_proxy.ProxyClientSession() as session:
|
|
78
|
+
async with session.get('https://api.ipify.org?format=json', proxy="http://PROXYHOST:PORT", proxy_headers={'X-ProxyMesh-Country': 'US'}) as r:
|
|
79
|
+
await r.text()
|
|
80
|
+
|
|
81
|
+
r.headers['X-ProxyMesh-IP']
|
|
82
|
+
```
|
|
83
|
+
|
|
84
|
+
## httpx
|
|
85
|
+
|
|
86
|
+
httpx also supports proxy headers by default, though it's not documented:
|
|
87
|
+
|
|
88
|
+
``` python
|
|
89
|
+
import httpx
|
|
90
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
91
|
+
transport = HTTPProxyTransport(proxy=proxy)
|
|
92
|
+
with httpx.Client(mounts={'http://': transort, 'https://': transport}) as client:
|
|
93
|
+
r = client.get('https://api.ipify.org?format=json')
|
|
94
|
+
```
|
|
95
|
+
|
|
96
|
+
But to get the response headers, you need to use our extension module:
|
|
97
|
+
|
|
98
|
+
``` python
|
|
99
|
+
import httpx
|
|
100
|
+
from python_proxy_headers.httpx_proxy import HTTPProxyTransport
|
|
101
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
102
|
+
transport = HTTPProxyTransport(proxy=proxy)
|
|
103
|
+
with httpx.Client(mounts={'http://': transort, 'https://': transport}) as client:
|
|
104
|
+
r = client.get('https://api.ipify.org?format=json')
|
|
105
|
+
|
|
106
|
+
r.headers['X-ProxyMesh-IP']
|
|
107
|
+
```
|
|
108
|
+
|
|
109
|
+
This module also provide helper methods similar to requests:
|
|
110
|
+
|
|
111
|
+
``` python
|
|
112
|
+
import httpx
|
|
113
|
+
from python_proxy_headers import httpx_proxy
|
|
114
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
115
|
+
r = httpx_proxy.get('https://api.ipify.org?format=json', proxy=proxy)
|
|
116
|
+
r.headers['X-ProxyMesh-IP']
|
|
117
|
+
```
|
|
118
|
+
|
|
119
|
+
And finally, httpx supports async requests, so we provide an async extension too:
|
|
120
|
+
|
|
121
|
+
``` python
|
|
122
|
+
import httpx
|
|
123
|
+
from python_proxy_headers.httpx_proxy import AsyncHTTPProxyTransport
|
|
124
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
125
|
+
transport = AsyncHTTPProxyTransport(proxy=proxy)
|
|
126
|
+
async with httpx.AsyncClient(mounts={'http://': transport, 'https://': transport}) as client:
|
|
127
|
+
r = await client.get('https://api.ipify.org?format=json')
|
|
128
|
+
|
|
129
|
+
r.headers['X-ProxyMesh-IP']
|
|
130
|
+
```
|
|
131
|
+
|
|
132
|
+
Our httpx helper module internally provides extension classes for [httpcore](https://www.encode.io/httpcore/), for handling proxy headers over tunnel connections.
|
|
133
|
+
You can use those classes if you're building on top of httpcore.
|
|
@@ -0,0 +1,26 @@
|
|
|
1
|
+
[build-system]
|
|
2
|
+
requires = ["setuptools>=61.0"]
|
|
3
|
+
build-backend = "setuptools.build_meta"
|
|
4
|
+
|
|
5
|
+
[project]
|
|
6
|
+
name = "python-proxy-headers"
|
|
7
|
+
version = "0.1.0"
|
|
8
|
+
authors = [
|
|
9
|
+
{ name="ProxyMesh", email="support@proxymesh.com" },
|
|
10
|
+
]
|
|
11
|
+
description = "Handle custom proxy headers for http requests in various python libraries"
|
|
12
|
+
readme = "README.md"
|
|
13
|
+
requires-python = ">=3.8"
|
|
14
|
+
classifiers = [
|
|
15
|
+
"Programming Language :: Python :: 3",
|
|
16
|
+
"Operating System :: OS Independent",
|
|
17
|
+
"License :: OSI Approved :: BSD License",
|
|
18
|
+
"Intended Audience :: Developers",
|
|
19
|
+
"Topic :: Internet :: WWW/HTTP",
|
|
20
|
+
"Topic :: Software Development :: Libraries :: Python Modules",
|
|
21
|
+
]
|
|
22
|
+
|
|
23
|
+
[project.urls]
|
|
24
|
+
Homepage = "https://github.com/proxymesh/python-proxy-headers"
|
|
25
|
+
Changelog = "https://github.com/proxymesh/python-proxy-headers/commits/main/"
|
|
26
|
+
Issues = "https://github.com/proxymesh/python-proxy-headers/issues"
|
|
File without changes
|
|
@@ -0,0 +1,159 @@
|
|
|
1
|
+
from http import HTTPStatus
|
|
2
|
+
from aiohttp.client_reqrep import ClientRequest, ClientResponse
|
|
3
|
+
from aiohttp.connector import TCPConnector, Connection
|
|
4
|
+
from aiohttp.client_exceptions import ClientHttpProxyError, ClientProxyConnectionError
|
|
5
|
+
from aiohttp.client import ClientSession
|
|
6
|
+
from aiohttp.helpers import reify
|
|
7
|
+
from aiohttp import hdrs
|
|
8
|
+
from multidict import CIMultiDict, CIMultiDictProxy
|
|
9
|
+
|
|
10
|
+
class ProxyTCPConnector(TCPConnector):
|
|
11
|
+
async def _create_proxy_connection(self, req: ClientRequest, traces, timeout):
|
|
12
|
+
self._fail_on_no_start_tls(req)
|
|
13
|
+
runtime_has_start_tls = self._loop_supports_start_tls()
|
|
14
|
+
|
|
15
|
+
headers = {}
|
|
16
|
+
if req.proxy_headers is not None:
|
|
17
|
+
headers = req.proxy_headers # type: ignore[assignment]
|
|
18
|
+
headers[hdrs.HOST] = req.headers[hdrs.HOST]
|
|
19
|
+
|
|
20
|
+
url = req.proxy
|
|
21
|
+
assert url is not None
|
|
22
|
+
proxy_req = ClientRequest(
|
|
23
|
+
hdrs.METH_GET,
|
|
24
|
+
url,
|
|
25
|
+
headers=headers,
|
|
26
|
+
auth=req.proxy_auth,
|
|
27
|
+
loop=self._loop,
|
|
28
|
+
ssl=req.ssl,
|
|
29
|
+
)
|
|
30
|
+
|
|
31
|
+
# create connection to proxy server
|
|
32
|
+
transport, proto = await self._create_direct_connection(
|
|
33
|
+
proxy_req, [], timeout, client_error=ClientProxyConnectionError
|
|
34
|
+
)
|
|
35
|
+
|
|
36
|
+
auth = proxy_req.headers.pop(hdrs.AUTHORIZATION, None)
|
|
37
|
+
if auth is not None:
|
|
38
|
+
if not req.is_ssl():
|
|
39
|
+
req.headers[hdrs.PROXY_AUTHORIZATION] = auth
|
|
40
|
+
else:
|
|
41
|
+
proxy_req.headers[hdrs.PROXY_AUTHORIZATION] = auth
|
|
42
|
+
|
|
43
|
+
if req.is_ssl():
|
|
44
|
+
if runtime_has_start_tls:
|
|
45
|
+
self._warn_about_tls_in_tls(transport, req)
|
|
46
|
+
|
|
47
|
+
# For HTTPS requests over HTTP proxy
|
|
48
|
+
# we must notify proxy to tunnel connection
|
|
49
|
+
# so we send CONNECT command:
|
|
50
|
+
# CONNECT www.python.org:443 HTTP/1.1
|
|
51
|
+
# Host: www.python.org
|
|
52
|
+
#
|
|
53
|
+
# next we must do TLS handshake and so on
|
|
54
|
+
# to do this we must wrap raw socket into secure one
|
|
55
|
+
# asyncio handles this perfectly
|
|
56
|
+
proxy_req.method = hdrs.METH_CONNECT
|
|
57
|
+
proxy_req.url = req.url
|
|
58
|
+
key = req.connection_key._replace(
|
|
59
|
+
proxy=None, proxy_auth=None, proxy_headers_hash=None
|
|
60
|
+
)
|
|
61
|
+
conn = Connection(self, key, proto, self._loop)
|
|
62
|
+
proxy_resp = await proxy_req.send(conn)
|
|
63
|
+
try:
|
|
64
|
+
protocol = conn._protocol
|
|
65
|
+
assert protocol is not None
|
|
66
|
+
|
|
67
|
+
# read_until_eof=True will ensure the connection isn't closed
|
|
68
|
+
# once the response is received and processed allowing
|
|
69
|
+
# START_TLS to work on the connection below.
|
|
70
|
+
protocol.set_response_params(
|
|
71
|
+
read_until_eof=runtime_has_start_tls,
|
|
72
|
+
timeout_ceil_threshold=self._timeout_ceil_threshold,
|
|
73
|
+
)
|
|
74
|
+
resp = await proxy_resp.start(conn)
|
|
75
|
+
except BaseException:
|
|
76
|
+
proxy_resp.close()
|
|
77
|
+
conn.close()
|
|
78
|
+
raise
|
|
79
|
+
else:
|
|
80
|
+
conn._protocol = None
|
|
81
|
+
try:
|
|
82
|
+
if resp.status != 200:
|
|
83
|
+
message = resp.reason
|
|
84
|
+
if message is None:
|
|
85
|
+
message = HTTPStatus(resp.status).phrase
|
|
86
|
+
raise ClientHttpProxyError(
|
|
87
|
+
proxy_resp.request_info,
|
|
88
|
+
resp.history,
|
|
89
|
+
status=resp.status,
|
|
90
|
+
message=message,
|
|
91
|
+
headers=resp.headers,
|
|
92
|
+
)
|
|
93
|
+
if not runtime_has_start_tls:
|
|
94
|
+
rawsock = transport.get_extra_info("socket", default=None)
|
|
95
|
+
if rawsock is None:
|
|
96
|
+
raise RuntimeError(
|
|
97
|
+
"Transport does not expose socket instance"
|
|
98
|
+
)
|
|
99
|
+
# Duplicate the socket, so now we can close proxy transport
|
|
100
|
+
rawsock = rawsock.dup()
|
|
101
|
+
except BaseException:
|
|
102
|
+
# It shouldn't be closed in `finally` because it's fed to
|
|
103
|
+
# `loop.start_tls()` and the docs say not to touch it after
|
|
104
|
+
# passing there.
|
|
105
|
+
transport.close()
|
|
106
|
+
raise
|
|
107
|
+
finally:
|
|
108
|
+
if not runtime_has_start_tls:
|
|
109
|
+
transport.close()
|
|
110
|
+
|
|
111
|
+
# TODO: try adding resp.headers to the proto returned as 2nd tuple element below
|
|
112
|
+
if not runtime_has_start_tls:
|
|
113
|
+
# HTTP proxy with support for upgrade to HTTPS
|
|
114
|
+
sslcontext = self._get_ssl_context(req)
|
|
115
|
+
transport, proto = await self._wrap_existing_connection(
|
|
116
|
+
self._factory,
|
|
117
|
+
timeout=timeout,
|
|
118
|
+
ssl=sslcontext,
|
|
119
|
+
sock=rawsock,
|
|
120
|
+
server_hostname=req.host,
|
|
121
|
+
req=req,
|
|
122
|
+
)
|
|
123
|
+
|
|
124
|
+
transport, proto = await self._start_tls_connection(
|
|
125
|
+
# Access the old transport for the last time before it's
|
|
126
|
+
# closed and forgotten forever:
|
|
127
|
+
transport,
|
|
128
|
+
req=req,
|
|
129
|
+
timeout=timeout,
|
|
130
|
+
)
|
|
131
|
+
finally:
|
|
132
|
+
proxy_resp.close()
|
|
133
|
+
|
|
134
|
+
proto._proxy_headers = resp.headers
|
|
135
|
+
return transport, proto
|
|
136
|
+
|
|
137
|
+
|
|
138
|
+
class ProxyClientRequest(ClientRequest):
|
|
139
|
+
async def send(self, conn):
|
|
140
|
+
resp = await super().send(conn)
|
|
141
|
+
if hasattr(conn.protocol, '_proxy_headers'):
|
|
142
|
+
resp._proxy_headers = conn.protocol._proxy_headers
|
|
143
|
+
return resp
|
|
144
|
+
|
|
145
|
+
class ProxyClientResponse(ClientResponse):
|
|
146
|
+
@reify
|
|
147
|
+
def headers(self):
|
|
148
|
+
proxy_headers = getattr(self, '_proxy_headers', None)
|
|
149
|
+
|
|
150
|
+
if proxy_headers:
|
|
151
|
+
headers = CIMultiDict(self._headers)
|
|
152
|
+
headers.extend(proxy_headers)
|
|
153
|
+
return CIMultiDictProxy(headers)
|
|
154
|
+
else:
|
|
155
|
+
return self._headers
|
|
156
|
+
|
|
157
|
+
class ProxyClientSession(ClientSession):
|
|
158
|
+
def __init__(self, *args, **kwargs):
|
|
159
|
+
super().__init__(connector=ProxyTCPConnector(), response_class=ProxyClientResponse,request_class=ProxyClientRequest, *args, **kwargs)
|
|
@@ -0,0 +1,376 @@
|
|
|
1
|
+
from contextlib import contextmanager
|
|
2
|
+
from httpcore._sync.http_proxy import HTTPProxy, TunnelHTTPConnection, merge_headers, logger
|
|
3
|
+
from httpcore._sync.http11 import HTTP11Connection
|
|
4
|
+
from httpcore._async.http_proxy import AsyncHTTPProxy, AsyncTunnelHTTPConnection
|
|
5
|
+
from httpcore._async.http11 import AsyncHTTP11Connection
|
|
6
|
+
from httpcore._models import URL, Request
|
|
7
|
+
from httpcore._exceptions import ProxyError
|
|
8
|
+
from httpcore._ssl import default_ssl_context
|
|
9
|
+
from httpcore._trace import Trace
|
|
10
|
+
from httpx import AsyncHTTPTransport, HTTPTransport, Client
|
|
11
|
+
from httpx._config import DEFAULT_LIMITS, DEFAULT_TIMEOUT_CONFIG, Proxy, create_ssl_context
|
|
12
|
+
|
|
13
|
+
class ProxyTunnelHTTPConnection(TunnelHTTPConnection):
|
|
14
|
+
# Unfortunately the only way to get connect_response.headers into the Response
|
|
15
|
+
# is to override this whole method
|
|
16
|
+
def handle_request(self, request):
|
|
17
|
+
timeouts = request.extensions.get("timeout", {})
|
|
18
|
+
timeout = timeouts.get("connect", None)
|
|
19
|
+
|
|
20
|
+
with self._connect_lock:
|
|
21
|
+
if not self._connected:
|
|
22
|
+
target = b"%b:%d" % (self._remote_origin.host, self._remote_origin.port)
|
|
23
|
+
|
|
24
|
+
connect_url = URL(
|
|
25
|
+
scheme=self._proxy_origin.scheme,
|
|
26
|
+
host=self._proxy_origin.host,
|
|
27
|
+
port=self._proxy_origin.port,
|
|
28
|
+
target=target,
|
|
29
|
+
)
|
|
30
|
+
connect_headers = merge_headers(
|
|
31
|
+
[(b"Host", target), (b"Accept", b"*/*")], self._proxy_headers
|
|
32
|
+
)
|
|
33
|
+
connect_request = Request(
|
|
34
|
+
method=b"CONNECT",
|
|
35
|
+
url=connect_url,
|
|
36
|
+
headers=connect_headers,
|
|
37
|
+
extensions=request.extensions,
|
|
38
|
+
)
|
|
39
|
+
connect_response = self._connection.handle_request(
|
|
40
|
+
connect_request
|
|
41
|
+
)
|
|
42
|
+
|
|
43
|
+
if connect_response.status < 200 or connect_response.status > 299:
|
|
44
|
+
reason_bytes = connect_response.extensions.get("reason_phrase", b"")
|
|
45
|
+
reason_str = reason_bytes.decode("ascii", errors="ignore")
|
|
46
|
+
msg = "%d %s" % (connect_response.status, reason_str)
|
|
47
|
+
self._connection.close()
|
|
48
|
+
raise ProxyError(msg)
|
|
49
|
+
|
|
50
|
+
stream = connect_response.extensions["network_stream"]
|
|
51
|
+
|
|
52
|
+
# Upgrade the stream to SSL
|
|
53
|
+
ssl_context = (
|
|
54
|
+
default_ssl_context()
|
|
55
|
+
if self._ssl_context is None
|
|
56
|
+
else self._ssl_context
|
|
57
|
+
)
|
|
58
|
+
alpn_protocols = ["http/1.1", "h2"] if self._http2 else ["http/1.1"]
|
|
59
|
+
ssl_context.set_alpn_protocols(alpn_protocols)
|
|
60
|
+
|
|
61
|
+
kwargs = {
|
|
62
|
+
"ssl_context": ssl_context,
|
|
63
|
+
"server_hostname": self._remote_origin.host.decode("ascii"),
|
|
64
|
+
"timeout": timeout,
|
|
65
|
+
}
|
|
66
|
+
with Trace("start_tls", logger, request, kwargs) as trace:
|
|
67
|
+
stream = stream.start_tls(**kwargs)
|
|
68
|
+
trace.return_value = stream
|
|
69
|
+
|
|
70
|
+
# Determine if we should be using HTTP/1.1 or HTTP/2
|
|
71
|
+
ssl_object = stream.get_extra_info("ssl_object")
|
|
72
|
+
http2_negotiated = (
|
|
73
|
+
ssl_object is not None
|
|
74
|
+
and ssl_object.selected_alpn_protocol() == "h2"
|
|
75
|
+
)
|
|
76
|
+
|
|
77
|
+
# Create the HTTP/1.1 or HTTP/2 connection
|
|
78
|
+
if http2_negotiated or (self._http2 and not self._http1):
|
|
79
|
+
from httpcore._sync.http2 import HTTP2Connection
|
|
80
|
+
|
|
81
|
+
self._connection = HTTP2Connection(
|
|
82
|
+
origin=self._remote_origin,
|
|
83
|
+
stream=stream,
|
|
84
|
+
keepalive_expiry=self._keepalive_expiry,
|
|
85
|
+
)
|
|
86
|
+
else:
|
|
87
|
+
self._connection = HTTP11Connection(
|
|
88
|
+
origin=self._remote_origin,
|
|
89
|
+
stream=stream,
|
|
90
|
+
keepalive_expiry=self._keepalive_expiry,
|
|
91
|
+
)
|
|
92
|
+
|
|
93
|
+
self._connected = True
|
|
94
|
+
# this is the only modification
|
|
95
|
+
response = self._connection.handle_request(request)
|
|
96
|
+
response.headers = merge_headers(response.headers, connect_response.headers)
|
|
97
|
+
return response
|
|
98
|
+
|
|
99
|
+
class AsyncProxyTunnelHTTPConnection(AsyncTunnelHTTPConnection):
|
|
100
|
+
async def handle_async_request(self, request):
|
|
101
|
+
timeouts = request.extensions.get("timeout", {})
|
|
102
|
+
timeout = timeouts.get("connect", None)
|
|
103
|
+
|
|
104
|
+
async with self._connect_lock:
|
|
105
|
+
if not self._connected:
|
|
106
|
+
target = b"%b:%d" % (self._remote_origin.host, self._remote_origin.port)
|
|
107
|
+
|
|
108
|
+
connect_url = URL(
|
|
109
|
+
scheme=self._proxy_origin.scheme,
|
|
110
|
+
host=self._proxy_origin.host,
|
|
111
|
+
port=self._proxy_origin.port,
|
|
112
|
+
target=target,
|
|
113
|
+
)
|
|
114
|
+
connect_headers = merge_headers(
|
|
115
|
+
[(b"Host", target), (b"Accept", b"*/*")], self._proxy_headers
|
|
116
|
+
)
|
|
117
|
+
connect_request = Request(
|
|
118
|
+
method=b"CONNECT",
|
|
119
|
+
url=connect_url,
|
|
120
|
+
headers=connect_headers,
|
|
121
|
+
extensions=request.extensions,
|
|
122
|
+
)
|
|
123
|
+
connect_response = await self._connection.handle_async_request(
|
|
124
|
+
connect_request
|
|
125
|
+
)
|
|
126
|
+
|
|
127
|
+
if connect_response.status < 200 or connect_response.status > 299:
|
|
128
|
+
reason_bytes = connect_response.extensions.get("reason_phrase", b"")
|
|
129
|
+
reason_str = reason_bytes.decode("ascii", errors="ignore")
|
|
130
|
+
msg = "%d %s" % (connect_response.status, reason_str)
|
|
131
|
+
await self._connection.aclose()
|
|
132
|
+
raise ProxyError(msg)
|
|
133
|
+
|
|
134
|
+
stream = connect_response.extensions["network_stream"]
|
|
135
|
+
|
|
136
|
+
# Upgrade the stream to SSL
|
|
137
|
+
ssl_context = (
|
|
138
|
+
default_ssl_context()
|
|
139
|
+
if self._ssl_context is None
|
|
140
|
+
else self._ssl_context
|
|
141
|
+
)
|
|
142
|
+
alpn_protocols = ["http/1.1", "h2"] if self._http2 else ["http/1.1"]
|
|
143
|
+
ssl_context.set_alpn_protocols(alpn_protocols)
|
|
144
|
+
|
|
145
|
+
kwargs = {
|
|
146
|
+
"ssl_context": ssl_context,
|
|
147
|
+
"server_hostname": self._remote_origin.host.decode("ascii"),
|
|
148
|
+
"timeout": timeout,
|
|
149
|
+
}
|
|
150
|
+
async with Trace("start_tls", logger, request, kwargs) as trace:
|
|
151
|
+
stream = await stream.start_tls(**kwargs)
|
|
152
|
+
trace.return_value = stream
|
|
153
|
+
|
|
154
|
+
# Determine if we should be using HTTP/1.1 or HTTP/2
|
|
155
|
+
ssl_object = stream.get_extra_info("ssl_object")
|
|
156
|
+
http2_negotiated = (
|
|
157
|
+
ssl_object is not None
|
|
158
|
+
and ssl_object.selected_alpn_protocol() == "h2"
|
|
159
|
+
)
|
|
160
|
+
|
|
161
|
+
# Create the HTTP/1.1 or HTTP/2 connection
|
|
162
|
+
if http2_negotiated or (self._http2 and not self._http1):
|
|
163
|
+
from httpcore._async.http2 import AsyncHTTP2Connection
|
|
164
|
+
|
|
165
|
+
self._connection = AsyncHTTP2Connection(
|
|
166
|
+
origin=self._remote_origin,
|
|
167
|
+
stream=stream,
|
|
168
|
+
keepalive_expiry=self._keepalive_expiry,
|
|
169
|
+
)
|
|
170
|
+
else:
|
|
171
|
+
self._connection = AsyncHTTP11Connection(
|
|
172
|
+
origin=self._remote_origin,
|
|
173
|
+
stream=stream,
|
|
174
|
+
keepalive_expiry=self._keepalive_expiry,
|
|
175
|
+
)
|
|
176
|
+
|
|
177
|
+
self._connected = True
|
|
178
|
+
# this is the only modification
|
|
179
|
+
response = await self._connection.handle_async_request(request)
|
|
180
|
+
response.headers = merge_headers(response.headers, connect_response.headers)
|
|
181
|
+
return response
|
|
182
|
+
|
|
183
|
+
class HTTPProxyHeaders(HTTPProxy):
|
|
184
|
+
def create_connection(self, origin):
|
|
185
|
+
if origin.scheme == b"http":
|
|
186
|
+
return super().create_connection(origin)
|
|
187
|
+
return ProxyTunnelHTTPConnection(
|
|
188
|
+
proxy_origin=self._proxy_url.origin,
|
|
189
|
+
proxy_headers=self._proxy_headers,
|
|
190
|
+
remote_origin=origin,
|
|
191
|
+
ssl_context=self._ssl_context,
|
|
192
|
+
proxy_ssl_context=self._proxy_ssl_context,
|
|
193
|
+
keepalive_expiry=self._keepalive_expiry,
|
|
194
|
+
http1=self._http1,
|
|
195
|
+
http2=self._http2,
|
|
196
|
+
network_backend=self._network_backend,
|
|
197
|
+
)
|
|
198
|
+
|
|
199
|
+
class AsyncHTTPProxyHeaders(AsyncHTTPProxy):
|
|
200
|
+
def create_connection(self, origin):
|
|
201
|
+
if origin.scheme == b"http":
|
|
202
|
+
return super().create_connection(origin)
|
|
203
|
+
return AsyncProxyTunnelHTTPConnection(
|
|
204
|
+
proxy_origin=self._proxy_url.origin,
|
|
205
|
+
proxy_headers=self._proxy_headers,
|
|
206
|
+
remote_origin=origin,
|
|
207
|
+
ssl_context=self._ssl_context,
|
|
208
|
+
proxy_ssl_context=self._proxy_ssl_context,
|
|
209
|
+
keepalive_expiry=self._keepalive_expiry,
|
|
210
|
+
http1=self._http1,
|
|
211
|
+
http2=self._http2,
|
|
212
|
+
network_backend=self._network_backend,
|
|
213
|
+
)
|
|
214
|
+
|
|
215
|
+
# class ProxyConnectionPool(ConnectionPool):
|
|
216
|
+
# def create_connection(self, origin):
|
|
217
|
+
# if self._proxy is not None:
|
|
218
|
+
# if self._proxy.url.scheme in (b"socks5", b"socks5h"):
|
|
219
|
+
# return super().create_connection(origin)
|
|
220
|
+
# elif origin.scheme == b"http":
|
|
221
|
+
# return super().create_connection(origin)
|
|
222
|
+
|
|
223
|
+
# return ProxyTunnelHTTPConnection(
|
|
224
|
+
# proxy_origin=self._proxy.url.origin,
|
|
225
|
+
# proxy_headers=self._proxy.headers,
|
|
226
|
+
# proxy_ssl_context=self._proxy.ssl_context,
|
|
227
|
+
# remote_origin=origin,
|
|
228
|
+
# ssl_context=self._ssl_context,
|
|
229
|
+
# keepalive_expiry=self._keepalive_expiry,
|
|
230
|
+
# http1=self._http1,
|
|
231
|
+
# http2=self._http2,
|
|
232
|
+
# network_backend=self._network_backend,
|
|
233
|
+
# )
|
|
234
|
+
|
|
235
|
+
# return super().create_connection(origin)
|
|
236
|
+
|
|
237
|
+
class HTTPProxyTransport(HTTPTransport):
|
|
238
|
+
def __init__(
|
|
239
|
+
self,
|
|
240
|
+
verify = True,
|
|
241
|
+
cert = None,
|
|
242
|
+
trust_env: bool = True,
|
|
243
|
+
http1: bool = True,
|
|
244
|
+
http2: bool = False,
|
|
245
|
+
limits = DEFAULT_LIMITS,
|
|
246
|
+
proxy = None,
|
|
247
|
+
uds: str | None = None,
|
|
248
|
+
local_address: str | None = None,
|
|
249
|
+
retries: int = 0,
|
|
250
|
+
socket_options = None,
|
|
251
|
+
) -> None:
|
|
252
|
+
proxy = Proxy(url=proxy) if isinstance(proxy, (str, URL)) else proxy
|
|
253
|
+
ssl_context = create_ssl_context(verify=verify, cert=cert, trust_env=trust_env)
|
|
254
|
+
|
|
255
|
+
if proxy and proxy.url.scheme in ("http", "https"):
|
|
256
|
+
self._pool = HTTPProxyHeaders(
|
|
257
|
+
proxy_url=URL(
|
|
258
|
+
scheme=proxy.url.raw_scheme,
|
|
259
|
+
host=proxy.url.raw_host,
|
|
260
|
+
port=proxy.url.port,
|
|
261
|
+
target=proxy.url.raw_path,
|
|
262
|
+
),
|
|
263
|
+
proxy_auth=proxy.raw_auth,
|
|
264
|
+
proxy_headers=proxy.headers.raw,
|
|
265
|
+
ssl_context=ssl_context,
|
|
266
|
+
proxy_ssl_context=proxy.ssl_context,
|
|
267
|
+
max_connections=limits.max_connections,
|
|
268
|
+
max_keepalive_connections=limits.max_keepalive_connections,
|
|
269
|
+
keepalive_expiry=limits.keepalive_expiry,
|
|
270
|
+
http1=http1,
|
|
271
|
+
http2=http2,
|
|
272
|
+
socket_options=socket_options,
|
|
273
|
+
)
|
|
274
|
+
else:
|
|
275
|
+
super().__init__(verify, cert, trust_env, http1, http2, limits, proxy, uds, local_address, retries, socket_options)
|
|
276
|
+
|
|
277
|
+
class AsyncHTTPProxyTransport(AsyncHTTPTransport):
|
|
278
|
+
def __init__(
|
|
279
|
+
self,
|
|
280
|
+
verify = True,
|
|
281
|
+
cert = None,
|
|
282
|
+
trust_env: bool = True,
|
|
283
|
+
http1: bool = True,
|
|
284
|
+
http2: bool = False,
|
|
285
|
+
limits = DEFAULT_LIMITS,
|
|
286
|
+
proxy = None,
|
|
287
|
+
uds: str | None = None,
|
|
288
|
+
local_address: str | None = None,
|
|
289
|
+
retries: int = 0,
|
|
290
|
+
socket_options = None,
|
|
291
|
+
) -> None:
|
|
292
|
+
proxy = Proxy(url=proxy) if isinstance(proxy, (str, URL)) else proxy
|
|
293
|
+
ssl_context = create_ssl_context(verify=verify, cert=cert, trust_env=trust_env)
|
|
294
|
+
|
|
295
|
+
if proxy and proxy.url.scheme in ("http", "https"):
|
|
296
|
+
self._pool = AsyncHTTPProxyHeaders(
|
|
297
|
+
proxy_url=URL(
|
|
298
|
+
scheme=proxy.url.raw_scheme,
|
|
299
|
+
host=proxy.url.raw_host,
|
|
300
|
+
port=proxy.url.port,
|
|
301
|
+
target=proxy.url.raw_path,
|
|
302
|
+
),
|
|
303
|
+
proxy_auth=proxy.raw_auth,
|
|
304
|
+
proxy_headers=proxy.headers.raw,
|
|
305
|
+
proxy_ssl_context=proxy.ssl_context,
|
|
306
|
+
ssl_context=ssl_context,
|
|
307
|
+
max_connections=limits.max_connections,
|
|
308
|
+
max_keepalive_connections=limits.max_keepalive_connections,
|
|
309
|
+
keepalive_expiry=limits.keepalive_expiry,
|
|
310
|
+
http1=http1,
|
|
311
|
+
http2=http2,
|
|
312
|
+
socket_options=socket_options,
|
|
313
|
+
)
|
|
314
|
+
else:
|
|
315
|
+
super().__init__(verify, cert, trust_env, http1, http2, limits, proxy, uds, local_address, retries, socket_options)
|
|
316
|
+
|
|
317
|
+
def request(method: str,
|
|
318
|
+
url: URL | str,
|
|
319
|
+
*,
|
|
320
|
+
cookies = None,
|
|
321
|
+
proxy = None,
|
|
322
|
+
timeout = DEFAULT_TIMEOUT_CONFIG,
|
|
323
|
+
verify = True,
|
|
324
|
+
trust_env: bool = True,
|
|
325
|
+
**kwargs):
|
|
326
|
+
transport = HTTPProxyTransport(proxy=proxy)
|
|
327
|
+
with Client(
|
|
328
|
+
cookies=cookies,
|
|
329
|
+
verify=verify,
|
|
330
|
+
timeout=timeout,
|
|
331
|
+
trust_env=trust_env,
|
|
332
|
+
mounts={'http://': transport, 'https://': transport}
|
|
333
|
+
) as client:
|
|
334
|
+
return client.request(method=method, url=url, **kwargs)
|
|
335
|
+
|
|
336
|
+
def get(*args, **kwargs):
|
|
337
|
+
return request('GET', *args, **kwargs)
|
|
338
|
+
|
|
339
|
+
def options(*args, **kwargs):
|
|
340
|
+
return request('OPTIONS', *args, **kwargs)
|
|
341
|
+
|
|
342
|
+
def head(*args, **kwargs):
|
|
343
|
+
return request('HEAD', *args, **kwargs)
|
|
344
|
+
|
|
345
|
+
def post(*args, **kwargs):
|
|
346
|
+
return request('POST', *args, **kwargs)
|
|
347
|
+
|
|
348
|
+
def put(*args, **kwargs):
|
|
349
|
+
return request('PUT', *args, **kwargs)
|
|
350
|
+
|
|
351
|
+
def patch(*args, **kwargs):
|
|
352
|
+
return request('PATCH', *args, **kwargs)
|
|
353
|
+
|
|
354
|
+
def delete(*args, **kwargs):
|
|
355
|
+
return request('DELETE', *args, **kwargs)
|
|
356
|
+
|
|
357
|
+
@contextmanager
|
|
358
|
+
def stream(method: str,
|
|
359
|
+
url: URL | str,
|
|
360
|
+
*,
|
|
361
|
+
cookies = None,
|
|
362
|
+
proxy = None,
|
|
363
|
+
timeout = DEFAULT_TIMEOUT_CONFIG,
|
|
364
|
+
verify = True,
|
|
365
|
+
trust_env: bool = True,
|
|
366
|
+
**kwargs):
|
|
367
|
+
transport = HTTPProxyTransport(proxy=proxy)
|
|
368
|
+
with Client(
|
|
369
|
+
cookies=cookies,
|
|
370
|
+
verify=verify,
|
|
371
|
+
timeout=timeout,
|
|
372
|
+
trust_env=trust_env,
|
|
373
|
+
mounts={'http://': transport, 'https://': transport}
|
|
374
|
+
) as client:
|
|
375
|
+
with client.stream(method=method, url=url, **kwargs) as response:
|
|
376
|
+
yield response
|
|
@@ -0,0 +1,73 @@
|
|
|
1
|
+
from requests.adapters import HTTPAdapter
|
|
2
|
+
from requests.sessions import Session
|
|
3
|
+
from .urllib3_proxy_manager import proxy_from_url
|
|
4
|
+
|
|
5
|
+
class HTTPProxyHeaderAdapter(HTTPAdapter):
|
|
6
|
+
def __init__(self, proxy_headers=None):
|
|
7
|
+
super().__init__()
|
|
8
|
+
self._proxy_headers = proxy_headers or {}
|
|
9
|
+
|
|
10
|
+
def proxy_manager_for(self, proxy, **proxy_kwargs):
|
|
11
|
+
"""Return urllib3 ProxyManager for the given proxy.
|
|
12
|
+
|
|
13
|
+
This method should not be called from user code, and is only
|
|
14
|
+
exposed for use when subclassing the
|
|
15
|
+
:class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.
|
|
16
|
+
|
|
17
|
+
:param proxy: The proxy to return a urllib3 ProxyManager for.
|
|
18
|
+
:param proxy_kwargs: Extra keyword arguments used to configure the Proxy Manager.
|
|
19
|
+
:returns: ProxyManager
|
|
20
|
+
:rtype: urllib3.ProxyManager
|
|
21
|
+
"""
|
|
22
|
+
if proxy in self.proxy_manager:
|
|
23
|
+
manager = self.proxy_manager[proxy]
|
|
24
|
+
elif proxy.lower().startswith("socks"):
|
|
25
|
+
return super().proxy_manager_for(proxy, **proxy_kwargs)
|
|
26
|
+
else:
|
|
27
|
+
# HTTPAdapter.proxy_headers only gets Proxy-Authorization
|
|
28
|
+
_proxy_headers = self.proxy_headers(proxy)
|
|
29
|
+
if self._proxy_headers:
|
|
30
|
+
_proxy_headers.update(self._proxy_headers)
|
|
31
|
+
|
|
32
|
+
manager = self.proxy_manager[proxy] = proxy_from_url(
|
|
33
|
+
proxy,
|
|
34
|
+
proxy_headers=_proxy_headers,
|
|
35
|
+
num_pools=self._pool_connections,
|
|
36
|
+
maxsize=self._pool_maxsize,
|
|
37
|
+
block=self._pool_block,
|
|
38
|
+
**proxy_kwargs,
|
|
39
|
+
)
|
|
40
|
+
|
|
41
|
+
return manager
|
|
42
|
+
|
|
43
|
+
class ProxySession(Session):
|
|
44
|
+
def __init__(self, proxy_headers=None):
|
|
45
|
+
super().__init__()
|
|
46
|
+
self.mount('https://', HTTPProxyHeaderAdapter(proxy_headers=proxy_headers))
|
|
47
|
+
self.mount('http://', HTTPProxyHeaderAdapter(proxy_headers=proxy_headers))
|
|
48
|
+
|
|
49
|
+
def request(method, url, proxy_headers=None, **kwargs):
|
|
50
|
+
with ProxySession(proxy_headers) as session:
|
|
51
|
+
return session.request(method=method, url=url, **kwargs)
|
|
52
|
+
|
|
53
|
+
def get(*args, **kwargs):
|
|
54
|
+
return request('get', *args, **kwargs)
|
|
55
|
+
|
|
56
|
+
def options(*args, **kwargs):
|
|
57
|
+
return request('options', *args, **kwargs)
|
|
58
|
+
|
|
59
|
+
def head(*args, **kwargs):
|
|
60
|
+
kwargs.setdefault("allow_redirects", False)
|
|
61
|
+
return request('head', *args, **kwargs)
|
|
62
|
+
|
|
63
|
+
def post(*args, **kwargs):
|
|
64
|
+
return request('post', *args, **kwargs)
|
|
65
|
+
|
|
66
|
+
def put(*args, **kwargs):
|
|
67
|
+
return request('put', *args, **kwargs)
|
|
68
|
+
|
|
69
|
+
def patch(*args, **kwargs):
|
|
70
|
+
return request('patch', *args, **kwargs)
|
|
71
|
+
|
|
72
|
+
def delete(*args, **kwargs):
|
|
73
|
+
return request('delete', *args, **kwargs)
|
|
@@ -0,0 +1,135 @@
|
|
|
1
|
+
import http, sys
|
|
2
|
+
from http.client import _read_headers
|
|
3
|
+
from urllib3.connection import HTTPSConnection
|
|
4
|
+
from urllib3.connectionpool import HTTPConnectionPool, HTTPSConnectionPool
|
|
5
|
+
from urllib3.poolmanager import ProxyManager
|
|
6
|
+
|
|
7
|
+
if sys.version_info < (3, 12, 0):
|
|
8
|
+
#####################################
|
|
9
|
+
### copied from python3.12 source ###
|
|
10
|
+
#####################################
|
|
11
|
+
import email.parser
|
|
12
|
+
import email.message
|
|
13
|
+
|
|
14
|
+
class HTTPMessage(email.message.Message):
|
|
15
|
+
# XXX The only usage of this method is in
|
|
16
|
+
# http.server.CGIHTTPRequestHandler. Maybe move the code there so
|
|
17
|
+
# that it doesn't need to be part of the public API. The API has
|
|
18
|
+
# never been defined so this could cause backwards compatibility
|
|
19
|
+
# issues.
|
|
20
|
+
|
|
21
|
+
def getallmatchingheaders(self, name):
|
|
22
|
+
"""Find all header lines matching a given header name.
|
|
23
|
+
|
|
24
|
+
Look through the list of headers and find all lines matching a given
|
|
25
|
+
header name (and their continuation lines). A list of the lines is
|
|
26
|
+
returned, without interpretation. If the header does not occur, an
|
|
27
|
+
empty list is returned. If the header occurs multiple times, all
|
|
28
|
+
occurrences are returned. Case is not important in the header name.
|
|
29
|
+
|
|
30
|
+
"""
|
|
31
|
+
name = name.lower() + ':'
|
|
32
|
+
n = len(name)
|
|
33
|
+
lst = []
|
|
34
|
+
hit = 0
|
|
35
|
+
for line in self.keys():
|
|
36
|
+
if line[:n].lower() == name:
|
|
37
|
+
hit = 1
|
|
38
|
+
elif not line[:1].isspace():
|
|
39
|
+
hit = 0
|
|
40
|
+
if hit:
|
|
41
|
+
lst.append(line)
|
|
42
|
+
return lst
|
|
43
|
+
|
|
44
|
+
def _parse_header_lines(header_lines, _class=HTTPMessage):
|
|
45
|
+
"""
|
|
46
|
+
Parses only RFC2822 headers from header lines.
|
|
47
|
+
|
|
48
|
+
email Parser wants to see strings rather than bytes.
|
|
49
|
+
But a TextIOWrapper around self.rfile would buffer too many bytes
|
|
50
|
+
from the stream, bytes which we later need to read as bytes.
|
|
51
|
+
So we read the correct bytes here, as bytes, for email Parser
|
|
52
|
+
to parse.
|
|
53
|
+
|
|
54
|
+
"""
|
|
55
|
+
hstring = b''.join(header_lines).decode('iso-8859-1')
|
|
56
|
+
return email.parser.Parser(_class=_class).parsestr(hstring)
|
|
57
|
+
else:
|
|
58
|
+
from http.client import _parse_header_lines
|
|
59
|
+
|
|
60
|
+
class HTTPSProxyConnection(HTTPSConnection):
|
|
61
|
+
if sys.version_info < (3, 12, 0):
|
|
62
|
+
#####################################
|
|
63
|
+
### copied from python3.12 source ###
|
|
64
|
+
#####################################
|
|
65
|
+
|
|
66
|
+
def _wrap_ipv6(self, ip):
|
|
67
|
+
if b':' in ip and ip[0] != b'['[0]:
|
|
68
|
+
return b"[" + ip + b"]"
|
|
69
|
+
return ip
|
|
70
|
+
|
|
71
|
+
def _tunnel(self):
|
|
72
|
+
connect = b"CONNECT %s:%d %s\r\n" % (
|
|
73
|
+
self._wrap_ipv6(self._tunnel_host.encode("idna")),
|
|
74
|
+
self._tunnel_port,
|
|
75
|
+
self._http_vsn_str.encode("ascii"))
|
|
76
|
+
headers = [connect]
|
|
77
|
+
for header, value in self._tunnel_headers.items():
|
|
78
|
+
headers.append(f"{header}: {value}\r\n".encode("latin-1"))
|
|
79
|
+
headers.append(b"\r\n")
|
|
80
|
+
# Making a single send() call instead of one per line encourages
|
|
81
|
+
# the host OS to use a more optimal packet size instead of
|
|
82
|
+
# potentially emitting a series of small packets.
|
|
83
|
+
self.send(b"".join(headers))
|
|
84
|
+
del headers
|
|
85
|
+
|
|
86
|
+
response = self.response_class(self.sock, method=self._method)
|
|
87
|
+
try:
|
|
88
|
+
(version, code, message) = response._read_status()
|
|
89
|
+
|
|
90
|
+
self._raw_proxy_headers = _read_headers(response.fp)
|
|
91
|
+
|
|
92
|
+
if self.debuglevel > 0:
|
|
93
|
+
for header in self._raw_proxy_headers:
|
|
94
|
+
print('header:', header.decode())
|
|
95
|
+
|
|
96
|
+
if code != http.HTTPStatus.OK:
|
|
97
|
+
self.close()
|
|
98
|
+
raise OSError(f"Tunnel connection failed: {code} {message.strip()}")
|
|
99
|
+
|
|
100
|
+
finally:
|
|
101
|
+
response.close()
|
|
102
|
+
|
|
103
|
+
def get_proxy_response_headers(self):
|
|
104
|
+
"""
|
|
105
|
+
Returns a dictionary with the headers of the response
|
|
106
|
+
received from the proxy server to the CONNECT request
|
|
107
|
+
sent to set the tunnel.
|
|
108
|
+
|
|
109
|
+
If the CONNECT request was not sent, the method returns None.
|
|
110
|
+
"""
|
|
111
|
+
return (
|
|
112
|
+
_parse_header_lines(self._raw_proxy_headers)
|
|
113
|
+
if self._raw_proxy_headers is not None
|
|
114
|
+
else None
|
|
115
|
+
)
|
|
116
|
+
|
|
117
|
+
class HTTPSProxyConnectionPool(HTTPSConnectionPool):
|
|
118
|
+
ConnectionCls = HTTPSProxyConnection
|
|
119
|
+
|
|
120
|
+
def _prepare_proxy(self, conn):
|
|
121
|
+
super()._prepare_proxy(conn)
|
|
122
|
+
self._proxy_response_headers = conn.get_proxy_response_headers()
|
|
123
|
+
|
|
124
|
+
def urlopen(self, *args, **kwargs):
|
|
125
|
+
response = super().urlopen(*args, **kwargs)
|
|
126
|
+
response.headers.update(self._proxy_response_headers)
|
|
127
|
+
return response
|
|
128
|
+
|
|
129
|
+
class ProxyHeaderManager(ProxyManager):
|
|
130
|
+
def __init__(self, *args, **kwargs):
|
|
131
|
+
super().__init__(*args, **kwargs)
|
|
132
|
+
self.pool_classes_by_scheme = {"http": HTTPConnectionPool, "https": HTTPSProxyConnectionPool}
|
|
133
|
+
|
|
134
|
+
def proxy_from_url(url, **kwargs):
|
|
135
|
+
return ProxyHeaderManager(proxy_url=url, **kwargs)
|
|
@@ -0,0 +1,151 @@
|
|
|
1
|
+
Metadata-Version: 2.2
|
|
2
|
+
Name: python-proxy-headers
|
|
3
|
+
Version: 0.1.0
|
|
4
|
+
Summary: Handle custom proxy headers for http requests in various python libraries
|
|
5
|
+
Author-email: ProxyMesh <support@proxymesh.com>
|
|
6
|
+
Project-URL: Homepage, https://github.com/proxymesh/python-proxy-headers
|
|
7
|
+
Project-URL: Changelog, https://github.com/proxymesh/python-proxy-headers/commits/main/
|
|
8
|
+
Project-URL: Issues, https://github.com/proxymesh/python-proxy-headers/issues
|
|
9
|
+
Classifier: Programming Language :: Python :: 3
|
|
10
|
+
Classifier: Operating System :: OS Independent
|
|
11
|
+
Classifier: License :: OSI Approved :: BSD License
|
|
12
|
+
Classifier: Intended Audience :: Developers
|
|
13
|
+
Classifier: Topic :: Internet :: WWW/HTTP
|
|
14
|
+
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
|
15
|
+
Requires-Python: >=3.8
|
|
16
|
+
Description-Content-Type: text/markdown
|
|
17
|
+
License-File: LICENSE
|
|
18
|
+
|
|
19
|
+
# Python Proxy Headers
|
|
20
|
+
|
|
21
|
+
The `python-proxy-headers` package provides support for handling custom proxy headers when making HTTPS requests in various python modules.
|
|
22
|
+
|
|
23
|
+
We currently provide extensions to the following packages:
|
|
24
|
+
|
|
25
|
+
* [urllib3](https://urllib3.readthedocs.io/en/stable/)
|
|
26
|
+
* [requests](https://docs.python-requests.org/en/latest/index.html)
|
|
27
|
+
* [aiohttp](https://docs.aiohttp.org/en/stable/index.html)
|
|
28
|
+
* [httpx](https://www.python-httpx.org/)
|
|
29
|
+
|
|
30
|
+
None of these modules provide good support for parsing custom response headers from proxy servers. And some of them make it hard to send custom headers to proxy servers. So we at [ProxyMesh](https://proxymesh.com) made these extension modules to support our customers that use Python and want to use custom headers to control our proxy behavior. But these modules can work for handling custom headers with any proxy.
|
|
31
|
+
|
|
32
|
+
*If you are looking for [Scrapy](https://scrapy.org/) support, please see our [scrapy-proxy-headers](https://github.com/proxymesh/scrapy-proxy-headers) project.*
|
|
33
|
+
|
|
34
|
+
## Installation
|
|
35
|
+
|
|
36
|
+
Examples for how to use these extension modules are described below. You must first do the following:
|
|
37
|
+
|
|
38
|
+
1. `pip install python-proxy-headers`
|
|
39
|
+
2. Install the appropriate package based on the python module you want to use.
|
|
40
|
+
|
|
41
|
+
This package does not have any dependencies because we don't know which module you want to use.
|
|
42
|
+
|
|
43
|
+
You can also find more example code in our [proxy-examples for python](https://github.com/proxymesh/proxy-examples/tree/main/python).
|
|
44
|
+
|
|
45
|
+
## urllib3
|
|
46
|
+
|
|
47
|
+
If you just want to send custom proxy headers, but don't need to receive proxy response headers, then you can [urllib3.ProxyManager](https://urllib3.readthedocs.io/en/stable/reference/urllib3.poolmanager.html#urllib3.ProxyManager), like so:
|
|
48
|
+
|
|
49
|
+
``` python
|
|
50
|
+
import urllib3
|
|
51
|
+
proxy = urllib3.ProxyManager('http://PROXYHOST:PORT', proxy_headers={'X-ProxyMesh-Country': 'US'})
|
|
52
|
+
r = proxy.request('GET', 'https://api.ipify.org?format=json')
|
|
53
|
+
```
|
|
54
|
+
|
|
55
|
+
Note that when using this method, if you keep reusing the same `ProxyManager` instance, you may be re-using the proxy connection, which may have different behavior than if you create a new proxy connection for each request. For example, with ProxyMesh you may keep getting the same IP address if you reuse the proxy connection.
|
|
56
|
+
|
|
57
|
+
To get proxy response headers, use our extension module like this:
|
|
58
|
+
|
|
59
|
+
``` python
|
|
60
|
+
from python_proxy_headers import urllib3_proxy_manager
|
|
61
|
+
proxy = urllib3_proxy_manager.ProxyHeaderManager('http://PROXYHOST:PORT')
|
|
62
|
+
r = proxy.request('GET', 'https://api.ipify.org?format=json')
|
|
63
|
+
r.headers['X-ProxyMesh-IP']
|
|
64
|
+
```
|
|
65
|
+
|
|
66
|
+
You can also pass `proxy_headers` into our `ProxyHeaderManager` as well. For example, you can pass back the same `X-ProxyMesh-IP` header to ensure you get the same IP address on subsequent requests.
|
|
67
|
+
|
|
68
|
+
## requests
|
|
69
|
+
|
|
70
|
+
The requests adapter builds on our `urllib3_proxy_manager` module to make it easy to pass in proxy headers and receive proxy response headers.
|
|
71
|
+
|
|
72
|
+
``` python
|
|
73
|
+
from python_proxy_headers import requests_adapter
|
|
74
|
+
r = requests_adapter.get('https://api.ipify.org?format=json', proxies={'http': 'http://PROXYHOST:PORT', 'https': 'http://PROXYHOST:PORT'}, proxy_headers={'X-ProxyMesh-Country': 'US'})
|
|
75
|
+
r.headers['X-ProxyMesh-IP']
|
|
76
|
+
```
|
|
77
|
+
|
|
78
|
+
The `requests_adapter` module supports all the standard requests methods: `get`, `post`, `put`, `delete`, etc.
|
|
79
|
+
|
|
80
|
+
## aiohttp
|
|
81
|
+
|
|
82
|
+
While it's not documented, aiohttp does support passing in custom proxy headers by default.
|
|
83
|
+
|
|
84
|
+
``` python
|
|
85
|
+
import aiohttp
|
|
86
|
+
async with aiohttp.ClientSession() as session:
|
|
87
|
+
async with session.get('https://api.ipify.org?format=json', proxy="http://PROXYHOST:PORT", proxy_headers={'X-ProxyMesh-Country': 'US'}) as r:
|
|
88
|
+
await r.text()
|
|
89
|
+
```
|
|
90
|
+
|
|
91
|
+
However, if you want to get proxy response, you should use our extension module:
|
|
92
|
+
|
|
93
|
+
``` python
|
|
94
|
+
from python_proxy_headers import aiohttp_proxy
|
|
95
|
+
async with aiohttp_proxy.ProxyClientSession() as session:
|
|
96
|
+
async with session.get('https://api.ipify.org?format=json', proxy="http://PROXYHOST:PORT", proxy_headers={'X-ProxyMesh-Country': 'US'}) as r:
|
|
97
|
+
await r.text()
|
|
98
|
+
|
|
99
|
+
r.headers['X-ProxyMesh-IP']
|
|
100
|
+
```
|
|
101
|
+
|
|
102
|
+
## httpx
|
|
103
|
+
|
|
104
|
+
httpx also supports proxy headers by default, though it's not documented:
|
|
105
|
+
|
|
106
|
+
``` python
|
|
107
|
+
import httpx
|
|
108
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
109
|
+
transport = HTTPProxyTransport(proxy=proxy)
|
|
110
|
+
with httpx.Client(mounts={'http://': transort, 'https://': transport}) as client:
|
|
111
|
+
r = client.get('https://api.ipify.org?format=json')
|
|
112
|
+
```
|
|
113
|
+
|
|
114
|
+
But to get the response headers, you need to use our extension module:
|
|
115
|
+
|
|
116
|
+
``` python
|
|
117
|
+
import httpx
|
|
118
|
+
from python_proxy_headers.httpx_proxy import HTTPProxyTransport
|
|
119
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
120
|
+
transport = HTTPProxyTransport(proxy=proxy)
|
|
121
|
+
with httpx.Client(mounts={'http://': transort, 'https://': transport}) as client:
|
|
122
|
+
r = client.get('https://api.ipify.org?format=json')
|
|
123
|
+
|
|
124
|
+
r.headers['X-ProxyMesh-IP']
|
|
125
|
+
```
|
|
126
|
+
|
|
127
|
+
This module also provide helper methods similar to requests:
|
|
128
|
+
|
|
129
|
+
``` python
|
|
130
|
+
import httpx
|
|
131
|
+
from python_proxy_headers import httpx_proxy
|
|
132
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
133
|
+
r = httpx_proxy.get('https://api.ipify.org?format=json', proxy=proxy)
|
|
134
|
+
r.headers['X-ProxyMesh-IP']
|
|
135
|
+
```
|
|
136
|
+
|
|
137
|
+
And finally, httpx supports async requests, so we provide an async extension too:
|
|
138
|
+
|
|
139
|
+
``` python
|
|
140
|
+
import httpx
|
|
141
|
+
from python_proxy_headers.httpx_proxy import AsyncHTTPProxyTransport
|
|
142
|
+
proxy = httpx.Proxy('http://PROXYHOST:PORT', headers={'X-ProxyMesh-Country': 'US'})
|
|
143
|
+
transport = AsyncHTTPProxyTransport(proxy=proxy)
|
|
144
|
+
async with httpx.AsyncClient(mounts={'http://': transport, 'https://': transport}) as client:
|
|
145
|
+
r = await client.get('https://api.ipify.org?format=json')
|
|
146
|
+
|
|
147
|
+
r.headers['X-ProxyMesh-IP']
|
|
148
|
+
```
|
|
149
|
+
|
|
150
|
+
Our httpx helper module internally provides extension classes for [httpcore](https://www.encode.io/httpcore/), for handling proxy headers over tunnel connections.
|
|
151
|
+
You can use those classes if you're building on top of httpcore.
|
|
@@ -0,0 +1,12 @@
|
|
|
1
|
+
LICENSE
|
|
2
|
+
README.md
|
|
3
|
+
pyproject.toml
|
|
4
|
+
python_proxy_headers/__init__.py
|
|
5
|
+
python_proxy_headers/aiohttp_proxy.py
|
|
6
|
+
python_proxy_headers/httpx_proxy.py
|
|
7
|
+
python_proxy_headers/requests_adapter.py
|
|
8
|
+
python_proxy_headers/urllib3_proxy_manager.py
|
|
9
|
+
python_proxy_headers.egg-info/PKG-INFO
|
|
10
|
+
python_proxy_headers.egg-info/SOURCES.txt
|
|
11
|
+
python_proxy_headers.egg-info/dependency_links.txt
|
|
12
|
+
python_proxy_headers.egg-info/top_level.txt
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
python_proxy_headers
|