fastapi-reverse-proxy 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Tomás
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,124 @@
1
+ Metadata-Version: 2.4
2
+ Name: fastapi-reverse-proxy
3
+ Version: 0.1.0
4
+ Summary: A robust, streaming-capable reverse proxy for FastAPI including WebSocket support.
5
+ Author-email: Tomás <tomas@suricatingss.xyz>
6
+ Classifier: Programming Language :: Python :: 3
7
+ Classifier: License :: OSI Approved :: MIT License
8
+ Classifier: Operating System :: OS Independent
9
+ Classifier: Framework :: FastAPI
10
+ Classifier: Topic :: Internet :: Proxy Servers
11
+ Requires-Python: >=3.8
12
+ Description-Content-Type: text/markdown
13
+ License-File: LICENSE
14
+ Requires-Dist: fastapi>=0.129.0
15
+ Requires-Dist: httpx>=0.28.1
16
+ Requires-Dist: starlette>=0.52.1
17
+ Requires-Dist: uvicorn>=0.41.0
18
+ Requires-Dist: anyio>=4.12.1
19
+ Requires-Dist: certifi>=2026.1.4
20
+ Requires-Dist: pydantic>=2.12.5
21
+ Requires-Dist: url-normalize>=2.2.1
22
+ Requires-Dist: typing_extensions>=4.15.0
23
+ Requires-Dist: websockets>=16.0
24
+ Dynamic: license-file
25
+
26
+ # FastAPI Reverse Proxy
27
+
28
+ A robust, streaming-capable reverse proxy for FastAPI/Starlette with built-in **Latency-Based Load Balancing** and **Active Health Monitoring**.
29
+
30
+ ## Features
31
+
32
+ - **Streaming Ready**: Efficiently handles SSE (Server-Sent Events) and large file uploads/downloads.
33
+ - **WebSocket Support**: Seamless bidirectional tunneling with automated subprotocol negotiation.
34
+ - **Unified Load Balancing**: Standard Round-Robin or Smart routing using a single utility.
35
+ - **Latency-Based Routing**: Automatically routes traffic to the fastest healthy server (HEAD probe).
36
+ - **Advanced Overrides**: Granular control over headers, body, and HTTP methods.
37
+ - **Robust Cancellation**: Specialized handling for `asyncio.CancelledError` to prevent resource leaks.
38
+ - **Version Agnostic**: Automatically handles `websockets` library version differences (12.0+ vs Legacy).
39
+
40
+ ## Quick Start (Best Practice)
41
+
42
+ The recommended way to use the library is within a FastAPI **lifespan** handler. This ensures all background monitoring tasks and HTTP clients start and stop cleanly.
43
+
44
+ ```python
45
+ from fastapi import FastAPI, Request, WebSocket
46
+ from contextlib import asynccontextmanager
47
+
48
+ # Local import assuming library is in the current directory
49
+ from __init__ import (
50
+ HealthChecker, LoadBalancer,
51
+ create_httpx_client, close_httpx_client
52
+ )
53
+
54
+ # 1. Setup health monitoring and load balancing
55
+ checker = HealthChecker(["http://localhost:8080", "http://localhost:8081"])
56
+ lb = LoadBalancer(checker)
57
+
58
+ @asynccontextmanager
59
+ async def lifespan(app: FastAPI):
60
+ # Initialize global resources
61
+ await create_httpx_client(app)
62
+ async with checker: # Starts background health loop
63
+ yield
64
+ await close_httpx_client(app)
65
+
66
+ app = FastAPI(lifespan=lifespan)
67
+
68
+ @app.api_route("/{path:path}", methods=["GET", "POST", "PUT", "DELETE"])
69
+ async def gateway(request: Request, path: str):
70
+ # Route to the fastest healthy backend
71
+ return await lb.proxy_pass(request, path=f"/{path}")
72
+
73
+ @app.websocket("/ws/{path:path}")
74
+ async def ws_tunnel(websocket: WebSocket, path: str):
75
+ # Automatic subprotocol negotiation + Tunneling
76
+ await lb.proxy_pass_websocket(websocket, path=f"/{path}")
77
+ ```
78
+
79
+ ## Advanced Proxying
80
+
81
+ The `proxy_pass` function and `LoadBalancer.proxy_pass` provide deep customization for upstream requests:
82
+
83
+ | Parameter | Type | Description |
84
+ | :--- | :--- | :--- |
85
+ | `method` | `str` | Force a specific HTTP method (e.g., `"POST"`). |
86
+ | `override_body` | `bytes` | Send custom data instead of the incoming request body. |
87
+ | `additional_headers` | `dict` | Append custom headers to the proxied request. |
88
+ | `override_headers` | `dict` | Use these headers *instead* of original request headers. |
89
+ | `forward_query` | `bool` | Whether to append the incoming query string (Default: `True`). |
90
+
91
+ ## Monitoring & Configuration
92
+
93
+ ### HealthChecker (The Loop Owner)
94
+ The **proactive** component. It owns an internal `asyncio` background task that monitors backends.
95
+ - **Immediate Start**: When you enter the `async with` block (or call `start()`), the checker performs an **immediate** check of all backends. This eliminates the "cold-start" window where backends are unknown.
96
+ - **Configuration Modes**:
97
+ - **Standard**: `HealthChecker(["http://a", "http://b"], ping_path="/health")`
98
+ - **Personalized**: Pass a list of dictionaries for per-host settings:
99
+ ```python
100
+ checker = HealthChecker([
101
+ {"host": "http://api-1", "pingpath": "/v1/status", "maxrequests": 100},
102
+ {"host": "http://api-2", "pingpath": "/health"}
103
+ ])
104
+ ```
105
+ - **Properties**:
106
+ - `ping_path`: Get or set the global health check path (default: `"/"`).
107
+
108
+ ### LoadBalancer (The Decision Utility)
109
+ A **normal Python object** that makes routing decisions based on its source.
110
+ - **Stateful**: While it has no background loop, it **does track state** (request counts for rate-limiting and the last time it pulled data from the health checker).
111
+ - **No Lifecycle Needed**: It relies on the `HealthChecker` (or a static list) for data and doesn't need explicit `start`/`stop` calls.
112
+
113
+ ## WebSocket Refinements
114
+
115
+ The library implements "deferred negotiation" for WebSockets:
116
+ 1. The proxy receives the client's supported subprotocols from `scope`.
117
+ 2. It establishes an upstream connection first.
118
+ 3. Once the upstream accepts a protocol, the proxy calls `websocket.accept(subprotocol=...)` back to the client.
119
+ 4. This ensures the entire tunnel (Client <-> Proxy <-> Upstream) uses the same negotiated protocol.
120
+
121
+ ## Robustness & Safety
122
+
123
+ - **Termination Safety**: Resource cleanup (closing `httpx` clients and sockets) is triggered even on task cancellation (`BaseException`).
124
+ - **Introspection-Based Compatibility**: Uses `inspect.signature` to automatically detect version-specific parameters in the `websockets` library.
@@ -0,0 +1,99 @@
1
+ # FastAPI Reverse Proxy
2
+
3
+ A robust, streaming-capable reverse proxy for FastAPI/Starlette with built-in **Latency-Based Load Balancing** and **Active Health Monitoring**.
4
+
5
+ ## Features
6
+
7
+ - **Streaming Ready**: Efficiently handles SSE (Server-Sent Events) and large file uploads/downloads.
8
+ - **WebSocket Support**: Seamless bidirectional tunneling with automated subprotocol negotiation.
9
+ - **Unified Load Balancing**: Standard Round-Robin or Smart routing using a single utility.
10
+ - **Latency-Based Routing**: Automatically routes traffic to the fastest healthy server (HEAD probe).
11
+ - **Advanced Overrides**: Granular control over headers, body, and HTTP methods.
12
+ - **Robust Cancellation**: Specialized handling for `asyncio.CancelledError` to prevent resource leaks.
13
+ - **Version Agnostic**: Automatically handles `websockets` library version differences (12.0+ vs Legacy).
14
+
15
+ ## Quick Start (Best Practice)
16
+
17
+ The recommended way to use the library is within a FastAPI **lifespan** handler. This ensures all background monitoring tasks and HTTP clients start and stop cleanly.
18
+
19
+ ```python
20
+ from fastapi import FastAPI, Request, WebSocket
21
+ from contextlib import asynccontextmanager
22
+
23
+ # Local import assuming library is in the current directory
24
+ from __init__ import (
25
+ HealthChecker, LoadBalancer,
26
+ create_httpx_client, close_httpx_client
27
+ )
28
+
29
+ # 1. Setup health monitoring and load balancing
30
+ checker = HealthChecker(["http://localhost:8080", "http://localhost:8081"])
31
+ lb = LoadBalancer(checker)
32
+
33
+ @asynccontextmanager
34
+ async def lifespan(app: FastAPI):
35
+ # Initialize global resources
36
+ await create_httpx_client(app)
37
+ async with checker: # Starts background health loop
38
+ yield
39
+ await close_httpx_client(app)
40
+
41
+ app = FastAPI(lifespan=lifespan)
42
+
43
+ @app.api_route("/{path:path}", methods=["GET", "POST", "PUT", "DELETE"])
44
+ async def gateway(request: Request, path: str):
45
+ # Route to the fastest healthy backend
46
+ return await lb.proxy_pass(request, path=f"/{path}")
47
+
48
+ @app.websocket("/ws/{path:path}")
49
+ async def ws_tunnel(websocket: WebSocket, path: str):
50
+ # Automatic subprotocol negotiation + Tunneling
51
+ await lb.proxy_pass_websocket(websocket, path=f"/{path}")
52
+ ```
53
+
54
+ ## Advanced Proxying
55
+
56
+ The `proxy_pass` function and `LoadBalancer.proxy_pass` provide deep customization for upstream requests:
57
+
58
+ | Parameter | Type | Description |
59
+ | :--- | :--- | :--- |
60
+ | `method` | `str` | Force a specific HTTP method (e.g., `"POST"`). |
61
+ | `override_body` | `bytes` | Send custom data instead of the incoming request body. |
62
+ | `additional_headers` | `dict` | Append custom headers to the proxied request. |
63
+ | `override_headers` | `dict` | Use these headers *instead* of original request headers. |
64
+ | `forward_query` | `bool` | Whether to append the incoming query string (Default: `True`). |
65
+
66
+ ## Monitoring & Configuration
67
+
68
+ ### HealthChecker (The Loop Owner)
69
+ The **proactive** component. It owns an internal `asyncio` background task that monitors backends.
70
+ - **Immediate Start**: When you enter the `async with` block (or call `start()`), the checker performs an **immediate** check of all backends. This eliminates the "cold-start" window where backends are unknown.
71
+ - **Configuration Modes**:
72
+ - **Standard**: `HealthChecker(["http://a", "http://b"], ping_path="/health")`
73
+ - **Personalized**: Pass a list of dictionaries for per-host settings:
74
+ ```python
75
+ checker = HealthChecker([
76
+ {"host": "http://api-1", "pingpath": "/v1/status", "maxrequests": 100},
77
+ {"host": "http://api-2", "pingpath": "/health"}
78
+ ])
79
+ ```
80
+ - **Properties**:
81
+ - `ping_path`: Get or set the global health check path (default: `"/"`).
82
+
83
+ ### LoadBalancer (The Decision Utility)
84
+ A **normal Python object** that makes routing decisions based on its source.
85
+ - **Stateful**: While it has no background loop, it **does track state** (request counts for rate-limiting and the last time it pulled data from the health checker).
86
+ - **No Lifecycle Needed**: It relies on the `HealthChecker` (or a static list) for data and doesn't need explicit `start`/`stop` calls.
87
+
88
+ ## WebSocket Refinements
89
+
90
+ The library implements "deferred negotiation" for WebSockets:
91
+ 1. The proxy receives the client's supported subprotocols from `scope`.
92
+ 2. It establishes an upstream connection first.
93
+ 3. Once the upstream accepts a protocol, the proxy calls `websocket.accept(subprotocol=...)` back to the client.
94
+ 4. This ensures the entire tunnel (Client <-> Proxy <-> Upstream) uses the same negotiated protocol.
95
+
96
+ ## Robustness & Safety
97
+
98
+ - **Termination Safety**: Resource cleanup (closing `httpx` clients and sockets) is triggered even on task cancellation (`BaseException`).
99
+ - **Introspection-Based Compatibility**: Uses `inspect.signature` to automatically detect version-specific parameters in the `websockets` library.
@@ -0,0 +1,35 @@
1
+ [build-system]
2
+ requires = ["setuptools>=61.0"]
3
+ build-backend = "setuptools.build_meta"
4
+
5
+ [project]
6
+ name = "fastapi-reverse-proxy"
7
+ version = "0.1.0"
8
+ authors = [
9
+ { name="Tomás", email="tomas@suricatingss.xyz" },
10
+ ]
11
+ description = "A robust, streaming-capable reverse proxy for FastAPI including WebSocket support."
12
+ readme = "README.md"
13
+ requires-python = ">=3.8"
14
+ classifiers = [
15
+ "Programming Language :: Python :: 3",
16
+ "License :: OSI Approved :: MIT License",
17
+ "Operating System :: OS Independent",
18
+ "Framework :: FastAPI",
19
+ "Topic :: Internet :: Proxy Servers",
20
+ ]
21
+ dependencies = [
22
+ "fastapi>=0.129.0",
23
+ "httpx>=0.28.1",
24
+ "starlette>=0.52.1",
25
+ "uvicorn>=0.41.0",
26
+ "anyio>=4.12.1",
27
+ "certifi>=2026.1.4",
28
+ "pydantic>=2.12.5",
29
+ "url-normalize>=2.2.1",
30
+ "typing_extensions>=4.15.0",
31
+ "websockets>=16.0"
32
+ ]
33
+
34
+ [tool.setuptools.packages.find]
35
+ where = ["src"]
@@ -0,0 +1,4 @@
1
+ [egg_info]
2
+ tag_build =
3
+ tag_date = 0
4
+
@@ -0,0 +1,14 @@
1
+ from .proxy_pass import proxy_pass, proxy_pass_websocket
2
+ from .proxy_httpx import create_httpx_client, close_httpx_client, get_httpx_client
3
+ from .load_balance import LoadBalancer
4
+ from .health_check import HealthChecker
5
+
6
+ __all__ = [
7
+ "proxy_pass",
8
+ "proxy_pass_websocket",
9
+ "create_httpx_client",
10
+ "close_httpx_client",
11
+ "get_httpx_client",
12
+ "LoadBalancer",
13
+ "HealthChecker",
14
+ ]
@@ -0,0 +1,166 @@
1
+ import asyncio
2
+ import httpx
3
+ import logging
4
+ import time
5
+ from typing import Optional, Union, Dict
6
+ from urllib.parse import urlparse
7
+
8
+ logger = logging.getLogger("fastapi_reverse_proxy")
9
+
10
+
11
+ class HealthChecker:
12
+ def __init__(self, targets: list[str] | list[dict], interval: int = 10, timeout: int = 10, httpx_client: httpx.AsyncClient | None = None):
13
+ if not targets:
14
+ raise ValueError("Targets list cannot be empty")
15
+ if interval < 1:
16
+ raise ValueError("Interval must be at least 1")
17
+ if timeout < 1:
18
+ raise ValueError("Timeout must be at least 1")
19
+
20
+ self.interval = interval
21
+ self.timeout = timeout
22
+
23
+ # Determine and enforce strict type (no mixing)
24
+ first_type = type(targets[0])
25
+ if first_type not in (str, dict):
26
+ raise TypeError("Targets must be a list of strings or a list of dictionaries")
27
+
28
+ self.is_personalized = (first_type is dict)
29
+ self._global_ping_path = "/"
30
+
31
+ # Internal storage
32
+ self._targets_map: Dict[str, str] = {}
33
+ # Stores extra config like 'maxrequests' from the dictionary
34
+ self.target_configs: Dict[str, Dict] = {}
35
+
36
+ for idx, t in enumerate(targets):
37
+ if not isinstance(t, first_type):
38
+ raise TypeError(f"Mixed types in targets at index {idx}. Total list must be same type.")
39
+
40
+ if self.is_personalized:
41
+ if "host" not in t:
42
+ raise KeyError(f"Target dictionary at index {idx} missing required 'host' key")
43
+ u = urlparse(t["host"])
44
+ host = f"{u.scheme}://{u.netloc}"
45
+ self._targets_map[host] = t.get("pingpath", "/")
46
+ self.target_configs[host] = t
47
+ else:
48
+ u = urlparse(t)
49
+ host = f"{u.scheme}://{u.netloc}"
50
+ self._targets_map[host] = self._global_ping_path
51
+ self.target_configs[host] = {}
52
+
53
+ self.targets = list(self._targets_map.keys())
54
+ # Stores latency in ms or False if down
55
+ self.status: Dict[str, Union[float, bool]] = {host: 0.0 for host in self.targets}
56
+ self.last_update: float = time.perf_counter()
57
+
58
+ self._task: Optional[asyncio.Task] = None
59
+
60
+ # Track if WE created the client so we know if we should close it
61
+ self._owns_client = httpx_client is None
62
+ self._client = httpx_client or httpx.AsyncClient(timeout=self.timeout)
63
+
64
+ @property
65
+ def ping_path(self) -> str:
66
+ if self.is_personalized:
67
+ raise RuntimeError("Global ping_path is not supported when using personalized hosts")
68
+ return self._global_ping_path
69
+
70
+ @ping_path.setter
71
+ def ping_path(self, value: str):
72
+ if self.is_personalized:
73
+ raise RuntimeError("Global ping_path is not supported when using personalized hosts")
74
+ self._global_ping_path = value
75
+ # Update all targets in map since we are in global mode
76
+ for host in self._targets_map:
77
+ self._targets_map[host] = value
78
+
79
+ def __del__(self):
80
+ """
81
+ Class-built function for deletion
82
+
83
+ WARNING: This one is last resort! Consider using .destroy() before exiting instead !
84
+ Complete GC of this class is NOT guaranteed
85
+ """
86
+ if hasattr(self, '_client') and self._client is not None:
87
+ try:
88
+ logger.warning("You did not call .destroy() . Attempting cleanup via __del__ ... \n(Don't forget to add .destroy() at the end of your program)")
89
+ loop = asyncio.get_running_loop()
90
+ if loop.is_running():
91
+ loop.create_task(self.destroy())
92
+ except RuntimeError:
93
+ pass
94
+
95
+ async def _check_target(self, host: str):
96
+ path = self._targets_map[host]
97
+ url = f"{host.rstrip('/')}/{path.lstrip('/')}"
98
+
99
+ start_time = time.perf_counter()
100
+ try:
101
+ response = await self._client.head(url)
102
+ if response.status_code < 400:
103
+ latency = (time.perf_counter() - start_time) * 1000
104
+ self.status[host] = latency
105
+ else:
106
+ self.status[host] = False
107
+ except Exception:
108
+ self.status[host] = False
109
+ logger.warning(f"Target {host} is DOWN")
110
+
111
+ async def _loop(self):
112
+ """The background loop."""
113
+ while True:
114
+ await self.check_all()
115
+ await asyncio.sleep(self.interval)
116
+
117
+ async def start(self):
118
+ """Start the background health check loop."""
119
+ if not self._task:
120
+ await self.check_all() # Perform a check right on start
121
+ self._task = asyncio.create_task(self._loop())
122
+
123
+ async def __aenter__(self):
124
+ await self.start()
125
+ return self
126
+
127
+ async def __aexit__(self, *args):
128
+ await self.destroy()
129
+
130
+ async def stop(self):
131
+ """Stop the background loop."""
132
+ if self._task:
133
+ self._task.cancel()
134
+ try:
135
+ await self._task
136
+ except asyncio.CancelledError: pass
137
+ self._task = None
138
+
139
+ async def destroy(self):
140
+ """Stop the loop AND close the httpx client."""
141
+ await self.stop()
142
+ if self._owns_client and self._client:
143
+ await self._client.aclose()
144
+ self._client = None
145
+
146
+ async def check_all(self):
147
+ """Pings all targets and updates their status."""
148
+ tasks = [self._check_target(host) for host in self.targets]
149
+ await asyncio.gather(*tasks)
150
+ self.last_update = time.perf_counter()
151
+
152
+ def get_healthy_targets(self):
153
+ return [h for h in self.targets if self.status.get(h) is not False]
154
+
155
+ def get_response_times(self) -> Dict[str, float]:
156
+ return {t: float(v) for t, v in self.status.items() if v is not False}
157
+
158
+ def get_fastest(self) -> str | None:
159
+ r = self.get_response_times()
160
+ return min(r, key=lambda t: r[t]) if r else None
161
+
162
+ def is_healthy(self, target: str) -> bool:
163
+ u = urlparse(target)
164
+ host = f"{u.scheme}://{u.netloc}"
165
+ status = self.status.get(host)
166
+ return status is not False and status is not None
@@ -0,0 +1,172 @@
1
+ from .health_check import HealthChecker
2
+ from fastapi import Request, Response, WebSocket
3
+ from urllib.parse import urlparse
4
+ from typing import Dict, Optional
5
+ import asyncio, logging
6
+ from url_normalize import url_normalize
7
+
8
+ logger = logging.getLogger("fastapi_reverse_proxy")
9
+
10
+ class LoadBalancer:
11
+ """
12
+ Dual-mode Load Balancer.
13
+ - Round-robin: if passed a list of strings.
14
+ - Health-based: if passed a HealthChecker object.
15
+
16
+ Includes request limits that reset every health-check interval.
17
+ """
18
+
19
+ def __init__(self, servers: list | HealthChecker):
20
+ self.servers = servers
21
+ self.__index = 0
22
+ self.__healthMode: bool = isinstance(servers, HealthChecker)
23
+
24
+ # Local request tracking
25
+ self._request_counts: Dict[str, int] = {}
26
+ self._last_health_update: float = 0.0
27
+
28
+ # Limits: host -> max_requests
29
+ self._limits_map: Dict[str, Optional[int]] = {}
30
+ self._global_max_requests: Optional[int] = None
31
+
32
+ if self.__healthMode:
33
+ # Populate initial limits if targets were dictionaries (Manual mode)
34
+ for host in self.servers.targets:
35
+ # Check 'maxrequests' in the target_configs from the health checker
36
+ config = self.servers.target_configs.get(host, {})
37
+ self._limits_map[host] = config.get("maxrequests")
38
+
39
+ @property
40
+ def max_requests(self) -> Optional[int]:
41
+ if self.__healthMode and self.servers.is_personalized:
42
+ raise RuntimeError("Global max_requests is not supported when using personalized hosts")
43
+ return self._global_max_requests
44
+
45
+ @max_requests.setter
46
+ def max_requests(self, value: Optional[int]):
47
+ if self.__healthMode and self.servers.is_personalized:
48
+ raise RuntimeError("Global max_requests is not supported when using personalized hosts")
49
+ self._global_max_requests = value
50
+ # Update current limits map since we are in global/auto mode
51
+ if self.__healthMode:
52
+ for host in self._limits_map:
53
+ self._limits_map[host] = value
54
+
55
+ def _get_best_healthy(self) -> str | None:
56
+ """Returns the fastest healthy host that hasn't hit its request limit."""
57
+ # Detect if HealthChecker performed a new check and reset counts
58
+ if self.servers.last_update > self._last_health_update:
59
+ self._request_counts = {h: 0 for h in self.servers.targets}
60
+ self._last_health_update = self.servers.last_update
61
+
62
+ # Get healthy hosts with their latencies
63
+ r = self.servers.get_response_times()
64
+
65
+ # Filter out those over their limit
66
+ available = {}
67
+ for host, latency in r.items():
68
+ # Use specific limit from _limits_map OR the global one if it's not set per-host
69
+ limit = self._limits_map.get(host) if self.servers.is_personalized else self._global_max_requests
70
+
71
+ if limit is None or self._request_counts.get(host, 0) < limit:
72
+ available[host] = latency
73
+
74
+ if not available:
75
+ return None
76
+
77
+ return min(available, key=lambda t: available[t])
78
+ def peek(self) -> str | None:
79
+ if self.__healthMode:
80
+ return self._get_best_healthy()
81
+ else:
82
+ if not self.servers: return None
83
+ return self.servers[self.__index]
84
+
85
+ def get(self) -> str | None:
86
+ if self.__healthMode:
87
+ best = self._get_best_healthy()
88
+ if best:
89
+ self._request_counts[best] = self._request_counts.get(best, 0) + 1
90
+ return best
91
+ else:
92
+ if not self.servers: return None
93
+ try:
94
+ return self.servers[self.__index]
95
+ finally:
96
+ self.__index = (self.__index + 1) % len(self.servers)
97
+
98
+ def get_all(self) -> list:
99
+ if self.__healthMode: return self.servers.targets
100
+ else: return self.servers
101
+
102
+ def set_index(self, index: int) -> None:
103
+ if self.__healthMode:
104
+ raise RuntimeError("set_index() is not supported in health mode")
105
+ if index < 0 or (self.servers and index >= len(self.servers)):
106
+ raise IndexError("Index out of range")
107
+ self.__index = index
108
+
109
+ async def proxy_pass(
110
+ self,
111
+ req: Request,
112
+ path: str,
113
+ timeout: float = 60.0,
114
+ forward_query: bool = True,
115
+ additional_headers: Optional[dict] = None,
116
+ override_headers: Optional[dict] = None,
117
+ override_body: Optional[bytes] = None,
118
+ method: Optional[str] = None
119
+ ):
120
+ from .proxy_pass import proxy_pass as _proxy_pass # Late import
121
+
122
+ target = self.get()
123
+ if not target:
124
+ return Response("Service Unavailable: No healthy backends available or all over limit", status_code=503)
125
+
126
+ #u = urlparse(target)
127
+ #origin = f"{u.scheme}://{u.netloc}"
128
+ # Smart pathing: combine origin and user-provided path
129
+ #dest_url = f"{origin.rstrip('/')}/{path.lstrip('/')}"
130
+
131
+ return await _proxy_pass(
132
+ request=req,
133
+ host=url_normalize(target, default_scheme="http"),
134
+ path=path,
135
+ timeout=timeout,
136
+ forward_query=forward_query,
137
+ additional_headers=additional_headers,
138
+ override_headers=override_headers,
139
+ override_body=override_body,
140
+ method=method
141
+ )
142
+
143
+ async def proxy_pass_websocket(
144
+ self,
145
+ websocket: WebSocket,
146
+ path: Optional[str] = None,
147
+ subprotocols: list[str] | None = None,
148
+ forward_query: bool = True,
149
+ additional_headers: Optional[dict] = None,
150
+ override_headers: Optional[dict] = None
151
+ ):
152
+ from .proxy_pass import proxy_pass_websocket as _proxy_pass_ws # Late import
153
+
154
+ target = self.get()
155
+ if not target:
156
+ await websocket.close(code=1011)
157
+ return
158
+
159
+ u = urlparse(target)
160
+ origin = f"{u.scheme}://{u.netloc}"
161
+ # Smart pathing: combine origin and user-provided path (or default path)
162
+ dest_url = f"{origin.rstrip('/')}/{path.lstrip('/') if path is not None else websocket.url.path}"
163
+
164
+ return await _proxy_pass_ws(
165
+ websocket,
166
+ dest_url,
167
+ subprotocols=subprotocols,
168
+ forward_query=forward_query,
169
+ additional_headers=additional_headers,
170
+ override_headers=override_headers
171
+ )
172
+
@@ -0,0 +1,15 @@
1
+ import httpx
2
+ from fastapi import FastAPI, Request
3
+
4
+ async def create_httpx_client(app: FastAPI):
5
+ """Initializes the HTTP client and stores it in the app state."""
6
+ app.state.http_proxy_client = httpx.AsyncClient()
7
+
8
+ async def close_httpx_client(app: FastAPI):
9
+ """Closes the HTTP client stored in the app state."""
10
+ if hasattr(app.state, "http_proxy_client"):
11
+ await app.state.http_proxy_client.aclose() # close it
12
+
13
+ async def get_httpx_client(req: Request) -> httpx.AsyncClient:
14
+ """Retrieves the HTTP client from the app state."""
15
+ return req.app.state.http_proxy_client
@@ -0,0 +1,267 @@
1
+ from fastapi import Request, WebSocket, Response
2
+ from fastapi.responses import StreamingResponse
3
+ from starlette.background import BackgroundTask
4
+ from url_normalize import url_normalize
5
+ import httpx
6
+ import websockets
7
+ import asyncio
8
+ import logging
9
+ import inspect
10
+ from typing import Optional
11
+ from .proxy_httpx import get_httpx_client
12
+
13
+ logger = logging.getLogger("fastapi_reverse_proxy")
14
+
15
+ # Hop-by-hop headers that should typically not be forwarded by a proxy
16
+ # https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/TE
17
+ EXCLUDED_HEADERS = {
18
+ "connection", "keep-alive", "proxy-authenticate",
19
+ "proxy-authorization", "te", "trailers", "transfer-encoding", "upgrade"
20
+ }
21
+
22
+ async def proxy_pass(
23
+ request: Request,
24
+ host: str,
25
+ path: Optional[str] = None,
26
+ timeout: float = 60.0,
27
+ forward_query: bool = True,
28
+ additional_headers: Optional[dict] = None,
29
+ override_headers: Optional[dict] = None,
30
+ override_body: Optional[bytes] = None,
31
+ method: Optional[str] = None
32
+ ):
33
+ """
34
+ Forwards incoming HTTP requests to the target service using streaming.
35
+ - host: The host itself (without ending slash)
36
+ - path: The path with beggining slash
37
+ - forward_query: If True, automatically appends the request's query string.
38
+ - additional_headers: Headers to add to the upstream request.
39
+ - override_headers: Use these headers instead of original request headers.
40
+ - override_body: Use this body instead of streaming the request body.
41
+ - method: HTTP method to use (e.g., 'POST'). Defaults to original request method.
42
+ """
43
+
44
+ if path is None: path = request.url.path # set to path
45
+
46
+ # normalize host + path
47
+ url = url_normalize(host + path, default_scheme="http")
48
+
49
+ if forward_query and request.url.query:
50
+ url = f"{url}?{request.url.query}" if "?" not in url else f"{url}&{request.url.query}"
51
+
52
+ # normalize again
53
+ url = url_normalize(url ,default_scheme="http")
54
+
55
+ # Determine method
56
+ final_method = method or request.method
57
+
58
+ # Resolve headers
59
+ if override_headers is not None:
60
+ headers = dict(override_headers)
61
+ else:
62
+ headers = dict(request.headers)
63
+ # Identify the client's real IP and forward it
64
+ client_host = request.client.host if request.client else "unknown"
65
+ headers["X-Real-IP"] = client_host
66
+ if "X-Forwarded-For" in headers:
67
+ headers["X-Forwarded-For"] = f"{headers['X-Forwarded-For']}, {client_host}"
68
+ else:
69
+ headers["X-Forwarded-For"] = client_host
70
+
71
+ headers["X-Forwarded-Proto"] = request.url.scheme
72
+ headers["X-Forwarded-Host"] = headers.get("host", request.url.netloc)
73
+
74
+ # Apply additional headers
75
+ if additional_headers:
76
+ headers.update(additional_headers)
77
+
78
+ # Let httpx handle the host header and connection management
79
+ headers.pop("host", None)
80
+ headers.pop("connection", None)
81
+
82
+ client = None
83
+ try:
84
+ client = await get_httpx_client(request)
85
+ is_global_client = True
86
+ except Exception:
87
+ # Fallback for testing or standalone usage
88
+ client = httpx.AsyncClient()
89
+ is_global_client = False
90
+
91
+ try:
92
+ # Prepare content
93
+ if override_body is not None:
94
+ content = override_body
95
+ else:
96
+ # Stream the request body to the target (efficient for large uploads)
97
+ async def request_generator():
98
+ async for chunk in request.stream():
99
+ yield chunk
100
+ content = request_generator() if final_method in ("POST", "PUT", "PATCH", "DELETE") else None
101
+
102
+ # Create the upstream request
103
+ rp_req = client.build_request(
104
+ method=final_method,
105
+ url=url,
106
+ headers=headers,
107
+ content=content,
108
+ timeout=timeout
109
+ )
110
+
111
+ # Send the request and stream the response
112
+ rp_resp = await client.send(rp_req, stream=True)
113
+
114
+ try:
115
+ # Filter response headers
116
+ resp_headers = {}
117
+ for k, v in rp_resp.headers.items():
118
+ if k.lower() in EXCLUDED_HEADERS:
119
+ continue
120
+ if k.lower() == "content-encoding":
121
+ continue
122
+ if k.lower() == "content-length":
123
+ continue
124
+ resp_headers[k] = v
125
+
126
+ resp_headers["X-Accel-Buffering"] = "no"
127
+ resp_headers["Cache-Control"] = "no-cache"
128
+
129
+ async def cleanup():
130
+ await rp_resp.aclose()
131
+ if not is_global_client:
132
+ await client.aclose()
133
+
134
+ return StreamingResponse(
135
+ rp_resp.aiter_bytes(),
136
+ status_code=rp_resp.status_code,
137
+ headers=resp_headers,
138
+ background=BackgroundTask(cleanup)
139
+ )
140
+ except BaseException as e:
141
+ # If we fail here, the ownership hasn't passed to StreamingResponse yet
142
+ await rp_resp.aclose()
143
+ raise e
144
+
145
+ except BaseException as e:
146
+ # Catch EVERY exception (including CancelledError) for local client cleanup
147
+ if not is_global_client and client:
148
+ await client.aclose()
149
+ # Re-raise so the server can handle the cancellation/error
150
+ raise e
151
+
152
+
153
+
154
+ async def proxy_pass_websocket(
155
+ websocket: WebSocket,
156
+ target_url: str,
157
+ subprotocols: Optional[list[str]] = None,
158
+ forward_query: bool = True,
159
+ additional_headers: Optional[dict] = None,
160
+ override_headers: Optional[dict] = None
161
+ ):
162
+ """
163
+ Forwards incoming WebSocket connections to the target service.
164
+ - target_url: The full destination WS(S) URL.
165
+ - forward_query: If True, automatically appends the request's query string.
166
+ """
167
+ url = target_url
168
+ if forward_query and websocket.url.query:
169
+ url = f"{url}?{websocket.url.query}" if "?" not in url else f"{url}&{websocket.url.query}"
170
+
171
+ # Ensure we use ws:// or wss://
172
+ if url.startswith("http://"):
173
+ url = url.replace("http://", "ws://", 1)
174
+ elif url.startswith("https://"):
175
+ url = url.replace("https://", "wss://", 1)
176
+
177
+ # Resolve headers
178
+ if override_headers is not None:
179
+ headers = dict(override_headers)
180
+ else:
181
+ client_host = websocket.client.host if websocket.client else "unknown"
182
+ headers = {
183
+ "X-Real-IP": client_host,
184
+ "X-Forwarded-For": client_host,
185
+ "X-Forwarded-Proto": websocket.url.scheme,
186
+ "X-Forwarded-Host": websocket.headers.get("host", websocket.url.netloc)
187
+ }
188
+
189
+ if additional_headers:
190
+ headers.update(additional_headers)
191
+
192
+ # Use subprotocols from scope if not provided explicitly
193
+ supported_subprotocols = subprotocols or websocket.scope.get("subprotocols")
194
+
195
+ try:
196
+ # Determine the correct header parameter name for this version of websockets
197
+ # Modern (12.0+): additional_headers, Legacy: extra_headers
198
+ _params = inspect.signature(websockets.connect).parameters
199
+ header_param = "additional_headers" if "additional_headers" in _params else "extra_headers"
200
+
201
+ connect_kwargs = {
202
+ header_param: headers,
203
+ "subprotocols": supported_subprotocols
204
+ }
205
+
206
+ async with websockets.connect(url, **connect_kwargs) as target_ws:
207
+ # Accept once we know the negotiated subprotocol
208
+ await websocket.accept(subprotocol=target_ws.subprotocol)
209
+ await _handle_ws_bidirectional(websocket, target_ws)
210
+
211
+ except BaseException as e:
212
+ if not isinstance(e, asyncio.CancelledError):
213
+ logger.error(f"WebSocket Proxy Error: {e}")
214
+ raise e
215
+ finally:
216
+ try:
217
+ await websocket.close()
218
+ except Exception:
219
+ pass
220
+
221
+
222
+
223
+
224
+
225
+ async def _handle_ws_bidirectional(websocket: WebSocket, target_ws):
226
+ """Internal helper to manage bidirectional WS traffic with clean cancellation."""
227
+ async def client_to_target():
228
+ try:
229
+ while True:
230
+ message = await websocket.receive()
231
+ if message["type"] == "websocket.receive":
232
+ if "text" in message:
233
+ await target_ws.send(message["text"])
234
+ elif "bytes" in message:
235
+ await target_ws.send(message["bytes"])
236
+ elif message["type"] == "websocket.disconnect":
237
+ break
238
+ except (Exception, asyncio.CancelledError):
239
+ # Exit loop on error or cancellation
240
+ pass
241
+
242
+ async def target_to_client():
243
+ try:
244
+ while True:
245
+ message = await target_ws.recv()
246
+ if isinstance(message, str):
247
+ await websocket.send_text(message)
248
+ else:
249
+ await websocket.send_bytes(message)
250
+ except (Exception, asyncio.CancelledError, websockets.ConnectionClosed):
251
+ # Exit loop on error, closure, or cancellation
252
+ pass
253
+
254
+ # Wrap in tasks for cancellation
255
+ tasks = [
256
+ asyncio.create_task(client_to_target()),
257
+ asyncio.create_task(target_to_client())
258
+ ]
259
+
260
+ try:
261
+ await asyncio.wait(tasks, return_when=asyncio.FIRST_COMPLETED)
262
+ finally:
263
+ for task in tasks:
264
+ if not task.done():
265
+ task.cancel()
266
+ # Ensure all tasks are gathered and exceptions (including CancelledError) are handled
267
+ await asyncio.gather(*tasks, return_exceptions=True)
@@ -0,0 +1,124 @@
1
+ Metadata-Version: 2.4
2
+ Name: fastapi-reverse-proxy
3
+ Version: 0.1.0
4
+ Summary: A robust, streaming-capable reverse proxy for FastAPI including WebSocket support.
5
+ Author-email: Tomás <tomas@suricatingss.xyz>
6
+ Classifier: Programming Language :: Python :: 3
7
+ Classifier: License :: OSI Approved :: MIT License
8
+ Classifier: Operating System :: OS Independent
9
+ Classifier: Framework :: FastAPI
10
+ Classifier: Topic :: Internet :: Proxy Servers
11
+ Requires-Python: >=3.8
12
+ Description-Content-Type: text/markdown
13
+ License-File: LICENSE
14
+ Requires-Dist: fastapi>=0.129.0
15
+ Requires-Dist: httpx>=0.28.1
16
+ Requires-Dist: starlette>=0.52.1
17
+ Requires-Dist: uvicorn>=0.41.0
18
+ Requires-Dist: anyio>=4.12.1
19
+ Requires-Dist: certifi>=2026.1.4
20
+ Requires-Dist: pydantic>=2.12.5
21
+ Requires-Dist: url-normalize>=2.2.1
22
+ Requires-Dist: typing_extensions>=4.15.0
23
+ Requires-Dist: websockets>=16.0
24
+ Dynamic: license-file
25
+
26
+ # FastAPI Reverse Proxy
27
+
28
+ A robust, streaming-capable reverse proxy for FastAPI/Starlette with built-in **Latency-Based Load Balancing** and **Active Health Monitoring**.
29
+
30
+ ## Features
31
+
32
+ - **Streaming Ready**: Efficiently handles SSE (Server-Sent Events) and large file uploads/downloads.
33
+ - **WebSocket Support**: Seamless bidirectional tunneling with automated subprotocol negotiation.
34
+ - **Unified Load Balancing**: Standard Round-Robin or Smart routing using a single utility.
35
+ - **Latency-Based Routing**: Automatically routes traffic to the fastest healthy server (HEAD probe).
36
+ - **Advanced Overrides**: Granular control over headers, body, and HTTP methods.
37
+ - **Robust Cancellation**: Specialized handling for `asyncio.CancelledError` to prevent resource leaks.
38
+ - **Version Agnostic**: Automatically handles `websockets` library version differences (12.0+ vs Legacy).
39
+
40
+ ## Quick Start (Best Practice)
41
+
42
+ The recommended way to use the library is within a FastAPI **lifespan** handler. This ensures all background monitoring tasks and HTTP clients start and stop cleanly.
43
+
44
+ ```python
45
+ from fastapi import FastAPI, Request, WebSocket
46
+ from contextlib import asynccontextmanager
47
+
48
+ # Local import assuming library is in the current directory
49
+ from __init__ import (
50
+ HealthChecker, LoadBalancer,
51
+ create_httpx_client, close_httpx_client
52
+ )
53
+
54
+ # 1. Setup health monitoring and load balancing
55
+ checker = HealthChecker(["http://localhost:8080", "http://localhost:8081"])
56
+ lb = LoadBalancer(checker)
57
+
58
+ @asynccontextmanager
59
+ async def lifespan(app: FastAPI):
60
+ # Initialize global resources
61
+ await create_httpx_client(app)
62
+ async with checker: # Starts background health loop
63
+ yield
64
+ await close_httpx_client(app)
65
+
66
+ app = FastAPI(lifespan=lifespan)
67
+
68
+ @app.api_route("/{path:path}", methods=["GET", "POST", "PUT", "DELETE"])
69
+ async def gateway(request: Request, path: str):
70
+ # Route to the fastest healthy backend
71
+ return await lb.proxy_pass(request, path=f"/{path}")
72
+
73
+ @app.websocket("/ws/{path:path}")
74
+ async def ws_tunnel(websocket: WebSocket, path: str):
75
+ # Automatic subprotocol negotiation + Tunneling
76
+ await lb.proxy_pass_websocket(websocket, path=f"/{path}")
77
+ ```
78
+
79
+ ## Advanced Proxying
80
+
81
+ The `proxy_pass` function and `LoadBalancer.proxy_pass` provide deep customization for upstream requests:
82
+
83
+ | Parameter | Type | Description |
84
+ | :--- | :--- | :--- |
85
+ | `method` | `str` | Force a specific HTTP method (e.g., `"POST"`). |
86
+ | `override_body` | `bytes` | Send custom data instead of the incoming request body. |
87
+ | `additional_headers` | `dict` | Append custom headers to the proxied request. |
88
+ | `override_headers` | `dict` | Use these headers *instead* of original request headers. |
89
+ | `forward_query` | `bool` | Whether to append the incoming query string (Default: `True`). |
90
+
91
+ ## Monitoring & Configuration
92
+
93
+ ### HealthChecker (The Loop Owner)
94
+ The **proactive** component. It owns an internal `asyncio` background task that monitors backends.
95
+ - **Immediate Start**: When you enter the `async with` block (or call `start()`), the checker performs an **immediate** check of all backends. This eliminates the "cold-start" window where backends are unknown.
96
+ - **Configuration Modes**:
97
+ - **Standard**: `HealthChecker(["http://a", "http://b"], ping_path="/health")`
98
+ - **Personalized**: Pass a list of dictionaries for per-host settings:
99
+ ```python
100
+ checker = HealthChecker([
101
+ {"host": "http://api-1", "pingpath": "/v1/status", "maxrequests": 100},
102
+ {"host": "http://api-2", "pingpath": "/health"}
103
+ ])
104
+ ```
105
+ - **Properties**:
106
+ - `ping_path`: Get or set the global health check path (default: `"/"`).
107
+
108
+ ### LoadBalancer (The Decision Utility)
109
+ A **normal Python object** that makes routing decisions based on its source.
110
+ - **Stateful**: While it has no background loop, it **does track state** (request counts for rate-limiting and the last time it pulled data from the health checker).
111
+ - **No Lifecycle Needed**: It relies on the `HealthChecker` (or a static list) for data and doesn't need explicit `start`/`stop` calls.
112
+
113
+ ## WebSocket Refinements
114
+
115
+ The library implements "deferred negotiation" for WebSockets:
116
+ 1. The proxy receives the client's supported subprotocols from `scope`.
117
+ 2. It establishes an upstream connection first.
118
+ 3. Once the upstream accepts a protocol, the proxy calls `websocket.accept(subprotocol=...)` back to the client.
119
+ 4. This ensures the entire tunnel (Client <-> Proxy <-> Upstream) uses the same negotiated protocol.
120
+
121
+ ## Robustness & Safety
122
+
123
+ - **Termination Safety**: Resource cleanup (closing `httpx` clients and sockets) is triggered even on task cancellation (`BaseException`).
124
+ - **Introspection-Based Compatibility**: Uses `inspect.signature` to automatically detect version-specific parameters in the `websockets` library.
@@ -0,0 +1,13 @@
1
+ LICENSE
2
+ README.md
3
+ pyproject.toml
4
+ src/fastapi_reverse_proxy/__init__.py
5
+ src/fastapi_reverse_proxy/health_check.py
6
+ src/fastapi_reverse_proxy/load_balance.py
7
+ src/fastapi_reverse_proxy/proxy_httpx.py
8
+ src/fastapi_reverse_proxy/proxy_pass.py
9
+ src/fastapi_reverse_proxy.egg-info/PKG-INFO
10
+ src/fastapi_reverse_proxy.egg-info/SOURCES.txt
11
+ src/fastapi_reverse_proxy.egg-info/dependency_links.txt
12
+ src/fastapi_reverse_proxy.egg-info/requires.txt
13
+ src/fastapi_reverse_proxy.egg-info/top_level.txt
@@ -0,0 +1,10 @@
1
+ fastapi>=0.129.0
2
+ httpx>=0.28.1
3
+ starlette>=0.52.1
4
+ uvicorn>=0.41.0
5
+ anyio>=4.12.1
6
+ certifi>=2026.1.4
7
+ pydantic>=2.12.5
8
+ url-normalize>=2.2.1
9
+ typing_extensions>=4.15.0
10
+ websockets>=16.0