pydantic-rpc 0.8.0__tar.gz → 0.10.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,3 +1,20 @@
1
+ Metadata-Version: 2.3
2
+ Name: pydantic-rpc
3
+ Version: 0.10.0
4
+ Summary: A Python library for building gRPC/ConnectRPC services with Pydantic models.
5
+ Author: Yasushi Itoh
6
+ Requires-Dist: annotated-types==0.7.0
7
+ Requires-Dist: pydantic>=2.1.1
8
+ Requires-Dist: grpcio>=1.56.2
9
+ Requires-Dist: grpcio-tools>=1.56.2
10
+ Requires-Dist: grpcio-reflection>=1.56.2
11
+ Requires-Dist: grpcio-health-checking>=1.56.2
12
+ Requires-Dist: connecpy>=2.2.0
13
+ Requires-Dist: mcp>=1.9.4
14
+ Requires-Dist: starlette>=0.27.0
15
+ Requires-Python: >=3.11
16
+ Description-Content-Type: text/markdown
17
+
1
18
  # 🚀 PydanticRPC
2
19
 
3
20
  **PydanticRPC** is a Python library that enables you to rapidly expose [Pydantic](https://docs.pydantic.dev/) models via [gRPC](https://grpc.io/)/[Connect RPC](https://connectrpc.com/docs/protocol/) services without writing any protobuf files. Instead, it automatically generates protobuf files on the fly from the method signatures of your Python objects and the type signatures of your Pydantic models.
@@ -58,7 +75,7 @@ import asyncio
58
75
  from openai import AsyncOpenAI
59
76
  from pydantic_ai import Agent
60
77
  from pydantic_ai.models.openai import OpenAIModel
61
- from pydantic_rpc import ConnecpyASGIApp, Message
78
+ from pydantic_rpc import ASGIApp, Message
62
79
 
63
80
 
64
81
  class CityLocation(Message):
@@ -89,7 +106,7 @@ class OlympicsLocationAgent:
89
106
  result = await self._agent.run(req.prompt())
90
107
  return result.data
91
108
 
92
- app = ConnecpyASGIApp()
109
+ app = ASGIApp()
93
110
  app.mount(OlympicsLocationAgent())
94
111
 
95
112
  ```
@@ -105,10 +122,10 @@ app.mount(OlympicsLocationAgent())
105
122
  - 💚 **Health Checking:** Built-in support for gRPC health checks using `grpc_health.v1`.
106
123
  - 🔎 **Server Reflection:** Built-in support for gRPC server reflection.
107
124
  - ⚡ **Asynchronous Support:** Easily create asynchronous gRPC services with `AsyncIOServer`.
108
- - **For gRPC-Web:**
109
- - 🌐 **WSGI/ASGI Support:** Create gRPC-Web services that can run as WSGI or ASGI applications powered by `Sonora`.
110
125
  - **For Connect-RPC:**
111
- - 🌐 **Connecpy Support:** Partially supports Connect-RPC via `Connecpy`.
126
+ - 🌐 **Full Protocol Support:** Native Connect-RPC support via `Connecpy` v2.2.0+
127
+ - 🔄 **All Streaming Patterns:** Unary, server streaming, client streaming, and bidirectional streaming
128
+ - 🌐 **WSGI/ASGI Applications:** Run as standard WSGI or ASGI applications for easy deployment
112
129
  - 🛠️ **Pre-generated Protobuf Files and Code:** Pre-generate proto files and corresponding code via the CLI. By setting the environment variable (PYDANTIC_RPC_SKIP_GENERATION), you can skip runtime generation.
113
130
  - 🤖 **MCP (Model Context Protocol) Support:** Expose your services as tools for AI assistants using the official MCP SDK, supporting both stdio and HTTP/SSE transports.
114
131
 
@@ -122,7 +139,11 @@ pip install pydantic-rpc
122
139
 
123
140
  ## 🚀 Getting Started
124
141
 
125
- ### 🔧 Synchronous Service Example
142
+ PydanticRPC supports two main protocols:
143
+ - **gRPC**: Traditional gRPC services with `Server` and `AsyncIOServer`
144
+ - **Connect-RPC**: Modern HTTP-based RPC with `ASGIApp` and `WSGIApp`
145
+
146
+ ### 🔧 Synchronous gRPC Service Example
126
147
 
127
148
  ```python
128
149
  from pydantic_rpc import Server, Message
@@ -143,7 +164,7 @@ if __name__ == "__main__":
143
164
  server.run(Greeter())
144
165
  ```
145
166
 
146
- ### ⚙️ Asynchronous Service Example
167
+ ### ⚙️ Asynchronous gRPC Service Example
147
168
 
148
169
  ```python
149
170
  import asyncio
@@ -176,7 +197,7 @@ if __name__ == "__main__":
176
197
 
177
198
  The AsyncIOServer automatically handles graceful shutdown on SIGTERM and SIGINT signals.
178
199
 
179
- ### 🌐 ASGI Application Example
200
+ ### 🌐 Connect-RPC ASGI Application Example
180
201
 
181
202
  ```python
182
203
  from pydantic_rpc import ASGIApp, Message
@@ -188,27 +209,17 @@ class HelloReply(Message):
188
209
  message: str
189
210
 
190
211
  class Greeter:
191
- def say_hello(self, request: HelloRequest) -> HelloReply:
212
+ async def say_hello(self, request: HelloRequest) -> HelloReply:
192
213
  return HelloReply(message=f"Hello, {request.name}!")
193
214
 
194
-
195
- async def app(scope, receive, send):
196
- """ASGI application.
197
-
198
- Args:
199
- scope (dict): The ASGI scope.
200
- receive (callable): The receive function.
201
- send (callable): The send function.
202
- """
203
- pass
204
-
205
- # Please note that `app` is any ASGI application, such as FastAPI or Starlette.
206
-
207
- app = ASGIApp(app)
215
+ app = ASGIApp()
208
216
  app.mount(Greeter())
217
+
218
+ # Run with uvicorn:
219
+ # uvicorn script:app --host 0.0.0.0 --port 8000
209
220
  ```
210
221
 
211
- ### 🌐 WSGI Application Example
222
+ ### 🌐 Connect-RPC WSGI Application Example
212
223
 
213
224
  ```python
214
225
  from pydantic_rpc import WSGIApp, Message
@@ -223,31 +234,69 @@ class Greeter:
223
234
  def say_hello(self, request: HelloRequest) -> HelloReply:
224
235
  return HelloReply(message=f"Hello, {request.name}!")
225
236
 
226
- def app(environ, start_response):
227
- """WSGI application.
228
-
229
- Args:
230
- environ (dict): The WSGI environment.
231
- start_response (callable): The start_response function.
232
- """
233
- pass
234
-
235
- # Please note that `app` is any WSGI application, such as Flask or Django.
236
-
237
- app = WSGIApp(app)
237
+ app = WSGIApp()
238
238
  app.mount(Greeter())
239
+
240
+ # Run with gunicorn:
241
+ # gunicorn script:app
239
242
  ```
240
243
 
241
- ### 🏆 Connecpy (Connect-RPC) Example
244
+ ### 🏆 Connect-RPC with Streaming Example
242
245
 
243
- PydanticRPC also partially supports Connect-RPC via connecpy. Check out “greeting_connecpy.py” for an example:
246
+ PydanticRPC provides native Connect-RPC support via Connecpy v2.2.0+, including full streaming capabilities and PEP 8 naming conventions. Check out our ASGI examples:
244
247
 
245
248
  ```bash
246
- uv run greeting_connecpy.py
249
+ # Run with uvicorn
250
+ uv run uvicorn greeting_asgi:app --port 3000
251
+
252
+ # Or run streaming example
253
+ uv run python examples/streaming_connecpy.py
247
254
  ```
248
255
 
249
256
  This will launch a Connecpy-based ASGI application that uses the same Pydantic models to serve Connect-RPC requests.
250
257
 
258
+ #### Streaming Support with Connecpy
259
+
260
+ Connecpy v2.2.0 provides full support for streaming RPCs with automatic PEP 8 naming (snake_case):
261
+
262
+ ```python
263
+ from typing import AsyncIterator
264
+ from pydantic_rpc import ASGIApp, Message
265
+
266
+ class StreamRequest(Message):
267
+ text: str
268
+ count: int
269
+
270
+ class StreamResponse(Message):
271
+ text: str
272
+ index: int
273
+
274
+ class StreamingService:
275
+ # Server streaming
276
+ async def server_stream(self, request: StreamRequest) -> AsyncIterator[StreamResponse]:
277
+ for i in range(request.count):
278
+ yield StreamResponse(text=f"{request.text}_{i}", index=i)
279
+
280
+ # Client streaming
281
+ async def client_stream(self, requests: AsyncIterator[StreamRequest]) -> StreamResponse:
282
+ texts = []
283
+ async for req in requests:
284
+ texts.append(req.text)
285
+ return StreamResponse(text=" ".join(texts), index=len(texts))
286
+
287
+ # Bidirectional streaming
288
+ async def bidi_stream(
289
+ self, requests: AsyncIterator[StreamRequest]
290
+ ) -> AsyncIterator[StreamResponse]:
291
+ idx = 0
292
+ async for req in requests:
293
+ yield StreamResponse(text=f"Echo: {req.text}", index=idx)
294
+ idx += 1
295
+
296
+ app = ASGIApp()
297
+ app.mount(StreamingService())
298
+ ```
299
+
251
300
  > [!NOTE]
252
301
  > Please install `protoc-gen-connecpy` to run the Connecpy example.
253
302
  >
@@ -284,9 +333,9 @@ export PYDANTIC_RPC_RESERVED_FIELDS=1
284
333
 
285
334
  ## 💎 Advanced Features
286
335
 
287
- ### 🌊 Response Streaming
288
- PydanticRPC supports streaming responses only for asynchronous gRPC and gRPC-Web services.
289
- If a service class methods return type is `typing.AsyncIterator[T]`, the method is considered a streaming method.
336
+ ### 🌊 Response Streaming (gRPC)
337
+ PydanticRPC supports streaming responses for both gRPC and Connect-RPC services.
338
+ If a service class method's return type is `typing.AsyncIterator[T]`, the method is considered a streaming method.
290
339
 
291
340
 
292
341
  Please see the sample code below:
@@ -852,9 +901,45 @@ class GoodMessage(Message):
852
901
  - Test error cases thoroughly
853
902
  - Be aware that errors fail silently
854
903
 
904
+ ### 🔒 TLS/mTLS Support
905
+
906
+ PydanticRPC provides built-in support for TLS (Transport Layer Security) and mTLS (mutual TLS) for secure gRPC communication.
907
+
908
+ ```python
909
+ from pydantic_rpc import AsyncIOServer, GrpcTLSConfig, extract_peer_identity
910
+ import grpc
911
+
912
+ # Basic TLS (server authentication only)
913
+ tls_config = GrpcTLSConfig(
914
+ cert_chain=server_cert_bytes,
915
+ private_key=server_key_bytes,
916
+ require_client_cert=False
917
+ )
918
+
919
+ # mTLS (mutual authentication)
920
+ tls_config = GrpcTLSConfig(
921
+ cert_chain=server_cert_bytes,
922
+ private_key=server_key_bytes,
923
+ root_certs=ca_cert_bytes, # CA to verify client certificates
924
+ require_client_cert=True
925
+ )
926
+
927
+ # Create server with TLS
928
+ server = AsyncIOServer(tls=tls_config)
929
+
930
+ # Extract client identity in service methods
931
+ class SecureService:
932
+ async def secure_method(self, request, context: grpc.ServicerContext):
933
+ client_identity = extract_peer_identity(context)
934
+ if client_identity:
935
+ print(f"Authenticated client: {client_identity}")
936
+ ```
937
+
938
+ For a complete example, see [examples/tls_server.py](examples/tls_server.py) and [examples/tls_client.py](examples/tls_client.py).
939
+
855
940
  ### 🔗 Multiple Services with Custom Interceptors
856
941
 
857
- PydanticRPC supports defining and running multiple services in a single server:
942
+ PydanticRPC supports defining and running multiple gRPC services in a single server:
858
943
 
859
944
  ```python
860
945
  from datetime import datetime
@@ -985,11 +1070,11 @@ Any MCP-compatible client can connect to your service. For example, to configure
985
1070
  MCP can also be mounted to existing ASGI applications:
986
1071
 
987
1072
  ```python
988
- from pydantic_rpc import ConnecpyASGIApp
1073
+ from pydantic_rpc import ASGIApp
989
1074
  from pydantic_rpc.mcp import MCPExporter
990
1075
 
991
1076
  # Create Connect-RPC ASGI app
992
- app = ConnecpyASGIApp()
1077
+ app = ASGIApp()
993
1078
  app.mount(MathService())
994
1079
 
995
1080
  # Add MCP support via HTTP/SSE
@@ -1,21 +1,3 @@
1
- Metadata-Version: 2.3
2
- Name: pydantic-rpc
3
- Version: 0.8.0
4
- Summary: A Python library for building gRPC/ConnectRPC services with Pydantic models.
5
- Author: Yasushi Itoh
6
- Requires-Dist: annotated-types>=0.5.0
7
- Requires-Dist: pydantic>=2.1.1
8
- Requires-Dist: grpcio>=1.56.2
9
- Requires-Dist: grpcio-tools>=1.56.2
10
- Requires-Dist: grpcio-reflection>=1.56.2
11
- Requires-Dist: grpcio-health-checking>=1.56.2
12
- Requires-Dist: sonora>=0.2.3
13
- Requires-Dist: connecpy==2.0.0
14
- Requires-Dist: mcp>=1.9.4
15
- Requires-Dist: starlette>=0.27.0
16
- Requires-Python: >=3.11
17
- Description-Content-Type: text/markdown
18
-
19
1
  # 🚀 PydanticRPC
20
2
 
21
3
  **PydanticRPC** is a Python library that enables you to rapidly expose [Pydantic](https://docs.pydantic.dev/) models via [gRPC](https://grpc.io/)/[Connect RPC](https://connectrpc.com/docs/protocol/) services without writing any protobuf files. Instead, it automatically generates protobuf files on the fly from the method signatures of your Python objects and the type signatures of your Pydantic models.
@@ -76,7 +58,7 @@ import asyncio
76
58
  from openai import AsyncOpenAI
77
59
  from pydantic_ai import Agent
78
60
  from pydantic_ai.models.openai import OpenAIModel
79
- from pydantic_rpc import ConnecpyASGIApp, Message
61
+ from pydantic_rpc import ASGIApp, Message
80
62
 
81
63
 
82
64
  class CityLocation(Message):
@@ -107,7 +89,7 @@ class OlympicsLocationAgent:
107
89
  result = await self._agent.run(req.prompt())
108
90
  return result.data
109
91
 
110
- app = ConnecpyASGIApp()
92
+ app = ASGIApp()
111
93
  app.mount(OlympicsLocationAgent())
112
94
 
113
95
  ```
@@ -123,10 +105,10 @@ app.mount(OlympicsLocationAgent())
123
105
  - 💚 **Health Checking:** Built-in support for gRPC health checks using `grpc_health.v1`.
124
106
  - 🔎 **Server Reflection:** Built-in support for gRPC server reflection.
125
107
  - ⚡ **Asynchronous Support:** Easily create asynchronous gRPC services with `AsyncIOServer`.
126
- - **For gRPC-Web:**
127
- - 🌐 **WSGI/ASGI Support:** Create gRPC-Web services that can run as WSGI or ASGI applications powered by `Sonora`.
128
108
  - **For Connect-RPC:**
129
- - 🌐 **Connecpy Support:** Partially supports Connect-RPC via `Connecpy`.
109
+ - 🌐 **Full Protocol Support:** Native Connect-RPC support via `Connecpy` v2.2.0+
110
+ - 🔄 **All Streaming Patterns:** Unary, server streaming, client streaming, and bidirectional streaming
111
+ - 🌐 **WSGI/ASGI Applications:** Run as standard WSGI or ASGI applications for easy deployment
130
112
  - 🛠️ **Pre-generated Protobuf Files and Code:** Pre-generate proto files and corresponding code via the CLI. By setting the environment variable (PYDANTIC_RPC_SKIP_GENERATION), you can skip runtime generation.
131
113
  - 🤖 **MCP (Model Context Protocol) Support:** Expose your services as tools for AI assistants using the official MCP SDK, supporting both stdio and HTTP/SSE transports.
132
114
 
@@ -140,7 +122,11 @@ pip install pydantic-rpc
140
122
 
141
123
  ## 🚀 Getting Started
142
124
 
143
- ### 🔧 Synchronous Service Example
125
+ PydanticRPC supports two main protocols:
126
+ - **gRPC**: Traditional gRPC services with `Server` and `AsyncIOServer`
127
+ - **Connect-RPC**: Modern HTTP-based RPC with `ASGIApp` and `WSGIApp`
128
+
129
+ ### 🔧 Synchronous gRPC Service Example
144
130
 
145
131
  ```python
146
132
  from pydantic_rpc import Server, Message
@@ -161,7 +147,7 @@ if __name__ == "__main__":
161
147
  server.run(Greeter())
162
148
  ```
163
149
 
164
- ### ⚙️ Asynchronous Service Example
150
+ ### ⚙️ Asynchronous gRPC Service Example
165
151
 
166
152
  ```python
167
153
  import asyncio
@@ -194,7 +180,7 @@ if __name__ == "__main__":
194
180
 
195
181
  The AsyncIOServer automatically handles graceful shutdown on SIGTERM and SIGINT signals.
196
182
 
197
- ### 🌐 ASGI Application Example
183
+ ### 🌐 Connect-RPC ASGI Application Example
198
184
 
199
185
  ```python
200
186
  from pydantic_rpc import ASGIApp, Message
@@ -206,27 +192,17 @@ class HelloReply(Message):
206
192
  message: str
207
193
 
208
194
  class Greeter:
209
- def say_hello(self, request: HelloRequest) -> HelloReply:
195
+ async def say_hello(self, request: HelloRequest) -> HelloReply:
210
196
  return HelloReply(message=f"Hello, {request.name}!")
211
197
 
212
-
213
- async def app(scope, receive, send):
214
- """ASGI application.
215
-
216
- Args:
217
- scope (dict): The ASGI scope.
218
- receive (callable): The receive function.
219
- send (callable): The send function.
220
- """
221
- pass
222
-
223
- # Please note that `app` is any ASGI application, such as FastAPI or Starlette.
224
-
225
- app = ASGIApp(app)
198
+ app = ASGIApp()
226
199
  app.mount(Greeter())
200
+
201
+ # Run with uvicorn:
202
+ # uvicorn script:app --host 0.0.0.0 --port 8000
227
203
  ```
228
204
 
229
- ### 🌐 WSGI Application Example
205
+ ### 🌐 Connect-RPC WSGI Application Example
230
206
 
231
207
  ```python
232
208
  from pydantic_rpc import WSGIApp, Message
@@ -241,31 +217,69 @@ class Greeter:
241
217
  def say_hello(self, request: HelloRequest) -> HelloReply:
242
218
  return HelloReply(message=f"Hello, {request.name}!")
243
219
 
244
- def app(environ, start_response):
245
- """WSGI application.
246
-
247
- Args:
248
- environ (dict): The WSGI environment.
249
- start_response (callable): The start_response function.
250
- """
251
- pass
252
-
253
- # Please note that `app` is any WSGI application, such as Flask or Django.
254
-
255
- app = WSGIApp(app)
220
+ app = WSGIApp()
256
221
  app.mount(Greeter())
222
+
223
+ # Run with gunicorn:
224
+ # gunicorn script:app
257
225
  ```
258
226
 
259
- ### 🏆 Connecpy (Connect-RPC) Example
227
+ ### 🏆 Connect-RPC with Streaming Example
260
228
 
261
- PydanticRPC also partially supports Connect-RPC via connecpy. Check out “greeting_connecpy.py” for an example:
229
+ PydanticRPC provides native Connect-RPC support via Connecpy v2.2.0+, including full streaming capabilities and PEP 8 naming conventions. Check out our ASGI examples:
262
230
 
263
231
  ```bash
264
- uv run greeting_connecpy.py
232
+ # Run with uvicorn
233
+ uv run uvicorn greeting_asgi:app --port 3000
234
+
235
+ # Or run streaming example
236
+ uv run python examples/streaming_connecpy.py
265
237
  ```
266
238
 
267
239
  This will launch a Connecpy-based ASGI application that uses the same Pydantic models to serve Connect-RPC requests.
268
240
 
241
+ #### Streaming Support with Connecpy
242
+
243
+ Connecpy v2.2.0 provides full support for streaming RPCs with automatic PEP 8 naming (snake_case):
244
+
245
+ ```python
246
+ from typing import AsyncIterator
247
+ from pydantic_rpc import ASGIApp, Message
248
+
249
+ class StreamRequest(Message):
250
+ text: str
251
+ count: int
252
+
253
+ class StreamResponse(Message):
254
+ text: str
255
+ index: int
256
+
257
+ class StreamingService:
258
+ # Server streaming
259
+ async def server_stream(self, request: StreamRequest) -> AsyncIterator[StreamResponse]:
260
+ for i in range(request.count):
261
+ yield StreamResponse(text=f"{request.text}_{i}", index=i)
262
+
263
+ # Client streaming
264
+ async def client_stream(self, requests: AsyncIterator[StreamRequest]) -> StreamResponse:
265
+ texts = []
266
+ async for req in requests:
267
+ texts.append(req.text)
268
+ return StreamResponse(text=" ".join(texts), index=len(texts))
269
+
270
+ # Bidirectional streaming
271
+ async def bidi_stream(
272
+ self, requests: AsyncIterator[StreamRequest]
273
+ ) -> AsyncIterator[StreamResponse]:
274
+ idx = 0
275
+ async for req in requests:
276
+ yield StreamResponse(text=f"Echo: {req.text}", index=idx)
277
+ idx += 1
278
+
279
+ app = ASGIApp()
280
+ app.mount(StreamingService())
281
+ ```
282
+
269
283
  > [!NOTE]
270
284
  > Please install `protoc-gen-connecpy` to run the Connecpy example.
271
285
  >
@@ -302,9 +316,9 @@ export PYDANTIC_RPC_RESERVED_FIELDS=1
302
316
 
303
317
  ## 💎 Advanced Features
304
318
 
305
- ### 🌊 Response Streaming
306
- PydanticRPC supports streaming responses only for asynchronous gRPC and gRPC-Web services.
307
- If a service class methods return type is `typing.AsyncIterator[T]`, the method is considered a streaming method.
319
+ ### 🌊 Response Streaming (gRPC)
320
+ PydanticRPC supports streaming responses for both gRPC and Connect-RPC services.
321
+ If a service class method's return type is `typing.AsyncIterator[T]`, the method is considered a streaming method.
308
322
 
309
323
 
310
324
  Please see the sample code below:
@@ -870,9 +884,45 @@ class GoodMessage(Message):
870
884
  - Test error cases thoroughly
871
885
  - Be aware that errors fail silently
872
886
 
887
+ ### 🔒 TLS/mTLS Support
888
+
889
+ PydanticRPC provides built-in support for TLS (Transport Layer Security) and mTLS (mutual TLS) for secure gRPC communication.
890
+
891
+ ```python
892
+ from pydantic_rpc import AsyncIOServer, GrpcTLSConfig, extract_peer_identity
893
+ import grpc
894
+
895
+ # Basic TLS (server authentication only)
896
+ tls_config = GrpcTLSConfig(
897
+ cert_chain=server_cert_bytes,
898
+ private_key=server_key_bytes,
899
+ require_client_cert=False
900
+ )
901
+
902
+ # mTLS (mutual authentication)
903
+ tls_config = GrpcTLSConfig(
904
+ cert_chain=server_cert_bytes,
905
+ private_key=server_key_bytes,
906
+ root_certs=ca_cert_bytes, # CA to verify client certificates
907
+ require_client_cert=True
908
+ )
909
+
910
+ # Create server with TLS
911
+ server = AsyncIOServer(tls=tls_config)
912
+
913
+ # Extract client identity in service methods
914
+ class SecureService:
915
+ async def secure_method(self, request, context: grpc.ServicerContext):
916
+ client_identity = extract_peer_identity(context)
917
+ if client_identity:
918
+ print(f"Authenticated client: {client_identity}")
919
+ ```
920
+
921
+ For a complete example, see [examples/tls_server.py](examples/tls_server.py) and [examples/tls_client.py](examples/tls_client.py).
922
+
873
923
  ### 🔗 Multiple Services with Custom Interceptors
874
924
 
875
- PydanticRPC supports defining and running multiple services in a single server:
925
+ PydanticRPC supports defining and running multiple gRPC services in a single server:
876
926
 
877
927
  ```python
878
928
  from datetime import datetime
@@ -1003,11 +1053,11 @@ Any MCP-compatible client can connect to your service. For example, to configure
1003
1053
  MCP can also be mounted to existing ASGI applications:
1004
1054
 
1005
1055
  ```python
1006
- from pydantic_rpc import ConnecpyASGIApp
1056
+ from pydantic_rpc import ASGIApp
1007
1057
  from pydantic_rpc.mcp import MCPExporter
1008
1058
 
1009
1059
  # Create Connect-RPC ASGI app
1010
- app = ConnecpyASGIApp()
1060
+ app = ASGIApp()
1011
1061
  app.mount(MathService())
1012
1062
 
1013
1063
  # Add MCP support via HTTP/SSE
@@ -1,19 +1,18 @@
1
1
  [project]
2
2
  name = "pydantic-rpc"
3
- version = "0.8.0"
3
+ version = "0.10.0"
4
4
  description = "A Python library for building gRPC/ConnectRPC services with Pydantic models."
5
5
  authors = [
6
6
  { name = "Yasushi Itoh" }
7
7
  ]
8
8
  dependencies = [
9
- "annotated-types>=0.5.0",
9
+ "annotated-types==0.7.0",
10
10
  "pydantic>=2.1.1",
11
11
  "grpcio>=1.56.2",
12
12
  "grpcio-tools>=1.56.2",
13
13
  "grpcio-reflection>=1.56.2",
14
14
  "grpcio-health-checking>=1.56.2",
15
- "sonora>=0.2.3",
16
- "connecpy==2.0.0",
15
+ "connecpy>=2.2.0",
17
16
  "mcp>=1.9.4",
18
17
  "starlette>=0.27.0",
19
18
  ]
@@ -0,0 +1,43 @@
1
+ from importlib.util import find_spec
2
+
3
+ from .core import (
4
+ Server,
5
+ AsyncIOServer,
6
+ WSGIApp,
7
+ ASGIApp,
8
+ Message,
9
+ generate_proto,
10
+ )
11
+ from .decorators import (
12
+ http_option,
13
+ proto_option,
14
+ get_method_options,
15
+ has_http_option,
16
+ )
17
+ from .tls import (
18
+ GrpcTLSConfig,
19
+ extract_peer_identity,
20
+ extract_peer_certificate_chain,
21
+ )
22
+
23
+ __all__ = [
24
+ "Server",
25
+ "AsyncIOServer",
26
+ "WSGIApp",
27
+ "ASGIApp",
28
+ "Message",
29
+ "generate_proto",
30
+ "http_option",
31
+ "proto_option",
32
+ "get_method_options",
33
+ "has_http_option",
34
+ "GrpcTLSConfig",
35
+ "extract_peer_identity",
36
+ "extract_peer_certificate_chain",
37
+ ]
38
+
39
+ # Optional MCP support
40
+ if find_spec("mcp"):
41
+ from .mcp import MCPExporter # noqa: F401
42
+
43
+ __all__.append("MCPExporter")