langchain-mcp-tools 0.1.1__tar.gz → 0.1.3__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,124 @@
1
+ Metadata-Version: 2.2
2
+ Name: langchain-mcp-tools
3
+ Version: 0.1.3
4
+ Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
+ Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
+ Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
7
+ Keywords: modelcontextprotocol,mcp,mcp-client,langchain,langchain-python,tool-call,tool-calling,python
8
+ Requires-Python: >=3.11
9
+ Description-Content-Type: text/markdown
10
+ License-File: LICENSE
11
+ Requires-Dist: jsonschema-pydantic>=0.6
12
+ Requires-Dist: langchain>=0.3.14
13
+ Requires-Dist: langchain-anthropic>=0.3.1
14
+ Requires-Dist: langchain-groq>=0.2.3
15
+ Requires-Dist: langchain-openai>=0.3.0
16
+ Requires-Dist: langgraph>=0.2.62
17
+ Requires-Dist: mcp>=1.2.0
18
+ Requires-Dist: pyjson5>=1.6.8
19
+ Requires-Dist: pympler>=1.1
20
+ Requires-Dist: python-dotenv>=1.0.1
21
+ Provides-Extra: dev
22
+ Requires-Dist: pytest>=8.3.4; extra == "dev"
23
+ Requires-Dist: pytest-asyncio>=0.25.2; extra == "dev"
24
+
25
+ # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
26
+
27
+ This package is intended to simplify the use of
28
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
29
+ server tools with LangChain / Python.
30
+
31
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io/),
32
+ introduced by
33
+ [Anthropic](https://www.anthropic.com/news/model-context-protocol),
34
+ extends the capabilities of LLMs by enabling interaction with external tools and resources,
35
+ such as web search and database access.
36
+ Thanks to its open-source nature, MCP has gained significant traction in the developer community,
37
+ with over 400 MCP servers already developed and shared:
38
+
39
+ - [Smithery: MCP Server Registry](https://smithery.ai/)
40
+ - [Glama’s list of Open-Source MCP servers](https://glama.ai/mcp/servers)
41
+
42
+ In the MCP framework, external features are encapsulated in an MCP server
43
+ that runs in a separate process.
44
+ This clear decoupling allows for easy adoption and reuse of
45
+ any of the significant collections of MCP servers listed above.
46
+
47
+ To make it easy for LangChain to take advantage of such a vast resource base,
48
+ this package offers quick and seamless access from LangChain to MCP servers.
49
+
50
+ It contains a utility function `convert_mcp_to_langchain_tools()`.
51
+ This async function handles parallel initialization of specified multiple MCP servers
52
+ and converts their available tools into a list of LangChain-compatible tools.
53
+
54
+ A typescript equivalent of this utility is available
55
+ [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
56
+
57
+ ## Requirements
58
+
59
+ - Python 3.11+
60
+
61
+ ## Installation
62
+
63
+ ```bash
64
+ pip install langchain-mcp-tools
65
+ ```
66
+
67
+ ## Quick Start
68
+
69
+ `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
70
+ that follow the same structure as
71
+ [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
72
+ but only the contents of the `mcpServers` property,
73
+ and is expressed as a `dict`, e.g.:
74
+
75
+ ```python
76
+ mcp_configs = {
77
+ 'filesystem': {
78
+ 'command': 'npx',
79
+ 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
80
+ },
81
+ 'fetch': {
82
+ 'command': 'uvx',
83
+ 'args': ['mcp-server-fetch']
84
+ }
85
+ }
86
+
87
+ tools, cleanup = await convert_mcp_to_langchain_tools(
88
+ mcp_configs
89
+ )
90
+ ```
91
+
92
+ This utility function initializes all specified MCP servers in parallel,
93
+ and returns LangChain Tools
94
+ ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
95
+ by gathering available MCP tools from the servers,
96
+ and by wrapping them into LangChain tools.
97
+ It also returns an async callback function (`cleanup: McpServerCleanupFn`)
98
+ to be invoked to close all MCP server sessions when finished.
99
+
100
+ The returned tools can be used with LangChain, e.g.:
101
+
102
+ ```python
103
+ # from langchain.chat_models import init_chat_model
104
+ llm = init_chat_model(
105
+ model='claude-3-5-haiku-latest',
106
+ model_provider='anthropic'
107
+ )
108
+
109
+ # from langgraph.prebuilt import create_react_agent
110
+ agent = create_react_agent(
111
+ llm,
112
+ tools
113
+ )
114
+ ```
115
+ A simple and experimentable usage example can be found
116
+ [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
117
+
118
+ A more realistic usage example can be found
119
+ [here](https://github.com/hideya/mcp-client-langchain-py)
120
+
121
+
122
+ ## Limitations
123
+
124
+ Currently, only text results of tool calls are supported.
@@ -0,0 +1,100 @@
1
+ # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
2
+
3
+ This package is intended to simplify the use of
4
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
5
+ server tools with LangChain / Python.
6
+
7
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io/),
8
+ introduced by
9
+ [Anthropic](https://www.anthropic.com/news/model-context-protocol),
10
+ extends the capabilities of LLMs by enabling interaction with external tools and resources,
11
+ such as web search and database access.
12
+ Thanks to its open-source nature, MCP has gained significant traction in the developer community,
13
+ with over 400 MCP servers already developed and shared:
14
+
15
+ - [Smithery: MCP Server Registry](https://smithery.ai/)
16
+ - [Glama’s list of Open-Source MCP servers](https://glama.ai/mcp/servers)
17
+
18
+ In the MCP framework, external features are encapsulated in an MCP server
19
+ that runs in a separate process.
20
+ This clear decoupling allows for easy adoption and reuse of
21
+ any of the significant collections of MCP servers listed above.
22
+
23
+ To make it easy for LangChain to take advantage of such a vast resource base,
24
+ this package offers quick and seamless access from LangChain to MCP servers.
25
+
26
+ It contains a utility function `convert_mcp_to_langchain_tools()`.
27
+ This async function handles parallel initialization of specified multiple MCP servers
28
+ and converts their available tools into a list of LangChain-compatible tools.
29
+
30
+ A typescript equivalent of this utility is available
31
+ [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
32
+
33
+ ## Requirements
34
+
35
+ - Python 3.11+
36
+
37
+ ## Installation
38
+
39
+ ```bash
40
+ pip install langchain-mcp-tools
41
+ ```
42
+
43
+ ## Quick Start
44
+
45
+ `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
46
+ that follow the same structure as
47
+ [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
48
+ but only the contents of the `mcpServers` property,
49
+ and is expressed as a `dict`, e.g.:
50
+
51
+ ```python
52
+ mcp_configs = {
53
+ 'filesystem': {
54
+ 'command': 'npx',
55
+ 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
56
+ },
57
+ 'fetch': {
58
+ 'command': 'uvx',
59
+ 'args': ['mcp-server-fetch']
60
+ }
61
+ }
62
+
63
+ tools, cleanup = await convert_mcp_to_langchain_tools(
64
+ mcp_configs
65
+ )
66
+ ```
67
+
68
+ This utility function initializes all specified MCP servers in parallel,
69
+ and returns LangChain Tools
70
+ ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
71
+ by gathering available MCP tools from the servers,
72
+ and by wrapping them into LangChain tools.
73
+ It also returns an async callback function (`cleanup: McpServerCleanupFn`)
74
+ to be invoked to close all MCP server sessions when finished.
75
+
76
+ The returned tools can be used with LangChain, e.g.:
77
+
78
+ ```python
79
+ # from langchain.chat_models import init_chat_model
80
+ llm = init_chat_model(
81
+ model='claude-3-5-haiku-latest',
82
+ model_provider='anthropic'
83
+ )
84
+
85
+ # from langgraph.prebuilt import create_react_agent
86
+ agent = create_react_agent(
87
+ llm,
88
+ tools
89
+ )
90
+ ```
91
+ A simple and experimentable usage example can be found
92
+ [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
93
+
94
+ A more realistic usage example can be found
95
+ [here](https://github.com/hideya/mcp-client-langchain-py)
96
+
97
+
98
+ ## Limitations
99
+
100
+ Currently, only text results of tool calls are supported.
@@ -1,7 +1,17 @@
1
1
  [project]
2
2
  name = "langchain-mcp-tools"
3
- version = "0.1.1"
3
+ version = "0.1.3"
4
4
  description = "Model Context Protocol (MCP) To LangChain Tools Conversion Utility"
5
+ keywords = [
6
+ "modelcontextprotocol",
7
+ "mcp",
8
+ "mcp-client",
9
+ "langchain",
10
+ "langchain-python",
11
+ "tool-call",
12
+ "tool-calling",
13
+ "python",
14
+ ]
5
15
  readme = "README.md"
6
16
  requires-python = ">=3.11"
7
17
  dependencies = [
@@ -1,5 +1,8 @@
1
1
  # Standard library imports
2
- import asyncio
2
+ from anyio.streams.memory import (
3
+ MemoryObjectReceiveStream,
4
+ MemoryObjectSendStream,
5
+ )
3
6
  import logging
4
7
  import os
5
8
  import sys
@@ -13,6 +16,7 @@ from typing import (
13
16
  NoReturn,
14
17
  Tuple,
15
18
  Type,
19
+ TypeAlias,
16
20
  )
17
21
 
18
22
  # Third-party imports
@@ -21,6 +25,7 @@ try:
21
25
  from langchain_core.tools import BaseTool, ToolException
22
26
  from mcp import ClientSession, StdioServerParameters
23
27
  from mcp.client.stdio import stdio_client
28
+ import mcp.types as mcp_types
24
29
  from pydantic import BaseModel
25
30
  from pympler import asizeof
26
31
  except ImportError as e:
@@ -29,111 +34,34 @@ except ImportError as e:
29
34
  sys.exit(1)
30
35
 
31
36
 
32
- """
33
- Resource Management Pattern for Parallel Server Initialization
34
- --------------------------------------------------------------
35
- This code implements a specific pattern for managing async resources that
36
- require context managers while enabling parallel initialization.
37
- The key aspects are:
38
-
39
- 1. Challenge:
40
-
41
- A key requirement for parallel initialization is that each server must be
42
- initialized in its own dedicated task - there's no way around this as far as
43
- I know. However, this poses a challenge when combined with
44
- `asynccontextmanager`.
45
-
46
- - Resources management for `stdio_client` and `ClientSession` seems
47
- to require relying exclusively on `asynccontextmanager` for cleanup,
48
- with no manual cleanup options
49
- (based on the mcp python-sdk impl as of Jan 14, 2025)
50
- - Initializing multiple MCP servers in parallel requires a dedicated
51
- `asyncio.Task` per server
52
- - Server cleanup can be initiated later by a task other than the one that
53
- initialized the resources, whereas `AsyncExitStack.aclose()` must be
54
- called from the same task that created the context
55
-
56
- 2. Solution:
57
-
58
- The key insight is to keep the initialization tasks alive throughout the
59
- session lifetime, rather than letting them complete after initialization.
60
-
61
- By using `asyncio.Event`s for coordination, we can:
62
- - Allow parallel initialization while maintaining proper context management
63
- - Keep each initialization task running until explicit cleanup is requested
64
- - Ensure cleanup occurs in the same task that created the resources
65
- - Provide a clean interface for the caller to manage the lifecycle
66
-
67
- Alternative Considered:
68
- A generator/coroutine approach using `finally` block for cleanup was
69
- considered but rejected because:
70
- - It turned out that the `finally` block in a generator/coroutine can be
71
- executed by a different task than the one that ran the main body of
72
- the code
73
- - This breaks the requirement that `AsyncExitStack.aclose()` must be
74
- called from the same task that created the context
75
-
76
- 3. Task Lifecycle:
77
-
78
- The following task lifecyle diagram illustrates how the above strategy
79
- was impelemented:
80
- ```
81
- [Task starts]
82
-
83
- Initialize server & convert tools
84
-
85
- Set ready_event (signals tools are ready)
86
-
87
- await cleanup_event.wait() (keeps task alive)
88
-
89
- When cleanup_event is set:
90
- exit_stack.aclose() (cleanup in original task)
91
- ```
92
- This approach indeed enables parallel initialization while maintaining proper
93
- async resource lifecycle management through context managers.
94
- However, I'm afraid I'm twisting things around too much.
95
- It usually means I'm doing something very worng...
96
-
97
- I think it is a natural assumption that MCP SDK is designed with consideration
98
- for parallel server initialization.
99
- I'm not sure what I'm missing...
100
- (FYI, with the TypeScript MCP SDK, parallel initialization was
101
- pretty straightforward.
102
- """
103
-
104
-
105
- async def spawn_mcp_server_tools_task(
37
+ # Type alias for the bidirectional communication channels with the MCP server
38
+ # FIXME: not defined in mcp.types, really?
39
+ StdioTransport: TypeAlias = tuple[
40
+ MemoryObjectReceiveStream[mcp_types.JSONRPCMessage | Exception],
41
+ MemoryObjectSendStream[mcp_types.JSONRPCMessage]
42
+ ]
43
+
44
+
45
+ async def spawn_mcp_server_and_get_transport(
106
46
  server_name: str,
107
47
  server_config: Dict[str, Any],
108
- langchain_tools: List[BaseTool],
109
- ready_event: asyncio.Event,
110
- cleanup_event: asyncio.Event,
48
+ exit_stack: AsyncExitStack,
111
49
  logger: logging.Logger = logging.getLogger(__name__)
112
- ) -> None:
113
- """Convert MCP server tools to LangChain compatible tools
114
- and manage lifecycle.
115
-
116
- This task initializes an MCP server connection, converts its tools
117
- to LangChain format, and manages the connection lifecycle.
118
- It adds the tools to the provided langchain_tools list and uses events
119
- for synchronization.
50
+ ) -> StdioTransport:
51
+ """
52
+ Spawns an MCP server process and establishes communication channels.
120
53
 
121
54
  Args:
122
- server_name: Name of the MCP server
123
- server_config: Server configuration dictionary containing command,
124
- args, and env
125
- langchain_tools: List to which the converted LangChain tools will
126
- be appended
127
- ready_event: Event to signal when tools are ready for use
128
- cleanup_event: Event to trigger cleanup and connection closure
129
- logger: Logger instance to use for logging events and errors.
130
- Defaults to module logger.
55
+ server_name: Server instance name to use for better logging
56
+ server_config: Configuration dictionary for server setup
57
+ exit_stack: Context manager for cleanup handling
58
+ logger: Logger instance for debugging and monitoring
131
59
 
132
60
  Returns:
133
- None
61
+ A tuple of receive and send streams for server communication
134
62
 
135
63
  Raises:
136
- Exception: If there's an error in server connection or tool conversion
64
+ Exception: If server spawning fails
137
65
  """
138
66
  try:
139
67
  logger.info(f'MCP server "{server_name}": initializing with:',
@@ -146,29 +74,59 @@ async def spawn_mcp_server_tools_task(
146
74
  if 'PATH' not in env:
147
75
  env['PATH'] = os.environ.get('PATH', '')
148
76
 
77
+ # Create server parameters with command, arguments and environment
149
78
  server_params = StdioServerParameters(
150
79
  command=server_config['command'],
151
80
  args=server_config.get('args', []),
152
81
  env=env
153
82
  )
154
83
 
84
+ # Initialize stdio client and register it with exit stack for cleanup
85
+ stdio_transport = await exit_stack.enter_async_context(
86
+ stdio_client(server_params)
87
+ )
88
+ except Exception as e:
89
+ logger.error(f'Error spawning MCP server: {str(e)}')
90
+ raise
91
+
92
+ return stdio_transport
93
+
94
+
95
+ async def get_mcp_server_tools(
96
+ server_name: str,
97
+ stdio_transport: StdioTransport,
98
+ exit_stack: AsyncExitStack,
99
+ logger: logging.Logger = logging.getLogger(__name__)
100
+ ) -> List[BaseTool]:
101
+ """
102
+ Retrieves and converts MCP server tools to LangChain format.
103
+
104
+ Args:
105
+ server_name: Server instance name to use for better logging
106
+ stdio_transport: Communication channels tuple
107
+ exit_stack: Context manager for cleanup handling
108
+ logger: Logger instance for debugging and monitoring
109
+
110
+ Returns:
111
+ List of LangChain tools converted from MCP tools
112
+
113
+ Raises:
114
+ Exception: If tool conversion fails
115
+ """
116
+ try:
117
+ read, write = stdio_transport
118
+
155
119
  # Use an intermediate `asynccontextmanager` to log the cleanup message
156
120
  @asynccontextmanager
157
121
  async def log_before_aexit(context_manager, message):
122
+ """Helper context manager that logs before cleanup"""
158
123
  yield await context_manager.__aenter__()
159
124
  try:
160
125
  logger.info(message)
161
126
  finally:
162
127
  await context_manager.__aexit__(None, None, None)
163
128
 
164
- # Initialize the MCP server
165
- exit_stack = AsyncExitStack()
166
-
167
- stdio_transport = await exit_stack.enter_async_context(
168
- stdio_client(server_params)
169
- )
170
- read, write = stdio_transport
171
-
129
+ # Initialize client session with cleanup logging
172
130
  session = await exit_stack.enter_async_context(
173
131
  log_before_aexit(
174
132
  ClientSession(read, write),
@@ -182,11 +140,14 @@ async def spawn_mcp_server_tools_task(
182
140
  # Get MCP tools
183
141
  tools_response = await session.list_tools()
184
142
 
185
- # Wrap MCP tools to into LangChain tools
143
+ # Wrap MCP tools into LangChain tools
144
+ langchain_tools: List[BaseTool] = []
186
145
  for tool in tools_response.tools:
146
+ # Define adapter class to convert MCP tool to LangChain format
187
147
  class McpToLangChainAdapter(BaseTool):
188
148
  name: str = tool.name or 'NO NAME'
189
149
  description: str = tool.description or ''
150
+ # Convert JSON schema to Pydantic model for argument validation
190
151
  args_schema: Type[BaseModel] = jsonschema_to_pydantic(
191
152
  tool.inputSchema
192
153
  )
@@ -197,12 +158,17 @@ async def spawn_mcp_server_tools_task(
197
158
  )
198
159
 
199
160
  async def _arun(self, **kwargs: Any) -> Any:
161
+ """
162
+ Asynchronously executes the tool with given arguments.
163
+ Logs input/output and handles errors.
164
+ """
200
165
  logger.info(f'MCP tool "{server_name}"/"{tool.name}"'
201
166
  f' received input:', kwargs)
202
167
  result = await session.call_tool(self.name, kwargs)
203
168
  if result.isError:
204
169
  raise ToolException(result.content)
205
170
 
171
+ # Log result size for monitoring
206
172
  size = asizeof.asizeof(result.content)
207
173
  logger.info(f'MCP tool "{server_name}"/"{tool.name}" '
208
174
  f'received result (size: {size})')
@@ -210,24 +176,19 @@ async def spawn_mcp_server_tools_task(
210
176
 
211
177
  langchain_tools.append(McpToLangChainAdapter())
212
178
 
179
+ # Log available tools for debugging
213
180
  logger.info(f'MCP server "{server_name}": {len(langchain_tools)} '
214
181
  f'tool(s) available:')
215
182
  for tool in langchain_tools:
216
183
  logger.info(f'- {tool.name}')
217
184
  except Exception as e:
218
- logger.error(f'Error getting response: {str(e)}')
185
+ logger.error(f'Error getting MCP tools: {str(e)}')
219
186
  raise
220
187
 
221
- # Set ready_event; signals tools are ready
222
- ready_event.set()
223
-
224
- # Keep this task alive until cleanup is requested
225
- await cleanup_event.wait()
226
-
227
- # Cleanup the resources
228
- await exit_stack.aclose()
188
+ return langchain_tools
229
189
 
230
190
 
191
+ # Type hint for cleanup function
231
192
  McpServerCleanupFn = Callable[[], Awaitable[None]]
232
193
 
233
194
 
@@ -264,43 +225,46 @@ async def convert_mcp_to_langchain_tools(
264
225
  # Use tools...
265
226
  await cleanup()
266
227
  """
267
- per_server_tools = []
268
- ready_event_list = []
269
- cleanup_event_list = []
270
228
 
271
- # Concurrently initialize all the MCP servers
272
- tasks = []
229
+ # Initialize AsyncExitStack for managing multiple server lifecycles
230
+ stdio_transports: List[StdioTransport] = []
231
+ async_exit_stack = AsyncExitStack()
232
+
233
+ # Spawn all MCP servers concurrently
273
234
  for server_name, server_config in server_configs.items():
274
- server_tools_accumulator: List[BaseTool] = []
275
- per_server_tools.append(server_tools_accumulator)
276
- ready_event = asyncio.Event()
277
- ready_event_list.append(ready_event)
278
- cleanup_event = asyncio.Event()
279
- cleanup_event_list.append(cleanup_event)
280
- task = asyncio.create_task(spawn_mcp_server_tools_task(
235
+ # NOTE: the following `await` only blocks until the server subprocess
236
+ # is spawned, i.e. after returning from the `await`, the spawned
237
+ # subprocess starts its initialization independently of (so in
238
+ # parallel with) the Python execution of the following lines.
239
+ stdio_transport = await spawn_mcp_server_and_get_transport(
281
240
  server_name,
282
241
  server_config,
283
- server_tools_accumulator,
284
- ready_event,
285
- cleanup_event,
242
+ async_exit_stack,
286
243
  logger
287
- ))
288
- tasks.append(task)
289
-
290
- # Wait for all tasks to finish filling in the `server_tools_accumulator`
291
- await asyncio.gather(*(event.wait() for event in ready_event_list))
292
-
293
- # Flatten the tools list
294
- langchain_tools = [
295
- item for sublist in per_server_tools for item in sublist
296
- ]
244
+ )
245
+ stdio_transports.append(stdio_transport)
246
+
247
+ # Convert tools from each server to LangChain format
248
+ langchain_tools: List[BaseTool] = []
249
+ for (server_name, server_config), stdio_transport in zip(
250
+ server_configs.items(),
251
+ stdio_transports,
252
+ strict=True
253
+ ):
254
+ tools = await get_mcp_server_tools(
255
+ server_name,
256
+ stdio_transport,
257
+ async_exit_stack,
258
+ logger
259
+ )
260
+ langchain_tools.extend(tools)
297
261
 
298
- # Define a cleanup callback to set cleanup_event and signal that
299
- # it is time to clean up the resources
262
+ # Define a cleanup function to properly shut down all servers
300
263
  async def mcp_cleanup() -> None:
301
- for event in cleanup_event_list:
302
- event.set()
264
+ """Closes all server connections and cleans up resources"""
265
+ await async_exit_stack.aclose()
303
266
 
267
+ # Log summary of initialized tools
304
268
  logger.info(f'MCP servers initialized: {len(langchain_tools)} tool(s) '
305
269
  f'available in total')
306
270
  for tool in langchain_tools:
@@ -0,0 +1,124 @@
1
+ Metadata-Version: 2.2
2
+ Name: langchain-mcp-tools
3
+ Version: 0.1.3
4
+ Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
+ Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
+ Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
7
+ Keywords: modelcontextprotocol,mcp,mcp-client,langchain,langchain-python,tool-call,tool-calling,python
8
+ Requires-Python: >=3.11
9
+ Description-Content-Type: text/markdown
10
+ License-File: LICENSE
11
+ Requires-Dist: jsonschema-pydantic>=0.6
12
+ Requires-Dist: langchain>=0.3.14
13
+ Requires-Dist: langchain-anthropic>=0.3.1
14
+ Requires-Dist: langchain-groq>=0.2.3
15
+ Requires-Dist: langchain-openai>=0.3.0
16
+ Requires-Dist: langgraph>=0.2.62
17
+ Requires-Dist: mcp>=1.2.0
18
+ Requires-Dist: pyjson5>=1.6.8
19
+ Requires-Dist: pympler>=1.1
20
+ Requires-Dist: python-dotenv>=1.0.1
21
+ Provides-Extra: dev
22
+ Requires-Dist: pytest>=8.3.4; extra == "dev"
23
+ Requires-Dist: pytest-asyncio>=0.25.2; extra == "dev"
24
+
25
+ # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
26
+
27
+ This package is intended to simplify the use of
28
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
29
+ server tools with LangChain / Python.
30
+
31
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io/),
32
+ introduced by
33
+ [Anthropic](https://www.anthropic.com/news/model-context-protocol),
34
+ extends the capabilities of LLMs by enabling interaction with external tools and resources,
35
+ such as web search and database access.
36
+ Thanks to its open-source nature, MCP has gained significant traction in the developer community,
37
+ with over 400 MCP servers already developed and shared:
38
+
39
+ - [Smithery: MCP Server Registry](https://smithery.ai/)
40
+ - [Glama’s list of Open-Source MCP servers](https://glama.ai/mcp/servers)
41
+
42
+ In the MCP framework, external features are encapsulated in an MCP server
43
+ that runs in a separate process.
44
+ This clear decoupling allows for easy adoption and reuse of
45
+ any of the significant collections of MCP servers listed above.
46
+
47
+ To make it easy for LangChain to take advantage of such a vast resource base,
48
+ this package offers quick and seamless access from LangChain to MCP servers.
49
+
50
+ It contains a utility function `convert_mcp_to_langchain_tools()`.
51
+ This async function handles parallel initialization of specified multiple MCP servers
52
+ and converts their available tools into a list of LangChain-compatible tools.
53
+
54
+ A typescript equivalent of this utility is available
55
+ [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
56
+
57
+ ## Requirements
58
+
59
+ - Python 3.11+
60
+
61
+ ## Installation
62
+
63
+ ```bash
64
+ pip install langchain-mcp-tools
65
+ ```
66
+
67
+ ## Quick Start
68
+
69
+ `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
70
+ that follow the same structure as
71
+ [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
72
+ but only the contents of the `mcpServers` property,
73
+ and is expressed as a `dict`, e.g.:
74
+
75
+ ```python
76
+ mcp_configs = {
77
+ 'filesystem': {
78
+ 'command': 'npx',
79
+ 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
80
+ },
81
+ 'fetch': {
82
+ 'command': 'uvx',
83
+ 'args': ['mcp-server-fetch']
84
+ }
85
+ }
86
+
87
+ tools, cleanup = await convert_mcp_to_langchain_tools(
88
+ mcp_configs
89
+ )
90
+ ```
91
+
92
+ This utility function initializes all specified MCP servers in parallel,
93
+ and returns LangChain Tools
94
+ ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
95
+ by gathering available MCP tools from the servers,
96
+ and by wrapping them into LangChain tools.
97
+ It also returns an async callback function (`cleanup: McpServerCleanupFn`)
98
+ to be invoked to close all MCP server sessions when finished.
99
+
100
+ The returned tools can be used with LangChain, e.g.:
101
+
102
+ ```python
103
+ # from langchain.chat_models import init_chat_model
104
+ llm = init_chat_model(
105
+ model='claude-3-5-haiku-latest',
106
+ model_provider='anthropic'
107
+ )
108
+
109
+ # from langgraph.prebuilt import create_react_agent
110
+ agent = create_react_agent(
111
+ llm,
112
+ tools
113
+ )
114
+ ```
115
+ A simple and experimentable usage example can be found
116
+ [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
117
+
118
+ A more realistic usage example can be found
119
+ [here](https://github.com/hideya/mcp-client-langchain-py)
120
+
121
+
122
+ ## Limitations
123
+
124
+ Currently, only text results of tool calls are supported.
@@ -1,181 +0,0 @@
1
- Metadata-Version: 2.2
2
- Name: langchain-mcp-tools
3
- Version: 0.1.1
4
- Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
- Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
- Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
7
- Requires-Python: >=3.11
8
- Description-Content-Type: text/markdown
9
- License-File: LICENSE
10
- Requires-Dist: jsonschema-pydantic>=0.6
11
- Requires-Dist: langchain>=0.3.14
12
- Requires-Dist: langchain-anthropic>=0.3.1
13
- Requires-Dist: langchain-groq>=0.2.3
14
- Requires-Dist: langchain-openai>=0.3.0
15
- Requires-Dist: langgraph>=0.2.62
16
- Requires-Dist: mcp>=1.2.0
17
- Requires-Dist: pyjson5>=1.6.8
18
- Requires-Dist: pympler>=1.1
19
- Requires-Dist: python-dotenv>=1.0.1
20
- Provides-Extra: dev
21
- Requires-Dist: pytest>=8.3.4; extra == "dev"
22
- Requires-Dist: pytest-asyncio>=0.25.2; extra == "dev"
23
-
24
- # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
25
-
26
- This package is intended to simplify the use of
27
- [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
28
- server tools with LangChain / Python.
29
-
30
- It contains a utility function `convert_mcp_to_langchain_tools()`.
31
- This async function handles parallel initialization of specified multiple MCP servers
32
- and converts their available tools into a list of LangChain-compatible tools.
33
-
34
- A typescript equivalent of this utility library is available
35
- [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
36
-
37
- ## Requirements
38
-
39
- - Python 3.11+
40
-
41
- ## Installation
42
-
43
- ```bash
44
- pip install langchain-mcp-tools
45
- ```
46
-
47
- ## Quick Start
48
-
49
- `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
50
- that follow the same structure as
51
- [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
52
- but only the contents of the `mcpServers` property,
53
- and is expressed as a `dict`, e.g.:
54
-
55
- ```python
56
- mcp_configs = {
57
- 'filesystem': {
58
- 'command': 'npx',
59
- 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
60
- },
61
- 'fetch': {
62
- 'command': 'uvx',
63
- 'args': ['mcp-server-fetch']
64
- }
65
- }
66
-
67
- tools, cleanup = await convert_mcp_to_langchain_tools(
68
- mcp_configs
69
- )
70
- ```
71
-
72
- This utility function initializes all specified MCP servers in parallel,
73
- and returns LangChain Tools
74
- ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
75
- by gathering available MCP tools from the servers,
76
- and by wrapping them into LangChain tools.
77
- It also returns an async callback function (`cleanup: McpServerCleanupFn`)
78
- to be invoked to close all MCP server sessions when finished.
79
-
80
- The returned tools can be used with LangChain, e.g.:
81
-
82
- ```python
83
- # from langchain.chat_models import init_chat_model
84
- llm = init_chat_model(
85
- model='claude-3-5-haiku-latest',
86
- model_provider='anthropic'
87
- )
88
-
89
- # from langgraph.prebuilt import create_react_agent
90
- agent = create_react_agent(
91
- llm,
92
- tools
93
- )
94
- ```
95
- A simple and experimentable usage example can be found
96
- [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
97
-
98
- A more realistic usage example can be found
99
- [here](https://github.com/hideya/mcp-client-langchain-py)
100
-
101
-
102
- ## Limitations
103
-
104
- Currently, only text results of tool calls are supported.
105
-
106
- ## Technical Details
107
-
108
- It was very tricky (for me) to get the parallel MCP server initialization
109
- to work, including successful final resource cleanup...
110
-
111
- I'm new to Python, so it is very possible that my ignorance is playing
112
- a big role here...
113
- I'll summarize the difficulties I faced below.
114
- The source code is available
115
- [here](https://github.com/hideya/langchain-mcp-tools-py/blob/main/src/langchain_mcp_tools/langchain_mcp_tools.py).
116
- Any comments pointing out something I am missing would be greatly appreciated!
117
- [(comment here)](https://github.com/hideya/langchain-mcp-tools-ts/issues)
118
-
119
- 1. Challenge:
120
-
121
- A key requirement for parallel initialization is that each server must be
122
- initialized in its own dedicated task - there's no way around this as far as
123
- I know. However, this poses a challenge when combined with
124
- `asynccontextmanager`.
125
-
126
- - Resources management for `stdio_client` and `ClientSession` seems
127
- to require relying exclusively on `asynccontextmanager` for cleanup,
128
- with no manual cleanup options
129
- (based on [the mcp python-sdk impl as of Jan 14, 2025](https://github.com/modelcontextprotocol/python-sdk/tree/99727a9/src/mcp/client))
130
- - Initializing multiple MCP servers in parallel requires a dedicated
131
- `asyncio.Task` per server
132
- - Server cleanup can be initiated later by a task other than the one
133
- that initialized the resources, whereas `AsyncExitStack.aclose()` must be
134
- called from the same task that created the context
135
-
136
- 2. Solution:
137
-
138
- The key insight is to keep the initialization tasks alive throughout the
139
- session lifetime, rather than letting them complete after initialization.
140
-
141
- By using `asyncio.Event`s for coordination, we can:
142
- - Allow parallel initialization while maintaining proper context management
143
- - Keep each initialization task running until explicit cleanup is requested
144
- - Ensure cleanup occurs in the same task that created the resources
145
- - Provide a clean interface for the caller to manage the lifecycle
146
-
147
- Alternative Considered:
148
- A generator/coroutine approach using `finally` block for cleanup was
149
- considered but rejected because:
150
- - It turned out that the `finally` block in a generator/coroutine can be
151
- executed by a different task than the one that ran the main body of
152
- the code
153
- - This breaks the requirement that `AsyncExitStack.aclose()` must be
154
- called from the same task that created the context
155
-
156
- 3. Task Lifecycle:
157
-
158
- The following task lifecyle diagram illustrates how the above strategy
159
- was impelemented:
160
- ```
161
- [Task starts]
162
-
163
- Initialize server & convert tools
164
-
165
- Set ready_event (signals tools are ready)
166
-
167
- await cleanup_event.wait() (keeps task alive)
168
-
169
- When cleanup_event is set:
170
- exit_stack.aclose() (cleanup in original task)
171
- ```
172
- This approach indeed enables parallel initialization while maintaining proper
173
- async resource lifecycle management through context managers.
174
- However, I'm afraid I'm twisting things around too much.
175
- It usually means I'm doing something very worng...
176
-
177
- I think it is a natural assumption that MCP SDK is designed with consideration
178
- for parallel server initialization.
179
- I'm not sure what I'm missing...
180
- (FYI, with the TypeScript MCP SDK, parallel initialization was
181
- [pretty straightforward](https://github.com/hideya/langchain-mcp-tools-ts/blob/main/src/langchain-mcp-tools.ts))
@@ -1,158 +0,0 @@
1
- # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
2
-
3
- This package is intended to simplify the use of
4
- [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
5
- server tools with LangChain / Python.
6
-
7
- It contains a utility function `convert_mcp_to_langchain_tools()`.
8
- This async function handles parallel initialization of specified multiple MCP servers
9
- and converts their available tools into a list of LangChain-compatible tools.
10
-
11
- A typescript equivalent of this utility library is available
12
- [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
13
-
14
- ## Requirements
15
-
16
- - Python 3.11+
17
-
18
- ## Installation
19
-
20
- ```bash
21
- pip install langchain-mcp-tools
22
- ```
23
-
24
- ## Quick Start
25
-
26
- `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
27
- that follow the same structure as
28
- [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
29
- but only the contents of the `mcpServers` property,
30
- and is expressed as a `dict`, e.g.:
31
-
32
- ```python
33
- mcp_configs = {
34
- 'filesystem': {
35
- 'command': 'npx',
36
- 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
37
- },
38
- 'fetch': {
39
- 'command': 'uvx',
40
- 'args': ['mcp-server-fetch']
41
- }
42
- }
43
-
44
- tools, cleanup = await convert_mcp_to_langchain_tools(
45
- mcp_configs
46
- )
47
- ```
48
-
49
- This utility function initializes all specified MCP servers in parallel,
50
- and returns LangChain Tools
51
- ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
52
- by gathering available MCP tools from the servers,
53
- and by wrapping them into LangChain tools.
54
- It also returns an async callback function (`cleanup: McpServerCleanupFn`)
55
- to be invoked to close all MCP server sessions when finished.
56
-
57
- The returned tools can be used with LangChain, e.g.:
58
-
59
- ```python
60
- # from langchain.chat_models import init_chat_model
61
- llm = init_chat_model(
62
- model='claude-3-5-haiku-latest',
63
- model_provider='anthropic'
64
- )
65
-
66
- # from langgraph.prebuilt import create_react_agent
67
- agent = create_react_agent(
68
- llm,
69
- tools
70
- )
71
- ```
72
- A simple and experimentable usage example can be found
73
- [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
74
-
75
- A more realistic usage example can be found
76
- [here](https://github.com/hideya/mcp-client-langchain-py)
77
-
78
-
79
- ## Limitations
80
-
81
- Currently, only text results of tool calls are supported.
82
-
83
- ## Technical Details
84
-
85
- It was very tricky (for me) to get the parallel MCP server initialization
86
- to work, including successful final resource cleanup...
87
-
88
- I'm new to Python, so it is very possible that my ignorance is playing
89
- a big role here...
90
- I'll summarize the difficulties I faced below.
91
- The source code is available
92
- [here](https://github.com/hideya/langchain-mcp-tools-py/blob/main/src/langchain_mcp_tools/langchain_mcp_tools.py).
93
- Any comments pointing out something I am missing would be greatly appreciated!
94
- [(comment here)](https://github.com/hideya/langchain-mcp-tools-ts/issues)
95
-
96
- 1. Challenge:
97
-
98
- A key requirement for parallel initialization is that each server must be
99
- initialized in its own dedicated task - there's no way around this as far as
100
- I know. However, this poses a challenge when combined with
101
- `asynccontextmanager`.
102
-
103
- - Resources management for `stdio_client` and `ClientSession` seems
104
- to require relying exclusively on `asynccontextmanager` for cleanup,
105
- with no manual cleanup options
106
- (based on [the mcp python-sdk impl as of Jan 14, 2025](https://github.com/modelcontextprotocol/python-sdk/tree/99727a9/src/mcp/client))
107
- - Initializing multiple MCP servers in parallel requires a dedicated
108
- `asyncio.Task` per server
109
- - Server cleanup can be initiated later by a task other than the one
110
- that initialized the resources, whereas `AsyncExitStack.aclose()` must be
111
- called from the same task that created the context
112
-
113
- 2. Solution:
114
-
115
- The key insight is to keep the initialization tasks alive throughout the
116
- session lifetime, rather than letting them complete after initialization.
117
-
118
- By using `asyncio.Event`s for coordination, we can:
119
- - Allow parallel initialization while maintaining proper context management
120
- - Keep each initialization task running until explicit cleanup is requested
121
- - Ensure cleanup occurs in the same task that created the resources
122
- - Provide a clean interface for the caller to manage the lifecycle
123
-
124
- Alternative Considered:
125
- A generator/coroutine approach using `finally` block for cleanup was
126
- considered but rejected because:
127
- - It turned out that the `finally` block in a generator/coroutine can be
128
- executed by a different task than the one that ran the main body of
129
- the code
130
- - This breaks the requirement that `AsyncExitStack.aclose()` must be
131
- called from the same task that created the context
132
-
133
- 3. Task Lifecycle:
134
-
135
- The following task lifecyle diagram illustrates how the above strategy
136
- was impelemented:
137
- ```
138
- [Task starts]
139
-
140
- Initialize server & convert tools
141
-
142
- Set ready_event (signals tools are ready)
143
-
144
- await cleanup_event.wait() (keeps task alive)
145
-
146
- When cleanup_event is set:
147
- exit_stack.aclose() (cleanup in original task)
148
- ```
149
- This approach indeed enables parallel initialization while maintaining proper
150
- async resource lifecycle management through context managers.
151
- However, I'm afraid I'm twisting things around too much.
152
- It usually means I'm doing something very worng...
153
-
154
- I think it is a natural assumption that MCP SDK is designed with consideration
155
- for parallel server initialization.
156
- I'm not sure what I'm missing...
157
- (FYI, with the TypeScript MCP SDK, parallel initialization was
158
- [pretty straightforward](https://github.com/hideya/langchain-mcp-tools-ts/blob/main/src/langchain-mcp-tools.ts))
@@ -1,181 +0,0 @@
1
- Metadata-Version: 2.2
2
- Name: langchain-mcp-tools
3
- Version: 0.1.1
4
- Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
- Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
- Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
7
- Requires-Python: >=3.11
8
- Description-Content-Type: text/markdown
9
- License-File: LICENSE
10
- Requires-Dist: jsonschema-pydantic>=0.6
11
- Requires-Dist: langchain>=0.3.14
12
- Requires-Dist: langchain-anthropic>=0.3.1
13
- Requires-Dist: langchain-groq>=0.2.3
14
- Requires-Dist: langchain-openai>=0.3.0
15
- Requires-Dist: langgraph>=0.2.62
16
- Requires-Dist: mcp>=1.2.0
17
- Requires-Dist: pyjson5>=1.6.8
18
- Requires-Dist: pympler>=1.1
19
- Requires-Dist: python-dotenv>=1.0.1
20
- Provides-Extra: dev
21
- Requires-Dist: pytest>=8.3.4; extra == "dev"
22
- Requires-Dist: pytest-asyncio>=0.25.2; extra == "dev"
23
-
24
- # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
25
-
26
- This package is intended to simplify the use of
27
- [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
28
- server tools with LangChain / Python.
29
-
30
- It contains a utility function `convert_mcp_to_langchain_tools()`.
31
- This async function handles parallel initialization of specified multiple MCP servers
32
- and converts their available tools into a list of LangChain-compatible tools.
33
-
34
- A typescript equivalent of this utility library is available
35
- [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
36
-
37
- ## Requirements
38
-
39
- - Python 3.11+
40
-
41
- ## Installation
42
-
43
- ```bash
44
- pip install langchain-mcp-tools
45
- ```
46
-
47
- ## Quick Start
48
-
49
- `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
50
- that follow the same structure as
51
- [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
52
- but only the contents of the `mcpServers` property,
53
- and is expressed as a `dict`, e.g.:
54
-
55
- ```python
56
- mcp_configs = {
57
- 'filesystem': {
58
- 'command': 'npx',
59
- 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
60
- },
61
- 'fetch': {
62
- 'command': 'uvx',
63
- 'args': ['mcp-server-fetch']
64
- }
65
- }
66
-
67
- tools, cleanup = await convert_mcp_to_langchain_tools(
68
- mcp_configs
69
- )
70
- ```
71
-
72
- This utility function initializes all specified MCP servers in parallel,
73
- and returns LangChain Tools
74
- ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
75
- by gathering available MCP tools from the servers,
76
- and by wrapping them into LangChain tools.
77
- It also returns an async callback function (`cleanup: McpServerCleanupFn`)
78
- to be invoked to close all MCP server sessions when finished.
79
-
80
- The returned tools can be used with LangChain, e.g.:
81
-
82
- ```python
83
- # from langchain.chat_models import init_chat_model
84
- llm = init_chat_model(
85
- model='claude-3-5-haiku-latest',
86
- model_provider='anthropic'
87
- )
88
-
89
- # from langgraph.prebuilt import create_react_agent
90
- agent = create_react_agent(
91
- llm,
92
- tools
93
- )
94
- ```
95
- A simple and experimentable usage example can be found
96
- [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
97
-
98
- A more realistic usage example can be found
99
- [here](https://github.com/hideya/mcp-client-langchain-py)
100
-
101
-
102
- ## Limitations
103
-
104
- Currently, only text results of tool calls are supported.
105
-
106
- ## Technical Details
107
-
108
- It was very tricky (for me) to get the parallel MCP server initialization
109
- to work, including successful final resource cleanup...
110
-
111
- I'm new to Python, so it is very possible that my ignorance is playing
112
- a big role here...
113
- I'll summarize the difficulties I faced below.
114
- The source code is available
115
- [here](https://github.com/hideya/langchain-mcp-tools-py/blob/main/src/langchain_mcp_tools/langchain_mcp_tools.py).
116
- Any comments pointing out something I am missing would be greatly appreciated!
117
- [(comment here)](https://github.com/hideya/langchain-mcp-tools-ts/issues)
118
-
119
- 1. Challenge:
120
-
121
- A key requirement for parallel initialization is that each server must be
122
- initialized in its own dedicated task - there's no way around this as far as
123
- I know. However, this poses a challenge when combined with
124
- `asynccontextmanager`.
125
-
126
- - Resources management for `stdio_client` and `ClientSession` seems
127
- to require relying exclusively on `asynccontextmanager` for cleanup,
128
- with no manual cleanup options
129
- (based on [the mcp python-sdk impl as of Jan 14, 2025](https://github.com/modelcontextprotocol/python-sdk/tree/99727a9/src/mcp/client))
130
- - Initializing multiple MCP servers in parallel requires a dedicated
131
- `asyncio.Task` per server
132
- - Server cleanup can be initiated later by a task other than the one
133
- that initialized the resources, whereas `AsyncExitStack.aclose()` must be
134
- called from the same task that created the context
135
-
136
- 2. Solution:
137
-
138
- The key insight is to keep the initialization tasks alive throughout the
139
- session lifetime, rather than letting them complete after initialization.
140
-
141
- By using `asyncio.Event`s for coordination, we can:
142
- - Allow parallel initialization while maintaining proper context management
143
- - Keep each initialization task running until explicit cleanup is requested
144
- - Ensure cleanup occurs in the same task that created the resources
145
- - Provide a clean interface for the caller to manage the lifecycle
146
-
147
- Alternative Considered:
148
- A generator/coroutine approach using `finally` block for cleanup was
149
- considered but rejected because:
150
- - It turned out that the `finally` block in a generator/coroutine can be
151
- executed by a different task than the one that ran the main body of
152
- the code
153
- - This breaks the requirement that `AsyncExitStack.aclose()` must be
154
- called from the same task that created the context
155
-
156
- 3. Task Lifecycle:
157
-
158
- The following task lifecyle diagram illustrates how the above strategy
159
- was impelemented:
160
- ```
161
- [Task starts]
162
-
163
- Initialize server & convert tools
164
-
165
- Set ready_event (signals tools are ready)
166
-
167
- await cleanup_event.wait() (keeps task alive)
168
-
169
- When cleanup_event is set:
170
- exit_stack.aclose() (cleanup in original task)
171
- ```
172
- This approach indeed enables parallel initialization while maintaining proper
173
- async resource lifecycle management through context managers.
174
- However, I'm afraid I'm twisting things around too much.
175
- It usually means I'm doing something very worng...
176
-
177
- I think it is a natural assumption that MCP SDK is designed with consideration
178
- for parallel server initialization.
179
- I'm not sure what I'm missing...
180
- (FYI, with the TypeScript MCP SDK, parallel initialization was
181
- [pretty straightforward](https://github.com/hideya/langchain-mcp-tools-ts/blob/main/src/langchain-mcp-tools.ts))