langchain-mcp-tools 0.1.0__tar.gz → 0.1.2__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,105 @@
1
+ Metadata-Version: 2.2
2
+ Name: langchain-mcp-tools
3
+ Version: 0.1.2
4
+ Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
+ Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
+ Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
7
+ Keywords: modelcontextprotocol,mcp,mcp-client,langchain,langchain-python,tool-call,tool-calling,python
8
+ Requires-Python: >=3.11
9
+ Description-Content-Type: text/markdown
10
+ License-File: LICENSE
11
+ Requires-Dist: jsonschema-pydantic>=0.6
12
+ Requires-Dist: langchain>=0.3.14
13
+ Requires-Dist: langchain-anthropic>=0.3.1
14
+ Requires-Dist: langchain-groq>=0.2.3
15
+ Requires-Dist: langchain-openai>=0.3.0
16
+ Requires-Dist: langgraph>=0.2.62
17
+ Requires-Dist: mcp>=1.2.0
18
+ Requires-Dist: pyjson5>=1.6.8
19
+ Requires-Dist: pympler>=1.1
20
+ Requires-Dist: python-dotenv>=1.0.1
21
+ Provides-Extra: dev
22
+ Requires-Dist: pytest>=8.3.4; extra == "dev"
23
+ Requires-Dist: pytest-asyncio>=0.25.2; extra == "dev"
24
+
25
+ # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
26
+
27
+ This package is intended to simplify the use of
28
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
29
+ server tools with LangChain / Python.
30
+
31
+ It contains a utility function `convert_mcp_to_langchain_tools()`.
32
+ This async function handles parallel initialization of specified multiple MCP servers
33
+ and converts their available tools into a list of LangChain-compatible tools.
34
+
35
+ A typescript equivalent of this utility library is available
36
+ [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
37
+
38
+ ## Requirements
39
+
40
+ - Python 3.11+
41
+
42
+ ## Installation
43
+
44
+ ```bash
45
+ pip install langchain-mcp-tools
46
+ ```
47
+
48
+ ## Quick Start
49
+
50
+ `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
51
+ that follow the same structure as
52
+ [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
53
+ but only the contents of the `mcpServers` property,
54
+ and is expressed as a `dict`, e.g.:
55
+
56
+ ```python
57
+ mcp_configs = {
58
+ 'filesystem': {
59
+ 'command': 'npx',
60
+ 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
61
+ },
62
+ 'fetch': {
63
+ 'command': 'uvx',
64
+ 'args': ['mcp-server-fetch']
65
+ }
66
+ }
67
+
68
+ tools, cleanup = await convert_mcp_to_langchain_tools(
69
+ mcp_configs
70
+ )
71
+ ```
72
+
73
+ This utility function initializes all specified MCP servers in parallel,
74
+ and returns LangChain Tools
75
+ ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
76
+ by gathering available MCP tools from the servers,
77
+ and by wrapping them into LangChain tools.
78
+ It also returns an async callback function (`cleanup: McpServerCleanupFn`)
79
+ to be invoked to close all MCP server sessions when finished.
80
+
81
+ The returned tools can be used with LangChain, e.g.:
82
+
83
+ ```python
84
+ # from langchain.chat_models import init_chat_model
85
+ llm = init_chat_model(
86
+ model='claude-3-5-haiku-latest',
87
+ model_provider='anthropic'
88
+ )
89
+
90
+ # from langgraph.prebuilt import create_react_agent
91
+ agent = create_react_agent(
92
+ llm,
93
+ tools
94
+ )
95
+ ```
96
+ A simple and experimentable usage example can be found
97
+ [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
98
+
99
+ A more realistic usage example can be found
100
+ [here](https://github.com/hideya/mcp-client-langchain-py)
101
+
102
+
103
+ ## Limitations
104
+
105
+ Currently, only text results of tool calls are supported.
@@ -0,0 +1,81 @@
1
+ # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
2
+
3
+ This package is intended to simplify the use of
4
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
5
+ server tools with LangChain / Python.
6
+
7
+ It contains a utility function `convert_mcp_to_langchain_tools()`.
8
+ This async function handles parallel initialization of specified multiple MCP servers
9
+ and converts their available tools into a list of LangChain-compatible tools.
10
+
11
+ A typescript equivalent of this utility library is available
12
+ [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
13
+
14
+ ## Requirements
15
+
16
+ - Python 3.11+
17
+
18
+ ## Installation
19
+
20
+ ```bash
21
+ pip install langchain-mcp-tools
22
+ ```
23
+
24
+ ## Quick Start
25
+
26
+ `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
27
+ that follow the same structure as
28
+ [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
29
+ but only the contents of the `mcpServers` property,
30
+ and is expressed as a `dict`, e.g.:
31
+
32
+ ```python
33
+ mcp_configs = {
34
+ 'filesystem': {
35
+ 'command': 'npx',
36
+ 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
37
+ },
38
+ 'fetch': {
39
+ 'command': 'uvx',
40
+ 'args': ['mcp-server-fetch']
41
+ }
42
+ }
43
+
44
+ tools, cleanup = await convert_mcp_to_langchain_tools(
45
+ mcp_configs
46
+ )
47
+ ```
48
+
49
+ This utility function initializes all specified MCP servers in parallel,
50
+ and returns LangChain Tools
51
+ ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
52
+ by gathering available MCP tools from the servers,
53
+ and by wrapping them into LangChain tools.
54
+ It also returns an async callback function (`cleanup: McpServerCleanupFn`)
55
+ to be invoked to close all MCP server sessions when finished.
56
+
57
+ The returned tools can be used with LangChain, e.g.:
58
+
59
+ ```python
60
+ # from langchain.chat_models import init_chat_model
61
+ llm = init_chat_model(
62
+ model='claude-3-5-haiku-latest',
63
+ model_provider='anthropic'
64
+ )
65
+
66
+ # from langgraph.prebuilt import create_react_agent
67
+ agent = create_react_agent(
68
+ llm,
69
+ tools
70
+ )
71
+ ```
72
+ A simple and experimentable usage example can be found
73
+ [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
74
+
75
+ A more realistic usage example can be found
76
+ [here](https://github.com/hideya/mcp-client-langchain-py)
77
+
78
+
79
+ ## Limitations
80
+
81
+ Currently, only text results of tool calls are supported.
@@ -1,7 +1,17 @@
1
1
  [project]
2
2
  name = "langchain-mcp-tools"
3
- version = "0.1.0"
3
+ version = "0.1.2"
4
4
  description = "Model Context Protocol (MCP) To LangChain Tools Conversion Utility"
5
+ keywords = [
6
+ "modelcontextprotocol",
7
+ "mcp",
8
+ "mcp-client",
9
+ "langchain",
10
+ "langchain-python",
11
+ "tool-call",
12
+ "tool-calling",
13
+ "python",
14
+ ]
5
15
  readme = "README.md"
6
16
  requires-python = ">=3.11"
7
17
  dependencies = [
@@ -1,5 +1,8 @@
1
1
  # Standard library imports
2
- import asyncio
2
+ from anyio.streams.memory import (
3
+ MemoryObjectReceiveStream,
4
+ MemoryObjectSendStream,
5
+ )
3
6
  import logging
4
7
  import os
5
8
  import sys
@@ -13,6 +16,7 @@ from typing import (
13
16
  NoReturn,
14
17
  Tuple,
15
18
  Type,
19
+ TypeAlias,
16
20
  )
17
21
 
18
22
  # Third-party imports
@@ -21,6 +25,7 @@ try:
21
25
  from langchain_core.tools import BaseTool, ToolException
22
26
  from mcp import ClientSession, StdioServerParameters
23
27
  from mcp.client.stdio import stdio_client
28
+ import mcp.types as mcp_types
24
29
  from pydantic import BaseModel
25
30
  from pympler import asizeof
26
31
  except ImportError as e:
@@ -29,110 +34,34 @@ except ImportError as e:
29
34
  sys.exit(1)
30
35
 
31
36
 
32
- """
33
- Resource Management Pattern for Parallel Server Initialization
34
- --------------------------------------------------------------
35
- This code implements a specific pattern for managing async resources that
36
- require context managers while enabling parallel initialization.
37
- The key aspects are:
38
-
39
- 1. Challenge:
40
-
41
- A key requirement for parallel initialization is that each server must be
42
- initialized in its own dedicated task - there's no way around this as far as
43
- I know. However, this poses a challenge when combined with
44
- `asynccontextmanager`.
45
-
46
- - Resources management for `stdio_client` and `ClientSession` seems
47
- to require relying exclusively on `asynccontextmanager` for cleanup,
48
- with no manual cleanup options
49
- (based on the mcp python-sdk impl as of Jan 14, 2025)
50
- - Initializing multiple MCP servers in parallel requires a dedicated
51
- `asyncio.Task` per server
52
- - Server cleanup can be initiated later by a task other than the one that
53
- initialized the resources
54
-
55
- 2. Solution:
56
-
57
- The key insight is to keep the initialization tasks alive throughout the
58
- session lifetime, rather than letting them complete after initialization.
59
-
60
- By using `asyncio.Event`s for coordination, we can:
61
- - Allow parallel initialization while maintaining proper context management
62
- - Keep each initialization task running until explicit cleanup is requested
63
- - Ensure cleanup occurs in the same task that created the resources
64
- - Provide a clean interface for the caller to manage the lifecycle
65
-
66
- Alternative Considered:
67
- A generator/coroutine approach using `finally` block for cleanup was
68
- considered but rejected because:
69
- - It turned out that the `finally` block in a generator/coroutine can be
70
- executed by a different task than the one that ran the main body of
71
- the code
72
- - This breaks the requirement that `AsyncExitStack.aclose()` must be
73
- called from the same task that created the context
74
-
75
- 3. Task Lifecycle:
76
-
77
- The following task lifecyle diagram illustrates how the above strategy
78
- was impelemented:
79
- ```
80
- [Task starts]
81
-
82
- Initialize server & convert tools
83
-
84
- Set ready_event (signals tools are ready)
85
-
86
- await cleanup_event.wait() (keeps task alive)
87
-
88
- When cleanup_event is set:
89
- exit_stack.aclose() (cleanup in original task)
90
- ```
91
- This approach indeed enables parallel initialization while maintaining proper
92
- async resource lifecycle management through context managers.
93
- However, I'm afraid I'm twisting things around too much.
94
- It usually means I'm doing something very worng...
95
-
96
- I think it is a natural assumption that MCP SDK is designed with consideration
97
- for parallel server initialization.
98
- I'm not sure what I'm missing...
99
- (FYI, with the TypeScript MCP SDK, parallel initialization was
100
- pretty straightforward.
101
- """
102
-
103
-
104
- async def spawn_mcp_server_tools_task(
37
+ # Type alias for the bidirectional communication channels with the MCP server
38
+ # FIXME: not defined in mcp.types, really?
39
+ StdioTransport: TypeAlias = tuple[
40
+ MemoryObjectReceiveStream[mcp_types.JSONRPCMessage | Exception],
41
+ MemoryObjectSendStream[mcp_types.JSONRPCMessage]
42
+ ]
43
+
44
+
45
+ async def spawn_mcp_server_and_get_transport(
105
46
  server_name: str,
106
47
  server_config: Dict[str, Any],
107
- langchain_tools: List[BaseTool],
108
- ready_event: asyncio.Event,
109
- cleanup_event: asyncio.Event,
48
+ exit_stack: AsyncExitStack,
110
49
  logger: logging.Logger = logging.getLogger(__name__)
111
- ) -> None:
112
- """Convert MCP server tools to LangChain compatible tools
113
- and manage lifecycle.
114
-
115
- This task initializes an MCP server connection, converts its tools
116
- to LangChain format, and manages the connection lifecycle.
117
- It adds the tools to the provided langchain_tools list and uses events
118
- for synchronization.
50
+ ) -> StdioTransport:
51
+ """
52
+ Spawns an MCP server process and establishes communication channels.
119
53
 
120
54
  Args:
121
- server_name: Name of the MCP server
122
- server_config: Server configuration dictionary containing command,
123
- args, and env
124
- langchain_tools: List to which the converted LangChain tools will
125
- be appended
126
- ready_event: Event to signal when tools are ready for use
127
- cleanup_event: Event to trigger cleanup and connection closure
128
- logger: Logger instance to use for logging events and errors.
129
- Defaults to module logger.
55
+ server_name: Server instance name to use for better logging
56
+ server_config: Configuration dictionary for server setup
57
+ exit_stack: Context manager for cleanup handling
58
+ logger: Logger instance for debugging and monitoring
130
59
 
131
60
  Returns:
132
- None
61
+ A tuple of receive and send streams for server communication
133
62
 
134
63
  Raises:
135
- Exception: If there's an error in server connection or tool conversion
64
+ Exception: If server spawning fails
136
65
  """
137
66
  try:
138
67
  logger.info(f'MCP server "{server_name}": initializing with:',
@@ -145,25 +74,59 @@ async def spawn_mcp_server_tools_task(
145
74
  if 'PATH' not in env:
146
75
  env['PATH'] = os.environ.get('PATH', '')
147
76
 
77
+ # Create server parameters with command, arguments and environment
148
78
  server_params = StdioServerParameters(
149
79
  command=server_config['command'],
150
80
  args=server_config.get('args', []),
151
81
  env=env
152
82
  )
153
83
 
154
- @asynccontextmanager
155
- async def log_before_aexit(context_manager, message):
156
- yield await context_manager.__aenter__()
157
- logger.info(message)
158
- await context_manager.__aexit__(None, None, None)
159
-
160
- exit_stack = AsyncExitStack()
161
-
84
+ # Initialize stdio client and register it with exit stack for cleanup
162
85
  stdio_transport = await exit_stack.enter_async_context(
163
86
  stdio_client(server_params)
164
87
  )
88
+ except Exception as e:
89
+ logger.error(f'Error spawning MCP server: {str(e)}')
90
+ raise
91
+
92
+ return stdio_transport
93
+
94
+
95
+ async def get_mcp_server_tools(
96
+ server_name: str,
97
+ stdio_transport: StdioTransport,
98
+ exit_stack: AsyncExitStack,
99
+ logger: logging.Logger = logging.getLogger(__name__)
100
+ ) -> List[BaseTool]:
101
+ """
102
+ Retrieves and converts MCP server tools to LangChain format.
103
+
104
+ Args:
105
+ server_name: Server instance name to use for better logging
106
+ stdio_transport: Communication channels tuple
107
+ exit_stack: Context manager for cleanup handling
108
+ logger: Logger instance for debugging and monitoring
109
+
110
+ Returns:
111
+ List of LangChain tools converted from MCP tools
112
+
113
+ Raises:
114
+ Exception: If tool conversion fails
115
+ """
116
+ try:
165
117
  read, write = stdio_transport
166
118
 
119
+ # Use an intermediate `asynccontextmanager` to log the cleanup message
120
+ @asynccontextmanager
121
+ async def log_before_aexit(context_manager, message):
122
+ """Helper context manager that logs before cleanup"""
123
+ yield await context_manager.__aenter__()
124
+ try:
125
+ logger.info(message)
126
+ finally:
127
+ await context_manager.__aexit__(None, None, None)
128
+
129
+ # Initialize client session with cleanup logging
167
130
  session = await exit_stack.enter_async_context(
168
131
  log_before_aexit(
169
132
  ClientSession(read, write),
@@ -174,12 +137,17 @@ async def spawn_mcp_server_tools_task(
174
137
  await session.initialize()
175
138
  logger.info(f'MCP server "{server_name}": connected')
176
139
 
140
+ # Get MCP tools
177
141
  tools_response = await session.list_tools()
178
142
 
143
+ # Wrap MCP tools into LangChain tools
144
+ langchain_tools: List[BaseTool] = []
179
145
  for tool in tools_response.tools:
146
+ # Define adapter class to convert MCP tool to LangChain format
180
147
  class McpToLangChainAdapter(BaseTool):
181
148
  name: str = tool.name or 'NO NAME'
182
149
  description: str = tool.description or ''
150
+ # Convert JSON schema to Pydantic model for argument validation
183
151
  args_schema: Type[BaseModel] = jsonschema_to_pydantic(
184
152
  tool.inputSchema
185
153
  )
@@ -190,12 +158,17 @@ async def spawn_mcp_server_tools_task(
190
158
  )
191
159
 
192
160
  async def _arun(self, **kwargs: Any) -> Any:
161
+ """
162
+ Asynchronously executes the tool with given arguments.
163
+ Logs input/output and handles errors.
164
+ """
193
165
  logger.info(f'MCP tool "{server_name}"/"{tool.name}"'
194
166
  f' received input:', kwargs)
195
167
  result = await session.call_tool(self.name, kwargs)
196
168
  if result.isError:
197
169
  raise ToolException(result.content)
198
170
 
171
+ # Log result size for monitoring
199
172
  size = asizeof.asizeof(result.content)
200
173
  logger.info(f'MCP tool "{server_name}"/"{tool.name}" '
201
174
  f'received result (size: {size})')
@@ -203,21 +176,19 @@ async def spawn_mcp_server_tools_task(
203
176
 
204
177
  langchain_tools.append(McpToLangChainAdapter())
205
178
 
179
+ # Log available tools for debugging
206
180
  logger.info(f'MCP server "{server_name}": {len(langchain_tools)} '
207
181
  f'tool(s) available:')
208
182
  for tool in langchain_tools:
209
183
  logger.info(f'- {tool.name}')
210
184
  except Exception as e:
211
- logger.error(f'Error getting response: {str(e)}')
185
+ logger.error(f'Error getting MCP tools: {str(e)}')
212
186
  raise
213
187
 
214
- ready_event.set()
215
-
216
- await cleanup_event.wait()
217
-
218
- await exit_stack.aclose()
188
+ return langchain_tools
219
189
 
220
190
 
191
+ # Type hint for cleanup function
221
192
  McpServerCleanupFn = Callable[[], Awaitable[None]]
222
193
 
223
194
 
@@ -254,38 +225,46 @@ async def convert_mcp_to_langchain_tools(
254
225
  # Use tools...
255
226
  await cleanup()
256
227
  """
257
- per_server_tools = []
258
- ready_event_list = []
259
- cleanup_event_list = []
260
228
 
261
- tasks = []
229
+ # Initialize AsyncExitStack for managing multiple server lifecycles
230
+ stdio_transports: List[StdioTransport] = []
231
+ async_exit_stack = AsyncExitStack()
232
+
233
+ # Spawn all MCP servers concurrently
262
234
  for server_name, server_config in server_configs.items():
263
- server_tools_accumulator: List[BaseTool] = []
264
- per_server_tools.append(server_tools_accumulator)
265
- ready_event = asyncio.Event()
266
- ready_event_list.append(ready_event)
267
- cleanup_event = asyncio.Event()
268
- cleanup_event_list.append(cleanup_event)
269
- task = asyncio.create_task(spawn_mcp_server_tools_task(
235
+ # NOTE: the following `await` only blocks until the server subprocess
236
+ # is spawned, i.e. after returning from the `await`, the spawned
237
+ # subprocess starts its initialization independently of (so in
238
+ # parallel with) the Python execution of the following lines.
239
+ stdio_transport = await spawn_mcp_server_and_get_transport(
270
240
  server_name,
271
241
  server_config,
272
- server_tools_accumulator,
273
- ready_event,
274
- cleanup_event,
242
+ async_exit_stack,
275
243
  logger
276
- ))
277
- tasks.append(task)
278
-
279
- await asyncio.gather(*(event.wait() for event in ready_event_list))
280
-
281
- langchain_tools = [
282
- item for sublist in per_server_tools for item in sublist
283
- ]
244
+ )
245
+ stdio_transports.append(stdio_transport)
246
+
247
+ # Convert tools from each server to LangChain format
248
+ langchain_tools: List[BaseTool] = []
249
+ for (server_name, server_config), stdio_transport in zip(
250
+ server_configs.items(),
251
+ stdio_transports,
252
+ strict=True
253
+ ):
254
+ tools = await get_mcp_server_tools(
255
+ server_name,
256
+ stdio_transport,
257
+ async_exit_stack,
258
+ logger
259
+ )
260
+ langchain_tools.extend(tools)
284
261
 
262
+ # Define a cleanup function to properly shut down all servers
285
263
  async def mcp_cleanup() -> None:
286
- for event in cleanup_event_list:
287
- event.set()
264
+ """Closes all server connections and cleans up resources"""
265
+ await async_exit_stack.aclose()
288
266
 
267
+ # Log summary of initialized tools
289
268
  logger.info(f'MCP servers initialized: {len(langchain_tools)} tool(s) '
290
269
  f'available in total')
291
270
  for tool in langchain_tools:
@@ -0,0 +1,105 @@
1
+ Metadata-Version: 2.2
2
+ Name: langchain-mcp-tools
3
+ Version: 0.1.2
4
+ Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
+ Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
+ Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
7
+ Keywords: modelcontextprotocol,mcp,mcp-client,langchain,langchain-python,tool-call,tool-calling,python
8
+ Requires-Python: >=3.11
9
+ Description-Content-Type: text/markdown
10
+ License-File: LICENSE
11
+ Requires-Dist: jsonschema-pydantic>=0.6
12
+ Requires-Dist: langchain>=0.3.14
13
+ Requires-Dist: langchain-anthropic>=0.3.1
14
+ Requires-Dist: langchain-groq>=0.2.3
15
+ Requires-Dist: langchain-openai>=0.3.0
16
+ Requires-Dist: langgraph>=0.2.62
17
+ Requires-Dist: mcp>=1.2.0
18
+ Requires-Dist: pyjson5>=1.6.8
19
+ Requires-Dist: pympler>=1.1
20
+ Requires-Dist: python-dotenv>=1.0.1
21
+ Provides-Extra: dev
22
+ Requires-Dist: pytest>=8.3.4; extra == "dev"
23
+ Requires-Dist: pytest-asyncio>=0.25.2; extra == "dev"
24
+
25
+ # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
26
+
27
+ This package is intended to simplify the use of
28
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
29
+ server tools with LangChain / Python.
30
+
31
+ It contains a utility function `convert_mcp_to_langchain_tools()`.
32
+ This async function handles parallel initialization of specified multiple MCP servers
33
+ and converts their available tools into a list of LangChain-compatible tools.
34
+
35
+ A typescript equivalent of this utility library is available
36
+ [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
37
+
38
+ ## Requirements
39
+
40
+ - Python 3.11+
41
+
42
+ ## Installation
43
+
44
+ ```bash
45
+ pip install langchain-mcp-tools
46
+ ```
47
+
48
+ ## Quick Start
49
+
50
+ `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
51
+ that follow the same structure as
52
+ [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
53
+ but only the contents of the `mcpServers` property,
54
+ and is expressed as a `dict`, e.g.:
55
+
56
+ ```python
57
+ mcp_configs = {
58
+ 'filesystem': {
59
+ 'command': 'npx',
60
+ 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
61
+ },
62
+ 'fetch': {
63
+ 'command': 'uvx',
64
+ 'args': ['mcp-server-fetch']
65
+ }
66
+ }
67
+
68
+ tools, cleanup = await convert_mcp_to_langchain_tools(
69
+ mcp_configs
70
+ )
71
+ ```
72
+
73
+ This utility function initializes all specified MCP servers in parallel,
74
+ and returns LangChain Tools
75
+ ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
76
+ by gathering available MCP tools from the servers,
77
+ and by wrapping them into LangChain tools.
78
+ It also returns an async callback function (`cleanup: McpServerCleanupFn`)
79
+ to be invoked to close all MCP server sessions when finished.
80
+
81
+ The returned tools can be used with LangChain, e.g.:
82
+
83
+ ```python
84
+ # from langchain.chat_models import init_chat_model
85
+ llm = init_chat_model(
86
+ model='claude-3-5-haiku-latest',
87
+ model_provider='anthropic'
88
+ )
89
+
90
+ # from langgraph.prebuilt import create_react_agent
91
+ agent = create_react_agent(
92
+ llm,
93
+ tools
94
+ )
95
+ ```
96
+ A simple and experimentable usage example can be found
97
+ [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
98
+
99
+ A more realistic usage example can be found
100
+ [here](https://github.com/hideya/mcp-client-langchain-py)
101
+
102
+
103
+ ## Limitations
104
+
105
+ Currently, only text results of tool calls are supported.
@@ -1,180 +0,0 @@
1
- Metadata-Version: 2.2
2
- Name: langchain-mcp-tools
3
- Version: 0.1.0
4
- Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
- Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
- Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
7
- Requires-Python: >=3.11
8
- Description-Content-Type: text/markdown
9
- License-File: LICENSE
10
- Requires-Dist: jsonschema-pydantic>=0.6
11
- Requires-Dist: langchain>=0.3.14
12
- Requires-Dist: langchain-anthropic>=0.3.1
13
- Requires-Dist: langchain-groq>=0.2.3
14
- Requires-Dist: langchain-openai>=0.3.0
15
- Requires-Dist: langgraph>=0.2.62
16
- Requires-Dist: mcp>=1.2.0
17
- Requires-Dist: pyjson5>=1.6.8
18
- Requires-Dist: pympler>=1.1
19
- Requires-Dist: python-dotenv>=1.0.1
20
- Provides-Extra: dev
21
- Requires-Dist: pytest>=8.3.4; extra == "dev"
22
- Requires-Dist: pytest-asyncio>=0.25.2; extra == "dev"
23
-
24
- # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
25
-
26
- This package is intended to simplify the use of
27
- [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
28
- server tools with LangChain / Python.
29
-
30
- It contains a utility function `convert_mcp_to_langchain_tools()`.
31
- This function handles parallel initialization of specified multiple MCP servers
32
- and converts their available tools into a list of LangChain-compatible tools.
33
-
34
- A typescript equivalent of this utility library is available
35
- [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
36
-
37
- ## Requirements
38
-
39
- - Python 3.11+
40
-
41
- ## Installation
42
-
43
- ```bash
44
- pip install langchain-mcp-tools
45
- ```
46
-
47
- ## Quick Start
48
-
49
- `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
50
- that follow the same structure as
51
- [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
52
- but only the contents of the `mcpServers` property,
53
- and is expressed as a `dict`, e.g.:
54
-
55
- ```python
56
- mcp_configs = {
57
- 'filesystem': {
58
- 'command': 'npx',
59
- 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
60
- },
61
- 'fetch': {
62
- 'command': 'uvx',
63
- 'args': ['mcp-server-fetch']
64
- }
65
- }
66
-
67
- tools, cleanup = await convert_mcp_to_langchain_tools(
68
- mcp_configs
69
- )
70
- ```
71
-
72
- This utility function initializes all specified MCP servers in parallel,
73
- and returns LangChain Tools
74
- ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
75
- by gathering available MCP tools from the servers,
76
- and by wrapping them into LangChain tools.
77
- It also returns an async callback function (`cleanup: McpServerCleanupFn`)
78
- to be invoked to close all MCP server sessions when finished.
79
-
80
- The returned tools can be used with LangChain, e.g.:
81
-
82
- ```python
83
- # from langchain.chat_models import init_chat_model
84
- llm = init_chat_model(
85
- model='claude-3-5-haiku-latest',
86
- model_provider='anthropic'
87
- )
88
-
89
- # from langgraph.prebuilt import create_react_agent
90
- agent = create_react_agent(
91
- llm,
92
- tools
93
- )
94
- ```
95
- A simple and experimentable usage example can be found
96
- [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
97
-
98
- A more realistic usage example can be found
99
- [here](https://github.com/hideya/mcp-client-langchain-py)
100
-
101
-
102
- ## Limitations
103
-
104
- Currently, only text results of tool calls are supported.
105
-
106
- ## Technical Details
107
-
108
- It was very tricky (for me) to get the parallel MCP server initialization
109
- to work, including successful final resource cleanup...
110
-
111
- I'm new to Python, so it is very possible that my ignorance is playing
112
- a big role here...
113
- I'll summarize the difficulties I faced below.
114
- The source code is available
115
- [here](https://github.com/hideya/langchain-mcp-tools-py/blob/main/src/langchain_mcp_tools/langchain_mcp_tools.py).
116
- Any comments pointing out something I am missing would be greatly appreciated!
117
- [(comment here)](https://github.com/hideya/langchain-mcp-tools-ts/issues)
118
-
119
- 1. Challenge:
120
-
121
- A key requirement for parallel initialization is that each server must be
122
- initialized in its own dedicated task - there's no way around this as far as
123
- I know. However, this poses a challenge when combined with
124
- `asynccontextmanager`.
125
-
126
- - Resources management for `stdio_client` and `ClientSession` seems
127
- to require relying exclusively on `asynccontextmanager` for cleanup,
128
- with no manual cleanup options
129
- (based on [the mcp python-sdk impl as of Jan 14, 2025](https://github.com/modelcontextprotocol/python-sdk/tree/99727a9/src/mcp/client))
130
- - Initializing multiple MCP servers in parallel requires a dedicated
131
- `asyncio.Task` per server
132
- - Server cleanup can be initiated later by a task other than the one
133
- that initialized the resources
134
-
135
- 2. Solution:
136
-
137
- The key insight is to keep the initialization tasks alive throughout the
138
- session lifetime, rather than letting them complete after initialization.
139
-
140
- By using `asyncio.Event`s for coordination, we can:
141
- - Allow parallel initialization while maintaining proper context management
142
- - Keep each initialization task running until explicit cleanup is requested
143
- - Ensure cleanup occurs in the same task that created the resources
144
- - Provide a clean interface for the caller to manage the lifecycle
145
-
146
- Alternative Considered:
147
- A generator/coroutine approach using `finally` block for cleanup was
148
- considered but rejected because:
149
- - It turned out that the `finally` block in a generator/coroutine can be
150
- executed by a different task than the one that ran the main body of
151
- the code
152
- - This breaks the requirement that `AsyncExitStack.aclose()` must be
153
- called from the same task that created the context
154
-
155
- 3. Task Lifecycle:
156
-
157
- The following task lifecyle diagram illustrates how the above strategy
158
- was impelemented:
159
- ```
160
- [Task starts]
161
-
162
- Initialize server & convert tools
163
-
164
- Set ready_event (signals tools are ready)
165
-
166
- await cleanup_event.wait() (keeps task alive)
167
-
168
- When cleanup_event is set:
169
- exit_stack.aclose() (cleanup in original task)
170
- ```
171
- This approach indeed enables parallel initialization while maintaining proper
172
- async resource lifecycle management through context managers.
173
- However, I'm afraid I'm twisting things around too much.
174
- It usually means I'm doing something very worng...
175
-
176
- I think it is a natural assumption that MCP SDK is designed with consideration
177
- for parallel server initialization.
178
- I'm not sure what I'm missing...
179
- (FYI, with the TypeScript MCP SDK, parallel initialization was
180
- [pretty straightforward](https://github.com/hideya/langchain-mcp-tools-ts/blob/main/src/langchain-mcp-tools.ts))
@@ -1,157 +0,0 @@
1
- # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
2
-
3
- This package is intended to simplify the use of
4
- [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
5
- server tools with LangChain / Python.
6
-
7
- It contains a utility function `convert_mcp_to_langchain_tools()`.
8
- This function handles parallel initialization of specified multiple MCP servers
9
- and converts their available tools into a list of LangChain-compatible tools.
10
-
11
- A typescript equivalent of this utility library is available
12
- [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
13
-
14
- ## Requirements
15
-
16
- - Python 3.11+
17
-
18
- ## Installation
19
-
20
- ```bash
21
- pip install langchain-mcp-tools
22
- ```
23
-
24
- ## Quick Start
25
-
26
- `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
27
- that follow the same structure as
28
- [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
29
- but only the contents of the `mcpServers` property,
30
- and is expressed as a `dict`, e.g.:
31
-
32
- ```python
33
- mcp_configs = {
34
- 'filesystem': {
35
- 'command': 'npx',
36
- 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
37
- },
38
- 'fetch': {
39
- 'command': 'uvx',
40
- 'args': ['mcp-server-fetch']
41
- }
42
- }
43
-
44
- tools, cleanup = await convert_mcp_to_langchain_tools(
45
- mcp_configs
46
- )
47
- ```
48
-
49
- This utility function initializes all specified MCP servers in parallel,
50
- and returns LangChain Tools
51
- ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
52
- by gathering available MCP tools from the servers,
53
- and by wrapping them into LangChain tools.
54
- It also returns an async callback function (`cleanup: McpServerCleanupFn`)
55
- to be invoked to close all MCP server sessions when finished.
56
-
57
- The returned tools can be used with LangChain, e.g.:
58
-
59
- ```python
60
- # from langchain.chat_models import init_chat_model
61
- llm = init_chat_model(
62
- model='claude-3-5-haiku-latest',
63
- model_provider='anthropic'
64
- )
65
-
66
- # from langgraph.prebuilt import create_react_agent
67
- agent = create_react_agent(
68
- llm,
69
- tools
70
- )
71
- ```
72
- A simple and experimentable usage example can be found
73
- [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
74
-
75
- A more realistic usage example can be found
76
- [here](https://github.com/hideya/mcp-client-langchain-py)
77
-
78
-
79
- ## Limitations
80
-
81
- Currently, only text results of tool calls are supported.
82
-
83
- ## Technical Details
84
-
85
- It was very tricky (for me) to get the parallel MCP server initialization
86
- to work, including successful final resource cleanup...
87
-
88
- I'm new to Python, so it is very possible that my ignorance is playing
89
- a big role here...
90
- I'll summarize the difficulties I faced below.
91
- The source code is available
92
- [here](https://github.com/hideya/langchain-mcp-tools-py/blob/main/src/langchain_mcp_tools/langchain_mcp_tools.py).
93
- Any comments pointing out something I am missing would be greatly appreciated!
94
- [(comment here)](https://github.com/hideya/langchain-mcp-tools-ts/issues)
95
-
96
- 1. Challenge:
97
-
98
- A key requirement for parallel initialization is that each server must be
99
- initialized in its own dedicated task - there's no way around this as far as
100
- I know. However, this poses a challenge when combined with
101
- `asynccontextmanager`.
102
-
103
- - Resources management for `stdio_client` and `ClientSession` seems
104
- to require relying exclusively on `asynccontextmanager` for cleanup,
105
- with no manual cleanup options
106
- (based on [the mcp python-sdk impl as of Jan 14, 2025](https://github.com/modelcontextprotocol/python-sdk/tree/99727a9/src/mcp/client))
107
- - Initializing multiple MCP servers in parallel requires a dedicated
108
- `asyncio.Task` per server
109
- - Server cleanup can be initiated later by a task other than the one
110
- that initialized the resources
111
-
112
- 2. Solution:
113
-
114
- The key insight is to keep the initialization tasks alive throughout the
115
- session lifetime, rather than letting them complete after initialization.
116
-
117
- By using `asyncio.Event`s for coordination, we can:
118
- - Allow parallel initialization while maintaining proper context management
119
- - Keep each initialization task running until explicit cleanup is requested
120
- - Ensure cleanup occurs in the same task that created the resources
121
- - Provide a clean interface for the caller to manage the lifecycle
122
-
123
- Alternative Considered:
124
- A generator/coroutine approach using `finally` block for cleanup was
125
- considered but rejected because:
126
- - It turned out that the `finally` block in a generator/coroutine can be
127
- executed by a different task than the one that ran the main body of
128
- the code
129
- - This breaks the requirement that `AsyncExitStack.aclose()` must be
130
- called from the same task that created the context
131
-
132
- 3. Task Lifecycle:
133
-
134
- The following task lifecyle diagram illustrates how the above strategy
135
- was impelemented:
136
- ```
137
- [Task starts]
138
-
139
- Initialize server & convert tools
140
-
141
- Set ready_event (signals tools are ready)
142
-
143
- await cleanup_event.wait() (keeps task alive)
144
-
145
- When cleanup_event is set:
146
- exit_stack.aclose() (cleanup in original task)
147
- ```
148
- This approach indeed enables parallel initialization while maintaining proper
149
- async resource lifecycle management through context managers.
150
- However, I'm afraid I'm twisting things around too much.
151
- It usually means I'm doing something very worng...
152
-
153
- I think it is a natural assumption that MCP SDK is designed with consideration
154
- for parallel server initialization.
155
- I'm not sure what I'm missing...
156
- (FYI, with the TypeScript MCP SDK, parallel initialization was
157
- [pretty straightforward](https://github.com/hideya/langchain-mcp-tools-ts/blob/main/src/langchain-mcp-tools.ts))
@@ -1,180 +0,0 @@
1
- Metadata-Version: 2.2
2
- Name: langchain-mcp-tools
3
- Version: 0.1.0
4
- Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
- Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
- Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
7
- Requires-Python: >=3.11
8
- Description-Content-Type: text/markdown
9
- License-File: LICENSE
10
- Requires-Dist: jsonschema-pydantic>=0.6
11
- Requires-Dist: langchain>=0.3.14
12
- Requires-Dist: langchain-anthropic>=0.3.1
13
- Requires-Dist: langchain-groq>=0.2.3
14
- Requires-Dist: langchain-openai>=0.3.0
15
- Requires-Dist: langgraph>=0.2.62
16
- Requires-Dist: mcp>=1.2.0
17
- Requires-Dist: pyjson5>=1.6.8
18
- Requires-Dist: pympler>=1.1
19
- Requires-Dist: python-dotenv>=1.0.1
20
- Provides-Extra: dev
21
- Requires-Dist: pytest>=8.3.4; extra == "dev"
22
- Requires-Dist: pytest-asyncio>=0.25.2; extra == "dev"
23
-
24
- # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
25
-
26
- This package is intended to simplify the use of
27
- [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
28
- server tools with LangChain / Python.
29
-
30
- It contains a utility function `convert_mcp_to_langchain_tools()`.
31
- This function handles parallel initialization of specified multiple MCP servers
32
- and converts their available tools into a list of LangChain-compatible tools.
33
-
34
- A typescript equivalent of this utility library is available
35
- [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
36
-
37
- ## Requirements
38
-
39
- - Python 3.11+
40
-
41
- ## Installation
42
-
43
- ```bash
44
- pip install langchain-mcp-tools
45
- ```
46
-
47
- ## Quick Start
48
-
49
- `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
50
- that follow the same structure as
51
- [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
52
- but only the contents of the `mcpServers` property,
53
- and is expressed as a `dict`, e.g.:
54
-
55
- ```python
56
- mcp_configs = {
57
- 'filesystem': {
58
- 'command': 'npx',
59
- 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
60
- },
61
- 'fetch': {
62
- 'command': 'uvx',
63
- 'args': ['mcp-server-fetch']
64
- }
65
- }
66
-
67
- tools, cleanup = await convert_mcp_to_langchain_tools(
68
- mcp_configs
69
- )
70
- ```
71
-
72
- This utility function initializes all specified MCP servers in parallel,
73
- and returns LangChain Tools
74
- ([`tools: List[BaseTool]`](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.base.BaseTool.html#langchain_core.tools.base.BaseTool))
75
- by gathering available MCP tools from the servers,
76
- and by wrapping them into LangChain tools.
77
- It also returns an async callback function (`cleanup: McpServerCleanupFn`)
78
- to be invoked to close all MCP server sessions when finished.
79
-
80
- The returned tools can be used with LangChain, e.g.:
81
-
82
- ```python
83
- # from langchain.chat_models import init_chat_model
84
- llm = init_chat_model(
85
- model='claude-3-5-haiku-latest',
86
- model_provider='anthropic'
87
- )
88
-
89
- # from langgraph.prebuilt import create_react_agent
90
- agent = create_react_agent(
91
- llm,
92
- tools
93
- )
94
- ```
95
- A simple and experimentable usage example can be found
96
- [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
97
-
98
- A more realistic usage example can be found
99
- [here](https://github.com/hideya/mcp-client-langchain-py)
100
-
101
-
102
- ## Limitations
103
-
104
- Currently, only text results of tool calls are supported.
105
-
106
- ## Technical Details
107
-
108
- It was very tricky (for me) to get the parallel MCP server initialization
109
- to work, including successful final resource cleanup...
110
-
111
- I'm new to Python, so it is very possible that my ignorance is playing
112
- a big role here...
113
- I'll summarize the difficulties I faced below.
114
- The source code is available
115
- [here](https://github.com/hideya/langchain-mcp-tools-py/blob/main/src/langchain_mcp_tools/langchain_mcp_tools.py).
116
- Any comments pointing out something I am missing would be greatly appreciated!
117
- [(comment here)](https://github.com/hideya/langchain-mcp-tools-ts/issues)
118
-
119
- 1. Challenge:
120
-
121
- A key requirement for parallel initialization is that each server must be
122
- initialized in its own dedicated task - there's no way around this as far as
123
- I know. However, this poses a challenge when combined with
124
- `asynccontextmanager`.
125
-
126
- - Resources management for `stdio_client` and `ClientSession` seems
127
- to require relying exclusively on `asynccontextmanager` for cleanup,
128
- with no manual cleanup options
129
- (based on [the mcp python-sdk impl as of Jan 14, 2025](https://github.com/modelcontextprotocol/python-sdk/tree/99727a9/src/mcp/client))
130
- - Initializing multiple MCP servers in parallel requires a dedicated
131
- `asyncio.Task` per server
132
- - Server cleanup can be initiated later by a task other than the one
133
- that initialized the resources
134
-
135
- 2. Solution:
136
-
137
- The key insight is to keep the initialization tasks alive throughout the
138
- session lifetime, rather than letting them complete after initialization.
139
-
140
- By using `asyncio.Event`s for coordination, we can:
141
- - Allow parallel initialization while maintaining proper context management
142
- - Keep each initialization task running until explicit cleanup is requested
143
- - Ensure cleanup occurs in the same task that created the resources
144
- - Provide a clean interface for the caller to manage the lifecycle
145
-
146
- Alternative Considered:
147
- A generator/coroutine approach using `finally` block for cleanup was
148
- considered but rejected because:
149
- - It turned out that the `finally` block in a generator/coroutine can be
150
- executed by a different task than the one that ran the main body of
151
- the code
152
- - This breaks the requirement that `AsyncExitStack.aclose()` must be
153
- called from the same task that created the context
154
-
155
- 3. Task Lifecycle:
156
-
157
- The following task lifecyle diagram illustrates how the above strategy
158
- was impelemented:
159
- ```
160
- [Task starts]
161
-
162
- Initialize server & convert tools
163
-
164
- Set ready_event (signals tools are ready)
165
-
166
- await cleanup_event.wait() (keeps task alive)
167
-
168
- When cleanup_event is set:
169
- exit_stack.aclose() (cleanup in original task)
170
- ```
171
- This approach indeed enables parallel initialization while maintaining proper
172
- async resource lifecycle management through context managers.
173
- However, I'm afraid I'm twisting things around too much.
174
- It usually means I'm doing something very worng...
175
-
176
- I think it is a natural assumption that MCP SDK is designed with consideration
177
- for parallel server initialization.
178
- I'm not sure what I'm missing...
179
- (FYI, with the TypeScript MCP SDK, parallel initialization was
180
- [pretty straightforward](https://github.com/hideya/langchain-mcp-tools-ts/blob/main/src/langchain-mcp-tools.ts))