langchain-mcp-tools 0.0.3__py3-none-any.whl → 0.0.5__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,174 @@
1
+ Metadata-Version: 2.2
2
+ Name: langchain_mcp_tools
3
+ Version: 0.0.5
4
+ Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
+ Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
+ Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
7
+ Requires-Python: >=3.11
8
+ Description-Content-Type: text/markdown
9
+ License-File: LICENSE
10
+ Requires-Dist: jsonschema-pydantic>=0.6
11
+ Requires-Dist: langchain-core>=0.3.29
12
+ Requires-Dist: langchain>=0.3.14
13
+ Requires-Dist: langchain-anthropic>=0.3.1
14
+ Requires-Dist: langchain-groq>=0.2.3
15
+ Requires-Dist: langchain-openai>=0.3.0
16
+ Requires-Dist: langgraph>=0.2.62
17
+ Requires-Dist: mcp>=1.2.0
18
+ Requires-Dist: pyjson5>=1.6.8
19
+ Requires-Dist: pympler>=1.1
20
+ Requires-Dist: python-dotenv>=1.0.1
21
+
22
+ # MCP To LangChain Tools Conversion Utility [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/mcp-langchain-client-ts/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
23
+
24
+ This package is intended to simplify the use of
25
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
26
+ server tools with LangChain / Python.
27
+
28
+ It contains a utility function `convertMcpToLangchainTools()`.
29
+ This function handles parallel initialization of specified multiple MCP servers
30
+ and converts their available tools into an array of
31
+ [LangChain-compatible tools](https://js.langchain.com/docs/how_to/tool_calling/).
32
+
33
+ A typescript equivalent of this utility library is available
34
+ [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
35
+
36
+ ## Requirements
37
+
38
+ - Python 3.11+
39
+
40
+ ## Installation
41
+
42
+ ```bash
43
+ pip install langchain-mcp-tools
44
+ ```
45
+
46
+ ## Quick Start
47
+
48
+ `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
49
+ that follow the same structure as
50
+ [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
51
+ but only the contents of the `mcpServers` property,
52
+ and is expressed as a `dict`, e.g.:
53
+
54
+ ```python
55
+ mcp_configs = {
56
+ 'filesystem': {
57
+ 'command': 'npx',
58
+ 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
59
+ },
60
+ 'fetch': {
61
+ 'command': 'uvx',
62
+ 'args': ['mcp-server-fetch']
63
+ }
64
+ }
65
+
66
+ tools, cleanup = await convert_mcp_to_langchain_tools(
67
+ mcp_configs
68
+ )
69
+ ```
70
+
71
+ The utility function initializes all specified MCP servers in parallel,
72
+ and returns LangChain Tools (`List[BaseTool]`)
73
+ by gathering all available MCP server tools,
74
+ and by wrapping them into [LangChain Tools](https://js.langchain.com/docs/how_to/tool_calling/).
75
+ It also returns a cleanup callback function (`McpServerCleanupFn`)
76
+ to be invoked to close all MCP server sessions when finished.
77
+
78
+ The returned tools can be used with LangChain, e.g.:
79
+
80
+ ```python
81
+ # from langchain.chat_models import init_chat_model
82
+ llm = init_chat_model(
83
+ model='claude-3-5-haiku-latest',
84
+ model_provider='anthropic'
85
+ )
86
+
87
+ # from langgraph.prebuilt import create_react_agent
88
+ agent = create_react_agent(
89
+ llm,
90
+ tools
91
+ )
92
+ ```
93
+ <!-- A simple and experimentable usage example can be found
94
+ [here](https://github.com/hideya/langchain-mcp-tools-ts-usage/blob/main/src/index.ts) -->
95
+
96
+ <!-- A more realistic usage example can be found
97
+ [here](https://github.com/hideya/langchain-mcp-client-ts) -->
98
+
99
+ An usage example can be found
100
+ [here](https://github.com/hideya/mcp-client-langchain-py)
101
+
102
+ ## Limitations
103
+
104
+ Currently, only text results of tool calls are supported.
105
+
106
+ ## Technical Details
107
+
108
+ It was very tricky (for me) to get the parallel MCP server initialization
109
+ to work successfully...
110
+
111
+ I'm new to Python, so it is very possible that my ignorance is playing
112
+ a big role here...
113
+ I'm summarizing why it was difficult for me below.
114
+ Any comments pointing out something I am missing would be greatly appreciated!
115
+ [(comment here)](https://github.com/hideya/langchain-mcp-tools-ts/issues)
116
+
117
+ 1. Core Challenge:
118
+ - Async resources management for `stdio_client` and `ClientSession` seems
119
+ to rely exclusively on `asynccontextmanager` for cleanup with no manual
120
+ cleanup options (based on the mcp python-sdk impl as of Jan 14, 2025)
121
+ - Initializing multiple MCP servers in parallel requires a dedicated
122
+ `asyncio.Task` per server
123
+ - Necessity of keeping sessions alive for later use after initialization
124
+ - Ensuring proper cleanup later in the same task that created them
125
+
126
+ 2. Solution Strategy:
127
+ A key requirement for parallel initialization is that each server must be
128
+ initialized in its own dedicated task - there's no way around this if we
129
+ want true parallel initialization. However, this creates a challenge since
130
+ we also need to maintain long-lived sessions and handle cleanup properly.
131
+
132
+ The key insight is to keep the initialization tasks alive throughout the
133
+ session lifetime, rather than letting them complete after initialization.
134
+
135
+ By using `asyncio.Event`s for coordination, we can:
136
+ - Allow parallel initialization while maintaining proper context management
137
+ - Keep each initialization task running until explicit cleanup is requested
138
+ - Ensure cleanup occurs in the same task that created the resources
139
+ - Provide a clean interface for the caller to manage the lifecycle
140
+
141
+ Alternative Considered:
142
+ A generator/coroutine approach using `finally` block for cleanup was
143
+ considered but rejected because:
144
+ - It turned out that the `finally` block in a generator/coroutine can be
145
+ executed by a different task than the one that ran the main body of
146
+ the code
147
+ - This breaks the requirement that `AsyncExitStack.aclose()` must be
148
+ called from the same task that created the context
149
+
150
+ 3. Task Lifecycle:
151
+
152
+ To allow the initialization task to stay alive waiting for cleanup:
153
+ ```
154
+ [Task starts]
155
+
156
+ Initialize server & convert tools
157
+
158
+ Set ready_event (signals tools are ready)
159
+
160
+ await cleanup_event.wait() (keeps task alive)
161
+
162
+ When cleanup_event is set:
163
+ exit_stack.aclose() (cleanup in original task)
164
+ ```
165
+ This approach indeed enables parallel initialization while maintaining proper
166
+ async resource lifecycle management through context managers.
167
+ However, I'm afraid I'm twisting things around too much.
168
+ It usually means I'm doing something very worng...
169
+
170
+ I think it is a natural assumption that MCP SDK is designed with consideration
171
+ for parallel server initialization.
172
+ I'm not sure what I'm missing...
173
+ (FYI, with the TypeScript MCP SDK, parallel initialization was
174
+ [pretty straightforward](https://github.com/hideya/langchain-mcp-tools-ts/blob/main/src/langchain-mcp-tools.ts))
@@ -0,0 +1,6 @@
1
+ src/langchain_mcp_tools.py,sha256=0vIjVWyK5Ps7p3rxHVOEjbTywuLY3ON5G-hHJj-XkF0,10451
2
+ langchain_mcp_tools-0.0.5.dist-info/LICENSE,sha256=CRC91e8v116gCpnp7h49oIa6_zjhxqnHFTREeoZFJwA,1072
3
+ langchain_mcp_tools-0.0.5.dist-info/METADATA,sha256=QxO5dVf1V1dktQlsWD5u-VyquTrQoqSLNInDMmDAI34,6687
4
+ langchain_mcp_tools-0.0.5.dist-info/WHEEL,sha256=In9FTNxeP60KnTkGw7wk6mJPYd_dQSjEZmXdBdMCI-8,91
5
+ langchain_mcp_tools-0.0.5.dist-info/top_level.txt,sha256=74rtVfumQlgAPzR5_2CgYN24MB0XARCg0t-gzk6gTrM,4
6
+ langchain_mcp_tools-0.0.5.dist-info/RECORD,,
@@ -37,11 +37,12 @@ require context managers while enabling parallel initialization.
37
37
  The key aspects are:
38
38
 
39
39
  1. Core Challenge:
40
- - Managing async resources (stdio_client and ClientSession) that seems to
41
- rely exclusively on asynccontextmanager for cleanup with no manual cleanup
42
- options (based on the mcp python-sdk impl as of Jan 14, 2025 #62a0af6)
43
- - Initializing multiple servers in parallel
44
- - Keeping sessions alive for later use
40
+ - Async resources management for `stdio_client` and `ClientSession` seems
41
+ to rely exclusively on `asynccontextmanager` for cleanup with no manual
42
+ cleanup options (based on the mcp python-sdk impl as of Jan 14, 2025)
43
+ - Initializing multiple MCP servers in parallel requires a dedicated
44
+ `asyncio.Task` per server
45
+ - Necessity of keeping sessions alive for later use after initialization
45
46
  - Ensuring proper cleanup in the same task that created them
46
47
 
47
48
  2. Solution Strategy:
@@ -52,18 +53,19 @@ The key aspects are:
52
53
 
53
54
  The key insight is to keep the initialization tasks alive throughout the
54
55
  session lifetime, rather than letting them complete after initialization.
55
- By using events for coordination, we can:
56
+ By using `asyncio.Event`s for coordination, we can:
56
57
  - Allow parallel initialization while maintaining proper context management
57
58
  - Keep each initialization task running until explicit cleanup is requested
58
59
  - Ensure cleanup occurs in the same task that created the resources
59
60
  - Provide a clean interface for the caller to manage the lifecycle
60
61
 
61
62
  Alternative Considered:
62
- A generator/coroutine approach using 'finally' block for cleanup was
63
+ A generator/coroutine approach using `finally` block for cleanup was
63
64
  considered but rejected because:
64
- - The 'finally' block in a generator/coroutine can be executed by a
65
- different task than the one that ran the main body of the code
66
- - This breaks the requirement that AsyncExitStack.aclose() must be
65
+ - It turned out that the `finally` block in a generator/coroutine can be
66
+ executed by a different task than the one that ran the main body of
67
+ the code
68
+ - This breaks the requirement that `AsyncExitStack.aclose()` must be
67
69
  called from the same task that created the context
68
70
 
69
71
  3. Task Lifecycle:
@@ -79,8 +81,16 @@ The key aspects are:
79
81
  When cleanup_event is set:
80
82
  exit_stack.aclose() (cleanup in original task)
81
83
 
82
- This pattern enables parallel initialization while maintaining proper async
83
- resource lifecycle management through context managers.
84
+ This approach indeed enables parallel initialization while maintaining proper
85
+ async resource lifecycle management through context managers.
86
+ However, I'm afraid I'm twisting things around too much.
87
+ It usually means I'm doing something very worng...
88
+
89
+ I think it is a natural assumption that MCP SDK is designed with consideration
90
+ for parallel server initialization.
91
+ I'm not sure what I'm missing...
92
+ (FYI, with the TypeScript SDK, parallel server initializaion was quite
93
+ straight forward)
84
94
  """
85
95
 
86
96
 
@@ -1,104 +0,0 @@
1
- Metadata-Version: 2.2
2
- Name: langchain-mcp-tools
3
- Version: 0.0.3
4
- Summary: MCP To LangChain Tools Conversion Utility
5
- Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
- Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
7
- Requires-Python: >=3.11
8
- Description-Content-Type: text/markdown
9
- License-File: LICENSE
10
- Requires-Dist: jsonschema-pydantic>=0.6
11
- Requires-Dist: langchain-core>=0.3.29
12
- Requires-Dist: langchain>=0.3.14
13
- Requires-Dist: langchain-anthropic>=0.3.1
14
- Requires-Dist: langchain-groq>=0.2.3
15
- Requires-Dist: langchain-openai>=0.3.0
16
- Requires-Dist: langgraph>=0.2.62
17
- Requires-Dist: mcp>=1.2.0
18
- Requires-Dist: pyjson5>=1.6.8
19
- Requires-Dist: pympler>=1.1
20
- Requires-Dist: python-dotenv>=1.0.1
21
-
22
- # MCP To LangChain Tools Conversion Utility / Python [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/mcp-langchain-client-ts/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/langchain-mcp-tools.svg)](https://pypi.org/project/langchain-mcp-tools/)
23
-
24
- This package is intended to simplify the use of
25
- [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
26
- server tools with LangChain / Python.
27
-
28
- It contains a utility function `convertMcpToLangchainTools()`.
29
- This function handles parallel initialization of specified multiple MCP servers
30
- and converts their available tools into an array of
31
- [LangChain-compatible tools](https://js.langchain.com/docs/how_to/tool_calling/).
32
-
33
- A typescript version of this MCP client is available
34
- [here](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools)
35
-
36
- ## Requirements
37
-
38
- - Python 3.11+
39
-
40
- ## Installation
41
-
42
- ```bash
43
- pip install langchain-mcp-tools
44
- ```
45
-
46
- ## Quick Start
47
-
48
- `convert_mcp_to_langchain_tools()` utility function accepts MCP server configurations
49
- that follow the same structure as
50
- [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
51
- but only the contents of the `mcpServers` property,
52
- and is expressed as a `dict`, e.g.:
53
-
54
- ```python
55
- mcp_configs = {
56
- 'filesystem': {
57
- 'command': 'npx',
58
- 'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
59
- },
60
- 'fetch': {
61
- 'command': 'uvx',
62
- 'args': ['mcp-server-fetch']
63
- }
64
- }
65
-
66
- tools, mcp_cleanup = await convert_mcp_to_langchain_tools(
67
- mcp_configs
68
- )
69
- ```
70
-
71
- The utility function initializes all specified MCP servers in parallel,
72
- and returns LangChain Tools (`List[BaseTool]`)
73
- by gathering all available MCP server tools,
74
- and by wrapping them into [LangChain Tools](https://js.langchain.com/docs/how_to/tool_calling/).
75
- It also returns a cleanup callback function (`McpServerCleanupFn`)
76
- to be invoked to close all MCP server sessions when finished.
77
-
78
- The returned tools can be used with LangChain, e.g.:
79
-
80
- ```python
81
- # from langchain.chat_models import init_chat_model
82
- llm = init_chat_model(
83
- model='claude-3-5-haiku-latest',
84
- model_provider='anthropic'
85
- )
86
-
87
- # from langgraph.prebuilt import create_react_agent
88
- agent = create_react_agent(
89
- llm,
90
- tools
91
- )
92
- ```
93
- <!-- A simple and experimentable usage example can be found
94
- [here](https://github.com/hideya/langchain-mcp-tools-ts-usage/blob/main/src/index.ts) -->
95
-
96
- <!-- A more realistic usage example can be found
97
- [here](https://github.com/hideya/langchain-mcp-client-ts) -->
98
-
99
- An usage example can be found
100
- [here](https://github.com/hideya/mcp-client-langchain-py)
101
-
102
- ## Limitations
103
-
104
- Currently, only text results of tool calls are supported.
@@ -1,6 +0,0 @@
1
- src/langchain_mcp_tools.py,sha256=P6jxJfITBwlZs2CvLQSvvRs3fWghUDrD8Yq4B7Q6-1Y,9975
2
- langchain_mcp_tools-0.0.3.dist-info/LICENSE,sha256=CRC91e8v116gCpnp7h49oIa6_zjhxqnHFTREeoZFJwA,1072
3
- langchain_mcp_tools-0.0.3.dist-info/METADATA,sha256=ZsyFl6jEEbqREbpng0BAQTWVCDKk1-m9KF5xuJoIdjw,3538
4
- langchain_mcp_tools-0.0.3.dist-info/WHEEL,sha256=In9FTNxeP60KnTkGw7wk6mJPYd_dQSjEZmXdBdMCI-8,91
5
- langchain_mcp_tools-0.0.3.dist-info/top_level.txt,sha256=74rtVfumQlgAPzR5_2CgYN24MB0XARCg0t-gzk6gTrM,4
6
- langchain_mcp_tools-0.0.3.dist-info/RECORD,,