langchain-mcp-tools 0.0.8__tar.gz → 0.0.10__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.2
2
2
  Name: langchain-mcp-tools
3
- Version: 0.0.8
3
+ Version: 0.0.10
4
4
  Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
5
  Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
6
  Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
@@ -27,7 +27,7 @@ server tools with LangChain / Python.
27
27
 
28
28
  It contains a utility function `convertMcpToLangchainTools()`.
29
29
  This function handles parallel initialization of specified multiple MCP servers
30
- and converts their available tools into an array of
30
+ and converts their available tools into a list of
31
31
  [LangChain-compatible tools](https://js.langchain.com/docs/how_to/tool_calling/).
32
32
 
33
33
  A typescript equivalent of this utility library is available
@@ -90,15 +90,13 @@ The returned tools can be used with LangChain, e.g.:
90
90
  tools
91
91
  )
92
92
  ```
93
- <!-- A simple and experimentable usage example can be found
94
- [here](https://github.com/hideya/langchain-mcp-tools-ts-usage/blob/main/src/index.ts) -->
93
+ A simple and experimentable usage example can be found
94
+ [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
95
95
 
96
- <!-- A more realistic usage example can be found
97
- [here](https://github.com/hideya/langchain-mcp-client-ts) -->
98
-
99
- An usage example can be found
96
+ A more realistic usage example can be found
100
97
  [here](https://github.com/hideya/mcp-client-langchain-py)
101
98
 
99
+
102
100
  ## Limitations
103
101
 
104
102
  Currently, only text results of tool calls are supported.
@@ -124,10 +122,11 @@ Any comments pointing out something I am missing would be greatly appreciated!
124
122
  - Ensuring proper cleanup later in the same task that created them
125
123
 
126
124
  2. Solution Strategy:
125
+
127
126
  A key requirement for parallel initialization is that each server must be
128
- initialized in its own dedicated task - there's no way around this if we
129
- want true parallel initialization. However, this creates a challenge since
130
- we also need to maintain long-lived sessions and handle cleanup properly.
127
+ initialized in its own dedicated task - there's no way around this as far
128
+ as I understand. However, this creates a challenge since we also need to
129
+ maintain long-lived sessions and handle cleanup properly.
131
130
 
132
131
  The key insight is to keep the initialization tasks alive throughout the
133
132
  session lifetime, rather than letting them complete after initialization.
@@ -6,7 +6,7 @@ server tools with LangChain / Python.
6
6
 
7
7
  It contains a utility function `convertMcpToLangchainTools()`.
8
8
  This function handles parallel initialization of specified multiple MCP servers
9
- and converts their available tools into an array of
9
+ and converts their available tools into a list of
10
10
  [LangChain-compatible tools](https://js.langchain.com/docs/how_to/tool_calling/).
11
11
 
12
12
  A typescript equivalent of this utility library is available
@@ -69,15 +69,13 @@ The returned tools can be used with LangChain, e.g.:
69
69
  tools
70
70
  )
71
71
  ```
72
- <!-- A simple and experimentable usage example can be found
73
- [here](https://github.com/hideya/langchain-mcp-tools-ts-usage/blob/main/src/index.ts) -->
72
+ A simple and experimentable usage example can be found
73
+ [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
74
74
 
75
- <!-- A more realistic usage example can be found
76
- [here](https://github.com/hideya/langchain-mcp-client-ts) -->
77
-
78
- An usage example can be found
75
+ A more realistic usage example can be found
79
76
  [here](https://github.com/hideya/mcp-client-langchain-py)
80
77
 
78
+
81
79
  ## Limitations
82
80
 
83
81
  Currently, only text results of tool calls are supported.
@@ -103,10 +101,11 @@ Any comments pointing out something I am missing would be greatly appreciated!
103
101
  - Ensuring proper cleanup later in the same task that created them
104
102
 
105
103
  2. Solution Strategy:
104
+
106
105
  A key requirement for parallel initialization is that each server must be
107
- initialized in its own dedicated task - there's no way around this if we
108
- want true parallel initialization. However, this creates a challenge since
109
- we also need to maintain long-lived sessions and handle cleanup properly.
106
+ initialized in its own dedicated task - there's no way around this as far
107
+ as I understand. However, this creates a challenge since we also need to
108
+ maintain long-lived sessions and handle cleanup properly.
110
109
 
111
110
  The key insight is to keep the initialization tasks alive throughout the
112
111
  session lifetime, rather than letting them complete after initialization.
@@ -47,9 +47,9 @@ The key aspects are:
47
47
 
48
48
  2. Solution Strategy:
49
49
  A key requirement for parallel initialization is that each server must be
50
- initialized in its own dedicated task - there's no way around this if we
51
- want true parallel initialization. However, this creates a challenge since
52
- we also need to maintain long-lived sessions and handle cleanup properly.
50
+ initialized in its own dedicated task - there's no way around this as far
51
+ as I understand. However, this creates a challenge since we also need to
52
+ maintain long-lived sessions and handle cleanup properly.
53
53
 
54
54
  The key insight is to keep the initialization tasks alive throughout the
55
55
  session lifetime, rather than letting them complete after initialization.
@@ -272,16 +272,15 @@ async def convert_mcp_to_langchain_tools(
272
272
  ))
273
273
  tasks.append(task)
274
274
 
275
- for ready_event in ready_event_list:
276
- await ready_event.wait()
275
+ await asyncio.gather(*(event.wait() for event in ready_event_list))
277
276
 
278
277
  langchain_tools = [
279
278
  item for sublist in per_server_tools for item in sublist
280
279
  ]
281
280
 
282
281
  async def mcp_cleanup() -> None:
283
- for cleanup_event in cleanup_event_list:
284
- cleanup_event.set()
282
+ for event in cleanup_event_list:
283
+ event.set()
285
284
 
286
285
  logger.info(f'MCP servers initialized: {len(langchain_tools)} tool(s) '
287
286
  f'available in total')
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.2
2
2
  Name: langchain-mcp-tools
3
- Version: 0.0.8
3
+ Version: 0.0.10
4
4
  Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
5
  Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
6
  Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
@@ -27,7 +27,7 @@ server tools with LangChain / Python.
27
27
 
28
28
  It contains a utility function `convertMcpToLangchainTools()`.
29
29
  This function handles parallel initialization of specified multiple MCP servers
30
- and converts their available tools into an array of
30
+ and converts their available tools into a list of
31
31
  [LangChain-compatible tools](https://js.langchain.com/docs/how_to/tool_calling/).
32
32
 
33
33
  A typescript equivalent of this utility library is available
@@ -90,15 +90,13 @@ The returned tools can be used with LangChain, e.g.:
90
90
  tools
91
91
  )
92
92
  ```
93
- <!-- A simple and experimentable usage example can be found
94
- [here](https://github.com/hideya/langchain-mcp-tools-ts-usage/blob/main/src/index.ts) -->
93
+ A simple and experimentable usage example can be found
94
+ [here](https://github.com/hideya/langchain-mcp-tools-py-usage/blob/main/src/example.py)
95
95
 
96
- <!-- A more realistic usage example can be found
97
- [here](https://github.com/hideya/langchain-mcp-client-ts) -->
98
-
99
- An usage example can be found
96
+ A more realistic usage example can be found
100
97
  [here](https://github.com/hideya/mcp-client-langchain-py)
101
98
 
99
+
102
100
  ## Limitations
103
101
 
104
102
  Currently, only text results of tool calls are supported.
@@ -124,10 +122,11 @@ Any comments pointing out something I am missing would be greatly appreciated!
124
122
  - Ensuring proper cleanup later in the same task that created them
125
123
 
126
124
  2. Solution Strategy:
125
+
127
126
  A key requirement for parallel initialization is that each server must be
128
- initialized in its own dedicated task - there's no way around this if we
129
- want true parallel initialization. However, this creates a challenge since
130
- we also need to maintain long-lived sessions and handle cleanup properly.
127
+ initialized in its own dedicated task - there's no way around this as far
128
+ as I understand. However, this creates a challenge since we also need to
129
+ maintain long-lived sessions and handle cleanup properly.
131
130
 
132
131
  The key insight is to keep the initialization tasks alive throughout the
133
132
  session lifetime, rather than letting them complete after initialization.
@@ -1,6 +1,6 @@
1
1
  [project]
2
2
  name = "langchain-mcp-tools"
3
- version = "0.0.8"
3
+ version = "0.0.10"
4
4
  description = "Model Context Protocol (MCP) To LangChain Tools Conversion Utility"
5
5
  readme = "README.md"
6
6
  requires-python = ">=3.11"