langchain-mcp-tools 0.0.14__tar.gz → 0.0.15__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.2
2
2
  Name: langchain-mcp-tools
3
- Version: 0.0.14
3
+ Version: 0.0.15
4
4
  Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
5
  Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
6
  Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
@@ -108,7 +108,7 @@ Currently, only text results of tool calls are supported.
108
108
  ## Technical Details
109
109
 
110
110
  It was very tricky (for me) to get the parallel MCP server initialization
111
- to work successfully...
111
+ to work, including successful final resource cleanup...
112
112
 
113
113
  I'm new to Python, so it is very possible that my ignorance is playing
114
114
  a big role here...
@@ -119,7 +119,13 @@ Any comments pointing out something I am missing would be greatly appreciated!
119
119
  [(comment here)](https://github.com/hideya/langchain-mcp-tools-ts/issues)
120
120
 
121
121
  1. Core Challenge:
122
- - Async resources management for `stdio_client` and `ClientSession` seems
122
+
123
+ A key requirement for parallel initialization is that each server must be
124
+ initialized in its own dedicated task - there's no way around this as far as
125
+ I know. However, this poses a challenge when combined with
126
+ `asynccontextmanager`.
127
+
128
+ - Resources management for `stdio_client` and `ClientSession` seems
123
129
  to require relying exclusively on `asynccontextmanager` for cleanup,
124
130
  with no manual cleanup options
125
131
  (based on [the mcp python-sdk impl as of Jan 14, 2025](https://github.com/modelcontextprotocol/python-sdk/tree/99727a9/src/mcp/client))
@@ -131,11 +137,6 @@ Any comments pointing out something I am missing would be greatly appreciated!
131
137
 
132
138
  2. Solution Strategy:
133
139
 
134
- A key requirement for parallel initialization is that each server must be
135
- initialized in its own dedicated task - there's no way around this as far
136
- as I understand. However, this creates a challenge since we also need to
137
- maintain long-lived sessions and handle cleanup properly.
138
-
139
140
  The key insight is to keep the initialization tasks alive throughout the
140
141
  session lifetime, rather than letting them complete after initialization.
141
142
 
@@ -83,7 +83,7 @@ Currently, only text results of tool calls are supported.
83
83
  ## Technical Details
84
84
 
85
85
  It was very tricky (for me) to get the parallel MCP server initialization
86
- to work successfully...
86
+ to work, including successful final resource cleanup...
87
87
 
88
88
  I'm new to Python, so it is very possible that my ignorance is playing
89
89
  a big role here...
@@ -94,7 +94,13 @@ Any comments pointing out something I am missing would be greatly appreciated!
94
94
  [(comment here)](https://github.com/hideya/langchain-mcp-tools-ts/issues)
95
95
 
96
96
  1. Core Challenge:
97
- - Async resources management for `stdio_client` and `ClientSession` seems
97
+
98
+ A key requirement for parallel initialization is that each server must be
99
+ initialized in its own dedicated task - there's no way around this as far as
100
+ I know. However, this poses a challenge when combined with
101
+ `asynccontextmanager`.
102
+
103
+ - Resources management for `stdio_client` and `ClientSession` seems
98
104
  to require relying exclusively on `asynccontextmanager` for cleanup,
99
105
  with no manual cleanup options
100
106
  (based on [the mcp python-sdk impl as of Jan 14, 2025](https://github.com/modelcontextprotocol/python-sdk/tree/99727a9/src/mcp/client))
@@ -106,11 +112,6 @@ Any comments pointing out something I am missing would be greatly appreciated!
106
112
 
107
113
  2. Solution Strategy:
108
114
 
109
- A key requirement for parallel initialization is that each server must be
110
- initialized in its own dedicated task - there's no way around this as far
111
- as I understand. However, this creates a challenge since we also need to
112
- maintain long-lived sessions and handle cleanup properly.
113
-
114
115
  The key insight is to keep the initialization tasks alive throughout the
115
116
  session lifetime, rather than letting them complete after initialization.
116
117
 
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.2
2
2
  Name: langchain-mcp-tools
3
- Version: 0.0.14
3
+ Version: 0.0.15
4
4
  Summary: Model Context Protocol (MCP) To LangChain Tools Conversion Utility
5
5
  Project-URL: Bug Tracker, https://github.com/hideya/langchain-mcp-tools-py/issues
6
6
  Project-URL: Source Code, https://github.com/hideya/langchain-mcp-tools-py
@@ -108,7 +108,7 @@ Currently, only text results of tool calls are supported.
108
108
  ## Technical Details
109
109
 
110
110
  It was very tricky (for me) to get the parallel MCP server initialization
111
- to work successfully...
111
+ to work, including successful final resource cleanup...
112
112
 
113
113
  I'm new to Python, so it is very possible that my ignorance is playing
114
114
  a big role here...
@@ -119,7 +119,13 @@ Any comments pointing out something I am missing would be greatly appreciated!
119
119
  [(comment here)](https://github.com/hideya/langchain-mcp-tools-ts/issues)
120
120
 
121
121
  1. Core Challenge:
122
- - Async resources management for `stdio_client` and `ClientSession` seems
122
+
123
+ A key requirement for parallel initialization is that each server must be
124
+ initialized in its own dedicated task - there's no way around this as far as
125
+ I know. However, this poses a challenge when combined with
126
+ `asynccontextmanager`.
127
+
128
+ - Resources management for `stdio_client` and `ClientSession` seems
123
129
  to require relying exclusively on `asynccontextmanager` for cleanup,
124
130
  with no manual cleanup options
125
131
  (based on [the mcp python-sdk impl as of Jan 14, 2025](https://github.com/modelcontextprotocol/python-sdk/tree/99727a9/src/mcp/client))
@@ -131,11 +137,6 @@ Any comments pointing out something I am missing would be greatly appreciated!
131
137
 
132
138
  2. Solution Strategy:
133
139
 
134
- A key requirement for parallel initialization is that each server must be
135
- initialized in its own dedicated task - there's no way around this as far
136
- as I understand. However, this creates a challenge since we also need to
137
- maintain long-lived sessions and handle cleanup properly.
138
-
139
140
  The key insight is to keep the initialization tasks alive throughout the
140
141
  session lifetime, rather than letting them complete after initialization.
141
142
 
@@ -1,12 +1,13 @@
1
1
  LICENSE
2
2
  README.md
3
3
  pyproject.toml
4
- langchain_mcp_tools/__init__.py
5
- langchain_mcp_tools/langchain_mcp_tools.py
6
- langchain_mcp_tools/py.typed
7
4
  langchain_mcp_tools.egg-info/PKG-INFO
8
5
  langchain_mcp_tools.egg-info/SOURCES.txt
9
6
  langchain_mcp_tools.egg-info/dependency_links.txt
10
7
  langchain_mcp_tools.egg-info/requires.txt
11
8
  langchain_mcp_tools.egg-info/top_level.txt
12
- tests/test_langchain_mcp_tools.py
9
+ src/langchain_mcp_tools/__init__.py
10
+ src/langchain_mcp_tools/langchain_mcp_tools.py
11
+ src/langchain_mcp_tools/py.typed
12
+ src/tests/__init__.py
13
+ src/tests/test_langchain_mcp_tools.py
@@ -1,6 +1,6 @@
1
1
  [project]
2
2
  name = "langchain-mcp-tools"
3
- version = "0.0.14"
3
+ version = "0.0.15"
4
4
  description = "Model Context Protocol (MCP) To LangChain Tools Conversion Utility"
5
5
  readme = "README.md"
6
6
  requires-python = ">=3.11"
@@ -37,7 +37,13 @@ require context managers while enabling parallel initialization.
37
37
  The key aspects are:
38
38
 
39
39
  1. Core Challenge:
40
- - Async resources management for `stdio_client` and `ClientSession` seems
40
+
41
+ A key requirement for parallel initialization is that each server must be
42
+ initialized in its own dedicated task - there's no way around this as far as
43
+ I know. However, this poses a challenge when combined with
44
+ `asynccontextmanager`.
45
+
46
+ - Resources management for `stdio_client` and `ClientSession` seems
41
47
  to require relying exclusively on `asynccontextmanager` for cleanup,
42
48
  with no manual cleanup options
43
49
  (based on [the mcp python-sdk impl as of Jan 14, 2025](https://github.com/modelcontextprotocol/python-sdk/tree/99727a9/src/mcp/client))
@@ -48,10 +54,6 @@ The key aspects are:
48
54
  - Need to ensure proper cleanup later in the same task that created them
49
55
 
50
56
  2. Solution Strategy:
51
- A key requirement for parallel initialization is that each server must be
52
- initialized in its own dedicated task - there's no way around this as far
53
- as I understand. However, this creates a challenge since we also need to
54
- maintain long-lived sessions and handle cleanup properly.
55
57
 
56
58
  The key insight is to keep the initialization tasks alive throughout the
57
59
  session lifetime, rather than letting them complete after initialization.
@@ -0,0 +1,2 @@
1
+
2
+ # Test package initialization
@@ -1,3 +0,0 @@
1
- build
2
- dist
3
- langchain_mcp_tools