mcp-use 0.0.3__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of mcp-use might be problematic. Click here for more details.

mcp_use-0.0.3/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 pietrozullo
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
mcp_use-0.0.3/PKG-INFO ADDED
@@ -0,0 +1,368 @@
1
+ Metadata-Version: 2.4
2
+ Name: mcp_use
3
+ Version: 0.0.3
4
+ Summary: Model-Agnostic MCP Library for LLMs
5
+ Home-page: https://github.com/pietrozullo/mcp_use
6
+ Author: Pietro Zullo
7
+ Author-email: pietro.zullo@gmail.com
8
+ Classifier: Development Status :: 3 - Alpha
9
+ Classifier: Intended Audience :: Developers
10
+ Classifier: License :: OSI Approved :: MIT License
11
+ Classifier: Operating System :: OS Independent
12
+ Classifier: Programming Language :: Python :: 3
13
+ Classifier: Programming Language :: Python :: 3.11
14
+ Classifier: Programming Language :: Python :: 3.12
15
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
16
+ Requires-Python: >=3.11
17
+ Description-Content-Type: text/markdown
18
+ License-File: LICENSE
19
+ Requires-Dist: mcp
20
+ Requires-Dist: langchain>=0.1.0
21
+ Requires-Dist: langchain-community>=0.0.10
22
+ Requires-Dist: websockets>=12.0
23
+ Requires-Dist: aiohttp>=3.9.0
24
+ Requires-Dist: pydantic>=2.0.0
25
+ Requires-Dist: typing-extensions>=4.8.0
26
+ Requires-Dist: jsonschema-pydantic>=0.1.0
27
+ Requires-Dist: python-dotenv>=1.0.0
28
+ Provides-Extra: dev
29
+ Requires-Dist: pytest>=7.4.0; extra == "dev"
30
+ Requires-Dist: pytest-asyncio>=0.21.0; extra == "dev"
31
+ Requires-Dist: pytest-cov>=4.1.0; extra == "dev"
32
+ Requires-Dist: black>=23.9.0; extra == "dev"
33
+ Requires-Dist: isort>=5.12.0; extra == "dev"
34
+ Requires-Dist: mypy>=1.5.0; extra == "dev"
35
+ Requires-Dist: ruff>=0.1.0; extra == "dev"
36
+ Provides-Extra: anthropic
37
+ Requires-Dist: anthropic>=0.15.0; extra == "anthropic"
38
+ Provides-Extra: openai
39
+ Requires-Dist: openai>=1.10.0; extra == "openai"
40
+ Dynamic: author
41
+ Dynamic: author-email
42
+ Dynamic: classifier
43
+ Dynamic: description
44
+ Dynamic: description-content-type
45
+ Dynamic: home-page
46
+ Dynamic: license-file
47
+ Dynamic: provides-extra
48
+ Dynamic: requires-dist
49
+ Dynamic: requires-python
50
+ Dynamic: summary
51
+
52
+ <picture>
53
+ <source media="(prefers-color-scheme: dark)" srcset="./static/mcp-use-dark.png">
54
+ <source media="(prefers-color-scheme: light)" srcset="./static/mcp-use.png">
55
+ <img alt="Shows a black MCP-Use Logo in light color mode and a white one in dark color mode." src="./static/mcp-use.png" width="full">
56
+ </picture>
57
+
58
+ <h1 align="center">Model-Agnostic MCP Library for LLMs 🤖</h1>
59
+
60
+ [![GitHub stars](https://img.shields.io/github/stars/pietrozullo/mcp-use?style=social)](https://github.com/pietrozullo/mcp-use/stargazers)
61
+
62
+ 🌐 MCP-Use is the easiest way to connect any LLM to MCP tools through a unified interface without using closed source or application clients.
63
+
64
+ 💡 Let developers easily connect any LLM to tools like web browsing, file operations, and more.
65
+
66
+ # Quick start
67
+
68
+ With pip:
69
+
70
+ ```bash
71
+ pip install mcp_use
72
+ ```
73
+
74
+ Or install from source:
75
+
76
+ ```bash
77
+ git clone https://github.com/pietrozullo/mcp_use.git
78
+ cd mcp_use
79
+ pip install -e .
80
+ ```
81
+
82
+ Spin up your agent:
83
+
84
+ ```python
85
+ import asyncio
86
+ import os
87
+ from dotenv import load_dotenv
88
+ from langchain_openai import ChatOpenAI
89
+ from mcp_use import MCPAgent, MCPClient
90
+
91
+ async def main():
92
+ # Load environment variables
93
+ load_dotenv()
94
+
95
+ # Create MCPClient from config file
96
+ client = MCPClient.from_config_file("browser_mcp.json")
97
+
98
+ # Create LLM
99
+ llm = ChatOpenAI(model="gpt-4o")
100
+
101
+ # Create agent with the client
102
+ agent = MCPAgent(llm=llm, client=client, max_steps=30)
103
+
104
+ # Run the query
105
+ result = await agent.run(
106
+ "Find the best restaurant in San Francisco USING GOOGLE SEARCH",
107
+ )
108
+ print(f"\nResult: {result}")
109
+
110
+ if __name__ == "__main__":
111
+ asyncio.run(main())
112
+ ```
113
+
114
+ Example configuration file (`browser_mcp.json`):
115
+
116
+ ```json
117
+ {
118
+ "mcpServers": {
119
+ "playwright": {
120
+ "command": "npx",
121
+ "args": ["@playwright/mcp@latest"],
122
+ "env": {
123
+ "DISPLAY": ":1"
124
+ }
125
+ }
126
+ }
127
+ }
128
+ ```
129
+
130
+ Add your API keys for the provider you want to use to your `.env` file.
131
+
132
+ ```bash
133
+ OPENAI_API_KEY=
134
+ ANTHROPIC_API_KEY=
135
+ ```
136
+
137
+ For other settings, models, and more, check out the documentation.
138
+
139
+ # Example Use Cases
140
+
141
+ ## Web Browsing with Playwright
142
+
143
+ ```python
144
+ import asyncio
145
+ import os
146
+ from dotenv import load_dotenv
147
+ from langchain_openai import ChatOpenAI
148
+ from mcp_use import MCPAgent, MCPClient
149
+
150
+ async def main():
151
+ # Load environment variables
152
+ load_dotenv()
153
+
154
+ # Create MCPClient from config file
155
+ client = MCPClient.from_config_file(
156
+ os.path.join(os.path.dirname(__file__), "browser_mcp.json")
157
+ )
158
+
159
+ # Create LLM
160
+ llm = ChatOpenAI(model="gpt-4o")
161
+ # Alternative models:
162
+ # llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
163
+ # llm = ChatGroq(model="llama3-8b-8192")
164
+
165
+ # Create agent with the client
166
+ agent = MCPAgent(llm=llm, client=client, max_steps=30)
167
+
168
+ # Run the query
169
+ result = await agent.run(
170
+ "Find the best restaurant in San Francisco USING GOOGLE SEARCH",
171
+ max_steps=30,
172
+ )
173
+ print(f"\nResult: {result}")
174
+
175
+ if __name__ == "__main__":
176
+ asyncio.run(main())
177
+ ```
178
+
179
+ ## Airbnb Search
180
+
181
+ ```python
182
+ import asyncio
183
+ import os
184
+ from dotenv import load_dotenv
185
+ from langchain_anthropic import ChatAnthropic
186
+ from mcp_use import MCPAgent, MCPClient
187
+
188
+ async def run_airbnb_example():
189
+ # Load environment variables
190
+ load_dotenv()
191
+
192
+ # Create MCPClient with Airbnb configuration
193
+ client = MCPClient.from_config_file(
194
+ os.path.join(os.path.dirname(__file__), "airbnb_mcp.json")
195
+ )
196
+
197
+ # Create LLM - you can choose between different models
198
+ llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
199
+
200
+ # Create agent with the client
201
+ agent = MCPAgent(llm=llm, client=client, max_steps=30)
202
+
203
+ try:
204
+ # Run a query to search for accommodations
205
+ result = await agent.run(
206
+ "Find me a nice place to stay in Barcelona for 2 adults "
207
+ "for a week in August. I prefer places with a pool and "
208
+ "good reviews. Show me the top 3 options.",
209
+ max_steps=30,
210
+ )
211
+ print(f"\nResult: {result}")
212
+ finally:
213
+ # Ensure we clean up resources properly
214
+ if client.sessions:
215
+ await client.close_all_sessions()
216
+
217
+ if __name__ == "__main__":
218
+ asyncio.run(run_airbnb_example())
219
+ ```
220
+
221
+ Example configuration file (`airbnb_mcp.json`):
222
+
223
+ ```json
224
+ {
225
+ "mcpServers": {
226
+ "airbnb": {
227
+ "command": "npx",
228
+ "args": ["-y", "@openbnb/mcp-server-airbnb"]
229
+ }
230
+ }
231
+ }
232
+ ```
233
+
234
+ ## Blender 3D Creation
235
+
236
+ ```python
237
+ import asyncio
238
+ from dotenv import load_dotenv
239
+ from langchain_anthropic import ChatAnthropic
240
+ from mcp_use import MCPAgent, MCPClient
241
+
242
+ async def run_blender_example():
243
+ # Load environment variables
244
+ load_dotenv()
245
+
246
+ # Create MCPClient with Blender MCP configuration
247
+ config = {"mcpServers": {"blender": {"command": "uvx", "args": ["blender-mcp"]}}}
248
+ client = MCPClient.from_dict(config)
249
+
250
+ # Create LLM
251
+ llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
252
+
253
+ # Create agent with the client
254
+ agent = MCPAgent(llm=llm, client=client, max_steps=30)
255
+
256
+ try:
257
+ # Run the query
258
+ result = await agent.run(
259
+ "Create an inflatable cube with soft material and a plane as ground.",
260
+ max_steps=30,
261
+ )
262
+ print(f"\nResult: {result}")
263
+ finally:
264
+ # Ensure we clean up resources properly
265
+ if client.sessions:
266
+ await client.close_all_sessions()
267
+
268
+ if __name__ == "__main__":
269
+ asyncio.run(run_blender_example())
270
+ ```
271
+
272
+ # Configuration File Support
273
+
274
+ MCP-Use supports initialization from configuration files, making it easy to manage and switch between different MCP server setups:
275
+
276
+ ```python
277
+ import asyncio
278
+ from mcp_use import create_session_from_config
279
+
280
+ async def main():
281
+ # Create an MCP session from a config file
282
+ session = create_session_from_config("mcp-config.json")
283
+
284
+ # Initialize the session
285
+ await session.initialize()
286
+
287
+ # Use the session...
288
+
289
+ # Disconnect when done
290
+ await session.disconnect()
291
+
292
+ if __name__ == "__main__":
293
+ asyncio.run(main())
294
+ ```
295
+
296
+ # MCPClient for Managing Multiple Servers
297
+
298
+ The `MCPClient` class provides a higher-level abstraction for managing multiple MCP servers from a single client:
299
+
300
+ ```python
301
+ import asyncio
302
+ from langchain_anthropic import ChatAnthropic
303
+ from mcp_use import MCPAgent, MCPClient
304
+
305
+ async def main():
306
+ # Create a client from a config file
307
+ client = MCPClient.from_config_file("mcp-config.json")
308
+
309
+ # Or initialize with a config file path
310
+ # client = MCPClient("mcp-config.json")
311
+
312
+ # Or programmatically add servers
313
+ client.add_server(
314
+ "local-ws",
315
+ {
316
+ "command": "npx",
317
+ "args": ["@playwright/mcp@latest", "headless"]
318
+ }
319
+ )
320
+
321
+ # Create an LLM
322
+ llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
323
+
324
+ # Create an agent using the client
325
+ agent = MCPAgent(
326
+ llm=llm,
327
+ client=client,
328
+ server_name="playwright", # Optional, uses first server if not specified
329
+ max_steps=30
330
+ )
331
+
332
+ # Run a query
333
+ result = await agent.run("Your query here")
334
+
335
+ # Close all sessions
336
+ await client.close_all_sessions()
337
+
338
+ if __name__ == "__main__":
339
+ asyncio.run(main())
340
+ ```
341
+
342
+ ## Contributing
343
+
344
+ We love contributions! Feel free to open issues for bugs or feature requests.
345
+
346
+ ## Requirements
347
+
348
+ - Python 3.11+
349
+ - MCP implementation (like Playwright MCP)
350
+ - LangChain and appropriate model libraries (OpenAI, Anthropic, etc.)
351
+
352
+ ## Citation
353
+
354
+ If you use MCP-Use in your research or project, please cite:
355
+
356
+ ```bibtex
357
+ @software{mcp_use2024,
358
+ author = {Zullo, Pietro},
359
+ title = {MCP-Use: Model-Agnostic MCP Library for LLMs},
360
+ year = {2024},
361
+ publisher = {GitHub},
362
+ url = {https://github.com/pietrozullo/mcp-use}
363
+ }
364
+ ```
365
+
366
+ ## License
367
+
368
+ MIT
@@ -0,0 +1,317 @@
1
+ <picture>
2
+ <source media="(prefers-color-scheme: dark)" srcset="./static/mcp-use-dark.png">
3
+ <source media="(prefers-color-scheme: light)" srcset="./static/mcp-use.png">
4
+ <img alt="Shows a black MCP-Use Logo in light color mode and a white one in dark color mode." src="./static/mcp-use.png" width="full">
5
+ </picture>
6
+
7
+ <h1 align="center">Model-Agnostic MCP Library for LLMs 🤖</h1>
8
+
9
+ [![GitHub stars](https://img.shields.io/github/stars/pietrozullo/mcp-use?style=social)](https://github.com/pietrozullo/mcp-use/stargazers)
10
+
11
+ 🌐 MCP-Use is the easiest way to connect any LLM to MCP tools through a unified interface without using closed source or application clients.
12
+
13
+ 💡 Let developers easily connect any LLM to tools like web browsing, file operations, and more.
14
+
15
+ # Quick start
16
+
17
+ With pip:
18
+
19
+ ```bash
20
+ pip install mcp_use
21
+ ```
22
+
23
+ Or install from source:
24
+
25
+ ```bash
26
+ git clone https://github.com/pietrozullo/mcp_use.git
27
+ cd mcp_use
28
+ pip install -e .
29
+ ```
30
+
31
+ Spin up your agent:
32
+
33
+ ```python
34
+ import asyncio
35
+ import os
36
+ from dotenv import load_dotenv
37
+ from langchain_openai import ChatOpenAI
38
+ from mcp_use import MCPAgent, MCPClient
39
+
40
+ async def main():
41
+ # Load environment variables
42
+ load_dotenv()
43
+
44
+ # Create MCPClient from config file
45
+ client = MCPClient.from_config_file("browser_mcp.json")
46
+
47
+ # Create LLM
48
+ llm = ChatOpenAI(model="gpt-4o")
49
+
50
+ # Create agent with the client
51
+ agent = MCPAgent(llm=llm, client=client, max_steps=30)
52
+
53
+ # Run the query
54
+ result = await agent.run(
55
+ "Find the best restaurant in San Francisco USING GOOGLE SEARCH",
56
+ )
57
+ print(f"\nResult: {result}")
58
+
59
+ if __name__ == "__main__":
60
+ asyncio.run(main())
61
+ ```
62
+
63
+ Example configuration file (`browser_mcp.json`):
64
+
65
+ ```json
66
+ {
67
+ "mcpServers": {
68
+ "playwright": {
69
+ "command": "npx",
70
+ "args": ["@playwright/mcp@latest"],
71
+ "env": {
72
+ "DISPLAY": ":1"
73
+ }
74
+ }
75
+ }
76
+ }
77
+ ```
78
+
79
+ Add your API keys for the provider you want to use to your `.env` file.
80
+
81
+ ```bash
82
+ OPENAI_API_KEY=
83
+ ANTHROPIC_API_KEY=
84
+ ```
85
+
86
+ For other settings, models, and more, check out the documentation.
87
+
88
+ # Example Use Cases
89
+
90
+ ## Web Browsing with Playwright
91
+
92
+ ```python
93
+ import asyncio
94
+ import os
95
+ from dotenv import load_dotenv
96
+ from langchain_openai import ChatOpenAI
97
+ from mcp_use import MCPAgent, MCPClient
98
+
99
+ async def main():
100
+ # Load environment variables
101
+ load_dotenv()
102
+
103
+ # Create MCPClient from config file
104
+ client = MCPClient.from_config_file(
105
+ os.path.join(os.path.dirname(__file__), "browser_mcp.json")
106
+ )
107
+
108
+ # Create LLM
109
+ llm = ChatOpenAI(model="gpt-4o")
110
+ # Alternative models:
111
+ # llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
112
+ # llm = ChatGroq(model="llama3-8b-8192")
113
+
114
+ # Create agent with the client
115
+ agent = MCPAgent(llm=llm, client=client, max_steps=30)
116
+
117
+ # Run the query
118
+ result = await agent.run(
119
+ "Find the best restaurant in San Francisco USING GOOGLE SEARCH",
120
+ max_steps=30,
121
+ )
122
+ print(f"\nResult: {result}")
123
+
124
+ if __name__ == "__main__":
125
+ asyncio.run(main())
126
+ ```
127
+
128
+ ## Airbnb Search
129
+
130
+ ```python
131
+ import asyncio
132
+ import os
133
+ from dotenv import load_dotenv
134
+ from langchain_anthropic import ChatAnthropic
135
+ from mcp_use import MCPAgent, MCPClient
136
+
137
+ async def run_airbnb_example():
138
+ # Load environment variables
139
+ load_dotenv()
140
+
141
+ # Create MCPClient with Airbnb configuration
142
+ client = MCPClient.from_config_file(
143
+ os.path.join(os.path.dirname(__file__), "airbnb_mcp.json")
144
+ )
145
+
146
+ # Create LLM - you can choose between different models
147
+ llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
148
+
149
+ # Create agent with the client
150
+ agent = MCPAgent(llm=llm, client=client, max_steps=30)
151
+
152
+ try:
153
+ # Run a query to search for accommodations
154
+ result = await agent.run(
155
+ "Find me a nice place to stay in Barcelona for 2 adults "
156
+ "for a week in August. I prefer places with a pool and "
157
+ "good reviews. Show me the top 3 options.",
158
+ max_steps=30,
159
+ )
160
+ print(f"\nResult: {result}")
161
+ finally:
162
+ # Ensure we clean up resources properly
163
+ if client.sessions:
164
+ await client.close_all_sessions()
165
+
166
+ if __name__ == "__main__":
167
+ asyncio.run(run_airbnb_example())
168
+ ```
169
+
170
+ Example configuration file (`airbnb_mcp.json`):
171
+
172
+ ```json
173
+ {
174
+ "mcpServers": {
175
+ "airbnb": {
176
+ "command": "npx",
177
+ "args": ["-y", "@openbnb/mcp-server-airbnb"]
178
+ }
179
+ }
180
+ }
181
+ ```
182
+
183
+ ## Blender 3D Creation
184
+
185
+ ```python
186
+ import asyncio
187
+ from dotenv import load_dotenv
188
+ from langchain_anthropic import ChatAnthropic
189
+ from mcp_use import MCPAgent, MCPClient
190
+
191
+ async def run_blender_example():
192
+ # Load environment variables
193
+ load_dotenv()
194
+
195
+ # Create MCPClient with Blender MCP configuration
196
+ config = {"mcpServers": {"blender": {"command": "uvx", "args": ["blender-mcp"]}}}
197
+ client = MCPClient.from_dict(config)
198
+
199
+ # Create LLM
200
+ llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
201
+
202
+ # Create agent with the client
203
+ agent = MCPAgent(llm=llm, client=client, max_steps=30)
204
+
205
+ try:
206
+ # Run the query
207
+ result = await agent.run(
208
+ "Create an inflatable cube with soft material and a plane as ground.",
209
+ max_steps=30,
210
+ )
211
+ print(f"\nResult: {result}")
212
+ finally:
213
+ # Ensure we clean up resources properly
214
+ if client.sessions:
215
+ await client.close_all_sessions()
216
+
217
+ if __name__ == "__main__":
218
+ asyncio.run(run_blender_example())
219
+ ```
220
+
221
+ # Configuration File Support
222
+
223
+ MCP-Use supports initialization from configuration files, making it easy to manage and switch between different MCP server setups:
224
+
225
+ ```python
226
+ import asyncio
227
+ from mcp_use import create_session_from_config
228
+
229
+ async def main():
230
+ # Create an MCP session from a config file
231
+ session = create_session_from_config("mcp-config.json")
232
+
233
+ # Initialize the session
234
+ await session.initialize()
235
+
236
+ # Use the session...
237
+
238
+ # Disconnect when done
239
+ await session.disconnect()
240
+
241
+ if __name__ == "__main__":
242
+ asyncio.run(main())
243
+ ```
244
+
245
+ # MCPClient for Managing Multiple Servers
246
+
247
+ The `MCPClient` class provides a higher-level abstraction for managing multiple MCP servers from a single client:
248
+
249
+ ```python
250
+ import asyncio
251
+ from langchain_anthropic import ChatAnthropic
252
+ from mcp_use import MCPAgent, MCPClient
253
+
254
+ async def main():
255
+ # Create a client from a config file
256
+ client = MCPClient.from_config_file("mcp-config.json")
257
+
258
+ # Or initialize with a config file path
259
+ # client = MCPClient("mcp-config.json")
260
+
261
+ # Or programmatically add servers
262
+ client.add_server(
263
+ "local-ws",
264
+ {
265
+ "command": "npx",
266
+ "args": ["@playwright/mcp@latest", "headless"]
267
+ }
268
+ )
269
+
270
+ # Create an LLM
271
+ llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
272
+
273
+ # Create an agent using the client
274
+ agent = MCPAgent(
275
+ llm=llm,
276
+ client=client,
277
+ server_name="playwright", # Optional, uses first server if not specified
278
+ max_steps=30
279
+ )
280
+
281
+ # Run a query
282
+ result = await agent.run("Your query here")
283
+
284
+ # Close all sessions
285
+ await client.close_all_sessions()
286
+
287
+ if __name__ == "__main__":
288
+ asyncio.run(main())
289
+ ```
290
+
291
+ ## Contributing
292
+
293
+ We love contributions! Feel free to open issues for bugs or feature requests.
294
+
295
+ ## Requirements
296
+
297
+ - Python 3.11+
298
+ - MCP implementation (like Playwright MCP)
299
+ - LangChain and appropriate model libraries (OpenAI, Anthropic, etc.)
300
+
301
+ ## Citation
302
+
303
+ If you use MCP-Use in your research or project, please cite:
304
+
305
+ ```bibtex
306
+ @software{mcp_use2024,
307
+ author = {Zullo, Pietro},
308
+ title = {MCP-Use: Model-Agnostic MCP Library for LLMs},
309
+ year = {2024},
310
+ publisher = {GitHub},
311
+ url = {https://github.com/pietrozullo/mcp-use}
312
+ }
313
+ ```
314
+
315
+ ## License
316
+
317
+ MIT