@mcp-use/cli 1.0.20 → 2.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/readme.md DELETED
@@ -1,250 +0,0 @@
1
- <div align="center">
2
- <img src="static/readme.png" alt="Terminal" />
3
- </div>
4
-
5
- # Open Source and Open Model CLI for MCP
6
-
7
- A CLI tool for interacting with Model Context Protocol (MCP) servers using natural language.
8
-
9
- ## Built with [mcp-use](https://mcp-use.com), build your own MCP application with our SDKs:
10
-
11
- <h4><strong>Python</strong> <a href="https://github.com/mcp-use/mcp-use"><img src="static/python.png" alt="Python" height="24" style="vertical-align: middle; margin-right: 4px;"/>mcp-use/mcp-use</a></h4>
12
- <h4><strong>Typescript</strong> <a href="https://github.com/mcp-use/mcp-use-ts"><img src="static/typescript.png" alt="TypeScript" height="20" style="vertical-align: middle; margin-right: 4px;"/>mcp-use/mcp-use-ts</a></h4>
13
-
14
- ## Features
15
-
16
- - 🤖 Natural language interface for MCP servers
17
- - 💬 Interactive chat interface with tool call visualization
18
- - ⚡ Direct integration with mcp-use (no API layer needed)
19
- - 🚀 Single command installation
20
- - 🔄 **Over a dozen LLM providers** (OpenAI, Anthropic, Google, Mistral, Groq, Cohere, and more)
21
- - ⚙️ **Slash commands** for configuration (like Claude Code)
22
- - 🔑 **Smart API key prompting** - automatically asks for keys when needed
23
- - 💾 **Persistent secure storage** - encrypted keys and settings saved across sessions
24
-
25
- ## Install
26
-
27
- ```bash
28
- $ npm install --global @mcp-use/cli
29
- ```
30
-
31
- ## Quick Start
32
-
33
- 1. **Install and run**:
34
-
35
- ```bash
36
- $ npm install --global @mcp-use/cli
37
- $ mcp-use
38
- ```
39
-
40
- 2. **Choose your model** (CLI handles API key setup automatically):
41
-
42
- ```bash
43
- # Just pick a model - that's it!
44
- /model openai gpt-4o
45
- /model anthropic claude-3-5-sonnet-20240620
46
- /model google gemini-1.5-pro
47
- /model groq llama-3.1-70b-versatile
48
- /model ollama llama3
49
-
50
- # CLI will prompt: "Please enter your OPENAI API key:"
51
- # Paste your key and start chatting immediately!
52
- ```
53
-
54
- 3. **Get API keys** when prompted from providers like:
55
- - [OpenAI](https://platform.openai.com/api-keys)
56
- - [Anthropic](https://console.anthropic.com/)
57
- - [Google AI](https://aistudio.google.com/app/apikey)
58
- - [Mistral](https://console.mistral.ai/)
59
- - [Groq](https://console.groq.com/keys)
60
- - [Cohere](https://dashboard.cohere.com/api-keys)
61
-
62
- > **Keys are stored securely encrypted** in `~/.mcp-use-cli/config.json` and persist across sessions.
63
-
64
- ## Alternative Setup
65
-
66
- If you prefer environment variables:
67
-
68
- ```bash
69
- export OPENAI_API_KEY=your_key_here
70
- export ANTHROPIC_API_KEY=your_key_here
71
- # Then just run: mcp-use
72
- ```
73
-
74
- ## Usage
75
-
76
- ```
77
- $ mcp-use --help
78
-
79
- Usage
80
- $ mcp-use
81
-
82
- Options
83
- --name Your name (optional)
84
- --config Path to MCP configuration file (optional)
85
-
86
- Examples
87
- $ mcp-use
88
- $ mcp-use --name=Jane
89
-
90
- Environment Variables
91
- <PROVIDER>_API_KEY Set API keys (e.g., OPENAI_API_KEY, ANTHROPIC_API_KEY)
92
-
93
- Setup
94
- 1. Run: mcp-use
95
- 2. Use /model or /setkey to configure an LLM.
96
- 3. Use /server commands to connect to your tools.
97
- 4. Start chatting!
98
- ```
99
-
100
- ## Connecting to Tools (MCP Servers)
101
-
102
- This CLI is a client for [Model Context Protocol (MCP)](https://github.com/mcp-use/mcp-spec) servers. MCP servers act as tools that the AI can use. You need to connect the CLI to one or more servers to give it capabilities.
103
-
104
- You can manage servers with the `/server` commands:
105
-
106
- ```bash
107
- # Add a new server configuration by pasting its JSON definition
108
- /server add
109
-
110
- # List configured servers
111
- /servers
112
-
113
- # Connect to a configured server
114
- /server connect <server-name>
115
-
116
- # Disconnect from a server
117
- /server disconnect <server-name>
118
- ```
119
-
120
- When you add a server, you'll be prompted for its JSON configuration. Here are examples for local and remote servers:
121
-
122
- **Local Server Example (e.g., a filesystem tool):**
123
-
124
- ```json
125
- {
126
- "mcpServers": {
127
- "filesystem-tool": {
128
- "command": "npx",
129
- "args": [
130
- "-y",
131
- "@modelcontextprotocol/server-filesystem",
132
- "/path/to/your/project"
133
- ],
134
- "env": {}
135
- }
136
- }
137
- }
138
- ```
139
-
140
- **Remote Server Example (e.g., an SSE endpoint):**
141
-
142
- ```json
143
- {
144
- "mcpServers": {
145
- "remote-tool": {
146
- "url": "http://127.0.0.1:8000/sse"
147
- }
148
- }
149
- }
150
- ```
151
-
152
- This configuration would be pasted directly into the CLI after running `/server add`.
153
-
154
- ## Slash Commands
155
-
156
- Switch LLM providers and configure settings using slash commands:
157
-
158
- ```bash
159
- # Set API keys (stored securely)
160
- /setkey openai sk-1234567890abcdef...
161
- /setkey anthropic ant_1234567890abcdef...
162
- /clearkeys # Clear all stored keys
163
-
164
- # Switch models
165
- /model openai gpt-4o
166
- /model anthropic claude-3-5-sonnet-20240620
167
- /model google gemini-1.5-pro
168
- /model mistral mistral-large-latest
169
- /model groq llama-3.1-70b-versatile
170
-
171
- # List available models
172
- /models
173
-
174
- # Server Management
175
- /server add
176
- /servers
177
- /server connect <name>
178
- /server disconnect <name>
179
-
180
- # Configuration
181
- /config temp 0.5
182
- /config tokens 4000
183
-
184
- # Status and help
185
- /status
186
- /help
187
- ```
188
-
189
- ## Chat Examples
190
-
191
- - "List files in the current directory"
192
- - "Create a new file called hello.txt with the content 'Hello, World!'"
193
- - "Search for files containing 'TODO'"
194
- - "What's the structure of this project?"
195
-
196
- ## Architecture
197
-
198
- This CLI uses:
199
-
200
- - **Frontend**: React + Ink for the terminal UI
201
- - **Agent**: mcp-use MCPAgent for LLM + MCP integration
202
- - **LLM**: Your choice of 12+ providers
203
- - **Transport**: Direct TypeScript integration (no API layer)
204
-
205
- ## Privacy & Telemetry
206
-
207
- This package uses [Scarf](https://scarf.sh) to collect basic installation analytics to help us understand how the package is being used. This data helps us improve the tool and prioritize features.
208
-
209
- ### What data is collected?
210
-
211
- Scarf collects:
212
-
213
- - Operating system information
214
- - IP address (used only for company lookup, not stored)
215
- - Limited dependency tree information (hashed for privacy)
216
-
217
- **No personally identifying information is stored.**
218
-
219
- ### How to disable telemetry
220
-
221
- You can opt out of analytics in several ways:
222
-
223
- **Option 1: Environment variable**
224
-
225
- ```bash
226
- export SCARF_ANALYTICS=false
227
- ```
228
-
229
- **Option 2: Standard Do Not Track**
230
-
231
- ```bash
232
- export DO_NOT_TRACK=1
233
- ```
234
-
235
- **Option 3: For package maintainers**
236
- If you distribute a package that depends on this CLI, you can disable analytics for all your downstream users by adding this to your `package.json`:
237
-
238
- ```json
239
- {
240
- "scarfSettings": {
241
- "enabled": false
242
- }
243
- }
244
- ```
245
-
246
- For more information about Scarf and privacy, visit [scarf.sh](https://scarf.sh).
247
-
248
- ## License
249
-
250
- MIT