deepseek-cli 0.2.0__tar.gz → 0.2.2__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {deepseek_cli-0.2.0/src/deepseek_cli.egg-info → deepseek_cli-0.2.2}/PKG-INFO +1 -1
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/README.md +335 -335
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/pyproject.toml +1 -1
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/setup.py +1 -1
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/__init__.py +1 -1
- deepseek_cli-0.2.0/src/cli/deepseek_cli.py → deepseek_cli-0.2.2/src/cli/ai_cli.py +207 -187
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/config/settings.py +1 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2/src/deepseek_cli.egg-info}/PKG-INFO +1 -1
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/deepseek_cli.egg-info/SOURCES.txt +1 -1
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/handlers/chat_handler.py +371 -361
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/handlers/error_handler.py +9 -7
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/LICENSE +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/MANIFEST.in +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/requirements.txt +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/setup.cfg +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/__main__.py +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/api/__init__.py +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/api/client.py +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/cli/__init__.py +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/config/__init__.py +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/deepseek_cli.egg-info/dependency_links.txt +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/deepseek_cli.egg-info/entry_points.txt +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/deepseek_cli.egg-info/requires.txt +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/deepseek_cli.egg-info/top_level.txt +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/handlers/__init__.py +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/handlers/command_handler.py +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/utils/__init__.py +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/utils/exceptions.py +0 -0
- {deepseek_cli-0.2.0 → deepseek_cli-0.2.2}/src/utils/version_checker.py +0 -0
|
@@ -1,336 +1,336 @@
|
|
|
1
|
-
# DeepSeek CLI
|
|
2
|
-
|
|
3
|
-
A powerful command-line interface for interacting with DeepSeek's AI models.
|
|
4
|
-
|
|
5
|
-
[@PierrunoYT/deepseek-cli](https://github.com/PierrunoYT/deepseek-cli)
|
|
6
|
-
|
|
7
|
-
## Features
|
|
8
|
-
|
|
9
|
-
- 🤖 Multiple Model Support
|
|
10
|
-
- DeepSeek-V3.1 (deepseek-chat) - Non-thinking Mode
|
|
11
|
-
- DeepSeek-V3.1 (deepseek-reasoner) - Thinking Mode with Chain of Thought
|
|
12
|
-
- DeepSeek-V2.5 Coder (deepseek-coder)
|
|
13
|
-
|
|
14
|
-
- 🔄 Advanced Conversation Features
|
|
15
|
-
- Multi-round conversations with context preservation
|
|
16
|
-
- System message customization
|
|
17
|
-
- Conversation history tracking
|
|
18
|
-
- Context caching for better performance and cost savings
|
|
19
|
-
- Inline mode for quick queries
|
|
20
|
-
- 128K context window for all models
|
|
21
|
-
|
|
22
|
-
- 🧪 Beta Features
|
|
23
|
-
- Prefix Completion: Complete assistant messages from a given prefix
|
|
24
|
-
- Fill-in-the-Middle (FIM): Complete content between a prefix and suffix
|
|
25
|
-
- Context Caching: Automatic disk-based caching with up to 90% cost savings
|
|
26
|
-
- Anthropic API Compatibility: Use DeepSeek models with Anthropic API format
|
|
27
|
-
|
|
28
|
-
- 🛠️ Advanced Controls
|
|
29
|
-
- Temperature control with presets
|
|
30
|
-
- JSON output mode
|
|
31
|
-
- Streaming responses (enabled by default)
|
|
32
|
-
- Function calling (up to 128 functions)
|
|
33
|
-
- Stop sequences
|
|
34
|
-
- Top-p sampling
|
|
35
|
-
- Frequency and presence penalties
|
|
36
|
-
|
|
37
|
-
- 📦 Package Management
|
|
38
|
-
- Automatic version checking
|
|
39
|
-
- Update notifications
|
|
40
|
-
- Easy installation and updates
|
|
41
|
-
- Development mode support
|
|
42
|
-
|
|
43
|
-
## Installation
|
|
44
|
-
|
|
45
|
-
You can install DeepSeek CLI in two ways:
|
|
46
|
-
|
|
47
|
-
### Option 1: Install from PyPI (Recommended)
|
|
48
|
-
|
|
49
|
-
```bash
|
|
50
|
-
pip install deepseek-cli
|
|
51
|
-
```
|
|
52
|
-
|
|
53
|
-
### Option 2: Install from Source (Development)
|
|
54
|
-
|
|
55
|
-
```bash
|
|
56
|
-
git clone https://github.com/PierrunoYT/deepseek-cli.git
|
|
57
|
-
cd deepseek-cli
|
|
58
|
-
pip install -e .
|
|
59
|
-
```
|
|
60
|
-
|
|
61
|
-
### Updating the Package
|
|
62
|
-
|
|
63
|
-
To update to the latest version:
|
|
64
|
-
|
|
65
|
-
```bash
|
|
66
|
-
pip install --upgrade deepseek-cli
|
|
67
|
-
```
|
|
68
|
-
|
|
69
|
-
For development installation, pull the latest changes and reinstall:
|
|
70
|
-
|
|
71
|
-
```bash
|
|
72
|
-
git pull
|
|
73
|
-
pip install -e . --upgrade
|
|
74
|
-
```
|
|
75
|
-
|
|
76
|
-
The CLI will automatically check for updates on startup and notify you when a new version is available.
|
|
77
|
-
|
|
78
|
-
### API Key Setup
|
|
79
|
-
|
|
80
|
-
Set your DeepSeek API key as an environment variable:
|
|
81
|
-
|
|
82
|
-
#### macOS/Linux
|
|
83
|
-
```bash
|
|
84
|
-
export DEEPSEEK_API_KEY="your-api-key"
|
|
85
|
-
```
|
|
86
|
-
|
|
87
|
-
#### Windows
|
|
88
|
-
```cmd
|
|
89
|
-
set DEEPSEEK_API_KEY="your-api-key"
|
|
90
|
-
```
|
|
91
|
-
|
|
92
|
-
To make it permanent, add it to your environment variables through System Settings.
|
|
93
|
-
|
|
94
|
-
## Usage
|
|
95
|
-
|
|
96
|
-
DeepSeek CLI supports two modes of operation: interactive mode and inline mode.
|
|
97
|
-
|
|
98
|
-
### Interactive Mode
|
|
99
|
-
|
|
100
|
-
After installation, you can start the CLI in interactive mode in two ways:
|
|
101
|
-
|
|
102
|
-
### If installed from PyPI:
|
|
103
|
-
```bash
|
|
104
|
-
deepseek
|
|
105
|
-
```
|
|
106
|
-
|
|
107
|
-
### If installed in development mode:
|
|
108
|
-
```bash
|
|
109
|
-
deepseek
|
|
110
|
-
# or
|
|
111
|
-
python -m deepseek_cli
|
|
112
|
-
```
|
|
113
|
-
|
|
114
|
-
### Inline Mode
|
|
115
|
-
|
|
116
|
-
You can also use DeepSeek CLI in inline mode to get quick answers without starting an interactive session:
|
|
117
|
-
|
|
118
|
-
```bash
|
|
119
|
-
# Basic usage
|
|
120
|
-
deepseek -q "What is the capital of France?"
|
|
121
|
-
|
|
122
|
-
# Specify a model
|
|
123
|
-
deepseek -q "Write a Python function to calculate factorial" -m deepseek-coder
|
|
124
|
-
|
|
125
|
-
# Get raw output without token usage information
|
|
126
|
-
deepseek -q "Write a Python function to calculate factorial" -r
|
|
127
|
-
|
|
128
|
-
# Combine options
|
|
129
|
-
deepseek -q "Write a Python function to calculate factorial" -m deepseek-coder -r
|
|
130
|
-
```
|
|
131
|
-
|
|
132
|
-
Available inline mode options:
|
|
133
|
-
- `-q, --query`: The query to send to the model
|
|
134
|
-
- `-m, --model`: The model to use (deepseek-chat, deepseek-coder, deepseek-reasoner)
|
|
135
|
-
- `-r, --raw`: Output raw response without token usage information
|
|
136
|
-
- `-s, --stream`: Enable streaming mode (enabled by default)
|
|
137
|
-
- `--no-stream`: Disable streaming mode
|
|
138
|
-
|
|
139
|
-
### Troubleshooting
|
|
140
|
-
|
|
141
|
-
- If the API key is not recognized:
|
|
142
|
-
- Make sure you've set the DEEPSEEK_API_KEY environment variable
|
|
143
|
-
- Try closing and reopening your terminal
|
|
144
|
-
- Check if the key is correct with: `echo $DEEPSEEK_API_KEY` (Unix) or `echo %DEEPSEEK_API_KEY%` (Windows)
|
|
145
|
-
|
|
146
|
-
- If you get import errors:
|
|
147
|
-
- Ensure you've installed the package: `pip list | grep deepseek-cli`
|
|
148
|
-
- Try reinstalling: `pip install --force-reinstall deepseek-cli`
|
|
149
|
-
|
|
150
|
-
- For development installation issues:
|
|
151
|
-
- Make sure you're in the correct directory
|
|
152
|
-
- Try: `pip install -e . --upgrade`
|
|
153
|
-
|
|
154
|
-
### Available Commands
|
|
155
|
-
|
|
156
|
-
Basic Commands:
|
|
157
|
-
- `/help` - Show help message
|
|
158
|
-
- `/models` - List available models
|
|
159
|
-
- `/model X` - Switch model (deepseek-chat, deepseek-coder, deepseek-reasoner)
|
|
160
|
-
- `/clear` - Clear conversation history
|
|
161
|
-
- `/history` - Display conversation history
|
|
162
|
-
- `/about` - Show API information
|
|
163
|
-
- `/balance` - Check account balance
|
|
164
|
-
|
|
165
|
-
Model Settings:
|
|
166
|
-
- `/temp X` - Set temperature (0-2) or use preset (coding/data/chat/translation/creative)
|
|
167
|
-
- `/freq X` - Set frequency penalty (-2 to 2)
|
|
168
|
-
- `/pres X` - Set presence penalty (-2 to 2)
|
|
169
|
-
- `/top_p X` - Set top_p sampling (0 to 1)
|
|
170
|
-
|
|
171
|
-
Beta Features:
|
|
172
|
-
- `/beta` - Toggle beta features
|
|
173
|
-
- `/prefix` - Toggle prefix completion mode
|
|
174
|
-
- `/fim` - Toggle Fill-in-the-Middle completion
|
|
175
|
-
- `/cache` - Toggle context caching
|
|
176
|
-
|
|
177
|
-
Output Control:
|
|
178
|
-
- `/json` - Toggle JSON output mode
|
|
179
|
-
- `/stream` - Toggle streaming mode (streaming is enabled by default)
|
|
180
|
-
- `/stop X` - Add stop sequence
|
|
181
|
-
- `/clearstop` - Clear stop sequences
|
|
182
|
-
|
|
183
|
-
Function Calling:
|
|
184
|
-
- `/function {}` - Add function definition (JSON format)
|
|
185
|
-
- `/clearfuncs` - Clear registered functions
|
|
186
|
-
|
|
187
|
-
### Model-Specific Features
|
|
188
|
-
|
|
189
|
-
#### DeepSeek-V3.1 (deepseek-chat)
|
|
190
|
-
- **Version**: DeepSeek-V3.1 (Non-thinking Mode)
|
|
191
|
-
- **Context Length**: 128K tokens (128,000 tokens)
|
|
192
|
-
- **Output Length**: Default 4K, Maximum 8K tokens
|
|
193
|
-
- **Supports all features**:
|
|
194
|
-
- JSON Output ✓
|
|
195
|
-
- Function Calling ✓ (up to 128 functions)
|
|
196
|
-
- Chat Prefix Completion (Beta) ✓
|
|
197
|
-
- Fill-in-the-Middle (Beta) ✓
|
|
198
|
-
- General-purpose chat model
|
|
199
|
-
- Latest improvements:
|
|
200
|
-
- Enhanced instruction following (77.6% IFEval accuracy)
|
|
201
|
-
- Improved JSON output (97% parsing rate)
|
|
202
|
-
- Advanced reasoning capabilities
|
|
203
|
-
- Role-playing capabilities
|
|
204
|
-
|
|
205
|
-
#### DeepSeek-V3.1 (deepseek-reasoner)
|
|
206
|
-
- **Version**: DeepSeek-V3.1 (Thinking Mode)
|
|
207
|
-
- **Context Length**: 128K tokens (128,000 tokens)
|
|
208
|
-
- **Output Length**: Default 32K, Maximum 64K tokens
|
|
209
|
-
- **Chain of Thought**: Displays reasoning process before final answer
|
|
210
|
-
- **Supported features**:
|
|
211
|
-
- JSON Output ✓
|
|
212
|
-
- Chat Prefix Completion (Beta) ✓
|
|
213
|
-
- **Unsupported features**:
|
|
214
|
-
- Function Calling ✗ (automatically falls back to deepseek-chat if tools provided)
|
|
215
|
-
- Fill-in-the-Middle ✗
|
|
216
|
-
- Temperature, top_p, presence/frequency penalties ✗
|
|
217
|
-
- Excels at complex reasoning and problem-solving tasks
|
|
218
|
-
|
|
219
|
-
#### DeepSeek-V2.5 Coder (deepseek-coder)
|
|
220
|
-
- **Context Length**: 128K tokens
|
|
221
|
-
- **Output Length**: Default 4K, Maximum 8K tokens
|
|
222
|
-
- **Supports all features**:
|
|
223
|
-
- JSON Output ✓
|
|
224
|
-
- Function Calling ✓
|
|
225
|
-
- Chat Prefix Completion (Beta) ✓
|
|
226
|
-
- Fill-in-the-Middle (Beta) ✓
|
|
227
|
-
- Optimized for code generation and analysis
|
|
228
|
-
|
|
229
|
-
### Feature Details
|
|
230
|
-
|
|
231
|
-
#### Fill-in-the-Middle (FIM)
|
|
232
|
-
Use XML-style tags to define the gap:
|
|
233
|
-
```
|
|
234
|
-
<fim_prefix>def calculate_sum(a, b):</fim_prefix><fim_suffix> return result</fim_suffix>
|
|
235
|
-
```
|
|
236
|
-
|
|
237
|
-
#### JSON Mode
|
|
238
|
-
Forces model to output valid JSON. Example system message:
|
|
239
|
-
```json
|
|
240
|
-
{
|
|
241
|
-
"response": "structured output",
|
|
242
|
-
"data": {
|
|
243
|
-
"field1": "value1",
|
|
244
|
-
"field2": "value2"
|
|
245
|
-
}
|
|
246
|
-
}
|
|
247
|
-
```
|
|
248
|
-
|
|
249
|
-
#### Context Caching
|
|
250
|
-
- **Automatic disk-based caching** for all users
|
|
251
|
-
- **No code changes required** - works automatically
|
|
252
|
-
- **Minimum cache size**: 64 tokens
|
|
253
|
-
- **Pricing**:
|
|
254
|
-
- Cache hits: $0.014 per million tokens (90% savings)
|
|
255
|
-
- Cache misses: $0.14 per million tokens (standard rate)
|
|
256
|
-
- **Performance benefits**:
|
|
257
|
-
- Significantly reduces first token latency for long, repetitive inputs
|
|
258
|
-
- Example: 128K prompt reduced from 13s to 500ms
|
|
259
|
-
- **Best use cases**:
|
|
260
|
-
- Q&A assistants with long preset prompts
|
|
261
|
-
- Role-play with extensive character settings
|
|
262
|
-
- Data analysis with recurring queries on same documents
|
|
263
|
-
- Code analysis and debugging with repeated repository references
|
|
264
|
-
- Few-shot learning with multiple examples
|
|
265
|
-
- Enabled by default
|
|
266
|
-
|
|
267
|
-
#### Anthropic API Compatibility
|
|
268
|
-
DeepSeek API now supports Anthropic API format, enabling integration with tools like Claude Code:
|
|
269
|
-
|
|
270
|
-
**Setup for Claude Code:**
|
|
271
|
-
```bash
|
|
272
|
-
# Install Claude Code
|
|
273
|
-
npm install -g @anthropic-ai/claude-code
|
|
274
|
-
|
|
275
|
-
# Configure environment variables
|
|
276
|
-
export ANTHROPIC_BASE_URL=https://api.deepseek.com/anthropic
|
|
277
|
-
export ANTHROPIC_AUTH_TOKEN=${DEEPSEEK_API_KEY}
|
|
278
|
-
export ANTHROPIC_MODEL=deepseek-chat
|
|
279
|
-
export ANTHROPIC_SMALL_FAST_MODEL=deepseek-chat
|
|
280
|
-
|
|
281
|
-
# Run in your project
|
|
282
|
-
cd my-project
|
|
283
|
-
claude
|
|
284
|
-
```
|
|
285
|
-
|
|
286
|
-
**Python SDK Example:**
|
|
287
|
-
```python
|
|
288
|
-
import anthropic
|
|
289
|
-
|
|
290
|
-
client = anthropic.Anthropic(
|
|
291
|
-
base_url="https://api.deepseek.com/anthropic",
|
|
292
|
-
api_key="your-deepseek-api-key"
|
|
293
|
-
)
|
|
294
|
-
|
|
295
|
-
message = client.messages.create(
|
|
296
|
-
model="deepseek-chat",
|
|
297
|
-
max_tokens=1000,
|
|
298
|
-
system="You are a helpful assistant.",
|
|
299
|
-
messages=[
|
|
300
|
-
{
|
|
301
|
-
"role": "user",
|
|
302
|
-
"content": [{"type": "text", "text": "Hi, how are you?"}]
|
|
303
|
-
}
|
|
304
|
-
]
|
|
305
|
-
)
|
|
306
|
-
print(message.content)
|
|
307
|
-
```
|
|
308
|
-
|
|
309
|
-
**Supported Fields:**
|
|
310
|
-
- ✓ model, max_tokens, stop_sequences, stream, system
|
|
311
|
-
- ✓ temperature (range 0.0-2.0), top_p
|
|
312
|
-
- ✓ tools (function calling)
|
|
313
|
-
- ✗ thinking, top_k, mcp_servers (ignored)
|
|
314
|
-
|
|
315
|
-
## Temperature Presets
|
|
316
|
-
|
|
317
|
-
- `coding`: 0.0 (deterministic)
|
|
318
|
-
- `data`: 1.0 (balanced)
|
|
319
|
-
- `chat`: 1.3 (creative)
|
|
320
|
-
- `translation`: 1.3 (creative)
|
|
321
|
-
- `creative`: 1.5 (very creative)
|
|
322
|
-
|
|
323
|
-
## Error Handling
|
|
324
|
-
|
|
325
|
-
- Automatic retry with exponential backoff
|
|
326
|
-
- Rate limit handling
|
|
327
|
-
- Clear error messages
|
|
328
|
-
- API status feedback
|
|
329
|
-
|
|
330
|
-
## Support
|
|
331
|
-
|
|
332
|
-
For support, please open an issue on the [GitHub repository](https://github.com/PierrunoYT/deepseek-cli/issues).
|
|
333
|
-
|
|
334
|
-
## License
|
|
335
|
-
|
|
1
|
+
# DeepSeek CLI
|
|
2
|
+
|
|
3
|
+
A powerful command-line interface for interacting with DeepSeek's AI models.
|
|
4
|
+
|
|
5
|
+
[@PierrunoYT/deepseek-cli](https://github.com/PierrunoYT/deepseek-cli)
|
|
6
|
+
|
|
7
|
+
## Features
|
|
8
|
+
|
|
9
|
+
- 🤖 Multiple Model Support
|
|
10
|
+
- DeepSeek-V3.1 (deepseek-chat) - Non-thinking Mode
|
|
11
|
+
- DeepSeek-V3.1 (deepseek-reasoner) - Thinking Mode with Chain of Thought
|
|
12
|
+
- DeepSeek-V2.5 Coder (deepseek-coder)
|
|
13
|
+
|
|
14
|
+
- 🔄 Advanced Conversation Features
|
|
15
|
+
- Multi-round conversations with context preservation
|
|
16
|
+
- System message customization
|
|
17
|
+
- Conversation history tracking
|
|
18
|
+
- Context caching for better performance and cost savings
|
|
19
|
+
- Inline mode for quick queries
|
|
20
|
+
- 128K context window for all models
|
|
21
|
+
|
|
22
|
+
- 🧪 Beta Features
|
|
23
|
+
- Prefix Completion: Complete assistant messages from a given prefix
|
|
24
|
+
- Fill-in-the-Middle (FIM): Complete content between a prefix and suffix
|
|
25
|
+
- Context Caching: Automatic disk-based caching with up to 90% cost savings
|
|
26
|
+
- Anthropic API Compatibility: Use DeepSeek models with Anthropic API format
|
|
27
|
+
|
|
28
|
+
- 🛠️ Advanced Controls
|
|
29
|
+
- Temperature control with presets
|
|
30
|
+
- JSON output mode
|
|
31
|
+
- Streaming responses (enabled by default)
|
|
32
|
+
- Function calling (up to 128 functions)
|
|
33
|
+
- Stop sequences
|
|
34
|
+
- Top-p sampling
|
|
35
|
+
- Frequency and presence penalties
|
|
36
|
+
|
|
37
|
+
- 📦 Package Management
|
|
38
|
+
- Automatic version checking
|
|
39
|
+
- Update notifications
|
|
40
|
+
- Easy installation and updates
|
|
41
|
+
- Development mode support
|
|
42
|
+
|
|
43
|
+
## Installation
|
|
44
|
+
|
|
45
|
+
You can install DeepSeek CLI in two ways:
|
|
46
|
+
|
|
47
|
+
### Option 1: Install from PyPI (Recommended)
|
|
48
|
+
|
|
49
|
+
```bash
|
|
50
|
+
pip install deepseek-cli
|
|
51
|
+
```
|
|
52
|
+
|
|
53
|
+
### Option 2: Install from Source (Development)
|
|
54
|
+
|
|
55
|
+
```bash
|
|
56
|
+
git clone https://github.com/PierrunoYT/deepseek-cli.git
|
|
57
|
+
cd deepseek-cli
|
|
58
|
+
pip install -e .
|
|
59
|
+
```
|
|
60
|
+
|
|
61
|
+
### Updating the Package
|
|
62
|
+
|
|
63
|
+
To update to the latest version:
|
|
64
|
+
|
|
65
|
+
```bash
|
|
66
|
+
pip install --upgrade deepseek-cli
|
|
67
|
+
```
|
|
68
|
+
|
|
69
|
+
For development installation, pull the latest changes and reinstall:
|
|
70
|
+
|
|
71
|
+
```bash
|
|
72
|
+
git pull
|
|
73
|
+
pip install -e . --upgrade
|
|
74
|
+
```
|
|
75
|
+
|
|
76
|
+
The CLI will automatically check for updates on startup and notify you when a new version is available.
|
|
77
|
+
|
|
78
|
+
### API Key Setup
|
|
79
|
+
|
|
80
|
+
Set your DeepSeek API key as an environment variable:
|
|
81
|
+
|
|
82
|
+
#### macOS/Linux
|
|
83
|
+
```bash
|
|
84
|
+
export DEEPSEEK_API_KEY="your-api-key"
|
|
85
|
+
```
|
|
86
|
+
|
|
87
|
+
#### Windows
|
|
88
|
+
```cmd
|
|
89
|
+
set DEEPSEEK_API_KEY="your-api-key"
|
|
90
|
+
```
|
|
91
|
+
|
|
92
|
+
To make it permanent, add it to your environment variables through System Settings.
|
|
93
|
+
|
|
94
|
+
## Usage
|
|
95
|
+
|
|
96
|
+
DeepSeek CLI supports two modes of operation: interactive mode and inline mode.
|
|
97
|
+
|
|
98
|
+
### Interactive Mode
|
|
99
|
+
|
|
100
|
+
After installation, you can start the CLI in interactive mode in two ways:
|
|
101
|
+
|
|
102
|
+
### If installed from PyPI:
|
|
103
|
+
```bash
|
|
104
|
+
deepseek
|
|
105
|
+
```
|
|
106
|
+
|
|
107
|
+
### If installed in development mode:
|
|
108
|
+
```bash
|
|
109
|
+
deepseek
|
|
110
|
+
# or
|
|
111
|
+
python -m deepseek_cli
|
|
112
|
+
```
|
|
113
|
+
|
|
114
|
+
### Inline Mode
|
|
115
|
+
|
|
116
|
+
You can also use DeepSeek CLI in inline mode to get quick answers without starting an interactive session:
|
|
117
|
+
|
|
118
|
+
```bash
|
|
119
|
+
# Basic usage
|
|
120
|
+
deepseek -q "What is the capital of France?"
|
|
121
|
+
|
|
122
|
+
# Specify a model
|
|
123
|
+
deepseek -q "Write a Python function to calculate factorial" -m deepseek-coder
|
|
124
|
+
|
|
125
|
+
# Get raw output without token usage information
|
|
126
|
+
deepseek -q "Write a Python function to calculate factorial" -r
|
|
127
|
+
|
|
128
|
+
# Combine options
|
|
129
|
+
deepseek -q "Write a Python function to calculate factorial" -m deepseek-coder -r
|
|
130
|
+
```
|
|
131
|
+
|
|
132
|
+
Available inline mode options:
|
|
133
|
+
- `-q, --query`: The query to send to the model
|
|
134
|
+
- `-m, --model`: The model to use (deepseek-chat, deepseek-coder, deepseek-reasoner)
|
|
135
|
+
- `-r, --raw`: Output raw response without token usage information
|
|
136
|
+
- `-s, --stream`: Enable streaming mode (enabled by default)
|
|
137
|
+
- `--no-stream`: Disable streaming mode
|
|
138
|
+
|
|
139
|
+
### Troubleshooting
|
|
140
|
+
|
|
141
|
+
- If the API key is not recognized:
|
|
142
|
+
- Make sure you've set the DEEPSEEK_API_KEY environment variable
|
|
143
|
+
- Try closing and reopening your terminal
|
|
144
|
+
- Check if the key is correct with: `echo $DEEPSEEK_API_KEY` (Unix) or `echo %DEEPSEEK_API_KEY%` (Windows)
|
|
145
|
+
|
|
146
|
+
- If you get import errors:
|
|
147
|
+
- Ensure you've installed the package: `pip list | grep deepseek-cli`
|
|
148
|
+
- Try reinstalling: `pip install --force-reinstall deepseek-cli`
|
|
149
|
+
|
|
150
|
+
- For development installation issues:
|
|
151
|
+
- Make sure you're in the correct directory
|
|
152
|
+
- Try: `pip install -e . --upgrade`
|
|
153
|
+
|
|
154
|
+
### Available Commands
|
|
155
|
+
|
|
156
|
+
Basic Commands:
|
|
157
|
+
- `/help` - Show help message
|
|
158
|
+
- `/models` - List available models
|
|
159
|
+
- `/model X` - Switch model (deepseek-chat, deepseek-coder, deepseek-reasoner)
|
|
160
|
+
- `/clear` - Clear conversation history
|
|
161
|
+
- `/history` - Display conversation history
|
|
162
|
+
- `/about` - Show API information
|
|
163
|
+
- `/balance` - Check account balance
|
|
164
|
+
|
|
165
|
+
Model Settings:
|
|
166
|
+
- `/temp X` - Set temperature (0-2) or use preset (coding/data/chat/translation/creative)
|
|
167
|
+
- `/freq X` - Set frequency penalty (-2 to 2)
|
|
168
|
+
- `/pres X` - Set presence penalty (-2 to 2)
|
|
169
|
+
- `/top_p X` - Set top_p sampling (0 to 1)
|
|
170
|
+
|
|
171
|
+
Beta Features:
|
|
172
|
+
- `/beta` - Toggle beta features
|
|
173
|
+
- `/prefix` - Toggle prefix completion mode
|
|
174
|
+
- `/fim` - Toggle Fill-in-the-Middle completion
|
|
175
|
+
- `/cache` - Toggle context caching
|
|
176
|
+
|
|
177
|
+
Output Control:
|
|
178
|
+
- `/json` - Toggle JSON output mode
|
|
179
|
+
- `/stream` - Toggle streaming mode (streaming is enabled by default)
|
|
180
|
+
- `/stop X` - Add stop sequence
|
|
181
|
+
- `/clearstop` - Clear stop sequences
|
|
182
|
+
|
|
183
|
+
Function Calling:
|
|
184
|
+
- `/function {}` - Add function definition (JSON format)
|
|
185
|
+
- `/clearfuncs` - Clear registered functions
|
|
186
|
+
|
|
187
|
+
### Model-Specific Features
|
|
188
|
+
|
|
189
|
+
#### DeepSeek-V3.1 (deepseek-chat)
|
|
190
|
+
- **Version**: DeepSeek-V3.1 (Non-thinking Mode)
|
|
191
|
+
- **Context Length**: 128K tokens (128,000 tokens)
|
|
192
|
+
- **Output Length**: Default 4K, Maximum 8K tokens
|
|
193
|
+
- **Supports all features**:
|
|
194
|
+
- JSON Output ✓
|
|
195
|
+
- Function Calling ✓ (up to 128 functions)
|
|
196
|
+
- Chat Prefix Completion (Beta) ✓
|
|
197
|
+
- Fill-in-the-Middle (Beta) ✓
|
|
198
|
+
- General-purpose chat model
|
|
199
|
+
- Latest improvements:
|
|
200
|
+
- Enhanced instruction following (77.6% IFEval accuracy)
|
|
201
|
+
- Improved JSON output (97% parsing rate)
|
|
202
|
+
- Advanced reasoning capabilities
|
|
203
|
+
- Role-playing capabilities
|
|
204
|
+
|
|
205
|
+
#### DeepSeek-V3.1 (deepseek-reasoner)
|
|
206
|
+
- **Version**: DeepSeek-V3.1 (Thinking Mode)
|
|
207
|
+
- **Context Length**: 128K tokens (128,000 tokens)
|
|
208
|
+
- **Output Length**: Default 32K, Maximum 64K tokens
|
|
209
|
+
- **Chain of Thought**: Displays reasoning process before final answer
|
|
210
|
+
- **Supported features**:
|
|
211
|
+
- JSON Output ✓
|
|
212
|
+
- Chat Prefix Completion (Beta) ✓
|
|
213
|
+
- **Unsupported features**:
|
|
214
|
+
- Function Calling ✗ (automatically falls back to deepseek-chat if tools provided)
|
|
215
|
+
- Fill-in-the-Middle ✗
|
|
216
|
+
- Temperature, top_p, presence/frequency penalties ✗
|
|
217
|
+
- Excels at complex reasoning and problem-solving tasks
|
|
218
|
+
|
|
219
|
+
#### DeepSeek-V2.5 Coder (deepseek-coder)
|
|
220
|
+
- **Context Length**: 128K tokens
|
|
221
|
+
- **Output Length**: Default 4K, Maximum 8K tokens
|
|
222
|
+
- **Supports all features**:
|
|
223
|
+
- JSON Output ✓
|
|
224
|
+
- Function Calling ✓
|
|
225
|
+
- Chat Prefix Completion (Beta) ✓
|
|
226
|
+
- Fill-in-the-Middle (Beta) ✓
|
|
227
|
+
- Optimized for code generation and analysis
|
|
228
|
+
|
|
229
|
+
### Feature Details
|
|
230
|
+
|
|
231
|
+
#### Fill-in-the-Middle (FIM)
|
|
232
|
+
Use XML-style tags to define the gap:
|
|
233
|
+
```
|
|
234
|
+
<fim_prefix>def calculate_sum(a, b):</fim_prefix><fim_suffix> return result</fim_suffix>
|
|
235
|
+
```
|
|
236
|
+
|
|
237
|
+
#### JSON Mode
|
|
238
|
+
Forces model to output valid JSON. Example system message:
|
|
239
|
+
```json
|
|
240
|
+
{
|
|
241
|
+
"response": "structured output",
|
|
242
|
+
"data": {
|
|
243
|
+
"field1": "value1",
|
|
244
|
+
"field2": "value2"
|
|
245
|
+
}
|
|
246
|
+
}
|
|
247
|
+
```
|
|
248
|
+
|
|
249
|
+
#### Context Caching
|
|
250
|
+
- **Automatic disk-based caching** for all users
|
|
251
|
+
- **No code changes required** - works automatically
|
|
252
|
+
- **Minimum cache size**: 64 tokens
|
|
253
|
+
- **Pricing**:
|
|
254
|
+
- Cache hits: $0.014 per million tokens (90% savings)
|
|
255
|
+
- Cache misses: $0.14 per million tokens (standard rate)
|
|
256
|
+
- **Performance benefits**:
|
|
257
|
+
- Significantly reduces first token latency for long, repetitive inputs
|
|
258
|
+
- Example: 128K prompt reduced from 13s to 500ms
|
|
259
|
+
- **Best use cases**:
|
|
260
|
+
- Q&A assistants with long preset prompts
|
|
261
|
+
- Role-play with extensive character settings
|
|
262
|
+
- Data analysis with recurring queries on same documents
|
|
263
|
+
- Code analysis and debugging with repeated repository references
|
|
264
|
+
- Few-shot learning with multiple examples
|
|
265
|
+
- Enabled by default
|
|
266
|
+
|
|
267
|
+
#### Anthropic API Compatibility
|
|
268
|
+
DeepSeek API now supports Anthropic API format, enabling integration with tools like Claude Code:
|
|
269
|
+
|
|
270
|
+
**Setup for Claude Code:**
|
|
271
|
+
```bash
|
|
272
|
+
# Install Claude Code
|
|
273
|
+
npm install -g @anthropic-ai/claude-code
|
|
274
|
+
|
|
275
|
+
# Configure environment variables
|
|
276
|
+
export ANTHROPIC_BASE_URL=https://api.deepseek.com/anthropic
|
|
277
|
+
export ANTHROPIC_AUTH_TOKEN=${DEEPSEEK_API_KEY}
|
|
278
|
+
export ANTHROPIC_MODEL=deepseek-chat
|
|
279
|
+
export ANTHROPIC_SMALL_FAST_MODEL=deepseek-chat
|
|
280
|
+
|
|
281
|
+
# Run in your project
|
|
282
|
+
cd my-project
|
|
283
|
+
claude
|
|
284
|
+
```
|
|
285
|
+
|
|
286
|
+
**Python SDK Example:**
|
|
287
|
+
```python
|
|
288
|
+
import anthropic
|
|
289
|
+
|
|
290
|
+
client = anthropic.Anthropic(
|
|
291
|
+
base_url="https://api.deepseek.com/anthropic",
|
|
292
|
+
api_key="your-deepseek-api-key"
|
|
293
|
+
)
|
|
294
|
+
|
|
295
|
+
message = client.messages.create(
|
|
296
|
+
model="deepseek-chat",
|
|
297
|
+
max_tokens=1000,
|
|
298
|
+
system="You are a helpful assistant.",
|
|
299
|
+
messages=[
|
|
300
|
+
{
|
|
301
|
+
"role": "user",
|
|
302
|
+
"content": [{"type": "text", "text": "Hi, how are you?"}]
|
|
303
|
+
}
|
|
304
|
+
]
|
|
305
|
+
)
|
|
306
|
+
print(message.content)
|
|
307
|
+
```
|
|
308
|
+
|
|
309
|
+
**Supported Fields:**
|
|
310
|
+
- ✓ model, max_tokens, stop_sequences, stream, system
|
|
311
|
+
- ✓ temperature (range 0.0-2.0), top_p
|
|
312
|
+
- ✓ tools (function calling)
|
|
313
|
+
- ✗ thinking, top_k, mcp_servers (ignored)
|
|
314
|
+
|
|
315
|
+
## Temperature Presets
|
|
316
|
+
|
|
317
|
+
- `coding`: 0.0 (deterministic)
|
|
318
|
+
- `data`: 1.0 (balanced)
|
|
319
|
+
- `chat`: 1.3 (creative)
|
|
320
|
+
- `translation`: 1.3 (creative)
|
|
321
|
+
- `creative`: 1.5 (very creative)
|
|
322
|
+
|
|
323
|
+
## Error Handling
|
|
324
|
+
|
|
325
|
+
- Automatic retry with exponential backoff
|
|
326
|
+
- Rate limit handling
|
|
327
|
+
- Clear error messages
|
|
328
|
+
- API status feedback
|
|
329
|
+
|
|
330
|
+
## Support
|
|
331
|
+
|
|
332
|
+
For support, please open an issue on the [GitHub repository](https://github.com/PierrunoYT/deepseek-cli/issues).
|
|
333
|
+
|
|
334
|
+
## License
|
|
335
|
+
|
|
336
336
|
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
|