@arikusi/deepseek-mcp-server 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md ADDED
@@ -0,0 +1,79 @@
1
+ # Changelog
2
+
3
+ All notable changes to this project will be documented in this file.
4
+
5
+ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
6
+ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
+
8
+ ## [Unreleased]
9
+
10
+ ### Added
11
+ - Nothing yet
12
+
13
+ ### Changed
14
+ - Nothing yet
15
+
16
+ ### Fixed
17
+ - Nothing yet
18
+
19
+ ## [1.0.0] - 2025-01-13
20
+
21
+ ### Added
22
+ - Initial release of DeepSeek MCP Server
23
+ - Support for `deepseek-chat` model
24
+ - Support for `deepseek-reasoner` (R1) model with reasoning traces
25
+ - Streaming mode support
26
+ - Full TypeScript implementation with type safety
27
+ - OpenAI-compatible API client
28
+ - Comprehensive error handling
29
+ - MCP protocol compliance via stdio transport
30
+ - Tool: `deepseek_chat` for chat completions
31
+ - Environment variable configuration for API key
32
+ - Detailed documentation and examples
33
+ - MIT License
34
+
35
+ ### Features
36
+ - **Models**:
37
+ - deepseek-chat: Fast general-purpose model
38
+ - deepseek-reasoner: Advanced reasoning with chain-of-thought
39
+ - **Parameters**:
40
+ - temperature: Control randomness (0-2)
41
+ - max_tokens: Limit response length
42
+ - stream: Enable streaming mode
43
+ - **Output**:
44
+ - Text content with formatting
45
+ - Reasoning traces for R1 model
46
+ - Token usage statistics
47
+ - Structured response data
48
+
49
+ ### Technical
50
+ - Built with @modelcontextprotocol/sdk v1.0.4
51
+ - Uses OpenAI SDK v4.77.3 for API compatibility
52
+ - Zod v3.24.1 for schema validation
53
+ - TypeScript v5.7.3
54
+ - Node.js 18+ required
55
+ - Stdio-based transport for process communication
56
+
57
+ ## [0.1.0] - Development
58
+
59
+ ### Added
60
+ - Initial project setup
61
+ - Basic MCP server structure
62
+ - DeepSeek API integration prototype
63
+
64
+ ---
65
+
66
+ ## Version History
67
+
68
+ - **1.0.0** (2025-01-13): Initial public release
69
+ - **0.1.0** (Development): Internal development version
70
+
71
+ ## Links
72
+
73
+ - [npm package](https://www.npmjs.com/package/@arikusi/deepseek-mcp-server)
74
+ - [GitHub repository](https://github.com/arikusi/deepseek-mcp-server)
75
+ - [Issue tracker](https://github.com/arikusi/deepseek-mcp-server/issues)
76
+
77
+ [Unreleased]: https://github.com/arikusi/deepseek-mcp-server/compare/v1.0.0...HEAD
78
+ [1.0.0]: https://github.com/arikusi/deepseek-mcp-server/releases/tag/v1.0.0
79
+ [0.1.0]: https://github.com/arikusi/deepseek-mcp-server/releases/tag/v0.1.0
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 DeepSeek MCP Server Contributors
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,323 @@
1
+ # DeepSeek MCP Server
2
+
3
+ [![npm version](https://img.shields.io/npm/v/@arikusi/deepseek-mcp-server.svg)](https://www.npmjs.com/package/@arikusi/deepseek-mcp-server)
4
+ [![npm downloads](https://img.shields.io/npm/dm/@arikusi/deepseek-mcp-server.svg)](https://www.npmjs.com/package/@arikusi/deepseek-mcp-server)
5
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
6
+ [![Node.js Version](https://img.shields.io/node/v/@arikusi/deepseek-mcp-server.svg)](https://nodejs.org/)
7
+ [![TypeScript](https://img.shields.io/badge/TypeScript-5.7-blue.svg)](https://www.typescriptlang.org/)
8
+ [![Build Status](https://github.com/arikusi/deepseek-mcp-server/workflows/CI/badge.svg)](https://github.com/arikusi/deepseek-mcp-server/actions)
9
+
10
+ A Model Context Protocol (MCP) server that integrates DeepSeek AI models with MCP-compatible clients. Access DeepSeek's powerful chat and reasoning models directly from your development environment.
11
+
12
+ **Compatible with:**
13
+ - Claude Code CLI
14
+ - Gemini CLI (if MCP support is available)
15
+ - Any MCP-compatible client
16
+
17
+ > **⚠️ Note**: This is an unofficial community project and is not affiliated with DeepSeek.
18
+
19
+ ## ⚡ Quick Start
20
+
21
+ ### For Claude Code
22
+
23
+ ```bash
24
+ # Install and configure in one step
25
+ claude mcp add deepseek npx @arikusi/deepseek-mcp-server
26
+
27
+ # Enter your DeepSeek API key when prompted
28
+ ```
29
+
30
+ ### For Gemini CLI
31
+
32
+ ```bash
33
+ # Install and configure with API key
34
+ gemini mcp add deepseek npx @arikusi/deepseek-mcp-server -e DEEPSEEK_API_KEY=your-key-here
35
+ ```
36
+
37
+ **Get your API key:** [https://platform.deepseek.com](https://platform.deepseek.com)
38
+
39
+ That's it! Your MCP client can now use DeepSeek models! 🎉
40
+
41
+ ---
42
+
43
+ ## Features
44
+
45
+ - 🤖 **DeepSeek Chat**: Fast and capable general-purpose model
46
+ - 🧠 **DeepSeek Reasoner (R1)**: Advanced reasoning with chain-of-thought explanations
47
+ - 🔄 **Streaming Support**: Real-time response generation
48
+ - 🛡️ **Type-Safe**: Full TypeScript implementation
49
+ - 🎯 **MCP Compatible**: Works with any MCP-compatible CLI (Claude Code, Gemini CLI, etc.)
50
+
51
+ ## Installation
52
+
53
+ ### Prerequisites
54
+
55
+ - Node.js 18+
56
+ - A DeepSeek API key (get one at [https://platform.deepseek.com](https://platform.deepseek.com))
57
+
58
+ ### Manual Installation
59
+
60
+ If you prefer to install manually:
61
+
62
+ ```bash
63
+ npm install -g @arikusi/deepseek-mcp-server
64
+ ```
65
+
66
+ ### From Source
67
+
68
+ 1. **Clone the repository**
69
+
70
+ ```bash
71
+ git clone https://github.com/arikusi/deepseek-mcp-server.git
72
+ cd deepseek-mcp-server
73
+ ```
74
+
75
+ 2. **Install dependencies**
76
+
77
+ ```bash
78
+ npm install
79
+ ```
80
+
81
+ 3. **Build the project**
82
+
83
+ ```bash
84
+ npm run build
85
+ ```
86
+
87
+ ## Usage
88
+
89
+ Once configured, your MCP client will have access to the `deepseek_chat` tool and can use DeepSeek models.
90
+
91
+ **Example prompts:**
92
+ ```
93
+ "Use DeepSeek to explain quantum computing"
94
+ "Ask DeepSeek Reasoner to solve: If I have 10 apples and buy 5 more..."
95
+ ```
96
+
97
+ Your MCP client will automatically call the `deepseek_chat` tool.
98
+
99
+ ### Manual Configuration (Advanced)
100
+
101
+ If your MCP client doesn't support the `add` command, manually add to your config file:
102
+
103
+ ```json
104
+ {
105
+ "mcpServers": {
106
+ "deepseek": {
107
+ "command": "npx",
108
+ "args": ["@arikusi/deepseek-mcp-server"],
109
+ "env": {
110
+ "DEEPSEEK_API_KEY": "your-api-key-here"
111
+ }
112
+ }
113
+ }
114
+ }
115
+ ```
116
+
117
+ **Note**: Config file location varies by client (e.g., `~/.claude/mcp_settings.json` for Claude Code).
118
+
119
+ ## Available Tools
120
+
121
+ ### `deepseek_chat`
122
+
123
+ Chat with DeepSeek AI models.
124
+
125
+ **Parameters:**
126
+
127
+ - `messages` (required): Array of conversation messages
128
+ - `role`: "system" | "user" | "assistant"
129
+ - `content`: Message text
130
+ - `model` (optional): "deepseek-chat" (default) or "deepseek-reasoner"
131
+ - `temperature` (optional): 0-2, controls randomness (default: 1.0)
132
+ - `max_tokens` (optional): Maximum tokens to generate
133
+ - `stream` (optional): Enable streaming mode (default: false)
134
+
135
+ **Example:**
136
+
137
+ ```json
138
+ {
139
+ "messages": [
140
+ {
141
+ "role": "user",
142
+ "content": "Explain the theory of relativity in simple terms"
143
+ }
144
+ ],
145
+ "model": "deepseek-chat",
146
+ "temperature": 0.7,
147
+ "max_tokens": 1000
148
+ }
149
+ ```
150
+
151
+ **DeepSeek Reasoner Example:**
152
+
153
+ ```json
154
+ {
155
+ "messages": [
156
+ {
157
+ "role": "user",
158
+ "content": "If I have 10 apples and eat 3, then buy 5 more, how many do I have?"
159
+ }
160
+ ],
161
+ "model": "deepseek-reasoner"
162
+ }
163
+ ```
164
+
165
+ The reasoner model will show its thinking process in `<thinking>` tags followed by the final answer.
166
+
167
+ ## Models
168
+
169
+ ### deepseek-chat
170
+
171
+ - **Best for**: General conversations, coding, content generation
172
+ - **Speed**: Fast
173
+ - **Context**: 64K tokens
174
+ - **Cost**: Most economical
175
+
176
+ ### deepseek-reasoner (R1)
177
+
178
+ - **Best for**: Complex reasoning, math, logic problems, multi-step tasks
179
+ - **Speed**: Slower (shows thinking process)
180
+ - **Context**: 64K tokens
181
+ - **Special**: Provides chain-of-thought reasoning
182
+ - **Output**: Both reasoning process and final answer
183
+
184
+ ## Development
185
+
186
+ ### Project Structure
187
+
188
+ ```
189
+ deepseek-mcp-server/
190
+ ├── src/
191
+ │ ├── index.ts # Main MCP server
192
+ │ ├── deepseek-client.ts # DeepSeek API wrapper
193
+ │ └── types.ts # TypeScript definitions
194
+ ├── dist/ # Compiled JavaScript
195
+ ├── package.json
196
+ ├── tsconfig.json
197
+ └── README.md
198
+ ```
199
+
200
+ ### Building
201
+
202
+ ```bash
203
+ npm run build
204
+ ```
205
+
206
+ ### Watch Mode (for development)
207
+
208
+ ```bash
209
+ npm run watch
210
+ ```
211
+
212
+ ### Testing Locally
213
+
214
+ ```bash
215
+ # Set API key
216
+ export DEEPSEEK_API_KEY="your-key"
217
+
218
+ # Run the server
219
+ npm start
220
+ ```
221
+
222
+ The server will start and wait for MCP client connections via stdio.
223
+
224
+ ## Troubleshooting
225
+
226
+ ### "DEEPSEEK_API_KEY environment variable is not set"
227
+
228
+ Make sure you've set your API key in the MCP settings `env` section or as an environment variable.
229
+
230
+ ### "Failed to connect to DeepSeek API"
231
+
232
+ 1. Check your API key is valid
233
+ 2. Verify you have internet connection
234
+ 3. Check DeepSeek API status at [https://status.deepseek.com](https://status.deepseek.com)
235
+
236
+ ### Server not appearing in your MCP client
237
+
238
+ 1. Verify the path to `dist/index.js` is correct
239
+ 2. Make sure you ran `npm run build`
240
+ 3. Check your MCP client's logs for errors
241
+ 4. Restart your MCP client completely
242
+
243
+ ### Permission Denied on macOS/Linux
244
+
245
+ Make the file executable:
246
+
247
+ ```bash
248
+ chmod +x dist/index.js
249
+ ```
250
+
251
+ ## Publishing to npm
252
+
253
+ To share this MCP server with others:
254
+
255
+ 1. Run `npm login`
256
+ 2. Run `npm publish --access public`
257
+
258
+ Users can then install with:
259
+
260
+ ```bash
261
+ npm install -g @arikusi/deepseek-mcp-server
262
+ ```
263
+
264
+ ## Contributing
265
+
266
+ Contributions are welcome! Please read our [Contributing Guidelines](CONTRIBUTING.md) before submitting PRs.
267
+
268
+ ### Reporting Issues
269
+
270
+ Found a bug or have a feature request? Please [open an issue](https://github.com/arikusi/deepseek-mcp-server/issues/new/choose) using our templates.
271
+
272
+ ### Development
273
+
274
+ ```bash
275
+ # Clone the repo
276
+ git clone https://github.com/arikusi/deepseek-mcp-server.git
277
+ cd deepseek-mcp-server
278
+
279
+ # Install dependencies
280
+ npm install
281
+
282
+ # Build in watch mode
283
+ npm run watch
284
+
285
+ # Run tests
286
+ npm test
287
+
288
+ # Lint
289
+ npm run lint
290
+ ```
291
+
292
+ ## Changelog
293
+
294
+ See [CHANGELOG.md](CHANGELOG.md) for version history and updates.
295
+
296
+ ## License
297
+
298
+ MIT License - see [LICENSE](LICENSE) file for details
299
+
300
+ ## Support
301
+
302
+ - 📖 [Documentation](https://github.com/arikusi/deepseek-mcp-server#readme)
303
+ - 🐛 [Bug Reports](https://github.com/arikusi/deepseek-mcp-server/issues)
304
+ - 💬 [Discussions](https://github.com/arikusi/deepseek-mcp-server/discussions)
305
+ - 📧 Contact: [GitHub Issues](https://github.com/arikusi/deepseek-mcp-server/issues)
306
+
307
+ ## Resources
308
+
309
+ - [DeepSeek Platform](https://platform.deepseek.com) - Get your API key
310
+ - [Model Context Protocol](https://modelcontextprotocol.io) - MCP specification
311
+ - [DeepSeek API Documentation](https://api-docs.deepseek.com) - API reference
312
+
313
+ ## Acknowledgments
314
+
315
+ - Built with [Model Context Protocol SDK](https://github.com/modelcontextprotocol/typescript-sdk)
316
+ - Uses [OpenAI SDK](https://github.com/openai/openai-node) for API compatibility
317
+ - Created for the MCP community
318
+
319
+ ---
320
+
321
+ **Made with ❤️ by [@arikusi](https://github.com/arikusi)**
322
+
323
+ This is an unofficial community project and is not affiliated with DeepSeek.
@@ -0,0 +1,24 @@
1
+ /**
2
+ * DeepSeek API Client
3
+ * Wrapper around OpenAI SDK for DeepSeek API
4
+ */
5
+ import type { ChatCompletionParams, ChatCompletionResponse } from './types.js';
6
+ export declare class DeepSeekClient {
7
+ private client;
8
+ private baseURL;
9
+ constructor(apiKey: string);
10
+ /**
11
+ * Create a chat completion (non-streaming)
12
+ */
13
+ createChatCompletion(params: ChatCompletionParams): Promise<ChatCompletionResponse>;
14
+ /**
15
+ * Create a streaming chat completion
16
+ * Returns the full text after streaming completes
17
+ */
18
+ createStreamingChatCompletion(params: ChatCompletionParams): Promise<ChatCompletionResponse>;
19
+ /**
20
+ * Test API connection
21
+ */
22
+ testConnection(): Promise<boolean>;
23
+ }
24
+ //# sourceMappingURL=deepseek-client.d.ts.map
@@ -0,0 +1 @@
1
+ {"version":3,"file":"deepseek-client.d.ts","sourceRoot":"","sources":["../src/deepseek-client.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAGH,OAAO,KAAK,EACV,oBAAoB,EACpB,sBAAsB,EAEvB,MAAM,YAAY,CAAC;AAEpB,qBAAa,cAAc;IACzB,OAAO,CAAC,MAAM,CAAS;IACvB,OAAO,CAAC,OAAO,CAA8B;gBAEjC,MAAM,EAAE,MAAM;IAW1B;;OAEG;IACG,oBAAoB,CACxB,MAAM,EAAE,oBAAoB,GAC3B,OAAO,CAAC,sBAAsB,CAAC;IA4ClC;;;OAGG;IACG,6BAA6B,CACjC,MAAM,EAAE,oBAAoB,GAC3B,OAAO,CAAC,sBAAsB,CAAC;IAsElC;;OAEG;IACG,cAAc,IAAI,OAAO,CAAC,OAAO,CAAC;CAazC"}
@@ -0,0 +1,142 @@
1
+ /**
2
+ * DeepSeek API Client
3
+ * Wrapper around OpenAI SDK for DeepSeek API
4
+ */
5
+ import OpenAI from 'openai';
6
+ export class DeepSeekClient {
7
+ client;
8
+ baseURL = 'https://api.deepseek.com';
9
+ constructor(apiKey) {
10
+ if (!apiKey) {
11
+ throw new Error('DeepSeek API key is required');
12
+ }
13
+ this.client = new OpenAI({
14
+ apiKey,
15
+ baseURL: this.baseURL,
16
+ });
17
+ }
18
+ /**
19
+ * Create a chat completion (non-streaming)
20
+ */
21
+ async createChatCompletion(params) {
22
+ try {
23
+ const response = await this.client.chat.completions.create({
24
+ model: params.model,
25
+ messages: params.messages,
26
+ temperature: params.temperature ?? 1.0,
27
+ max_tokens: params.max_tokens,
28
+ top_p: params.top_p,
29
+ frequency_penalty: params.frequency_penalty,
30
+ presence_penalty: params.presence_penalty,
31
+ stop: params.stop,
32
+ stream: false,
33
+ });
34
+ const choice = response.choices[0];
35
+ if (!choice) {
36
+ throw new Error('No response from DeepSeek API');
37
+ }
38
+ // Extract reasoning content if available (for deepseek-reasoner)
39
+ const reasoning_content = 'reasoning_content' in choice.message
40
+ ? choice.message.reasoning_content
41
+ : undefined;
42
+ return {
43
+ content: choice.message.content || '',
44
+ reasoning_content,
45
+ model: response.model,
46
+ usage: {
47
+ prompt_tokens: response.usage?.prompt_tokens || 0,
48
+ completion_tokens: response.usage?.completion_tokens || 0,
49
+ total_tokens: response.usage?.total_tokens || 0,
50
+ },
51
+ finish_reason: choice.finish_reason || 'stop',
52
+ };
53
+ }
54
+ catch (error) {
55
+ console.error('DeepSeek API Error:', error);
56
+ throw new Error(`DeepSeek API Error: ${error?.message || 'Unknown error'}`);
57
+ }
58
+ }
59
+ /**
60
+ * Create a streaming chat completion
61
+ * Returns the full text after streaming completes
62
+ */
63
+ async createStreamingChatCompletion(params) {
64
+ try {
65
+ const stream = await this.client.chat.completions.create({
66
+ model: params.model,
67
+ messages: params.messages,
68
+ temperature: params.temperature ?? 1.0,
69
+ max_tokens: params.max_tokens,
70
+ top_p: params.top_p,
71
+ frequency_penalty: params.frequency_penalty,
72
+ presence_penalty: params.presence_penalty,
73
+ stop: params.stop,
74
+ stream: true,
75
+ });
76
+ let fullContent = '';
77
+ let reasoningContent = '';
78
+ let modelName = params.model;
79
+ let finishReason = 'stop';
80
+ let usage = {
81
+ prompt_tokens: 0,
82
+ completion_tokens: 0,
83
+ total_tokens: 0,
84
+ };
85
+ // Collect all chunks
86
+ for await (const chunk of stream) {
87
+ const choice = chunk.choices[0];
88
+ if (!choice)
89
+ continue;
90
+ // Collect content
91
+ if (choice.delta?.content) {
92
+ fullContent += choice.delta.content;
93
+ }
94
+ // Collect reasoning content (for deepseek-reasoner)
95
+ if ('reasoning_content' in choice.delta) {
96
+ reasoningContent += choice.delta.reasoning_content || '';
97
+ }
98
+ // Get finish reason
99
+ if (choice.finish_reason) {
100
+ finishReason = choice.finish_reason;
101
+ }
102
+ // Get model name
103
+ if (chunk.model) {
104
+ modelName = chunk.model;
105
+ }
106
+ // Get usage info (usually in last chunk)
107
+ if (chunk.usage) {
108
+ usage = chunk.usage;
109
+ }
110
+ }
111
+ return {
112
+ content: fullContent,
113
+ reasoning_content: reasoningContent || undefined,
114
+ model: modelName,
115
+ usage,
116
+ finish_reason: finishReason,
117
+ };
118
+ }
119
+ catch (error) {
120
+ console.error('DeepSeek Streaming API Error:', error);
121
+ throw new Error(`DeepSeek Streaming API Error: ${error?.message || 'Unknown error'}`);
122
+ }
123
+ }
124
+ /**
125
+ * Test API connection
126
+ */
127
+ async testConnection() {
128
+ try {
129
+ const response = await this.createChatCompletion({
130
+ model: 'deepseek-chat',
131
+ messages: [{ role: 'user', content: 'Hi' }],
132
+ max_tokens: 10,
133
+ });
134
+ return !!response.content;
135
+ }
136
+ catch (error) {
137
+ console.error('Connection test failed:', error);
138
+ return false;
139
+ }
140
+ }
141
+ }
142
+ //# sourceMappingURL=deepseek-client.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"file":"deepseek-client.js","sourceRoot":"","sources":["../src/deepseek-client.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAEH,OAAO,MAAM,MAAM,QAAQ,CAAC;AAO5B,MAAM,OAAO,cAAc;IACjB,MAAM,CAAS;IACf,OAAO,GAAG,0BAA0B,CAAC;IAE7C,YAAY,MAAc;QACxB,IAAI,CAAC,MAAM,EAAE,CAAC;YACZ,MAAM,IAAI,KAAK,CAAC,8BAA8B,CAAC,CAAC;QAClD,CAAC;QAED,IAAI,CAAC,MAAM,GAAG,IAAI,MAAM,CAAC;YACvB,MAAM;YACN,OAAO,EAAE,IAAI,CAAC,OAAO;SACtB,CAAC,CAAC;IACL,CAAC;IAED;;OAEG;IACH,KAAK,CAAC,oBAAoB,CACxB,MAA4B;QAE5B,IAAI,CAAC;YACH,MAAM,QAAQ,GAAG,MAAM,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,WAAW,CAAC,MAAM,CAAC;gBACzD,KAAK,EAAE,MAAM,CAAC,KAAK;gBACnB,QAAQ,EAAE,MAAM,CAAC,QAAQ;gBACzB,WAAW,EAAE,MAAM,CAAC,WAAW,IAAI,GAAG;gBACtC,UAAU,EAAE,MAAM,CAAC,UAAU;gBAC7B,KAAK,EAAE,MAAM,CAAC,KAAK;gBACnB,iBAAiB,EAAE,MAAM,CAAC,iBAAiB;gBAC3C,gBAAgB,EAAE,MAAM,CAAC,gBAAgB;gBACzC,IAAI,EAAE,MAAM,CAAC,IAAI;gBACjB,MAAM,EAAE,KAAK;aACd,CAAC,CAAC;YAEH,MAAM,MAAM,GAAG,QAAQ,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC;YACnC,IAAI,CAAC,MAAM,EAAE,CAAC;gBACZ,MAAM,IAAI,KAAK,CAAC,+BAA+B,CAAC,CAAC;YACnD,CAAC;YAED,iEAAiE;YACjE,MAAM,iBAAiB,GACrB,mBAAmB,IAAI,MAAM,CAAC,OAAO;gBACnC,CAAC,CAAE,MAAM,CAAC,OAAe,CAAC,iBAAiB;gBAC3C,CAAC,CAAC,SAAS,CAAC;YAEhB,OAAO;gBACL,OAAO,EAAE,MAAM,CAAC,OAAO,CAAC,OAAO,IAAI,EAAE;gBACrC,iBAAiB;gBACjB,KAAK,EAAE,QAAQ,CAAC,KAAK;gBACrB,KAAK,EAAE;oBACL,aAAa,EAAE,QAAQ,CAAC,KAAK,EAAE,aAAa,IAAI,CAAC;oBACjD,iBAAiB,EAAE,QAAQ,CAAC,KAAK,EAAE,iBAAiB,IAAI,CAAC;oBACzD,YAAY,EAAE,QAAQ,CAAC,KAAK,EAAE,YAAY,IAAI,CAAC;iBAChD;gBACD,aAAa,EAAE,MAAM,CAAC,aAAa,IAAI,MAAM;aAC9C,CAAC;QACJ,CAAC;QAAC,OAAO,KAAU,EAAE,CAAC;YACpB,OAAO,CAAC,KAAK,CAAC,qBAAqB,EAAE,KAAK,CAAC,CAAC;YAC5C,MAAM,IAAI,KAAK,CACb,uBAAuB,KAAK,EAAE,OAAO,IAAI,eAAe,EAAE,CAC3D,CAAC;QACJ,CAAC;IACH,CAAC;IAED;;;OAGG;IACH,KAAK,CAAC,6BAA6B,CACjC,MAA4B;QAE5B,IAAI,CAAC;YACH,MAAM,MAAM,GAAG,MAAM,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,WAAW,CAAC,MAAM,CAAC;gBACvD,KAAK,EAAE,MAAM,CAAC,KAAK;gBACnB,QAAQ,EAAE,MAAM,CAAC,QAAQ;gBACzB,WAAW,EAAE,MAAM,CAAC,WAAW,IAAI,GAAG;gBACtC,UAAU,EAAE,MAAM,CAAC,UAAU;gBAC7B,KAAK,EAAE,MAAM,CAAC,KAAK;gBACnB,iBAAiB,EAAE,MAAM,CAAC,iBAAiB;gBAC3C,gBAAgB,EAAE,MAAM,CAAC,gBAAgB;gBACzC,IAAI,EAAE,MAAM,CAAC,IAAI;gBACjB,MAAM,EAAE,IAAI;aACb,CAAC,CAAC;YAEH,IAAI,WAAW,GAAG,EAAE,CAAC;YACrB,IAAI,gBAAgB,GAAG,EAAE,CAAC;YAC1B,IAAI,SAAS,GAAG,MAAM,CAAC,KAAK,CAAC;YAC7B,IAAI,YAAY,GAAG,MAAM,CAAC;YAC1B,IAAI,KAAK,GAAG;gBACV,aAAa,EAAE,CAAC;gBAChB,iBAAiB,EAAE,CAAC;gBACpB,YAAY,EAAE,CAAC;aAChB,CAAC;YAEF,qBAAqB;YACrB,IAAI,KAAK,EAAE,MAAM,KAAK,IAAI,MAAM,EAAE,CAAC;gBACjC,MAAM,MAAM,GAAG,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC;gBAChC,IAAI,CAAC,MAAM;oBAAE,SAAS;gBAEtB,kBAAkB;gBAClB,IAAI,MAAM,CAAC,KAAK,EAAE,OAAO,EAAE,CAAC;oBAC1B,WAAW,IAAI,MAAM,CAAC,KAAK,CAAC,OAAO,CAAC;gBACtC,CAAC;gBAED,oDAAoD;gBACpD,IAAI,mBAAmB,IAAI,MAAM,CAAC,KAAK,EAAE,CAAC;oBACxC,gBAAgB,IAAK,MAAM,CAAC,KAAa,CAAC,iBAAiB,IAAI,EAAE,CAAC;gBACpE,CAAC;gBAED,oBAAoB;gBACpB,IAAI,MAAM,CAAC,aAAa,EAAE,CAAC;oBACzB,YAAY,GAAG,MAAM,CAAC,aAAa,CAAC;gBACtC,CAAC;gBAED,iBAAiB;gBACjB,IAAI,KAAK,CAAC,KAAK,EAAE,CAAC;oBAChB,SAAS,GAAG,KAAK,CAAC,KAAsB,CAAC;gBAC3C,CAAC;gBAED,yCAAyC;gBACzC,IAAK,KAAa,CAAC,KAAK,EAAE,CAAC;oBACzB,KAAK,GAAI,KAAa,CAAC,KAAK,CAAC;gBAC/B,CAAC;YACH,CAAC;YAED,OAAO;gBACL,OAAO,EAAE,WAAW;gBACpB,iBAAiB,EAAE,gBAAgB,IAAI,SAAS;gBAChD,KAAK,EAAE,SAAS;gBAChB,KAAK;gBACL,aAAa,EAAE,YAAY;aAC5B,CAAC;QACJ,CAAC;QAAC,OAAO,KAAU,EAAE,CAAC;YACpB,OAAO,CAAC,KAAK,CAAC,+BAA+B,EAAE,KAAK,CAAC,CAAC;YACtD,MAAM,IAAI,KAAK,CACb,iCAAiC,KAAK,EAAE,OAAO,IAAI,eAAe,EAAE,CACrE,CAAC;QACJ,CAAC;IACH,CAAC;IAED;;OAEG;IACH,KAAK,CAAC,cAAc;QAClB,IAAI,CAAC;YACH,MAAM,QAAQ,GAAG,MAAM,IAAI,CAAC,oBAAoB,CAAC;gBAC/C,KAAK,EAAE,eAAe;gBACtB,QAAQ,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC;gBAC3C,UAAU,EAAE,EAAE;aACf,CAAC,CAAC;YACH,OAAO,CAAC,CAAC,QAAQ,CAAC,OAAO,CAAC;QAC5B,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO,CAAC,KAAK,CAAC,yBAAyB,EAAE,KAAK,CAAC,CAAC;YAChD,OAAO,KAAK,CAAC;QACf,CAAC;IACH,CAAC;CACF"}
@@ -0,0 +1,10 @@
1
+ #!/usr/bin/env node
2
+ /**
3
+ * DeepSeek MCP Server
4
+ * Model Context Protocol server for DeepSeek API integration
5
+ *
6
+ * This server exposes DeepSeek's chat and reasoning models as MCP tools
7
+ * that can be used by Claude Code and other MCP clients.
8
+ */
9
+ export {};
10
+ //# sourceMappingURL=index.d.ts.map
@@ -0,0 +1 @@
1
+ {"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";AAEA;;;;;;GAMG"}
package/dist/index.js ADDED
@@ -0,0 +1,169 @@
1
+ #!/usr/bin/env node
2
+ /**
3
+ * DeepSeek MCP Server
4
+ * Model Context Protocol server for DeepSeek API integration
5
+ *
6
+ * This server exposes DeepSeek's chat and reasoning models as MCP tools
7
+ * that can be used by Claude Code and other MCP clients.
8
+ */
9
+ import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
10
+ import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
11
+ import { z } from 'zod';
12
+ import { DeepSeekClient } from './deepseek-client.js';
13
+ // Get API key from environment variable
14
+ const DEEPSEEK_API_KEY = process.env.DEEPSEEK_API_KEY;
15
+ if (!DEEPSEEK_API_KEY) {
16
+ console.error('Error: DEEPSEEK_API_KEY environment variable is not set');
17
+ console.error('Please set your DeepSeek API key:');
18
+ console.error(' export DEEPSEEK_API_KEY="your-api-key-here"');
19
+ process.exit(1);
20
+ }
21
+ // Initialize DeepSeek client
22
+ const deepseek = new DeepSeekClient(DEEPSEEK_API_KEY);
23
+ // Create MCP server
24
+ const server = new McpServer({
25
+ name: 'deepseek-mcp-server',
26
+ version: '1.0.0',
27
+ });
28
+ // Define Zod schemas for input validation
29
+ const MessageSchema = z.object({
30
+ role: z.enum(['system', 'user', 'assistant']),
31
+ content: z.string(),
32
+ });
33
+ const ChatInputSchema = z.object({
34
+ messages: z.array(MessageSchema).min(1),
35
+ model: z.enum(['deepseek-chat', 'deepseek-reasoner']).default('deepseek-chat'),
36
+ temperature: z.number().min(0).max(2).optional(),
37
+ max_tokens: z.number().min(1).max(32768).optional(),
38
+ stream: z.boolean().optional().default(false),
39
+ });
40
+ /**
41
+ * Tool: deepseek_chat
42
+ *
43
+ * Chat completion with DeepSeek models.
44
+ * Supports both deepseek-chat and deepseek-reasoner (R1) models.
45
+ */
46
+ server.registerTool('deepseek_chat', {
47
+ title: 'DeepSeek Chat Completion',
48
+ description: 'Chat with DeepSeek AI models. Supports deepseek-chat for general conversations and ' +
49
+ 'deepseek-reasoner (R1) for complex reasoning tasks with chain-of-thought explanations. ' +
50
+ 'The reasoner model provides both reasoning_content (thinking process) and content (final answer).',
51
+ inputSchema: {
52
+ messages: z.array(MessageSchema).min(1).describe('Array of conversation messages'),
53
+ model: z.enum(['deepseek-chat', 'deepseek-reasoner'])
54
+ .default('deepseek-chat')
55
+ .describe('Model to use. deepseek-chat for general tasks, deepseek-reasoner for complex reasoning'),
56
+ temperature: z.number()
57
+ .min(0)
58
+ .max(2)
59
+ .optional()
60
+ .describe('Sampling temperature (0-2). Higher = more random. Default: 1.0'),
61
+ max_tokens: z.number()
62
+ .min(1)
63
+ .max(32768)
64
+ .optional()
65
+ .describe('Maximum tokens to generate. Default: model maximum'),
66
+ stream: z.boolean()
67
+ .optional()
68
+ .default(false)
69
+ .describe('Enable streaming mode. Returns full response after streaming completes.'),
70
+ },
71
+ outputSchema: {
72
+ content: z.string(),
73
+ reasoning_content: z.string().optional(),
74
+ model: z.string(),
75
+ usage: z.object({
76
+ prompt_tokens: z.number(),
77
+ completion_tokens: z.number(),
78
+ total_tokens: z.number(),
79
+ }),
80
+ finish_reason: z.string(),
81
+ },
82
+ }, async (input) => {
83
+ try {
84
+ // Validate input
85
+ const validated = ChatInputSchema.parse(input);
86
+ console.error(`[DeepSeek MCP] Request: model=${validated.model}, messages=${validated.messages.length}, stream=${validated.stream}`);
87
+ // Call appropriate method based on stream parameter
88
+ const response = validated.stream
89
+ ? await deepseek.createStreamingChatCompletion({
90
+ model: validated.model,
91
+ messages: validated.messages,
92
+ temperature: validated.temperature,
93
+ max_tokens: validated.max_tokens,
94
+ })
95
+ : await deepseek.createChatCompletion({
96
+ model: validated.model,
97
+ messages: validated.messages,
98
+ temperature: validated.temperature,
99
+ max_tokens: validated.max_tokens,
100
+ });
101
+ console.error(`[DeepSeek MCP] Response: tokens=${response.usage.total_tokens}, finish_reason=${response.finish_reason}`);
102
+ // Format response
103
+ let responseText = '';
104
+ // Add reasoning content if available (for deepseek-reasoner)
105
+ if (response.reasoning_content) {
106
+ responseText += `<thinking>\n${response.reasoning_content}\n</thinking>\n\n`;
107
+ }
108
+ responseText += response.content;
109
+ // Add usage stats
110
+ responseText += `\n\n---\n**Model:** ${response.model}\n`;
111
+ responseText += `**Tokens:** ${response.usage.prompt_tokens} prompt + ${response.usage.completion_tokens} completion = ${response.usage.total_tokens} total`;
112
+ return {
113
+ content: [
114
+ {
115
+ type: 'text',
116
+ text: responseText,
117
+ },
118
+ ],
119
+ structuredContent: response,
120
+ };
121
+ }
122
+ catch (error) {
123
+ console.error('[DeepSeek MCP] Error:', error);
124
+ const errorMessage = error?.message || 'Unknown error occurred';
125
+ return {
126
+ content: [
127
+ {
128
+ type: 'text',
129
+ text: `Error: ${errorMessage}`,
130
+ },
131
+ ],
132
+ isError: true,
133
+ };
134
+ }
135
+ });
136
+ // Start server with stdio transport
137
+ async function main() {
138
+ console.error('[DeepSeek MCP] Starting server...');
139
+ // Test connection
140
+ console.error('[DeepSeek MCP] Testing API connection...');
141
+ const isConnected = await deepseek.testConnection();
142
+ if (!isConnected) {
143
+ console.error('[DeepSeek MCP] Warning: Failed to connect to DeepSeek API');
144
+ console.error('[DeepSeek MCP] Please check your API key and internet connection');
145
+ }
146
+ else {
147
+ console.error('[DeepSeek MCP] API connection successful');
148
+ }
149
+ // Connect to stdio transport
150
+ const transport = new StdioServerTransport();
151
+ await server.connect(transport);
152
+ console.error('[DeepSeek MCP] Server running on stdio');
153
+ console.error('[DeepSeek MCP] Available tools: deepseek_chat');
154
+ }
155
+ // Error handling
156
+ process.on('uncaughtException', (error) => {
157
+ console.error('[DeepSeek MCP] Uncaught exception:', error);
158
+ process.exit(1);
159
+ });
160
+ process.on('unhandledRejection', (reason, promise) => {
161
+ console.error('[DeepSeek MCP] Unhandled rejection at:', promise, 'reason:', reason);
162
+ process.exit(1);
163
+ });
164
+ // Start the server
165
+ main().catch((error) => {
166
+ console.error('[DeepSeek MCP] Fatal error:', error);
167
+ process.exit(1);
168
+ });
169
+ //# sourceMappingURL=index.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";AAEA;;;;;;GAMG;AAEH,OAAO,EAAE,SAAS,EAAE,MAAM,yCAAyC,CAAC;AACpE,OAAO,EAAE,oBAAoB,EAAE,MAAM,2CAA2C,CAAC;AACjF,OAAO,EAAE,CAAC,EAAE,MAAM,KAAK,CAAC;AACxB,OAAO,EAAE,cAAc,EAAE,MAAM,sBAAsB,CAAC;AAGtD,wCAAwC;AACxC,MAAM,gBAAgB,GAAG,OAAO,CAAC,GAAG,CAAC,gBAAgB,CAAC;AAEtD,IAAI,CAAC,gBAAgB,EAAE,CAAC;IACtB,OAAO,CAAC,KAAK,CAAC,yDAAyD,CAAC,CAAC;IACzE,OAAO,CAAC,KAAK,CAAC,mCAAmC,CAAC,CAAC;IACnD,OAAO,CAAC,KAAK,CAAC,+CAA+C,CAAC,CAAC;IAC/D,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;AAClB,CAAC;AAED,6BAA6B;AAC7B,MAAM,QAAQ,GAAG,IAAI,cAAc,CAAC,gBAAgB,CAAC,CAAC;AAEtD,oBAAoB;AACpB,MAAM,MAAM,GAAG,IAAI,SAAS,CAAC;IAC3B,IAAI,EAAE,qBAAqB;IAC3B,OAAO,EAAE,OAAO;CACjB,CAAC,CAAC;AAEH,0CAA0C;AAC1C,MAAM,aAAa,GAAG,CAAC,CAAC,MAAM,CAAC;IAC7B,IAAI,EAAE,CAAC,CAAC,IAAI,CAAC,CAAC,QAAQ,EAAE,MAAM,EAAE,WAAW,CAAC,CAAC;IAC7C,OAAO,EAAE,CAAC,CAAC,MAAM,EAAE;CACpB,CAAC,CAAC;AAEH,MAAM,eAAe,GAAG,CAAC,CAAC,MAAM,CAAC;IAC/B,QAAQ,EAAE,CAAC,CAAC,KAAK,CAAC,aAAa,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC;IACvC,KAAK,EAAE,CAAC,CAAC,IAAI,CAAC,CAAC,eAAe,EAAE,mBAAmB,CAAC,CAAC,CAAC,OAAO,CAAC,eAAe,CAAC;IAC9E,WAAW,EAAE,CAAC,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,QAAQ,EAAE;IAChD,UAAU,EAAE,CAAC,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,KAAK,CAAC,CAAC,QAAQ,EAAE;IACnD,MAAM,EAAE,CAAC,CAAC,OAAO,EAAE,CAAC,QAAQ,EAAE,CAAC,OAAO,CAAC,KAAK,CAAC;CAC9C,CAAC,CAAC;AAEH;;;;;GAKG;AACH,MAAM,CAAC,YAAY,CACjB,eAAe,EACf;IACE,KAAK,EAAE,0BAA0B;IACjC,WAAW,EACT,qFAAqF;QACrF,yFAAyF;QACzF,mGAAmG;IACrG,WAAW,EAAE;QACX,QAAQ,EAAE,CAAC,CAAC,KAAK,CAAC,aAAa,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,QAAQ,CAAC,gCAAgC,CAAC;QAClF,KAAK,EAAE,CAAC,CAAC,IAAI,CAAC,CAAC,eAAe,EAAE,mBAAmB,CAAC,CAAC;aAClD,OAAO,CAAC,eAAe,CAAC;aACxB,QAAQ,CAAC,wFAAwF,CAAC;QACrG,WAAW,EAAE,CAAC,CAAC,MAAM,EAAE;aACpB,GAAG,CAAC,CAAC,CAAC;aACN,GAAG,CAAC,CAAC,CAAC;aACN,QAAQ,EAAE;aACV,QAAQ,CAAC,gEAAgE,CAAC;QAC7E,UAAU,EAAE,CAAC,CAAC,MAAM,EAAE;aACnB,GAAG,CAAC,CAAC,CAAC;aACN,GAAG,CAAC,KAAK,CAAC;aACV,QAAQ,EAAE;aACV,QAAQ,CAAC,oDAAoD,CAAC;QACjE,MAAM,EAAE,CAAC,CAAC,OAAO,EAAE;aAChB,QAAQ,EAAE;aACV,OAAO,CAAC,KAAK,CAAC;aACd,QAAQ,CAAC,yEAAyE,CAAC;KACvF;IACD,YAAY,EAAE;QACZ,OAAO,EAAE,CAAC,CAAC,MAAM,EAAE;QACnB,iBAAiB,EAAE,CAAC,CAAC,MAAM,EAAE,CAAC,QAAQ,EAAE;QACxC,KAAK,EAAE,CAAC,CAAC,MAAM,EAAE;QACjB,KAAK,EAAE,CAAC,CAAC,MAAM,CAAC;YACd,aAAa,EAAE,CAAC,CAAC,MAAM,EAAE;YACzB,iBAAiB,EAAE,CAAC,CAAC,MAAM,EAAE;YAC7B,YAAY,EAAE,CAAC,CAAC,MAAM,EAAE;SACzB,CAAC;QACF,aAAa,EAAE,CAAC,CAAC,MAAM,EAAE;KAC1B;CACF,EACD,KAAK,EAAE,KAAwB,EAAE,EAAE;IACjC,IAAI,CAAC;QACH,iBAAiB;QACjB,MAAM,SAAS,GAAG,eAAe,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;QAE/C,OAAO,CAAC,KAAK,CACX,iCAAiC,SAAS,CAAC,KAAK,cAAc,SAAS,CAAC,QAAQ,CAAC,MAAM,YAAY,SAAS,CAAC,MAAM,EAAE,CACtH,CAAC;QAEF,oDAAoD;QACpD,MAAM,QAAQ,GAAG,SAAS,CAAC,MAAM;YAC/B,CAAC,CAAC,MAAM,QAAQ,CAAC,6BAA6B,CAAC;gBAC3C,KAAK,EAAE,SAAS,CAAC,KAAK;gBACtB,QAAQ,EAAE,SAAS,CAAC,QAAQ;gBAC5B,WAAW,EAAE,SAAS,CAAC,WAAW;gBAClC,UAAU,EAAE,SAAS,CAAC,UAAU;aACjC,CAAC;YACJ,CAAC,CAAC,MAAM,QAAQ,CAAC,oBAAoB,CAAC;gBAClC,KAAK,EAAE,SAAS,CAAC,KAAK;gBACtB,QAAQ,EAAE,SAAS,CAAC,QAAQ;gBAC5B,WAAW,EAAE,SAAS,CAAC,WAAW;gBAClC,UAAU,EAAE,SAAS,CAAC,UAAU;aACjC,CAAC,CAAC;QAEP,OAAO,CAAC,KAAK,CACX,mCAAmC,QAAQ,CAAC,KAAK,CAAC,YAAY,mBAAmB,QAAQ,CAAC,aAAa,EAAE,CAC1G,CAAC;QAEF,kBAAkB;QAClB,IAAI,YAAY,GAAG,EAAE,CAAC;QAEtB,6DAA6D;QAC7D,IAAI,QAAQ,CAAC,iBAAiB,EAAE,CAAC;YAC/B,YAAY,IAAI,eAAe,QAAQ,CAAC,iBAAiB,mBAAmB,CAAC;QAC/E,CAAC;QAED,YAAY,IAAI,QAAQ,CAAC,OAAO,CAAC;QAEjC,kBAAkB;QAClB,YAAY,IAAI,uBAAuB,QAAQ,CAAC,KAAK,IAAI,CAAC;QAC1D,YAAY,IAAI,eAAe,QAAQ,CAAC,KAAK,CAAC,aAAa,aAAa,QAAQ,CAAC,KAAK,CAAC,iBAAiB,iBAAiB,QAAQ,CAAC,KAAK,CAAC,YAAY,QAAQ,CAAC;QAE7J,OAAO;YACL,OAAO,EAAE;gBACP;oBACE,IAAI,EAAE,MAAM;oBACZ,IAAI,EAAE,YAAY;iBACnB;aACF;YACD,iBAAiB,EAAE,QAA8C;SAClE,CAAC;IACJ,CAAC;IAAC,OAAO,KAAU,EAAE,CAAC;QACpB,OAAO,CAAC,KAAK,CAAC,uBAAuB,EAAE,KAAK,CAAC,CAAC;QAC9C,MAAM,YAAY,GAAG,KAAK,EAAE,OAAO,IAAI,wBAAwB,CAAC;QAEhE,OAAO;YACL,OAAO,EAAE;gBACP;oBACE,IAAI,EAAE,MAAM;oBACZ,IAAI,EAAE,UAAU,YAAY,EAAE;iBAC/B;aACF;YACD,OAAO,EAAE,IAAI;SACd,CAAC;IACJ,CAAC;AACH,CAAC,CACF,CAAC;AAEF,oCAAoC;AACpC,KAAK,UAAU,IAAI;IACjB,OAAO,CAAC,KAAK,CAAC,mCAAmC,CAAC,CAAC;IAEnD,kBAAkB;IAClB,OAAO,CAAC,KAAK,CAAC,0CAA0C,CAAC,CAAC;IAC1D,MAAM,WAAW,GAAG,MAAM,QAAQ,CAAC,cAAc,EAAE,CAAC;IAEpD,IAAI,CAAC,WAAW,EAAE,CAAC;QACjB,OAAO,CAAC,KAAK,CAAC,2DAA2D,CAAC,CAAC;QAC3E,OAAO,CAAC,KAAK,CAAC,kEAAkE,CAAC,CAAC;IACpF,CAAC;SAAM,CAAC;QACN,OAAO,CAAC,KAAK,CAAC,0CAA0C,CAAC,CAAC;IAC5D,CAAC;IAED,6BAA6B;IAC7B,MAAM,SAAS,GAAG,IAAI,oBAAoB,EAAE,CAAC;IAC7C,MAAM,MAAM,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC;IAEhC,OAAO,CAAC,KAAK,CAAC,wCAAwC,CAAC,CAAC;IACxD,OAAO,CAAC,KAAK,CAAC,+CAA+C,CAAC,CAAC;AACjE,CAAC;AAED,iBAAiB;AACjB,OAAO,CAAC,EAAE,CAAC,mBAAmB,EAAE,CAAC,KAAK,EAAE,EAAE;IACxC,OAAO,CAAC,KAAK,CAAC,oCAAoC,EAAE,KAAK,CAAC,CAAC;IAC3D,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;AAClB,CAAC,CAAC,CAAC;AAEH,OAAO,CAAC,EAAE,CAAC,oBAAoB,EAAE,CAAC,MAAM,EAAE,OAAO,EAAE,EAAE;IACnD,OAAO,CAAC,KAAK,CAAC,wCAAwC,EAAE,OAAO,EAAE,SAAS,EAAE,MAAM,CAAC,CAAC;IACpF,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;AAClB,CAAC,CAAC,CAAC;AAEH,mBAAmB;AACnB,IAAI,EAAE,CAAC,KAAK,CAAC,CAAC,KAAK,EAAE,EAAE;IACrB,OAAO,CAAC,KAAK,CAAC,6BAA6B,EAAE,KAAK,CAAC,CAAC;IACpD,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;AAClB,CAAC,CAAC,CAAC"}
@@ -0,0 +1,70 @@
1
+ /**
2
+ * DeepSeek MCP Server Types
3
+ * Type definitions for DeepSeek API integration with Model Context Protocol
4
+ */
5
+ /**
6
+ * Supported DeepSeek models
7
+ */
8
+ export type DeepSeekModel = 'deepseek-chat' | 'deepseek-reasoner';
9
+ /**
10
+ * Message role in conversation
11
+ */
12
+ export type MessageRole = 'system' | 'user' | 'assistant';
13
+ /**
14
+ * Chat message structure
15
+ */
16
+ export interface ChatMessage {
17
+ role: MessageRole;
18
+ content: string;
19
+ }
20
+ /**
21
+ * Parameters for chat completion request
22
+ */
23
+ export interface ChatCompletionParams {
24
+ model: DeepSeekModel;
25
+ messages: ChatMessage[];
26
+ temperature?: number;
27
+ max_tokens?: number;
28
+ top_p?: number;
29
+ frequency_penalty?: number;
30
+ presence_penalty?: number;
31
+ stop?: string | string[];
32
+ }
33
+ /**
34
+ * Response from DeepSeek chat completion
35
+ */
36
+ export interface ChatCompletionResponse {
37
+ content: string;
38
+ reasoning_content?: string;
39
+ model: string;
40
+ usage: {
41
+ prompt_tokens: number;
42
+ completion_tokens: number;
43
+ total_tokens: number;
44
+ };
45
+ finish_reason: string;
46
+ }
47
+ /**
48
+ * Tool input schema for deepseek_chat tool
49
+ */
50
+ export interface DeepSeekChatInput {
51
+ messages: Array<{
52
+ role: string;
53
+ content: string;
54
+ }>;
55
+ model?: 'deepseek-chat' | 'deepseek-reasoner';
56
+ temperature?: number;
57
+ max_tokens?: number;
58
+ stream?: boolean;
59
+ }
60
+ /**
61
+ * Error response structure
62
+ */
63
+ export interface DeepSeekError {
64
+ error: {
65
+ message: string;
66
+ type: string;
67
+ code: string;
68
+ };
69
+ }
70
+ //# sourceMappingURL=types.d.ts.map
@@ -0,0 +1 @@
1
+ {"version":3,"file":"types.d.ts","sourceRoot":"","sources":["../src/types.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAEH;;GAEG;AACH,MAAM,MAAM,aAAa,GAAG,eAAe,GAAG,mBAAmB,CAAC;AAElE;;GAEG;AACH,MAAM,MAAM,WAAW,GAAG,QAAQ,GAAG,MAAM,GAAG,WAAW,CAAC;AAE1D;;GAEG;AACH,MAAM,WAAW,WAAW;IAC1B,IAAI,EAAE,WAAW,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;CACjB;AAED;;GAEG;AACH,MAAM,WAAW,oBAAoB;IACnC,KAAK,EAAE,aAAa,CAAC;IACrB,QAAQ,EAAE,WAAW,EAAE,CAAC;IACxB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,UAAU,CAAC,EAAE,MAAM,CAAC;IACpB,KAAK,CAAC,EAAE,MAAM,CAAC;IACf,iBAAiB,CAAC,EAAE,MAAM,CAAC;IAC3B,gBAAgB,CAAC,EAAE,MAAM,CAAC;IAC1B,IAAI,CAAC,EAAE,MAAM,GAAG,MAAM,EAAE,CAAC;CAC1B;AAED;;GAEG;AACH,MAAM,WAAW,sBAAsB;IACrC,OAAO,EAAE,MAAM,CAAC;IAChB,iBAAiB,CAAC,EAAE,MAAM,CAAC;IAC3B,KAAK,EAAE,MAAM,CAAC;IACd,KAAK,EAAE;QACL,aAAa,EAAE,MAAM,CAAC;QACtB,iBAAiB,EAAE,MAAM,CAAC;QAC1B,YAAY,EAAE,MAAM,CAAC;KACtB,CAAC;IACF,aAAa,EAAE,MAAM,CAAC;CACvB;AAED;;GAEG;AACH,MAAM,WAAW,iBAAiB;IAChC,QAAQ,EAAE,KAAK,CAAC;QACd,IAAI,EAAE,MAAM,CAAC;QACb,OAAO,EAAE,MAAM,CAAC;KACjB,CAAC,CAAC;IACH,KAAK,CAAC,EAAE,eAAe,GAAG,mBAAmB,CAAC;IAC9C,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,UAAU,CAAC,EAAE,MAAM,CAAC;IACpB,MAAM,CAAC,EAAE,OAAO,CAAC;CAClB;AAED;;GAEG;AACH,MAAM,WAAW,aAAa;IAC5B,KAAK,EAAE;QACL,OAAO,EAAE,MAAM,CAAC;QAChB,IAAI,EAAE,MAAM,CAAC;QACb,IAAI,EAAE,MAAM,CAAC;KACd,CAAC;CACH"}
package/dist/types.js ADDED
@@ -0,0 +1,6 @@
1
+ /**
2
+ * DeepSeek MCP Server Types
3
+ * Type definitions for DeepSeek API integration with Model Context Protocol
4
+ */
5
+ export {};
6
+ //# sourceMappingURL=types.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"file":"types.js","sourceRoot":"","sources":["../src/types.ts"],"names":[],"mappings":"AAAA;;;GAGG"}
package/package.json ADDED
@@ -0,0 +1,63 @@
1
+ {
2
+ "name": "@arikusi/deepseek-mcp-server",
3
+ "version": "1.0.0",
4
+ "description": "MCP Server for DeepSeek API integration - enables Claude Code to use DeepSeek Chat and Reasoner models",
5
+ "main": "dist/index.js",
6
+ "type": "module",
7
+ "bin": {
8
+ "deepseek-mcp-server": "dist/index.js"
9
+ },
10
+ "scripts": {
11
+ "build": "tsc",
12
+ "watch": "tsc --watch",
13
+ "prepare": "npm run build",
14
+ "start": "node dist/index.js",
15
+ "test": "echo \"No tests yet\" && exit 0",
16
+ "lint": "tsc --noEmit",
17
+ "prepublishOnly": "npm run build"
18
+ },
19
+ "keywords": [
20
+ "mcp",
21
+ "model-context-protocol",
22
+ "deepseek",
23
+ "deepseek-api",
24
+ "deepseek-chat",
25
+ "deepseek-reasoner",
26
+ "deepseek-r1",
27
+ "claude-code",
28
+ "ai",
29
+ "llm",
30
+ "openai-compatible"
31
+ ],
32
+ "author": "arikusi",
33
+ "license": "MIT",
34
+ "repository": {
35
+ "type": "git",
36
+ "url": "git+https://github.com/arikusi/deepseek-mcp-server.git"
37
+ },
38
+ "bugs": {
39
+ "url": "https://github.com/arikusi/deepseek-mcp-server/issues"
40
+ },
41
+ "homepage": "https://github.com/arikusi/deepseek-mcp-server#readme",
42
+ "publishConfig": {
43
+ "access": "public"
44
+ },
45
+ "files": [
46
+ "dist",
47
+ "README.md",
48
+ "LICENSE",
49
+ "CHANGELOG.md"
50
+ ],
51
+ "dependencies": {
52
+ "@modelcontextprotocol/sdk": "^1.0.4",
53
+ "openai": "^4.77.3",
54
+ "zod": "^3.24.1"
55
+ },
56
+ "devDependencies": {
57
+ "@types/node": "^22.10.5",
58
+ "typescript": "^5.7.3"
59
+ },
60
+ "engines": {
61
+ "node": ">=18.0.0"
62
+ }
63
+ }