converse-mcp-server 1.5.0 → 1.5.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,19 @@
1
+ Copyright 2025 Converse MCP Server Contributors
2
+
3
+ Permission is hereby granted, free of charge, to any person obtaining a copy
4
+ of this software and associated documentation files (the "Software"), to deal
5
+ in the Software without restriction, including without limitation the rights
6
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
7
+ copies of the Software, and to permit persons to whom the Software is
8
+ furnished to do so, subject to the following conditions:
9
+
10
+ The above copyright notice and this permission notice shall be included in all
11
+ copies or substantial portions of the Software.
12
+
13
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
14
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
15
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
16
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
17
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
18
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
19
+ SOFTWARE.
package/README.md CHANGED
@@ -1,20 +1,22 @@
1
1
  # Converse MCP Server
2
2
 
3
+ [![npm version](https://img.shields.io/npm/v/converse-mcp-server.svg)](https://www.npmjs.com/package/converse-mcp-server)
4
+
3
5
  A simplified, functional Node.js implementation of an MCP (Model Context Protocol) server with chat and consensus tools. Built with modern Node.js practices and official SDKs for seamless AI provider integration.
4
6
 
5
7
  ## 🚀 Quick Start
6
8
 
7
- ### Option 1: Direct from GitHub (Recommended)
9
+ ### Option 1: Direct from NPM (Recommended)
8
10
 
9
11
  ```bash
10
12
  # Using npx (recommended)
11
- npx FallDownTheSystem/converse
13
+ npx converse-mcp-server
12
14
 
13
15
  # Using pnpm dlx (alternative)
14
- pnpm dlx FallDownTheSystem/converse
16
+ pnpm dlx converse-mcp-server
15
17
 
16
18
  # Using yarn dlx (alternative)
17
- yarn dlx FallDownTheSystem/converse
19
+ yarn dlx converse-mcp-server
18
20
  ```
19
21
 
20
22
  ### Option 2: Clone and Install
@@ -63,7 +65,6 @@ LOG_LEVEL=info
63
65
  MAX_MCP_OUTPUT_TOKENS=200000
64
66
 
65
67
  # Optional: Provider-specific settings
66
- GOOGLE_LOCATION=us-central1
67
68
  XAI_BASE_URL=https://api.x.ai/v1
68
69
  OPENROUTER_REFERER=https://github.com/FallDownTheSystem/converse
69
70
  ```
@@ -91,31 +92,7 @@ There are several ways to add the Converse MCP Server to Claude:
91
92
  "mcpServers": {
92
93
  "converse": {
93
94
  "command": "npx",
94
- "args": ["FallDownTheSystem/converse"],
95
- "env": {
96
- "OPENAI_API_KEY": "your_key_here",
97
- "GOOGLE_API_KEY": "your_key_here",
98
- "XAI_API_KEY": "your_key_here",
99
- "ANTHROPIC_API_KEY": "your_key_here",
100
- "MISTRAL_API_KEY": "your_key_here",
101
- "DEEPSEEK_API_KEY": "your_key_here",
102
- "OPENROUTER_API_KEY": "your_key_here",
103
- "OPENROUTER_REFERER": "https://github.com/YourUsername/YourApp",
104
- "MAX_MCP_OUTPUT_TOKENS": "200000"
105
- }
106
- }
107
- }
108
- }
109
- ```
110
-
111
- #### Option B: Using NPX with stdio transport
112
-
113
- ```json
114
- {
115
- "mcpServers": {
116
- "converse": {
117
- "command": "npx",
118
- "args": ["FallDownTheSystem/converse", "--transport", "stdio"],
95
+ "args": ["converse-mcp-server"],
119
96
  "env": {
120
97
  "OPENAI_API_KEY": "your_key_here",
121
98
  "GOOGLE_API_KEY": "your_key_here",
@@ -132,7 +109,7 @@ There are several ways to add the Converse MCP Server to Claude:
132
109
  }
133
110
  ```
134
111
 
135
- #### Option C: Direct Node.js execution
112
+ #### Option B: Direct Node.js execution
136
113
 
137
114
  ```json
138
115
  {
@@ -140,9 +117,7 @@ There are several ways to add the Converse MCP Server to Claude:
140
117
  "converse": {
141
118
  "command": "node",
142
119
  "args": [
143
- "C:\\Users\\YourUsername\\Documents\\Projects\\converse\\src\\index.js",
144
- "--transport",
145
- "stdio"
120
+ "C:\\Users\\YourUsername\\Documents\\Projects\\converse\\src\\index.js"
146
121
  ],
147
122
  "env": {
148
123
  "OPENAI_API_KEY": "your_key_here",
@@ -160,27 +135,7 @@ There are several ways to add the Converse MCP Server to Claude:
160
135
  }
161
136
  ```
162
137
 
163
- #### Option D: Using environment variable for transport
164
-
165
- ```json
166
- {
167
- "mcpServers": {
168
- "converse": {
169
- "command": "npx",
170
- "args": ["FallDownTheSystem/converse"],
171
- "env": {
172
- "MCP_TRANSPORT": "stdio",
173
- "OPENAI_API_KEY": "your_key_here",
174
- "GOOGLE_API_KEY": "your_key_here",
175
- "XAI_API_KEY": "your_key_here",
176
- "MAX_MCP_OUTPUT_TOKENS": "200000"
177
- }
178
- }
179
- }
180
- }
181
- ```
182
-
183
- #### Option E: Local HTTP Development (Advanced)
138
+ #### Option C: Local HTTP Development (Advanced)
184
139
 
185
140
  For local development with HTTP transport (optional, for debugging):
186
141
 
@@ -207,9 +162,16 @@ For local development with HTTP transport (optional, for debugging):
207
162
  #### Installation Steps
208
163
 
209
164
  1. **For Claude Code**:
210
- - Open the command palette (Ctrl/Cmd + Shift + P)
211
- - Run "Claude Code: Edit MCP Settings"
212
- - Add one of the configurations above
165
+ ```bash
166
+ # Add the server globally (for all projects)
167
+ claude mcp add converse npx converse-mcp-server -s user
168
+
169
+ # Then set your API keys
170
+ claude mcp set-env converse OPENAI_API_KEY=your_key_here -s user
171
+ claude mcp set-env converse GOOGLE_API_KEY=your_key_here -s user
172
+ claude mcp set-env converse XAI_API_KEY=your_key_here -s user
173
+ # Add other API keys as needed
174
+ ```
213
175
 
214
176
  2. **For Claude Desktop**:
215
177
  - Navigate to Settings → Developer → MCP Servers
@@ -291,7 +253,9 @@ Programmatic access to documentation:
291
253
  ### OpenAI Models
292
254
  - **o3**: Strong reasoning (200K context)
293
255
  - **o3-mini**: Fast O3 variant (200K context)
256
+ - **o3-pro**: Professional-grade reasoning (200K context) - EXTREMELY EXPENSIVE
294
257
  - **o4-mini**: Latest reasoning model (200K context)
258
+ - **gpt-4.1**: Advanced reasoning (1M context)
295
259
  - **gpt-4o**: Multimodal flagship (128K context)
296
260
  - **gpt-4o-mini**: Fast multimodal (128K context)
297
261
 
@@ -299,12 +263,34 @@ Programmatic access to documentation:
299
263
  - **gemini-2.5-flash** (alias: `flash`): Ultra-fast (1M context)
300
264
  - **gemini-2.5-pro** (alias: `pro`): Deep reasoning (1M context)
301
265
  - **gemini-2.0-flash**: Latest with experimental thinking
266
+ - **gemini-2.0-flash-lite**: Lightweight fast model, text-only
302
267
 
303
268
  ### X.AI/Grok Models
304
269
  - **grok-4-0709** (alias: `grok`): Latest advanced model (256K context)
305
270
  - **grok-3**: Previous generation (131K context)
306
271
  - **grok-3-fast**: Higher performance variant
307
272
 
273
+ ### Anthropic Models
274
+ - **claude-opus-4**: Highest intelligence with extended thinking (200K context)
275
+ - **claude-sonnet-4**: Balanced performance with extended thinking (200K context)
276
+ - **claude-3.7-sonnet**: Enhanced 3.x generation with thinking (200K context)
277
+ - **claude-3.5-sonnet**: Fast and intelligent (200K context)
278
+ - **claude-3.5-haiku**: Fastest model for simple queries (200K context)
279
+
280
+ ### Mistral Models
281
+ - **magistral-medium**: Frontier-class reasoning model (40K context)
282
+ - **magistral-small**: Small reasoning model (40K context)
283
+ - **mistral-medium-3**: Frontier-class multimodal model (128K context)
284
+
285
+ ### DeepSeek Models
286
+ - **deepseek-chat**: Strong MoE model with 671B/37B parameters (64K context)
287
+ - **deepseek-reasoner**: Advanced reasoning model with CoT (64K context)
288
+
289
+ ### OpenRouter Models
290
+ - **qwen3-235b-thinking**: Qwen3 with enhanced reasoning (32K context)
291
+ - **qwen3-coder**: Specialized for programming tasks (32K context)
292
+ - **kimi-k2**: Moonshot AI Kimi K2 with extended context (200K context)
293
+
308
294
  ## 🚀 Development
309
295
 
310
296
  ### Install from Source
@@ -342,8 +328,12 @@ npm run kill-server # Kill any server running on port 3157
342
328
  npm test # Run all tests
343
329
  npm run test:unit # Unit tests only
344
330
  npm run test:integration # Integration tests
331
+ npm run test:mcp-client # MCP client tests (HTTP-based client-server testing)
345
332
  npm run test:real-api # Real API tests (requires keys)
333
+ npm run test:providers # Provider tests
334
+ npm run test:tools # Tool tests
346
335
  npm run test:coverage # Coverage report
336
+ npm run test:watch # Run tests in watch mode
347
337
 
348
338
  # Code quality
349
339
  npm run lint # Check code style
@@ -396,7 +386,7 @@ XAI_API_KEY=xai-...
396
386
  npm run test:real-api
397
387
 
398
388
  # Run comprehensive integration tests
399
- node final-integration-test.js
389
+ node tests/integration/final-integration-test.js
400
390
 
401
391
  # Validate server functionality
402
392
  npm run validate
@@ -417,14 +407,14 @@ npm test
417
407
  npm run test:real-api
418
408
 
419
409
  # 4. Comprehensive validation
420
- node final-integration-test.js
410
+ node tests/integration/final-integration-test.js
421
411
  ```
422
412
 
423
413
  **Expected Results:**
424
414
  - Server starts without errors on port 3157
425
415
  - All unit tests pass
426
416
  - Real API tests connect successfully (if keys configured)
427
- - Integration tests achieve >70% success rate
417
+ - Some real API integration tests may occasionally timeout
428
418
 
429
419
  ## 📦 Publishing to NPM
430
420
 
@@ -530,7 +520,6 @@ converse/
530
520
  | `PORT` | Server port | `3157` | `3157` |
531
521
  | `LOG_LEVEL` | Logging level | `info` | `debug`, `info`, `error` |
532
522
  | `MAX_MCP_OUTPUT_TOKENS` | Token response limit | `25000` | `200000` |
533
- | `GOOGLE_LOCATION` | Google API region | `us-central1` | `us-central1` |
534
523
  | `XAI_BASE_URL` | XAI API endpoint | `https://api.x.ai/v1` | Custom endpoint |
535
524
 
536
525
  ### Model Selection
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "converse-mcp-server",
3
- "version": "1.5.0",
3
+ "version": "1.5.2",
4
4
  "description": "Converse MCP Server - Converse with other LLMs with chat and consensus tools",
5
5
  "type": "module",
6
6
  "main": "src/index.js",
@@ -151,7 +151,7 @@ function convertMessagesToMistral(messages) {
151
151
  // Convert Anthropic/Claude format to Mistral format
152
152
  mistralContent.push({
153
153
  type: 'image_url',
154
- image_url: `data:${item.source.media_type};base64,${item.source.data}`
154
+ imageUrl: `data:${item.source.media_type};base64,${item.source.data}`
155
155
  });
156
156
  debugLog(`[Mistral] Converting image: ${item.source.media_type}, data length: ${item.source.data.length}`);
157
157
  }