gemini-bridge 1.0.0__tar.gz → 1.0.2__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,12 +1,12 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: gemini-bridge
3
- Version: 1.0.0
3
+ Version: 1.0.2
4
4
  Summary: Lightweight MCP server bridging Claude Code to Google's Gemini AI via official CLI
5
5
  Author-email: Shelakh <info@shelakh.com>
6
6
  License-Expression: MIT
7
- Project-URL: Homepage, https://github.com/shelakh/gemini-bridge
8
- Project-URL: Repository, https://github.com/shelakh/gemini-bridge
9
- Project-URL: Issues, https://github.com/shelakh/gemini-bridge/issues
7
+ Project-URL: Homepage, https://github.com/eLyiN/gemini-bridge
8
+ Project-URL: Repository, https://github.com/eLyiN/gemini-bridge
9
+ Project-URL: Issues, https://github.com/eLyiN/gemini-bridge/issues
10
10
  Keywords: mcp,gemini,ai,coding,assistant
11
11
  Classifier: Development Status :: 5 - Production/Stable
12
12
  Classifier: Intended Audience :: Developers
@@ -24,13 +24,14 @@ Dynamic: license-file
24
24
 
25
25
  # Gemini Bridge
26
26
 
27
+ ![CI Status](https://github.com/eLyiN/gemini-bridge/actions/workflows/ci.yml/badge.svg)
27
28
  ![PyPI Version](https://img.shields.io/pypi/v/gemini-bridge)
28
29
  ![MIT License](https://img.shields.io/badge/license-MIT-blue.svg)
29
30
  ![Python 3.10+](https://img.shields.io/badge/python-3.10+-blue.svg)
30
31
  ![MCP Compatible](https://img.shields.io/badge/MCP-compatible-green.svg)
31
- ![CI Status](https://github.com/shelakh/gemini-bridge/actions/workflows/ci.yml/badge.svg)
32
+ ![Gemini CLI](https://img.shields.io/badge/Gemini-CLI-blue.svg)
32
33
 
33
- A lightweight MCP (Model Context Protocol) server that enables Claude Code to interact with Google's Gemini AI through the official CLI. Designed for simplicity, reliability, and seamless integration.
34
+ A lightweight MCP (Model Context Protocol) server that enables AI coding assistants to interact with Google's Gemini AI through the official CLI. Works with Claude Code, Cursor, VS Code, and other MCP-compatible clients. Designed for simplicity, reliability, and seamless integration.
34
35
 
35
36
  ## ✨ Features
36
37
 
@@ -40,6 +41,7 @@ A lightweight MCP (Model Context Protocol) server that enables Claude Code to in
40
41
  - **Production Ready**: Robust error handling with configurable 60-second timeouts
41
42
  - **Minimal Dependencies**: Only requires `mcp>=1.0.0` and Gemini CLI
42
43
  - **Easy Deployment**: Support for both uvx and traditional pip installation
44
+ - **Universal MCP Compatibility**: Works with any MCP-compatible AI coding assistant
43
45
 
44
46
  ## 🚀 Quick Start
45
47
 
@@ -96,6 +98,313 @@ pip install -e .
96
98
  claude mcp add gemini-bridge-dev -s user -- python -m src
97
99
  ```
98
100
 
101
+ ## 🌐 Multi-Client Support
102
+
103
+ **Gemini Bridge works with any MCP-compatible AI coding assistant** - the same server supports multiple clients through different configuration methods.
104
+
105
+ ### Supported MCP Clients
106
+ - **Claude Code** ✅ (Default)
107
+ - **Cursor** ✅
108
+ - **VS Code** ✅
109
+ - **Windsurf** ✅
110
+ - **Cline** ✅
111
+ - **Void** ✅
112
+ - **Cherry Studio** ✅
113
+ - **Augment** ✅
114
+ - **Roo Code** ✅
115
+ - **Zencoder** ✅
116
+ - **Any MCP-compatible client** ✅
117
+
118
+ ### Configuration Examples
119
+
120
+ <details>
121
+ <summary><strong>Claude Code</strong> (Default)</summary>
122
+
123
+ ```bash
124
+ # Recommended installation
125
+ claude mcp add gemini-bridge -s user -- uvx gemini-bridge
126
+
127
+ # Development installation
128
+ claude mcp add gemini-bridge-dev -s user -- python -m src
129
+ ```
130
+
131
+ </details>
132
+
133
+ <details>
134
+ <summary><strong>Cursor</strong></summary>
135
+
136
+ **Global Configuration** (`~/.cursor/mcp.json`):
137
+ ```json
138
+ {
139
+ "mcpServers": {
140
+ "gemini-bridge": {
141
+ "command": "uvx",
142
+ "args": ["gemini-bridge"],
143
+ "env": {}
144
+ }
145
+ }
146
+ }
147
+ ```
148
+
149
+ **Project-Specific** (`.cursor/mcp.json` in your project):
150
+ ```json
151
+ {
152
+ "mcpServers": {
153
+ "gemini-bridge": {
154
+ "command": "uvx",
155
+ "args": ["gemini-bridge"],
156
+ "env": {}
157
+ }
158
+ }
159
+ }
160
+ ```
161
+
162
+ Go to: `Settings` → `Cursor Settings` → `MCP` → `Add new global MCP server`
163
+
164
+ </details>
165
+
166
+ <details>
167
+ <summary><strong>VS Code</strong></summary>
168
+
169
+ **Configuration** (`.vscode/mcp.json` in your workspace):
170
+ ```json
171
+ {
172
+ "servers": {
173
+ "gemini-bridge": {
174
+ "type": "stdio",
175
+ "command": "uvx",
176
+ "args": ["gemini-bridge"]
177
+ }
178
+ }
179
+ }
180
+ ```
181
+
182
+ **Alternative: Through Extensions**
183
+ 1. Open Extensions view (Ctrl+Shift+X)
184
+ 2. Search for MCP extensions
185
+ 3. Add custom server with command: `uvx gemini-bridge`
186
+
187
+ </details>
188
+
189
+ <details>
190
+ <summary><strong>Windsurf</strong></summary>
191
+
192
+ Add to your Windsurf MCP configuration:
193
+ ```json
194
+ {
195
+ "mcpServers": {
196
+ "gemini-bridge": {
197
+ "command": "uvx",
198
+ "args": ["gemini-bridge"],
199
+ "env": {}
200
+ }
201
+ }
202
+ }
203
+ ```
204
+
205
+ </details>
206
+
207
+ <details>
208
+ <summary><strong>Cline</strong> (VS Code Extension)</summary>
209
+
210
+ 1. Open Cline and click **MCP Servers** in the top navigation
211
+ 2. Select **Installed** tab → **Advanced MCP Settings**
212
+ 3. Add to `cline_mcp_settings.json`:
213
+
214
+ ```json
215
+ {
216
+ "mcpServers": {
217
+ "gemini-bridge": {
218
+ "command": "uvx",
219
+ "args": ["gemini-bridge"],
220
+ "env": {}
221
+ }
222
+ }
223
+ }
224
+ ```
225
+
226
+ </details>
227
+
228
+ <details>
229
+ <summary><strong>Void</strong></summary>
230
+
231
+ Go to: `Settings` → `MCP` → `Add MCP Server`
232
+
233
+ ```json
234
+ {
235
+ "mcpServers": {
236
+ "gemini-bridge": {
237
+ "command": "uvx",
238
+ "args": ["gemini-bridge"],
239
+ "env": {}
240
+ }
241
+ }
242
+ }
243
+ ```
244
+
245
+ </details>
246
+
247
+ <details>
248
+ <summary><strong>Cherry Studio</strong></summary>
249
+
250
+ 1. Navigate to **Settings → MCP Servers → Add Server**
251
+ 2. Fill in the server details:
252
+ - **Name**: `gemini-bridge`
253
+ - **Type**: `STDIO`
254
+ - **Command**: `uvx`
255
+ - **Arguments**: `["gemini-bridge"]`
256
+ 3. Save the configuration
257
+
258
+ </details>
259
+
260
+ <details>
261
+ <summary><strong>Augment</strong></summary>
262
+
263
+ **Using the UI:**
264
+ 1. Click hamburger menu → **Settings** → **Tools**
265
+ 2. Click **+ Add MCP** button
266
+ 3. Enter command: `uvx gemini-bridge`
267
+ 4. Name: **Gemini Bridge**
268
+
269
+ **Manual Configuration:**
270
+ ```json
271
+ "augment.advanced": {
272
+ "mcpServers": [
273
+ {
274
+ "name": "gemini-bridge",
275
+ "command": "uvx",
276
+ "args": ["gemini-bridge"],
277
+ "env": {}
278
+ }
279
+ ]
280
+ }
281
+ ```
282
+
283
+ </details>
284
+
285
+ <details>
286
+ <summary><strong>Roo Code</strong></summary>
287
+
288
+ 1. Go to **Settings → MCP Servers → Edit Global Config**
289
+ 2. Add to `mcp_settings.json`:
290
+
291
+ ```json
292
+ {
293
+ "mcpServers": {
294
+ "gemini-bridge": {
295
+ "command": "uvx",
296
+ "args": ["gemini-bridge"],
297
+ "env": {}
298
+ }
299
+ }
300
+ }
301
+ ```
302
+
303
+ </details>
304
+
305
+ <details>
306
+ <summary><strong>Zencoder</strong></summary>
307
+
308
+ 1. Go to Zencoder menu (...) → **Tools** → **Add Custom MCP**
309
+ 2. Add configuration:
310
+
311
+ ```json
312
+ {
313
+ "command": "uvx",
314
+ "args": ["gemini-bridge"],
315
+ "env": {}
316
+ }
317
+ ```
318
+
319
+ 3. Hit the **Install** button
320
+
321
+ </details>
322
+
323
+ <details>
324
+ <summary><strong>Alternative Installation Methods</strong></summary>
325
+
326
+ **For pip-based installations:**
327
+ ```json
328
+ {
329
+ "command": "gemini-bridge",
330
+ "args": [],
331
+ "env": {}
332
+ }
333
+ ```
334
+
335
+ **For development/local testing:**
336
+ ```json
337
+ {
338
+ "command": "python",
339
+ "args": ["-m", "src"],
340
+ "env": {},
341
+ "cwd": "/path/to/gemini-bridge"
342
+ }
343
+ ```
344
+
345
+ **For npm-style installation** (if needed):
346
+ ```json
347
+ {
348
+ "command": "npx",
349
+ "args": ["gemini-bridge"],
350
+ "env": {}
351
+ }
352
+ ```
353
+
354
+ </details>
355
+
356
+ ### Universal Usage
357
+
358
+ Once configured with any client, use the same two tools:
359
+
360
+ 1. **Ask general questions**: "What authentication patterns are used in this codebase?"
361
+ 2. **Analyze specific files**: "Review these auth files for security issues"
362
+
363
+ **The server implementation is identical** - only the client configuration differs!
364
+
365
+ ## ⚙️ Configuration
366
+
367
+ ### Timeout Configuration
368
+
369
+ By default, Gemini Bridge uses a 60-second timeout for all CLI operations. For longer queries (large files, complex analysis), you can configure a custom timeout using the `GEMINI_BRIDGE_TIMEOUT` environment variable.
370
+
371
+ **Example configurations:**
372
+
373
+ <details>
374
+ <summary><strong>Claude Code</strong></summary>
375
+
376
+ ```bash
377
+ # Add with custom timeout (120 seconds)
378
+ claude mcp add gemini-bridge -s user --env GEMINI_BRIDGE_TIMEOUT=120 -- uvx gemini-bridge
379
+ ```
380
+
381
+ </details>
382
+
383
+ <details>
384
+ <summary><strong>Manual Configuration (mcp_settings.json)</strong></summary>
385
+
386
+ ```json
387
+ {
388
+ "mcpServers": {
389
+ "gemini-bridge": {
390
+ "command": "uvx",
391
+ "args": ["gemini-bridge"],
392
+ "env": {
393
+ "GEMINI_BRIDGE_TIMEOUT": "120"
394
+ }
395
+ }
396
+ }
397
+ }
398
+ ```
399
+
400
+ </details>
401
+
402
+ **Timeout Options:**
403
+ - **Default**: 60 seconds (if not configured)
404
+ - **Range**: Any positive integer (seconds)
405
+ - **Recommended**: 120-300 seconds for large file analysis
406
+ - **Invalid values**: Fall back to 60 seconds with warning
407
+
99
408
  ## 🛠️ Available Tools
100
409
 
101
410
  ### `consult_gemini`
@@ -182,8 +491,6 @@ gemini-bridge/
182
491
  │ ├── __main__.py # Module execution entry point
183
492
  │ └── mcp_server.py # Main MCP server implementation
184
493
  ├── .github/ # GitHub templates and workflows
185
- ├── start_server_uvx.sh # Production startup script
186
- ├── start_server_dev.sh # Development startup script
187
494
  ├── pyproject.toml # Python package configuration
188
495
  ├── README.md # This file
189
496
  ├── CONTRIBUTING.md # Contribution guidelines
@@ -258,7 +565,7 @@ See [CHANGELOG.md](CHANGELOG.md) for detailed version history.
258
565
 
259
566
  - **Issues**: Report bugs or request features via [GitHub Issues](https://github.com/shelakh/gemini-bridge/issues)
260
567
  - **Discussions**: Join the community discussion
261
- - **Documentation**: Additional docs available in the `docs/` directory
568
+ - **Documentation**: Additional docs can be created in the `docs/` directory
262
569
 
263
570
  ---
264
571
 
@@ -1,12 +1,13 @@
1
1
  # Gemini Bridge
2
2
 
3
+ ![CI Status](https://github.com/eLyiN/gemini-bridge/actions/workflows/ci.yml/badge.svg)
3
4
  ![PyPI Version](https://img.shields.io/pypi/v/gemini-bridge)
4
5
  ![MIT License](https://img.shields.io/badge/license-MIT-blue.svg)
5
6
  ![Python 3.10+](https://img.shields.io/badge/python-3.10+-blue.svg)
6
7
  ![MCP Compatible](https://img.shields.io/badge/MCP-compatible-green.svg)
7
- ![CI Status](https://github.com/shelakh/gemini-bridge/actions/workflows/ci.yml/badge.svg)
8
+ ![Gemini CLI](https://img.shields.io/badge/Gemini-CLI-blue.svg)
8
9
 
9
- A lightweight MCP (Model Context Protocol) server that enables Claude Code to interact with Google's Gemini AI through the official CLI. Designed for simplicity, reliability, and seamless integration.
10
+ A lightweight MCP (Model Context Protocol) server that enables AI coding assistants to interact with Google's Gemini AI through the official CLI. Works with Claude Code, Cursor, VS Code, and other MCP-compatible clients. Designed for simplicity, reliability, and seamless integration.
10
11
 
11
12
  ## ✨ Features
12
13
 
@@ -16,6 +17,7 @@ A lightweight MCP (Model Context Protocol) server that enables Claude Code to in
16
17
  - **Production Ready**: Robust error handling with configurable 60-second timeouts
17
18
  - **Minimal Dependencies**: Only requires `mcp>=1.0.0` and Gemini CLI
18
19
  - **Easy Deployment**: Support for both uvx and traditional pip installation
20
+ - **Universal MCP Compatibility**: Works with any MCP-compatible AI coding assistant
19
21
 
20
22
  ## 🚀 Quick Start
21
23
 
@@ -72,6 +74,313 @@ pip install -e .
72
74
  claude mcp add gemini-bridge-dev -s user -- python -m src
73
75
  ```
74
76
 
77
+ ## 🌐 Multi-Client Support
78
+
79
+ **Gemini Bridge works with any MCP-compatible AI coding assistant** - the same server supports multiple clients through different configuration methods.
80
+
81
+ ### Supported MCP Clients
82
+ - **Claude Code** ✅ (Default)
83
+ - **Cursor** ✅
84
+ - **VS Code** ✅
85
+ - **Windsurf** ✅
86
+ - **Cline** ✅
87
+ - **Void** ✅
88
+ - **Cherry Studio** ✅
89
+ - **Augment** ✅
90
+ - **Roo Code** ✅
91
+ - **Zencoder** ✅
92
+ - **Any MCP-compatible client** ✅
93
+
94
+ ### Configuration Examples
95
+
96
+ <details>
97
+ <summary><strong>Claude Code</strong> (Default)</summary>
98
+
99
+ ```bash
100
+ # Recommended installation
101
+ claude mcp add gemini-bridge -s user -- uvx gemini-bridge
102
+
103
+ # Development installation
104
+ claude mcp add gemini-bridge-dev -s user -- python -m src
105
+ ```
106
+
107
+ </details>
108
+
109
+ <details>
110
+ <summary><strong>Cursor</strong></summary>
111
+
112
+ **Global Configuration** (`~/.cursor/mcp.json`):
113
+ ```json
114
+ {
115
+ "mcpServers": {
116
+ "gemini-bridge": {
117
+ "command": "uvx",
118
+ "args": ["gemini-bridge"],
119
+ "env": {}
120
+ }
121
+ }
122
+ }
123
+ ```
124
+
125
+ **Project-Specific** (`.cursor/mcp.json` in your project):
126
+ ```json
127
+ {
128
+ "mcpServers": {
129
+ "gemini-bridge": {
130
+ "command": "uvx",
131
+ "args": ["gemini-bridge"],
132
+ "env": {}
133
+ }
134
+ }
135
+ }
136
+ ```
137
+
138
+ Go to: `Settings` → `Cursor Settings` → `MCP` → `Add new global MCP server`
139
+
140
+ </details>
141
+
142
+ <details>
143
+ <summary><strong>VS Code</strong></summary>
144
+
145
+ **Configuration** (`.vscode/mcp.json` in your workspace):
146
+ ```json
147
+ {
148
+ "servers": {
149
+ "gemini-bridge": {
150
+ "type": "stdio",
151
+ "command": "uvx",
152
+ "args": ["gemini-bridge"]
153
+ }
154
+ }
155
+ }
156
+ ```
157
+
158
+ **Alternative: Through Extensions**
159
+ 1. Open Extensions view (Ctrl+Shift+X)
160
+ 2. Search for MCP extensions
161
+ 3. Add custom server with command: `uvx gemini-bridge`
162
+
163
+ </details>
164
+
165
+ <details>
166
+ <summary><strong>Windsurf</strong></summary>
167
+
168
+ Add to your Windsurf MCP configuration:
169
+ ```json
170
+ {
171
+ "mcpServers": {
172
+ "gemini-bridge": {
173
+ "command": "uvx",
174
+ "args": ["gemini-bridge"],
175
+ "env": {}
176
+ }
177
+ }
178
+ }
179
+ ```
180
+
181
+ </details>
182
+
183
+ <details>
184
+ <summary><strong>Cline</strong> (VS Code Extension)</summary>
185
+
186
+ 1. Open Cline and click **MCP Servers** in the top navigation
187
+ 2. Select **Installed** tab → **Advanced MCP Settings**
188
+ 3. Add to `cline_mcp_settings.json`:
189
+
190
+ ```json
191
+ {
192
+ "mcpServers": {
193
+ "gemini-bridge": {
194
+ "command": "uvx",
195
+ "args": ["gemini-bridge"],
196
+ "env": {}
197
+ }
198
+ }
199
+ }
200
+ ```
201
+
202
+ </details>
203
+
204
+ <details>
205
+ <summary><strong>Void</strong></summary>
206
+
207
+ Go to: `Settings` → `MCP` → `Add MCP Server`
208
+
209
+ ```json
210
+ {
211
+ "mcpServers": {
212
+ "gemini-bridge": {
213
+ "command": "uvx",
214
+ "args": ["gemini-bridge"],
215
+ "env": {}
216
+ }
217
+ }
218
+ }
219
+ ```
220
+
221
+ </details>
222
+
223
+ <details>
224
+ <summary><strong>Cherry Studio</strong></summary>
225
+
226
+ 1. Navigate to **Settings → MCP Servers → Add Server**
227
+ 2. Fill in the server details:
228
+ - **Name**: `gemini-bridge`
229
+ - **Type**: `STDIO`
230
+ - **Command**: `uvx`
231
+ - **Arguments**: `["gemini-bridge"]`
232
+ 3. Save the configuration
233
+
234
+ </details>
235
+
236
+ <details>
237
+ <summary><strong>Augment</strong></summary>
238
+
239
+ **Using the UI:**
240
+ 1. Click hamburger menu → **Settings** → **Tools**
241
+ 2. Click **+ Add MCP** button
242
+ 3. Enter command: `uvx gemini-bridge`
243
+ 4. Name: **Gemini Bridge**
244
+
245
+ **Manual Configuration:**
246
+ ```json
247
+ "augment.advanced": {
248
+ "mcpServers": [
249
+ {
250
+ "name": "gemini-bridge",
251
+ "command": "uvx",
252
+ "args": ["gemini-bridge"],
253
+ "env": {}
254
+ }
255
+ ]
256
+ }
257
+ ```
258
+
259
+ </details>
260
+
261
+ <details>
262
+ <summary><strong>Roo Code</strong></summary>
263
+
264
+ 1. Go to **Settings → MCP Servers → Edit Global Config**
265
+ 2. Add to `mcp_settings.json`:
266
+
267
+ ```json
268
+ {
269
+ "mcpServers": {
270
+ "gemini-bridge": {
271
+ "command": "uvx",
272
+ "args": ["gemini-bridge"],
273
+ "env": {}
274
+ }
275
+ }
276
+ }
277
+ ```
278
+
279
+ </details>
280
+
281
+ <details>
282
+ <summary><strong>Zencoder</strong></summary>
283
+
284
+ 1. Go to Zencoder menu (...) → **Tools** → **Add Custom MCP**
285
+ 2. Add configuration:
286
+
287
+ ```json
288
+ {
289
+ "command": "uvx",
290
+ "args": ["gemini-bridge"],
291
+ "env": {}
292
+ }
293
+ ```
294
+
295
+ 3. Hit the **Install** button
296
+
297
+ </details>
298
+
299
+ <details>
300
+ <summary><strong>Alternative Installation Methods</strong></summary>
301
+
302
+ **For pip-based installations:**
303
+ ```json
304
+ {
305
+ "command": "gemini-bridge",
306
+ "args": [],
307
+ "env": {}
308
+ }
309
+ ```
310
+
311
+ **For development/local testing:**
312
+ ```json
313
+ {
314
+ "command": "python",
315
+ "args": ["-m", "src"],
316
+ "env": {},
317
+ "cwd": "/path/to/gemini-bridge"
318
+ }
319
+ ```
320
+
321
+ **For npm-style installation** (if needed):
322
+ ```json
323
+ {
324
+ "command": "npx",
325
+ "args": ["gemini-bridge"],
326
+ "env": {}
327
+ }
328
+ ```
329
+
330
+ </details>
331
+
332
+ ### Universal Usage
333
+
334
+ Once configured with any client, use the same two tools:
335
+
336
+ 1. **Ask general questions**: "What authentication patterns are used in this codebase?"
337
+ 2. **Analyze specific files**: "Review these auth files for security issues"
338
+
339
+ **The server implementation is identical** - only the client configuration differs!
340
+
341
+ ## ⚙️ Configuration
342
+
343
+ ### Timeout Configuration
344
+
345
+ By default, Gemini Bridge uses a 60-second timeout for all CLI operations. For longer queries (large files, complex analysis), you can configure a custom timeout using the `GEMINI_BRIDGE_TIMEOUT` environment variable.
346
+
347
+ **Example configurations:**
348
+
349
+ <details>
350
+ <summary><strong>Claude Code</strong></summary>
351
+
352
+ ```bash
353
+ # Add with custom timeout (120 seconds)
354
+ claude mcp add gemini-bridge -s user --env GEMINI_BRIDGE_TIMEOUT=120 -- uvx gemini-bridge
355
+ ```
356
+
357
+ </details>
358
+
359
+ <details>
360
+ <summary><strong>Manual Configuration (mcp_settings.json)</strong></summary>
361
+
362
+ ```json
363
+ {
364
+ "mcpServers": {
365
+ "gemini-bridge": {
366
+ "command": "uvx",
367
+ "args": ["gemini-bridge"],
368
+ "env": {
369
+ "GEMINI_BRIDGE_TIMEOUT": "120"
370
+ }
371
+ }
372
+ }
373
+ }
374
+ ```
375
+
376
+ </details>
377
+
378
+ **Timeout Options:**
379
+ - **Default**: 60 seconds (if not configured)
380
+ - **Range**: Any positive integer (seconds)
381
+ - **Recommended**: 120-300 seconds for large file analysis
382
+ - **Invalid values**: Fall back to 60 seconds with warning
383
+
75
384
  ## 🛠️ Available Tools
76
385
 
77
386
  ### `consult_gemini`
@@ -158,8 +467,6 @@ gemini-bridge/
158
467
  │ ├── __main__.py # Module execution entry point
159
468
  │ └── mcp_server.py # Main MCP server implementation
160
469
  ├── .github/ # GitHub templates and workflows
161
- ├── start_server_uvx.sh # Production startup script
162
- ├── start_server_dev.sh # Development startup script
163
470
  ├── pyproject.toml # Python package configuration
164
471
  ├── README.md # This file
165
472
  ├── CONTRIBUTING.md # Contribution guidelines
@@ -234,7 +541,7 @@ See [CHANGELOG.md](CHANGELOG.md) for detailed version history.
234
541
 
235
542
  - **Issues**: Report bugs or request features via [GitHub Issues](https://github.com/shelakh/gemini-bridge/issues)
236
543
  - **Discussions**: Join the community discussion
237
- - **Documentation**: Additional docs available in the `docs/` directory
544
+ - **Documentation**: Additional docs can be created in the `docs/` directory
238
545
 
239
546
  ---
240
547
 
@@ -1,12 +1,12 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: gemini-bridge
3
- Version: 1.0.0
3
+ Version: 1.0.2
4
4
  Summary: Lightweight MCP server bridging Claude Code to Google's Gemini AI via official CLI
5
5
  Author-email: Shelakh <info@shelakh.com>
6
6
  License-Expression: MIT
7
- Project-URL: Homepage, https://github.com/shelakh/gemini-bridge
8
- Project-URL: Repository, https://github.com/shelakh/gemini-bridge
9
- Project-URL: Issues, https://github.com/shelakh/gemini-bridge/issues
7
+ Project-URL: Homepage, https://github.com/eLyiN/gemini-bridge
8
+ Project-URL: Repository, https://github.com/eLyiN/gemini-bridge
9
+ Project-URL: Issues, https://github.com/eLyiN/gemini-bridge/issues
10
10
  Keywords: mcp,gemini,ai,coding,assistant
11
11
  Classifier: Development Status :: 5 - Production/Stable
12
12
  Classifier: Intended Audience :: Developers
@@ -24,13 +24,14 @@ Dynamic: license-file
24
24
 
25
25
  # Gemini Bridge
26
26
 
27
+ ![CI Status](https://github.com/eLyiN/gemini-bridge/actions/workflows/ci.yml/badge.svg)
27
28
  ![PyPI Version](https://img.shields.io/pypi/v/gemini-bridge)
28
29
  ![MIT License](https://img.shields.io/badge/license-MIT-blue.svg)
29
30
  ![Python 3.10+](https://img.shields.io/badge/python-3.10+-blue.svg)
30
31
  ![MCP Compatible](https://img.shields.io/badge/MCP-compatible-green.svg)
31
- ![CI Status](https://github.com/shelakh/gemini-bridge/actions/workflows/ci.yml/badge.svg)
32
+ ![Gemini CLI](https://img.shields.io/badge/Gemini-CLI-blue.svg)
32
33
 
33
- A lightweight MCP (Model Context Protocol) server that enables Claude Code to interact with Google's Gemini AI through the official CLI. Designed for simplicity, reliability, and seamless integration.
34
+ A lightweight MCP (Model Context Protocol) server that enables AI coding assistants to interact with Google's Gemini AI through the official CLI. Works with Claude Code, Cursor, VS Code, and other MCP-compatible clients. Designed for simplicity, reliability, and seamless integration.
34
35
 
35
36
  ## ✨ Features
36
37
 
@@ -40,6 +41,7 @@ A lightweight MCP (Model Context Protocol) server that enables Claude Code to in
40
41
  - **Production Ready**: Robust error handling with configurable 60-second timeouts
41
42
  - **Minimal Dependencies**: Only requires `mcp>=1.0.0` and Gemini CLI
42
43
  - **Easy Deployment**: Support for both uvx and traditional pip installation
44
+ - **Universal MCP Compatibility**: Works with any MCP-compatible AI coding assistant
43
45
 
44
46
  ## 🚀 Quick Start
45
47
 
@@ -96,6 +98,313 @@ pip install -e .
96
98
  claude mcp add gemini-bridge-dev -s user -- python -m src
97
99
  ```
98
100
 
101
+ ## 🌐 Multi-Client Support
102
+
103
+ **Gemini Bridge works with any MCP-compatible AI coding assistant** - the same server supports multiple clients through different configuration methods.
104
+
105
+ ### Supported MCP Clients
106
+ - **Claude Code** ✅ (Default)
107
+ - **Cursor** ✅
108
+ - **VS Code** ✅
109
+ - **Windsurf** ✅
110
+ - **Cline** ✅
111
+ - **Void** ✅
112
+ - **Cherry Studio** ✅
113
+ - **Augment** ✅
114
+ - **Roo Code** ✅
115
+ - **Zencoder** ✅
116
+ - **Any MCP-compatible client** ✅
117
+
118
+ ### Configuration Examples
119
+
120
+ <details>
121
+ <summary><strong>Claude Code</strong> (Default)</summary>
122
+
123
+ ```bash
124
+ # Recommended installation
125
+ claude mcp add gemini-bridge -s user -- uvx gemini-bridge
126
+
127
+ # Development installation
128
+ claude mcp add gemini-bridge-dev -s user -- python -m src
129
+ ```
130
+
131
+ </details>
132
+
133
+ <details>
134
+ <summary><strong>Cursor</strong></summary>
135
+
136
+ **Global Configuration** (`~/.cursor/mcp.json`):
137
+ ```json
138
+ {
139
+ "mcpServers": {
140
+ "gemini-bridge": {
141
+ "command": "uvx",
142
+ "args": ["gemini-bridge"],
143
+ "env": {}
144
+ }
145
+ }
146
+ }
147
+ ```
148
+
149
+ **Project-Specific** (`.cursor/mcp.json` in your project):
150
+ ```json
151
+ {
152
+ "mcpServers": {
153
+ "gemini-bridge": {
154
+ "command": "uvx",
155
+ "args": ["gemini-bridge"],
156
+ "env": {}
157
+ }
158
+ }
159
+ }
160
+ ```
161
+
162
+ Go to: `Settings` → `Cursor Settings` → `MCP` → `Add new global MCP server`
163
+
164
+ </details>
165
+
166
+ <details>
167
+ <summary><strong>VS Code</strong></summary>
168
+
169
+ **Configuration** (`.vscode/mcp.json` in your workspace):
170
+ ```json
171
+ {
172
+ "servers": {
173
+ "gemini-bridge": {
174
+ "type": "stdio",
175
+ "command": "uvx",
176
+ "args": ["gemini-bridge"]
177
+ }
178
+ }
179
+ }
180
+ ```
181
+
182
+ **Alternative: Through Extensions**
183
+ 1. Open Extensions view (Ctrl+Shift+X)
184
+ 2. Search for MCP extensions
185
+ 3. Add custom server with command: `uvx gemini-bridge`
186
+
187
+ </details>
188
+
189
+ <details>
190
+ <summary><strong>Windsurf</strong></summary>
191
+
192
+ Add to your Windsurf MCP configuration:
193
+ ```json
194
+ {
195
+ "mcpServers": {
196
+ "gemini-bridge": {
197
+ "command": "uvx",
198
+ "args": ["gemini-bridge"],
199
+ "env": {}
200
+ }
201
+ }
202
+ }
203
+ ```
204
+
205
+ </details>
206
+
207
+ <details>
208
+ <summary><strong>Cline</strong> (VS Code Extension)</summary>
209
+
210
+ 1. Open Cline and click **MCP Servers** in the top navigation
211
+ 2. Select **Installed** tab → **Advanced MCP Settings**
212
+ 3. Add to `cline_mcp_settings.json`:
213
+
214
+ ```json
215
+ {
216
+ "mcpServers": {
217
+ "gemini-bridge": {
218
+ "command": "uvx",
219
+ "args": ["gemini-bridge"],
220
+ "env": {}
221
+ }
222
+ }
223
+ }
224
+ ```
225
+
226
+ </details>
227
+
228
+ <details>
229
+ <summary><strong>Void</strong></summary>
230
+
231
+ Go to: `Settings` → `MCP` → `Add MCP Server`
232
+
233
+ ```json
234
+ {
235
+ "mcpServers": {
236
+ "gemini-bridge": {
237
+ "command": "uvx",
238
+ "args": ["gemini-bridge"],
239
+ "env": {}
240
+ }
241
+ }
242
+ }
243
+ ```
244
+
245
+ </details>
246
+
247
+ <details>
248
+ <summary><strong>Cherry Studio</strong></summary>
249
+
250
+ 1. Navigate to **Settings → MCP Servers → Add Server**
251
+ 2. Fill in the server details:
252
+ - **Name**: `gemini-bridge`
253
+ - **Type**: `STDIO`
254
+ - **Command**: `uvx`
255
+ - **Arguments**: `["gemini-bridge"]`
256
+ 3. Save the configuration
257
+
258
+ </details>
259
+
260
+ <details>
261
+ <summary><strong>Augment</strong></summary>
262
+
263
+ **Using the UI:**
264
+ 1. Click hamburger menu → **Settings** → **Tools**
265
+ 2. Click **+ Add MCP** button
266
+ 3. Enter command: `uvx gemini-bridge`
267
+ 4. Name: **Gemini Bridge**
268
+
269
+ **Manual Configuration:**
270
+ ```json
271
+ "augment.advanced": {
272
+ "mcpServers": [
273
+ {
274
+ "name": "gemini-bridge",
275
+ "command": "uvx",
276
+ "args": ["gemini-bridge"],
277
+ "env": {}
278
+ }
279
+ ]
280
+ }
281
+ ```
282
+
283
+ </details>
284
+
285
+ <details>
286
+ <summary><strong>Roo Code</strong></summary>
287
+
288
+ 1. Go to **Settings → MCP Servers → Edit Global Config**
289
+ 2. Add to `mcp_settings.json`:
290
+
291
+ ```json
292
+ {
293
+ "mcpServers": {
294
+ "gemini-bridge": {
295
+ "command": "uvx",
296
+ "args": ["gemini-bridge"],
297
+ "env": {}
298
+ }
299
+ }
300
+ }
301
+ ```
302
+
303
+ </details>
304
+
305
+ <details>
306
+ <summary><strong>Zencoder</strong></summary>
307
+
308
+ 1. Go to Zencoder menu (...) → **Tools** → **Add Custom MCP**
309
+ 2. Add configuration:
310
+
311
+ ```json
312
+ {
313
+ "command": "uvx",
314
+ "args": ["gemini-bridge"],
315
+ "env": {}
316
+ }
317
+ ```
318
+
319
+ 3. Hit the **Install** button
320
+
321
+ </details>
322
+
323
+ <details>
324
+ <summary><strong>Alternative Installation Methods</strong></summary>
325
+
326
+ **For pip-based installations:**
327
+ ```json
328
+ {
329
+ "command": "gemini-bridge",
330
+ "args": [],
331
+ "env": {}
332
+ }
333
+ ```
334
+
335
+ **For development/local testing:**
336
+ ```json
337
+ {
338
+ "command": "python",
339
+ "args": ["-m", "src"],
340
+ "env": {},
341
+ "cwd": "/path/to/gemini-bridge"
342
+ }
343
+ ```
344
+
345
+ **For npm-style installation** (if needed):
346
+ ```json
347
+ {
348
+ "command": "npx",
349
+ "args": ["gemini-bridge"],
350
+ "env": {}
351
+ }
352
+ ```
353
+
354
+ </details>
355
+
356
+ ### Universal Usage
357
+
358
+ Once configured with any client, use the same two tools:
359
+
360
+ 1. **Ask general questions**: "What authentication patterns are used in this codebase?"
361
+ 2. **Analyze specific files**: "Review these auth files for security issues"
362
+
363
+ **The server implementation is identical** - only the client configuration differs!
364
+
365
+ ## ⚙️ Configuration
366
+
367
+ ### Timeout Configuration
368
+
369
+ By default, Gemini Bridge uses a 60-second timeout for all CLI operations. For longer queries (large files, complex analysis), you can configure a custom timeout using the `GEMINI_BRIDGE_TIMEOUT` environment variable.
370
+
371
+ **Example configurations:**
372
+
373
+ <details>
374
+ <summary><strong>Claude Code</strong></summary>
375
+
376
+ ```bash
377
+ # Add with custom timeout (120 seconds)
378
+ claude mcp add gemini-bridge -s user --env GEMINI_BRIDGE_TIMEOUT=120 -- uvx gemini-bridge
379
+ ```
380
+
381
+ </details>
382
+
383
+ <details>
384
+ <summary><strong>Manual Configuration (mcp_settings.json)</strong></summary>
385
+
386
+ ```json
387
+ {
388
+ "mcpServers": {
389
+ "gemini-bridge": {
390
+ "command": "uvx",
391
+ "args": ["gemini-bridge"],
392
+ "env": {
393
+ "GEMINI_BRIDGE_TIMEOUT": "120"
394
+ }
395
+ }
396
+ }
397
+ }
398
+ ```
399
+
400
+ </details>
401
+
402
+ **Timeout Options:**
403
+ - **Default**: 60 seconds (if not configured)
404
+ - **Range**: Any positive integer (seconds)
405
+ - **Recommended**: 120-300 seconds for large file analysis
406
+ - **Invalid values**: Fall back to 60 seconds with warning
407
+
99
408
  ## 🛠️ Available Tools
100
409
 
101
410
  ### `consult_gemini`
@@ -182,8 +491,6 @@ gemini-bridge/
182
491
  │ ├── __main__.py # Module execution entry point
183
492
  │ └── mcp_server.py # Main MCP server implementation
184
493
  ├── .github/ # GitHub templates and workflows
185
- ├── start_server_uvx.sh # Production startup script
186
- ├── start_server_dev.sh # Development startup script
187
494
  ├── pyproject.toml # Python package configuration
188
495
  ├── README.md # This file
189
496
  ├── CONTRIBUTING.md # Contribution guidelines
@@ -258,7 +565,7 @@ See [CHANGELOG.md](CHANGELOG.md) for detailed version history.
258
565
 
259
566
  - **Issues**: Report bugs or request features via [GitHub Issues](https://github.com/shelakh/gemini-bridge/issues)
260
567
  - **Discussions**: Join the community discussion
261
- - **Documentation**: Additional docs available in the `docs/` directory
568
+ - **Documentation**: Additional docs can be created in the `docs/` directory
262
569
 
263
570
  ---
264
571
 
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
4
4
 
5
5
  [project]
6
6
  name = "gemini-bridge"
7
- version = "1.0.0"
7
+ version = "1.0.2"
8
8
  description = "Lightweight MCP server bridging Claude Code to Google's Gemini AI via official CLI"
9
9
  readme = "README.md"
10
10
  license = "MIT"
@@ -31,9 +31,9 @@ dependencies = [
31
31
  gemini-bridge = "src:main"
32
32
 
33
33
  [project.urls]
34
- Homepage = "https://github.com/shelakh/gemini-bridge"
35
- Repository = "https://github.com/shelakh/gemini-bridge"
36
- Issues = "https://github.com/shelakh/gemini-bridge/issues"
34
+ Homepage = "https://github.com/eLyiN/gemini-bridge"
35
+ Repository = "https://github.com/eLyiN/gemini-bridge"
36
+ Issues = "https://github.com/eLyiN/gemini-bridge/issues"
37
37
 
38
38
  [tool.setuptools.packages.find]
39
39
  where = ["."]
@@ -5,5 +5,5 @@ Version 1.0.0 - Production ready, radically simplified.
5
5
 
6
6
  from .mcp_server import main
7
7
 
8
- __version__ = "1.0.0"
8
+ __version__ = "1.0.2"
9
9
  __all__ = ["main"]
@@ -1,11 +1,12 @@
1
1
  #!/usr/bin/env python3
2
2
  """
3
3
  Gemini MCP Server - Simple CLI Bridge
4
- Version 1.0.0
4
+ Version 1.0.2
5
5
  A minimal MCP server to interface with Gemini AI via the gemini CLI.
6
6
  Created by @shelakh/elyin
7
7
  """
8
8
 
9
+ import logging
9
10
  import os
10
11
  import shutil
11
12
  import subprocess
@@ -40,6 +41,29 @@ def _normalize_model_name(model: Optional[str]) -> str:
40
41
  return "gemini-2.5-flash"
41
42
 
42
43
 
44
+ def _get_timeout() -> int:
45
+ """
46
+ Get the timeout value from environment variable GEMINI_BRIDGE_TIMEOUT.
47
+ Defaults to 60 seconds if not set or invalid.
48
+
49
+ Returns:
50
+ Timeout value in seconds (positive integer)
51
+ """
52
+ timeout_str = os.getenv("GEMINI_BRIDGE_TIMEOUT")
53
+ if not timeout_str:
54
+ return 60
55
+
56
+ try:
57
+ timeout = int(timeout_str)
58
+ if timeout <= 0:
59
+ logging.warning("Invalid GEMINI_BRIDGE_TIMEOUT value '%s' (must be positive). Using default 60 seconds.", timeout_str)
60
+ return 60
61
+ return timeout
62
+ except ValueError:
63
+ logging.warning("Invalid GEMINI_BRIDGE_TIMEOUT value '%s' (must be integer). Using default 60 seconds.", timeout_str)
64
+ return 60
65
+
66
+
43
67
  def execute_gemini_simple(query: str, directory: str = ".", model: Optional[str] = None) -> str:
44
68
  """
45
69
  Execute gemini CLI command for simple queries without file attachments.
@@ -65,13 +89,14 @@ def execute_gemini_simple(query: str, directory: str = ".", model: Optional[str]
65
89
  cmd = ["gemini", "-m", selected_model]
66
90
 
67
91
  # Execute CLI command - simple timeout, no retries
92
+ timeout = _get_timeout()
68
93
  try:
69
94
  result = subprocess.run(
70
95
  cmd,
71
96
  cwd=directory,
72
97
  capture_output=True,
73
98
  text=True,
74
- timeout=60,
99
+ timeout=timeout,
75
100
  input=query
76
101
  )
77
102
 
@@ -81,7 +106,7 @@ def execute_gemini_simple(query: str, directory: str = ".", model: Optional[str]
81
106
  return f"Gemini CLI Error: {result.stderr.strip()}"
82
107
 
83
108
  except subprocess.TimeoutExpired:
84
- return "Error: Gemini CLI command timed out after 60 seconds"
109
+ return f"Error: Gemini CLI command timed out after {timeout} seconds"
85
110
  except Exception as e:
86
111
  return f"Error executing Gemini CLI: {str(e)}"
87
112
 
@@ -136,13 +161,14 @@ def execute_gemini_with_files(query: str, directory: str = ".", files: Optional[
136
161
  stdin_content = "\n\n".join(file_contents) + "\n\n" + query
137
162
 
138
163
  # Execute CLI command - simple timeout, no retries
164
+ timeout = _get_timeout()
139
165
  try:
140
166
  result = subprocess.run(
141
167
  cmd,
142
168
  cwd=directory,
143
169
  capture_output=True,
144
170
  text=True,
145
- timeout=60,
171
+ timeout=timeout,
146
172
  input=stdin_content
147
173
  )
148
174
 
@@ -152,7 +178,7 @@ def execute_gemini_with_files(query: str, directory: str = ".", files: Optional[
152
178
  return f"Gemini CLI Error: {result.stderr.strip()}"
153
179
 
154
180
  except subprocess.TimeoutExpired:
155
- return "Error: Gemini CLI command timed out after 60 seconds"
181
+ return f"Error: Gemini CLI command timed out after {timeout} seconds"
156
182
  except Exception as e:
157
183
  return f"Error executing Gemini CLI: {str(e)}"
158
184
 
File without changes
File without changes