@cainmaila/gemini-cli-mcp 1.0.0 → 1.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +64 -20
  2. package/README.zh-TW.md +63 -19
  3. package/package.json +1 -1
package/README.md CHANGED
@@ -1,22 +1,23 @@
1
1
  <div align="center">
2
2
  <img src="./assets/banner.svg" alt="Gemini CLI MCP Banner" width="100%" />
3
3
 
4
- # 🤖 Gemini CLI MCP Server
4
+ # 🤖 Gemini CLI MCP Server
5
5
 
6
- *Seamless AI-to-AI Delegation via Local Gemini CLI*
6
+ _Seamless AI-to-AI Delegation via Local Gemini CLI_
7
7
 
8
- [![npm version](https://img.shields.io/npm/v/@cainmaila/gemini-cli-mcp.svg?style=flat-square)](https://www.npmjs.org/package/@cainmaila/gemini-cli-mcp)
9
- [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg?style=flat-square)](https://opensource.org/licenses/MIT)
10
- [![Node.js Version](https://img.shields.io/node/v/gemini-cli-mcp.svg?style=flat-square)](#requirements)
8
+ [![npm version](https://img.shields.io/npm/v/@cainmaila/gemini-cli-mcp.svg?style=flat-square)](https://www.npmjs.org/package/@cainmaila/gemini-cli-mcp)
9
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg?style=flat-square)](https://opensource.org/licenses/MIT)
10
+ [![Node.js Version](https://img.shields.io/node/v/gemini-cli-mcp.svg?style=flat-square)](#requirements)
11
+
12
+ [**English**](./README.md) · [**繁體中文**](./README.zh-TW.md)
11
13
 
12
- [**English**](./README.md) · [**繁體中文**](./README.zh-TW.md)
13
14
  </div>
14
15
 
15
16
  ---
16
17
 
17
18
  ## 🌟 Why Gemini CLI MCP?
18
19
 
19
- `gemini-cli-mcp` is an advanced Model Context Protocol (MCP) server that empowers your AI assistants by delegating complex tasks to your locally installed Gemini CLI.
20
+ `gemini-cli-mcp` is an advanced Model Context Protocol (MCP) server that empowers your AI assistants by delegating complex tasks to your locally installed Gemini CLI.
20
21
 
21
22
  Rather than collapsing failures into generic errors, this server returns **structured results along with execution metadata**, making it an essential tool for robust AI-to-AI handoffs and deep debugging.
22
23
 
@@ -32,16 +33,25 @@ Rather than collapsing failures into generic errors, this server returns **struc
32
33
 
33
34
  ## 📦 Installation
34
35
 
35
- Getting started is quick and easy. Ensure you have [Node.js 18.18+](https://nodejs.org/) installed along with a configured local Gemini CLI.
36
+ For normal usage, install the published package globally or run it through your package manager. Ensure you have [Node.js 18.18+](https://nodejs.org/) installed along with a configured local Gemini CLI.
36
37
 
37
38
  ```bash
38
- # Install dependencies
39
- npm install
39
+ # npm
40
+ npm install -g @cainmaila/gemini-cli-mcp
41
+
42
+ # pnpm
43
+ pnpm add -g @cainmaila/gemini-cli-mcp
40
44
 
41
- # Build the project
42
- npm run build
45
+ # one-off execution without a global install
46
+ npx -y @cainmaila/gemini-cli-mcp
43
47
  ```
44
48
 
49
+ The published package name is `@cainmaila/gemini-cli-mcp`, but the installed executable name is `gemini-cli-mcp`.
50
+
51
+ If you use `pnpm add -g`, make sure your MCP client can see your `PNPM_HOME`/global bin directory. Some desktop MCP clients do not inherit the same `PATH` as your interactive shell, which can make `gemini-cli-mcp` look missing even though the package is installed.
52
+
53
+ If you want to work on this repository itself instead of using the published package, use the local development flow in [DEVELOPMENT.md](DEVELOPMENT.md).
54
+
45
55
  ---
46
56
 
47
57
  ## 🚀 Quick Start & Usage
@@ -51,25 +61,48 @@ npm run build
51
61
  Since this is an MCP server, it is designed to communicate over `stdio` and should be launched by your MCP client.
52
62
 
53
63
  ```bash
54
- node build/index.js
64
+ gemini-cli-mcp
55
65
  ```
56
66
 
57
67
  ### Client Configuration Example
58
68
 
59
69
  Add the following to your AI assistant's MCP configuration:
60
70
 
71
+ ```json
72
+ {
73
+ "mcpServers": {
74
+ "gemini-cli": {
75
+ "command": "gemini-cli-mcp"
76
+ }
77
+ }
78
+ }
79
+ ```
80
+
81
+ If your MCP client cannot resolve global binaries reliably, use one of these alternatives instead:
82
+
83
+ ```json
84
+ {
85
+ "mcpServers": {
86
+ "gemini-cli": {
87
+ "command": "npx",
88
+ "args": ["-y", "@cainmaila/gemini-cli-mcp"]
89
+ }
90
+ }
91
+ }
92
+ ```
93
+
61
94
  ```json
62
95
  {
63
96
  "mcpServers": {
64
97
  "gemini-cli": {
65
98
  "command": "node",
66
- "args": ["/absolute/path/to/gemini-cli-mcp/build/index.js"]
99
+ "args": ["/absolute/path/to/installed/package/build/index.js"]
67
100
  }
68
101
  }
69
102
  }
70
103
  ```
71
104
 
72
- If you package or install this elsewhere, point the client to the built entry file at `build/index.js`.
105
+ Running `gemini-cli-mcp` directly in a terminal is only useful as a smoke test. It will wait for MCP traffic over `stdio`, so it may look idle until a client connects.
73
106
 
74
107
  ---
75
108
 
@@ -81,6 +114,7 @@ If you package or install this elsewhere, point the client to the built entry fi
81
114
  Best for upstream AI systems. Hands off a task to the Gemini CLI and returns a ready-to-use answer. Auto-applies edit approvals when necessary!
82
115
 
83
116
  **Input Example:**
117
+
84
118
  ```json
85
119
  {
86
120
  "task": "Query today's weather in Taipei and provide a short summary.",
@@ -90,6 +124,7 @@ Best for upstream AI systems. Hands off a task to the Gemini CLI and returns a r
90
124
  ```
91
125
 
92
126
  **Output Example:**
127
+
93
128
  ```json
94
129
  {
95
130
  "answer": "...",
@@ -100,6 +135,7 @@ Best for upstream AI systems. Hands off a task to the Gemini CLI and returns a r
100
135
  "elapsedMs": 23053
101
136
  }
102
137
  ```
138
+
103
139
  </details>
104
140
 
105
141
  <details>
@@ -108,6 +144,7 @@ Best for upstream AI systems. Hands off a task to the Gemini CLI and returns a r
108
144
  A lower-level interface designed for callers who demand exact prompt control.
109
145
 
110
146
  **Input Example:**
147
+
111
148
  ```json
112
149
  {
113
150
  "prompt": "Summarize the current repository",
@@ -117,6 +154,7 @@ A lower-level interface designed for callers who demand exact prompt control.
117
154
  ```
118
155
 
119
156
  **Output Example:**
157
+
120
158
  ```json
121
159
  {
122
160
  "ok": true,
@@ -127,6 +165,7 @@ A lower-level interface designed for callers who demand exact prompt control.
127
165
  "elapsedMs": 1532
128
166
  }
129
167
  ```
168
+
130
169
  </details>
131
170
 
132
171
  <details>
@@ -135,6 +174,7 @@ A lower-level interface designed for callers who demand exact prompt control.
135
174
  Flawless image-generation backed by the local `nanobanana` extension. Bypasses interactive prompts automatically!
136
175
 
137
176
  **Input Example:**
177
+
138
178
  ```json
139
179
  {
140
180
  "prompt": "a cute orange cat portrait, clean light background",
@@ -144,6 +184,7 @@ Flawless image-generation backed by the local `nanobanana` extension. Bypasses i
144
184
  ```
145
185
 
146
186
  **Output Example:**
187
+
147
188
  ```json
148
189
  {
149
190
  "ok": true,
@@ -154,6 +195,7 @@ Flawless image-generation backed by the local `nanobanana` extension. Bypasses i
154
195
  "elapsedMs": 12000
155
196
  }
156
197
  ```
198
+
157
199
  </details>
158
200
 
159
201
  <details>
@@ -162,12 +204,14 @@ Flawless image-generation backed by the local `nanobanana` extension. Bypasses i
162
204
  Discover your AI environment's capabilities on the fly. Returns top-level commands, installed extensions, available skills, and configured MCP servers.
163
205
 
164
206
  **Input Example:**
207
+
165
208
  ```json
166
209
  {
167
210
  "includeModelReportedTools": true,
168
211
  "timeoutMs": 60000
169
212
  }
170
213
  ```
214
+
171
215
  </details>
172
216
 
173
217
  ---
@@ -176,11 +220,11 @@ Discover your AI environment's capabilities on the fly. Returns top-level comman
176
220
 
177
221
  Tailor the server to your specific environment simply by setting these variables:
178
222
 
179
- | Variable | Description | Default |
180
- |----------|-------------|---------|
181
- | `GEMINI_CLI_PATH` | Path to the Gemini executable | `gemini` |
182
- | `GEMINI_PROMPT_FLAG` | The flag used for passing prompts | `-p` |
183
- | `GEMINI_MODEL_FLAG` | The flag used to specify the model | `--model` |
223
+ | Variable | Description | Default |
224
+ | -------------------- | ---------------------------------- | --------- |
225
+ | `GEMINI_CLI_PATH` | Path to the Gemini executable | `gemini` |
226
+ | `GEMINI_PROMPT_FLAG` | The flag used for passing prompts | `-p` |
227
+ | `GEMINI_MODEL_FLAG` | The flag used to specify the model | `--model` |
184
228
 
185
229
  ---
186
230
 
package/README.zh-TW.md CHANGED
@@ -1,15 +1,16 @@
1
1
  <div align="center">
2
2
  <img src="./assets/banner.svg" alt="Gemini CLI MCP Banner" width="100%" />
3
3
 
4
- # 🤖 Gemini CLI MCP Server
4
+ # 🤖 Gemini CLI MCP Server
5
5
 
6
- *輕量、無縫的 AI 本地代理協作工具*
6
+ _輕量、無縫的 AI 本地代理協作工具_
7
7
 
8
- [![npm version](https://img.shields.io/npm/v/gemini-cli-mcp.svg?style=flat-square)](https://www.npmjs.org/package/gemini-cli-mcp)
9
- [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg?style=flat-square)](https://opensource.org/licenses/MIT)
10
- [![Node.js Version](https://img.shields.io/node/v/gemini-cli-mcp.svg?style=flat-square)](#要求條件)
8
+ [![npm version](https://img.shields.io/npm/v/gemini-cli-mcp.svg?style=flat-square)](https://www.npmjs.org/package/gemini-cli-mcp)
9
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg?style=flat-square)](https://opensource.org/licenses/MIT)
10
+ [![Node.js Version](https://img.shields.io/node/v/gemini-cli-mcp.svg?style=flat-square)](#要求條件)
11
+
12
+ [**English**](./README.md) · [**繁體中文**](./README.zh-TW.md)
11
13
 
12
- [**English**](./README.md) · [**繁體中文**](./README.zh-TW.md)
13
14
  </div>
14
15
 
15
16
  ---
@@ -32,16 +33,25 @@
32
33
 
33
34
  ## 📦 安裝說明
34
35
 
35
- 開始使用非常簡單。請確認你已安裝了 [Node.js 18.18+](https://nodejs.org/) 以及配置好並能自動運行的本地端 Gemini CLI 驗證。
36
+ 一般使用情境下,應直接安裝已發佈的 npm 套件,或透過套件管理器即時執行。請先確認你已安裝 [Node.js 18.18+](https://nodejs.org/) 並完成本地 Gemini CLI 驗證設定。
36
37
 
37
38
  ```bash
38
- # 安裝相依套件
39
- npm install
39
+ # npm 全域安裝
40
+ npm install -g @cainmaila/gemini-cli-mcp
41
+
42
+ # pnpm 全域安裝
43
+ pnpm add -g @cainmaila/gemini-cli-mcp
40
44
 
41
- # 編譯專案
42
- npm run build
45
+ # 不做全域安裝,直接執行
46
+ npx -y @cainmaila/gemini-cli-mcp
43
47
  ```
44
48
 
49
+ 發佈到 npm 的套件名稱是 `@cainmaila/gemini-cli-mcp`,但安裝後實際可執行的命令名稱是 `gemini-cli-mcp`。
50
+
51
+ 如果你使用 `pnpm add -g`,要特別確認 MCP 客戶端啟動時看得到你的 `PNPM_HOME` 或全域 bin 目錄。許多桌面型 MCP client 不會沿用互動式 shell 的 `PATH`,因此即使安裝成功,也可能出現找不到 `gemini-cli-mcp` 的錯誤。
52
+
53
+ 如果你的目的不是使用已發佈套件,而是要開發這個 repository 本身,請改看 [DEVELOPMENT.md](DEVELOPMENT.md) 的本地開發流程。
54
+
45
55
  ---
46
56
 
47
57
  ## 🚀 快速上手與使用方法
@@ -51,25 +61,48 @@ npm run build
51
61
  由於這是一個 MCP 伺服器,它被設計成透過 `stdio` 溝通,並且應該由你的 MCP 客戶端應用程式來啟動:
52
62
 
53
63
  ```bash
54
- node build/index.js
64
+ gemini-cli-mcp
55
65
  ```
56
66
 
57
67
  ### MCP 客戶端設定範例
58
68
 
59
69
  將以下配置加入你所使用的 AI 助理之 MCP 設定檔中:
60
70
 
71
+ ```json
72
+ {
73
+ "mcpServers": {
74
+ "gemini-cli": {
75
+ "command": "gemini-cli-mcp"
76
+ }
77
+ }
78
+ }
79
+ ```
80
+
81
+ 如果你的 MCP client 對全域命令的 PATH 解析不穩定,建議改用以下其中一種寫法:
82
+
83
+ ```json
84
+ {
85
+ "mcpServers": {
86
+ "gemini-cli": {
87
+ "command": "npx",
88
+ "args": ["-y", "@cainmaila/gemini-cli-mcp"]
89
+ }
90
+ }
91
+ }
92
+ ```
93
+
61
94
  ```json
62
95
  {
63
96
  "mcpServers": {
64
97
  "gemini-cli": {
65
98
  "command": "node",
66
- "args": ["/絕對路徑/到/你的/gemini-cli-mcp/build/index.js"]
99
+ "args": ["/已安裝套件的絕對路徑/build/index.js"]
67
100
  }
68
101
  }
69
102
  }
70
103
  ```
71
104
 
72
- 如果將此伺服器打包或安裝在其他地方,請確保用戶端指向編譯好的 `build/index.js`。
105
+ 若你直接在終端機執行 `gemini-cli-mcp`,那通常只能當 smoke test。它會持續等待來自 MCP client 的 `stdio` 通訊,所以看起來像「沒反應」其實是正常行為。
73
106
 
74
107
  ---
75
108
 
@@ -81,6 +114,7 @@ node build/index.js
81
114
  這是上游 AI 系統的最佳預設工具。當你需要 Gemini CLI 幫你完成一項任務並直接回傳解答時,選這個就對了!如果任務牽涉修改檔案,伺服器還會自動啟用 Auto-Edit 核准模式。
82
115
 
83
116
  **輸入範例:**
117
+
84
118
  ```json
85
119
  {
86
120
  "task": "請查詢台北市今天的天氣,並用繁體中文簡短回答:天氣概況、溫度、降雨機率、以及一個外出建議。",
@@ -90,6 +124,7 @@ node build/index.js
90
124
  ```
91
125
 
92
126
  **輸出範例:**
127
+
93
128
  ```json
94
129
  {
95
130
  "answer": "台北今天多雲到晴,約 11°C 至 19°C,降雨機率低,建議早晚加外套。",
@@ -100,6 +135,7 @@ node build/index.js
100
135
  "elapsedMs": 23053
101
136
  }
102
137
  ```
138
+
103
139
  </details>
104
140
 
105
141
  <details>
@@ -108,6 +144,7 @@ node build/index.js
108
144
  專為需要精確控制 Prompt 參數的呼叫者所準備的底層介面。
109
145
 
110
146
  **輸入範例:**
147
+
111
148
  ```json
112
149
  {
113
150
  "prompt": "Summarize the current repository",
@@ -117,6 +154,7 @@ node build/index.js
117
154
  ```
118
155
 
119
156
  **輸出範例:**
157
+
120
158
  ```json
121
159
  {
122
160
  "ok": true,
@@ -127,6 +165,7 @@ node build/index.js
127
165
  "elapsedMs": 1532
128
166
  }
129
167
  ```
168
+
130
169
  </details>
131
170
 
132
171
  <details>
@@ -135,6 +174,7 @@ node build/index.js
135
174
  支援本地 `nanobanana` 擴充套件的穩健圖片生成介面。預設會啟用 YOLO 模式以自動繞過互動式認證提示,實現全自動生成!
136
175
 
137
176
  **輸入範例:**
177
+
138
178
  ```json
139
179
  {
140
180
  "prompt": "一隻可愛的橘貓肖像,乾淨明亮的背景",
@@ -144,6 +184,7 @@ node build/index.js
144
184
  ```
145
185
 
146
186
  **輸出範例:**
187
+
147
188
  ```json
148
189
  {
149
190
  "ok": true,
@@ -154,6 +195,7 @@ node build/index.js
154
195
  "elapsedMs": 12000
155
196
  }
156
197
  ```
198
+
157
199
  </details>
158
200
 
159
201
  <details>
@@ -162,12 +204,14 @@ node build/index.js
162
204
  用來動態探索本地 Gemini CLI 環境的利器。它會掃描並回報頂層指令、擴充套件、可用 Skills 以及所有的 MCP 伺服器狀態。
163
205
 
164
206
  **輸入範例:**
207
+
165
208
  ```json
166
209
  {
167
210
  "includeModelReportedTools": true,
168
211
  "timeoutMs": 60000
169
212
  }
170
213
  ```
214
+
171
215
  </details>
172
216
 
173
217
  ---
@@ -176,11 +220,11 @@ node build/index.js
176
220
 
177
221
  若需要客製化伺服器行為,你可以設定以下環境變數:
178
222
 
179
- | 變數名稱 | 說明 | 預設值 |
180
- |----------|-------------|---------|
181
- | `GEMINI_CLI_PATH` | 覆蓋 Gemini CLI 執行檔路徑 | `gemini` |
182
- | `GEMINI_PROMPT_FLAG` | 覆蓋傳遞 Prompt 使用的 Flag | `-p` |
183
- | `GEMINI_MODEL_FLAG` | 覆蓋指定外部模型的 Flag | `--model` |
223
+ | 變數名稱 | 說明 | 預設值 |
224
+ | -------------------- | --------------------------- | --------- |
225
+ | `GEMINI_CLI_PATH` | 覆蓋 Gemini CLI 執行檔路徑 | `gemini` |
226
+ | `GEMINI_PROMPT_FLAG` | 覆蓋傳遞 Prompt 使用的 Flag | `-p` |
227
+ | `GEMINI_MODEL_FLAG` | 覆蓋指定外部模型的 Flag | `--model` |
184
228
 
185
229
  ---
186
230
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@cainmaila/gemini-cli-mcp",
3
- "version": "1.0.0",
3
+ "version": "1.0.1",
4
4
  "description": "MCP server that delegates prompt execution to a locally installed Gemini CLI",
5
5
  "type": "module",
6
6
  "bin": {