hasina-gemini-cli 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/.env.example ADDED
@@ -0,0 +1,4 @@
1
+ GEMINI_API_KEY=your_gemini_api_key_here
2
+ DEFAULT_MODEL=gemini-2.5-flash
3
+ MAX_HISTORY_MESSAGES=20
4
+ SYSTEM_PROMPT=You are a helpful terminal AI assistant focused on clear, accurate, and practical answers.
package/README.md ADDED
@@ -0,0 +1,334 @@
1
+ # Gemini CLI
2
+
3
+ Gemini CLI is a production-ready terminal chat application for Node.js that uses the official `@google/genai` SDK. It provides a polished CLI, streaming-friendly architecture, local JSON session storage, bounded conversation context, and a clean service-based structure that can grow into a multi-provider AI client later.
4
+
5
+ ## Features
6
+
7
+ - Interactive terminal chat loop
8
+ - Official Gemini API integration through `@google/genai`
9
+ - Runtime model switching with `/use-model`
10
+ - Numbered model chooser with `/models`
11
+ - Session-level system prompt support
12
+ - Bounded context memory for provider requests
13
+ - Local JSON session persistence in `data/sessions.json`
14
+ - Slash command system for session and runtime controls
15
+ - Streaming-ready chat architecture with Gemini streaming enabled
16
+ - Friendly, actionable terminal errors
17
+ - Cross-platform support for Windows, macOS, and Linux
18
+
19
+ ## Folder Structure
20
+
21
+ ```text
22
+ Gemini-Cli/
23
+ src/
24
+ config/
25
+ env.js
26
+ gemini.js
27
+ services/
28
+ chat.service.js
29
+ history.service.js
30
+ session.service.js
31
+ command.service.js
32
+ utils/
33
+ printer.js
34
+ file.js
35
+ validators.js
36
+ app.js
37
+ index.js
38
+ data/
39
+ sessions.json
40
+ .env.example
41
+ package.json
42
+ README.md
43
+ ```
44
+
45
+ ## Installation
46
+
47
+ ### Prerequisites
48
+
49
+ - Node.js 20 or later
50
+ - A Gemini API key from Google AI Studio
51
+
52
+ Note: the original product brief said Node.js 18+, but the current official `@google/genai` package requires Node.js 20+ as of March 10, 2026.
53
+
54
+ ### Quick install from GitHub
55
+
56
+ Install globally:
57
+
58
+ ```bash
59
+ npm install -g github:Hasina69/Gemini-Cli
60
+ ```
61
+
62
+ Run:
63
+
64
+ ```bash
65
+ gemini-cli
66
+ ```
67
+
68
+ ### Install from npm after publishing
69
+
70
+ Once the package is published on npm, install globally with:
71
+
72
+ ```bash
73
+ npm install -g hasina-gemini-cli
74
+ ```
75
+
76
+ Or run it directly without a global install:
77
+
78
+ ```bash
79
+ npx hasina-gemini-cli
80
+ ```
81
+
82
+ ### Local development install
83
+
84
+ Clone and install:
85
+
86
+ ```bash
87
+ git clone https://github.com/Hasina69/Gemini-Cli.git
88
+ cd Gemini-Cli
89
+ npm install
90
+ ```
91
+
92
+ Then create your environment file.
93
+
94
+ ### Environment file setup
95
+
96
+ For local repo usage, create `.env` in the project root.
97
+
98
+ Windows PowerShell:
99
+
100
+ ```powershell
101
+ Copy-Item .env.example .env
102
+ ```
103
+
104
+ macOS/Linux:
105
+
106
+ ```bash
107
+ cp .env.example .env
108
+ ```
109
+
110
+ For global install usage, create `.env` in one of these locations:
111
+
112
+ - Current working directory: `.env`
113
+ - Windows shared config: `%APPDATA%\GeminiCli\.env`
114
+ - macOS shared config: `~/Library/Application Support/GeminiCli/.env`
115
+ - Linux shared config: `~/.gemini-cli/.env`
116
+
117
+ Update that `.env` with your Gemini API key and preferred defaults.
118
+
119
+ ## How To Create `GEMINI_API_KEY` In Google AI Studio
120
+
121
+ 1. Open [Google AI Studio](https://aistudio.google.com/apikey).
122
+ 2. Sign in with your Google account.
123
+ 3. Create a new API key.
124
+ 4. Copy the generated key.
125
+ 5. Paste it into your local `.env` file as `GEMINI_API_KEY`.
126
+
127
+ ## Environment Configuration
128
+
129
+ Use these variables in `.env`:
130
+
131
+ ```env
132
+ GEMINI_API_KEY=your_gemini_api_key_here
133
+ DEFAULT_MODEL=gemini-2.5-flash
134
+ MAX_HISTORY_MESSAGES=20
135
+ SYSTEM_PROMPT=You are a helpful terminal AI assistant focused on clear, accurate, and practical answers.
136
+ ```
137
+
138
+ ### Variable Notes
139
+
140
+ - `GEMINI_API_KEY`: required for authenticating to Gemini
141
+ - `DEFAULT_MODEL`: the model loaded at startup
142
+ - `MAX_HISTORY_MESSAGES`: max number of recent messages sent back to Gemini as context
143
+ - `SYSTEM_PROMPT`: default system instruction for each new session
144
+ - `GEMINI_CLI_HOME`: optional custom config directory for shared `.env` and session storage
145
+ - `GEMINI_CLI_SESSIONS_FILE`: optional custom path for `sessions.json`
146
+
147
+ ## Running The App
148
+
149
+ If you used the global install:
150
+
151
+ ```bash
152
+ gemini-cli
153
+ ```
154
+
155
+ If you are running from the local repo:
156
+
157
+ Development mode:
158
+
159
+ ```bash
160
+ npm run dev
161
+ ```
162
+
163
+ Production mode:
164
+
165
+ ```bash
166
+ npm start
167
+ ```
168
+
169
+ ## Command Reference
170
+
171
+ | Command | Description |
172
+ | --- | --- |
173
+ | `/help` | Show all commands |
174
+ | `/exit` | Exit the app cleanly |
175
+ | `/clear` | Clear in-memory history for the current session |
176
+ | `/history` | Show recent conversation messages |
177
+ | `/models` | Open a numbered Gemini model chooser |
178
+ | `/save` | Persist the current session to local session storage |
179
+ | `/new` | Start a fresh conversation session |
180
+ | `/model` | Show the currently active model |
181
+ | `/use-model <model_name>` | Switch the active Gemini model at runtime |
182
+ | `/system` | Show the active system prompt |
183
+ | `/set-system <text>` | Override the current system prompt for this session |
184
+ | `/sessions` | List saved local sessions |
185
+ | `/load <session_id>` | Load a saved session from local storage |
186
+
187
+ ## Example Terminal Session
188
+
189
+ ```text
190
+ GEMINI CLI banner...
191
+
192
+ Info > Active model: gemini-2.5-flash
193
+ Info > Session ID: session_001
194
+ Info > Type a message to chat, use /models to choose a model, or /help to list commands.
195
+
196
+ You > /models
197
+ Command > Choose a Gemini Model
198
+ 01. gemini-2.5-flash [active]
199
+ Gemini 2.5 Flash | input 1M | output 65.5K
200
+ 02. gemini-2.5-pro
201
+ Gemini 2.5 Pro | input 1M | output 65.5K
202
+ Model > 2
203
+ Success > Active model changed to "gemini-2.5-pro".
204
+
205
+ You > Explain event loops in Node.js.
206
+ Gemini > The Node.js event loop coordinates timers, I/O callbacks, microtasks,
207
+ and application work without blocking the main thread...
208
+
209
+ You > /save
210
+ Success > Session "session_001" saved to local session storage.
211
+
212
+ You > /exit
213
+ Info > Closing Gemini Terminal.
214
+ ```
215
+
216
+ ## Architecture Notes
217
+
218
+ - `src/config/gemini.js` contains the Gemini-specific provider wrapper
219
+ - `src/services/chat.service.js` is provider-oriented and keeps request building separate from the terminal UI
220
+ - `src/services/history.service.js` stores provider-neutral messages using `{ role, content }`
221
+ - `src/services/session.service.js` isolates local JSON persistence
222
+ - `src/services/command.service.js` parses commands without depending on terminal rendering
223
+ - `src/utils/printer.js` owns terminal presentation and loading behavior
224
+
225
+ This makes it straightforward to add future providers such as OpenAI or Claude without rewriting the app loop.
226
+
227
+ ## Notes About Changing Models
228
+
229
+ - The startup model comes from `DEFAULT_MODEL` in `.env`
230
+ - Use `/model` to inspect the active model
231
+ - Use `/models` to open a numbered chooser and switch without typing the full model name
232
+ - Use `/use-model gemini-2.5-flash` or another valid Gemini model name to switch at runtime
233
+ - `/use-model` validates the model with the Gemini API before applying the change
234
+ - If a saved session is loaded with `/load`, the saved model becomes the active model
235
+
236
+ ## Notes About Local Session Storage
237
+
238
+ - Local repo mode stores sessions in `data/sessions.json`
239
+ - Global install mode stores sessions in your shared config directory:
240
+ - Windows: `%APPDATA%\GeminiCli\sessions.json`
241
+ - macOS: `~/Library/Application Support/GeminiCli/sessions.json`
242
+ - Linux: `~/.gemini-cli/sessions.json`
243
+ - The app does not auto-save on exit; use `/save` when you want persistence
244
+ - Each saved session stores:
245
+ - session ID
246
+ - creation and update timestamps
247
+ - active model
248
+ - active system prompt
249
+ - provider-neutral messages
250
+ - If `data/sessions.json` becomes malformed JSON, the app will stop with a clear storage error instead of silently overwriting data
251
+
252
+ ## One-Line Commands
253
+
254
+ Global install from GitHub:
255
+
256
+ ```bash
257
+ npm install -g github:Hasina69/Gemini-Cli
258
+ ```
259
+
260
+ Global install from npm after publishing:
261
+
262
+ ```bash
263
+ npm install -g hasina-gemini-cli
264
+ ```
265
+
266
+ Run with npx after publishing:
267
+
268
+ ```bash
269
+ npx hasina-gemini-cli
270
+ ```
271
+
272
+ Run after global install:
273
+
274
+ ```bash
275
+ gemini-cli
276
+ ```
277
+
278
+ Local clone and run:
279
+
280
+ ```bash
281
+ git clone https://github.com/Hasina69/Gemini-Cli.git
282
+ cd Gemini-Cli
283
+ npm install
284
+ npm start
285
+ ```
286
+
287
+ ## Troubleshooting
288
+
289
+ ### Missing `GEMINI_API_KEY`
290
+
291
+ If startup fails with a configuration error, confirm `.env` exists and includes a non-empty `GEMINI_API_KEY`.
292
+
293
+ ### Invalid Or Unsupported Model
294
+
295
+ If Gemini rejects a model:
296
+
297
+ - verify the name with `/use-model`
298
+ - try a known working model such as `gemini-2.5-flash`
299
+ - confirm your API key has access to that model
300
+
301
+ ### Rate Limits Or Quota Errors
302
+
303
+ If you see rate limit or quota messages:
304
+
305
+ - wait and retry later for rate limiting
306
+ - check quota and billing in Google AI Studio for quota exhaustion
307
+
308
+ ### Node Version Errors
309
+
310
+ If `npm install` or startup fails on Node 18 or Node 19:
311
+
312
+ - upgrade to Node 20+
313
+ - reinstall dependencies after upgrading
314
+
315
+ ### Broken Session Storage
316
+
317
+ If the session storage file is manually edited and becomes invalid JSON:
318
+
319
+ - fix the JSON manually
320
+ - or replace it with:
321
+
322
+ ```json
323
+ {
324
+ "sessions": []
325
+ }
326
+ ```
327
+
328
+ ### Network Failures
329
+
330
+ If Gemini requests fail due to connectivity:
331
+
332
+ - check your internet connection
333
+ - retry after transient failures
334
+ - verify corporate firewall or proxy rules are not blocking outbound HTTPS requests
@@ -0,0 +1,3 @@
1
+ #!/usr/bin/env node
2
+
3
+ require('../src/index');
@@ -0,0 +1,3 @@
1
+ {
2
+ "sessions": []
3
+ }
package/package.json ADDED
@@ -0,0 +1,51 @@
1
+ {
2
+ "name": "hasina-gemini-cli",
3
+ "version": "1.0.0",
4
+ "description": "Production-ready terminal AI chat application powered by the official Gemini API SDK.",
5
+ "main": "src/index.js",
6
+ "bin": {
7
+ "gemini-cli": "bin/gemini-cli.js"
8
+ },
9
+ "files": [
10
+ "bin",
11
+ "data",
12
+ "src",
13
+ "README.md",
14
+ ".env.example"
15
+ ],
16
+ "type": "commonjs",
17
+ "preferGlobal": true,
18
+ "engines": {
19
+ "node": ">=20.0.0"
20
+ },
21
+ "scripts": {
22
+ "dev": "nodemon src/index.js",
23
+ "start": "node src/index.js"
24
+ },
25
+ "keywords": [
26
+ "gemini",
27
+ "terminal",
28
+ "cli",
29
+ "ai",
30
+ "chat"
31
+ ],
32
+ "author": "",
33
+ "license": "MIT",
34
+ "repository": {
35
+ "type": "git",
36
+ "url": "git+https://github.com/Hasina69/Gemini-Cli.git"
37
+ },
38
+ "homepage": "https://github.com/Hasina69/Gemini-Cli",
39
+ "bugs": {
40
+ "url": "https://github.com/Hasina69/Gemini-Cli/issues"
41
+ },
42
+ "dependencies": {
43
+ "@google/genai": "^1.44.0",
44
+ "chalk": "^4.1.2",
45
+ "dotenv": "^16.4.7",
46
+ "figlet": "^1.11.0"
47
+ },
48
+ "devDependencies": {
49
+ "nodemon": "^3.1.10"
50
+ }
51
+ }
package/src/app.js ADDED
@@ -0,0 +1,285 @@
1
+ const readline = require('readline/promises');
2
+ const { stdin: input, stdout: output } = require('process');
3
+ const { ChatService } = require('./services/chat.service');
4
+ const { CommandService } = require('./services/command.service');
5
+ const { HistoryService } = require('./services/history.service');
6
+ const { SessionService } = require('./services/session.service');
7
+
8
+ function normalizeAssistantText(text) {
9
+ if (typeof text !== 'string') {
10
+ return '';
11
+ }
12
+
13
+ return text.trim();
14
+ }
15
+
16
+ class App {
17
+ constructor({ config, provider, printer }) {
18
+ this.config = config;
19
+ this.provider = provider;
20
+ this.printer = printer;
21
+ this.readline = null;
22
+ this.isShuttingDown = false;
23
+
24
+ this.historyService = new HistoryService({
25
+ maxMessages: config.maxHistoryMessages,
26
+ });
27
+
28
+ this.sessionService = new SessionService({
29
+ storagePath: config.sessionsFilePath,
30
+ });
31
+
32
+ this.chatService = new ChatService({
33
+ provider,
34
+ maxHistoryMessages: config.maxHistoryMessages,
35
+ });
36
+
37
+ this.commandService = new CommandService({
38
+ sessionService: this.sessionService,
39
+ provider,
40
+ });
41
+
42
+ this.state = null;
43
+ }
44
+
45
+ async initializeState() {
46
+ await this.sessionService.ensureStorage();
47
+
48
+ this.state = {
49
+ sessionId: await this.sessionService.generateSessionId(),
50
+ sessionCreatedAt: new Date().toISOString(),
51
+ model: this.config.defaultModel,
52
+ systemPrompt: this.config.systemPrompt,
53
+ historyService: this.historyService,
54
+ };
55
+ }
56
+
57
+ async printStartup() {
58
+ await this.printer.printBanner({
59
+ animated: process.stdout.isTTY && process.env.GEMINI_CLI_BANNER_ANIMATION === '1',
60
+ });
61
+ this.printer.printInfo(`Active model: ${this.state.model}`);
62
+ this.printer.printInfo(`Session ID: ${this.state.sessionId}`);
63
+ this.printer.printInfo('Type a message to chat, use /models to choose a model, or /help to list commands.');
64
+ }
65
+
66
+ async start() {
67
+ await this.initializeState();
68
+ await this.printStartup();
69
+
70
+ this.readline = readline.createInterface({
71
+ input,
72
+ output,
73
+ terminal: true,
74
+ historySize: 1000,
75
+ });
76
+
77
+ this.readline.on('SIGINT', () => {
78
+ this.isShuttingDown = true;
79
+ this.printer.printInfo('Received Ctrl+C. Exiting cleanly.');
80
+ this.readline.close();
81
+ });
82
+
83
+ try {
84
+ while (!this.isShuttingDown) {
85
+ let rawInput = '';
86
+
87
+ try {
88
+ rawInput = await this.readline.question(this.printer.getUserPrompt());
89
+ } catch (error) {
90
+ if (this.isShuttingDown || this.isReadlineClosedError(error)) {
91
+ break;
92
+ }
93
+
94
+ throw error;
95
+ }
96
+
97
+ const userInput = rawInput.trim();
98
+
99
+ if (!userInput) {
100
+ continue;
101
+ }
102
+
103
+ if (this.commandService.isCommand(userInput)) {
104
+ const commandResult = await this.commandService.execute(userInput, this.state);
105
+ const shouldExit = await this.handleCommandResult(commandResult);
106
+
107
+ if (shouldExit) {
108
+ break;
109
+ }
110
+
111
+ continue;
112
+ }
113
+
114
+ await this.handleChatTurn(userInput);
115
+ }
116
+ } finally {
117
+ await this.shutdown();
118
+ }
119
+ }
120
+
121
+ async handleCommandResult(result) {
122
+ if (!result) {
123
+ return false;
124
+ }
125
+
126
+ if (result.kind === 'block') {
127
+ this.printer.printCommandBlock(result.title, result.lines);
128
+ return Boolean(result.exit);
129
+ }
130
+
131
+ if (result.kind === 'model-picker') {
132
+ return this.handleModelPicker(result);
133
+ }
134
+
135
+ if (result.level === 'success') {
136
+ this.printer.printSuccess(result.message);
137
+ return Boolean(result.exit);
138
+ }
139
+
140
+ if (result.level === 'error') {
141
+ this.printer.printError(result.message);
142
+ return Boolean(result.exit);
143
+ }
144
+
145
+ this.printer.printInfo(result.message);
146
+ return Boolean(result.exit);
147
+ }
148
+
149
+ async handleModelPicker(result) {
150
+ this.printer.printCommandBlock(result.title, result.lines);
151
+ let answer = '';
152
+
153
+ try {
154
+ answer = await this.readline.question('Model > ');
155
+ } catch (error) {
156
+ if (this.isShuttingDown || this.isReadlineClosedError(error)) {
157
+ return true;
158
+ }
159
+
160
+ throw error;
161
+ }
162
+
163
+ const selection = answer.trim();
164
+
165
+ if (!selection) {
166
+ this.printer.printInfo('Model selection cancelled.');
167
+ return false;
168
+ }
169
+
170
+ const selectedModel = this.resolveModelSelection(selection, result.models);
171
+
172
+ if (!selectedModel) {
173
+ this.printer.printError(
174
+ 'Invalid model selection. Use /models and choose a listed number or model name.'
175
+ );
176
+ return false;
177
+ }
178
+
179
+ if (selectedModel.id === this.state.model) {
180
+ this.printer.printInfo(`"${selectedModel.id}" is already the active model.`);
181
+ return false;
182
+ }
183
+
184
+ this.state.model = selectedModel.id;
185
+ this.printer.printSuccess(`Active model changed to "${selectedModel.id}".`);
186
+ return false;
187
+ }
188
+
189
+ resolveModelSelection(selection, models) {
190
+ if (!Array.isArray(models) || models.length === 0) {
191
+ return null;
192
+ }
193
+
194
+ if (/^\d+$/.test(selection)) {
195
+ const index = Number.parseInt(selection, 10) - 1;
196
+ return models[index] || null;
197
+ }
198
+
199
+ const normalizedSelection = selection.replace(/^models\//i, '').toLowerCase();
200
+
201
+ return (
202
+ models.find((model) => model.id.toLowerCase() === normalizedSelection) ||
203
+ models.find((model) => model.displayName.toLowerCase() === normalizedSelection) ||
204
+ null
205
+ );
206
+ }
207
+
208
+ async handleChatTurn(userInput) {
209
+ const loading = this.printer.createLoadingIndicator();
210
+ let streamStarted = false;
211
+
212
+ loading.start();
213
+
214
+ try {
215
+ const result = await this.chatService.sendMessage({
216
+ model: this.state.model,
217
+ systemPrompt: this.state.systemPrompt,
218
+ historyMessages: this.state.historyService.getMessages(),
219
+ userMessage: userInput,
220
+ preferStreaming: true,
221
+ onTextChunk: (chunk) => {
222
+ if (!streamStarted) {
223
+ loading.stop();
224
+ this.printer.startAssistantStream();
225
+ streamStarted = true;
226
+ }
227
+
228
+ this.printer.writeAssistantChunk(chunk);
229
+ },
230
+ });
231
+
232
+ loading.stop();
233
+
234
+ const assistantText =
235
+ normalizeAssistantText(result.text) || 'No text response returned by Gemini.';
236
+
237
+ if (streamStarted) {
238
+ this.printer.endAssistantStream();
239
+
240
+ if (!normalizeAssistantText(result.text)) {
241
+ this.printer.printInfo('Gemini returned an empty text response.');
242
+ }
243
+ } else {
244
+ this.printer.printAssistant(assistantText);
245
+ }
246
+
247
+ this.state.historyService.addMessage('user', userInput);
248
+ this.state.historyService.addMessage('assistant', assistantText);
249
+ } catch (error) {
250
+ loading.stop();
251
+
252
+ if (streamStarted) {
253
+ this.printer.endAssistantStream();
254
+ }
255
+
256
+ this.printer.printError(error.message || 'Unexpected error while talking to Gemini.');
257
+ }
258
+ }
259
+
260
+ async shutdown() {
261
+ if (this.readline) {
262
+ try {
263
+ this.readline.close();
264
+ } catch (error) {
265
+ if (!this.isReadlineClosedError(error)) {
266
+ throw error;
267
+ }
268
+ } finally {
269
+ this.readline = null;
270
+ }
271
+ }
272
+ }
273
+
274
+ isReadlineClosedError(error) {
275
+ return (
276
+ error &&
277
+ (error.code === 'ERR_USE_AFTER_CLOSE' ||
278
+ /The "readline" interface is closed/i.test(error.message || ''))
279
+ );
280
+ }
281
+ }
282
+
283
+ module.exports = {
284
+ App,
285
+ };