@cbuk100011/claude-mode 1.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,339 @@
1
+ # Claude Mode
2
+
3
+ A Node.js CLI launcher for Claude Code that supports multiple AI providers and models.
4
+
5
+ ## Features
6
+
7
+ - **Interactive menu** - Guided step-by-step selection of provider, model, and mode
8
+ - **Quick mode** - Command-line arguments for fast launching
9
+ - **Multiple providers** - OpenRouter, Ollama Cloud, Ollama Local, Ollama Custom
10
+ - **Custom providers** - Define your own providers via config file
11
+ - **Dynamic model discovery** - Ollama models discovered automatically via API
12
+ - **Offline model cache** - Falls back to cached models when API is unreachable
13
+ - **Health checks** - Provider availability shown in interactive mode
14
+ - **Headless mode** - Single prompt execution for scripting/automation
15
+ - **Terminal mode** - Interactive Claude Code session
16
+ - **Shell completions** - Tab completion for bash, zsh, and fish
17
+
18
+ ## Installation
19
+
20
+ ### From Source
21
+
22
+ ```bash
23
+ # Clone the repository
24
+ git clone <repo-url>
25
+ cd claude-mode
26
+
27
+ # Install dependencies
28
+ npm install
29
+
30
+ # Build
31
+ npm run build
32
+
33
+ # Link globally (optional)
34
+ npm link
35
+ ```
36
+
37
+ ### Global Installation (after npm publish)
38
+
39
+ ```bash
40
+ npm install -g claude-mode
41
+ ```
42
+
43
+ ## Usage
44
+
45
+ ### Interactive Menu
46
+
47
+ Run without arguments to get a guided menu with provider health checks:
48
+
49
+ ```bash
50
+ claude-mode
51
+ ```
52
+
53
+ This will:
54
+ 1. Check availability of all providers
55
+ 2. Prompt you to select a provider
56
+ 3. Select a model (dynamically discovered for Ollama)
57
+ 4. Choose terminal or headless mode
58
+ 5. Enter a prompt (if headless)
59
+
60
+ ### Quick Mode
61
+
62
+ Specify provider, model, and optionally mode/prompt:
63
+
64
+ ```bash
65
+ # Interactive terminal session
66
+ claude-mode openrouter sonnet
67
+ claude-mode ollama-local qwen3
68
+
69
+ # Headless mode with default provider/model
70
+ claude-mode -p "list files"
71
+
72
+ # Headless mode with specified provider/model
73
+ claude-mode ollama-local qwen3 h "list all TypeScript files"
74
+ claude-mode ollama-cloud glm47 headless "explain this code"
75
+
76
+ # Using full provider names
77
+ claude-mode ollama-custom gpt120 h "run tests"
78
+ ```
79
+
80
+ ### Commands
81
+
82
+ ```bash
83
+ claude-mode --help # Show usage
84
+ claude-mode --list # List all available models
85
+ claude-mode list # Same as above
86
+ claude-mode health # Check provider availability
87
+ claude-mode config show # Show current configuration
88
+ claude-mode config init # Create config file
89
+ claude-mode completion bash # Generate bash completions
90
+ claude-mode completion zsh # Generate zsh completions
91
+ claude-mode completion fish # Generate fish completions
92
+ ```
93
+
94
+ ## Providers
95
+
96
+ | Provider | Description | Environment Variable |
97
+ |----------|-------------|---------------------|
98
+ | `openrouter` | OpenRouter API | `ANTHROPIC_BASE_URL` |
99
+ | `ollama-cloud` | Ollama Cloud | `OLLAMA_HOST` |
100
+ | `ollama-local` | Local Ollama | `OLLAMA_BASE_URL_LOCAL` |
101
+ | `ollama-custom` | Custom Ollama server | `OLLAMA_BASE_URL_CUSTOM` |
102
+
103
+ ### Provider Shortcuts
104
+
105
+ | Shortcut | Provider |
106
+ |----------|----------|
107
+ | `or`, `open` | openrouter |
108
+ | `oc`, `cloud` | ollama-cloud |
109
+ | `ol`, `local` | ollama-local |
110
+ | `custom`, `remote` | ollama-custom |
111
+
112
+ ## Models
113
+
114
+ ### OpenRouter (Static)
115
+
116
+ | Shortcut | Model ID |
117
+ |----------|----------|
118
+ | `opus` | anthropic/claude-opus-4.5 |
119
+ | `sonnet` | anthropic/claude-sonnet-4.5 |
120
+ | `haiku` | anthropic/claude-haiku-4.5 |
121
+ | `gpt52` | openai/gpt-5.2 |
122
+ | `gpt52-pro` | openai/gpt-5.2-pro |
123
+ | `gpt120` | @preset/gpt-oss-120b-cerebras |
124
+ | `glm47` | @preset/cerebras-glm-4-7-cerebras |
125
+ | `deepseek` | deepseek/deepseek-v3.2 |
126
+ | `gemini-pro` | google/gemini-3-pro-preview |
127
+ | `gemini-flash` | google/gemini-3-flash-preview |
128
+
129
+ ### Ollama (Dynamic)
130
+
131
+ Ollama models are **discovered automatically** via the `/v1/models` API endpoint.
132
+ Use `claude-mode --list` to see currently available models.
133
+
134
+ Models are cached locally and will work offline if previously discovered.
135
+
136
+ ## Configuration
137
+
138
+ ### Environment Variables
139
+
140
+ Create a `.env` file in your project directory:
141
+
142
+ ```bash
143
+ # OpenRouter
144
+ ANTHROPIC_AUTH_TOKEN=sk-or-v1-your-openrouter-key
145
+ # or
146
+ OPEN_ROUTER_API_KEY=sk-or-v1-your-openrouter-key
147
+
148
+ # Ollama Cloud
149
+ OLLAMA_HOST=https://ollama.com
150
+ OLLAMA_API_KEY=your-ollama-cloud-key
151
+
152
+ # Ollama Local (default: http://localhost:11434)
153
+ OLLAMA_BASE_URL_LOCAL=http://localhost:11434
154
+
155
+ # Ollama Custom (your own server)
156
+ OLLAMA_BASE_URL_CUSTOM=http://192.168.1.100:11434
157
+ ```
158
+
159
+ ### Config File
160
+
161
+ Create a config file for persistent settings:
162
+
163
+ ```bash
164
+ claude-mode config init
165
+ ```
166
+
167
+ This creates `~/.claude-mode/claude-mode.json`:
168
+
169
+ ```json
170
+ {
171
+ "defaultProvider": "",
172
+ "defaultModel": "",
173
+ "modelDiscoveryTimeout": 5000,
174
+ "healthCheckTimeout": 2000,
175
+ "cacheTTL": 30000,
176
+ "customProviders": [],
177
+ "skipHealthCheck": false,
178
+ "offlineMode": false
179
+ }
180
+ ```
181
+
182
+ #### Custom Providers
183
+
184
+ Add your own providers in the config file:
185
+
186
+ ```json
187
+ {
188
+ "customProviders": [
189
+ {
190
+ "key": "my-server",
191
+ "name": "My Server",
192
+ "baseUrl": "http://my-server:8080",
193
+ "authEnvVar": "MY_SERVER_API_KEY",
194
+ "description": "My custom LLM server"
195
+ }
196
+ ]
197
+ }
198
+ ```
199
+
200
+ ## Mode Options
201
+
202
+ | Mode | Aliases | Description |
203
+ |------|---------|-------------|
204
+ | Terminal | `terminal`, `t`, `interactive`, `i` | Interactive Claude Code session |
205
+ | Headless | `headless`, `h`, `prompt`, `p` | Single prompt execution |
206
+
207
+ ## Shell Completions
208
+
209
+ Add tab completion to your shell:
210
+
211
+ ### Bash
212
+ ```bash
213
+ # Add to ~/.bashrc
214
+ eval "$(claude-mode completion bash)"
215
+ ```
216
+
217
+ ### Zsh
218
+ ```bash
219
+ # Add to ~/.zshrc
220
+ eval "$(claude-mode completion zsh)"
221
+ ```
222
+
223
+ ### Fish
224
+ ```bash
225
+ # Save to completions directory
226
+ claude-mode completion fish > ~/.config/fish/completions/claude-mode.fish
227
+ ```
228
+
229
+ ## Examples
230
+
231
+ ```bash
232
+ # Start interactive menu with health checks
233
+ claude-mode
234
+
235
+ # Quick start with Claude Sonnet (OpenRouter)
236
+ claude-mode openrouter sonnet
237
+
238
+ # Use local Qwen3 for a quick task
239
+ claude-mode ollama-local qwen3 h "list files in src/"
240
+
241
+ # Use Ollama Cloud GLM for code explanation
242
+ claude-mode ollama-cloud glm47 h "explain the main function"
243
+
244
+ # Use remote server with large model
245
+ claude-mode ollama-custom gpt120
246
+
247
+ # Use shortcuts
248
+ claude-mode or sonnet h "tell me a joke"
249
+ claude-mode ol qwen3
250
+
251
+ # Check which providers are available
252
+ claude-mode health
253
+
254
+ # List available models
255
+ claude-mode --list
256
+
257
+ # Skip permission prompts (use with caution)
258
+ claude-mode --dangerously-skip-permissions
259
+ ```
260
+
261
+ ## Headless Mode Tools
262
+
263
+ In headless mode, the following tools are enabled by default:
264
+ - `Read` - Read files
265
+ - `Edit` - Edit files
266
+ - `Write` - Write files
267
+ - `Bash` - Execute bash commands
268
+ - `Glob` - File pattern matching
269
+ - `Grep` - Search file contents
270
+
271
+ ## Development
272
+
273
+ ```bash
274
+ # Install dependencies
275
+ npm install
276
+
277
+ # Development mode (watch)
278
+ npm run dev
279
+
280
+ # Build
281
+ npm run build
282
+
283
+ # Run tests
284
+ npm test
285
+
286
+ # Watch tests
287
+ npm run test:watch
288
+
289
+ # Type check
290
+ npm run typecheck
291
+ ```
292
+
293
+ ## Project Structure
294
+
295
+ ```
296
+ claude-mode/
297
+ ├── src/
298
+ │ ├── index.ts # Main CLI entry point
299
+ │ ├── providers.ts # Provider and model configurations
300
+ │ ├── config.ts # Configuration management
301
+ │ ├── errors.ts # Error handling utilities
302
+ │ ├── *.test.ts # Test files
303
+ ├── dist/ # Built output
304
+ ├── package.json
305
+ ├── tsconfig.json
306
+ ├── .env.example
307
+ └── README.md
308
+ ```
309
+
310
+ ## Troubleshooting
311
+
312
+ ### Model not found
313
+ Ensure the model is available on the selected provider. Use `--list` to see available models.
314
+
315
+ ### Authentication errors
316
+ Check that your API keys are correctly set in the `.env` file or environment variables.
317
+
318
+ ### Local Ollama not responding
319
+ Ensure Ollama is running: `ollama serve`
320
+
321
+ ### Claude not found
322
+ Ensure Claude Code CLI is installed and available in your PATH:
323
+ ```bash
324
+ npm install -g @anthropic-ai/claude-code
325
+ ```
326
+
327
+ ### Provider health check failing
328
+ Run `claude-mode health` to see which providers are available and any error details.
329
+
330
+ ### Timeout errors
331
+ Increase `modelDiscoveryTimeout` or `healthCheckTimeout` in your config file:
332
+ ```bash
333
+ claude-mode config init
334
+ # Then edit ~/.claude-mode/claude-mode.json
335
+ ```
336
+
337
+ ## License
338
+
339
+ MIT
@@ -0,0 +1 @@
1
+ #!/usr/bin/env node