cckit 0.2.0 → 0.3.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,12 +1,12 @@
1
1
  # cckit (Node.js Edition)
2
2
 
3
- Code Kit for Claude Model Switching - Support 智谱LLM, MiniMax, Kimi, Kuaishou StreamLake, ZenMux.ai, and official Claude
3
+ Code Kit for Claude Model Switching - Support 智谱 Coding Plan, MiniMax Coding Plan, Kimi Coding Plan, 快手 StreamLake Coding Plan, 火山引擎 Coding Plan, ZenMux.ai, and official Claude
4
4
 
5
5
  A CLI tool for managing and switching between different Claude model providers and their configurations. Features support for model gateways like ZenMux.ai and Kuaishou StreamLake that provide unified access to multiple AI providers.
6
6
 
7
7
  ## Features
8
8
 
9
- - **Multiple Provider Support**: Configure and switch between 智谱LLM, MiniMax, Kimi, Kuaishou StreamLake, 火山方舟(Volcengine), 阿里云(Aliyun), ZenMux.ai, official Claude, and custom providers
9
+ - **Multiple Provider Support**: Configure and switch between 智谱 Coding Plan, MiniMax Coding Plan, Kimi Coding Plan, 快手 StreamLake Coding Plan, 火山引擎 Coding Plan, 阿里云 Coding Plan, 腾讯云 Coding Plan, 讯飞 Coding Plan, ZenMux.ai, official Claude, and custom providers
10
10
  - **Easy Configuration**: Simple command-line interface for managing API keys and settings
11
11
  - **Claude Integration**: Automatically updates Claude Code configuration files
12
12
  - **Export/Import**: Backup and restore your provider configurations
@@ -26,6 +26,189 @@ pnpm add -g cckit
26
26
  yarn global add cckit
27
27
  ```
28
28
 
29
+ ## Quick Start (Interactive Mode)
30
+
31
+ ```bash
32
+ # Start interactive configuration wizard (recommended for first-time users)
33
+ cckit interactive
34
+ ```
35
+
36
+ ### Interactive Switch Mode
37
+
38
+ `cckit switch` now supports interactive mode when no provider argument is given:
39
+
40
+ ```bash
41
+ # Interactive mode - shows selection list
42
+ cckit switch
43
+
44
+ # Direct mode - with provider argument
45
+ cckit switch zhipu
46
+ ```
47
+
48
+ Interactive switch example:
49
+ ```
50
+ ? 选择要切换的 Provider
51
+ 智谱 Coding Plan - glm-5
52
+ ✓ MiniMax Coding Plan - abab6.5s-chat (当前)
53
+ Kimi Coding Plan - kimi-for-coding
54
+ Claude - claude-3-opus-20250219
55
+ ZenMux.ai - claude-3-5-sonnet-20241022
56
+ ```
57
+
58
+
59
+ ## Interactive Mode Details
60
+
61
+ The interactive wizard (`cckit interactive`) provides a user-friendly, step-by-step configuration experience with arrow key selection:
62
+
63
+ ### Step-by-Step Guide
64
+
65
+ ```
66
+ ▸ Step 1: 选择要配置的 AI 提供商
67
+ ```
68
+ Use ↑/↓ arrow keys to select from:
69
+ - **Preset Providers**: 智谱 Coding Plan, MiniMax Coding Plan, Kimi Coding Plan, ZenMux, 快手 StreamLake Coding Plan, 火山引擎 Coding Plan, 阿里云 Coding Plan, 腾讯云 Coding Plan, 讯飞 Coding Plan, Claude
70
+ - **Custom Provider**: For local LLMs (Ollama, LocalAI, LM Studio) or any OpenAI-compatible API
71
+
72
+ ```
73
+ ▸ Step 2: 配置 [Provider Name]
74
+ ```
75
+ - Enter your API key (required)
76
+ - For custom providers: Enter the base URL (e.g., `http://localhost:11434/v1`)
77
+
78
+ ```
79
+ ▸ Step 3: 设置模型
80
+ ```
81
+ - Enter the model name (default value provided based on provider)
82
+ - Multiple models can be entered, separated by commas
83
+
84
+ ```
85
+ ▸ Step 4: 配置模型类型 (可选)
86
+ ```
87
+ - Optional: Configure different models for Sonnet/Haiku/Opus model tiers
88
+ - Each tier can be skipped to use the default model
89
+ - This sets `ANTHROPIC_DEFAULT_SONNET_MODEL`, `ANTHROPIC_DEFAULT_HAIKU_MODEL`, `ANTHROPIC_DEFAULT_OPUS_MODEL`
90
+
91
+ ```
92
+ ▸ Step 5: 保存配置
93
+ ```
94
+ - If existing configurations found, choose to update or create new
95
+ - Configuration saved to `~/.cckit/config.json`
96
+
97
+ ```
98
+ ▸ Step 6: 测试连接
99
+ ```
100
+ - Optional: Test the connection to verify API key and endpoint
101
+
102
+ ```
103
+ ▸ Step 7: 切换 Provider
104
+ ```
105
+ - Optional: Switch to the newly configured provider immediately
106
+
107
+ ### Features
108
+
109
+ - **Arrow Key Navigation**: Easy selection with keyboard
110
+ - **Smart Defaults**: Pre-filled model names based on provider
111
+ - **Existing Config Detection**: Detects and offers to update existing configurations
112
+ - **Connection Testing**: Verify setup before switching
113
+ - **Immediate Switch**: Option to activate the provider right away
114
+
115
+ ### Example Session
116
+
117
+ ```bash
118
+ $ cckit interactive
119
+
120
+ ___ ___ / ___ ( ) __ ___
121
+ // ) ) // ) ) //\ \ / / / /
122
+ // // // \ \ / / / /
123
+ ((____ ((____ // \ \ / / / /
124
+ 交互式配置向导
125
+
126
+ ▸ Step 1: 选择要配置的 AI 提供商
127
+
128
+ ? 请选择提供商: (Use arrow keys)
129
+ ❯ 智谱 Coding Plan (GLM)
130
+ MiniMax Coding Plan
131
+ Kimi Coding Plan
132
+ ZenMux.ai
133
+ 快手 StreamLake Coding Plan
134
+ 火山引擎 Coding Plan
135
+ 阿里云 Coding Plan
136
+ 腾讯云 Coding Plan
137
+ 讯飞 Coding Plan
138
+ Claude (Official)
139
+ 自定义 Provider (Local LLM / OpenAI Compatible)
140
+
141
+ ▸ Step 2: 配置 智谱 Coding Plan
142
+
143
+ ? 请输入 API Key: [your-api-key]
144
+
145
+ ▸ Step 3: 设置模型
146
+
147
+ ? 请输入模型名称 (多个模型用逗号分隔): glm-5, kimi-k2.5, MiniMax-M2.5
148
+
149
+ ▸ Step 4: 配置模型类型 (可选)
150
+ 提示: 可为 Sonnet/Haiku/Opus 模型类型配置不同的模型,不配置则使用默认模型
151
+
152
+ ? 是否为不同模型类型 (Sonnet/Haiku/Opus) 配置不同的默认模型? Yes
153
+
154
+ ? 选择 Sonnet 模型
155
+ ❯ 跳过 (使用默认模型)
156
+ glm-5
157
+ kimi-k2.5
158
+ MiniMax-M2.5
159
+
160
+ ? 选择 Haiku 模型
161
+ ❯ 跳过 (使用默认模型)
162
+ glm-5
163
+ kimi-k2.5
164
+ MiniMax-M2.5
165
+
166
+ ? 选择 Opus 模型
167
+ ❯ 跳过 (使用默认模型)
168
+ glm-5
169
+ kimi-k2.5
170
+ MiniMax-M2.5
171
+
172
+ 模型类型配置:
173
+ Sonnet: glm-5
174
+ Haiku: kimi-k2.5
175
+ Opus: MiniMax-M2.5
176
+
177
+ ▸ Step 5: 保存配置
178
+
179
+ ✓ 配置已保存 (id: zhipu_xxx)
180
+ Provider: 智谱 Coding Plan
181
+ Model: glm-5, kimi-k2.5, MiniMax-M2.5
182
+
183
+ ▸ Step 6: 测试连接
184
+
185
+ ? 是否测试连接? (Y/n)
186
+
187
+ ✓ 连接成功!
188
+ 响应时间: 234ms
189
+
190
+ ▸ Step 7: 切换 Provider
191
+
192
+ ? 是否立即切换到该 Provider? (Y/n)
193
+
194
+ ✓ 已切换到 智谱 Coding Plan
195
+ 请重启终端或新开会话以使配置生效
196
+
197
+ ___ ___ / ___ ( ) __ ___
198
+ // ) ) // ) ) //\ \ / / / /
199
+ // // // \ \ / / / /
200
+ ((____ ((____ // \ \ / / / /
201
+ 配置完成! ✓
202
+ ```
203
+
204
+ The interactive wizard will guide you through:
205
+ 1. Selecting an AI provider (智谱 Coding Plan, MiniMax Coding Plan, Kimi Coding Plan, Claude, ZenMux, 快手 StreamLake Coding Plan, 火山引擎 Coding Plan, 阿里云 Coding Plan, 腾讯云 Coding Plan, 讯飞 Coding Plan, or Custom)
206
+ 2. Entering your API key
207
+ 3. Setting up the model name(s)
208
+ 4. (Optional) Configuring different models for Sonnet/Haiku/Opus tiers
209
+ 5. Testing the connection
210
+ 6. Switching to the configured provider
211
+
29
212
  ## Usage
30
213
 
31
214
  ### Basic Commands
@@ -46,9 +229,12 @@ cckit configure <provider> --api-key <key> [--base-url <url>] [--model <model>]
46
229
  # Configure a custom provider (local LLM, OpenAI-compatible API, etc.)
47
230
  cckit configure my-ollama --api-key "ollama-key" --base-url "http://localhost:11434/v1" --model "llama3"
48
231
 
49
- # Switch to a provider
232
+ # Switch to a provider (direct mode with argument)
50
233
  cckit switch <provider>
51
234
 
235
+ # Switch to a provider (interactive mode without argument)
236
+ cckit switch
237
+
52
238
  # Show provider configuration
53
239
  cckit show <provider>
54
240
 
@@ -61,15 +247,20 @@ cckit models <provider>
61
247
  # Set active model for a provider
62
248
  cckit set-model <provider> <model>
63
249
 
250
+ # Set model for a specific model tier (Sonnet/Haiku/Opus)
251
+ cckit set-model <provider> <model> --type sonnet
252
+ cckit set-model <provider> <model> --type haiku
253
+ cckit set-model <provider> <model> --type opus
254
+
64
255
  # Remove a model from a provider
65
256
  cckit remove-model <provider> <model>
66
257
  ```
67
258
 
68
259
  ### Supported Providers
69
260
 
70
- #### 智谱LLM (Zhipu) - [https://www.bigmodel.cn/claude-code?ic=AFDPNDPWIF](https://www.bigmodel.cn/claude-code?ic=AFDPNDPWIF)
261
+ #### 智谱 Coding Plan - [https://www.bigmodel.cn/claude-code?ic=AFDPNDPWIF](https://www.bigmodel.cn/claude-code?ic=AFDPNDPWIF)
71
262
  ```bash
72
- # Configure 智谱LLM with a single model
263
+ # Configure 智谱 Coding Plan with a single model
73
264
  cckit configure zhipu --api-key "your-api-key" --model "GLM-4.7"
74
265
 
75
266
  # Configure with multiple models
@@ -81,16 +272,20 @@ cckit models zhipu
81
272
  # Set active model
82
273
  cckit set-model zhipu "GLM-4-Plus"
83
274
 
275
+ # Set model for specific tier
276
+ cckit set-model zhipu "GLM-4-Plus" --type sonnet
277
+ cckit set-model zhipu "GLM-4.7" --type haiku
278
+
84
279
  # Remove a model
85
280
  cckit remove-model zhipu "GLM-4.7"
86
281
 
87
- # Switch to 智谱LLM
282
+ # Switch to 智谱 Coding Plan
88
283
  cckit switch zhipu
89
284
  ```
90
285
 
91
- #### MiniMax - [https://platform.minimaxi.com/subscribe/coding-plan](https://platform.minimaxi.com/subscribe/coding-plan?code=3FiunKqPhD&source=link)
286
+ #### MiniMax Coding Plan - [https://platform.minimaxi.com/subscribe/coding-plan](https://platform.minimaxi.com/subscribe/coding-plan?code=3FiunKqPhD&source=link)
92
287
  ```bash
93
- # Configure MiniMax with multiple models
288
+ # Configure MiniMax Coding Plan with multiple models
94
289
  cckit configure minimax --api-key "your-api-key" --model "MiniMax-M2" --model "MiniMax-M4"
95
290
 
96
291
  # List models
@@ -99,16 +294,19 @@ cckit models minimax
99
294
  # Set active model
100
295
  cckit set-model minimax "MiniMax-M4"
101
296
 
297
+ # Set model for specific tier
298
+ cckit set-model minimax "MiniMax-M4" --type sonnet
299
+
102
300
  # Remove a model
103
301
  cckit remove-model minimax "MiniMax-M2"
104
302
 
105
- # Switch to MiniMax
303
+ # Switch to MiniMax Coding Plan
106
304
  cckit switch minimax
107
305
  ```
108
306
 
109
- #### Kimi (Moonshot) - [https://www.kimi.com/coding/docs/](https://www.kimi.com/coding/docs/)
307
+ #### Kimi Coding Plan - [https://www.kimi.com/coding/docs/](https://www.kimi.com/coding/docs/)
110
308
  ```bash
111
- # Configure Kimi with multiple models
309
+ # Configure Kimi Coding Plan with multiple models
112
310
  cckit configure kimi --api-key "your-api-key" --model "kimi-for-coding" --model "kimi-plus"
113
311
 
114
312
  # List models
@@ -117,7 +315,7 @@ cckit models kimi
117
315
  # Set active model
118
316
  cckit set-model kimi "kimi-plus"
119
317
 
120
- # Switch to Kimi
318
+ # Switch to Kimi Coding Plan
121
319
  cckit switch kimi
122
320
  ```
123
321
 
@@ -151,9 +349,9 @@ cckit set-model claude "claude-3-opus-20250219"
151
349
  cckit switch claude
152
350
  ```
153
351
 
154
- #### Kuaishou StreamLake - [快手万擎引擎](https://streamlake.com/marketing/coding-plan)
352
+ #### 快手 StreamLake Coding Plan - [快手万擎引擎](https://streamlake.com/marketing/coding-plan)
155
353
  ```bash
156
- # Configure Kuaishou StreamLake with multiple models
354
+ # Configure 快手 StreamLake Coding Plan with multiple models
157
355
  cckit configure streamlake --api-key "your-streamlake-api-key" --model "kat-coder-pro-v1" --model "claude-3-opus-20250219"
158
356
 
159
357
  # List models
@@ -162,13 +360,13 @@ cckit models streamlake
162
360
  # Set active model
163
361
  cckit set-model streamlake "claude-3-opus-20250219"
164
362
 
165
- # Switch to Kuaishou StreamLake
363
+ # Switch to 快手 StreamLake Coding Plan
166
364
  cckit switch streamlake
167
365
  ```
168
366
 
169
- #### Volcengine [火山方舟](https://www.volcengine.com/activity/codingplan)
367
+ #### 火山引擎 Coding Plan [火山方舟](https://www.volcengine.com/activity/codingplan)
170
368
  ```bash
171
- # Configure Volcengine with multiple models
369
+ # Configure 火山引擎 Coding Plan with multiple models
172
370
  cckit configure volcengine --api-key "your-ark-api-key" --model "ark-code-latest" --model "ark-code-pro"
173
371
 
174
372
  # List models
@@ -177,7 +375,7 @@ cckit models volcengine
177
375
  # Set active model
178
376
  cckit set-model volcengine "ark-code-pro"
179
377
 
180
- # Switch to Volcengine
378
+ # Switch to 火山引擎 Coding Plan
181
379
  cckit switch volcengine
182
380
  ```
183
381
 
@@ -192,10 +390,45 @@ cckit models aliyun
192
390
  # Set active model
193
391
  cckit set-model aliyun "glm-5"
194
392
 
393
+ # Set model for specific tier
394
+ cckit set-model aliyun "glm-5" --type sonnet
395
+ cckit set-model aliyun "kimi-k2.5" --type haiku
396
+ cckit set-model aliyun "MiniMax-M2.5" --type opus
397
+
195
398
  # Switch to Aliyun
196
399
  cckit switch aliyun
197
400
  ```
198
401
 
402
+ #### Tencent [腾讯云 Coding Plan]
403
+ ```bash
404
+ # Configure Tencent Coding Plan with multiple models
405
+ cckit configure tencent --api-key "your-api-key" --model "GLM-5" --model "MiniMax-M2.5" --model "Kimi-K2.5"
406
+
407
+ # List models
408
+ cckit models tencent
409
+
410
+ # Set active model
411
+ cckit set-model tencent "MiniMax-M2.5"
412
+
413
+ # Switch to Tencent
414
+ cckit switch tencent
415
+ ```
416
+
417
+ #### XFYun [讯飞 Coding Plan]
418
+ ```bash
419
+ # Configure XFYun Coding Plan with multiple models
420
+ cckit configure xfyun --api-key "your-api-key" --model "GLM-5" --model "MiniMax-M2.5" --model "Kimi-K2.5"
421
+
422
+ # List models
423
+ cckit models xfyun
424
+
425
+ # Set active model
426
+ cckit set-model xfyun "MiniMax-M2.5"
427
+
428
+ # Switch to XFYun
429
+ cckit switch xfyun
430
+ ```
431
+
199
432
  #### Custom Provider - Local/OpaqueAI/OpenAI Compatible APIs
200
433
 
201
434
  Configure custom providers for local LLMs (Ollama, LocalAI, LM Studio), opaque API endpoints, or any Anthropic-compatible API:
@@ -239,6 +472,9 @@ cckit import backup.json
239
472
 
240
473
  # Reset to default Claude configuration
241
474
  cckit reset
475
+
476
+ # Interactive configuration wizard (new!)
477
+ cckit interactive
242
478
  ```
243
479
 
244
480
  ## Configuration
@@ -255,17 +491,78 @@ Each provider can now be configured with multiple models. You can:
255
491
 
256
492
  The active model is used when Claude Code switches to that provider. If no explicit active model is set, the first configured model is used.
257
493
 
494
+ ### Model Tier Configuration
495
+
496
+ Claude Code supports different model tiers (Sonnet, Haiku, Opus) for different use cases. You can configure different models for each tier:
497
+
498
+ #### Command Line
499
+
500
+ ```bash
501
+ # Set default model (ANTHROPIC_MODEL)
502
+ cckit set-model aliyun "glm-5"
503
+
504
+ # Set Sonnet model (ANTHROPIC_DEFAULT_SONNET_MODEL)
505
+ cckit set-model aliyun "glm-5" --type sonnet
506
+
507
+ # Set Haiku model (ANTHROPIC_DEFAULT_HAIKU_MODEL)
508
+ cckit set-model aliyun "kimi-k2.5" --type haiku
509
+
510
+ # Set Opus model (ANTHROPIC_DEFAULT_OPUS_MODEL)
511
+ cckit set-model aliyun "MiniMax-M2.5" --type opus
512
+ ```
513
+
514
+ #### Interactive Mode
515
+
516
+ In interactive mode (`cckit interactive`), after entering model names, you'll be asked:
517
+ ```
518
+ ? 是否为不同模型类型 (Sonnet/Haiku/Opus) 配置不同的默认模型?
519
+ ```
520
+
521
+ Select "Yes" to configure each model tier. You can choose "跳过 (使用默认模型)" for any tier you don't want to customize.
522
+
523
+ #### View Configuration
524
+
525
+ ```bash
526
+ # Show provider configuration including model tier settings
527
+ cckit show aliyun
528
+ ```
529
+
530
+ Output:
531
+ ```
532
+ Configuration: 阿里云 Coding Plan
533
+
534
+ ID: bf76bc20-87f1-4bd4-b091-20b9d1eabdb9
535
+ Name: Aliyun Coding Plan
536
+ Type: aliyun
537
+ ...
538
+ Models: qwen3.5-plus, kimi-k2.5, glm-5, MiniMax-M2.5
539
+ Active: glm-5
540
+ Sonnet: glm-5
541
+ Haiku: kimi-k2.5
542
+ Opus: MiniMax-M2.5
543
+ ```
544
+
545
+ #### Validation
546
+
547
+ The model must be in the provider's model list. If you try to set a model that doesn't exist:
548
+ ```bash
549
+ $ cckit set-model aliyun "non-existent-model" --type sonnet
550
+ Error: Model 'non-existent-model' not found in provider. Available models: qwen3.5-plus, kimi-k2.5, glm-5, MiniMax-M2.5
551
+ ```
552
+
258
553
  ### Provider Details
259
554
 
260
555
  | Provider | Default Model | Auth Method | Capabilities | Default Base URL |
261
556
  |----------|---------------|-------------|--------------|------------------|
262
- | 智谱LLM | GLM-4.7 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support | https://open.bigmodel.cn/api/anthropic |
263
- | MiniMax | MiniMax-M2 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Multi-language | https://api.minimaxi.com/anthropic |
264
- | Kimi | kimi-for-coding | ANTHROPIC_API_KEY | Chat, Long Context, Code Generation | https://api.kimi.com/coding/ |
557
+ | 智谱 Coding Plan | GLM-4.7 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support | https://open.bigmodel.cn/api/anthropic |
558
+ | MiniMax Coding Plan | MiniMax-M2 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Multi-language | https://api.minimaxi.com/anthropic |
559
+ | Kimi Coding Plan | kimi-for-coding | ANTHROPIC_API_KEY | Chat, Long Context, Code Generation | https://api.kimi.com/coding/ |
265
560
  | ZenMux.ai | claude-3-5-sonnet-20241022 | ANTHROPIC_API_KEY | Model Gateway, Multi-provider, Claude Compatible | https://zenmux.ai/api/anthropic |
266
- | Kuaishou StreamLake | kat-coder-pro-v1 | ANTHROPIC_API_KEY | Chat, Code Generation, Chinese Support, Video Understanding | https://wanqing.streamlakeapi.com/api/gateway/v1/endpoints/kat-coder-pro-v1/claude-code-proxy |
267
- | Volcengine | ark-code-latest | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support | https://ark.cn-beijing.volces.com/api/coding |
268
- | Aliyun | claude-sonnet-4-20250514 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support, Multimodal | https://coding.dashscope.aliyuncs.com/apps/anthropic |
561
+ | 快手 StreamLake Coding Plan | kat-coder-pro-v1 | ANTHROPIC_API_KEY | Chat, Code Generation, Chinese Support, Video Understanding | https://wanqing.streamlakeapi.com/api/gateway/v1/endpoints/kat-coder-pro-v1/claude-code-proxy |
562
+ | 火山引擎 Coding Plan | ark-code-latest | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support | https://ark.cn-beijing.volces.com/api/coding |
563
+ | 阿里云 Coding Plan | claude-sonnet-4-20250514 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support, Multimodal | https://coding.dashscope.aliyuncs.com/apps/anthropic |
564
+ | 腾讯云 Coding Plan | GLM-5 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support, Multimodal | https://api.lkeap.cloud.tencent.com/coding/anthropic |
565
+ | 讯飞 Coding Plan | GLM-5 | ANTHROPIC_AUTH_TOKEN | Chat, Code Generation, Chinese Support, Multimodal | https://maas-coding-api.cn-huabei-1.xf-yun.com/anthropic |
269
566
  | Claude | claude-3-5-sonnet-20241022 | ANTHROPIC_API_KEY | Chat, Code Generation, Analysis, Multimodal | Official |
270
567
  | Custom | - | ANTHROPIC_API_KEY | Custom, User-defined, Local LLM, Anthropic Compatible API | User-defined |
271
568
 
@@ -299,7 +596,7 @@ For more build options and details, see [BUILD.md](BUILD.md).
299
596
 
300
597
  ## Requirements
301
598
 
302
- - Node.js 18 or higher
599
+ - Node.js 20 or higher
303
600
  - pnpm (or npm/yarn)
304
601
  - Claude Code installed
305
602
  - Valid API keys for the providers you want to use
@@ -2,19 +2,24 @@ import { ConfigManager } from './config.js';
2
2
  export declare class Commands {
3
3
  private configManager;
4
4
  constructor(configManager: ConfigManager);
5
+ /**
6
+ * Interactive configuration wizard with arrow key selection
7
+ */
8
+ interactive(): Promise<void>;
5
9
  list(): void;
6
- switch(providerId: string): void;
10
+ switch(providerId?: string): Promise<void>;
7
11
  current(): void;
8
12
  configure(providerId: string, apiKey?: string, baseUrl?: string, models?: string[]): void;
9
13
  show(providerId: string): void;
10
14
  test(providerId: string): Promise<void>;
11
15
  models(providerId: string): void;
12
- setModel(providerId: string, modelName: string): Promise<void>;
16
+ setModel(providerId: string, modelName: string, modelType?: 'default' | 'sonnet' | 'haiku' | 'opus'): Promise<void>;
13
17
  removeModel(providerId: string, modelName: string): void;
14
18
  export(output?: string): void;
15
19
  import(file: string): void;
16
20
  reset(): void;
17
21
  private getProviderDisplayName;
18
22
  private askForSwitch;
23
+ private getModelTypeLabel;
19
24
  }
20
25
  //# sourceMappingURL=commands.d.ts.map
@@ -1 +1 @@
1
- {"version":3,"file":"commands.d.ts","sourceRoot":"","sources":["../src/commands.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,aAAa,EAAsD,MAAM,aAAa,CAAC;AAKhG,qBAAa,QAAQ;IACP,OAAO,CAAC,aAAa;gBAAb,aAAa,EAAE,aAAa;IAEhD,IAAI,IAAI,IAAI;IAgCZ,MAAM,CAAC,UAAU,EAAE,MAAM,GAAG,IAAI;IAmBhC,OAAO,IAAI,IAAI;IA8Bf,SAAS,CACP,UAAU,EAAE,MAAM,EAClB,MAAM,CAAC,EAAE,MAAM,EACf,OAAO,CAAC,EAAE,MAAM,EAChB,MAAM,CAAC,EAAE,MAAM,EAAE,GAChB,IAAI;IAoEP,IAAI,CAAC,UAAU,EAAE,MAAM,GAAG,IAAI;IAuCxB,IAAI,CAAC,UAAU,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IA8B7C,MAAM,CAAC,UAAU,EAAE,MAAM,GAAG,IAAI;IA6B1B,QAAQ,CAAC,UAAU,EAAE,MAAM,EAAE,SAAS,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAuEpE,WAAW,CAAC,UAAU,EAAE,MAAM,EAAE,SAAS,EAAE,MAAM,GAAG,IAAI;IA2CxD,MAAM,CAAC,MAAM,CAAC,EAAE,MAAM,GAAG,IAAI;IAkB7B,MAAM,CAAC,IAAI,EAAE,MAAM,GAAG,IAAI;IAa1B,KAAK,IAAI,IAAI;IASb,OAAO,CAAC,sBAAsB;YAgChB,YAAY;CAgB3B"}
1
+ {"version":3,"file":"commands.d.ts","sourceRoot":"","sources":["../src/commands.ts"],"names":[],"mappings":"AAIA,OAAO,EAAE,aAAa,EAAsD,MAAM,aAAa,CAAC;AAahG,qBAAa,QAAQ;IACP,OAAO,CAAC,aAAa;gBAAb,aAAa,EAAE,aAAa;IAEhD;;OAEG;IACG,WAAW,IAAI,OAAO,CAAC,IAAI,CAAC;IAiSlC,IAAI,IAAI,IAAI;IAsCN,MAAM,CAAC,UAAU,CAAC,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAqDhD,OAAO,IAAI,IAAI;IAkCf,SAAS,CACP,UAAU,EAAE,MAAM,EAClB,MAAM,CAAC,EAAE,MAAM,EACf,OAAO,CAAC,EAAE,MAAM,EAChB,MAAM,CAAC,EAAE,MAAM,EAAE,GAChB,IAAI;IAuEP,IAAI,CAAC,UAAU,EAAE,MAAM,GAAG,IAAI;IAuDxB,IAAI,CAAC,UAAU,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IA8B7C,MAAM,CAAC,UAAU,EAAE,MAAM,GAAG,IAAI;IA6B1B,QAAQ,CACZ,UAAU,EAAE,MAAM,EAClB,SAAS,EAAE,MAAM,EACjB,SAAS,CAAC,EAAE,SAAS,GAAG,QAAQ,GAAG,OAAO,GAAG,MAAM,GAClD,OAAO,CAAC,IAAI,CAAC;IAsGhB,WAAW,CAAC,UAAU,EAAE,MAAM,EAAE,SAAS,EAAE,MAAM,GAAG,IAAI;IA2CxD,MAAM,CAAC,MAAM,CAAC,EAAE,MAAM,GAAG,IAAI;IAkB7B,MAAM,CAAC,IAAI,EAAE,MAAM,GAAG,IAAI;IAa1B,KAAK,IAAI,IAAI;IASb,OAAO,CAAC,sBAAsB;YAoChB,YAAY;IAO1B,OAAO,CAAC,iBAAiB;CAY1B"}