claude-glm-alt-installer 2.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,711 @@
1
+ # Claude-GLM Wrapper
2
+
3
+ Use [Z.AI's GLM models](https://z.ai) with [Claude Code](https://www.anthropic.com/claude-code) — **without losing your existing Claude setup!**
4
+
5
+ Switch freely between multiple AI providers: GLM, OpenAI, Gemini, OpenRouter, and Anthropic Claude.
6
+
7
+ ## Why This Wrapper?
8
+
9
+ **💰 Cost-effective**: Access to multiple providers with competitive pricing
10
+ **🔄 Risk-free**: Your existing Claude Code setup remains completely untouched
11
+ **⚡ Multiple options**: Two modes - dedicated wrappers or multi-provider proxy
12
+ **🔀 In-session switching**: With ccx, switch models without restarting
13
+ **🎯 Perfect for**: Development, testing, or when you want model flexibility
14
+
15
+ ## Quick Start
16
+
17
+ ### Universal Installation (All Platforms)
18
+
19
+ **One command works everywhere - Windows, macOS, and Linux:**
20
+
21
+ ```bash
22
+ npx claude-glm-alt-installer
23
+ ```
24
+
25
+ Then activate (platform-specific):
26
+ ```bash
27
+ # macOS / Linux:
28
+ source ~/.zshrc # or ~/.bashrc
29
+
30
+ # Windows PowerShell:
31
+ . $PROFILE
32
+ ```
33
+
34
+ ### Start Using GLM Models
35
+
36
+ **All Platforms:**
37
+ ```bash
38
+ ccg # Claude Code with GLM-4.7 (latest)
39
+ ccg46 # Claude Code with GLM-4.6
40
+ ccg45 # Claude Code with GLM-4.5
41
+ ccf # Claude Code with GLM-4.5-Air (faster)
42
+ cc # Regular Claude Code
43
+ ```
44
+
45
+ That's it! 🎉
46
+
47
+ ---
48
+
49
+ ### Alternative: Platform-Specific Installers
50
+
51
+ <details>
52
+ <summary>Click to expand platform-specific installation methods</summary>
53
+
54
+ #### macOS / Linux
55
+
56
+ ```bash
57
+ bash <(curl -fsSL https://raw.githubusercontent.com/MohMaya/claude-glm-wrapper/main/install.sh)
58
+ source ~/.zshrc # or ~/.bashrc
59
+ ```
60
+
61
+ #### Windows (PowerShell)
62
+
63
+ ```powershell
64
+ iwr -useb https://raw.githubusercontent.com/MohMaya/claude-glm-wrapper/main/install.ps1 | iex
65
+ . $PROFILE
66
+ ```
67
+
68
+ </details>
69
+
70
+ ## Features
71
+
72
+ - 🚀 **Easy switching** between GLM and Claude models
73
+ - ⚡ **Multiple GLM models**: GLM-4.7 (latest), GLM-4.6, GLM-4.5, and GLM-4.5-Air (fast)
74
+ - 🔒 **No sudo/admin required**: Installs to user's home directory
75
+ - 🖥️ **Cross-platform**: Works on Windows, macOS, and Linux
76
+ - 📁 **Isolated configs**: Each model uses its own config directory — no conflicts!
77
+ - 🔧 **Shell aliases**: Quick access with simple commands
78
+
79
+ ## Prerequisites
80
+
81
+ 1. **Node.js** (v14+): For npx installation - [nodejs.org](https://nodejs.org/)
82
+ 2. **Claude Code**: Install from [anthropic.com/claude-code](https://www.anthropic.com/claude-code)
83
+ 3. **Z.AI API Key**: Get your free key from [z.ai/manage-apikey/apikey-list](https://z.ai/manage-apikey/apikey-list)
84
+
85
+ *Note: If you don't have Node.js, you can use the platform-specific installers (see Quick Start above)*
86
+
87
+ ## Installation
88
+
89
+ ### Method 1: npx (Recommended - All Platforms)
90
+
91
+ **One command for Windows, macOS, and Linux:**
92
+
93
+ ```bash
94
+ npx claude-glm-alt-installer
95
+ ```
96
+
97
+ The installer will:
98
+ - Auto-detect your operating system
99
+ - Check if Claude Code is installed
100
+ - Ask for your Z.AI API key
101
+ - Create platform-appropriate wrapper scripts
102
+ - Add convenient aliases to your shell/profile
103
+
104
+ After installation, **activate the changes**:
105
+
106
+ ```bash
107
+ # macOS / Linux:
108
+ source ~/.zshrc # or ~/.bashrc
109
+
110
+ # Windows PowerShell:
111
+ . $PROFILE
112
+ ```
113
+
114
+ ### Method 2: Platform-Specific Installers
115
+
116
+ <details>
117
+ <summary>macOS / Linux</summary>
118
+
119
+ **One-Line Install:**
120
+ ```bash
121
+ bash <(curl -fsSL https://raw.githubusercontent.com/MohMaya/claude-glm-wrapper/main/install.sh)
122
+ source ~/.zshrc # or ~/.bashrc
123
+ ```
124
+
125
+ **Clone and Install:**
126
+ ```bash
127
+ git clone https://github.com/MohMaya/claude-glm-wrapper.git
128
+ cd claude-glm-wrapper
129
+ bash install.sh
130
+ source ~/.zshrc
131
+ ```
132
+
133
+ </details>
134
+
135
+ <details>
136
+ <summary>Windows (PowerShell)</summary>
137
+
138
+ **One-Line Install:**
139
+ ```powershell
140
+ iwr -useb https://raw.githubusercontent.com/MohMaya/claude-glm-wrapper/main/install.ps1 | iex
141
+ . $PROFILE
142
+ ```
143
+
144
+ **Clone and Install:**
145
+ ```powershell
146
+ git clone https://github.com/MohMaya/claude-glm-wrapper.git
147
+ cd claude-glm-wrapper
148
+ .\install.ps1
149
+ . $PROFILE
150
+ ```
151
+
152
+ **Note:** If you get an execution policy error, run:
153
+ ```powershell
154
+ Set-ExecutionPolicy -Scope CurrentUser RemoteSigned
155
+ ```
156
+
157
+ </details>
158
+
159
+ ## Usage
160
+
161
+ ### Available Commands & Aliases
162
+
163
+ The installer creates these commands and aliases:
164
+
165
+ | Alias | Full Command | What It Does | When to Use |
166
+ |-------|--------------|--------------|-------------|
167
+ | `cc` | `claude` | Regular Claude Code | Default - your normal Claude setup |
168
+ | `ccg` | `claude-glm` | GLM-4.7 (latest) | Best quality GLM model |
169
+ | `ccg46` | `claude-glm-4.6` | GLM-4.6 | Previous version of GLM |
170
+ | `ccg45` | `claude-glm-4.5` | GLM-4.5 | Legacy version of GLM |
171
+ | `ccf` | `claude-glm-fast` | GLM-4.5-Air (fast) | Quicker responses, lower cost |
172
+ | `ccx` | `ccx` | Multi-provider proxy | Switch between providers in-session |
173
+
174
+ **💡 Tip**: Use the short aliases! They're faster to type and easier to remember.
175
+
176
+ **🆕 New: ccx Multi-Provider Proxy**
177
+
178
+ The `ccx` command starts a local proxy that lets you switch between multiple AI providers in a single session:
179
+ - **OpenAI**: GPT-4o, GPT-4o-mini, and more
180
+ - **OpenRouter**: Access to hundreds of models
181
+ - **Google Gemini**: Gemini 1.5 Pro and Flash
182
+ - **Z.AI GLM**: GLM-4.7, GLM-4.6, GLM-4.5, GLM-4.5-Air
183
+ - **Anthropic**: Claude 3.5 Sonnet, etc.
184
+
185
+ Switch models mid-session using `/model <provider>:<model-name>`. Perfect for comparing responses or using the right model for each task!
186
+
187
+ ### How It Works
188
+
189
+ Each command starts a **separate Claude Code session** with different configurations:
190
+ - `ccg`, `ccg45`, and `ccf` use Z.AI's API with your Z.AI key
191
+ - `cc` uses Anthropic's API with your Anthropic key (default Claude setup)
192
+ - Your configurations **never conflict** — they're stored in separate directories
193
+
194
+ ### Basic Examples
195
+
196
+ **Start a coding session with the latest GLM:**
197
+ ```bash
198
+ ccg
199
+ # Opens Claude Code using GLM-4.7
200
+ ```
201
+
202
+ **Use GLM-4.6:**
203
+ ```bash
204
+ ccg46
205
+ # Opens Claude Code using GLM-4.6
206
+ ```
207
+
208
+ **Use GLM-4.5:**
209
+ ```bash
210
+ ccg45
211
+ # Opens Claude Code using GLM-4.5
212
+ ```
213
+
214
+ **Need faster responses? Use the fast model:**
215
+ ```bash
216
+ ccf
217
+ # Opens Claude Code using GLM-4.5-Air
218
+ ```
219
+
220
+ **Use regular Claude:**
221
+ ```bash
222
+ cc
223
+ # Opens Claude Code with Anthropic models (your default setup)
224
+ ```
225
+
226
+ **Pass arguments like normal:**
227
+ ```bash
228
+ ccg --help
229
+ ccg "refactor this function"
230
+ ccf "quick question about Python"
231
+ ```
232
+
233
+ ## Common Workflows
234
+
235
+ ### Workflow 1: Testing with GLM, Production with Claude
236
+ ```bash
237
+ # Develop and test with cost-effective GLM-4.7
238
+ ccg
239
+ # ... work on your code ...
240
+ # exit
241
+
242
+ # Switch to Claude for final review
243
+ cc
244
+ # ... final review with Claude ...
245
+ ```
246
+
247
+ ### Workflow 2: Quick Questions with Fast Model
248
+ ```bash
249
+ # Quick syntax questions
250
+ ccf "how do I use async/await in Python?"
251
+
252
+ # Complex refactoring with latest GLM
253
+ ccg
254
+ # ... longer coding session ...
255
+ ```
256
+
257
+ ### Workflow 3: Multiple Projects
258
+ ```bash
259
+ # Project 1: Use GLM to save costs
260
+ cd ~/project1
261
+ ccg
262
+
263
+ # Project 2: Use Claude for critical work
264
+ cd ~/project2
265
+ cc
266
+ ```
267
+
268
+ **Each session is independent** — your chat history stays separate!
269
+
270
+ ## Using ccx (Multi-Provider Proxy)
271
+
272
+ ### Setup
273
+
274
+ After installation, configure your API keys:
275
+
276
+ ```bash
277
+ # First time setup
278
+ ccx --setup
279
+ ```
280
+
281
+ This creates `~/.claude-proxy/.env`. Edit it to add your API keys:
282
+
283
+ ```bash
284
+ # macOS / Linux
285
+ nano ~/.claude-proxy/.env
286
+
287
+ # Windows
288
+ notepad %USERPROFILE%\.claude-proxy\.env
289
+ ```
290
+
291
+ Add keys for the providers you want to use:
292
+
293
+ ```ini
294
+ # OpenAI
295
+ OPENAI_API_KEY=sk-...
296
+
297
+ # OpenRouter
298
+ OPENROUTER_API_KEY=sk-or-...
299
+
300
+ # Gemini
301
+ GEMINI_API_KEY=AIza...
302
+
303
+ # Z.AI GLM
304
+ GLM_UPSTREAM_URL=https://api.z.ai/api/anthropic
305
+ ZAI_API_KEY=...
306
+
307
+ # Anthropic (if you want to route through the proxy)
308
+ ANTHROPIC_UPSTREAM_URL=https://api.anthropic.com
309
+ ANTHROPIC_API_KEY=sk-ant-...
310
+ ```
311
+
312
+ ### Starting ccx
313
+
314
+ ```bash
315
+ ccx
316
+ ```
317
+
318
+ The proxy starts automatically and Claude Code connects to it.
319
+
320
+ ### Switching Models
321
+
322
+ Use Claude Code's built-in `/model` command with provider prefixes:
323
+
324
+ ```
325
+ /model openai:gpt-4o
326
+ /model openai:gpt-4o-mini
327
+ /model openrouter:anthropic/claude-3.5-sonnet
328
+ /model openrouter:meta-llama/llama-3.1-70b-instruct
329
+ /model gemini:gemini-1.5-pro
330
+ /model gemini:gemini-1.5-flash
331
+ /model glm:glm-4.7
332
+ /model glm:glm-4.6
333
+ /model glm:glm-4.5
334
+ /model anthropic:claude-3-5-sonnet-20241022
335
+ ```
336
+
337
+ ### ccx Workflows
338
+
339
+ **Workflow 1: Compare Model Responses**
340
+ ```bash
341
+ ccx
342
+ # Ask a question
343
+ /model openai:gpt-4o
344
+ # Ask the same question
345
+ /model gemini:gemini-1.5-pro
346
+ # Ask again - compare the responses!
347
+ ```
348
+
349
+ **Workflow 2: Cost Optimization**
350
+ ```bash
351
+ ccx
352
+ # Start with a fast, cheap model for exploration
353
+ /model glm:glm-4.5-air
354
+ # ... work on the problem ...
355
+ # Switch to a more powerful model when needed
356
+ /model openai:gpt-4o
357
+ ```
358
+
359
+ **Workflow 3: Leverage Model Strengths**
360
+ ```bash
361
+ ccx
362
+ # Use GPT-4 for coding
363
+ /model openai:gpt-4o
364
+ # ... write code ...
365
+ # Use Claude for writing/docs
366
+ /model openrouter:anthropic/claude-3.5-sonnet
367
+ # ... write documentation ...
368
+ ```
369
+
370
+ ### ccx Advantages
371
+
372
+ ✅ **Single Session**: No need to exit and restart
373
+ ✅ **Context Preserved**: Chat history continues across model switches
374
+ ✅ **Easy Comparison**: Switch models to compare responses
375
+ ✅ **Flexibility**: Use the best model for each task
376
+ ✅ **Provider Options**: OpenAI, OpenRouter, Gemini, GLM, Anthropic
377
+
378
+ ### ccx vs Dedicated Wrappers
379
+
380
+ | Feature | ccx | ccg/ccg45/ccf |
381
+ |---------|-----|---------------|
382
+ | Switch models in-session | ✅ Yes | ❌ No |
383
+ | Multiple providers | ✅ Yes | ❌ GLM only |
384
+ | Separate chat history | ❌ No | ✅ Yes |
385
+ | Simple setup | ✅ .env file | ✅ Installer |
386
+ | Overhead | Proxy startup | None |
387
+
388
+ **Use ccx when**: You want flexibility and in-session switching
389
+ **Use dedicated wrappers when**: You want separate histories for different models
390
+
391
+ ## Configuration Details
392
+
393
+ ### Where Things Are Stored
394
+
395
+ Each wrapper uses its own configuration directory to prevent conflicts:
396
+
397
+ **macOS / Linux:**
398
+ | Command | Config Directory | Purpose |
399
+ |---------|-----------------|---------|
400
+ | `claude-glm` | `~/.claude-glm/` | GLM-4.7 settings and history |
401
+ | `claude-glm-4.6` | `~/.claude-glm-46/` | GLM-4.6 settings and history |
402
+ | `claude-glm-4.5` | `~/.claude-glm-45/` | GLM-4.5 settings and history |
403
+ | `claude-glm-fast` | `~/.claude-glm-fast/` | GLM-4.5-Air settings and history |
404
+ | `claude` | `~/.claude/` (default) | Your original Claude setup |
405
+
406
+ **Windows:**
407
+ | Command | Config Directory | Purpose |
408
+ |---------|-----------------|---------|
409
+ | `claude-glm` | `%USERPROFILE%\.claude-glm\` | GLM-4.7 settings and history |
410
+ | `claude-glm-4.6` | `%USERPROFILE%\.claude-glm-46\` | GLM-4.6 settings and history |
411
+ | `claude-glm-4.5` | `%USERPROFILE%\.claude-glm-45\` | GLM-4.5 settings and history |
412
+ | `claude-glm-fast` | `%USERPROFILE%\.claude-glm-fast\` | GLM-4.5-Air settings and history |
413
+ | `claude` | `%USERPROFILE%\.claude\` (default) | Your original Claude setup |
414
+
415
+ **This means:**
416
+ - ✅ Your original Claude settings are **never touched**
417
+ - ✅ Chat histories stay separate for each model
418
+ - ✅ API keys are isolated — no mixing!
419
+
420
+ ### Wrapper Scripts Location
421
+
422
+ **macOS / Linux:** `~/.local/bin/`
423
+ - `claude-glm` (GLM-4.7)
424
+ - `claude-glm-4.6` (GLM-4.6)
425
+ - `claude-glm-4.5` (GLM-4.5)
426
+ - `claude-glm-fast` (GLM-4.5-Air)
427
+
428
+ **Windows:** `%USERPROFILE%\.local\bin\`
429
+ - `claude-glm.ps1` (GLM-4.7)
430
+ - `claude-glm-4.6.ps1` (GLM-4.6)
431
+ - `claude-glm-4.5.ps1` (GLM-4.5)
432
+ - `claude-glm-fast.ps1` (GLM-4.5-Air)
433
+
434
+ These are just tiny wrapper scripts (bash or PowerShell) that set the right environment variables before launching Claude Code.
435
+
436
+ ## Updating Your API Key
437
+
438
+ ### macOS / Linux
439
+
440
+ **Option 1: Use the Installer**
441
+ ```bash
442
+ cd claude-glm-wrapper && bash install.sh
443
+ # Choose option "1) Update API key only"
444
+ ```
445
+
446
+ **Option 2: Edit Manually**
447
+ ```bash
448
+ nano ~/.local/bin/claude-glm
449
+ nano ~/.local/bin/claude-glm-4.6
450
+ nano ~/.local/bin/claude-glm-4.5
451
+ nano ~/.local/bin/claude-glm-fast
452
+ # Find and replace ANTHROPIC_AUTH_TOKEN value
453
+ ```
454
+
455
+ ### Windows (PowerShell)
456
+
457
+ **Option 1: Use the Installer**
458
+ ```powershell
459
+ cd claude-glm-wrapper
460
+ .\install.ps1
461
+ # Choose option "1) Update API key only"
462
+ ```
463
+
464
+ **Option 2: Edit Manually**
465
+ ```powershell
466
+ notepad "$env:USERPROFILE\.local\bin\claude-glm.ps1"
467
+ notepad "$env:USERPROFILE\.local\bin\claude-glm-4.6.ps1"
468
+ notepad "$env:USERPROFILE\.local\bin\claude-glm-4.5.ps1"
469
+ notepad "$env:USERPROFILE\.local\bin\claude-glm-fast.ps1"
470
+ # Find and replace $ZaiApiKey value
471
+ ```
472
+
473
+ ## How It Works (Technical Details)
474
+
475
+ The wrapper scripts work by setting environment variables before launching Claude Code:
476
+
477
+ | Environment Variable | What It Does |
478
+ |---------------------|--------------|
479
+ | `ANTHROPIC_BASE_URL` | Points to Z.AI's API endpoint |
480
+ | `ANTHROPIC_AUTH_TOKEN` | Your Z.AI API key |
481
+ | `ANTHROPIC_MODEL` | Which model to use (glm-4.7, glm-4.6, glm-4.5, or glm-4.5-air) |
482
+ | `CLAUDE_HOME` | Where to store config files |
483
+
484
+ Claude Code reads these variables and uses them instead of the defaults. Simple! 🎯
485
+
486
+ ## Troubleshooting
487
+
488
+ ### ❌ "claude command not found"
489
+
490
+ **Problem**: Claude Code isn't installed or not in your PATH.
491
+
492
+ **Solutions**:
493
+ 1. Install Claude Code from [anthropic.com/claude-code](https://www.anthropic.com/claude-code)
494
+ 2. Or add Claude to your PATH if it's installed elsewhere
495
+
496
+ **Test it**: Run `which claude` — it should show a path.
497
+
498
+ ### ❌ "ccg: command not found" (or ccg45, ccf, cc)
499
+
500
+ **Problem**: You didn't source your shell config after installation.
501
+
502
+ **Solution**: Run the source command the installer showed you:
503
+ ```bash
504
+ source ~/.zshrc # or ~/.bashrc
505
+ ```
506
+
507
+ **Still not working?** Try opening a new terminal window.
508
+
509
+ ### ❌ API Authentication Errors
510
+
511
+ **Problem**: API key issues.
512
+
513
+ **Solutions for ccg/ccf/ccg45**:
514
+ 1. **Check your key**: Visit [z.ai/manage-apikey/apikey-list](https://z.ai/manage-apikey/apikey-list)
515
+ 2. **Verify credits**: Make sure your Z.AI account has available credits
516
+ 3. **Update the key**: Run `bash install.sh` and choose "Update API key only"
517
+
518
+ **Solutions for ccx**:
519
+ 1. **Check your .env file**: Edit `~/.claude-proxy/.env`
520
+ 2. **Verify keys are set**: Make sure the API keys for the providers you're using are filled in
521
+ 3. **No empty values**: If you're not using a provider, either leave it blank or remove the line
522
+ 4. **Reload**: Restart ccx after editing .env
523
+
524
+ ### ❌ ccx Proxy Won't Start
525
+
526
+ **Problem**: Proxy fails to start or times out.
527
+
528
+ **Solutions**:
529
+ 1. **Check logs**: Look at `/tmp/claude-proxy.log` (Unix) or `%TEMP%\claude-proxy.log` (Windows)
530
+ 2. **Port in use**: Another process might be using port 17870. Set `CLAUDE_PROXY_PORT=17871` in .env
531
+ 3. **Missing dependencies**: Run `npm install -g tsx` to ensure TypeScript runner is available
532
+ 4. **Check adapters**: Ensure `~/.claude-proxy/adapters/` directory exists and contains TS files
533
+
534
+ ### ❌ Models Don't Switch in ccx
535
+
536
+ **Problem**: `/model` command doesn't seem to work.
537
+
538
+ **Solutions**:
539
+ 1. **Check provider prefix**: Use format `/model provider:model-name` (e.g., `/model openai:gpt-4o`)
540
+ 2. **Verify API key**: Make sure the provider's API key is set in `~/.claude-proxy/.env`
541
+ 3. **Check proxy logs**: Look for errors in `/tmp/claude-proxy.log`
542
+
543
+ ### ❌ Wrong Model Being Used
544
+
545
+ **Problem**: Using `ccg` but it's using the wrong API.
546
+
547
+ **Solution**: Each command is independent. Make sure you:
548
+ - Exit any running Claude Code session
549
+ - Start fresh with the command you want (`ccg`, `ccg45`, `ccf`, or `cc`)
550
+
551
+ ### 🪟 Windows-Specific Issues
552
+
553
+ **❌ "cannot be loaded because running scripts is disabled"**
554
+
555
+ **Problem**: PowerShell execution policy prevents running scripts.
556
+
557
+ **Solution**:
558
+ ```powershell
559
+ Set-ExecutionPolicy -Scope CurrentUser RemoteSigned
560
+ ```
561
+
562
+ **❌ "ccg: The term 'ccg' is not recognized"**
563
+
564
+ **Problem**: PowerShell profile wasn't reloaded after installation.
565
+
566
+ **Solutions**:
567
+ 1. Reload profile: `. $PROFILE`
568
+ 2. Or restart PowerShell
569
+ 3. Or run the full command: `claude-glm`
570
+
571
+ **❌ PATH not updated**
572
+
573
+ **Problem**: The `~/.local/bin` or `$env:USERPROFILE\.local\bin` directory isn't in your PATH.
574
+
575
+ **Solution**: The installer adds it automatically, but you may need to restart PowerShell for it to take effect.
576
+
577
+ ### 💡 General Tips
578
+
579
+ - **Open new terminal**: After installation, aliases work in new terminals automatically
580
+ - **Check the greeting**: Each command prints what model it's using when it starts
581
+ - **Test with**: `ccg --version` to verify the command works
582
+
583
+ ## Uninstallation
584
+
585
+ ### macOS / Linux
586
+
587
+ **Remove wrapper scripts:**
588
+ ```bash
589
+ rm ~/.local/bin/claude-glm
590
+ rm ~/.local/bin/claude-glm-4.6
591
+ rm ~/.local/bin/claude-glm-4.5
592
+ rm ~/.local/bin/claude-glm-fast
593
+ ```
594
+
595
+ **Remove config directories** (optional - deletes chat history):
596
+ ```bash
597
+ rm -rf ~/.claude-glm
598
+ rm -rf ~/.claude-glm-46
599
+ rm -rf ~/.claude-glm-45
600
+ rm -rf ~/.claude-glm-fast
601
+ ```
602
+
603
+ **Remove aliases** from `~/.zshrc` or `~/.bashrc`:
604
+ ```bash
605
+ # Delete these lines:
606
+ # Claude Code Model Switcher Aliases
607
+ alias cc='claude'
608
+ alias ccg='claude-glm'
609
+ alias ccg46='claude-glm-4.6'
610
+ alias ccg45='claude-glm-4.5'
611
+ alias ccf='claude-glm-fast'
612
+ ```
613
+
614
+ Then run: `source ~/.zshrc`
615
+
616
+ ### Windows (PowerShell)
617
+
618
+ **Remove wrapper scripts:**
619
+ ```powershell
620
+ Remove-Item "$env:USERPROFILE\.local\bin\claude-glm.ps1"
621
+ Remove-Item "$env:USERPROFILE\.local\bin\claude-glm-4.6.ps1"
622
+ Remove-Item "$env:USERPROFILE\.local\bin\claude-glm-4.5.ps1"
623
+ Remove-Item "$env:USERPROFILE\.local\bin\claude-glm-fast.ps1"
624
+ ```
625
+
626
+ **Remove config directories** (optional - deletes chat history):
627
+ ```powershell
628
+ Remove-Item -Recurse "$env:USERPROFILE\.claude-glm"
629
+ Remove-Item -Recurse "$env:USERPROFILE\.claude-glm-46"
630
+ Remove-Item -Recurse "$env:USERPROFILE\.claude-glm-45"
631
+ Remove-Item -Recurse "$env:USERPROFILE\.claude-glm-fast"
632
+ ```
633
+
634
+ **Remove aliases** from PowerShell profile:
635
+ ```powershell
636
+ notepad $PROFILE
637
+ # Delete these lines:
638
+ # Claude Code Model Switcher Aliases
639
+ Set-Alias cc claude
640
+ Set-Alias ccg claude-glm
641
+ Set-Alias ccg46 claude-glm-4.6
642
+ Set-Alias ccg45 claude-glm-4.5
643
+ Set-Alias ccf claude-glm-fast
644
+ ```
645
+
646
+ Then reload: `. $PROFILE`
647
+
648
+ ## FAQ
649
+
650
+ ### Q: Will this affect my existing Claude Code setup?
651
+ **A**: No! Your regular Claude Code setup is completely untouched. The wrappers use separate config directories.
652
+
653
+ ### Q: Can I use both GLM and Claude in the same project?
654
+ **A**: Yes! Just use `ccg` for GLM sessions and `cc` for Claude sessions. Each maintains its own chat history. Or use `ccx` to switch between providers in a single session.
655
+
656
+ ### Q: Which should I use: ccx or dedicated wrappers (ccg/ccf)?
657
+ **A**:
658
+ - **Use ccx** if you want to switch between multiple providers (OpenAI, Gemini, OpenRouter, GLM, Anthropic) in the same session
659
+ - **Use dedicated wrappers** if you want separate chat histories for different models/providers
660
+
661
+ ### Q: Which model should I use?
662
+ **A**:
663
+ - Use **`ccx`** for: Maximum flexibility, model comparison, leveraging different model strengths
664
+ - Use **`ccg` (GLM-4.7)** for: Latest model, complex coding, refactoring, detailed explanations
665
+ - Use **`ccg46` (GLM-4.6)** for: Previous version, if you need consistency with older projects
666
+ - Use **`ccg45` (GLM-4.5)** for: Legacy version, if you need consistency with older projects
667
+ - Use **`ccf` (GLM-4.5-Air)** for: Quick questions, simple tasks, faster responses
668
+ - Use **`cc` (Claude)** for: Your regular Anthropic Claude setup
669
+
670
+ ### Q: How do I switch models in ccx?
671
+ **A**: Use the `/model` command with the format `<provider>:<model-name>`. For example:
672
+ - `/model openai:gpt-4o`
673
+ - `/model gemini:gemini-1.5-pro`
674
+ - `/model glm:glm-4.7`
675
+ - `/model glm:glm-4.6`
676
+
677
+ ### Q: Is this secure?
678
+ **A**: Yes! Your API keys are stored locally on your machine in wrapper scripts (bash or PowerShell, depending on your OS). Keep your scripts directory secure with appropriate permissions.
679
+
680
+ ### Q: Does this work on Windows?
681
+ **A**: Yes! Use the PowerShell installer (install.ps1). Windows, macOS, and Linux are all fully supported.
682
+
683
+ ### Q: Can I use a different Z.AI model?
684
+ **A**: Yes! Edit the wrapper scripts in `~/.local/bin/` and change the `ANTHROPIC_MODEL` variable to any model Z.AI supports.
685
+
686
+ ### Q: What happens if I run out of Z.AI credits?
687
+ **A**: The GLM commands will fail with an API error. Just switch to regular Claude using `cc` until you add more credits.
688
+
689
+ ## Contributing
690
+
691
+ Found a bug? Have an idea? Contributions are welcome!
692
+
693
+ - 🐛 **Report issues**: [GitHub Issues](https://github.com/MohMaya/claude-glm-wrapper/issues)
694
+ - 🔧 **Submit PRs**: Fork, improve, and open a pull request
695
+ - 💡 **Share feedback**: Tell us how you're using this tool!
696
+
697
+ ## License
698
+
699
+ MIT License - see [LICENSE](LICENSE) file for details.
700
+
701
+ **TL;DR**: Free to use, modify, and distribute. No warranty provided.
702
+
703
+ ## Acknowledgments
704
+
705
+ - 🙏 [Z.AI](https://z.ai) for providing GLM model API access
706
+ - 🙏 [Anthropic](https://anthropic.com) for Claude Code
707
+ - 🙏 You, for using this tool!
708
+
709
+ ---
710
+
711
+ **⭐ Found this useful?** Give it a star on GitHub and share it with others!