@paean-ai/zero-cli 0.8.0 → 0.9.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +35 -15
- package/dist/_cli.js +1061 -1061
- package/package.json +3 -3
package/README.md
CHANGED
|
@@ -97,9 +97,9 @@ zero provider cache clear
|
|
|
97
97
|
|
|
98
98
|
| Tier | Model | Context |
|
|
99
99
|
|------|-------|---------|
|
|
100
|
-
| **
|
|
101
|
-
| **
|
|
102
|
-
| **
|
|
100
|
+
| **pro** (default) | `GLM-5.1` | 200K |
|
|
101
|
+
| **flash** | `GLM-4.7` | 200K |
|
|
102
|
+
| **lite** | `GLM-4.5-Air` | 128K |
|
|
103
103
|
|
|
104
104
|
Get an API key at https://bigmodel.cn/usercenter/proj-mgmt/apikeys.
|
|
105
105
|
|
|
@@ -107,9 +107,9 @@ Get an API key at https://bigmodel.cn/usercenter/proj-mgmt/apikeys.
|
|
|
107
107
|
|
|
108
108
|
| Tier | Model | Context |
|
|
109
109
|
|------|-------|---------|
|
|
110
|
-
| **
|
|
111
|
-
| **
|
|
112
|
-
| **
|
|
110
|
+
| **pro** (default) | `ark-code-latest` | 256K |
|
|
111
|
+
| **flash** | `ark-code-latest` | 256K |
|
|
112
|
+
| **lite** | `ark-code-latest` | 256K |
|
|
113
113
|
|
|
114
114
|
Volcengine's coding endpoint serves a single unified model (`ark-code-latest`) across all tiers. Pin to a specific snapshot with `zero provider set volcengine --opus <snapshot-id>` or via the wizard.
|
|
115
115
|
|
|
@@ -119,24 +119,44 @@ Get an API key at https://console.volcengine.com/ark.
|
|
|
119
119
|
|
|
120
120
|
| Tier | Model | Context |
|
|
121
121
|
|------|-------|---------|
|
|
122
|
-
| **
|
|
123
|
-
| **
|
|
124
|
-
| **
|
|
122
|
+
| **pro** | `deepseek-v4-pro` | 1M |
|
|
123
|
+
| **flash** (default) | `deepseek-v4-flash` | 1M |
|
|
124
|
+
| **lite** | `deepseek-v4-flash` | 1M |
|
|
125
125
|
|
|
126
|
-
Two-model lineup: `pro` is the flagship
|
|
126
|
+
Two-model lineup: `deepseek-v4-pro` is the flagship, `deepseek-v4-flash` is the daily driver. The default tier is `flash`, mirroring DeepSeek's own pricing-page recommendation.
|
|
127
127
|
|
|
128
128
|
Get an API key at https://platform.deepseek.com/api_keys.
|
|
129
129
|
|
|
130
|
-
## Model
|
|
130
|
+
## Model Tiers
|
|
131
131
|
|
|
132
|
-
Zero maps three capability tiers
|
|
132
|
+
Zero maps three capability tiers to any supported model:
|
|
133
|
+
|
|
134
|
+
| Preferred alias | Legacy alias | Role |
|
|
135
|
+
|---|---|---|
|
|
136
|
+
| **pro** | `opus` | Flagship — most capable, highest cost |
|
|
137
|
+
| **flash** | `sonnet` | Standard — best for everyday tasks |
|
|
138
|
+
| **lite** | `haiku` | Fast / cheap — quick answers, simple tasks |
|
|
139
|
+
|
|
140
|
+
Switch tiers in-session with either name — both resolve to the same model:
|
|
141
|
+
|
|
142
|
+
```
|
|
143
|
+
/model pro # or /model opus
|
|
144
|
+
/model flash # or /model sonnet
|
|
145
|
+
/model lite # or /model haiku
|
|
146
|
+
```
|
|
147
|
+
|
|
148
|
+
The `pro` / `flash` / `lite` naming aligns with how third-party Anthropic-protocol providers (DeepSeek, Volcengine, GLM, Gemini, Qwen) tier their own catalogs. Zero CLI's display layer prefers these names; the legacy `opus` / `sonnet` / `haiku` remain accepted inputs for backward compatibility.
|
|
149
|
+
|
|
150
|
+
`zero provider set` writes the right env vars for you; to override manually:
|
|
133
151
|
|
|
134
152
|
```bash
|
|
135
|
-
export ZERO_CLI_DEFAULT_OPUS_MODEL=GLM-5.1
|
|
136
|
-
export ZERO_CLI_DEFAULT_SONNET_MODEL=GLM-4.7
|
|
137
|
-
export ZERO_CLI_DEFAULT_HAIKU_MODEL=GLM-4.5-Air
|
|
153
|
+
export ZERO_CLI_DEFAULT_OPUS_MODEL=GLM-5.1 # pro tier
|
|
154
|
+
export ZERO_CLI_DEFAULT_SONNET_MODEL=GLM-4.7 # flash tier
|
|
155
|
+
export ZERO_CLI_DEFAULT_HAIKU_MODEL=GLM-4.5-Air # lite tier
|
|
138
156
|
```
|
|
139
157
|
|
|
158
|
+
(Env-var infixes still use the canonical internal form `OPUS` / `SONNET` / `HAIKU` so existing scripts and CI keep working.)
|
|
159
|
+
|
|
140
160
|
Or switch models in-session with `/model`.
|
|
141
161
|
|
|
142
162
|
### Supported Models
|