ghc-proxy 0.3.0 → 0.3.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -48,7 +48,7 @@ This is the most common use case. There are two ways to set it up:
48
48
  bunx ghc-proxy@latest start --claude-code
49
49
  ```
50
50
 
51
- This starts the proxy, opens an interactive model picker, and copies a ready-to-paste environment command to your clipboard. Run that command in another terminal to launch Claude Code with the correct configuration.
51
+ This starts the proxy, opens an interactive model picker, and prints a ready-to-paste environment command. Run that command in another terminal to launch Claude Code with the correct configuration.
52
52
 
53
53
  ### Option B: Permanent config (Recommended)
54
54
 
@@ -62,7 +62,6 @@ Create or edit `~/.claude/settings.json` (this applies globally to all projects)
62
62
  "ANTHROPIC_MODEL": "claude-opus-4.6",
63
63
  "ANTHROPIC_DEFAULT_SONNET_MODEL": "claude-sonnet-4.6",
64
64
  "ANTHROPIC_DEFAULT_HAIKU_MODEL": "claude-haiku-4.5",
65
- "DISABLE_NON_ESSENTIAL_MODEL_CALLS": "1",
66
65
  "CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC": "1"
67
66
  },
68
67
  "permissions": {
@@ -86,7 +85,6 @@ bunx ghc-proxy@latest start --wait
86
85
  | `ANTHROPIC_MODEL` | The model Claude Code uses for primary/Opus tasks |
87
86
  | `ANTHROPIC_DEFAULT_SONNET_MODEL` | The model used for Sonnet-tier tasks |
88
87
  | `ANTHROPIC_DEFAULT_HAIKU_MODEL` | The model used for Haiku-tier (fast/cheap) tasks |
89
- | `DISABLE_NON_ESSENTIAL_MODEL_CALLS` | Prevents Claude Code from making extra API calls |
90
88
  | `CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC` | Disables telemetry and non-essential network traffic |
91
89
 
92
90
  > **Tip:** The model names above (e.g. `claude-opus-4.6`) are mapped to actual Copilot models by the proxy. See [Model Mapping](#model-mapping) below for details.
@@ -112,7 +110,7 @@ The proxy authenticates with GitHub using the [device code OAuth flow](https://d
112
110
 
113
111
  When the Copilot token response includes `endpoints.api`, `ghc-proxy` now prefers that runtime API base automatically instead of relying only on the configured account type. This keeps enterprise/business routing aligned with the endpoint GitHub actually returned for the current token.
114
112
 
115
- Incoming requests hit a [Hono](https://hono.dev/) server. `chat/completions` requests are validated, normalized into the shared planning pipeline, and then forwarded to Copilot. `responses` requests use a native Responses path with explicit compatibility policies. `messages` requests are routed per-model and can use native Anthropic passthrough, the Responses translation path, or the existing chat-completions fallback. The translator tracks exact vs lossy vs unsupported behavior explicitly; see the [Messages Routing and Translation Guide](./docs/messages-routing-and-translation.md) and the [Anthropic Translation Matrix](./docs/anthropic-translation-matrix.md) for the current support surface.
113
+ Incoming requests hit an [Elysia](https://elysiajs.com/) server. `chat/completions` requests are validated, normalized into the shared planning pipeline, and then forwarded to Copilot. `responses` requests use a native Responses path with explicit compatibility policies. `messages` requests are routed per-model and can use native Anthropic passthrough, the Responses translation path, or the existing chat-completions fallback. The translator tracks exact vs lossy vs unsupported behavior explicitly; see the [Messages Routing and Translation Guide](./docs/messages-routing-and-translation.md) and the [Anthropic Translation Matrix](./docs/anthropic-translation-matrix.md) for the current support surface.
116
114
 
117
115
  ### Request Routing
118
116
 
@@ -186,7 +184,7 @@ bunx ghc-proxy@latest debug # Print diagnostic info (version, paths, to
186
184
  | `--github-token` | `-g` | -- | Pass a GitHub token directly (from `auth`) |
187
185
  | `--claude-code` | `-c` | `false` | Generate a Claude Code launch command |
188
186
  | `--show-token` | -- | `false` | Display tokens on auth and refresh |
189
- | `--proxy-env` | -- | `false` | Use `HTTP_PROXY`/`HTTPS_PROXY` from env |
187
+ | `--proxy-env` | -- | `false` | Use `HTTP_PROXY`/`HTTPS_PROXY` from env (Node.js only; Bun reads proxy env natively) |
190
188
  | `--idle-timeout` | -- | `120` | Bun server idle timeout in seconds |
191
189
  | `--upstream-timeout` | -- | `300` | Upstream request timeout in seconds (0 to disable) |
192
190
 
@@ -228,7 +226,7 @@ When Claude Code sends a request for a model like `claude-sonnet-4.6`, the proxy
228
226
  | Prefix | Default Fallback |
229
227
  |--------|-----------------|
230
228
  | `claude-opus-*` | `claude-opus-4.6` |
231
- | `claude-sonnet-*` | `claude-sonnet-4.5` |
229
+ | `claude-sonnet-*` | `claude-sonnet-4.6` |
232
230
  | `claude-haiku-*` | `claude-haiku-4.5` |
233
231
 
234
232
  ### Customizing Fallbacks
@@ -237,7 +235,7 @@ You can override the defaults with **environment variables**:
237
235
 
238
236
  ```bash
239
237
  MODEL_FALLBACK_CLAUDE_OPUS=claude-opus-4.6
240
- MODEL_FALLBACK_CLAUDE_SONNET=claude-sonnet-4.5
238
+ MODEL_FALLBACK_CLAUDE_SONNET=claude-sonnet-4.6
241
239
  MODEL_FALLBACK_CLAUDE_HAIKU=claude-haiku-4.5
242
240
  ```
243
241
 
@@ -247,7 +245,7 @@ Or in the proxy's **config file** (`~/.local/share/ghc-proxy/config.json`):
247
245
  {
248
246
  "modelFallback": {
249
247
  "claudeOpus": "claude-opus-4.6",
250
- "claudeSonnet": "claude-sonnet-4.5",
248
+ "claudeSonnet": "claude-sonnet-4.6",
251
249
  "claudeHaiku": "claude-haiku-4.5"
252
250
  }
253
251
  }
@@ -255,6 +253,8 @@ Or in the proxy's **config file** (`~/.local/share/ghc-proxy/config.json`):
255
253
 
256
254
  **Priority order:** environment variable > config.json > built-in default.
257
255
 
256
+ > **Note:** Model fallbacks only apply to the **chat completions translation path**. The native Messages and Responses API strategies pass the model ID through to Copilot as-is.
257
+
258
258
  ### Small-Model Routing
259
259
 
260
260
  `/v1/messages` can optionally reroute specific low-value requests to a cheaper model: