@khanglvm/llm-router 1.1.1 → 1.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -5,6 +5,46 @@ All notable changes to this project will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
+ ## [1.3.0] - 2026-03-05
9
+
10
+ ### Added
11
+ - Added Claude Code OAuth subscription provider support end-to-end:
12
+ - new subscription type: `claude-code`
13
+ - Claude OAuth constants and runtime request config (`anthropic-beta`, OAuth token endpoint, Claude messages endpoint)
14
+ - default Claude subscription model seed list for new subscription providers
15
+ - Added CLI support for Claude subscription auth operations:
16
+ - `llm-router subscription login --subscription-type=claude-code`
17
+ - `llm-router subscription logout --subscription-type=claude-code`
18
+ - `llm-router subscription status --subscription-type=claude-code`
19
+ - Added runtime and CLI test coverage for Claude subscription request translation/headers and setup flows.
20
+
21
+ ### Changed
22
+ - Updated subscription probe and provider upsert flow to build type-specific probe payloads:
23
+ - ChatGPT Codex keeps Responses/Codex probe shape
24
+ - Claude Code uses Claude messages probe shape
25
+ - Updated subscription config normalization/workflows so default format and model seed list are selected by `subscriptionType`.
26
+ - Updated README and CLI help text/examples to document both supported OAuth subscription types (`chatgpt-codex`, `claude-code`).
27
+
28
+ ### Fixed
29
+ - Fixed new Claude subscription provider creation default model seeding to correctly use Claude defaults instead of ChatGPT defaults.
30
+
31
+ ## [1.2.0] - 2026-03-04
32
+
33
+ ### Added
34
+ - Added Codex Responses API compatibility layer:
35
+ - request transformation into Codex Responses payload shape
36
+ - response transformation from Codex responses/SSE events to OpenAI Chat Completions-compatible output
37
+ - dedicated runtime tests for request + response transformation coverage
38
+ - Added explicit project-level ignore for local `AGENTS.md`.
39
+
40
+ ### Changed
41
+ - Improved TUI/CLI operation reports to user-friendly structured layouts and tables across provider/model-alias/rate-limit/config flows and operational actions.
42
+ - Improved startup/deploy/worker-key/status outputs to avoid raw config variable style and show friendly fields.
43
+ - Updated subscription auth/provider flow behavior and tests for more robust OAuth/Codex subscription handling.
44
+
45
+ ### Fixed
46
+ - Fixed migration/reporting test expectations and summary rendering stability after report format refactor.
47
+
8
48
  ## [1.1.1] - 2026-03-04
9
49
 
10
50
  ### Fixed
package/README.md CHANGED
@@ -77,31 +77,43 @@ Then follow this order.
77
77
  Flow:
78
78
  1. `Config manager`
79
79
  2. `Add/Edit provider`
80
- 3. Select provider type:
81
- - `standard` -> endpoint + API key + model list
82
- - `subscription` -> OAuth profile for predefined ChatGPT Codex models
83
- 4. Enter provider details
80
+ 3. Select provider auth mode:
81
+ - `API Key` -> endpoint + API key + model list
82
+ - `OAuth` -> browser OAuth + editable model list
83
+ 4. For `OAuth`:
84
+ - Choose subscription provider (`ChatGPT` or `Claude Code`)
85
+ - Enter Friendly Name and Provider ID
86
+ - Complete browser OAuth login inside this same flow
87
+ - Edit model list (pre-filled defaults; you can add/remove)
88
+ - llm-router live-tests every selected model before save
84
89
  5. Save
85
90
 
86
- ### 1b) Add Subscription Provider (ChatGPT Codex)
87
- Commandline example:
91
+ ### 1b) Add Subscription Provider (OAuth)
92
+ Commandline examples:
88
93
 
89
94
  ```bash
95
+ # ChatGPT Codex subscription
90
96
  llm-router config \
91
97
  --operation=upsert-provider \
92
98
  --provider-id=chatgpt \
93
- --name="ChatGPT Subscription" \
94
- --type=subscription \
95
- --subscription-type=chatgpt-codex \
96
- --subscription-profile=default
99
+ --name="GPT Sub" \
100
+ --type=subscription
97
101
 
98
- llm-router subscription login --profile=default
99
- llm-router subscription status --profile=default
102
+ # Claude Code subscription
103
+ llm-router config \
104
+ --operation=upsert-provider \
105
+ --provider-id=claude-sub \
106
+ --name="Claude Sub" \
107
+ --type=subscription \
108
+ --subscription-type=claude-code
100
109
  ```
101
110
 
102
111
  Notes:
103
- - `chatgpt-codex` subscription providers use predefined model IDs managed by llm-router releases.
104
- - No provider API key or endpoint probing is required for this provider type.
112
+ - OAuth login is run during provider upsert (browser flow by default).
113
+ - Supported `subscription-type`: `chatgpt-codex` and `claude-code` (defaults to `chatgpt-codex`).
114
+ - Default model lists are prefilled by subscription type, then editable.
115
+ - Device-code login is available for `chatgpt-codex` only.
116
+ - No provider API key or endpoint probe input is required for subscription mode.
105
117
 
106
118
  ### 2) Configure Model Fallback (Optional)
107
119
  Flow:
@@ -161,6 +173,7 @@ Local endpoints:
161
173
  - Unified: `http://127.0.0.1:<port>/route`
162
174
  - Anthropic-style: `http://127.0.0.1:<port>/anthropic`
163
175
  - OpenAI-style: `http://127.0.0.1:<port>/openai`
176
+ - OpenAI Responses-style: `http://127.0.0.1:<port>/openai/v1/responses` (Codex CLI-compatible)
164
177
 
165
178
  ## Connect your coding tool
166
179
 
@@ -190,6 +203,8 @@ When local server is running:
190
203
 
191
204
  No stop/start cycle needed.
192
205
 
206
+ Config/status outputs are shown in structured table layouts for easier operator review.
207
+
193
208
  ## Cloudflare Worker (Hosted)
194
209
 
195
210
  Use when you want a hosted endpoint instead of local server.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@khanglvm/llm-router",
3
- "version": "1.1.1",
3
+ "version": "1.3.0",
4
4
  "description": "Single gateway endpoint for multi-provider LLMs with unified OpenAI+Anthropic format and seamless fallback",
5
5
  "keywords": [
6
6
  "llm-router",