@khanglvm/ai-router 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,15 @@
1
+ AI_ROUTER_TEST_HOST=127.0.0.1
2
+ AI_ROUTER_TEST_PORT=8787
3
+ AI_ROUTER_TEST_TIMEOUT_MS=90000
4
+
5
+ AI_ROUTER_TEST_PROVIDER_KEYS=EXAMPLE
6
+
7
+ AI_ROUTER_TEST_EXAMPLE_PROVIDER_ID=example
8
+ AI_ROUTER_TEST_EXAMPLE_NAME="Example Provider"
9
+ AI_ROUTER_TEST_EXAMPLE_API_KEY=sk-...
10
+ AI_ROUTER_TEST_EXAMPLE_OPENAI_BASE_URL=https://api.example.com/v1
11
+ AI_ROUTER_TEST_EXAMPLE_CLAUDE_BASE_URL=https://api.example.com
12
+ AI_ROUTER_TEST_EXAMPLE_MODELS=model-a,model-b
13
+ AI_ROUTER_TEST_EXAMPLE_HEADERS_JSON={"X-Env":"local"}
14
+ AI_ROUTER_TEST_EXAMPLE_OPENAI_HEADERS_JSON={"User-Agent":"Mozilla/5.0"}
15
+ AI_ROUTER_TEST_EXAMPLE_CLAUDE_HEADERS_JSON={"x-foo":"bar"}
package/README.md ADDED
@@ -0,0 +1,268 @@
1
+ # ai-router-proxy
2
+
3
+ Generic API proxy for AI providers with:
4
+ - Local runtime (`~/.ai-router.json`)
5
+ - Cloudflare Worker runtime (`AI_ROUTER_CONFIG_JSON` secret)
6
+ - Snap-based CLI + TUI setup/deploy flows
7
+ - Unified + dual response endpoints:
8
+ - root/unified: `/`, `/v1`, `/v1/messages`, `/v1/chat/completions`
9
+ - `/anthropic` (Anthropic-compatible responses)
10
+ - `/openai` (OpenAI-compatible responses)
11
+ - OS startup install/uninstall/status (macOS/Linux)
12
+ - Auto-restart local server when config file changes
13
+
14
+ ## Install / Run
15
+
16
+ ```bash
17
+ npm install
18
+ ```
19
+
20
+ Global CLI (after `npm i -g`):
21
+ - `ai-router` or `ai-router start` -> start local proxy
22
+ - `ai-router setup` -> provider/model/master-key/startup manager (TUI by default)
23
+ - `ai-router deploy` -> Cloudflare deploy helper (TUI by default)
24
+
25
+ ### Interactive setup (TUI)
26
+
27
+ ```bash
28
+ npx ai-router-proxy setup
29
+ # or globally:
30
+ ai-router setup
31
+ ```
32
+
33
+ This command:
34
+ - asks for provider name/base URL/API key
35
+ - requires a provider id (slug/camelCase, e.g. `openrouter` or `myProvider`)
36
+ - probes the provider with live requests
37
+ - auto-detects supported format(s): `openai` and/or `claude`
38
+ - tries to discover model list
39
+ - saves config to `~/.ai-router.json`
40
+
41
+ ### Non-interactive setup
42
+
43
+ ```bash
44
+ npx ai-router-proxy setup \
45
+ --operation=upsert-provider \
46
+ --provider-id=openrouter \
47
+ --name=OpenRouter \
48
+ --base-url=https://openrouter.ai/api/v1 \
49
+ --api-key=sk-... \
50
+ --models=gpt-4o,claude-3-5-sonnet-latest \
51
+ --headers='{"User-Agent":"Mozilla/5.0"}'
52
+ ```
53
+
54
+ ### Start local proxy (default command)
55
+
56
+ ```bash
57
+ ai-router
58
+ # same as:
59
+ ai-router start --port=8787
60
+ ```
61
+
62
+ `ai-router start`:
63
+ - starts local proxy on localhost
64
+ - prints proxy URLs and provider/model summary
65
+ - exits with guidance if config/providers are missing
66
+ - watches `~/.ai-router.json` and auto-restarts on changes (default enabled)
67
+
68
+ ```bash
69
+ ai-router start --port=8787 --watch-config=true
70
+ ```
71
+
72
+ Secure local mode (require master key on every endpoint):
73
+
74
+ ```bash
75
+ ai-router setup --operation=set-master-key --master-key=local_secret_key
76
+ ai-router start --require-auth=true
77
+ ```
78
+
79
+ Then call local endpoints with:
80
+ - `Authorization: Bearer local_secret_key`
81
+ - or `x-api-key: local_secret_key`
82
+
83
+ Local mode reads `~/.ai-router.json` on each request. Auth is optional by default, and enforced when `--require-auth=true`.
84
+
85
+ Endpoint bases:
86
+ - Anthropic-compatible: `http://127.0.0.1:8787/anthropic`
87
+ - OpenAI-compatible: `http://127.0.0.1:8787/openai`
88
+
89
+ ### Setup manager operations (CLI mode)
90
+
91
+ ```bash
92
+ ai-router setup --operation=list
93
+ ai-router setup --operation=remove-provider --provider-id=openrouter
94
+ ai-router setup --operation=remove-model --provider-id=openrouter --model=gpt-4o
95
+ ai-router setup --operation=set-master-key --master-key=your_master_key
96
+ ```
97
+
98
+ ### OS startup (start with OS)
99
+
100
+ ```bash
101
+ ai-router setup --operation=startup-install
102
+ ai-router setup --operation=startup-install --require-auth=true
103
+ ai-router setup --operation=startup-status
104
+ ai-router setup --operation=startup-uninstall
105
+ ```
106
+
107
+ On macOS this installs a LaunchAgent, on Linux a `systemd --user` service. The startup service runs `ai-router start` with config-watch enabled.
108
+
109
+ ## Provider Smoke Test Suite (Real-Usage Simulation)
110
+
111
+ This repo includes a CLI smoke test that simulates real local usage:
112
+ - backs up `~/.ai-router.json` (if it exists)
113
+ - removes the active file to start clean
114
+ - runs `ai-router setup --operation=upsert-provider` for the provided providers/models
115
+ - starts `ai-router`
116
+ - sends live API requests for every configured provider/model
117
+ - prints a terminal report
118
+ - removes test config and restores the original `~/.ai-router.json` backup
119
+
120
+ ### Local secrets file (git-ignored)
121
+
122
+ Use a local env file (already git-ignored by `.gitignore`) to store provider credentials and model lists:
123
+
124
+ ```bash
125
+ cp .env.test-suite.example .env.test-suite
126
+ ```
127
+
128
+ Fill in your provider details in `.env.test-suite`.
129
+
130
+ Optional header fields per provider key:
131
+ - `AI_ROUTER_TEST_<KEY>_HEADERS_JSON`
132
+ - `AI_ROUTER_TEST_<KEY>_OPENAI_HEADERS_JSON`
133
+ - `AI_ROUTER_TEST_<KEY>_CLAUDE_HEADERS_JSON`
134
+
135
+ These must be JSON objects and are passed into `ai-router setup --headers=...`.
136
+
137
+ ### Run the smoke suite
138
+
139
+ ```bash
140
+ npm run test:provider-smoke -- --env-file=.env.test-suite --provider-keys=RAMCLOUDS,ZAI
141
+ ```
142
+
143
+ You can also pass provider specs directly via CLI JSON:
144
+
145
+ ```bash
146
+ node scripts/provider-smoke-suite.mjs --providers-json='[{"id":"myprovider","name":"My Provider","apiKey":"sk-...","openaiBaseUrl":"https://api.example.com/v1","models":["model-a"]}]'
147
+ ```
148
+
149
+ Notes:
150
+ - If a provider exposes separate OpenAI and Anthropic base URLs, the suite creates endpoint-specific setup entries (e.g. `provider-openai`, `provider-claude`) so both paths are tested.
151
+ - `--skip-probe=true` is the default for repeatable test runs. Set `--skip-probe=false` to include live setup probing.
152
+
153
+ ## Config File
154
+
155
+ Default location: `~/.ai-router.json`
156
+
157
+ ```json
158
+ {
159
+ "version": 1,
160
+ "masterKey": "your-worker-master-key",
161
+ "defaultModel": "openrouter/gpt-4o",
162
+ "providers": [
163
+ {
164
+ "id": "openrouter",
165
+ "name": "OpenRouter",
166
+ "baseUrl": "https://openrouter.ai/api/v1",
167
+ "apiKey": "sk-...",
168
+ "format": "openai",
169
+ "formats": ["openai"],
170
+ "auth": { "type": "bearer" },
171
+ "models": [{ "id": "gpt-4o" }, { "id": "claude-3-5-sonnet-latest" }]
172
+ }
173
+ ]
174
+ }
175
+ ```
176
+
177
+ Model names used by clients must follow:
178
+
179
+ ```txt
180
+ ${provider}/${model-name}
181
+ ```
182
+
183
+ Examples:
184
+ - `openrouter/gpt-4o`
185
+ - `myProvider/claude-3-5-sonnet-latest`
186
+
187
+ ## Worker Deploy
188
+
189
+ 1) Export worker payload (requires `masterKey` and at least one provider):
190
+
191
+ ```bash
192
+ ai-router deploy --export-only=true --out=.ai-router.worker.json
193
+ ```
194
+
195
+ 2) Upload payload as Worker secret:
196
+
197
+ ```bash
198
+ wrangler secret put AI_ROUTER_CONFIG_JSON < .ai-router.worker.json
199
+ ```
200
+
201
+ 3) Deploy:
202
+
203
+ ```bash
204
+ npm run deploy
205
+ ```
206
+
207
+ Or one command (requires Wrangler installed/login):
208
+
209
+ ```bash
210
+ ai-router deploy
211
+ ai-router deploy --env=production
212
+ ```
213
+
214
+ Quick master-key rotation on the Worker (runtime override secret, no full config re-export required):
215
+
216
+ ```bash
217
+ ai-router worker-key --master-key=new_master_key
218
+ ai-router worker-key --env=production --master-key=new_master_key
219
+ # or reuse the local config's masterKey:
220
+ ai-router worker-key --use-config-key=true
221
+ ```
222
+
223
+ The Worker reads config from:
224
+ - `AI_ROUTER_CONFIG_JSON` (primary)
225
+ - `PROXY_CONFIG_JSON` or `AI_ROUTER_JSON` (aliases)
226
+ - `AI_ROUTER_MASTER_KEY` (optional runtime override)
227
+
228
+ ## Endpoints
229
+
230
+ - `GET /health`
231
+ - `POST /` (unified, auto-detect request format)
232
+ - `POST /v1` (unified, auto-detect request format)
233
+ - `GET /anthropic/v1/models`
234
+ - `GET /openai/v1/models`
235
+ - `POST /anthropic/v1/messages` (Anthropic-compatible)
236
+ - `POST /openai/v1/chat/completions` (OpenAI-compatible)
237
+
238
+ Backward-compatible routes also work:
239
+ - `GET /v1/models`
240
+ - `POST /v1/messages`
241
+ - `POST /v1/chat/completions`
242
+
243
+ Worker auth note:
244
+ - In Cloudflare Worker mode, all endpoints require the master key (including `GET /health`, `GET /`, and `GET /.../models`).
245
+ - Send it as `Authorization: Bearer <masterKey>` (or `x-api-key: <masterKey>`).
246
+
247
+ ## Claude Code Example
248
+
249
+ Point Claude Code at the Anthropic endpoint and use a qualified model id:
250
+
251
+ ```json
252
+ {
253
+ "env": {
254
+ "ANTHROPIC_AUTH_TOKEN": "their_master_key",
255
+ "ANTHROPIC_BASE_URL": "https://your-worker.example.workers.dev/anthropic"
256
+ },
257
+ "ANTHROPIC_DEFAULT_OPUS_MODEL": "provider-name/model-name"
258
+ }
259
+ ```
260
+
261
+ ## Notes
262
+
263
+ - Request translation currently supports:
264
+ - Claude request -> OpenAI provider
265
+ - OpenAI request -> Claude provider
266
+ - OpenAI stream/non-stream response -> Claude response
267
+ - Claude stream/non-stream response -> OpenAI response
268
+ - If a provider/model supports the endpoint format directly, the proxy bypasses the translator.
package/package.json ADDED
@@ -0,0 +1,26 @@
1
+ {
2
+ "name": "@khanglvm/ai-router",
3
+ "version": "1.0.0",
4
+ "description": "Generic AI Router Proxy (local + Cloudflare Worker)",
5
+ "type": "module",
6
+ "main": "src/index.js",
7
+ "bin": {
8
+ "ai-router": "./src/cli-entry.js",
9
+ "ai-router-proxy": "./src/cli-entry.js"
10
+ },
11
+ "scripts": {
12
+ "dev": "wrangler dev",
13
+ "deploy": "wrangler deploy",
14
+ "tail": "wrangler tail",
15
+ "start": "node ./src/cli-entry.js start",
16
+ "setup": "node ./src/cli-entry.js setup",
17
+ "deploy:worker": "node ./src/cli-entry.js deploy",
18
+ "test:provider-smoke": "node ./scripts/provider-smoke-suite.mjs"
19
+ },
20
+ "dependencies": {
21
+ "@levu/snap": "^0.1.1"
22
+ },
23
+ "devDependencies": {
24
+ "wrangler": "^3.0.0"
25
+ }
26
+ }