@khanglvm/llm-router 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.env.test-suite.example +15 -0
- package/README.md +132 -0
- package/package.json +26 -0
- package/scripts/provider-smoke-suite.mjs +612 -0
- package/src/cli/router-module.js +2412 -0
- package/src/cli-entry.js +144 -0
- package/src/index.js +18 -0
- package/src/node/config-store.js +74 -0
- package/src/node/config-workflows.js +245 -0
- package/src/node/instance-state.js +206 -0
- package/src/node/local-server.js +97 -0
- package/src/node/provider-probe.js +905 -0
- package/src/node/start-command.js +520 -0
- package/src/node/startup-manager.js +369 -0
- package/src/runtime/config.js +612 -0
- package/src/runtime/handler.js +964 -0
- package/src/translator/formats.js +7 -0
- package/src/translator/index.js +73 -0
- package/src/translator/request/claude-to-openai.js +228 -0
- package/src/translator/request/openai-to-claude.js +241 -0
- package/src/translator/response/claude-to-openai.js +204 -0
- package/src/translator/response/openai-to-claude.js +197 -0
- package/wrangler.toml +18 -0
|
@@ -0,0 +1,15 @@
|
|
|
1
|
+
LLM_ROUTER_TEST_HOST=127.0.0.1
|
|
2
|
+
LLM_ROUTER_TEST_PORT=8787
|
|
3
|
+
LLM_ROUTER_TEST_TIMEOUT_MS=90000
|
|
4
|
+
|
|
5
|
+
LLM_ROUTER_TEST_PROVIDER_KEYS=EXAMPLE
|
|
6
|
+
|
|
7
|
+
LLM_ROUTER_TEST_EXAMPLE_PROVIDER_ID=example
|
|
8
|
+
LLM_ROUTER_TEST_EXAMPLE_NAME="Example Provider"
|
|
9
|
+
LLM_ROUTER_TEST_EXAMPLE_API_KEY=sk-...
|
|
10
|
+
LLM_ROUTER_TEST_EXAMPLE_OPENAI_BASE_URL=https://api.example.com/v1
|
|
11
|
+
LLM_ROUTER_TEST_EXAMPLE_CLAUDE_BASE_URL=https://api.example.com
|
|
12
|
+
LLM_ROUTER_TEST_EXAMPLE_MODELS=model-a,model-b
|
|
13
|
+
LLM_ROUTER_TEST_EXAMPLE_HEADERS_JSON={"X-Env":"local"}
|
|
14
|
+
LLM_ROUTER_TEST_EXAMPLE_OPENAI_HEADERS_JSON={"User-Agent":"Mozilla/5.0"}
|
|
15
|
+
LLM_ROUTER_TEST_EXAMPLE_CLAUDE_HEADERS_JSON={"x-foo":"bar"}
|
package/README.md
ADDED
|
@@ -0,0 +1,132 @@
|
|
|
1
|
+
# llm-router
|
|
2
|
+
|
|
3
|
+
`llm-router` routes OpenAI-format and Anthropic-format requests across your configured providers.
|
|
4
|
+
|
|
5
|
+
It supports:
|
|
6
|
+
- local route server (`~/.llm-router.json`)
|
|
7
|
+
- Cloudflare Worker route runtime (`LLM_ROUTER_CONFIG_JSON` secret)
|
|
8
|
+
- CLI + TUI management (`config`, `start`, `deploy`, `worker-key`)
|
|
9
|
+
|
|
10
|
+
## Install
|
|
11
|
+
|
|
12
|
+
```bash
|
|
13
|
+
npm i -g @khanglvm/llm-router
|
|
14
|
+
```
|
|
15
|
+
|
|
16
|
+
## Quick Start
|
|
17
|
+
|
|
18
|
+
```bash
|
|
19
|
+
# 1) Open config TUI (default behavior)
|
|
20
|
+
llm-router
|
|
21
|
+
|
|
22
|
+
# 2) Start local route server
|
|
23
|
+
llm-router start
|
|
24
|
+
```
|
|
25
|
+
|
|
26
|
+
Local endpoints:
|
|
27
|
+
- Anthropic: `http://127.0.0.1:8787/anthropic`
|
|
28
|
+
- OpenAI: `http://127.0.0.1:8787/openai`
|
|
29
|
+
- Unified: `http://127.0.0.1:8787/route` (or `/` and `/v1`)
|
|
30
|
+
|
|
31
|
+
## Main Commands
|
|
32
|
+
|
|
33
|
+
```bash
|
|
34
|
+
llm-router config
|
|
35
|
+
llm-router start
|
|
36
|
+
llm-router stop
|
|
37
|
+
llm-router reload
|
|
38
|
+
llm-router update
|
|
39
|
+
llm-router deploy
|
|
40
|
+
llm-router worker-key
|
|
41
|
+
```
|
|
42
|
+
|
|
43
|
+
## Non-Interactive Config (Agent/CI Friendly)
|
|
44
|
+
|
|
45
|
+
```bash
|
|
46
|
+
llm-router config \
|
|
47
|
+
--operation=upsert-provider \
|
|
48
|
+
--provider-id=openrouter \
|
|
49
|
+
--name="OpenRouter" \
|
|
50
|
+
--base-url=https://openrouter.ai/api/v1 \
|
|
51
|
+
--api-key=sk-or-v1-... \
|
|
52
|
+
--models=gpt-4o,claude-3-7-sonnet \
|
|
53
|
+
--format=openai \
|
|
54
|
+
--skip-probe=true
|
|
55
|
+
```
|
|
56
|
+
|
|
57
|
+
Set local auth key:
|
|
58
|
+
|
|
59
|
+
```bash
|
|
60
|
+
llm-router config --operation=set-master-key --master-key=your_local_key
|
|
61
|
+
```
|
|
62
|
+
|
|
63
|
+
Start with auth required:
|
|
64
|
+
|
|
65
|
+
```bash
|
|
66
|
+
llm-router start --require-auth=true
|
|
67
|
+
```
|
|
68
|
+
|
|
69
|
+
## Cloudflare Worker Deploy
|
|
70
|
+
|
|
71
|
+
Worker project name in `wrangler.toml`: `llm-router-route`.
|
|
72
|
+
|
|
73
|
+
### Option A: Guided deploy
|
|
74
|
+
|
|
75
|
+
```bash
|
|
76
|
+
llm-router deploy
|
|
77
|
+
```
|
|
78
|
+
|
|
79
|
+
### Option B: Explicit steps
|
|
80
|
+
|
|
81
|
+
```bash
|
|
82
|
+
llm-router deploy --export-only=true --out=.llm-router.worker.json
|
|
83
|
+
wrangler secret put LLM_ROUTER_CONFIG_JSON < .llm-router.worker.json
|
|
84
|
+
wrangler deploy
|
|
85
|
+
```
|
|
86
|
+
|
|
87
|
+
Rotate worker auth key quickly:
|
|
88
|
+
|
|
89
|
+
```bash
|
|
90
|
+
llm-router worker-key --master-key=new_key
|
|
91
|
+
```
|
|
92
|
+
|
|
93
|
+
## Runtime Secrets / Env
|
|
94
|
+
|
|
95
|
+
Primary:
|
|
96
|
+
- `LLM_ROUTER_CONFIG_JSON`
|
|
97
|
+
- `LLM_ROUTER_MASTER_KEY` (optional override)
|
|
98
|
+
|
|
99
|
+
Also supported:
|
|
100
|
+
- `ROUTE_CONFIG_JSON`
|
|
101
|
+
- `LLM_ROUTER_JSON`
|
|
102
|
+
|
|
103
|
+
## Default Config Path
|
|
104
|
+
|
|
105
|
+
`~/.llm-router.json`
|
|
106
|
+
|
|
107
|
+
Minimal shape:
|
|
108
|
+
|
|
109
|
+
```json
|
|
110
|
+
{
|
|
111
|
+
"masterKey": "local_or_worker_key",
|
|
112
|
+
"defaultModel": "openrouter/gpt-4o",
|
|
113
|
+
"providers": [
|
|
114
|
+
{
|
|
115
|
+
"id": "openrouter",
|
|
116
|
+
"name": "OpenRouter",
|
|
117
|
+
"baseUrl": "https://openrouter.ai/api/v1",
|
|
118
|
+
"apiKey": "sk-or-v1-...",
|
|
119
|
+
"formats": ["openai"],
|
|
120
|
+
"models": [{ "id": "gpt-4o" }]
|
|
121
|
+
}
|
|
122
|
+
]
|
|
123
|
+
}
|
|
124
|
+
```
|
|
125
|
+
|
|
126
|
+
## Smoke Test
|
|
127
|
+
|
|
128
|
+
```bash
|
|
129
|
+
npm run test:provider-smoke
|
|
130
|
+
```
|
|
131
|
+
|
|
132
|
+
Use `.env.test-suite.example` as template for provider-based smoke tests.
|
package/package.json
ADDED
|
@@ -0,0 +1,26 @@
|
|
|
1
|
+
{
|
|
2
|
+
"name": "@khanglvm/llm-router",
|
|
3
|
+
"version": "1.0.0",
|
|
4
|
+
"description": "LLM Router route (local + Cloudflare Worker)",
|
|
5
|
+
"type": "module",
|
|
6
|
+
"main": "src/index.js",
|
|
7
|
+
"bin": {
|
|
8
|
+
"llm-router": "./src/cli-entry.js",
|
|
9
|
+
"llm-router-route": "./src/cli-entry.js"
|
|
10
|
+
},
|
|
11
|
+
"scripts": {
|
|
12
|
+
"dev": "wrangler dev",
|
|
13
|
+
"deploy": "wrangler deploy",
|
|
14
|
+
"tail": "wrangler tail",
|
|
15
|
+
"start": "node ./src/cli-entry.js start",
|
|
16
|
+
"config": "node ./src/cli-entry.js config",
|
|
17
|
+
"deploy:worker": "node ./src/cli-entry.js deploy",
|
|
18
|
+
"test:provider-smoke": "node ./scripts/provider-smoke-suite.mjs"
|
|
19
|
+
},
|
|
20
|
+
"dependencies": {
|
|
21
|
+
"@levu/snap": "^0.3.0"
|
|
22
|
+
},
|
|
23
|
+
"devDependencies": {
|
|
24
|
+
"wrangler": "^3.0.0"
|
|
25
|
+
}
|
|
26
|
+
}
|