@khanglvm/llm-router 1.0.1 → 1.0.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of @khanglvm/llm-router might be problematic. Click here for more details.

Files changed (2) hide show
  1. package/README.md +11 -1
  2. package/package.json +2 -2
package/README.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # llm-router
2
2
 
3
- `llm-router` routes OpenAI-format and Anthropic-format requests across your configured providers.
3
+ `llm-router` is a gateway to access multiple models across multiple providers, with a unified format (OpenAI + Anthropic) and seamless model fallback.
4
4
 
5
5
  It supports:
6
6
  - local route server (`~/.llm-router.json`)
@@ -28,6 +28,16 @@ Local endpoints:
28
28
  - OpenAI: `http://127.0.0.1:8787/openai`
29
29
  - Unified: `http://127.0.0.1:8787/route` (or `/` and `/v1`)
30
30
 
31
+ ## Smart Fallback Behavior
32
+
33
+ `llm-router` can fail over from a primary model to configured fallback models with status-aware logic:
34
+ - `429` (rate-limited): immediate fallback (no origin retry), with `Retry-After` respected when present.
35
+ - Temporary failures (`408`, `409`, `5xx`, network errors): origin-only bounded retries with jittered backoff, then fallback.
36
+ - Billing/quota exhaustion (`402`, or provider-specific billing signals): immediate fallback with longer origin cooldown memory.
37
+ - Auth and permission failures (`401` and relevant `403` cases): no retry; fallback to other providers/models when possible.
38
+ - Policy/moderation blocks: no retry; cross-provider fallback is disabled by default (`LLM_ROUTER_ALLOW_POLICY_FALLBACK=false`).
39
+ - Invalid client requests (`400`, `413`, `422`): no retry and no fallback short-circuit.
40
+
31
41
  ## Main Commands
32
42
 
33
43
  ```bash
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "@khanglvm/llm-router",
3
- "version": "1.0.1",
4
- "description": "LLM Router route (local + Cloudflare Worker)",
3
+ "version": "1.0.3",
4
+ "description": "Single gateway endpoint for multi-provider LLMs with unified OpenAI+Anthropic format and seamless fallback",
5
5
  "type": "module",
6
6
  "main": "src/index.js",
7
7
  "bin": {