aistatus 0.0.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 LeTau Robotics
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,341 @@
1
+ # aistatus
2
+
3
+ Status-aware LLM routing for more reliable agents and coding CLIs.
4
+
5
+ `aistatus` is a TypeScript SDK that checks provider and model availability
6
+ through `aistatus.cc`, picks a healthy route, and then calls the provider
7
+ directly with `fetch`. Prompts and API keys stay in your own process.
8
+ `aistatus` only helps with status checks, routing, and fallback selection.
9
+
10
+ This package is useful when you are building:
11
+
12
+ - multi-step agents that can fail if one model call breaks mid-run
13
+ - coding CLIs that need stable model access during edit, retry, and repair loops
14
+ - internal tools that want graceful failover across multiple providers
15
+
16
+ ## Why This Package Exists
17
+
18
+ Agent workflows are brittle when they assume one provider is always healthy.
19
+ That brittleness gets worse in long-running pipelines: a research agent, coding
20
+ assistant, or automation bot might make 10 to 50 model calls in one task. If a
21
+ single provider is degraded or temporarily unavailable, the whole run can fail.
22
+
23
+ `aistatus` adds a small routing layer in front of those calls:
24
+
25
+ - do a pre-flight health check before dispatching a request
26
+ - select a compatible fallback when the primary route is unavailable
27
+ - keep one TypeScript API even when you use multiple providers
28
+ - return routing metadata so your app can observe fallback behavior
29
+
30
+ In practice, that means better stability for agent systems and coding CLI
31
+ tools: the workflow can keep moving instead of failing hard on one provider
32
+ incident.
33
+
34
+ ## How It Works
35
+
36
+ 1. `aistatus` auto-discovers providers from environment variables, or you can
37
+ register providers manually.
38
+ 2. Before sending a request, it queries `aistatus.cc` for provider or model
39
+ status and compatible alternatives.
40
+ 3. If the primary route is healthy, it uses it.
41
+ 4. If the primary route is unavailable, or a provider call fails,
42
+ `aistatus` can automatically try the next available provider.
43
+ 5. The actual LLM request is executed directly from your runtime, not proxied
44
+ through `aistatus`.
45
+ 6. You get back a unified `RouteResponse` with the chosen model, provider, and
46
+ fallback metadata.
47
+
48
+ If the status API is unreachable, the router falls back to model-prefix
49
+ guessing and only uses adapters that are available locally.
50
+
51
+ ## What You Get
52
+
53
+ - Real-time pre-flight checks for providers and models
54
+ - Automatic fallback across compatible providers
55
+ - Tier-based routing for fast / standard / premium model groups
56
+ - One async API across multiple model vendors
57
+ - Direct provider HTTP calls with local API keys
58
+ - Auto-discovery from standard environment variables
59
+ - Manual registration for custom or self-hosted OpenAI-compatible endpoints
60
+ - Unified response metadata for logging and reliability analysis
61
+
62
+ ## Supported Providers
63
+
64
+ Current built-in adapters cover:
65
+
66
+ - Anthropic
67
+ - OpenAI
68
+ - Google Gemini
69
+ - OpenRouter
70
+ - DeepSeek
71
+ - Mistral
72
+ - xAI
73
+ - Groq
74
+ - Together
75
+ - Moonshot
76
+ - Qwen / DashScope
77
+
78
+ OpenAI-compatible providers reuse an OpenAI-style HTTP adapter under the hood.
79
+
80
+ ## Install
81
+
82
+ ```bash
83
+ npm install aistatus
84
+ ```
85
+
86
+ Notes:
87
+
88
+ - Node.js `>=18` is required because the SDK uses the built-in `fetch` API.
89
+ - No separate provider SDK packages are required for model calls.
90
+ - The package name `aistatus` was available on npm when this package scaffold
91
+ was generated on March 16, 2026.
92
+
93
+ ## Quickstart
94
+
95
+ Set at least one provider API key, then route by model name:
96
+
97
+ ```ts
98
+ import { route } from "aistatus";
99
+
100
+ const resp = await route(
101
+ "Summarize the latest deployment status.",
102
+ {
103
+ model: "claude-sonnet-4-6",
104
+ },
105
+ );
106
+
107
+ console.log(resp.content);
108
+ console.log(resp.modelUsed);
109
+ console.log(resp.providerUsed);
110
+ console.log(resp.wasFallback);
111
+ console.log(resp.fallbackReason);
112
+ ```
113
+
114
+ If the primary provider is unavailable, `aistatus` will try compatible
115
+ providers that are both healthy and configured in your environment.
116
+
117
+ ## Why This Helps Agents And Coding CLIs
118
+
119
+ For simple scripts, a retry loop may be enough. For agents and coding tools, it
120
+ usually is not.
121
+
122
+ - An agent often chains planning, retrieval, synthesis, and repair into one run.
123
+ - A coding CLI may need several model calls for diagnosis, patch generation,
124
+ test-fix loops, and final explanation.
125
+ - When those systems depend on one provider, a brief outage can break the whole
126
+ interaction.
127
+
128
+ `aistatus` improves stability by checking route health before the call and
129
+ falling back automatically when the preferred route is not available. That gives
130
+ you a more resilient default for production agents, internal coding tools, and
131
+ developer-facing CLIs.
132
+
133
+ ## Tier Routing
134
+
135
+ Tier routing is explicit and predictable: you define ordered model groups and
136
+ let the router try them in sequence.
137
+
138
+ ```ts
139
+ import { Router } from "aistatus";
140
+
141
+ const router = new Router({ checkTimeout: 2 });
142
+
143
+ router.addTier("fast", [
144
+ "claude-haiku-4-5",
145
+ "gpt-4o-mini",
146
+ "gemini-2.0-flash",
147
+ ]);
148
+ router.addTier("standard", [
149
+ "claude-sonnet-4-6",
150
+ "gpt-4o",
151
+ "gemini-2.5-pro",
152
+ ]);
153
+
154
+ const resp = await router.route(
155
+ "Explain quantum computing in one sentence.",
156
+ {
157
+ tier: "fast",
158
+ },
159
+ );
160
+ ```
161
+
162
+ This is a good fit when you want stable behavioral buckets such as `fast`,
163
+ `standard`, or `premium`, without hard-coding one vendor per workflow step.
164
+
165
+ ## Agent Pipeline Example
166
+
167
+ `aistatus` is especially useful for multi-step agents. A simple pattern is:
168
+
169
+ ```ts
170
+ import { route } from "aistatus";
171
+
172
+ const plan = await route(
173
+ "How is embodied AI changing manufacturing?",
174
+ {
175
+ model: "claude-haiku-4-5",
176
+ system: "Break the topic into 3 research sub-questions. Be concise.",
177
+ },
178
+ );
179
+
180
+ const answer = await route(
181
+ plan.content,
182
+ {
183
+ model: "claude-sonnet-4-6",
184
+ prefer: ["anthropic", "google"],
185
+ },
186
+ );
187
+ ```
188
+
189
+ See [`examples/agent_pipeline.ts`](examples/agent_pipeline.ts) for a full
190
+ multi-step example that uses different model tiers for planning, research, and
191
+ synthesis.
192
+
193
+ ## Manual Provider Registration
194
+
195
+ You can register custom providers directly when auto-discovery is not enough.
196
+ This is useful for self-hosted gateways or OpenAI-compatible endpoints.
197
+
198
+ ```ts
199
+ import { Router } from "aistatus";
200
+
201
+ const router = new Router({ autoDiscover: false });
202
+
203
+ router.registerProvider({
204
+ slug: "local-vllm",
205
+ adapterType: "openai",
206
+ apiKey: "dummy",
207
+ baseUrl: "http://localhost:8000/v1",
208
+ });
209
+
210
+ const resp = await router.route("Hello", {
211
+ model: "gpt-4o-mini",
212
+ });
213
+ ```
214
+
215
+ If you need a custom provider key to match `aistatus.cc` routing, add aliases:
216
+
217
+ ```ts
218
+ router.registerProvider({
219
+ slug: "my-openai",
220
+ aliases: ["openai"],
221
+ adapterType: "openai",
222
+ apiKey: process.env.OPENAI_API_KEY,
223
+ baseUrl: "https://api.openai.com/v1",
224
+ });
225
+ ```
226
+
227
+ ## Async
228
+
229
+ The SDK is async-first. `route()` already returns a `Promise<RouteResponse>`.
230
+ `aroute()` is provided as an alias if you want naming symmetry with the Python
231
+ SDK.
232
+
233
+ ```ts
234
+ import { aroute } from "aistatus";
235
+
236
+ const resp = await aroute(
237
+ [{ role: "user", content: "Hello" }],
238
+ {
239
+ model: "gpt-4o-mini",
240
+ },
241
+ );
242
+ ```
243
+
244
+ ## Status API
245
+
246
+ You can also query `aistatus.cc` directly without sending any model request:
247
+
248
+ ```ts
249
+ import { StatusAPI } from "aistatus";
250
+
251
+ const api = new StatusAPI();
252
+
253
+ const check = await api.checkProvider("anthropic");
254
+ console.log(check.status);
255
+ console.log(check.isAvailable);
256
+
257
+ for (const provider of await api.providers()) {
258
+ console.log(provider.name, provider.status);
259
+ }
260
+
261
+ for (const model of await api.searchModels("sonnet")) {
262
+ console.log(model.id, model.promptPrice, model.completionPrice);
263
+ }
264
+ ```
265
+
266
+ This is useful for dashboards, health checks, pre-deployment validation, or
267
+ building your own routing policy on top of the status data.
268
+
269
+ ## Response Object
270
+
271
+ Every `route()` call returns a `RouteResponse`:
272
+
273
+ ```ts
274
+ class RouteResponse {
275
+ content: string;
276
+ modelUsed: string;
277
+ providerUsed: string;
278
+ wasFallback: boolean;
279
+ fallbackReason: string | null;
280
+ inputTokens: number;
281
+ outputTokens: number;
282
+ costUsd: number;
283
+ raw: unknown;
284
+ }
285
+ ```
286
+
287
+ The routing metadata makes it easy to log fallback events and understand how
288
+ stable your agent or CLI is in real traffic.
289
+
290
+ ## Errors
291
+
292
+ ```ts
293
+ import {
294
+ AllProvidersDown,
295
+ ProviderCallFailed,
296
+ ProviderNotConfigured,
297
+ route,
298
+ } from "aistatus";
299
+
300
+ try {
301
+ const resp = await route("Hello", {
302
+ model: "claude-sonnet-4-6",
303
+ });
304
+ } catch (error) {
305
+ if (error instanceof AllProvidersDown) {
306
+ console.log(error.tried);
307
+ } else if (error instanceof ProviderNotConfigured) {
308
+ console.log(`Missing API key for: ${error.provider}`);
309
+ } else if (error instanceof ProviderCallFailed) {
310
+ console.log(error.provider, error.model);
311
+ }
312
+ }
313
+ ```
314
+
315
+ Common failure modes:
316
+
317
+ - `AllProvidersDown`: no configured provider could successfully serve the call
318
+ - `ProviderNotConfigured`: the required API key or explicit provider config is missing
319
+ - `ProviderCallFailed`: the selected provider failed and fallback was disabled
320
+
321
+ ## Environment Variables
322
+
323
+ The router auto-discovers providers from standard environment variables:
324
+
325
+ ```bash
326
+ ANTHROPIC_API_KEY=...
327
+ OPENAI_API_KEY=...
328
+ GEMINI_API_KEY=...
329
+ OPENROUTER_API_KEY=...
330
+ DEEPSEEK_API_KEY=...
331
+ MISTRAL_API_KEY=...
332
+ XAI_API_KEY=...
333
+ GROQ_API_KEY=...
334
+ TOGETHER_API_KEY=...
335
+ MOONSHOT_API_KEY=...
336
+ DASHSCOPE_API_KEY=...
337
+ ```
338
+
339
+ ## License
340
+
341
+ MIT. See [LICENSE](LICENSE).