@clinebot/llms 0.0.0 → 0.0.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,198 +1,71 @@
1
1
  # @clinebot/llms
2
2
 
3
- Package-level docs are centralized:
4
-
5
- - Overview: [`packages/README.md`](../README.md)
6
- - Architecture and interactions: [`ARCHITECTURE.md`](/Users/beatrix/dev/clinee/sdk-wip/ARCHITECTURE.md)
7
-
8
- `@clinebot/llms` remains the canonical source for model/provider cataloging and provider settings schema.
9
-
10
- ## Runtime entrypoints
11
-
12
- - Default package entrypoint: `@clinebot/llms`
13
- - Node/runtime explicit entrypoint: `@clinebot/llms/node`
14
-
15
- The default export map resolves to a browser-safe bundle under browser/react-server conditions, and to the Node runtime bundle under standard Node import conditions.
16
- The package `development` export conditions resolve to `src/*` for workspace dev loops; source entrypoints use extensionless relative imports so Turbopack can resolve them in monorepo source mode.
17
- The build emits both `dist/index.js` (Node/default) and `dist/index.browser.js` (browser/react-server) to match this export map.
18
-
19
- Vertex Claude routing in the Node runtime uses `@ai-sdk/google-vertex/anthropic`.
20
-
21
- ## Public API Boundaries
22
-
23
- `@clinebot/llms` now exposes a curated `providers` namespace via:
24
-
25
- - Node/default: `src/providers/public.ts`
26
- - Browser: `src/providers/public.browser.ts`
27
-
28
- Use `providers` for runtime handler creation + shared contracts:
29
-
30
- - handler creation: `createHandler`, `createHandlerAsync`
31
- - provider resolution: `resolveProviderConfig`, `OPENAI_COMPATIBLE_PROVIDERS`
32
- - shared contracts/schemas: `ProviderConfig`, `ProviderSettings`, `ProviderSettingsSchema`, `Message`, `ApiStreamChunk`
33
-
34
- Internal provider implementation modules (`handlers/*`, `transform/*`, `utils/*`) remain internal and are not part of the top-level package contract.
35
-
36
- ## Catalog and Provider Defaults
37
-
38
- OpenAI-compatible provider discovery is centralized in `src/providers/shared/openai-compatible.ts` and reused by:
39
-
40
- - `src/providers/handlers/providers.ts` (runtime defaults + live/private model merge)
41
- - `src/catalog.ts` (catalog view with known models)
42
- - `scripts/models/generate-models-dev.ts` and live-catalog loading via shared models.dev key maps from `@clinebot/shared`
43
-
44
- This keeps provider default derivation and protocol filtering in one place.
45
-
46
- Provider model-catalog key remapping is centralized in `@clinebot/shared` (`src/llms/model-id.ts`, `MODELS_DEV_PROVIDER_KEY_ENTRIES`) and reused by:
47
-
48
- - `src/providers/types/settings.ts` (`toProviderConfig`) for generated known-model hydration
49
- - `src/providers/handlers/providers.ts` (`mergeKnownModels`) for runtime generated/live/private model merging
50
-
51
- Generated models imported from models.dev now exclude entries marked with `status: "deprecated"` at normalization time, so `src/models/generated.ts` only contains non-deprecated tool-capable entries.
52
- `src/models/generated.ts` exports `GENERATED_PROVIDER_MODELS` as `{ version, providers }`, where `version` is an epoch-milliseconds number refreshed on every `bun run build:models` generation run; callers can read it via `getGeneratedModelsVersion()` from `src/models/generated-access.ts`.
53
-
54
- ## Provider Runtime Notes
55
-
56
- - Provider IDs and alias normalization (for example, `openai` -> `openai-native`) are centralized in `src/providers/types/provider-ids.ts` and reused across provider auth, handler factory routing, and app call sites.
57
- - Built-in handler routing is table-driven in `src/providers/index.ts` (`BUILT_IN_HANDLER_FACTORIES`) and shared by both `createHandler` and `createHandlerAsync` (async only adds catalog-refresh resolution before falling back to the same sync routing path).
58
- - The model registry lazy-loader (`src/models/registry.ts`) includes `openai-native` (plus `openai` alias), `openrouter`, `zai`, `doubao`, `moonshot`, `qwen`, `qwen-code`, `sapaicore`, and `minimax` as built-in provider loaders.
59
- - Registry loader registration is generated at build time into `src/models/generated-provider-loaders.ts` from provider module metadata and consumed by `src/models/registry.ts`, eliminating drift-prone manual loader tables.
60
- - Built-in provider ID drift is guarded by `src/providers/types/provider-ids.test.ts`, which checks `BUILT_IN_PROVIDER_IDS` against `getProviderIds()` from the model registry.
61
- - Fetch-based providers can share `FetchBaseHandler` (`src/providers/handlers/fetch-base.ts`) for common JSON request plumbing, error handling, and retry behavior; `AskSageHandler` is implemented on top of this base.
62
- - Handler base classes (`FetchBaseHandler`, `AiSdkProviderHandler`) use proper abstract methods; retry behavior is composed explicitly in `createMessage` via `retryStream(...)` instead of method decorators.
63
- - `openai-codex`, `claude-code`, `opencode`, `mistral`, `dify`, `sapaicore`, and Vertex Claude routes share a common AI SDK runtime bridge (`handlers/ai-sdk-community.ts`) for message mapping and stream normalization.
64
- - `openai-codex`, `claude-code`, `opencode`, `mistral`, `dify`, and `sapaicore` are consolidated in `handlers/community-sdk.ts` and share a common SDK-backed handler base (`handlers/ai-sdk-provider-base.ts`) for provider loading, model resolution, and stream wiring.
65
- - Tests for Claude Code, OpenCode, Mistral, Dify, and SAP AI Core community handlers are consolidated in `handlers/community-sdk.test.ts`.
66
- - `providerId: "openai-codex"` uses `ai-sdk-provider-codex-cli` (Codex CLI), not OpenCode.
67
- - Codex CLI executes its own tools; AI SDK custom tool schemas are ignored for this provider path.
68
- - OAuth-backed `openai-codex` settings do not force `OPENAI_API_KEY`; only explicit OpenAI API keys (`sk-...`) map to Codex CLI env.
69
- - `openai-codex` retains a catalog `baseUrl` only for provider-registry inclusion; runtime requests are handled by `CodexHandler`.
70
- - In Bun + Zod v3 workspaces, `CodexHandler` applies a runtime compatibility shim for `ai-sdk-provider-codex-cli` schema loading.
71
- - `providerId: "opencode"` uses the OpenCode provider (`ai-sdk-provider-opencode-sdk`).
72
- - `opencode` ignores AI SDK custom tool schemas; tools are executed provider-side.
73
- - `providerId: "claude-code"` uses `ai-sdk-provider-claude-code`, defaults to `sonnet`, and supports `sonnet`, `opus`, `haiku`.
74
- - `providerId: "sapaicore"` uses `@jerome-benoit/sap-ai-provider`; auth is handled by SAP AI SDK environment credentials (`AICORE_SERVICE_KEY` or `VCAP_SERVICES`).
75
- - Set `AI_SDK_LOG_WARNINGS=false` to suppress AI SDK warning logs.
76
- - Provider settings `headers` must be a string-to-string map (`Record<string, string>`).
77
- - Provider settings OAuth auth schema includes `auth.expiresAt` (epoch ms) for runtime token refresh orchestration in `@clinebot/core`.
78
- - `toProviderConfig(...)` now backfills `knownModels` from generated model catalogs for non-OpenAI-compatible providers (and alias routes like `openai-native` -> `openai`, `claude-code` -> `anthropic`, `cline` -> `vercel-ai-gateway`) so pricing metadata is available for usage cost calculation.
79
- - Stream chunks are modeled as discriminated unions (`ApiStreamChunk`); tests should narrow by `type` instead of casting to generic records.
80
- - OpenAI-compatible tool schemas default to strict mode; `openrouter` requests disable tool strictness for broader routed-model compatibility.
81
- - OpenAI message conversion now normalizes malformed historical `tool_use.input` payloads (for example, top-level arrays) into object-shaped function arguments before replay.
82
- - Anthropic message conversion now normalizes malformed historical `tool_use.input` payloads (for example, top-level arrays) into object-shaped arguments before replaying them to Anthropic/Bedrock APIs.
83
- - Provider transform converters now coerce internal `file` content blocks into provider-native text payloads for both user messages and `tool_result` replay content (OpenAI, Gemini, Anthropic, R1).
84
- - AI SDK community-provider message conversion (`toAiSdkMessages`) now lives in `src/providers/transform/ai-sdk-community-format.ts` and applies the same `file` block coercion for user/tool-result replay payloads.
85
- - Retry utility uses explicit function composition: `retryAsync(...)` for promises and `retryStream(...)` for async generators.
86
-
87
- ## Legacy Provider Migration Status
88
-
89
- Source of truth for the legacy list: `src/core/api/index.ts`.
90
-
91
- | Provider | Old code | New package |
92
- | --- | --- | --- |
93
- | `aihubmix` | ✅ | ✅ |
94
- | `anthropic` | ✅ | ✅ |
95
- | `asksage` | ✅ | ✅ |
96
- | `baseten` | ✅ | ✅ |
97
- | `bedrock` | ✅ | ✅ |
98
- | `cerebras` | ✅ | ✅ |
99
- | `claude-code` | ✅ | ✅ |
100
- | `cline` | ✅ | ✅ |
101
- | `deepseek` | ✅ | ✅ |
102
- | `dify` | ✅ | ✅ |
103
- | `doubao` | ✅ | ✅ |
104
- | `fireworks` | ✅ | ✅ |
105
- | `gemini` | ✅ | ✅ |
106
- | `groq` | ✅ | ✅ |
107
- | `hicap` | ✅ | ✅ |
108
- | `huawei-cloud-maas` | ✅ | ✅ |
109
- | `huggingface` | ✅ | ✅ |
110
- | `litellm` | ✅ | ✅ |
111
- | `lmstudio` | ✅ | ✅ |
112
- | `minimax` | ✅ | ✅ |
113
- | `mistral` | ✅ | ✅ |
114
- | `moonshot` | ✅ | ✅ |
115
- | `nebius` | ✅ | ✅ |
116
- | `nousResearch` | ✅ | ✅ |
117
- | `oca` | ✅ | ✅ |
118
- | `ollama` | ✅ | ✅ |
119
- | `openai` | ✅ | ✅ |
120
- | `openai-codex` | ✅ | ✅ |
121
- | `openai-native` | ✅ | ✅ |
122
- | `opencode` | ❌ | ✅ |
123
- | `openrouter` | ✅ | ✅ |
124
- | `qwen` | ✅ | ✅ |
125
- | `qwen-code` | ✅ | ✅ |
126
- | `requesty` | ✅ | ✅ |
127
- | `sambanova` | ✅ | ✅ |
128
- | `sapaicore` | ✅ | ✅ |
129
- | `together` | ✅ | ✅ |
130
- | `vercel-ai-gateway` | ✅ | ✅ |
131
- | `vertex` | ✅ | ✅ |
132
- | `vscode-lm` | ✅ | ❌ |
133
- | `xai` | ✅ | ✅ |
134
- | `zai` | ✅ | ✅ |
135
-
136
- `vscode-lm` is client-hosted (VS Code LM/Copilot runtime), so it is not a built-in package provider. VS Code clients can add it via `registerHandler()` or `registerAsyncHandler()`.
137
-
138
- `New package` is marked `✅` only when the provider has a built-in handler route in `src/providers/index.ts` (explicit or OpenAI-compatible defaults), not just an enum entry.
139
-
140
- ### Support Snapshot (Docs Sync)
141
-
142
- - Legacy provider rows tracked: `42`
143
- - Built-in in `@clinebot/llms`: `41`
144
- - Not built-in in `@clinebot/llms`: `1` (`vscode-lm`)
145
- - Newly built-in vs legacy: `opencode`, `dify`, `mistral`, `asksage`
146
-
147
- ## Live Provider Smoke Test
148
-
149
- A live smoke test is available at `src/live-providers.test.ts`. It reads provider
150
- configurations from a JSON file, sends a simple prompt to each configured provider,
151
- and reports which providers returned errors.
152
-
153
- The test is opt-in and only runs when:
154
-
155
- - `LLMS_LIVE_TESTS=1`
156
- - `LLMS_LIVE_PROVIDERS_PATH=/absolute/path/to/providers.json`
157
-
158
- Optional:
159
-
160
- - `LLMS_LIVE_PROVIDER_TIMEOUT_MS=90000` (per-provider timeout; default `90000`)
161
-
162
- Run:
3
+ `@clinebot/llms` is the model and provider layer for the Cline SDK. It gives
4
+ you typed provider settings, model catalogs, and handler creation for supported
5
+ LLM backends.
6
+
7
+ ## What You Get
8
+
9
+ - `providers` for creating runtime handlers and working with provider settings
10
+ - `models` for browsing generated model catalogs
11
+ - `defineLlmsConfig(...)` for declarative SDK config
12
+ - `createLlmsSdk(...)` for higher-level provider/model workflows
13
+
14
+ ## Installation
163
15
 
164
16
  ```bash
165
- cd sdk-wip/packages/llms
166
- LLMS_LIVE_TESTS=1 LLMS_LIVE_PROVIDERS_PATH=/abs/path/providers.json bun test src/live-providers.test.ts
17
+ npm install @clinebot/llms zod
167
18
  ```
168
19
 
169
- Supported JSON formats:
170
-
171
- 1. Stored providers.json style:
172
-
173
- ```json
174
- {
175
- "version": 1,
176
- "providers": {
177
- "anthropic": {
178
- "settings": {
179
- "provider": "anthropic",
180
- "apiKey": "sk-...",
181
- "model": "claude-sonnet-4-20250514"
182
- }
183
- }
184
- }
20
+ ## Quick Start
21
+
22
+ ```ts
23
+ import { providers } from "@clinebot/llms";
24
+
25
+ const handler = providers.createHandler({
26
+ provider: "anthropic",
27
+ apiKey: process.env.ANTHROPIC_API_KEY ?? "",
28
+ model: "claude-sonnet-4-6",
29
+ });
30
+
31
+ const stream = handler.createMessage({
32
+ systemPrompt: "You are a concise assistant.",
33
+ messages: [{ role: "user", content: [{ type: "text", text: "Say hello." }] }],
34
+ });
35
+
36
+ for await (const chunk of stream) {
37
+ console.log(chunk);
185
38
  }
186
39
  ```
187
40
 
188
- 2. Direct array of `ProviderSettings` entries:
41
+ ## Main APIs
189
42
 
190
- ```json
191
- [
192
- {
193
- "provider": "openrouter",
194
- "apiKey": "sk-...",
195
- "model": "anthropic/claude-sonnet-4"
196
- }
197
- ]
198
- ```
43
+ ### Providers
44
+
45
+ Use the `providers` namespace for:
46
+
47
+ - `createHandler(...)` and `createHandlerAsync(...)`
48
+ - `ProviderSettings` and `ProviderSettingsSchema`
49
+ - `ProviderConfig`
50
+ - `Message` and `ApiStreamChunk`
51
+
52
+ ### Models
53
+
54
+ Use the `models` namespace when you need generated provider/model metadata for
55
+ selection UIs, defaults, or validation.
56
+
57
+ ## Entry Points
58
+
59
+ - `@clinebot/llms`: default package entrypoint
60
+ - `@clinebot/llms/node`: explicit Node/runtime entrypoint
61
+ - `@clinebot/llms/browser`: browser-safe bundle
62
+
63
+ ## Related Packages
64
+
65
+ - `@clinebot/agents`: agent loop and tool execution
66
+ - `@clinebot/core`: stateful runtime assembly and provider settings storage
67
+
68
+ ## More Examples
69
+
70
+ - Workspace overview: [README.md](https://github.com/cline/cline/blob/main/README.md)
71
+ - API and architecture references: [DOC.md](https://github.com/cline/cline/blob/main/DOC.md), [ARCHITECTURE.md](https://github.com/cline/cline/blob/main/ARCHITECTURE.md)