openclaw-abacusai-auth 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 tonyhu2006
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,338 @@
1
+ # AbacusAI Auth (OpenClaw Plugin)
2
+
3
+ Third-party provider plugin that integrates **AbacusAI** models into OpenClaw via
4
+ **direct connection** to AbacusAI's **RouteLLM** endpoint. The plugin configures
5
+ core compatibility options (`requiresAdditionalPropertiesFalse`, `supportsStrictMode`)
6
+ so that the OpenClaw Agent can use AbacusAI-hosted models (Claude, Gemini, GPT,
7
+ DeepSeek, Qwen, Grok, Kimi, Llama, and more) with full **multi-tool calling** support.
8
+
9
+ | Field | Value |
10
+ | --------------- | ------------------------------------------- |
11
+ | **Package** | `openclaw-abacusai-auth` |
12
+ | **Entry** | `./index.ts` |
13
+ | **Provider ID** | `abacusai` |
14
+ | **Aliases** | `abacus`, `abacus-ai`, `abacusai-code-mode` |
15
+ | **API style** | `openai-completions` (direct connection) |
16
+ | **Upstream** | `https://routellm.abacus.ai/v1` |
17
+
18
+ ---
19
+
20
+ ## Installation
21
+
22
+ ### Option 1: Install from npm (recommended)
23
+
24
+ ```bash
25
+ openclaw plugins install openclaw-abacusai-auth
26
+ ```
27
+
28
+ ### Option 2: Manual installation
29
+
30
+ 1. Clone this repository:
31
+ ```bash
32
+ git clone https://github.com/tonyhu2006/openclaw-abacusai-auth.git ~/.openclaw/extensions/abacusai-auth
33
+ ```
34
+
35
+ 2. Install dependencies:
36
+ ```bash
37
+ cd ~/.openclaw/extensions/abacusai-auth
38
+ npm install
39
+ ```
40
+
41
+ 3. Enable the plugin:
42
+ ```bash
43
+ openclaw plugins enable abacusai-auth
44
+ ```
45
+
46
+ ---
47
+
48
+ ## Quick Start
49
+
50
+ ### 1. Authenticate
51
+
52
+ ```bash
53
+ openclaw models auth login --provider abacusai --set-default
54
+ ```
55
+
56
+ The interactive login flow will:
57
+
58
+ 1. Attempt to **auto-detect** credentials from a local AbacusAI Code Mode installation.
59
+ 2. Fall back to the `ABACUSAI_API_KEY` environment variable.
60
+ 3. Prompt for **manual entry** if neither is found.
61
+ 4. **Validate** the API key against `https://api.abacus.ai/api/v0/describeUser`.
62
+ 5. Let you select which models to register (defaults to all supported models).
63
+ 6. Write the provider config to `openclaw.json` with compat options.
64
+
65
+ ### 2. Restart the Gateway
66
+
67
+ ```bash
68
+ openclaw gateway run
69
+ ```
70
+
71
+ ### 3. Use AbacusAI models
72
+
73
+ ```bash
74
+ openclaw send "Hello" --model abacusai/gemini-3-flash-preview
75
+ ```
76
+
77
+ ---
78
+
79
+ ## Architecture
80
+
81
+ ```
82
+ ┌──────────────────────────────────────────────────────────────────┐
83
+ │ OpenClaw Agent (Pi Agent) │
84
+ │ Sends standard OpenAI-compatible requests │
85
+ │ (POST /v1/chat/completions with tools[]) │
86
+ └──────────────┬───────────────────────────────────────────────────┘
87
+
88
+
89
+ ┌──────────────────────────────────────────────────────────────────┐
90
+ │ Core Tool Schema Normalization (pi-tools.schema.ts) │
91
+ │ │
92
+ │ 1. Reads provider compat from config │
93
+ │ 2. If requiresAdditionalPropertiesFalse: true │
94
+ │ → Sets additionalProperties: false in tool schemas │
95
+ │ 3. If supportsStrictMode: false │
96
+ │ → pi-ai omits `strict` field from tool definitions │
97
+ └──────────────┬───────────────────────────────────────────────────┘
98
+ │ https://routellm.abacus.ai/v1
99
+
100
+ ┌──────────────────────────────────────────────────────────────────┐
101
+ │ AbacusAI RouteLLM Endpoint │
102
+ │ OpenAI-compatible API with function calling │
103
+ │ Routes to Claude, Gemini, GPT, DeepSeek, Llama, etc. │
104
+ └──────────────────────────────────────────────────────────────────┘
105
+ ```
106
+
107
+ **Why core compat options instead of a local proxy?** AbacusAI's RouteLLM is
108
+ _mostly_ OpenAI-compatible but has two schema requirements:
109
+
110
+ 1. Rejects the `strict` field in tool schemas → `supportsStrictMode: false`
111
+ 2. Requires `additionalProperties: false` → `requiresAdditionalPropertiesFalse: true`
112
+
113
+ By using core `ModelCompatConfig` options, we:
114
+
115
+ - Avoid the complexity of a local proxy
116
+ - Enable the same fix for other providers with similar requirements
117
+ - Let pi-ai handle the `strict` field natively (it already supports `supportsStrictMode`)
118
+
119
+ ---
120
+
121
+ ## Supported Models
122
+
123
+ The following models are registered by default (verified February 2026):
124
+
125
+ | Model ID | Family |
126
+ | ----------------------------- | -------------------- |
127
+ | `gemini-3-flash-preview` | Google Gemini |
128
+ | `gemini-3-pro-preview` | Google Gemini |
129
+ | `gemini-2.5-flash` | Google Gemini |
130
+ | `gemini-2.5-pro` | Google Gemini |
131
+ | `gpt-5.2` | OpenAI GPT |
132
+ | `gpt-5.1` | OpenAI GPT |
133
+ | `gpt-5-mini` | OpenAI GPT |
134
+ | `claude-sonnet-4-5-20250929` | Anthropic Claude |
135
+ | `claude-opus-4-6` | Anthropic Claude |
136
+ | `claude-haiku-4-5-20251001` | Anthropic Claude |
137
+ | `deepseek-ai/DeepSeek-V3.2` | DeepSeek |
138
+ | `deepseek-ai/DeepSeek-R1` | DeepSeek |
139
+ | `kimi-k2.5` | Moonshot Kimi |
140
+ | `qwen3-max` | Alibaba Qwen |
141
+ | `grok-4-1-fast-non-reasoning` | xAI Grok |
142
+ | `route-llm` | AbacusAI Auto-Router |
143
+
144
+ All models are configured with:
145
+
146
+ - **Context window**: 200,000 tokens
147
+ - **Max output tokens**: 8,192 tokens
148
+ - **Input modalities**: text, image
149
+ - **API**: `openai-completions`
150
+
151
+ You can customize the model list during the interactive login flow.
152
+
153
+ ---
154
+
155
+ ## Credential Resolution
156
+
157
+ The plugin resolves API keys using a multi-tier fallback strategy, checked in order:
158
+
159
+ ### During Login (`openclaw models auth login`)
160
+
161
+ 1. **Local AbacusAI Code Mode installation** — scans platform-specific paths:
162
+ - **Windows**: `%APPDATA%\AbacusAI\User\globalStorage\credentials.json`,
163
+ `%APPDATA%\AbacusAI Code Mode\User\globalStorage\credentials.json`,
164
+ `%USERPROFILE%\.abacusai\credentials.json`, `%USERPROFILE%\.abacusai\config.json`
165
+ - **macOS**: `~/Library/Application Support/AbacusAI/...`, `~/.abacusai/...`
166
+ - **Linux**: `~/.config/AbacusAI/...`, `~/.abacusai/...`
167
+ - Accepts fields: `apiKey`, `api_key`, `token`, `accessToken`, `access_token`
168
+ 2. **Environment variable** — `ABACUSAI_API_KEY`
169
+ 3. **Manual entry** — interactive prompt
170
+
171
+ ---
172
+
173
+ ## Core Compatibility Options
174
+
175
+ This plugin configures two core `ModelCompatConfig` options that are essential
176
+ for AbacusAI RouteLLM compatibility:
177
+
178
+ ### `requiresAdditionalPropertiesFalse`
179
+
180
+ AbacusAI RouteLLM requires `additionalProperties: false` in tool parameter schemas.
181
+ When this option is set to `true`, the `normalizeToolParameters` function in
182
+ `pi-tools.schema.ts` sets `additionalProperties: false` instead of the default `true`.
183
+
184
+ ### `supportsStrictMode`
185
+
186
+ AbacusAI RouteLLM rejects the `strict` field in tool definitions. When this option
187
+ is set to `false`, the pi-ai library omits the `strict` field from tool definitions.
188
+
189
+ ### Provider Configuration
190
+
191
+ The plugin configures the AbacusAI provider with these compat options:
192
+
193
+ ```json
194
+ {
195
+ "baseUrl": "https://routellm.abacus.ai/v1",
196
+ "api": "openai-completions",
197
+ "auth": "token",
198
+ "compat": {
199
+ "requiresAdditionalPropertiesFalse": true,
200
+ "supportsStrictMode": false
201
+ }
202
+ }
203
+ ```
204
+
205
+ ---
206
+
207
+ ## Configuration Reference
208
+
209
+ After login, the plugin writes the following to `~/.openclaw/openclaw.json`:
210
+
211
+ ```jsonc
212
+ {
213
+ "models": {
214
+ "providers": {
215
+ "abacusai": {
216
+ "baseUrl": "https://routellm.abacus.ai/v1",
217
+ "api": "openai-completions",
218
+ "auth": "token",
219
+ "compat": {
220
+ "requiresAdditionalPropertiesFalse": true,
221
+ "supportsStrictMode": false,
222
+ },
223
+ "models": [
224
+ {
225
+ "id": "gemini-3-flash-preview",
226
+ "name": "gemini-3-flash-preview",
227
+ "api": "openai-completions",
228
+ "reasoning": false,
229
+ "input": ["text", "image"],
230
+ "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
231
+ "contextWindow": 200000,
232
+ "maxTokens": 8192,
233
+ },
234
+ // ... other models
235
+ ],
236
+ },
237
+ },
238
+ },
239
+ }
240
+ ```
241
+
242
+ Credentials are stored separately in `~/.openclaw/agents/<agent>/agent/auth-profiles.json`:
243
+
244
+ ```jsonc
245
+ {
246
+ "profiles": {
247
+ "abacusai:<email-or-default>": {
248
+ "type": "token",
249
+ "provider": "abacusai",
250
+ "token": "<api-key>",
251
+ },
252
+ },
253
+ }
254
+ ```
255
+
256
+ ### Environment Variables
257
+
258
+ | Variable | Description |
259
+ | -------------------- | -------------------------------------------------------------- |
260
+ | `ABACUSAI_API_KEY` | API key fallback (used if no saved profile is found) |
261
+ | `OPENCLAW_STATE_DIR` | Override the OpenClaw state directory (default: `~/.openclaw`) |
262
+
263
+ ---
264
+
265
+ ## Troubleshooting
266
+
267
+ ### Tool calling not working
268
+
269
+ If tool calls fail with schema-related errors, ensure the provider has the
270
+ correct compat options configured:
271
+
272
+ ```json
273
+ "compat": {
274
+ "requiresAdditionalPropertiesFalse": true,
275
+ "supportsStrictMode": false
276
+ }
277
+ ```
278
+
279
+ ### API key validation failed
280
+
281
+ Your API key may have been revoked or expired. Generate a new one at
282
+ <https://abacus.ai/app/profile/apikey> and re-authenticate:
283
+
284
+ ```bash
285
+ openclaw models auth login --provider abacusai --set-default
286
+ ```
287
+
288
+ ### Plugin not found
289
+
290
+ If AbacusAI models are not available, ensure the plugin is installed:
291
+
292
+ ```bash
293
+ openclaw plugins install openclaw-abacusai-auth
294
+ ```
295
+
296
+ ---
297
+
298
+ ## Getting an API Key
299
+
300
+ 1. Sign in at <https://abacus.ai>
301
+ 2. Navigate to **Profile → API Keys** (<https://abacus.ai/app/profile/apikey>)
302
+ 3. Click **Generate new API Key**
303
+ 4. Copy the key (starts with `s2_...`)
304
+
305
+ ---
306
+
307
+ ## File Structure
308
+
309
+ ```
310
+ openclaw-abacusai-auth/
311
+ ├── index.ts # Plugin source (auth, credential detection, model definitions)
312
+ ├── package.json # Package metadata
313
+ ├── openclaw.plugin.json # Plugin manifest
314
+ └── README.md # This file
315
+ ```
316
+
317
+ ## Key Constants
318
+
319
+ | Constant | Value | Description |
320
+ | ------------------------ | ------------------------------- | -------------------------------------- |
321
+ | `ROUTELLM_BASE` | `https://routellm.abacus.ai/v1` | Upstream RouteLLM endpoint |
322
+ | `ABACUS_API` | `https://api.abacus.ai/api/v0` | AbacusAI REST API (for key validation) |
323
+ | `DEFAULT_CONTEXT_WINDOW` | 200,000 | Default context window for all models |
324
+ | `DEFAULT_MAX_TOKENS` | 8,192 | Default max output tokens |
325
+
326
+ ---
327
+
328
+ ## License
329
+
330
+ MIT
331
+
332
+ ## Author
333
+
334
+ [tonyhu2006](https://github.com/tonyhu2006)
335
+
336
+ ## Contributing
337
+
338
+ Issues and PRs welcome at <https://github.com/tonyhu2006/openclaw-abacusai-auth>
package/index.ts ADDED
@@ -0,0 +1,683 @@
1
+ import { readFileSync, readdirSync } from "node:fs";
2
+ import { createServer, type IncomingMessage, type ServerResponse } from "node:http";
3
+ import { homedir } from "node:os";
4
+ import { join } from "node:path";
5
+ import { emptyPluginConfigSchema, fetchWithSsrFGuard } from "openclaw/plugin-sdk";
6
+
7
+ // ---------------------------------------------------------------------------
8
+ // Constants
9
+ // ---------------------------------------------------------------------------
10
+
11
+ const ABACUS_API = "https://api.abacus.ai/api/v0";
12
+ const ROUTELLM_BASE = "https://routellm.abacus.ai/v1";
13
+ const DEFAULT_CONTEXT_WINDOW = 200_000;
14
+ const DEFAULT_MAX_TOKENS = 8192;
15
+
16
+ // Proxy configuration
17
+ const PROXY_PORT = 18790;
18
+ const PROXY_HOST = "127.0.0.1";
19
+
20
+ // Models available on AbacusAI RouteLLM endpoint (OpenAI-compatible, with
21
+ // function calling support). Verified 2026-02.
22
+ const DEFAULT_MODEL_IDS = [
23
+ // Google Gemini
24
+ "gemini-3-flash-preview",
25
+ "gemini-3-pro-preview",
26
+ "gemini-2.5-flash",
27
+ "gemini-2.5-pro",
28
+ // OpenAI GPT
29
+ "gpt-5.2",
30
+ "gpt-5.2-codex",
31
+ "gpt-5.1",
32
+ "gpt-5.1-codex",
33
+ "gpt-5.1-codex-max",
34
+ "gpt-5",
35
+ "gpt-5-codex",
36
+ "gpt-5-mini",
37
+ "gpt-5-nano",
38
+ // OpenAI o-series (reasoning)
39
+ "o3-pro",
40
+ "o3",
41
+ "o3-mini",
42
+ "o4-mini",
43
+ // Anthropic Claude
44
+ "claude-sonnet-4-5-20250929",
45
+ "claude-opus-4-6",
46
+ "claude-opus-4-5-20251101",
47
+ "claude-sonnet-4-20250514",
48
+ "claude-haiku-4-5-20251001",
49
+ // DeepSeek
50
+ "deepseek-ai/DeepSeek-V3.2",
51
+ "deepseek-ai/DeepSeek-V3.1-Terminus",
52
+ "deepseek-ai/DeepSeek-R1",
53
+ // xAI Grok
54
+ "grok-4-0709",
55
+ "grok-4-1-fast-non-reasoning",
56
+ "grok-4-fast-non-reasoning",
57
+ "grok-code-fast-1",
58
+ // Meta Llama
59
+ "meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8",
60
+ "meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo",
61
+ "llama-3.3-70b-versatile",
62
+ // Qwen
63
+ "qwen3-max",
64
+ "Qwen/Qwen3-235B-A22B-Instruct-2507",
65
+ "Qwen/Qwen3-32B",
66
+ "qwen/qwen3-coder-480b-a35b-instruct",
67
+ // Moonshot Kimi
68
+ "kimi-k2.5",
69
+ "kimi-k2-turbo-preview",
70
+ // Z.AI GLM
71
+ "zai-org/glm-4.7",
72
+ "zai-org/glm-4.6",
73
+ // Other
74
+ "openai/gpt-oss-120b",
75
+ // AbacusAI auto-router
76
+ "route-llm",
77
+ ];
78
+
79
+ // ---------------------------------------------------------------------------
80
+ // Credential detection (Code Mode / env / manual)
81
+ // ---------------------------------------------------------------------------
82
+
83
+ const CODE_MODE_CREDENTIAL_PATHS = {
84
+ win32: [
85
+ join(homedir(), "AppData", "Roaming", "AbacusAI", "User", "globalStorage", "credentials.json"),
86
+ join(
87
+ homedir(),
88
+ "AppData",
89
+ "Roaming",
90
+ "AbacusAI Code Mode",
91
+ "User",
92
+ "globalStorage",
93
+ "credentials.json",
94
+ ),
95
+ join(homedir(), ".abacusai", "credentials.json"),
96
+ join(homedir(), ".abacusai", "config.json"),
97
+ ],
98
+ darwin: [
99
+ join(
100
+ homedir(),
101
+ "Library",
102
+ "Application Support",
103
+ "AbacusAI",
104
+ "User",
105
+ "globalStorage",
106
+ "credentials.json",
107
+ ),
108
+ join(
109
+ homedir(),
110
+ "Library",
111
+ "Application Support",
112
+ "AbacusAI Code Mode",
113
+ "User",
114
+ "globalStorage",
115
+ "credentials.json",
116
+ ),
117
+ join(homedir(), ".abacusai", "credentials.json"),
118
+ join(homedir(), ".abacusai", "config.json"),
119
+ ],
120
+ linux: [
121
+ join(homedir(), ".config", "AbacusAI", "User", "globalStorage", "credentials.json"),
122
+ join(homedir(), ".config", "AbacusAI Code Mode", "User", "globalStorage", "credentials.json"),
123
+ join(homedir(), ".abacusai", "credentials.json"),
124
+ join(homedir(), ".abacusai", "config.json"),
125
+ ],
126
+ };
127
+
128
+ type CredentialFile = {
129
+ apiKey?: string;
130
+ api_key?: string;
131
+ token?: string;
132
+ accessToken?: string;
133
+ access_token?: string;
134
+ };
135
+
136
+ function tryReadLocalCredential(): string | null {
137
+ const platform = process.platform as "win32" | "darwin" | "linux";
138
+ const paths = CODE_MODE_CREDENTIAL_PATHS[platform] ?? CODE_MODE_CREDENTIAL_PATHS.linux;
139
+ for (const credPath of paths) {
140
+ try {
141
+ const raw = readFileSync(credPath, "utf8");
142
+ const data = JSON.parse(raw) as CredentialFile;
143
+ const key =
144
+ data.apiKey?.trim() ||
145
+ data.api_key?.trim() ||
146
+ data.token?.trim() ||
147
+ data.accessToken?.trim() ||
148
+ data.access_token?.trim();
149
+ if (key) {
150
+ return key;
151
+ }
152
+ } catch {
153
+ // not found — try next
154
+ }
155
+ }
156
+ return null;
157
+ }
158
+
159
+ // ---------------------------------------------------------------------------
160
+ // Saved credential recovery (for proxy auto-restart after reboot)
161
+ // ---------------------------------------------------------------------------
162
+
163
+ function resolveOpenClawStateDir(): string {
164
+ const override = process.env.OPENCLAW_STATE_DIR?.trim() || process.env.CLAWDBOT_STATE_DIR?.trim();
165
+ if (override) {
166
+ return override;
167
+ }
168
+ return join(homedir(), ".openclaw");
169
+ }
170
+
171
+ function tryRecoverApiKey(): string | null {
172
+ const stateDir = resolveOpenClawStateDir();
173
+
174
+ // Helper: extract abacusai API key from an auth-profiles.json file
175
+ function extractFromAuthFile(authPath: string): string | null {
176
+ try {
177
+ const raw = JSON.parse(readFileSync(authPath, "utf8")) as {
178
+ profiles?: Record<string, { token?: string; key?: string; provider?: string }>;
179
+ };
180
+ if (raw.profiles) {
181
+ for (const [id, profile] of Object.entries(raw.profiles)) {
182
+ if (!id.startsWith("abacusai:")) {
183
+ continue;
184
+ }
185
+ // Credentials may use "token" or "key" field depending on auth flow
186
+ const secret = profile.token?.trim() || profile.key?.trim();
187
+ if (secret) {
188
+ return secret;
189
+ }
190
+ }
191
+ }
192
+ } catch {
193
+ // file not found or unreadable
194
+ }
195
+ return null;
196
+ }
197
+
198
+ // Primary: search agents/*/agent/auth-profiles.json (actual storage location)
199
+ try {
200
+ const agentsDir = join(stateDir, "agents");
201
+ for (const agentName of readdirSync(agentsDir)) {
202
+ const authPath = join(agentsDir, agentName, "agent", "auth-profiles.json");
203
+ const key = extractFromAuthFile(authPath);
204
+ if (key) {
205
+ return key;
206
+ }
207
+ }
208
+ } catch {
209
+ // agents dir not found
210
+ }
211
+
212
+ // Fallback: try root-level auth-profiles.json (legacy or future layout)
213
+ const rootKey = extractFromAuthFile(join(stateDir, "auth-profiles.json"));
214
+ if (rootKey) {
215
+ return rootKey;
216
+ }
217
+
218
+ // Fallback: try environment variable
219
+ const envKey = process.env.ABACUSAI_API_KEY?.trim();
220
+ if (envKey) {
221
+ return envKey;
222
+ }
223
+ // Fallback: try local Code Mode credentials
224
+ return tryReadLocalCredential();
225
+ }
226
+
227
+ // ---------------------------------------------------------------------------
228
+ // AbacusAI API helpers
229
+ // ---------------------------------------------------------------------------
230
+
231
+ async function validateApiKey(
232
+ apiKey: string,
233
+ ): Promise<{ valid: boolean; email?: string; error?: string }> {
234
+ try {
235
+ const { response: r, release } = await fetchWithSsrFGuard({
236
+ url: `${ABACUS_API}/describeUser`,
237
+ init: {
238
+ method: "GET",
239
+ headers: { apiKey, "Content-Type": "application/json" },
240
+ },
241
+ timeoutMs: 15_000,
242
+ });
243
+ try {
244
+ if (!r.ok) {
245
+ return { valid: false, error: r.status === 403 ? "Invalid API key" : `HTTP ${r.status}` };
246
+ }
247
+ const d = (await r.json()) as { success?: boolean; result?: { email?: string } };
248
+ if (!d.success) {
249
+ return { valid: false, error: "API returned unsuccessful response" };
250
+ }
251
+ return { valid: true, email: d.result?.email?.trim() };
252
+ } finally {
253
+ await release();
254
+ }
255
+ } catch (err) {
256
+ return {
257
+ valid: false,
258
+ error: `Validation failed: ${err instanceof Error ? err.message : String(err)}`,
259
+ };
260
+ }
261
+ }
262
+
263
+ // ---------------------------------------------------------------------------
264
+ // Local proxy for RouteLLM response normalization
265
+ // ---------------------------------------------------------------------------
266
+ // RouteLLM responses are missing `id` and `object` fields that OpenAI SDK expects.
267
+ // This proxy adds those fields to ensure compatibility.
268
+
269
+ let proxyServer: ReturnType<typeof createServer> | null = null;
270
+ let proxyApiKey = "";
271
+
272
+ function readBody(req: IncomingMessage): Promise<Buffer> {
273
+ return new Promise((resolve, reject) => {
274
+ const chunks: Buffer[] = [];
275
+ req.on("data", (chunk: Buffer) => chunks.push(chunk));
276
+ req.on("end", () => resolve(Buffer.concat(chunks)));
277
+ req.on("error", reject);
278
+ });
279
+ }
280
+
281
+ function sendJsonResponse(res: ServerResponse, status: number, data: unknown) {
282
+ const body = JSON.stringify(data);
283
+ res.writeHead(status, { "Content-Type": "application/json" });
284
+ res.end(body);
285
+ }
286
+
287
+ function generateChunkId(): string {
288
+ return `chatcmpl-${crypto.randomUUID()}`;
289
+ }
290
+
291
+ /**
292
+ * Normalize SSE chunk to add missing id and object fields
293
+ */
294
+ function normalizeSseChunk(line: string, chunkId: string): string {
295
+ if (!line.startsWith("data: ") || line === "data: [DONE]") {
296
+ return line;
297
+ }
298
+ try {
299
+ const json = JSON.parse(line.slice(6)) as Record<string, unknown>;
300
+ if (!("id" in json)) {
301
+ json.id = chunkId;
302
+ }
303
+ if (!("object" in json)) {
304
+ json.object = "chat.completion.chunk";
305
+ }
306
+ return `data: ${JSON.stringify(json)}`;
307
+ } catch {
308
+ return line;
309
+ }
310
+ }
311
+
312
+ /**
313
+ * Strip `strict` field from tools - RouteLLM doesn't support it
314
+ */
315
+ function stripStrictFromTools(tools: unknown[]): unknown[] {
316
+ return tools.map((t) => {
317
+ if (!t || typeof t !== "object") {
318
+ return t;
319
+ }
320
+ const copy = { ...(t as Record<string, unknown>) };
321
+ delete copy.strict;
322
+ if (copy.function && typeof copy.function === "object") {
323
+ const fn = { ...(copy.function as Record<string, unknown>) };
324
+ delete fn.strict;
325
+ copy.function = fn;
326
+ }
327
+ return copy;
328
+ });
329
+ }
330
+
331
+ async function handleProxyRequest(req: IncomingMessage, res: ServerResponse) {
332
+ const path = req.url ?? "/";
333
+ const target = `${ROUTELLM_BASE}${path}`;
334
+ const headers: Record<string, string> = {
335
+ Authorization: `Bearer ${proxyApiKey}`,
336
+ "Content-Type": "application/json",
337
+ };
338
+
339
+ let body: string | undefined;
340
+ if (req.method === "POST") {
341
+ const raw = await readBody(req);
342
+ const parsed = JSON.parse(raw.toString()) as Record<string, unknown>;
343
+ // Strip `strict` field from tools - RouteLLM doesn't support it
344
+ if (Array.isArray(parsed.tools)) {
345
+ parsed.tools = stripStrictFromTools(parsed.tools);
346
+ }
347
+ body = JSON.stringify(parsed);
348
+ }
349
+
350
+ const { response: upstream, release } = await fetchWithSsrFGuard({
351
+ url: target,
352
+ init: {
353
+ method: req.method ?? "GET",
354
+ headers: body ? headers : { Authorization: headers.Authorization },
355
+ body: body ?? undefined,
356
+ },
357
+ timeoutMs: 180_000,
358
+ });
359
+
360
+ // Detect expired/revoked API key at runtime
361
+ if (upstream.status === 401 || upstream.status === 403) {
362
+ const errBody = await upstream.text().catch(() => "");
363
+ await release();
364
+ console.error(
365
+ `[abacusai] Upstream returned ${upstream.status} — API key may be expired or revoked.`,
366
+ );
367
+ sendJsonResponse(res, upstream.status, {
368
+ error: {
369
+ message: `AbacusAI API key expired or invalid (HTTP ${upstream.status}).`,
370
+ type: "auth_expired",
371
+ upstream_body: errBody.slice(0, 500),
372
+ },
373
+ });
374
+ return;
375
+ }
376
+
377
+ const ct = upstream.headers.get("content-type") ?? "application/json";
378
+ const chunkId = generateChunkId();
379
+
380
+ // Stream SSE response with normalization
381
+ if (ct.includes("text/event-stream") && upstream.body) {
382
+ res.writeHead(upstream.status, {
383
+ "Content-Type": "text/event-stream",
384
+ "Cache-Control": "no-cache",
385
+ Connection: "keep-alive",
386
+ });
387
+ const reader = (upstream.body as ReadableStream<Uint8Array>).getReader();
388
+ const decoder = new TextDecoder();
389
+ let buffer = "";
390
+
391
+ const pump = async () => {
392
+ for (;;) {
393
+ const { done, value } = await reader.read();
394
+ if (done) {
395
+ // Process any remaining buffer
396
+ if (buffer.trim()) {
397
+ const normalized = normalizeSseChunk(buffer.trim(), chunkId);
398
+ res.write(normalized + "\n\n");
399
+ }
400
+ res.end();
401
+ await release();
402
+ return;
403
+ }
404
+ buffer += decoder.decode(value, { stream: true });
405
+ const lines = buffer.split("\n");
406
+ buffer = lines.pop() ?? "";
407
+ for (const line of lines) {
408
+ if (line.trim()) {
409
+ const normalized = normalizeSseChunk(line, chunkId);
410
+ res.write(normalized + "\n");
411
+ } else {
412
+ res.write("\n");
413
+ }
414
+ }
415
+ }
416
+ };
417
+ pump().catch(async () => {
418
+ res.end();
419
+ await release();
420
+ });
421
+ } else {
422
+ // Non-streaming response - add id and object fields
423
+ const data = await upstream.text();
424
+ await release();
425
+ try {
426
+ const json = JSON.parse(data) as Record<string, unknown>;
427
+ if (!("id" in json)) {
428
+ json.id = chunkId;
429
+ }
430
+ if (!("object" in json)) {
431
+ json.object = "chat.completion";
432
+ }
433
+ res.writeHead(upstream.status, { "Content-Type": "application/json" });
434
+ res.end(JSON.stringify(json));
435
+ } catch {
436
+ res.writeHead(upstream.status, { "Content-Type": ct });
437
+ res.end(data);
438
+ }
439
+ }
440
+ }
441
+
442
+ function startProxy(apiKey: string): Promise<void> {
443
+ return new Promise((resolve, reject) => {
444
+ if (proxyServer) {
445
+ proxyApiKey = apiKey;
446
+ resolve();
447
+ return;
448
+ }
449
+ proxyApiKey = apiKey;
450
+ proxyServer = createServer((req, res) => {
451
+ handleProxyRequest(req, res).catch((err) => {
452
+ console.error("[abacusai] proxy error:", err);
453
+ sendJsonResponse(res, 500, { error: { message: String(err) } });
454
+ });
455
+ });
456
+ proxyServer.listen(PROXY_PORT, PROXY_HOST, () => {
457
+ console.log(`[abacusai] proxy listening on http://${PROXY_HOST}:${PROXY_PORT}`);
458
+ resolve();
459
+ });
460
+ proxyServer.on("error", (err: NodeJS.ErrnoException) => {
461
+ if (err.code === "EADDRINUSE") {
462
+ // Port already in use, assume proxy is already running
463
+ proxyServer = null;
464
+ resolve();
465
+ } else {
466
+ reject(err);
467
+ }
468
+ });
469
+ });
470
+ }
471
+
472
+ // ---------------------------------------------------------------------------
473
+ // Helpers
474
+ // ---------------------------------------------------------------------------
475
+
476
+ function parseModelIds(input: string): string[] {
477
+ return Array.from(
478
+ new Set(
479
+ input
480
+ .split(/[\n,]/)
481
+ .map((m) => m.trim())
482
+ .filter(Boolean),
483
+ ),
484
+ );
485
+ }
486
+
487
+ function buildModelDefinition(modelId: string) {
488
+ return {
489
+ id: modelId,
490
+ name: modelId,
491
+ api: "openai-completions",
492
+ reasoning: false,
493
+ input: ["text", "image"],
494
+ cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
495
+ contextWindow: DEFAULT_CONTEXT_WINDOW,
496
+ maxTokens: DEFAULT_MAX_TOKENS,
497
+ };
498
+ }
499
+
500
+ // ---------------------------------------------------------------------------
501
+ // Plugin
502
+ // ---------------------------------------------------------------------------
503
+
504
+ // Type definitions for plugin API
505
+ interface PluginPrompter {
506
+ progress: (msg: string) => { update: (msg: string) => void; stop: (msg: string) => void };
507
+ confirm: (opts: { message: string; initialValue: boolean }) => Promise<boolean>;
508
+ text: (opts: {
509
+ message: string;
510
+ placeholder?: string;
511
+ initialValue?: string;
512
+ validate?: (v: string) => string | undefined;
513
+ }) => Promise<string>;
514
+ }
515
+
516
+ interface PluginAuthContext {
517
+ prompter: PluginPrompter;
518
+ }
519
+
520
+ const abacusaiPlugin = {
521
+ id: "abacusai-auth",
522
+ name: "AbacusAI Auth",
523
+ description: "AbacusAI RouteLLM provider plugin with direct connection and schema normalization",
524
+ configSchema: emptyPluginConfigSchema(),
525
+ register(api: unknown) {
526
+ const pluginApi = api as {
527
+ registerProvider: (config: unknown) => void;
528
+ config?: {
529
+ models?: { providers?: { abacusai?: { compat?: { supportsStrictMode?: boolean } } } };
530
+ };
531
+ };
532
+
533
+ // Direct connection mode - no proxy needed
534
+ // Core code handles schema cleaning via requiresCleanSchema compat option
535
+
536
+ pluginApi.registerProvider({
537
+ id: "abacusai",
538
+ label: "AbacusAI",
539
+ docsPath: "/providers/models",
540
+ aliases: ["abacus", "abacus-ai", "abacusai-code-mode"],
541
+ envVars: ["ABACUSAI_API_KEY"],
542
+ auth: [
543
+ {
544
+ id: "api-key",
545
+ label: "AbacusAI API key",
546
+ hint: "Enter your AbacusAI API key or auto-detect from Code Mode",
547
+ kind: "custom",
548
+ run: async (ctx: PluginAuthContext) => {
549
+ const spin = ctx.prompter.progress("Setting up AbacusAI…");
550
+
551
+ try {
552
+ // --- Credential resolution (3-tier) ---
553
+ const localKey = tryReadLocalCredential();
554
+ let apiKey = "";
555
+
556
+ if (localKey) {
557
+ spin.update("Found local AbacusAI credentials…");
558
+ const useLocal = await ctx.prompter.confirm({
559
+ message: `Found AbacusAI credentials locally (${localKey.slice(0, 8)}…). Use them?`,
560
+ initialValue: true,
561
+ });
562
+ if (useLocal) {
563
+ apiKey = localKey;
564
+ }
565
+ }
566
+
567
+ if (!apiKey) {
568
+ const envKey = process.env.ABACUSAI_API_KEY?.trim();
569
+ if (envKey) {
570
+ spin.update("Found ABACUSAI_API_KEY environment variable…");
571
+ const useEnv = await ctx.prompter.confirm({
572
+ message: "Found ABACUSAI_API_KEY in environment. Use it?",
573
+ initialValue: true,
574
+ });
575
+ if (useEnv) {
576
+ apiKey = envKey;
577
+ }
578
+ }
579
+ }
580
+
581
+ if (!apiKey) {
582
+ const input = await ctx.prompter.text({
583
+ message: "AbacusAI API key",
584
+ placeholder: "Paste your API key from https://abacus.ai/app/profile/apikey",
585
+ validate: (value: string) => {
586
+ const t = value.trim();
587
+ if (!t) {
588
+ return "API key is required";
589
+ }
590
+ if (t.length < 10) {
591
+ return "API key looks too short";
592
+ }
593
+ return undefined;
594
+ },
595
+ });
596
+ apiKey = String(input).trim();
597
+ }
598
+
599
+ if (!apiKey) {
600
+ throw new Error("No API key provided");
601
+ }
602
+
603
+ // --- Validate ---
604
+ spin.update("Validating API key…");
605
+ const validation = await validateApiKey(apiKey);
606
+ if (!validation.valid) {
607
+ spin.stop("API key validation failed");
608
+ const saveAnyway = await ctx.prompter.confirm({
609
+ message: `Validation failed: ${validation.error}\nSave this key anyway? (You can re-authenticate later)`,
610
+ initialValue: false,
611
+ });
612
+ if (!saveAnyway) {
613
+ throw new Error("Aborted: API key validation failed");
614
+ }
615
+ }
616
+
617
+ // --- Model selection ---
618
+ const modelInput = await ctx.prompter.text({
619
+ message: "Model IDs (comma-separated)",
620
+ initialValue: DEFAULT_MODEL_IDS.join(", "),
621
+ validate: (v: string) =>
622
+ parseModelIds(v).length > 0 ? undefined : "Enter at least one model id",
623
+ });
624
+ const modelIds = parseModelIds(modelInput);
625
+ const defaultModelId = modelIds[0] ?? DEFAULT_MODEL_IDS[0];
626
+ const defaultModelRef = `abacusai/${defaultModelId}`;
627
+
628
+ const profileId = `abacusai:${validation.email ?? "default"}`;
629
+ spin.stop("AbacusAI configured");
630
+
631
+ return {
632
+ profiles: [
633
+ {
634
+ profileId,
635
+ credential: {
636
+ type: "api_key",
637
+ provider: "abacusai",
638
+ key: apiKey,
639
+ ...(validation.email ? { email: validation.email } : {}),
640
+ },
641
+ },
642
+ ],
643
+ configPatch: {
644
+ models: {
645
+ providers: {
646
+ abacusai: {
647
+ baseUrl: ROUTELLM_BASE,
648
+ api: "openai-completions",
649
+ auth: "token",
650
+ models: modelIds.map((id) => buildModelDefinition(id)),
651
+ compat: {
652
+ requiresAdditionalPropertiesFalse: true,
653
+ supportsStrictMode: false,
654
+ requiresCleanSchema: true,
655
+ },
656
+ },
657
+ },
658
+ },
659
+ agents: {
660
+ defaults: {
661
+ models: Object.fromEntries(modelIds.map((id) => [`abacusai/${id}`, {}])),
662
+ },
663
+ },
664
+ },
665
+ defaultModel: defaultModelRef,
666
+ notes: [
667
+ "Direct connection to AbacusAI RouteLLM with schema normalization.",
668
+ "Full OpenAI function-calling support is enabled.",
669
+ "Manage your API keys at https://abacus.ai/app/profile/apikey",
670
+ ],
671
+ };
672
+ } catch (err) {
673
+ spin.stop("AbacusAI setup failed");
674
+ throw err;
675
+ }
676
+ },
677
+ },
678
+ ],
679
+ });
680
+ },
681
+ };
682
+
683
+ export default abacusaiPlugin;
@@ -0,0 +1,9 @@
1
+ {
2
+ "id": "abacusai-auth",
3
+ "providers": ["abacusai"],
4
+ "configSchema": {
5
+ "type": "object",
6
+ "additionalProperties": false,
7
+ "properties": {}
8
+ }
9
+ }
package/package.json ADDED
@@ -0,0 +1,29 @@
1
+ {
2
+ "name": "openclaw-abacusai-auth",
3
+ "version": "1.0.0",
4
+ "description": "OpenClaw AbacusAI provider plugin - Third-party plugin for AbacusAI RouteLLM integration",
5
+ "type": "module",
6
+ "main": "index.ts",
7
+ "repository": {
8
+ "type": "git",
9
+ "url": "https://github.com/tonyhu2006/openclaw-abacusai-auth.git"
10
+ },
11
+ "keywords": [
12
+ "openclaw",
13
+ "openclaw-plugin",
14
+ "abacusai",
15
+ "routellm",
16
+ "llm",
17
+ "ai"
18
+ ],
19
+ "author": "tonyhu2006",
20
+ "license": "MIT",
21
+ "peerDependencies": {
22
+ "openclaw": ">=2026.2.0"
23
+ },
24
+ "openclaw": {
25
+ "extensions": [
26
+ "./index.ts"
27
+ ]
28
+ }
29
+ }