@askalf/dario 2.1.1 → 2.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -32,13 +32,13 @@ export ANTHROPIC_BASE_URL=http://localhost:3456 # or OPENAI_BASE_URL=http://lo
32
32
  export ANTHROPIC_API_KEY=dario # or OPENAI_API_KEY=dario
33
33
  ```
34
34
 
35
- Opus, Sonnet, Haiku — all models, streaming, tool use. Works with OpenClaw, Cursor, Continue, Aider, Hermes, or any tool that speaks the Anthropic or OpenAI API. When rate limited, `--cli` routes through Claude Code for uninterrupted Opus access.
35
+ Opus, Sonnet, Haiku — all models, streaming, tool use. Works with Cursor, Continue, Aider, LiteLLM, Hermes, OpenClaw, or any tool that speaks the Anthropic or OpenAI API. When rate limited, `--cli` routes through Claude Code for uninterrupted Opus access.
36
36
 
37
37
  ---
38
38
 
39
39
  ## The Problem
40
40
 
41
- You pay $100-200/mo for Claude Max or Pro. But that subscription only works on claude.ai and Claude Code. If you want to use Claude with **any other tool** — OpenClaw, Cursor, Continue, Aider, your own scripts — you need a separate API key with separate billing.
41
+ You pay $100-200/mo for Claude Max or Pro. But that subscription only works on claude.ai and Claude Code. If you want to use Claude with **any other tool** — Cursor, Continue, Aider, your own scripts — you need a separate API key with separate billing.
42
42
 
43
43
  **Note:** Claude subscriptions have [usage limits](https://support.claude.com/en/articles/11647753-how-do-usage-and-length-limits-work) that reset on rolling 5-hour and 7-day windows. When exceeded, Opus and Sonnet may return 429 errors while Haiku continues working. You can check your utilization via Claude Code's `/usage` command or [statusline](https://code.claude.com/docs/en/statusline). Use `--cli` mode to route through Claude Code's binary, which is not affected by these limits.
44
44
 
@@ -240,8 +240,8 @@ curl http://localhost:3456/v1/messages \
240
240
  ### With Other Tools
241
241
 
242
242
  ```bash
243
- # OpenClaw
244
- ANTHROPIC_BASE_URL=http://localhost:3456 ANTHROPIC_API_KEY=dario openclaw
243
+ # Cursor / Continue / any OpenAI-compatible tool
244
+ OPENAI_BASE_URL=http://localhost:3456/v1 OPENAI_API_KEY=dario cursor
245
245
 
246
246
  # Aider
247
247
  ANTHROPIC_BASE_URL=http://localhost:3456 ANTHROPIC_API_KEY=dario aider --model claude-opus-4-6
@@ -250,6 +250,19 @@ ANTHROPIC_BASE_URL=http://localhost:3456 ANTHROPIC_API_KEY=dario aider --model c
250
250
  ANTHROPIC_BASE_URL=http://localhost:3456 ANTHROPIC_API_KEY=dario your-tool-here
251
251
  ```
252
252
 
253
+ ### Hermes
254
+
255
+ Add to `~/.hermes/config.yaml`:
256
+
257
+ ```yaml
258
+ model:
259
+ base_url: "http://localhost:3456/v1"
260
+ api_key: "dario"
261
+ default: claude-opus-4-6
262
+ ```
263
+
264
+ Then run `hermes` normally — it routes through dario using your Claude subscription.
265
+
253
266
  ## How It Works
254
267
 
255
268
  ### Direct API Mode (default)
@@ -415,7 +428,7 @@ Dario handles your OAuth tokens. Here's why you can trust it:
415
428
 
416
429
  | Signal | Status |
417
430
  |--------|--------|
418
- | **Source code** | ~1000 lines of TypeScript — small enough to read in one sitting |
431
+ | **Source code** | ~1100 lines of TypeScript — small enough to read in one sitting |
419
432
  | **Dependencies** | 1 production dep (`@anthropic-ai/sdk`). Verify: `npm ls --production` |
420
433
  | **npm provenance** | Every release is [SLSA attested](https://www.npmjs.com/package/@askalf/dario) via GitHub Actions |
421
434
  | **Security scanning** | [CodeQL](https://github.com/askalf/dario/actions/workflows/codeql.yml) runs on every push and weekly |
@@ -437,7 +450,7 @@ cd $(npm root -g)/@askalf/dario && npm ls --production
437
450
 
438
451
  ## Contributing
439
452
 
440
- PRs welcome. The codebase is ~1000 lines of TypeScript across 4 files:
453
+ PRs welcome. The codebase is ~1100 lines of TypeScript across 4 files:
441
454
 
442
455
  | File | Purpose |
443
456
  |------|---------|
@@ -453,6 +466,15 @@ npm install
453
466
  npm run dev # runs with tsx (no build needed)
454
467
  ```
455
468
 
469
+ ## Also by AskAlf
470
+
471
+ | Project | What it does |
472
+ |---------|-------------|
473
+ | [platform](https://github.com/askalf/platform) | AI workforce with autonomous agents, teams, memory, and self-healing |
474
+ | [agent](https://github.com/askalf/agent) | Connect any device to the workforce over WebSocket |
475
+ | [claude-re](https://github.com/askalf/claude-re) | Claude Code reimplemented in Python |
476
+ | [amnesia](https://github.com/askalf/amnesia) | Privacy search engine — 155 engines, zero tracking |
477
+
456
478
  ## License
457
479
 
458
480
  MIT
package/dist/cli.js CHANGED
@@ -154,6 +154,19 @@ async function help() {
154
154
  Tokens auto-refresh in the background — set it and forget it.
155
155
  `);
156
156
  }
157
+ async function version() {
158
+ const { readFile } = await import('node:fs/promises');
159
+ const { fileURLToPath } = await import('node:url');
160
+ const { dirname, join } = await import('node:path');
161
+ try {
162
+ const dir = dirname(fileURLToPath(import.meta.url));
163
+ const pkg = JSON.parse(await readFile(join(dir, '..', 'package.json'), 'utf-8'));
164
+ console.log(pkg.version);
165
+ }
166
+ catch {
167
+ console.log('unknown');
168
+ }
169
+ }
157
170
  // Main
158
171
  const commands = {
159
172
  login,
@@ -162,8 +175,11 @@ const commands = {
162
175
  refresh,
163
176
  logout,
164
177
  help,
178
+ version,
165
179
  '--help': help,
166
180
  '-h': help,
181
+ '--version': version,
182
+ '-V': version,
167
183
  };
168
184
  const handler = commands[command];
169
185
  if (!handler) {
package/dist/proxy.d.ts CHANGED
@@ -1,12 +1,3 @@
1
- /**
2
- * Dario — API Proxy Server
3
- *
4
- * Sits between your app and the Anthropic API.
5
- * Transparently swaps API key auth for OAuth bearer tokens.
6
- *
7
- * Point any Anthropic SDK client at http://localhost:3456 and it just works.
8
- * No API key needed — your Claude subscription pays for it.
9
- */
10
1
  interface ProxyOptions {
11
2
  port?: number;
12
3
  verbose?: boolean;
package/dist/proxy.js CHANGED
@@ -1,12 +1,3 @@
1
- /**
2
- * Dario — API Proxy Server
3
- *
4
- * Sits between your app and the Anthropic API.
5
- * Transparently swaps API key auth for OAuth bearer tokens.
6
- *
7
- * Point any Anthropic SDK client at http://localhost:3456 and it just works.
8
- * No API key needed — your Claude subscription pays for it.
9
- */
10
1
  import { createServer } from 'node:http';
11
2
  import { randomUUID, timingSafeEqual } from 'node:crypto';
12
3
  import { execSync, spawn } from 'node:child_process';
@@ -16,6 +7,7 @@ const ANTHROPIC_API = 'https://api.anthropic.com';
16
7
  const DEFAULT_PORT = 3456;
17
8
  const MAX_BODY_BYTES = 10 * 1024 * 1024; // 10 MB — generous for large prompts, prevents abuse
18
9
  const UPSTREAM_TIMEOUT_MS = 300_000; // 5 min — matches Anthropic SDK default
10
+ const BODY_READ_TIMEOUT_MS = 30_000; // 30s — prevents slow-loris on body reads
19
11
  const LOCALHOST = '127.0.0.1';
20
12
  // Detect installed Claude Code version at startup
21
13
  function detectClaudeVersion() {
@@ -28,73 +20,38 @@ function detectClaudeVersion() {
28
20
  return '2.1.96';
29
21
  }
30
22
  }
31
- function getOsName() {
32
- const p = platform;
33
- if (p === 'win32')
34
- return 'Windows';
35
- if (p === 'darwin')
36
- return 'MacOS';
37
- return 'Linux';
38
- }
39
- // Persistent session ID per proxy lifetime (like Claude Code does per session)
40
23
  const SESSION_ID = randomUUID();
41
- // Detect @anthropic-ai/sdk version from installed package
42
- function detectSdkVersion() {
43
- try {
44
- const pkg = require('@anthropic-ai/sdk/package.json');
45
- return pkg.version ?? '0.81.0';
46
- }
47
- catch {
48
- return '0.81.0';
49
- }
50
- }
24
+ const OS_NAME = platform === 'win32' ? 'Windows' : platform === 'darwin' ? 'MacOS' : 'Linux';
51
25
  // Model shortcuts — users can pass short names
52
26
  const MODEL_ALIASES = {
53
27
  'opus': 'claude-opus-4-6',
54
28
  'sonnet': 'claude-sonnet-4-6',
55
29
  'haiku': 'claude-haiku-4-5',
56
30
  };
57
- // OpenAI model name → Anthropic model name
31
+ // OpenAI model names → Anthropic (fallback if client sends GPT names)
58
32
  const OPENAI_MODEL_MAP = {
59
- 'gpt-4.1': 'claude-opus-4-6',
60
- 'gpt-4.1-mini': 'claude-sonnet-4-6',
61
- 'gpt-4.1-nano': 'claude-haiku-4-5',
62
- 'gpt-4o': 'claude-opus-4-6',
63
- 'gpt-4o-mini': 'claude-haiku-4-5',
64
- 'gpt-4-turbo': 'claude-opus-4-6',
33
+ 'gpt-5.4': 'claude-opus-4-6',
34
+ 'gpt-5.4-mini': 'claude-sonnet-4-6',
35
+ 'gpt-5.4-nano': 'claude-haiku-4-5',
36
+ 'gpt-5.3': 'claude-opus-4-6',
65
37
  'gpt-4': 'claude-opus-4-6',
66
38
  'gpt-3.5-turbo': 'claude-haiku-4-5',
67
- 'o3': 'claude-opus-4-6',
68
- 'o3-mini': 'claude-sonnet-4-6',
69
- 'o4-mini': 'claude-sonnet-4-6',
70
- 'o1': 'claude-opus-4-6',
71
- 'o1-mini': 'claude-sonnet-4-6',
72
- 'o1-pro': 'claude-opus-4-6',
73
39
  };
74
- /**
75
- * Translate OpenAI chat completion request → Anthropic Messages request.
76
- */
40
+ /** Translate OpenAI chat completion request → Anthropic Messages request. */
77
41
  function openaiToAnthropic(body, modelOverride) {
78
42
  const messages = body.messages;
79
43
  if (!messages)
80
44
  return body;
81
- // Extract system messages
82
45
  const systemMessages = messages.filter(m => m.role === 'system');
83
46
  const nonSystemMessages = messages.filter(m => m.role !== 'system');
84
- // Map model name
85
- const requestModel = String(body.model || '');
86
- const model = modelOverride || OPENAI_MODEL_MAP[requestModel] || requestModel;
47
+ const model = modelOverride || OPENAI_MODEL_MAP[String(body.model || '')] || String(body.model || 'claude-opus-4-6');
87
48
  const result = {
88
49
  model,
89
- messages: nonSystemMessages.map(m => ({
90
- role: m.role === 'assistant' ? 'assistant' : 'user',
91
- content: m.content,
92
- })),
50
+ messages: nonSystemMessages.map(m => ({ role: m.role === 'assistant' ? 'assistant' : 'user', content: m.content })),
93
51
  max_tokens: body.max_tokens ?? body.max_completion_tokens ?? 8192,
94
52
  };
95
- if (systemMessages.length > 0) {
53
+ if (systemMessages.length > 0)
96
54
  result.system = systemMessages.map(m => typeof m.content === 'string' ? m.content : JSON.stringify(m.content)).join('\n');
97
- }
98
55
  if (body.stream)
99
56
  result.stream = true;
100
57
  if (body.temperature != null)
@@ -105,33 +62,20 @@ function openaiToAnthropic(body, modelOverride) {
105
62
  result.stop_sequences = Array.isArray(body.stop) ? body.stop : [body.stop];
106
63
  return result;
107
64
  }
108
- /**
109
- * Translate Anthropic Messages response → OpenAI chat completion response.
110
- */
65
+ /** Translate Anthropic Messages response → OpenAI chat completion response. */
111
66
  function anthropicToOpenai(body) {
112
- const content = body.content;
113
- const text = content?.find(c => c.type === 'text')?.text ?? '';
114
- const usage = body.usage;
67
+ const text = body.content?.find(c => c.type === 'text')?.text ?? '';
68
+ const u = body.usage;
115
69
  return {
116
70
  id: `chatcmpl-${(body.id || '').replace('msg_', '')}`,
117
71
  object: 'chat.completion',
118
72
  created: Math.floor(Date.now() / 1000),
119
73
  model: body.model,
120
- choices: [{
121
- index: 0,
122
- message: { role: 'assistant', content: text },
123
- finish_reason: body.stop_reason === 'end_turn' ? 'stop' : body.stop_reason === 'max_tokens' ? 'length' : 'stop',
124
- }],
125
- usage: {
126
- prompt_tokens: usage?.input_tokens ?? 0,
127
- completion_tokens: usage?.output_tokens ?? 0,
128
- total_tokens: (usage?.input_tokens ?? 0) + (usage?.output_tokens ?? 0),
129
- },
74
+ choices: [{ index: 0, message: { role: 'assistant', content: text }, finish_reason: body.stop_reason === 'end_turn' ? 'stop' : 'length' }],
75
+ usage: { prompt_tokens: u?.input_tokens ?? 0, completion_tokens: u?.output_tokens ?? 0, total_tokens: (u?.input_tokens ?? 0) + (u?.output_tokens ?? 0) },
130
76
  };
131
77
  }
132
- /**
133
- * Translate Anthropic SSE stream → OpenAI SSE stream.
134
- */
78
+ /** Translate Anthropic SSE → OpenAI SSE. */
135
79
  function translateStreamChunk(line) {
136
80
  if (!line.startsWith('data: '))
137
81
  return null;
@@ -139,54 +83,26 @@ function translateStreamChunk(line) {
139
83
  if (json === '[DONE]')
140
84
  return 'data: [DONE]\n\n';
141
85
  try {
142
- const event = JSON.parse(json);
143
- if (event.type === 'content_block_delta') {
144
- const delta = event.delta;
145
- if (delta?.type === 'text_delta' && delta.text) {
146
- return `data: ${JSON.stringify({
147
- id: 'chatcmpl-dario',
148
- object: 'chat.completion.chunk',
149
- created: Math.floor(Date.now() / 1000),
150
- model: 'claude',
151
- choices: [{ index: 0, delta: { content: delta.text }, finish_reason: null }],
152
- })}\n\n`;
153
- }
154
- }
155
- if (event.type === 'message_stop') {
156
- return `data: ${JSON.stringify({
157
- id: 'chatcmpl-dario',
158
- object: 'chat.completion.chunk',
159
- created: Math.floor(Date.now() / 1000),
160
- model: 'claude',
161
- choices: [{ index: 0, delta: {}, finish_reason: 'stop' }],
162
- })}\n\ndata: [DONE]\n\n`;
86
+ const e = JSON.parse(json);
87
+ if (e.type === 'content_block_delta') {
88
+ const d = e.delta;
89
+ if (d?.type === 'text_delta' && d.text)
90
+ return `data: ${JSON.stringify({ id: 'chatcmpl-dario', object: 'chat.completion.chunk', created: Math.floor(Date.now() / 1000), model: 'claude', choices: [{ index: 0, delta: { content: d.text }, finish_reason: null }] })}\n\n`;
163
91
  }
92
+ if (e.type === 'message_stop')
93
+ return `data: ${JSON.stringify({ id: 'chatcmpl-dario', object: 'chat.completion.chunk', created: Math.floor(Date.now() / 1000), model: 'claude', choices: [{ index: 0, delta: {}, finish_reason: 'stop' }] })}\n\ndata: [DONE]\n\n`;
164
94
  }
165
- catch { /* skip unparseable */ }
95
+ catch { }
166
96
  return null;
167
97
  }
168
- /**
169
- * OpenAI-compatible models list.
170
- */
171
- function openaiModelsList() {
172
- const models = ['claude-opus-4-6', 'claude-sonnet-4-6', 'claude-haiku-4-5'];
173
- return {
174
- object: 'list',
175
- data: models.map(id => ({
176
- id,
177
- object: 'model',
178
- created: 1700000000,
179
- owned_by: 'anthropic',
180
- })),
181
- };
182
- }
98
+ const OPENAI_MODELS_LIST = { object: 'list', data: ['claude-opus-4-6', 'claude-sonnet-4-6', 'claude-haiku-4-5'].map(id => ({ id, object: 'model', created: 1700000000, owned_by: 'anthropic' })) };
183
99
  export function sanitizeError(err) {
184
100
  const msg = err instanceof Error ? err.message : String(err);
185
101
  // Never leak tokens, JWTs, or bearer values in error messages
186
102
  return msg
187
103
  .replace(/sk-ant-[a-zA-Z0-9_-]+/g, '[REDACTED]')
188
104
  .replace(/eyJ[a-zA-Z0-9_-]+\.eyJ[a-zA-Z0-9_-]+\.[a-zA-Z0-9_-]+/g, '[REDACTED_JWT]')
189
- .replace(/Bearer\s+[a-zA-Z0-9_-]+/gi, 'Bearer [REDACTED]');
105
+ .replace(/Bearer\s+[^\s,;]+/gi, 'Bearer [REDACTED]');
190
106
  }
191
107
  /**
192
108
  * CLI Backend: route requests through `claude --print` instead of direct API.
@@ -231,8 +147,11 @@ async function handleViaCli(body, model, verbose) {
231
147
  });
232
148
  let stdout = '';
233
149
  let stderr = '';
234
- child.stdout.on('data', (d) => { stdout += d.toString(); });
235
- child.stderr.on('data', (d) => { stderr += d.toString(); });
150
+ const MAX_CLI_OUTPUT = 5_000_000; // 5MB cap per stream — prevents OOM from runaway CLI
151
+ child.stdout.on('data', (d) => { if (stdout.length < MAX_CLI_OUTPUT)
152
+ stdout += d.toString(); });
153
+ child.stderr.on('data', (d) => { if (stderr.length < MAX_CLI_OUTPUT)
154
+ stderr += d.toString(); });
236
155
  child.stdin.write(prompt);
237
156
  child.stdin.end();
238
157
  child.on('close', (code) => {
@@ -289,37 +208,65 @@ export async function startProxy(opts = {}) {
289
208
  process.exit(1);
290
209
  }
291
210
  const cliVersion = detectClaudeVersion();
292
- const sdkVersion = detectSdkVersion();
293
211
  const modelOverride = opts.model ? (MODEL_ALIASES[opts.model] ?? opts.model) : null;
212
+ // Pre-build static headers (only auth, version, beta, request-id change per request)
213
+ const staticHeaders = {
214
+ 'accept': 'application/json',
215
+ 'Content-Type': 'application/json',
216
+ 'anthropic-dangerous-direct-browser-access': 'true',
217
+ 'anthropic-client-platform': 'cli',
218
+ 'user-agent': `claude-cli/${cliVersion} (external, cli)`,
219
+ 'x-app': 'cli',
220
+ 'x-claude-code-session-id': SESSION_ID,
221
+ 'x-stainless-arch': arch,
222
+ 'x-stainless-lang': 'js',
223
+ 'x-stainless-os': OS_NAME,
224
+ 'x-stainless-package-version': '0.81.0',
225
+ 'x-stainless-retry-count': '0',
226
+ 'x-stainless-runtime': 'node',
227
+ 'x-stainless-runtime-version': nodeVersion,
228
+ 'x-stainless-timeout': '600',
229
+ };
294
230
  const useCli = opts.cliBackend ?? false;
295
231
  let requestCount = 0;
296
- let tokenCostEstimate = 0;
297
- // Optional proxy authentication
232
+ // Optional proxy authentication — pre-encode key buffer for performance
298
233
  const apiKey = process.env.DARIO_API_KEY;
234
+ const apiKeyBuf = apiKey ? Buffer.from(apiKey) : null;
299
235
  const corsOrigin = `http://localhost:${port}`;
236
+ // Security headers for all responses
237
+ const SECURITY_HEADERS = {
238
+ 'X-Content-Type-Options': 'nosniff',
239
+ 'X-Frame-Options': 'DENY',
240
+ 'Cache-Control': 'no-store',
241
+ };
242
+ // Pre-serialize static responses
243
+ const CORS_HEADERS = {
244
+ 'Access-Control-Allow-Origin': corsOrigin,
245
+ 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
246
+ 'Access-Control-Allow-Headers': 'Content-Type, Authorization, x-api-key, anthropic-version, anthropic-beta',
247
+ 'Access-Control-Max-Age': '86400',
248
+ ...SECURITY_HEADERS,
249
+ };
250
+ const JSON_HEADERS = { 'Content-Type': 'application/json', ...SECURITY_HEADERS };
251
+ const MODELS_JSON = JSON.stringify(OPENAI_MODELS_LIST);
252
+ const ERR_UNAUTH = JSON.stringify({ error: 'Unauthorized', message: 'Invalid or missing API key' });
253
+ const ERR_FORBIDDEN = JSON.stringify({ error: 'Forbidden', message: 'Path not allowed' });
254
+ const ERR_METHOD = JSON.stringify({ error: 'Method not allowed' });
300
255
  function checkAuth(req) {
301
- if (!apiKey)
302
- return true; // no key set = open access
256
+ if (!apiKeyBuf)
257
+ return true;
303
258
  const provided = req.headers['x-api-key']
304
259
  || req.headers.authorization?.replace(/^Bearer\s+/i, '');
305
260
  if (!provided)
306
261
  return false;
307
- try {
308
- return timingSafeEqual(Buffer.from(provided), Buffer.from(apiKey));
309
- }
310
- catch {
262
+ const providedBuf = Buffer.from(provided);
263
+ if (providedBuf.length !== apiKeyBuf.length)
311
264
  return false;
312
- }
265
+ return timingSafeEqual(providedBuf, apiKeyBuf);
313
266
  }
314
267
  const server = createServer(async (req, res) => {
315
- // CORS preflight
316
268
  if (req.method === 'OPTIONS') {
317
- res.writeHead(204, {
318
- 'Access-Control-Allow-Origin': corsOrigin,
319
- 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
320
- 'Access-Control-Allow-Headers': 'Content-Type, Authorization, x-api-key, anthropic-version, anthropic-beta',
321
- 'Access-Control-Max-Age': '86400',
322
- });
269
+ res.writeHead(204, CORS_HEADERS);
323
270
  res.end();
324
271
  return;
325
272
  }
@@ -328,7 +275,7 @@ export async function startProxy(opts = {}) {
328
275
  // Health check
329
276
  if (urlPath === '/health' || urlPath === '/') {
330
277
  const s = await getStatus();
331
- res.writeHead(200, { 'Content-Type': 'application/json' });
278
+ res.writeHead(200, JSON_HEADERS);
332
279
  res.end(JSON.stringify({
333
280
  status: 'ok',
334
281
  oauth: s.status,
@@ -337,24 +284,22 @@ export async function startProxy(opts = {}) {
337
284
  }));
338
285
  return;
339
286
  }
340
- // Auth gate — everything below health requires auth when DARIO_API_KEY is set
341
287
  if (!checkAuth(req)) {
342
- res.writeHead(401, { 'Content-Type': 'application/json' });
343
- res.end(JSON.stringify({ error: 'Unauthorized', message: 'Invalid or missing API key' }));
288
+ res.writeHead(401, JSON_HEADERS);
289
+ res.end(ERR_UNAUTH);
344
290
  return;
345
291
  }
346
292
  // Status endpoint
347
293
  if (urlPath === '/status') {
348
294
  const s = await getStatus();
349
- res.writeHead(200, { 'Content-Type': 'application/json' });
295
+ res.writeHead(200, JSON_HEADERS);
350
296
  res.end(JSON.stringify(s));
351
297
  return;
352
298
  }
353
- // OpenAI-compatible models list
354
299
  if (urlPath === '/v1/models' && req.method === 'GET') {
355
300
  requestCount++;
356
- res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': corsOrigin });
357
- res.end(JSON.stringify(openaiModelsList()));
301
+ res.writeHead(200, { ...JSON_HEADERS, 'Access-Control-Allow-Origin': corsOrigin });
302
+ res.end(MODELS_JSON);
358
303
  return;
359
304
  }
360
305
  // Detect OpenAI-format requests
@@ -366,60 +311,75 @@ export async function startProxy(opts = {}) {
366
311
  };
367
312
  const targetBase = isOpenAI ? `${ANTHROPIC_API}/v1/messages` : allowedPaths[urlPath];
368
313
  if (!targetBase) {
369
- res.writeHead(403, { 'Content-Type': 'application/json' });
370
- res.end(JSON.stringify({ error: 'Forbidden', message: 'Path not allowed' }));
314
+ res.writeHead(403, JSON_HEADERS);
315
+ res.end(ERR_FORBIDDEN);
371
316
  return;
372
317
  }
373
- // Only allow POST (Messages/Chat API) and GET (models)
374
318
  if (req.method !== 'POST') {
375
- res.writeHead(405, { 'Content-Type': 'application/json' });
376
- res.end(JSON.stringify({ error: 'Method not allowed' }));
319
+ res.writeHead(405, JSON_HEADERS);
320
+ res.end(ERR_METHOD);
377
321
  return;
378
322
  }
379
323
  // Proxy to Anthropic
380
324
  try {
381
325
  const accessToken = await getAccessToken();
382
- // Read request body with size limit
326
+ // Read request body with size limit and timeout (prevents slow-loris)
383
327
  const chunks = [];
384
328
  let totalBytes = 0;
385
- for await (const chunk of req) {
386
- const buf = typeof chunk === 'string' ? Buffer.from(chunk) : chunk;
387
- totalBytes += buf.length;
388
- if (totalBytes > MAX_BODY_BYTES) {
389
- res.writeHead(413, { 'Content-Type': 'application/json' });
390
- res.end(JSON.stringify({ error: 'Request body too large', max: `${MAX_BODY_BYTES / 1024 / 1024}MB` }));
391
- return;
329
+ const bodyTimeout = setTimeout(() => { req.destroy(); }, BODY_READ_TIMEOUT_MS);
330
+ try {
331
+ for await (const chunk of req) {
332
+ const buf = typeof chunk === 'string' ? Buffer.from(chunk) : chunk;
333
+ totalBytes += buf.length;
334
+ if (totalBytes > MAX_BODY_BYTES) {
335
+ clearTimeout(bodyTimeout);
336
+ res.writeHead(413, JSON_HEADERS);
337
+ res.end(JSON.stringify({ error: 'Request body too large', max: `${MAX_BODY_BYTES / 1024 / 1024}MB` }));
338
+ return;
339
+ }
340
+ chunks.push(buf);
392
341
  }
393
- chunks.push(buf);
342
+ }
343
+ finally {
344
+ clearTimeout(bodyTimeout);
394
345
  }
395
346
  const body = Buffer.concat(chunks);
396
- // CLI backend mode: route through claude --print
397
- if (useCli && urlPath === '/v1/messages' && req.method === 'POST' && body.length > 0) {
398
- const cliResult = await handleViaCli(body, modelOverride, verbose);
347
+ // CLI backend mode: route through claude --print (works for both Anthropic and OpenAI endpoints)
348
+ if (useCli && req.method === 'POST' && body.length > 0) {
349
+ let cliBody = body;
350
+ // Translate OpenAI format before passing to CLI
351
+ if (isOpenAI) {
352
+ try {
353
+ const parsed = JSON.parse(body.toString());
354
+ cliBody = Buffer.from(JSON.stringify(openaiToAnthropic(parsed, modelOverride)));
355
+ }
356
+ catch { /* send as-is */ }
357
+ }
358
+ const cliResult = await handleViaCli(cliBody, modelOverride, verbose);
399
359
  requestCount++;
360
+ // Translate CLI response back to OpenAI format if needed
361
+ if (isOpenAI && cliResult.status >= 200 && cliResult.status < 300) {
362
+ try {
363
+ const parsed = JSON.parse(cliResult.body);
364
+ cliResult.body = JSON.stringify(anthropicToOpenai(parsed));
365
+ }
366
+ catch { /* send as-is */ }
367
+ }
400
368
  res.writeHead(cliResult.status, {
401
369
  'Content-Type': cliResult.contentType,
402
370
  'Access-Control-Allow-Origin': corsOrigin,
371
+ ...SECURITY_HEADERS,
403
372
  });
404
373
  res.end(cliResult.body);
405
374
  return;
406
375
  }
407
- // Translate OpenAI Anthropic format if needed
376
+ // Parse body once, apply OpenAI translation or model override
408
377
  let finalBody = body.length > 0 ? body : undefined;
409
- if (isOpenAI && body.length > 0) {
378
+ if (body.length > 0 && (isOpenAI || modelOverride)) {
410
379
  try {
411
380
  const parsed = JSON.parse(body.toString());
412
- const translated = openaiToAnthropic(parsed, modelOverride);
413
- finalBody = Buffer.from(JSON.stringify(translated));
414
- }
415
- catch { /* not JSON, send as-is */ }
416
- }
417
- else if (modelOverride && body.length > 0) {
418
- // Override model in request body if --model flag was set
419
- try {
420
- const parsed = JSON.parse(body.toString());
421
- parsed.model = modelOverride;
422
- finalBody = Buffer.from(JSON.stringify(parsed));
381
+ const result = isOpenAI ? openaiToAnthropic(parsed, modelOverride) : (modelOverride ? { ...parsed, model: modelOverride } : parsed);
382
+ finalBody = Buffer.from(JSON.stringify(result));
423
383
  }
424
384
  catch { /* not JSON, send as-is */ }
425
385
  }
@@ -427,46 +387,19 @@ export async function startProxy(opts = {}) {
427
387
  const modelInfo = modelOverride ? ` (model: ${modelOverride})` : '';
428
388
  console.log(`[dario] #${requestCount} ${req.method} ${req.url}${modelInfo}`);
429
389
  }
430
- // Build target URL from allowlist (no user input in URL construction)
431
- const targetUrl = targetBase;
432
- // Merge any client-provided beta flags with the required oauth flag
390
+ // Merge client beta flags with defaults
433
391
  const clientBeta = req.headers['anthropic-beta'];
434
- const betaFlags = new Set([
435
- 'oauth-2025-04-20',
436
- 'interleaved-thinking-2025-05-14',
437
- 'prompt-caching-scope-2026-01-05',
438
- 'claude-code-20250219',
439
- 'context-management-2025-06-27',
440
- ]);
441
- if (clientBeta) {
442
- for (const flag of clientBeta.split(',')) {
443
- const trimmed = flag.trim();
444
- if (trimmed.length > 0 && trimmed.length < 100)
445
- betaFlags.add(trimmed);
446
- }
447
- }
392
+ let beta = 'oauth-2025-04-20,interleaved-thinking-2025-05-14,prompt-caching-scope-2026-01-05,claude-code-20250219,context-management-2025-06-27';
393
+ if (clientBeta)
394
+ beta += ',' + clientBeta.split(',').map(f => f.trim()).filter(f => f.length > 0 && f.length < 100).join(',');
448
395
  const headers = {
449
- 'accept': 'application/json',
396
+ ...staticHeaders,
450
397
  'Authorization': `Bearer ${accessToken}`,
451
- 'Content-Type': 'application/json',
452
398
  'anthropic-version': req.headers['anthropic-version'] || '2023-06-01',
453
- 'anthropic-beta': [...betaFlags].join(','),
454
- 'anthropic-dangerous-direct-browser-access': 'true',
455
- 'anthropic-client-platform': 'cli',
456
- 'user-agent': `claude-cli/${cliVersion} (external, cli)`,
457
- 'x-app': 'cli',
458
- 'x-claude-code-session-id': SESSION_ID,
399
+ 'anthropic-beta': beta,
459
400
  'x-client-request-id': randomUUID(),
460
- 'x-stainless-arch': arch,
461
- 'x-stainless-lang': 'js',
462
- 'x-stainless-os': getOsName(),
463
- 'x-stainless-package-version': sdkVersion,
464
- 'x-stainless-retry-count': '0',
465
- 'x-stainless-runtime': 'node',
466
- 'x-stainless-runtime-version': nodeVersion,
467
- 'x-stainless-timeout': '600',
468
401
  };
469
- const upstream = await fetch(targetUrl, {
402
+ const upstream = await fetch(targetBase, {
470
403
  method: req.method ?? 'POST',
471
404
  headers,
472
405
  body: finalBody ? new Uint8Array(finalBody) : undefined,
@@ -479,6 +412,7 @@ export async function startProxy(opts = {}) {
479
412
  const responseHeaders = {
480
413
  'Content-Type': contentType || 'application/json',
481
414
  'Access-Control-Allow-Origin': corsOrigin,
415
+ ...SECURITY_HEADERS,
482
416
  };
483
417
  // Forward rate limit headers (including unified subscription headers)
484
418
  for (const [key, value] of upstream.headers.entries()) {
@@ -547,24 +481,14 @@ export async function startProxy(opts = {}) {
547
481
  else {
548
482
  res.end(responseBody);
549
483
  }
550
- // Quick token estimate for logging
551
- if (verbose && responseBody) {
552
- try {
553
- const parsed = JSON.parse(responseBody);
554
- if (parsed.usage) {
555
- const tokens = (parsed.usage.input_tokens ?? 0) + (parsed.usage.output_tokens ?? 0);
556
- tokenCostEstimate += tokens;
557
- console.log(`[dario] #${requestCount} ${upstream.status} — ${tokens} tokens (session total: ${tokenCostEstimate})`);
558
- }
559
- }
560
- catch { /* not JSON, skip */ }
561
- }
484
+ if (verbose)
485
+ console.log(`[dario] #${requestCount} ${upstream.status}`);
562
486
  }
563
487
  }
564
488
  catch (err) {
565
489
  // Log full error server-side, return generic message to client
566
490
  console.error('[dario] Proxy error:', sanitizeError(err));
567
- res.writeHead(502, { 'Content-Type': 'application/json' });
491
+ res.writeHead(502, JSON_HEADERS);
568
492
  res.end(JSON.stringify({ error: 'Proxy error', message: 'Failed to reach upstream API' }));
569
493
  }
570
494
  });
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@askalf/dario",
3
- "version": "2.1.1",
3
+ "version": "2.2.0",
4
4
  "description": "Use your Claude subscription as an API. No API key needed. Local proxy for Claude Max/Pro subscriptions.",
5
5
  "type": "module",
6
6
  "bin": {