@askalf/dario 2.1.0 → 2.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -27,18 +27,18 @@
27
27
  ```bash
28
28
  npx @askalf/dario login # detects Claude Code credentials, starts proxy
29
29
 
30
- # now use it from anywhere
31
- export ANTHROPIC_BASE_URL=http://localhost:3456
32
- export ANTHROPIC_API_KEY=dario
30
+ # now use it from anywhere — Anthropic or OpenAI SDK
31
+ export ANTHROPIC_BASE_URL=http://localhost:3456 # or OPENAI_BASE_URL=http://localhost:3456/v1
32
+ export ANTHROPIC_API_KEY=dario # or OPENAI_API_KEY=dario
33
33
  ```
34
34
 
35
- That's it. Any tool that speaks the Anthropic API now uses your subscription.
35
+ Opus, Sonnet, Haiku — all models, streaming, tool use. Works with Cursor, Continue, Aider, LiteLLM, Hermes, OpenClaw, or any tool that speaks the Anthropic or OpenAI API. When rate limited, `--cli` routes through Claude Code for uninterrupted Opus access.
36
36
 
37
37
  ---
38
38
 
39
39
  ## The Problem
40
40
 
41
- You pay $100-200/mo for Claude Max or Pro. But that subscription only works on claude.ai and Claude Code. If you want to use Claude with **any other tool** — OpenClaw, Cursor, Continue, Aider, your own scripts — you need a separate API key with separate billing.
41
+ You pay $100-200/mo for Claude Max or Pro. But that subscription only works on claude.ai and Claude Code. If you want to use Claude with **any other tool** — Cursor, Continue, Aider, your own scripts — you need a separate API key with separate billing.
42
42
 
43
43
  **Note:** Claude subscriptions have [usage limits](https://support.claude.com/en/articles/11647753-how-do-usage-and-length-limits-work) that reset on rolling 5-hour and 7-day windows. When exceeded, Opus and Sonnet may return 429 errors while Haiku continues working. You can check your utilization via Claude Code's `/usage` command or [statusline](https://code.claude.com/docs/en/statusline). Use `--cli` mode to route through Claude Code's binary, which is not affected by these limits.
44
44
 
@@ -88,6 +88,7 @@ Usage:
88
88
  ANTHROPIC_BASE_URL=http://localhost:3456
89
89
  ANTHROPIC_API_KEY=dario
90
90
 
91
+ Auth: open (no DARIO_API_KEY set)
91
92
  OAuth: healthy (expires in 11h 42m)
92
93
  Model: passthrough (client decides)
93
94
  ```
@@ -239,8 +240,8 @@ curl http://localhost:3456/v1/messages \
239
240
  ### With Other Tools
240
241
 
241
242
  ```bash
242
- # OpenClaw
243
- ANTHROPIC_BASE_URL=http://localhost:3456 ANTHROPIC_API_KEY=dario openclaw
243
+ # Cursor / Continue / any OpenAI-compatible tool
244
+ OPENAI_BASE_URL=http://localhost:3456/v1 OPENAI_API_KEY=dario cursor
244
245
 
245
246
  # Aider
246
247
  ANTHROPIC_BASE_URL=http://localhost:3456 ANTHROPIC_API_KEY=dario aider --model claude-opus-4-6
@@ -249,6 +250,19 @@ ANTHROPIC_BASE_URL=http://localhost:3456 ANTHROPIC_API_KEY=dario aider --model c
249
250
  ANTHROPIC_BASE_URL=http://localhost:3456 ANTHROPIC_API_KEY=dario your-tool-here
250
251
  ```
251
252
 
253
+ ### Hermes
254
+
255
+ Add to `~/.hermes/config.yaml`:
256
+
257
+ ```yaml
258
+ model:
259
+ base_url: "http://localhost:3456/v1"
260
+ api_key: "dario"
261
+ default: claude-opus-4-6
262
+ ```
263
+
264
+ Then run `hermes` normally — it routes through dario using your Claude subscription.
265
+
252
266
  ## How It Works
253
267
 
254
268
  ### Direct API Mode (default)
@@ -294,12 +308,13 @@ ANTHROPIC_BASE_URL=http://localhost:3456 ANTHROPIC_API_KEY=dario your-tool-here
294
308
 
295
309
  ### Proxy Options
296
310
 
297
- | Flag | Description | Default |
298
- |------|-------------|---------|
311
+ | Flag/Env | Description | Default |
312
+ |----------|-------------|---------|
299
313
  | `--cli` | Use Claude CLI as backend (bypasses rate limits) | off |
300
314
  | `--model=MODEL` | Force a model (`opus`, `sonnet`, `haiku`, or full ID) | passthrough |
301
315
  | `--port=PORT` | Port to listen on | `3456` |
302
316
  | `--verbose` / `-v` | Log every request | off |
317
+ | `DARIO_API_KEY` | If set, all endpoints (except `/health`) require matching `x-api-key` header or `Authorization: Bearer` header | unset (open) |
303
318
 
304
319
  ## Supported Features
305
320
 
@@ -413,7 +428,7 @@ Dario handles your OAuth tokens. Here's why you can trust it:
413
428
 
414
429
  | Signal | Status |
415
430
  |--------|--------|
416
- | **Source code** | ~1000 lines of TypeScript — small enough to read in one sitting |
431
+ | **Source code** | ~1100 lines of TypeScript — small enough to read in one sitting |
417
432
  | **Dependencies** | 1 production dep (`@anthropic-ai/sdk`). Verify: `npm ls --production` |
418
433
  | **npm provenance** | Every release is [SLSA attested](https://www.npmjs.com/package/@askalf/dario) via GitHub Actions |
419
434
  | **Security scanning** | [CodeQL](https://github.com/askalf/dario/actions/workflows/codeql.yml) runs on every push and weekly |
@@ -435,7 +450,7 @@ cd $(npm root -g)/@askalf/dario && npm ls --production
435
450
 
436
451
  ## Contributing
437
452
 
438
- PRs welcome. The codebase is ~1000 lines of TypeScript across 4 files:
453
+ PRs welcome. The codebase is ~1100 lines of TypeScript across 4 files:
439
454
 
440
455
  | File | Purpose |
441
456
  |------|---------|
@@ -451,6 +466,15 @@ npm install
451
466
  npm run dev # runs with tsx (no build needed)
452
467
  ```
453
468
 
469
+ ## Also by AskAlf
470
+
471
+ | Project | What it does |
472
+ |---------|-------------|
473
+ | [platform](https://github.com/askalf/platform) | AI workforce with autonomous agents, teams, memory, and self-healing |
474
+ | [agent](https://github.com/askalf/agent) | Connect any device to the workforce over WebSocket |
475
+ | [claude-re](https://github.com/askalf/claude-re) | Claude Code reimplemented in Python |
476
+ | [amnesia](https://github.com/askalf/amnesia) | Privacy search engine — 155 engines, zero tracking |
477
+
454
478
  ## License
455
479
 
456
480
  MIT
package/dist/proxy.d.ts CHANGED
@@ -1,12 +1,3 @@
1
- /**
2
- * Dario — API Proxy Server
3
- *
4
- * Sits between your app and the Anthropic API.
5
- * Transparently swaps API key auth for OAuth bearer tokens.
6
- *
7
- * Point any Anthropic SDK client at http://localhost:3456 and it just works.
8
- * No API key needed — your Claude subscription pays for it.
9
- */
10
1
  interface ProxyOptions {
11
2
  port?: number;
12
3
  verbose?: boolean;
package/dist/proxy.js CHANGED
@@ -1,12 +1,3 @@
1
- /**
2
- * Dario — API Proxy Server
3
- *
4
- * Sits between your app and the Anthropic API.
5
- * Transparently swaps API key auth for OAuth bearer tokens.
6
- *
7
- * Point any Anthropic SDK client at http://localhost:3456 and it just works.
8
- * No API key needed — your Claude subscription pays for it.
9
- */
10
1
  import { createServer } from 'node:http';
11
2
  import { randomUUID, timingSafeEqual } from 'node:crypto';
12
3
  import { execSync, spawn } from 'node:child_process';
@@ -28,73 +19,38 @@ function detectClaudeVersion() {
28
19
  return '2.1.96';
29
20
  }
30
21
  }
31
- function getOsName() {
32
- const p = platform;
33
- if (p === 'win32')
34
- return 'Windows';
35
- if (p === 'darwin')
36
- return 'MacOS';
37
- return 'Linux';
38
- }
39
- // Persistent session ID per proxy lifetime (like Claude Code does per session)
40
22
  const SESSION_ID = randomUUID();
41
- // Detect @anthropic-ai/sdk version from installed package
42
- function detectSdkVersion() {
43
- try {
44
- const pkg = require('@anthropic-ai/sdk/package.json');
45
- return pkg.version ?? '0.81.0';
46
- }
47
- catch {
48
- return '0.81.0';
49
- }
50
- }
23
+ const OS_NAME = platform === 'win32' ? 'Windows' : platform === 'darwin' ? 'MacOS' : 'Linux';
51
24
  // Model shortcuts — users can pass short names
52
25
  const MODEL_ALIASES = {
53
26
  'opus': 'claude-opus-4-6',
54
27
  'sonnet': 'claude-sonnet-4-6',
55
28
  'haiku': 'claude-haiku-4-5',
56
29
  };
57
- // OpenAI model name → Anthropic model name
30
+ // OpenAI model names → Anthropic (fallback if client sends GPT names)
58
31
  const OPENAI_MODEL_MAP = {
59
- 'gpt-4.1': 'claude-opus-4-6',
60
- 'gpt-4.1-mini': 'claude-sonnet-4-6',
61
- 'gpt-4.1-nano': 'claude-haiku-4-5',
62
- 'gpt-4o': 'claude-opus-4-6',
63
- 'gpt-4o-mini': 'claude-haiku-4-5',
64
- 'gpt-4-turbo': 'claude-opus-4-6',
32
+ 'gpt-5.4': 'claude-opus-4-6',
33
+ 'gpt-5.4-mini': 'claude-sonnet-4-6',
34
+ 'gpt-5.4-nano': 'claude-haiku-4-5',
35
+ 'gpt-5.3': 'claude-opus-4-6',
65
36
  'gpt-4': 'claude-opus-4-6',
66
37
  'gpt-3.5-turbo': 'claude-haiku-4-5',
67
- 'o3': 'claude-opus-4-6',
68
- 'o3-mini': 'claude-sonnet-4-6',
69
- 'o4-mini': 'claude-sonnet-4-6',
70
- 'o1': 'claude-opus-4-6',
71
- 'o1-mini': 'claude-sonnet-4-6',
72
- 'o1-pro': 'claude-opus-4-6',
73
38
  };
74
- /**
75
- * Translate OpenAI chat completion request → Anthropic Messages request.
76
- */
39
+ /** Translate OpenAI chat completion request → Anthropic Messages request. */
77
40
  function openaiToAnthropic(body, modelOverride) {
78
41
  const messages = body.messages;
79
42
  if (!messages)
80
43
  return body;
81
- // Extract system messages
82
44
  const systemMessages = messages.filter(m => m.role === 'system');
83
45
  const nonSystemMessages = messages.filter(m => m.role !== 'system');
84
- // Map model name
85
- const requestModel = String(body.model || '');
86
- const model = modelOverride || OPENAI_MODEL_MAP[requestModel] || requestModel;
46
+ const model = modelOverride || OPENAI_MODEL_MAP[String(body.model || '')] || String(body.model || 'claude-opus-4-6');
87
47
  const result = {
88
48
  model,
89
- messages: nonSystemMessages.map(m => ({
90
- role: m.role === 'assistant' ? 'assistant' : 'user',
91
- content: m.content,
92
- })),
49
+ messages: nonSystemMessages.map(m => ({ role: m.role === 'assistant' ? 'assistant' : 'user', content: m.content })),
93
50
  max_tokens: body.max_tokens ?? body.max_completion_tokens ?? 8192,
94
51
  };
95
- if (systemMessages.length > 0) {
52
+ if (systemMessages.length > 0)
96
53
  result.system = systemMessages.map(m => typeof m.content === 'string' ? m.content : JSON.stringify(m.content)).join('\n');
97
- }
98
54
  if (body.stream)
99
55
  result.stream = true;
100
56
  if (body.temperature != null)
@@ -105,33 +61,20 @@ function openaiToAnthropic(body, modelOverride) {
105
61
  result.stop_sequences = Array.isArray(body.stop) ? body.stop : [body.stop];
106
62
  return result;
107
63
  }
108
- /**
109
- * Translate Anthropic Messages response → OpenAI chat completion response.
110
- */
64
+ /** Translate Anthropic Messages response → OpenAI chat completion response. */
111
65
  function anthropicToOpenai(body) {
112
- const content = body.content;
113
- const text = content?.find(c => c.type === 'text')?.text ?? '';
114
- const usage = body.usage;
66
+ const text = body.content?.find(c => c.type === 'text')?.text ?? '';
67
+ const u = body.usage;
115
68
  return {
116
69
  id: `chatcmpl-${(body.id || '').replace('msg_', '')}`,
117
70
  object: 'chat.completion',
118
71
  created: Math.floor(Date.now() / 1000),
119
72
  model: body.model,
120
- choices: [{
121
- index: 0,
122
- message: { role: 'assistant', content: text },
123
- finish_reason: body.stop_reason === 'end_turn' ? 'stop' : body.stop_reason === 'max_tokens' ? 'length' : 'stop',
124
- }],
125
- usage: {
126
- prompt_tokens: usage?.input_tokens ?? 0,
127
- completion_tokens: usage?.output_tokens ?? 0,
128
- total_tokens: (usage?.input_tokens ?? 0) + (usage?.output_tokens ?? 0),
129
- },
73
+ choices: [{ index: 0, message: { role: 'assistant', content: text }, finish_reason: body.stop_reason === 'end_turn' ? 'stop' : 'length' }],
74
+ usage: { prompt_tokens: u?.input_tokens ?? 0, completion_tokens: u?.output_tokens ?? 0, total_tokens: (u?.input_tokens ?? 0) + (u?.output_tokens ?? 0) },
130
75
  };
131
76
  }
132
- /**
133
- * Translate Anthropic SSE stream → OpenAI SSE stream.
134
- */
77
+ /** Translate Anthropic SSE → OpenAI SSE. */
135
78
  function translateStreamChunk(line) {
136
79
  if (!line.startsWith('data: '))
137
80
  return null;
@@ -139,47 +82,19 @@ function translateStreamChunk(line) {
139
82
  if (json === '[DONE]')
140
83
  return 'data: [DONE]\n\n';
141
84
  try {
142
- const event = JSON.parse(json);
143
- if (event.type === 'content_block_delta') {
144
- const delta = event.delta;
145
- if (delta?.type === 'text_delta' && delta.text) {
146
- return `data: ${JSON.stringify({
147
- id: 'chatcmpl-dario',
148
- object: 'chat.completion.chunk',
149
- created: Math.floor(Date.now() / 1000),
150
- model: 'claude',
151
- choices: [{ index: 0, delta: { content: delta.text }, finish_reason: null }],
152
- })}\n\n`;
153
- }
154
- }
155
- if (event.type === 'message_stop') {
156
- return `data: ${JSON.stringify({
157
- id: 'chatcmpl-dario',
158
- object: 'chat.completion.chunk',
159
- created: Math.floor(Date.now() / 1000),
160
- model: 'claude',
161
- choices: [{ index: 0, delta: {}, finish_reason: 'stop' }],
162
- })}\n\ndata: [DONE]\n\n`;
85
+ const e = JSON.parse(json);
86
+ if (e.type === 'content_block_delta') {
87
+ const d = e.delta;
88
+ if (d?.type === 'text_delta' && d.text)
89
+ return `data: ${JSON.stringify({ id: 'chatcmpl-dario', object: 'chat.completion.chunk', created: Math.floor(Date.now() / 1000), model: 'claude', choices: [{ index: 0, delta: { content: d.text }, finish_reason: null }] })}\n\n`;
163
90
  }
91
+ if (e.type === 'message_stop')
92
+ return `data: ${JSON.stringify({ id: 'chatcmpl-dario', object: 'chat.completion.chunk', created: Math.floor(Date.now() / 1000), model: 'claude', choices: [{ index: 0, delta: {}, finish_reason: 'stop' }] })}\n\ndata: [DONE]\n\n`;
164
93
  }
165
- catch { /* skip unparseable */ }
94
+ catch { }
166
95
  return null;
167
96
  }
168
- /**
169
- * OpenAI-compatible models list.
170
- */
171
- function openaiModelsList() {
172
- const models = ['claude-opus-4-6', 'claude-sonnet-4-6', 'claude-haiku-4-5'];
173
- return {
174
- object: 'list',
175
- data: models.map(id => ({
176
- id,
177
- object: 'model',
178
- created: 1700000000,
179
- owned_by: 'anthropic',
180
- })),
181
- };
182
- }
97
+ const OPENAI_MODELS_LIST = { object: 'list', data: ['claude-opus-4-6', 'claude-sonnet-4-6', 'claude-haiku-4-5'].map(id => ({ id, object: 'model', created: 1700000000, owned_by: 'anthropic' })) };
183
98
  export function sanitizeError(err) {
184
99
  const msg = err instanceof Error ? err.message : String(err);
185
100
  // Never leak tokens, JWTs, or bearer values in error messages
@@ -201,7 +116,9 @@ async function handleViaCli(body, model, verbose) {
201
116
  if (!lastUser) {
202
117
  return { status: 400, body: JSON.stringify({ error: 'No user message' }), contentType: 'application/json' };
203
118
  }
204
- const effectiveModel = model ?? parsed.model ?? 'claude-opus-4-6';
119
+ const rawModel = model ?? parsed.model ?? 'claude-opus-4-6';
120
+ // Validate model name — only allow alphanumeric, hyphens, dots, underscores
121
+ const effectiveModel = /^[a-zA-Z0-9._-]+$/.test(rawModel) ? rawModel : 'claude-opus-4-6';
205
122
  const prompt = typeof lastUser.content === 'string'
206
123
  ? lastUser.content
207
124
  : JSON.stringify(lastUser.content);
@@ -237,7 +154,7 @@ async function handleViaCli(body, model, verbose) {
237
154
  if (code !== 0 || !stdout.trim()) {
238
155
  resolve({
239
156
  status: 502,
240
- body: JSON.stringify({ type: 'error', error: { type: 'api_error', message: stderr.substring(0, 200) || 'CLI backend failed' } }),
157
+ body: JSON.stringify({ type: 'error', error: { type: 'api_error', message: sanitizeError(stderr.substring(0, 200)) || 'CLI backend failed' } }),
241
158
  contentType: 'application/json',
242
159
  });
243
160
  return;
@@ -287,37 +204,57 @@ export async function startProxy(opts = {}) {
287
204
  process.exit(1);
288
205
  }
289
206
  const cliVersion = detectClaudeVersion();
290
- const sdkVersion = detectSdkVersion();
291
207
  const modelOverride = opts.model ? (MODEL_ALIASES[opts.model] ?? opts.model) : null;
208
+ // Pre-build static headers (only auth, version, beta, request-id change per request)
209
+ const staticHeaders = {
210
+ 'accept': 'application/json',
211
+ 'Content-Type': 'application/json',
212
+ 'anthropic-dangerous-direct-browser-access': 'true',
213
+ 'anthropic-client-platform': 'cli',
214
+ 'user-agent': `claude-cli/${cliVersion} (external, cli)`,
215
+ 'x-app': 'cli',
216
+ 'x-claude-code-session-id': SESSION_ID,
217
+ 'x-stainless-arch': arch,
218
+ 'x-stainless-lang': 'js',
219
+ 'x-stainless-os': OS_NAME,
220
+ 'x-stainless-package-version': '0.81.0',
221
+ 'x-stainless-retry-count': '0',
222
+ 'x-stainless-runtime': 'node',
223
+ 'x-stainless-runtime-version': nodeVersion,
224
+ 'x-stainless-timeout': '600',
225
+ };
292
226
  const useCli = opts.cliBackend ?? false;
293
227
  let requestCount = 0;
294
- let tokenCostEstimate = 0;
295
- // Optional proxy authentication
228
+ // Optional proxy authentication — pre-encode key buffer for performance
296
229
  const apiKey = process.env.DARIO_API_KEY;
230
+ const apiKeyBuf = apiKey ? Buffer.from(apiKey) : null;
297
231
  const corsOrigin = `http://localhost:${port}`;
232
+ // Pre-serialize static responses
233
+ const CORS_HEADERS = {
234
+ 'Access-Control-Allow-Origin': corsOrigin,
235
+ 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
236
+ 'Access-Control-Allow-Headers': 'Content-Type, Authorization, x-api-key, anthropic-version, anthropic-beta',
237
+ 'Access-Control-Max-Age': '86400',
238
+ };
239
+ const MODELS_JSON = JSON.stringify(OPENAI_MODELS_LIST);
240
+ const ERR_UNAUTH = JSON.stringify({ error: 'Unauthorized', message: 'Invalid or missing API key' });
241
+ const ERR_FORBIDDEN = JSON.stringify({ error: 'Forbidden', message: 'Path not allowed' });
242
+ const ERR_METHOD = JSON.stringify({ error: 'Method not allowed' });
298
243
  function checkAuth(req) {
299
- if (!apiKey)
300
- return true; // no key set = open access
244
+ if (!apiKeyBuf)
245
+ return true;
301
246
  const provided = req.headers['x-api-key']
302
247
  || req.headers.authorization?.replace(/^Bearer\s+/i, '');
303
248
  if (!provided)
304
249
  return false;
305
- try {
306
- return timingSafeEqual(Buffer.from(provided), Buffer.from(apiKey));
307
- }
308
- catch {
250
+ const providedBuf = Buffer.from(provided);
251
+ if (providedBuf.length !== apiKeyBuf.length)
309
252
  return false;
310
- }
253
+ return timingSafeEqual(providedBuf, apiKeyBuf);
311
254
  }
312
255
  const server = createServer(async (req, res) => {
313
- // CORS preflight
314
256
  if (req.method === 'OPTIONS') {
315
- res.writeHead(204, {
316
- 'Access-Control-Allow-Origin': corsOrigin,
317
- 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
318
- 'Access-Control-Allow-Headers': 'Content-Type, Authorization, x-api-key, anthropic-version, anthropic-beta',
319
- 'Access-Control-Max-Age': '86400',
320
- });
257
+ res.writeHead(204, CORS_HEADERS);
321
258
  res.end();
322
259
  return;
323
260
  }
@@ -335,10 +272,9 @@ export async function startProxy(opts = {}) {
335
272
  }));
336
273
  return;
337
274
  }
338
- // Auth gate — everything below health requires auth when DARIO_API_KEY is set
339
275
  if (!checkAuth(req)) {
340
276
  res.writeHead(401, { 'Content-Type': 'application/json' });
341
- res.end(JSON.stringify({ error: 'Unauthorized', message: 'Invalid or missing API key' }));
277
+ res.end(ERR_UNAUTH);
342
278
  return;
343
279
  }
344
280
  // Status endpoint
@@ -348,11 +284,10 @@ export async function startProxy(opts = {}) {
348
284
  res.end(JSON.stringify(s));
349
285
  return;
350
286
  }
351
- // OpenAI-compatible models list
352
287
  if (urlPath === '/v1/models' && req.method === 'GET') {
353
288
  requestCount++;
354
289
  res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': corsOrigin });
355
- res.end(JSON.stringify(openaiModelsList()));
290
+ res.end(MODELS_JSON);
356
291
  return;
357
292
  }
358
293
  // Detect OpenAI-format requests
@@ -365,13 +300,12 @@ export async function startProxy(opts = {}) {
365
300
  const targetBase = isOpenAI ? `${ANTHROPIC_API}/v1/messages` : allowedPaths[urlPath];
366
301
  if (!targetBase) {
367
302
  res.writeHead(403, { 'Content-Type': 'application/json' });
368
- res.end(JSON.stringify({ error: 'Forbidden', message: 'Path not allowed' }));
303
+ res.end(ERR_FORBIDDEN);
369
304
  return;
370
305
  }
371
- // Only allow POST (Messages/Chat API) and GET (models)
372
306
  if (req.method !== 'POST') {
373
307
  res.writeHead(405, { 'Content-Type': 'application/json' });
374
- res.end(JSON.stringify({ error: 'Method not allowed' }));
308
+ res.end(ERR_METHOD);
375
309
  return;
376
310
  }
377
311
  // Proxy to Anthropic
@@ -402,22 +336,13 @@ export async function startProxy(opts = {}) {
402
336
  res.end(cliResult.body);
403
337
  return;
404
338
  }
405
- // Translate OpenAI Anthropic format if needed
339
+ // Parse body once, apply OpenAI translation or model override
406
340
  let finalBody = body.length > 0 ? body : undefined;
407
- if (isOpenAI && body.length > 0) {
341
+ if (body.length > 0 && (isOpenAI || modelOverride)) {
408
342
  try {
409
343
  const parsed = JSON.parse(body.toString());
410
- const translated = openaiToAnthropic(parsed, modelOverride);
411
- finalBody = Buffer.from(JSON.stringify(translated));
412
- }
413
- catch { /* not JSON, send as-is */ }
414
- }
415
- else if (modelOverride && body.length > 0) {
416
- // Override model in request body if --model flag was set
417
- try {
418
- const parsed = JSON.parse(body.toString());
419
- parsed.model = modelOverride;
420
- finalBody = Buffer.from(JSON.stringify(parsed));
344
+ const result = isOpenAI ? openaiToAnthropic(parsed, modelOverride) : (modelOverride ? { ...parsed, model: modelOverride } : parsed);
345
+ finalBody = Buffer.from(JSON.stringify(result));
421
346
  }
422
347
  catch { /* not JSON, send as-is */ }
423
348
  }
@@ -425,46 +350,19 @@ export async function startProxy(opts = {}) {
425
350
  const modelInfo = modelOverride ? ` (model: ${modelOverride})` : '';
426
351
  console.log(`[dario] #${requestCount} ${req.method} ${req.url}${modelInfo}`);
427
352
  }
428
- // Build target URL from allowlist (no user input in URL construction)
429
- const targetUrl = targetBase;
430
- // Merge any client-provided beta flags with the required oauth flag
353
+ // Merge client beta flags with defaults
431
354
  const clientBeta = req.headers['anthropic-beta'];
432
- const betaFlags = new Set([
433
- 'oauth-2025-04-20',
434
- 'interleaved-thinking-2025-05-14',
435
- 'prompt-caching-scope-2026-01-05',
436
- 'claude-code-20250219',
437
- 'context-management-2025-06-27',
438
- ]);
439
- if (clientBeta) {
440
- for (const flag of clientBeta.split(',')) {
441
- const trimmed = flag.trim();
442
- if (trimmed.length > 0 && trimmed.length < 100)
443
- betaFlags.add(trimmed);
444
- }
445
- }
355
+ let beta = 'oauth-2025-04-20,interleaved-thinking-2025-05-14,prompt-caching-scope-2026-01-05,claude-code-20250219,context-management-2025-06-27';
356
+ if (clientBeta)
357
+ beta += ',' + clientBeta.split(',').map(f => f.trim()).filter(f => f.length > 0 && f.length < 100).join(',');
446
358
  const headers = {
447
- 'accept': 'application/json',
359
+ ...staticHeaders,
448
360
  'Authorization': `Bearer ${accessToken}`,
449
- 'Content-Type': 'application/json',
450
361
  'anthropic-version': req.headers['anthropic-version'] || '2023-06-01',
451
- 'anthropic-beta': [...betaFlags].join(','),
452
- 'anthropic-dangerous-direct-browser-access': 'true',
453
- 'anthropic-client-platform': 'cli',
454
- 'user-agent': `claude-cli/${cliVersion} (external, cli)`,
455
- 'x-app': 'cli',
456
- 'x-claude-code-session-id': SESSION_ID,
362
+ 'anthropic-beta': beta,
457
363
  'x-client-request-id': randomUUID(),
458
- 'x-stainless-arch': arch,
459
- 'x-stainless-lang': 'js',
460
- 'x-stainless-os': getOsName(),
461
- 'x-stainless-package-version': sdkVersion,
462
- 'x-stainless-retry-count': '0',
463
- 'x-stainless-runtime': 'node',
464
- 'x-stainless-runtime-version': nodeVersion,
465
- 'x-stainless-timeout': '600',
466
364
  };
467
- const upstream = await fetch(targetUrl, {
365
+ const upstream = await fetch(targetBase, {
468
366
  method: req.method ?? 'POST',
469
367
  headers,
470
368
  body: finalBody ? new Uint8Array(finalBody) : undefined,
@@ -492,6 +390,7 @@ export async function startProxy(opts = {}) {
492
390
  const decoder = new TextDecoder();
493
391
  try {
494
392
  let buffer = '';
393
+ const MAX_LINE_LENGTH = 1_000_000; // 1MB max per SSE line
495
394
  while (true) {
496
395
  const { done, value } = await reader.read();
497
396
  if (done)
@@ -499,6 +398,10 @@ export async function startProxy(opts = {}) {
499
398
  if (isOpenAI) {
500
399
  // Translate Anthropic SSE → OpenAI SSE
501
400
  buffer += decoder.decode(value, { stream: true });
401
+ // Guard against unbounded buffer growth
402
+ if (buffer.length > MAX_LINE_LENGTH) {
403
+ buffer = buffer.slice(-MAX_LINE_LENGTH);
404
+ }
502
405
  const lines = buffer.split('\n');
503
406
  buffer = lines.pop() ?? '';
504
407
  for (const line of lines) {
@@ -540,18 +443,8 @@ export async function startProxy(opts = {}) {
540
443
  else {
541
444
  res.end(responseBody);
542
445
  }
543
- // Quick token estimate for logging
544
- if (verbose && responseBody) {
545
- try {
546
- const parsed = JSON.parse(responseBody);
547
- if (parsed.usage) {
548
- const tokens = (parsed.usage.input_tokens ?? 0) + (parsed.usage.output_tokens ?? 0);
549
- tokenCostEstimate += tokens;
550
- console.log(`[dario] #${requestCount} ${upstream.status} — ${tokens} tokens (session total: ${tokenCostEstimate})`);
551
- }
552
- }
553
- catch { /* not JSON, skip */ }
554
- }
446
+ if (verbose)
447
+ console.log(`[dario] #${requestCount} ${upstream.status}`);
555
448
  }
556
449
  }
557
450
  catch (err) {
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@askalf/dario",
3
- "version": "2.1.0",
3
+ "version": "2.1.2",
4
4
  "description": "Use your Claude subscription as an API. No API key needed. Local proxy for Claude Max/Pro subscriptions.",
5
5
  "type": "module",
6
6
  "bin": {