@codeproxy/cli 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,124 @@
1
+ # @codeproxy/cli
2
+
3
+ > **中文版** → [README.zh-CN.md](./README.zh-CN.md)
4
+
5
+ **@codeproxy/cli** is a local proxy server that converts **any** Chat Completions or Anthropic Messages API into the **OpenAI Responses API** format. It lets Codex, Claude Code, or any Responses-API client use models from DeepSeek, GLM, Kimi, and more.
6
+
7
+ Built on [@codeproxy/core](https://github.com/codeproxy-ai/core).
8
+
9
+ ## Quick Start
10
+
11
+ ```bash
12
+ npx @codeproxy/cli --upstream-format openai-chat \
13
+ --base-url https://api.deepseek.com/v1 \
14
+ --apikey sk-your-key
15
+ ```
16
+
17
+ Point your Responses-API client at `http://127.0.0.1:8787`:
18
+
19
+ ```bash
20
+ curl -N http://127.0.0.1:8787/v1/responses \
21
+ -H 'content-type: application/json' \
22
+ -H "authorization: Bearer $API_KEY" \
23
+ -d '{"model":"deepseek-v4-pro","input":"Hello!","stream":true}'
24
+ ```
25
+
26
+
27
+ ### With config file
28
+
29
+ ```bash
30
+ npx @codeproxy/cli --config config.json
31
+ ```
32
+
33
+ See [config.example.json](./config.example.json) for a full example.
34
+
35
+ #### Top-level fields
36
+
37
+ | Field | Type | Description |
38
+ |---|---|---|
39
+ | `version` | `string` | Config format version (currently `"1.0"`) |
40
+ | `currentUpstream` | `string` | Name of the upstream to use (must match a key in `upstreams`) |
41
+ | `headers` | `object` | Default headers applied to **all** upstreams (merged with per-upstream headers; per-upstream wins) |
42
+ | `reasoningEffort` | `string` | Default reasoning effort for all upstreams: `"low"`, `"medium"`, `"high"`, `"xhigh"` |
43
+ | `thinking` | `object` `|` `null` | Default thinking config for all upstreams. Anthropic format: `{"type": "enabled", "budget_tokens": 16384}`. Set to `null` to disable |
44
+ | `timeoutMs` | `number` | Default upstream request timeout in milliseconds |
45
+
46
+ #### Per-upstream fields
47
+
48
+ | Field | Type | Description |
49
+ |---|---|---|
50
+ | `format` | `"anthropic"` `|` `"openai-chat"` | Upstream API format. If omitted, inferred from `baseUrl` (path ending in `/messages` \u2192 `anthropic`, `/chat/completions` \u2192 `openai-chat`, otherwise falls back to `openai-chat`). The proper path suffix is appended automatically |
51
+ | `baseUrl` | `string` | **Required.** Upstream endpoint URL |
52
+ | `apiKey` | `string` | Upstream API key. Sent as `Authorization: Bearer <key>` (Anthropic: rewritten to `x-api-key`) |
53
+ | `model` | `string` | Override the `model` field in all incoming requests |
54
+ | `apiVersion` | `string` | Override `anthropic-version` header (Anthropic only) |
55
+ | `headers` | `object` | Extra HTTP headers for this upstream. Merged on top of top-level `headers`, per-upstream wins |
56
+ | `timeoutMs` | `number` | Request timeout for this upstream (overrides top-level `timeoutMs`) |
57
+ | `dropImages` | `boolean` | When `true`, strip image/file parts from user messages (for text-only models). Use with `fallback` to auto-route image requests to a vision-capable upstream |
58
+ | `fallback` | `string` | Name of another upstream to route to when `dropImages: true` and the request contains images |
59
+ | `reasoningEffort` | `string` | Per-upstream reasoning effort override (`"low"`, `"medium"`, `"high"`, `"xhigh"`). Overrides top-level value |
60
+ | `thinking` | `object` `|` `null` | Per-upstream thinking config. Overrides top-level value. Set to `null` to disable |
61
+
62
+ #### Precedence
63
+
64
+ ```
65
+ CLI flags > per-upstream fields > top-level fields > built-in defaults
66
+ ```
67
+
68
+ #### Example: auto-fallback for text-only models
69
+
70
+ When `deepseek` has `dropImages: true` and the user sends an image, the proxy
71
+ automatically routes to `deepseek-vision` (which supports vision):
72
+
73
+ ```json
74
+ {
75
+ "currentUpstream": "deepseek",
76
+ "upstreams": {
77
+ "deepseek": {
78
+ "baseUrl": "https://api.deepseek.com/v1",
79
+ "model": "deepseek-v4-pro",
80
+ "dropImages": true,
81
+ "fallback": "deepseek-vision"
82
+ },
83
+ "deepseek-vision": {
84
+ "baseUrl": "https://api.deepseek.com/v1",
85
+ "model": "deepseek-v4-vision"
86
+ }
87
+ }
88
+ }
89
+ ```
90
+ ## Install
91
+
92
+ ```bash
93
+ npm install -g @codeproxy/cli
94
+ ```
95
+
96
+ ## CLI Options
97
+
98
+ | Flag | Default | Description |
99
+ |---|---|---|
100
+ | `--base-url <url>` | — | Upstream endpoint (required unless `--config`) |
101
+ | `--upstream-format <fmt>` | inferred | `anthropic` or `openai-chat` |
102
+ | `--config <file>` | — | JSON config file |
103
+ | `--host <host>` | `127.0.0.1` | Bind host |
104
+ | `-p, --port <port>` | `8787` | Bind port |
105
+ | `--api-version <ver>` | `2023-06-01` | Override Anthropic version header |
106
+ | `--apikey <key>` | — | Upstream API key |
107
+ | `--model <name>` | — | Override model for all requests |
108
+ | `--drop-images` | — | Strip images (text-only models) |
109
+
110
+ ## Programmatic usage
111
+
112
+ ```ts
113
+ import { createResponsesFetch, startProxy } from '@codeproxy/cli';
114
+
115
+ const proxy = await startProxy({
116
+ upstreamFormat: 'openai-chat',
117
+ baseUrl: 'https://api.deepseek.com/v1',
118
+ defaultHeaders: { authorization: 'Bearer sk-...' },
119
+ });
120
+ ```
121
+
122
+ ## License
123
+
124
+ MIT
@@ -0,0 +1,123 @@
1
+ # @codeproxy/cli
2
+
3
+ > **English** → [README.md](./README.md)
4
+
5
+ **@codeproxy/cli** 是一个本地代理服务器,将**任何** Chat Completions 或 Anthropic Messages API **转换为 OpenAI Responses API 格式**。让 Codex、Claude Code 等 Responses-API 客户端能使用 DeepSeek、GLM、Kimi 等任意模型。
6
+
7
+ 基于 [@codeproxy/core](https://github.com/codeproxy-ai/core) 构建。
8
+
9
+ ## 快速开始
10
+
11
+ ```bash
12
+ npx @codeproxy/cli --upstream-format openai-chat \
13
+ --base-url https://api.deepseek.com/v1 \
14
+ --apikey sk-your-key
15
+ ```
16
+
17
+ 将你的 Responses-API 客户端指向 `http://127.0.0.1:8787`:
18
+
19
+ ```bash
20
+ curl -N http://127.0.0.1:8787/v1/responses \
21
+ -H 'content-type: application/json' \
22
+ -H "authorization: Bearer $API_KEY" \
23
+ -d '{"model":"deepseek-v4-pro","input":"Hello!","stream":true}'
24
+ ```
25
+
26
+
27
+ ### 使用配置文件
28
+
29
+ ```bash
30
+ npx @codeproxy/cli --config config.json
31
+ ```
32
+
33
+ 完整的配置示例见 [config.example.json](./config.example.json)。
34
+
35
+ #### 顶层字段
36
+
37
+ | 字段 | 类型 | 说明 |
38
+ |---|---|---|
39
+ | `version` | `string` | 配置格式版本(当前为 `"1.0"`) |
40
+ | `currentUpstream` | `string` | 当前使用的上游名称(必须是 `upstreams` 中的某个 key) |
41
+ | `headers` | `object` | 应用到**所有**上游的默认请求头(与各上游的 headers 合并,上游级别优先) |
42
+ | `reasoningEffort` | `string` | 所有上游的默认推理力度:`"low"`、`"medium"`、`"high"`、`"xhigh"` |
43
+ | `thinking` | `object` `|` `null` | 所有上游的默认 thinking 配置。Anthropic 格式:`{"type": "enabled", "budget_tokens": 16384}`。设为 `null` 禁用 |
44
+ | `timeoutMs` | `number` | 默认上游请求超时时间(毫秒) |
45
+
46
+ #### 每个上游的字段
47
+
48
+ | 字段 | 类型 | 说明 |
49
+ |---|---|---|
50
+ | `format` | `"anthropic"` `|` `"openai-chat"` | 上游 API 格式。省略时从 `baseUrl` 推断(路径结尾是 `/messages` \u2192 `anthropic`,`/chat/completions` \u2192 `openai-chat`,否则回退到 `openai-chat`)。路径后缀会自动补全 |
51
+ | `baseUrl` | `string` | **必需。** 上游端点 URL |
52
+ | `apiKey` | `string` | 上游 API 密钥。作为 `Authorization: Bearer <key>` 发送(Anthropic 会转为 `x-api-key`) |
53
+ | `model` | `string` | 覆盖所有传入请求中的 `model` 字段 |
54
+ | `apiVersion` | `string` | 覆盖 `anthropic-version` 请求头(仅 Anthropic) |
55
+ | `headers` | `object` | 该上游的额外 HTTP 请求头。合并到顶层 `headers` 之上,上游级别优先 |
56
+ | `timeoutMs` | `number` | 该上游的请求超时(覆盖顶层 `timeoutMs`) |
57
+ | `dropImages` | `boolean` | 设为 `true` 时,从用户消息中移除图片/文件部分(用于纯文本模型)。配合 `fallback` 使用,含图片的请求会自动路由到支持视觉的上游 |
58
+ | `fallback` | `string` | 另一个上游的名称。当当前上游设了 `dropImages: true` 且请求包含图片时,自动切换到该上游 |
59
+ | `reasoningEffort` | `string` | 该上游的推理力度覆盖(`"low"`、`"medium"`、`"high"`、`"xhigh"`)。覆盖顶层值 |
60
+ | `thinking` | `object` `|` `null` | 该上游的 thinking 配置。覆盖顶层值。设为 `null` 禁用 |
61
+
62
+ #### 优先级
63
+
64
+ ```
65
+ CLI 标志 > 每个上游的字段 > 顶层字段 > 内置默认值
66
+ ```
67
+
68
+ #### 示例:纯文本模型的自动回退
69
+
70
+ 当 `deepseek` 设了 `dropImages: true` 且用户发送图片时,代理自动路由到 `deepseek-vision`(支持视觉):
71
+
72
+ ```json
73
+ {
74
+ "currentUpstream": "deepseek",
75
+ "upstreams": {
76
+ "deepseek": {
77
+ "baseUrl": "https://api.deepseek.com/v1",
78
+ "model": "deepseek-v4-pro",
79
+ "dropImages": true,
80
+ "fallback": "deepseek-vision"
81
+ },
82
+ "deepseek-vision": {
83
+ "baseUrl": "https://api.deepseek.com/v1",
84
+ "model": "deepseek-v4-vision"
85
+ }
86
+ }
87
+ }
88
+ ```
89
+ ## 安装
90
+
91
+ ```bash
92
+ npm install -g @codeproxy/cli
93
+ ```
94
+
95
+ ## CLI 选项
96
+
97
+ | 参数 | 默认值 | 说明 |
98
+ |---|---|---|
99
+ | `--base-url <url>` | — | 上游端点(除非使用 `--config`,否则必需) |
100
+ | `--upstream-format <fmt>` | 自动推断 | `anthropic` 或 `openai-chat` |
101
+ | `--config <file>` | — | JSON 配置文件 |
102
+ | `--host <host>` | `127.0.0.1` | 绑定地址 |
103
+ | `-p, --port <port>` | `8787` | 绑定端口 |
104
+ | `--api-version <ver>` | `2023-06-01` | 覆盖 Anthropic 版本头 |
105
+ | `--apikey <key>` | — | 上游 API 密钥 |
106
+ | `--model <name>` | — | 为所有请求覆盖模型 |
107
+ | `--drop-images` | — | 移除图片部分(纯文本模型) |
108
+
109
+ ## 编程使用
110
+
111
+ ```ts
112
+ import { createResponsesFetch, startProxy } from '@codeproxy/cli';
113
+
114
+ const proxy = await startProxy({
115
+ upstreamFormat: 'openai-chat',
116
+ baseUrl: 'https://api.deepseek.com/v1',
117
+ defaultHeaders: { authorization: 'Bearer sk-...' },
118
+ });
119
+ ```
120
+
121
+ ## 协议
122
+
123
+ MIT
package/dist/index.cjs ADDED
@@ -0,0 +1,431 @@
1
+ 'use strict';
2
+
3
+ var core = require('@codeproxy/core');
4
+ var http = require('http');
5
+ var stream = require('stream');
6
+ var fs = require('fs');
7
+ var path = require('path');
8
+
9
+ function _interopDefault (e) { return e && e.__esModule ? e : { default: e }; }
10
+
11
+ var http__default = /*#__PURE__*/_interopDefault(http);
12
+
13
+ // src/index.ts
14
+ function fmtTime(date) {
15
+ return date.toLocaleTimeString("en-US", { hour12: false });
16
+ }
17
+ function fmtDuration(ms) {
18
+ if (ms < 1e3) {
19
+ return `${Math.round(ms)}ms`;
20
+ }
21
+ if (ms < 6e4) {
22
+ return `${(ms / 1e3).toFixed(1)}s`;
23
+ }
24
+ const minutes = Math.floor(ms / 6e4);
25
+ const seconds = Math.round(ms % 6e4 / 1e3);
26
+ return `${minutes}:${String(seconds).padStart(2, "0")}`;
27
+ }
28
+ async function startProxy(options) {
29
+ const host = options.host ?? "127.0.0.1";
30
+ const port = options.port ?? 8787;
31
+ const cors = options.cors ?? true;
32
+ const logger = options.logger === null ? null : options.logger ?? console;
33
+ const durationHistory = [];
34
+ function updateRollingAverage(ms) {
35
+ durationHistory.push(ms);
36
+ if (durationHistory.length > 50) {
37
+ durationHistory.shift();
38
+ }
39
+ return durationHistory.reduce((sum, val) => sum + val, 0) / durationHistory.length;
40
+ }
41
+ const activeRequests = /* @__PURE__ */ new Map();
42
+ let statusTimerId = null;
43
+ function drawStatusLine() {
44
+ if (activeRequests.size === 0) {
45
+ return;
46
+ }
47
+ const parts = Array.from(activeRequests.entries()).map(([, req]) => {
48
+ const elapsed = Date.now() - req.startTime;
49
+ return `[${fmtDuration(elapsed)}]`;
50
+ });
51
+ process.stdout.write(`\r\x1B[K\u23F3 ${parts.join(", ")}`);
52
+ }
53
+ const requestTracker = {
54
+ add(method, url) {
55
+ const id = `${Date.now()}:${Math.random().toString(36).slice(2, 8)}`;
56
+ activeRequests.set(id, { method, url, startTime: Date.now() });
57
+ drawStatusLine();
58
+ if (!statusTimerId) {
59
+ statusTimerId = setInterval(drawStatusLine, 150);
60
+ }
61
+ return id;
62
+ },
63
+ remove(id) {
64
+ activeRequests.delete(id);
65
+ if (activeRequests.size === 0) {
66
+ process.stdout.write("\r\x1B[K");
67
+ if (statusTimerId) {
68
+ clearInterval(statusTimerId);
69
+ statusTimerId = null;
70
+ }
71
+ }
72
+ }
73
+ };
74
+ const requestInfo = { method: "", url: "", startTime: 0, resultLog: "" };
75
+ const upstreamCapture = {};
76
+ const baseFetch = options.fetch ?? globalThis.fetch;
77
+ const capturingFetch = async (input, init) => {
78
+ const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
79
+ const method = (init?.method ?? input?.method ?? "GET").toUpperCase();
80
+ const reqHeaders = headersInitToObject(init?.headers);
81
+ let reqBody = void 0;
82
+ if (init?.body != null) {
83
+ if (typeof init.body === "string") {
84
+ reqBody = tryParseJson(init.body);
85
+ } else if (init.body instanceof ArrayBuffer) {
86
+ reqBody = tryParseJson(new TextDecoder().decode(init.body));
87
+ } else if (ArrayBuffer.isView(init.body)) {
88
+ reqBody = tryParseJson(new TextDecoder().decode(init.body));
89
+ } else {
90
+ reqBody = String(init.body);
91
+ }
92
+ }
93
+ upstreamCapture.request = { url, method, headers: reqHeaders, body: reqBody };
94
+ const resp = await baseFetch(input, init);
95
+ if (!resp.ok) {
96
+ const clone = resp.clone();
97
+ const text = await clone.text().catch(() => "");
98
+ upstreamCapture.response = {
99
+ status: resp.status,
100
+ statusText: resp.statusText,
101
+ headers: headersToObject(resp.headers),
102
+ body: tryParseJson(text)
103
+ };
104
+ } else {
105
+ upstreamCapture.response = void 0;
106
+ }
107
+ return resp;
108
+ };
109
+ const apiFetch = core.createResponsesFetch({
110
+ upstreamFormat: options.upstreamFormat,
111
+ baseUrl: options.baseUrl,
112
+ apiVersion: options.apiVersion,
113
+ model: options.model,
114
+ defaultHeaders: options.defaultHeaders,
115
+ timeoutMs: options.timeoutMs,
116
+ dropImages: options.dropImages,
117
+ fallbackUpstream: options.fallbackUpstream,
118
+ fetch: capturingFetch,
119
+ passthroughFetch: async () => new Response(JSON.stringify({ error: { message: "Not found" } }), {
120
+ status: 404,
121
+ headers: { "content-type": "application/json" }
122
+ }),
123
+ onCacheStats: (stats) => {
124
+ const durationMs = requestInfo.startTime ? Date.now() - requestInfo.startTime : 0;
125
+ const billedTokens = stats.inputTokens + stats.outputTokens - stats.cachedTokens;
126
+ const parts = [
127
+ `total=${stats.totalTokens}`,
128
+ `input=${stats.inputTokens}`,
129
+ `output=${stats.outputTokens}`,
130
+ `cached=${stats.cachedTokens}`,
131
+ `billed=${billedTokens}`
132
+ ];
133
+ if (stats.cacheCreationTokens > 0) {
134
+ parts.push(`cache_creation=${stats.cacheCreationTokens}`);
135
+ }
136
+ const avg = updateRollingAverage(durationMs);
137
+ const ratio = avg > 0 ? durationMs / avg : 1;
138
+ const color = ratio < 0.8 ? "\x1B[32m" : ratio < 1.5 ? "\x1B[33m" : "\x1B[31m";
139
+ const reset = "\x1B[0m";
140
+ const logMsg = `[${fmtTime(/* @__PURE__ */ new Date())}] -> 200 (${color}${fmtDuration(durationMs)}${reset} avg=${fmtDuration(Math.round(avg))}) [${parts.join(", ")}]`;
141
+ requestInfo.resultLog = stats.cachedTokens < 1024 && billedTokens > 0 ? `\u26A0\uFE0F NO CACHE -- ${logMsg}` : logMsg;
142
+ if (options.onCacheStats) {
143
+ options.onCacheStats({
144
+ ...stats,
145
+ method: requestInfo.method || void 0,
146
+ url: requestInfo.url || void 0,
147
+ durationMs: durationMs || void 0
148
+ });
149
+ }
150
+ }
151
+ });
152
+ const server = http__default.default.createServer(async (req, res) => {
153
+ const start = Date.now();
154
+ requestInfo.method = req.method ?? "POST";
155
+ requestInfo.url = req.url ?? "/v1/responses";
156
+ requestInfo.startTime = start;
157
+ const timeoutMs = options.timeoutMs;
158
+ let timeoutTimer;
159
+ if (timeoutMs && timeoutMs > 0) {
160
+ timeoutTimer = setTimeout(() => {
161
+ logger?.warn(`[timeout] request exceeded ${timeoutMs}ms, aborting`);
162
+ res.destroy();
163
+ req.destroy();
164
+ }, timeoutMs);
165
+ }
166
+ try {
167
+ await handleRequest(req, res, {
168
+ apiFetch,
169
+ cors,
170
+ logger,
171
+ method: req.method ?? "POST",
172
+ url: req.url ?? "/",
173
+ upstreamCapture,
174
+ requestInfo,
175
+ requestTracker
176
+ });
177
+ } catch (err) {
178
+ logger?.error("[proxy-error]", err);
179
+ try {
180
+ if (!res.headersSent) {
181
+ res.writeHead(500, { "content-type": "application/json" });
182
+ res.end(JSON.stringify({ error: { message: "Internal server error" } }));
183
+ }
184
+ } catch {
185
+ }
186
+ } finally {
187
+ if (timeoutTimer) {
188
+ clearTimeout(timeoutTimer);
189
+ }
190
+ }
191
+ });
192
+ return new Promise((resolve2, reject) => {
193
+ server.listen(port, host, () => {
194
+ const actualPort = (() => {
195
+ const addr = server.address();
196
+ return addr.port;
197
+ })();
198
+ const url = `http://${host}:${actualPort}`;
199
+ logger?.log(`Proxy listening on ${url}`);
200
+ logger?.log(`Upstream format: ${options.upstreamFormat}`);
201
+ logger?.log(`Upstream URL: ${options.baseUrl}`);
202
+ resolve2({
203
+ host,
204
+ port: actualPort,
205
+ url,
206
+ server,
207
+ close: () => new Promise((res) => {
208
+ server.close((err) => {
209
+ if (err) {
210
+ logger?.warn("Error closing server:", err);
211
+ }
212
+ res();
213
+ });
214
+ })
215
+ });
216
+ });
217
+ server.once("error", reject);
218
+ });
219
+ }
220
+ async function handleRequest(req, res, opts) {
221
+ if (opts.cors) {
222
+ setCorsHeaders(res);
223
+ }
224
+ if (req.method === "OPTIONS") {
225
+ res.writeHead(204);
226
+ res.end();
227
+ return;
228
+ }
229
+ const method = req.method ?? "GET";
230
+ const urlPath = req.url ?? "/";
231
+ const headers = flattenIncomingHeaders(req.headers);
232
+ let body;
233
+ if (method !== "GET" && method !== "HEAD") {
234
+ body = await readIncomingBody(req);
235
+ }
236
+ if (!/^\/v1\/responses\/?(?:\?|$)/.test(urlPath)) {
237
+ res.writeHead(404, { "content-type": "application/json" });
238
+ res.end(JSON.stringify({ error: { message: `Not found: ${method} ${urlPath}` } }));
239
+ return;
240
+ }
241
+ const requestBodyText = body ? body.toString("utf8") : void 0;
242
+ const requestStart = Date.now();
243
+ const requestId = opts.requestTracker.add(method, urlPath);
244
+ try {
245
+ const response = await opts.apiFetch(`http://local${urlPath}`, {
246
+ method,
247
+ headers,
248
+ body: body ? new Uint8Array(body) : void 0
249
+ });
250
+ const responseBodyText = response.body ? await response.clone().text() : "";
251
+ opts.requestTracker.remove(requestId);
252
+ if (opts.logger) {
253
+ if (response.status >= 400) {
254
+ process.stdout.write(
255
+ `\r\x1B[K<-- ${response.status} (${fmtDuration(Date.now() - requestStart)})
256
+ `
257
+ );
258
+ } else if (opts.requestInfo.resultLog) {
259
+ process.stdout.write(`\r\x1B[K${opts.requestInfo.resultLog}
260
+ `);
261
+ } else {
262
+ process.stdout.write(
263
+ `\r\x1B[K<-- ${response.status} (${fmtDuration(Date.now() - requestStart)})
264
+ `
265
+ );
266
+ }
267
+ }
268
+ if (response.status >= 400) {
269
+ try {
270
+ const filePath = saveErrorDump({
271
+ method: opts.method,
272
+ url: opts.url,
273
+ clientRequest: {
274
+ headers,
275
+ body: tryParseJson(requestBodyText ?? "")
276
+ },
277
+ upstreamRequest: opts.upstreamCapture.request,
278
+ upstreamResponse: opts.upstreamCapture.response,
279
+ proxyResponse: {
280
+ status: response.status,
281
+ headers: headersToObject(response.headers),
282
+ body: tryParseJson(responseBodyText)
283
+ }
284
+ });
285
+ opts.logger?.error(`[proxy-failure] full exchange saved to ${filePath}`);
286
+ } catch (dumpErr) {
287
+ opts.logger?.error("[proxy-failure] failed to persist error dump", dumpErr);
288
+ }
289
+ }
290
+ const outHeaders = {};
291
+ response.headers.forEach((value, key) => {
292
+ outHeaders[key] = value;
293
+ });
294
+ if (opts.cors) {
295
+ Object.assign(outHeaders, corsHeaders());
296
+ }
297
+ res.writeHead(response.status, outHeaders);
298
+ if (!response.body) {
299
+ res.end();
300
+ return;
301
+ }
302
+ const typedBody = response.body;
303
+ const nodeStream = stream.Readable.fromWeb(typedBody);
304
+ nodeStream.pipe(res);
305
+ await new Promise((resolve2, reject) => {
306
+ nodeStream.once("end", resolve2);
307
+ nodeStream.once("error", reject);
308
+ res.once("close", resolve2);
309
+ });
310
+ } catch (err) {
311
+ opts.requestTracker.remove(requestId);
312
+ throw err;
313
+ }
314
+ }
315
+ function readIncomingBody(req) {
316
+ return new Promise((resolve2, reject) => {
317
+ const chunks = [];
318
+ req.on("data", (chunk) => {
319
+ const buf = chunk;
320
+ chunks.push(buf);
321
+ });
322
+ req.on("end", () => resolve2(Buffer.concat(chunks)));
323
+ req.on("error", reject);
324
+ });
325
+ }
326
+ function flattenIncomingHeaders(headers) {
327
+ const out = {};
328
+ for (const [key, value] of Object.entries(headers)) {
329
+ if (value == null) {
330
+ continue;
331
+ }
332
+ out[key.toLowerCase()] = Array.isArray(value) ? value.join(", ") : String(value);
333
+ }
334
+ return out;
335
+ }
336
+ function headersToObject(headers) {
337
+ const out = {};
338
+ headers.forEach((value, key) => {
339
+ out[key] = value;
340
+ });
341
+ return out;
342
+ }
343
+ function setCorsHeaders(res) {
344
+ const headers = corsHeaders();
345
+ for (const [key, value] of Object.entries(headers)) {
346
+ res.setHeader(key, value);
347
+ }
348
+ }
349
+ function corsHeaders() {
350
+ return {
351
+ "access-control-allow-origin": "*",
352
+ "access-control-allow-methods": "GET,POST,OPTIONS",
353
+ "access-control-allow-headers": "authorization,content-type,x-api-key,anthropic-version,anthropic-beta,anthropic-dangerous-direct-browser-access",
354
+ "access-control-expose-headers": "content-type"
355
+ };
356
+ }
357
+ function tryParseJson(str) {
358
+ if (!str) {
359
+ return str ?? null;
360
+ }
361
+ try {
362
+ return JSON.parse(str);
363
+ } catch {
364
+ return str;
365
+ }
366
+ }
367
+ function headersInitToObject(headersInit) {
368
+ const out = {};
369
+ if (!headersInit) {
370
+ return out;
371
+ }
372
+ if (typeof Headers !== "undefined" && headersInit instanceof Headers) {
373
+ headersInit.forEach((value, key) => {
374
+ out[key.toLowerCase()] = value;
375
+ });
376
+ return out;
377
+ }
378
+ if (Array.isArray(headersInit)) {
379
+ for (const [key, value] of headersInit) {
380
+ out[String(key).toLowerCase()] = String(value);
381
+ }
382
+ return out;
383
+ }
384
+ for (const [key, value] of Object.entries(headersInit)) {
385
+ out[key.toLowerCase()] = String(value);
386
+ }
387
+ return out;
388
+ }
389
+ function saveErrorDump(dump) {
390
+ const dir = path.resolve(process.cwd(), "logs");
391
+ fs.mkdirSync(dir, { recursive: true });
392
+ const ts = (/* @__PURE__ */ new Date()).toISOString().replace(/[:.]/g, "-");
393
+ const status = dump.upstreamResponse?.status ?? dump.proxyResponse.status;
394
+ const filename = `proxy-error-${ts}-${status}.json`;
395
+ const filePath = path.join(dir, filename);
396
+ const payload = {
397
+ timestamp: (/* @__PURE__ */ new Date()).toISOString(),
398
+ ...dump
399
+ };
400
+ redactAuth(payload.clientRequest?.headers);
401
+ redactAuth(payload.upstreamRequest?.headers);
402
+ fs.writeFileSync(filePath, JSON.stringify(payload, null, 2));
403
+ return filePath;
404
+ }
405
+ function redactAuth(headers) {
406
+ if (!headers) {
407
+ return;
408
+ }
409
+ for (const key of Object.keys(headers)) {
410
+ const lowerKey = key.toLowerCase();
411
+ if (lowerKey === "authorization" || lowerKey === "x-api-key" || lowerKey === "api-key" || lowerKey === "cookie") {
412
+ headers[key] = "[REDACTED]";
413
+ }
414
+ }
415
+ }
416
+
417
+ Object.defineProperty(exports, "createResponsesFetch", {
418
+ enumerable: true,
419
+ get: function () { return core.createResponsesFetch; }
420
+ });
421
+ Object.defineProperty(exports, "encodeSseEvent", {
422
+ enumerable: true,
423
+ get: function () { return core.encodeSseEvent; }
424
+ });
425
+ Object.defineProperty(exports, "parseSseStream", {
426
+ enumerable: true,
427
+ get: function () { return core.parseSseStream; }
428
+ });
429
+ exports.startProxy = startProxy;
430
+ //# sourceMappingURL=index.cjs.map
431
+ //# sourceMappingURL=index.cjs.map