@duanluan/openai-local-bridge 0.2.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,269 @@
1
+ # openai-local-bridge
2
+
3
+ [中文说明](./README_CN.md)
4
+
5
+ `openai-local-bridge` routes local `api.openai.com` requests to a third-party OpenAI-compatible endpoint, allowing tools such as Trae and AI Assistant to use GPT Codex through a non-OpenAI API.
6
+
7
+ ## Prerequisites
8
+
9
+ - `Git`
10
+ - `Python`
11
+ - `uv`: optional, but recommended. When available, the npm launcher prefers it so the latest CLI can be run directly.
12
+ - `OpenSSL`: required when running `olb enable` or `olb start` to generate local certificates. On Windows, install [OpenSSL](https://slproweb.com/products/Win32OpenSSL.html) first and make sure the directory containing `openssl.exe` is in `PATH`.
13
+
14
+ Check your environment:
15
+
16
+ ```bash
17
+ git --version
18
+ python --version
19
+ openssl version
20
+ ```
21
+
22
+ ## Installation
23
+
24
+ If you want a standalone binary with the Python runtime bundled, download the platform archive from GitHub Releases. Those archives do not require `Python` or `uv`; only `OpenSSL` is still needed for `olb enable` / `olb start`.
25
+
26
+ ### Method 1: `uv`
27
+
28
+ ```bash
29
+ uv tool install openai-local-bridge
30
+ ```
31
+
32
+ ### Method 2: `pip`
33
+
34
+ ```bash
35
+ python -m pip install --user openai-local-bridge
36
+ ```
37
+
38
+ ### Method 3: `npm`
39
+
40
+ ```bash
41
+ npm install -g @duanluan/openai-local-bridge
42
+ ```
43
+
44
+ The npm package downloads the matching standalone binary from GitHub Releases during installation, so runtime use does not require `Python` or `uv`.
45
+
46
+ ### Method 4: `curl` / PowerShell
47
+
48
+ Linux / macOS:
49
+
50
+ ```bash
51
+ curl -fsSL https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.sh | bash
52
+ ```
53
+
54
+ Windows PowerShell:
55
+
56
+ ```powershell
57
+ irm https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.ps1 | iex
58
+ ```
59
+
60
+ ### Method 5: standalone binary
61
+
62
+ Download the matching archive from GitHub Releases, then unpack and run `olb` directly:
63
+
64
+ - `olb-linux-x86_64.tar.gz`
65
+ - `olb-macos-x86_64.tar.gz`
66
+ - `olb-macos-arm64.tar.gz`
67
+ - `olb-windows-x86_64.zip`
68
+
69
+ ## Quick Start
70
+
71
+ The most direct way to use it is:
72
+
73
+ ```bash
74
+ olb start
75
+ ```
76
+
77
+ Run it in the background:
78
+
79
+ ```bash
80
+ olb start --background
81
+ ```
82
+
83
+ If the machine has not been configured yet, `olb start` first runs initialization, then continues with enablement and startup. In interactive mode, it asks for:
84
+
85
+ - `Base URL`
86
+ - `API Key`
87
+ - `Reasoning effort`
88
+
89
+ If you only want to update the configuration, run:
90
+
91
+ ```bash
92
+ olb init
93
+ ```
94
+
95
+ To stop the takeover:
96
+
97
+ ```bash
98
+ olb disable
99
+ ```
100
+
101
+ To stop a running bridge process:
102
+
103
+ ```bash
104
+ olb stop
105
+ ```
106
+
107
+ ## Common Commands
108
+
109
+ Command overview:
110
+
111
+ - `olb`: runs initialization when no config exists; otherwise shows the current status
112
+ - `init`: initial setup or reconfiguration
113
+ - `config`: show the current configuration
114
+ - `config-path`: show the configuration file path
115
+ - `status`: show the current status
116
+ - `enable`: install certificates, update hosts, and manage NSS on supported platforms
117
+ - `disable`: remove the hosts takeover
118
+ - `start`: if not initialized, run setup first, then execute `enable` and start the bridge immediately
119
+ - `start --background`: start the bridge in the background and write logs to the config directory
120
+ - `stop`: stop the current bridge process, including one started in the background
121
+
122
+ ## Wrapper Script Entry Points
123
+
124
+ If you are running directly from the repository, you can also use:
125
+
126
+ ### Linux / macOS
127
+
128
+ ```bash
129
+ ./openai-local-bridge.sh <command>
130
+ ```
131
+
132
+ ### Windows PowerShell
133
+
134
+ ```powershell
135
+ .\openai-local-bridge.ps1 <command>
136
+ ```
137
+
138
+ ### Windows BAT
139
+
140
+ ```bat
141
+ openai-local-bridge.bat <command>
142
+ ```
143
+
144
+ All of these entry points forward to the same CLI.
145
+
146
+ If you install through npm, `olb` starts the bundled platform binary directly. Supported npm targets are Linux x64, macOS x64, macOS arm64, and Windows x64.
147
+
148
+ ## Release
149
+
150
+ The release workflow lives in [.github/workflows/release-binaries.yml](/home/duanluan/workspaces/my/projects/openai-local-bridge/.github/workflows/release-binaries.yml).
151
+
152
+ Before pushing a release tag:
153
+
154
+ - set matching versions in `pyproject.toml` and `package.json`
155
+ - configure GitHub secrets `PYPI_API_TOKEN` and `NPM_TOKEN`
156
+ - push a tag such as `v0.2.2`
157
+
158
+ When the workflow runs on that tag, it:
159
+
160
+ - builds standalone binaries for Linux x64, macOS x64, macOS arm64, and Windows x64
161
+ - uploads those binaries to GitHub Releases
162
+ - publishes the Python package to PyPI
163
+ - publishes the root npm package; the npm installer downloads the matching binary from GitHub Releases
164
+
165
+ If npm publishing fails but the tag and GitHub Release already exist, use the workflow's `Run workflow` button and pass the existing tag such as `v0.2.3` to retry npm publishing without creating a new version.
166
+
167
+ ## Using It in a Client
168
+
169
+ Using Trae as an example, the recommended flow is split into two phases.
170
+
171
+ ### Phase 1: Add the model in the client first
172
+
173
+ 1. Keep this project disabled:
174
+
175
+ ```bash
176
+ olb disable
177
+ ```
178
+
179
+ 2. Confirm that the machine can reach the official OpenAI service.
180
+ 3. Add the model in the client, for example:
181
+ - Provider: `OpenAI`
182
+ - Model: `Custom model`
183
+ - Model ID: `gpt-5.4`
184
+ - API Key: your official OpenAI key
185
+
186
+ ### Phase 2: Enable the bridge for subsequent requests
187
+
188
+ ```bash
189
+ olb start
190
+ ```
191
+
192
+ Then choose the model you just added in the client.
193
+
194
+ ## FAQ
195
+
196
+ ### What does `olb` do by default?
197
+
198
+ - If no configuration file exists, it starts initialization.
199
+ - If configuration is already complete, it shows the current status.
200
+
201
+ ### What can I check with `olb status`?
202
+
203
+ These fields are usually the most important:
204
+
205
+ - `hosts`: whether takeover is active
206
+ - `root_ca`: whether the root certificate exists
207
+ - `nss`: NSS status
208
+ - `listener`: whether the local listener is running
209
+ - `listen_addr`: listening address
210
+ - `config`: configuration file location
211
+
212
+ ### Other software is affected too
213
+
214
+ That is expected with the current approach, because the takeover happens at the system level for `api.openai.com`.
215
+
216
+ Restore normal behavior immediately:
217
+
218
+ ```bash
219
+ olb disable
220
+ ```
221
+
222
+ ### Model requests fail
223
+
224
+ Check these items first:
225
+
226
+ - Whether `Base URL` is correct
227
+ - Whether `API Key` is correct
228
+ - Whether the upstream service is OpenAI-compatible
229
+ - Whether the upstream model you configured actually exists
230
+
231
+ ### Failed to modify `hosts` or import certificates on Windows
232
+
233
+ This is usually a permission issue. Run the command again in a terminal with sufficient privileges.
234
+
235
+ ### Windows says `missing command: openssl`
236
+
237
+ The current implementation requires OpenSSL to be installed locally. Install OpenSSL first, then confirm that this works in your terminal:
238
+
239
+ ```powershell
240
+ openssl version
241
+ ```
242
+
243
+ ### Startup fails on Linux / macOS
244
+
245
+ If you use the default port `443`, the system may require elevated privileges. Follow the prompt, or switch to a higher port.
246
+
247
+ ## Configuration File
248
+
249
+ The CLI writes its configuration to a file under your user configuration directory.
250
+
251
+ View the path:
252
+
253
+ ```bash
254
+ olb config-path
255
+ ```
256
+
257
+ View the current configuration:
258
+
259
+ ```bash
260
+ olb config
261
+ ```
262
+
263
+ ## Security Notes
264
+
265
+ Before using this project, keep in mind:
266
+
267
+ - It installs a local certificate on your machine.
268
+ - It modifies the system `hosts` file.
269
+ - Run `olb disable` when you are not using it.
package/README_CN.md ADDED
@@ -0,0 +1,269 @@
1
+ # openai-local-bridge
2
+
3
+ [English README](./README.md)
4
+
5
+ `openai-local-bridge` 用于在本机将 `api.openai.com` 请求桥接到第三方 OpenAI-compatible 接口,以实现 Trae、AI Assistant 中 GPT Codex 使用第三方 API。
6
+
7
+ ## 前置环境
8
+
9
+ - `Git`
10
+ - `Python`
11
+ - `uv`:可选,但推荐。有它时 `npm` 启动器会优先直接运行最新 CLI。
12
+ - `OpenSSL`:执行 `olb enable` 或 `olb start` 时需要它来生成本地证书。Windows 请先安装 [OpenSSL](https://slproweb.com/products/Win32OpenSSL.html),并确认 `openssl.exe` 所在目录已加入 `PATH`。
13
+
14
+ 检查环境:
15
+
16
+ ```bash
17
+ git --version
18
+ python --version
19
+ openssl version
20
+ ```
21
+
22
+ ## 安装
23
+
24
+ 如果想直接使用自带 Python 运行时的独立二进制包,可以从 GitHub Releases 下载对应平台压缩包。此方式不再依赖本机 `Python` 或 `uv`,但执行 `olb enable` / `olb start` 仍然需要 `OpenSSL`。
25
+
26
+ ### 方式 1:`uv`
27
+
28
+ ```bash
29
+ uv tool install openai-local-bridge
30
+ ```
31
+
32
+ ### 方式 2:`pip`
33
+
34
+ ```bash
35
+ python -m pip install --user openai-local-bridge
36
+ ```
37
+
38
+ ### 方式 3:`npm`
39
+
40
+ ```bash
41
+ npm install -g @duanluan/openai-local-bridge
42
+ ```
43
+
44
+ npm 包会在安装阶段从 GitHub Releases 下载当前平台对应的独立二进制文件,所以运行时不再依赖 `Python` 或 `uv`。
45
+
46
+ ### 方式 4:`curl` / PowerShell
47
+
48
+ Linux / macOS:
49
+
50
+ ```bash
51
+ curl -fsSL https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.sh | bash
52
+ ```
53
+
54
+ Windows PowerShell:
55
+
56
+ ```powershell
57
+ irm https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.ps1 | iex
58
+ ```
59
+
60
+ ### 方式 5:独立二进制包
61
+
62
+ 从 GitHub Releases 下载对应平台压缩包,解压后直接运行 `olb`:
63
+
64
+ - `olb-linux-x86_64.tar.gz`
65
+ - `olb-macos-x86_64.tar.gz`
66
+ - `olb-macos-arm64.tar.gz`
67
+ - `olb-windows-x86_64.zip`
68
+
69
+ ## 快速开始
70
+
71
+ 最直接的用法就是:
72
+
73
+ ```bash
74
+ olb start
75
+ ```
76
+
77
+ 后台启动:
78
+
79
+ ```bash
80
+ olb start --background
81
+ ```
82
+
83
+ 如果本机还没有配置,`olb start` 会先进入初始化,再继续执行启用和启动;交互式会采集:
84
+
85
+ - `Base URL`
86
+ - `API Key`
87
+ - `推理强度`
88
+
89
+ 如果你只想单独修改配置,可以执行:
90
+
91
+ ```bash
92
+ olb init
93
+ ```
94
+
95
+ 关闭接管:
96
+
97
+ ```bash
98
+ olb disable
99
+ ```
100
+
101
+ 关闭当前运行中的 bridge:
102
+
103
+ ```bash
104
+ olb stop
105
+ ```
106
+
107
+ ## 常用命令
108
+
109
+ 命令说明:
110
+
111
+ - `olb`:未配置时进入初始化,已配置时显示状态
112
+ - `init`:首次初始化或重新配置
113
+ - `config`:查看当前配置
114
+ - `config-path`:查看配置文件路径
115
+ - `status`:查看当前状态
116
+ - `enable`:安装证书、处理 hosts,并在支持的平台上处理 NSS
117
+ - `disable`:取消 hosts 接管
118
+ - `start`:未初始化时先进入初始化,然后执行 `enable` 并直接启动 bridge
119
+ - `start --background`:以后台模式启动 bridge,并将日志写到配置目录
120
+ - `stop`:停止当前 bridge 进程,包括后台运行中的实例
121
+
122
+ ## 包装脚本入口
123
+
124
+ 如果你是在仓库目录里直接使用,也可以:
125
+
126
+ ### Linux / macOS
127
+
128
+ ```bash
129
+ ./openai-local-bridge.sh <command>
130
+ ```
131
+
132
+ ### Windows PowerShell
133
+
134
+ ```powershell
135
+ .\openai-local-bridge.ps1 <command>
136
+ ```
137
+
138
+ ### Windows BAT
139
+
140
+ ```bat
141
+ openai-local-bridge.bat <command>
142
+ ```
143
+
144
+ 这些入口最终都会转发到同一个 CLI。
145
+
146
+ 如果通过 npm 安装,`olb` 会直接启动当前平台对应的内置二进制文件。当前支持 Linux x64、macOS x64、macOS arm64、Windows x64。
147
+
148
+ ## 发布
149
+
150
+ 发布工作流位于 [release-binaries.yml](/home/duanluan/workspaces/my/projects/openai-local-bridge/.github/workflows/release-binaries.yml)。
151
+
152
+ 推送发布 tag 前需要:
153
+
154
+ - 在 `pyproject.toml` 和 `package.json` 中写入相同版本号
155
+ - 在 GitHub Secrets 中配置 `PYPI_API_TOKEN` 和 `NPM_TOKEN`
156
+ - 推送类似 `v0.2.2` 的 tag
157
+
158
+ 该工作流会:
159
+
160
+ - 构建 Linux x64、macOS x64、macOS arm64、Windows x64 的独立二进制文件
161
+ - 上传到 GitHub Releases
162
+ - 发布 Python 包到 PyPI
163
+ - 发布 npm 主包;npm 安装阶段会再从 GitHub Releases 下载对应平台的二进制文件
164
+
165
+ 如果 npm 发布失败,但 tag 和 GitHub Release 已经存在,可以在 Actions 页面使用 `Run workflow`,传入已有 tag(例如 `v0.2.3`)单独重试 npm 发布,而不需要再增加版本号。
166
+
167
+ ## 在客户端中使用
168
+
169
+ 以 Trae 为例,建议分两个阶段:
170
+
171
+ ### 阶段 1:先在客户端里完成模型添加
172
+
173
+ 1. 保持本项目未接管,执行:
174
+
175
+ ```bash
176
+ olb disable
177
+ ```
178
+
179
+ 2. 确认当前机器可以正常访问官方 OpenAI。
180
+ 3. 在客户端中添加模型,例如:
181
+ - 服务商:`OpenAI`
182
+ - 模型:`自定义模型`
183
+ - 模型 ID:`gpt-5.4`
184
+ - API Key:官方 OpenAI Key
185
+
186
+ ### 阶段 2:再启用 bridge 接管后续请求
187
+
188
+ ```bash
189
+ olb start
190
+ ```
191
+
192
+ 然后在客户端中选择你刚才添加的模型即可。
193
+
194
+ ## 常见问题
195
+
196
+ ### `olb` 默认会做什么
197
+
198
+ - 如果还没有配置文件:进入初始化
199
+ - 如果已经配置完成:显示当前状态
200
+
201
+ ### `olb status` 可以看什么
202
+
203
+ 通常重点关注这些字段:
204
+
205
+ - `hosts`:是否已接管
206
+ - `root_ca`:根证书是否存在
207
+ - `nss`:NSS 状态
208
+ - `listener`:本地监听是否已启动
209
+ - `listen_addr`:监听地址
210
+ - `config`:配置文件位置
211
+
212
+ ### 其他软件也被影响了
213
+
214
+ 这是当前方案的正常表现,因为接管的是系统级 `api.openai.com`。
215
+
216
+ 立即恢复:
217
+
218
+ ```bash
219
+ olb disable
220
+ ```
221
+
222
+ ### 模型调用失败
223
+
224
+ 优先检查:
225
+
226
+ - `Base URL` 是否正确
227
+ - `API Key` 是否正确
228
+ - 上游是否兼容 OpenAI 接口
229
+ - 你配置的上游模型是否真实存在
230
+
231
+ ### Windows 写 hosts 或导入证书失败
232
+
233
+ 通常是权限不足。请使用有足够权限的终端再执行。
234
+
235
+ ### Windows 提示 `missing command: openssl`
236
+
237
+ 当前实现要求本机已安装 OpenSSL。请先安装 OpenSSL,并确认终端里可以直接执行:
238
+
239
+ ```powershell
240
+ openssl version
241
+ ```
242
+
243
+ ### Linux / macOS 启动失败
244
+
245
+ 如果你使用默认 `443` 端口,系统可能要求更高权限。可直接按提示执行,或改用更高端口。
246
+
247
+ ## 配置文件
248
+
249
+ CLI 会把配置写到用户目录下的配置文件中。
250
+
251
+ 查看路径:
252
+
253
+ ```bash
254
+ olb config-path
255
+ ```
256
+
257
+ 查看当前配置:
258
+
259
+ ```bash
260
+ olb config
261
+ ```
262
+
263
+ ## 安全提示
264
+
265
+ 使用前请注意:
266
+
267
+ - 本项目会在本机安装本地证书
268
+ - 本项目会修改系统 hosts
269
+ - 不使用时建议执行 `olb disable`
package/npm/install.js ADDED
@@ -0,0 +1,96 @@
1
+ #!/usr/bin/env node
2
+
3
+ const fs = require('node:fs');
4
+ const fsp = require('node:fs/promises');
5
+ const https = require('node:https');
6
+ const os = require('node:os');
7
+ const path = require('node:path');
8
+
9
+ const packageRoot = path.resolve(__dirname, '..');
10
+ const packageJson = require('../package.json');
11
+ const platformPackages = require('./platforms.json');
12
+
13
+ function platformKey(platform = process.platform, arch = process.arch) {
14
+ return `${platform}-${arch}`;
15
+ }
16
+
17
+ function packageSpecFor(platform = process.platform, arch = process.arch) {
18
+ const spec = platformPackages[platformKey(platform, arch)];
19
+ if (!spec) {
20
+ throw new Error(`unsupported platform: ${platform}/${arch}`);
21
+ }
22
+ return spec;
23
+ }
24
+
25
+ function downloadUrl(version = packageJson.version, platform = process.platform, arch = process.arch) {
26
+ const spec = packageSpecFor(platform, arch);
27
+ return `https://github.com/duanluan/openai-local-bridge/releases/download/v${version}/${spec.releaseAsset}`;
28
+ }
29
+
30
+ function downloadFile(url, destination) {
31
+ return new Promise((resolve, reject) => {
32
+ const request = https.get(
33
+ url,
34
+ {
35
+ headers: {
36
+ 'User-Agent': 'openai-local-bridge-npm-installer',
37
+ },
38
+ },
39
+ (response) => {
40
+ if (response.statusCode && response.statusCode >= 300 && response.statusCode < 400 && response.headers.location) {
41
+ response.resume();
42
+ downloadFile(response.headers.location, destination).then(resolve, reject);
43
+ return;
44
+ }
45
+
46
+ if (response.statusCode !== 200) {
47
+ response.resume();
48
+ reject(new Error(`download failed: ${url} (${response.statusCode})`));
49
+ return;
50
+ }
51
+
52
+ const file = fs.createWriteStream(destination, { mode: 0o755 });
53
+ response.pipe(file);
54
+ file.on('finish', () => file.close(resolve));
55
+ file.on('error', reject);
56
+ },
57
+ );
58
+ request.on('error', reject);
59
+ });
60
+ }
61
+
62
+ async function installBinary() {
63
+ const spec = packageSpecFor();
64
+ const binaryPath = path.join(__dirname, spec.binaryPath);
65
+ const tempPath = path.join(os.tmpdir(), `${spec.releaseAsset}-${process.pid}`);
66
+
67
+ await fsp.mkdir(path.dirname(binaryPath), { recursive: true });
68
+ await downloadFile(downloadUrl(), tempPath);
69
+ await fsp.rename(tempPath, binaryPath);
70
+
71
+ if (process.platform !== 'win32') {
72
+ await fsp.chmod(binaryPath, 0o755);
73
+ }
74
+ }
75
+
76
+ async function main() {
77
+ if (process.env.OLB_SKIP_BINARY_DOWNLOAD === '1') {
78
+ return;
79
+ }
80
+ await installBinary();
81
+ }
82
+
83
+ module.exports = {
84
+ downloadUrl,
85
+ installBinary,
86
+ main,
87
+ packageSpecFor,
88
+ platformKey,
89
+ };
90
+
91
+ if (require.main === module) {
92
+ main().catch((error) => {
93
+ console.error(error.message || String(error));
94
+ process.exit(1);
95
+ });
96
+ }
package/npm/olb.js ADDED
@@ -0,0 +1,63 @@
1
+ #!/usr/bin/env node
2
+
3
+ const { spawnSync } = require('node:child_process');
4
+ const { existsSync } = require('node:fs');
5
+ const path = require('node:path');
6
+
7
+ const packageRoot = path.resolve(__dirname, '..');
8
+ const platformPackages = require('./platforms.json');
9
+
10
+ function platformKey(platform = process.platform, arch = process.arch) {
11
+ return `${platform}-${arch}`;
12
+ }
13
+
14
+ function packageSpecFor(platform = process.platform, arch = process.arch) {
15
+ const spec = platformPackages[platformKey(platform, arch)];
16
+ if (!spec) {
17
+ throw new Error(`unsupported platform: ${platform}/${arch}`);
18
+ }
19
+ return spec;
20
+ }
21
+
22
+ function resolveBinaryPath(options = {}) {
23
+ const override = options.binaryPath || process.env.OLB_BINARY_PATH;
24
+ if (override) {
25
+ return override;
26
+ }
27
+
28
+ const spec = packageSpecFor(options.platform, options.arch);
29
+ const binaryPath = path.join(packageRoot, 'npm', spec.binaryPath);
30
+ if (!existsSync(binaryPath)) {
31
+ throw new Error(`missing binary: ${binaryPath}; rerun npm install openai-local-bridge`);
32
+ }
33
+ return binaryPath;
34
+ }
35
+
36
+ function runBinary(binaryPath, args = process.argv.slice(2)) {
37
+ const result = spawnSync(binaryPath, args, { stdio: 'inherit' });
38
+ if (result.error) {
39
+ throw result.error;
40
+ }
41
+ return result.status ?? 1;
42
+ }
43
+
44
+ function main(args = process.argv.slice(2)) {
45
+ try {
46
+ const binaryPath = resolveBinaryPath();
47
+ process.exit(runBinary(binaryPath, args));
48
+ } catch (error) {
49
+ console.error(error.message || String(error));
50
+ process.exit(1);
51
+ }
52
+ }
53
+
54
+ module.exports = {
55
+ packageSpecFor,
56
+ platformKey,
57
+ resolveBinaryPath,
58
+ runBinary,
59
+ };
60
+
61
+ if (require.main === module) {
62
+ main();
63
+ }
@@ -0,0 +1,26 @@
1
+ {
2
+ "darwin-arm64": {
3
+ "releaseAsset": "olb-macos-arm64",
4
+ "binaryPath": "bin/olb",
5
+ "os": "darwin",
6
+ "cpu": "arm64"
7
+ },
8
+ "darwin-x64": {
9
+ "releaseAsset": "olb-macos-x86_64",
10
+ "binaryPath": "bin/olb",
11
+ "os": "darwin",
12
+ "cpu": "x64"
13
+ },
14
+ "linux-x64": {
15
+ "releaseAsset": "olb-linux-x86_64",
16
+ "binaryPath": "bin/olb",
17
+ "os": "linux",
18
+ "cpu": "x64"
19
+ },
20
+ "win32-x64": {
21
+ "releaseAsset": "olb-windows-x86_64.exe",
22
+ "binaryPath": "bin/olb.exe",
23
+ "os": "win32",
24
+ "cpu": "x64"
25
+ }
26
+ }
package/package.json ADDED
@@ -0,0 +1,38 @@
1
+ {
2
+ "name": "@duanluan/openai-local-bridge",
3
+ "version": "0.2.3",
4
+ "description": "Standalone npm launcher for openai-local-bridge",
5
+ "bin": {
6
+ "olb": "npm/olb.js"
7
+ },
8
+ "scripts": {
9
+ "postinstall": "node npm/install.js"
10
+ },
11
+ "files": [
12
+ "npm/install.js",
13
+ "npm/olb.js",
14
+ "npm/platforms.json",
15
+ "README.md",
16
+ "README_CN.md"
17
+ ],
18
+ "publishConfig": {
19
+ "access": "public"
20
+ },
21
+ "repository": {
22
+ "type": "git",
23
+ "url": "git+https://github.com/duanluan/openai-local-bridge.git"
24
+ },
25
+ "homepage": "https://github.com/duanluan/openai-local-bridge",
26
+ "bugs": {
27
+ "url": "https://github.com/duanluan/openai-local-bridge/issues"
28
+ },
29
+ "keywords": [
30
+ "github",
31
+ "launcher",
32
+ "openai",
33
+ "bridge",
34
+ "cli",
35
+ "binary"
36
+ ],
37
+ "license": "UNLICENSED"
38
+ }