@mmmbuto/zai-codex-bridge 0.3.2 → 0.4.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md ADDED
@@ -0,0 +1,39 @@
1
+ # Changelog
2
+
3
+ All notable changes to this project will be documented in this file.
4
+
5
+ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
6
+ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
+
8
+ ## [0.4.2] - 2026-01-16
9
+
10
+ ### Changed
11
+ - Replaced the README with expanded setup, usage, and troubleshooting guidance
12
+ - Clarified Codex provider configuration and proxy endpoint usage
13
+
14
+ ## [0.4.1] - 2026-01-16
15
+
16
+ ### Added
17
+ - Tool calling support (MCP/function calls) when `ALLOW_TOOLS=1`
18
+ - Bridging for `function_call_output` items to Chat `role: tool` messages
19
+ - Streaming support for `delta.tool_calls` with proper Responses API events
20
+ - Non-streaming support for `msg.tool_calls` in final response
21
+ - Tool call events: `response.output_item.added` (function_call), `response.function_call_arguments.delta`, `response.function_call_arguments.done`
22
+ - Automated tool call test in test suite
23
+
24
+ ### Changed
25
+ - `translateResponsesToChat()` now handles `type: function_call_output` items
26
+ - `streamChatToResponses()` now detects and emits tool call events
27
+ - `translateChatToResponses()` now includes `function_call` items in output array
28
+
29
+ ### Fixed
30
+ - Tool responses (from MCP/function calls) are now correctly forwarded to upstream as `role: tool` messages
31
+ - Function call items are now properly included in `response.completed` output array
32
+
33
+ ## [0.4.0] - Previous
34
+
35
+ ### Added
36
+ - Initial release with Responses API to Chat Completions translation
37
+ - Streaming support with SSE
38
+ - Health check endpoint
39
+ - Zero-dependency implementation
package/README.md CHANGED
@@ -1,6 +1,6 @@
1
- # Z.AI Codex Bridge
1
+ # ZAI Codex Bridge
2
2
 
3
- > Local proxy that translates OpenAI Responses API format to Z.AI Chat Completions format
3
+ > Local proxy that translates OpenAI **Responses API** Z.AI **Chat Completions** for Codex CLI
4
4
 
5
5
  [![npm](https://img.shields.io/npm/v/@mmmbuto/zai-codex-bridge?style=flat-square&logo=npm)](https://www.npmjs.org/package/@mmmbuto/zai-codex-bridge)
6
6
  [![node](https://img.shields.io/node/v/@mmmbuto/zai-codex-bridge?style=flat-square&logo=node.js)](https://github.com/DioNanos/zai-codex-bridge)
@@ -10,36 +10,39 @@
10
10
 
11
11
  ## What It Solves
12
12
 
13
- Codex uses the OpenAI **Responses API** format (with `instructions` and `input` fields), but Z.AI only supports the legacy **Chat Completions** format (with `messages` array).
13
+ Newer **Codex CLI** versions speak the OpenAI **Responses API** (e.g. `/v1/responses`, with `instructions` + `input` + event-stream semantics).
14
+ Some gateways/providers (including Z.AI endpoints) only expose legacy **Chat Completions** (`messages[]`).
14
15
 
15
16
  This proxy:
16
- 1. Accepts Codex requests in **Responses format**
17
- 2. Translates them to **Chat format**
17
+ 1. Accepts Codex requests in **Responses** format
18
+ 2. Translates them to **Chat Completions**
18
19
  3. Forwards to Z.AI
19
- 4. Translates the response back to **Responses format**
20
+ 4. Translates back to **Responses** format (stream + non-stream)
20
21
  5. Returns to Codex
21
22
 
22
- **Without this proxy**, Codex fails with error from Z.AI:
23
+ **Without this proxy**, Codex may fail (example from upstream error payloads):
23
24
  ```json
24
25
  {"error":{"code":"1214","message":"Incorrect role information"}}
25
26
  ```
26
27
 
28
+ > If you’re using **codex-termux** and a gateway that doesn’t fully match the Responses API, this proxy is the recommended compatibility layer.
29
+
27
30
  ---
28
31
 
29
32
  ## Features
30
33
 
31
- - Transparent translation between Responses and Chat formats
34
+ - Responses API Chat Completions translation (request + response)
32
35
  - Streaming support with SSE (Server-Sent Events)
33
- - Zero dependencies - uses Node.js built-ins only
34
- - Health checks at `/health` endpoint
35
- - Configurable via CLI flags and environment variables
36
+ - Health check endpoint (`/health`)
37
+ - Works on Linux/macOS/Windows (WSL) + Termux (ARM64)
38
+ - **Optional tool/MCP bridging** (see “Tools / MCP” below)
39
+ - Zero/low dependencies (Node built-ins only, unless noted by package.json)
36
40
 
37
41
  ---
38
42
 
39
43
  ## Requirements
40
44
 
41
- - **Node.js**: 18.0.0 or higher (for native `fetch`)
42
- - **Platform**: Linux, macOS, Windows (WSL), Termux (ARM64)
45
+ - **Node.js**: 18+ (native `fetch`)
43
46
  - **Port**: 31415 (default, configurable)
44
47
 
45
48
  ---
@@ -54,28 +57,34 @@ npm install -g @mmmbuto/zai-codex-bridge
54
57
 
55
58
  ## Quick Start
56
59
 
57
- ### 1. Start the Proxy
60
+ ### 1) Start the Proxy
58
61
 
59
62
  ```bash
60
63
  zai-codex-bridge
61
64
  ```
62
65
 
63
- The proxy will listen on `http://127.0.0.1:31415`
66
+ Default listen address:
67
+
68
+ - `http://127.0.0.1:31415`
64
69
 
65
- ### 2. Configure Codex
70
+ ### 2) Configure Codex
66
71
 
67
- Add to `~/.codex/config.toml`:
72
+ Add this provider to `~/.codex/config.toml`:
68
73
 
69
74
  ```toml
70
75
  [model_providers.zai_proxy]
71
76
  name = "ZAI via local proxy"
72
- base_url = "http://127.0.0.1:31415/v1"
77
+ base_url = "http://127.0.0.1:31415"
73
78
  env_key = "OPENAI_API_KEY"
74
79
  wire_api = "responses"
75
80
  stream_idle_timeout_ms = 3000000
76
81
  ```
77
82
 
78
- ### 3. Use with Codex
83
+ > Notes:
84
+ > - `base_url` is the server root. Codex will call `/v1/responses`; this proxy supports that path.
85
+ > - We keep `env_key = "OPENAI_API_KEY"` because Codex expects that key name. You can store your Z.AI key there.
86
+
87
+ ### 3) Run Codex via the Proxy
79
88
 
80
89
  ```bash
81
90
  export OPENAI_API_KEY="your-zai-api-key"
@@ -84,6 +93,29 @@ codex -m "GLM-4.7" -c model_provider="zai_proxy"
84
93
 
85
94
  ---
86
95
 
96
+ ## Tools / MCP (optional)
97
+
98
+ Codex tool-calling / MCP memory requires an additional compatibility layer:
99
+ - Codex uses **Responses API tool events** (function call items + arguments delta/done, plus function_call_output inputs)
100
+ - Some upstream models/providers may not emit tool calls (or may emit them in a different shape)
101
+
102
+ This proxy can **attempt** to bridge tools when enabled:
103
+
104
+ ```bash
105
+ export ALLOW_TOOLS=1
106
+ ```
107
+
108
+ Important:
109
+ - Tool support is **provider/model dependent**. If upstream never emits tool calls, the proxy can’t invent them.
110
+ - If tools are enabled, the proxy must translate:
111
+ - Responses `tools` + `tool_choice` → Chat `tools` + `tool_choice`
112
+ - Chat `tool_calls` (stream/non-stream) → Responses function-call events
113
+ - Responses `function_call_output` → Chat `role=tool` messages
114
+
115
+ (See repo changelog and docs for the exact implemented behavior.)
116
+
117
+ ---
118
+
87
119
  ## CLI Usage
88
120
 
89
121
  ```bash
@@ -97,7 +129,7 @@ zai-codex-bridge --port 8080
97
129
  zai-codex-bridge --log-level debug
98
130
 
99
131
  # Custom Z.AI endpoint
100
- zai-codex-bridge --zai-base-url https://custom.z.ai/v1
132
+ zai-codex-bridge --zai-base-url https://api.z.ai/api/coding/paas/v4
101
133
 
102
134
  # Show help
103
135
  zai-codex-bridge --help
@@ -106,17 +138,20 @@ zai-codex-bridge --help
106
138
  ### Environment Variables
107
139
 
108
140
  ```bash
109
- export PORT=31415
110
141
  export HOST=127.0.0.1
142
+ export PORT=31415
111
143
  export ZAI_BASE_URL=https://api.z.ai/api/coding/paas/v4
112
144
  export LOG_LEVEL=info
145
+
146
+ # Optional
147
+ export ALLOW_TOOLS=1
113
148
  ```
114
149
 
115
150
  ---
116
151
 
117
- ## Auto-Starting Proxy with Codex
152
+ ## Auto-start the Proxy with Codex (recommended)
118
153
 
119
- You can create a shell function that starts the proxy automatically when needed:
154
+ Use a shell function that starts the proxy only if needed:
120
155
 
121
156
  ```bash
122
157
  codex-with-zai() {
@@ -125,114 +160,107 @@ codex-with-zai() {
125
160
  local HEALTH="http://${HOST}:${PORT}/health"
126
161
  local PROXY_PID=""
127
162
 
128
- # Start proxy only if not responding
129
163
  if ! curl -fsS "$HEALTH" >/dev/null 2>&1; then
130
164
  zai-codex-bridge --host "$HOST" --port "$PORT" >/dev/null 2>&1 &
131
165
  PROXY_PID=$!
132
166
  trap 'kill $PROXY_PID 2>/dev/null' EXIT INT TERM
133
- sleep 2
167
+ sleep 1
134
168
  fi
135
169
 
136
- # Run codex
137
- codex -m "GLM-4.7" -c model_provider="zai_proxy" "$@"
170
+ codex -c model_provider="zai_proxy" "$@"
138
171
  }
139
172
  ```
140
173
 
141
174
  Usage:
175
+
142
176
  ```bash
143
- codex-with-zai
144
- # Proxy auto-starts, Codex runs
145
- # Ctrl+D exits both
177
+ export OPENAI_API_KEY="your-zai-api-key"
178
+ codex-with-zai -m "GLM-4.7"
146
179
  ```
147
180
 
148
181
  ---
149
182
 
150
183
  ## API Endpoints
151
184
 
152
- ### `POST /responses`
153
- Accepts OpenAI Responses API format, translates to Chat, returns Responses format.
154
-
155
- ### `POST /v1/responses`
156
- Same as `/responses` (for compatibility with Codex's path structure).
157
-
158
- ### `GET /health`
159
- Health check endpoint.
185
+ - `POST /responses` — accepts Responses API requests
186
+ - `POST /v1/responses` same as above (Codex default path)
187
+ - `GET /health` — health check
160
188
 
161
189
  ---
162
190
 
163
- ## Translation Details
191
+ ## Translation Overview
164
192
 
165
193
  ### Request: Responses → Chat
166
194
 
167
- ```javascript
168
- // Input (Responses format)
195
+ ```js
196
+ // Input (Responses)
169
197
  {
170
- model: "GLM-4.7",
171
- instructions: "Be helpful",
172
- input: [
173
- { role: "user", content: "Hello" }
174
- ],
175
- max_output_tokens: 1000
198
+ "model": "GLM-4.7",
199
+ "instructions": "Be helpful",
200
+ "input": [{ "role": "user", "content": "Hello" }],
201
+ "max_output_tokens": 1000
176
202
  }
177
203
 
178
- // Output (Chat format)
204
+ // Output (Chat)
179
205
  {
180
- model: "GLM-4.7",
181
- messages: [
182
- { role: "system", content: "Be helpful" },
183
- { role: "user", content: "Hello" }
206
+ "model": "GLM-4.7",
207
+ "messages": [
208
+ { "role": "system", "content": "Be helpful" },
209
+ { "role": "user", "content": "Hello" }
184
210
  ],
185
- max_tokens: 1000
211
+ "max_tokens": 1000
186
212
  }
187
213
  ```
188
214
 
189
- ### Response: Chat → Responses
215
+ ### Response: Chat → Responses (simplified)
190
216
 
191
- ```javascript
192
- // Input (Chat format)
217
+ ```js
218
+ // Input (Chat)
193
219
  {
194
- choices: [{
195
- message: { content: "Hi there!" }
196
- }],
197
- usage: {
198
- prompt_tokens: 10,
199
- completion_tokens: 5
200
- }
220
+ "choices": [{ "message": { "content": "Hi there!" } }],
221
+ "usage": { "prompt_tokens": 10, "completion_tokens": 5 }
201
222
  }
202
223
 
203
- // Output (Responses format)
224
+ // Output (Responses - simplified)
204
225
  {
205
- output: [{ value: "Hi there!", content_type: "text" }],
206
- status: "completed",
207
- usage: {
208
- input_tokens: 10,
209
- output_tokens: 5
210
- }
226
+ "status": "completed",
227
+ "output": [{ "type": "message", "content": [{ "type": "output_text", "text": "Hi there!" }] }],
228
+ "usage": { "input_tokens": 10, "output_tokens": 5 }
211
229
  }
212
230
  ```
213
231
 
214
232
  ---
215
233
 
216
- ## Testing
234
+ ## Troubleshooting
217
235
 
218
- ```bash
219
- # Set your Z.AI API key
220
- export ZAI_API_KEY="sk-your-key"
236
+ ### 401 / “token expired or incorrect”
237
+ - Verify the key is exported as `OPENAI_API_KEY` (or matches `env_key` in config.toml).
238
+ - Make sure the proxy is not overwriting Authorization headers.
221
239
 
222
- # Run test suite
223
- npm run test:curl
224
- ```
240
+ ### 404 on `/v1/responses`
241
+ - Ensure `base_url` points to the proxy root (example: `http://127.0.0.1:31415`).
242
+ - Confirm the proxy is running and `/health` returns `ok`.
243
+
244
+ ### 502 Bad Gateway
245
+ - Proxy reached upstream but upstream failed. Enable debug:
246
+ ```bash
247
+ LOG_LEVEL=debug zai-codex-bridge
248
+ ```
225
249
 
226
250
  ---
227
251
 
228
- ## Documentation
252
+ ## Versioning Policy
229
253
 
230
- Complete usage guide: [docs/guide.md](docs/guide.md)
254
+ This repo follows **small, safe patch increments** while stabilizing provider compatibility:
255
+
256
+ - Keep patch bumps only: `0.4.0 → 0.4.1 → 0.4.2 → ...`
257
+ - No big jumps unless strictly necessary.
258
+
259
+ (See `CHANGELOG.md` for details once present.)
231
260
 
232
261
  ---
233
262
 
234
263
  ## License
235
264
 
236
- MIT License - Copyright (c) 2026 Davide A. Guglielmi
237
-
265
+ MIT License Copyright (c) 2026 Davide A. Guglielmi
238
266
  See [LICENSE](LICENSE) for details.
package/RELEASING.md ADDED
@@ -0,0 +1,80 @@
1
+ # Releasing
2
+
3
+ This document describes the release process for zai-codex-bridge.
4
+
5
+ ## Version Policy
6
+
7
+ - **Patch releases only** (0.4.0 → 0.4.1 → 0.4.2, etc.)
8
+ - No minor or major bumps without explicit discussion
9
+ - Always increment by +0.0.1 from current version
10
+
11
+ ## Release Steps
12
+
13
+ ### 1. Run Tests
14
+
15
+ ```bash
16
+ # Set your API key
17
+ export ZAI_API_KEY="sk-your-key"
18
+
19
+ # Run test suite
20
+ npm run test:curl
21
+ # or
22
+ npm test
23
+ ```
24
+
25
+ ### 2. Bump Version
26
+
27
+ ```bash
28
+ # Use the release script (recommended)
29
+ npm run release:patch
30
+
31
+ # Or manually edit package.json and change:
32
+ # "version": "0.4.0" -> "version": "0.4.1"
33
+ ```
34
+
35
+ ### 3. Update CHANGELOG.md
36
+
37
+ Add an entry for the new version following [Keep a Changelog](https://keepachangelog.com/en/1.0.0/) format.
38
+
39
+ ### 4. Commit
40
+
41
+ ```bash
42
+ git add package.json CHANGELOG.md
43
+ git commit -m "chore: release v0.4.1"
44
+ ```
45
+
46
+ ### 5. Tag
47
+
48
+ ```bash
49
+ git tag v0.4.1
50
+ ```
51
+
52
+ ### 6. Push (Optional)
53
+
54
+ ```bash
55
+ git push
56
+ git push --tags
57
+ ```
58
+
59
+ ### 7. Publish to npm
60
+
61
+ ```bash
62
+ npm publish
63
+ ```
64
+
65
+ ## release:patch Script
66
+
67
+ The `npm run release:patch` script:
68
+
69
+ 1. Verifies current version is 0.4.x
70
+ 2. Bumps patch version by +0.0.1
71
+ 3. Refuses to bump minor/major versions
72
+ 4. Updates package.json in-place
73
+
74
+ Example:
75
+ ```bash
76
+ $ npm run release:patch
77
+ Current version: 0.4.0
78
+ Bumping to: 0.4.1
79
+ Updated package.json
80
+ ```
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@mmmbuto/zai-codex-bridge",
3
- "version": "0.3.2",
3
+ "version": "0.4.2",
4
4
  "description": "Local proxy that translates OpenAI Responses API format to Z.AI Chat Completions format for Codex",
5
5
  "main": "src/server.js",
6
6
  "bin": {
@@ -8,7 +8,9 @@
8
8
  },
9
9
  "scripts": {
10
10
  "start": "node src/server.js",
11
- "test:curl": "node scripts/test-curl.js"
11
+ "test": "node scripts/test-curl.js",
12
+ "test:curl": "node scripts/test-curl.js",
13
+ "release:patch": "node scripts/release-patch.js"
12
14
  },
13
15
  "keywords": [
14
16
  "codex",
@@ -0,0 +1,60 @@
1
+ #!/usr/bin/env node
2
+
3
+ /**
4
+ * Safe patch version bumper
5
+ * Only allows patch releases (0.4.0 -> 0.4.1)
6
+ * Refuses minor/major bumps
7
+ */
8
+
9
+ const fs = require('fs');
10
+ const path = require('path');
11
+
12
+ const PACKAGE_PATH = path.join(__dirname, '..', 'package.json');
13
+
14
+ function bumpPatch(version) {
15
+ const parts = version.split('.').map(Number);
16
+
17
+ if (parts.length !== 3) {
18
+ throw new Error(`Invalid version format: ${version}`);
19
+ }
20
+
21
+ const [major, minor, patch] = parts;
22
+
23
+ // Only allow 0.4.x versions
24
+ if (major !== 0 || minor !== 4) {
25
+ console.error(`ERROR: Current version is ${version}`);
26
+ console.error('This script only supports patch releases for 0.4.x versions.');
27
+ console.error('For other version changes, edit package.json manually.');
28
+ process.exit(1);
29
+ }
30
+
31
+ const newVersion = `0.4.${patch + 1}`;
32
+ return newVersion;
33
+ }
34
+
35
+ function main() {
36
+ // Read package.json
37
+ const pkg = JSON.parse(fs.readFileSync(PACKAGE_PATH, 'utf8'));
38
+ const currentVersion = pkg.version;
39
+
40
+ console.log(`Current version: ${currentVersion}`);
41
+
42
+ // Bump patch
43
+ const newVersion = bumpPatch(currentVersion);
44
+ console.log(`Bumping to: ${newVersion}`);
45
+
46
+ // Update package.json
47
+ pkg.version = newVersion;
48
+
49
+ // Write back
50
+ fs.writeFileSync(PACKAGE_PATH, JSON.stringify(pkg, null, 2) + '\n');
51
+
52
+ console.log('Updated package.json');
53
+ console.log('\nNext steps:');
54
+ console.log(' 1. Update CHANGELOG.md');
55
+ console.log(' 2. Commit: git add package.json CHANGELOG.md && git commit -m "chore: release v' + newVersion + '"');
56
+ console.log(' 3. Tag: git tag v' + newVersion);
57
+ console.log(' 4. Publish: npm publish');
58
+ }
59
+
60
+ main();