@mmmbuto/zai-codex-bridge 0.1.0 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +104 -160
  2. package/package.json +1 -1
  3. package/src/server.js +3 -1
package/README.md CHANGED
@@ -1,16 +1,16 @@
1
- # 🌉 Z.AI Codex Bridge
1
+ # Z.AI Codex Bridge
2
2
 
3
- > **Local proxy that translates OpenAI Responses API format to Z.AI Chat Completions format**
3
+ > Local proxy that translates OpenAI Responses API format to Z.AI Chat Completions format
4
4
 
5
- [![npm](https://img.shields.io/npm/v/@mmmbuto/zai-codex-bridge?style=flat-square&logo=npm)](https://www.npmjs.com/package/@mmmbuto/zai-codex-bridge)
6
- [![node](https://img.shields.io/node/v/@mmmbuto/zai-codex-bridge?style=flat-square&logo=node.js)](https://github.com/mmmbuto/zai-codex-bridge)
5
+ [![npm](https://img.shields.io/npm/v/@mmmbuto/zai-codex-bridge?style=flat-square&logo=npm)](https://www.npmjs.org/package/@mmmbuto/zai-codex-bridge)
6
+ [![node](https://img.shields.io/node/v/@mmmbuto/zai-codex-bridge?style=flat-square&logo=node.js)](https://github.com/DioNanos/zai-codex-bridge)
7
7
  [![license](https://img.shields.io/npm/l/@mmmbuto/zai-codex-bridge?style=flat-square)](LICENSE)
8
8
 
9
9
  ---
10
10
 
11
11
  ## What It Solves
12
12
 
13
- Codex uses the newer OpenAI **Responses API** format (with `instructions` and `input` fields), but Z.AI only supports the legacy **Chat Completions** format (with `messages` array).
13
+ Codex uses the OpenAI **Responses API** format (with `instructions` and `input` fields), but Z.AI only supports the legacy **Chat Completions** format (with `messages` array).
14
14
 
15
15
  This proxy:
16
16
  1. Accepts Codex requests in **Responses format**
@@ -28,12 +28,11 @@ This proxy:
28
28
 
29
29
  ## Features
30
30
 
31
- **Transparent translation** between Responses and Chat formats
32
- **Streaming support** with SSE (Server-Sent Events)
33
- **Zero dependencies** - uses Node.js built-ins only
34
- ✅ **Auto-starting ZSH functions** - proxy runs only when Codex runs
35
- **Health checks** at `/health` endpoint
36
- ✅ **Configurable** via CLI flags and environment variables
31
+ - Transparent translation between Responses and Chat formats
32
+ - Streaming support with SSE (Server-Sent Events)
33
+ - Zero dependencies - uses Node.js built-ins only
34
+ - Health checks at `/health` endpoint
35
+ - Configurable via CLI flags and environment variables
37
36
 
38
37
  ---
39
38
 
@@ -47,88 +46,40 @@ This proxy:
47
46
 
48
47
  ## Installation
49
48
 
50
- ### Global Install (Recommended)
51
-
52
49
  ```bash
53
50
  npm install -g @mmmbuto/zai-codex-bridge
54
51
  ```
55
52
 
56
- ### Local Install
53
+ ---
57
54
 
58
- ```bash
59
- npm install @mmmbuto/zai-codex-bridge
60
- npx zai-codex-bridge
61
- ```
55
+ ## Quick Start
62
56
 
63
- ### From Source
57
+ ### 1. Start the Proxy
64
58
 
65
59
  ```bash
66
- git clone https://github.com/DioNanos/zai-codex-bridge.git
67
- cd zai-codex-bridge
68
- npm install
69
- npm link
60
+ zai-codex-bridge
70
61
  ```
71
62
 
72
- ---
73
-
74
- ## Quick Start
63
+ The proxy will listen on `http://127.0.0.1:31415`
75
64
 
76
- ### 1. Configure Codex
65
+ ### 2. Configure Codex
77
66
 
78
67
  Add to `~/.codex/config.toml`:
79
68
 
80
69
  ```toml
81
- [model_providers.zai_glm_proxy]
82
- name = "ZAI GLM via local proxy"
70
+ [model_providers.zai_proxy]
71
+ name = "ZAI via local proxy"
83
72
  base_url = "http://127.0.0.1:31415/v1"
84
73
  env_key = "OPENAI_API_KEY"
85
74
  wire_api = "responses"
86
75
  stream_idle_timeout_ms = 3000000
87
76
  ```
88
77
 
89
- ### 2. Add ZSH Functions
90
-
91
- Add to `~/.zshrc`:
78
+ ### 3. Use with Codex
92
79
 
93
80
  ```bash
94
- _codex_glm_with_proxy () {
95
- local KEY="$1"; shift
96
- local HOST="127.0.0.1"
97
- local PORT="31415"
98
- local HEALTH="http://${HOST}:${PORT}/health"
99
- local LOGFILE="${TMPDIR:-/tmp}/zai-codex-bridge.log"
100
- local PROXY_PID=""
101
-
102
- # Start proxy only if not responding
103
- if ! curl -fsS "$HEALTH" >/dev/null 2>&1; then
104
- zai-codex-bridge --host "$HOST" --port "$PORT" --log-level info >"$LOGFILE" 2>&1 &
105
- PROXY_PID=$!
106
- trap '[[ -n "$PROXY_PID" ]] && kill "$PROXY_PID" 2>/dev/null' EXIT INT TERM
107
- for i in {1..40}; do
108
- curl -fsS "$HEALTH" >/dev/null 2>&1 && break
109
- sleep 0.05
110
- done
111
- fi
112
-
113
- OPENAI_API_KEY="$KEY" codex -m "GLM-4.7" -c model_provider="zai_glm_proxy" "$@"
114
- }
115
-
116
- codex-glm-a () { _codex_glm_with_proxy "$ZAI_API_KEY_A" "$@"; }
117
- codex-glm-p () { _codex_glm_with_proxy "$ZAI_API_KEY_P" "$@"; }
118
- ```
119
-
120
- ### 3. Set API Keys
121
-
122
- ```bash
123
- export ZAI_API_KEY_A="sk-your-key-here"
124
- export ZAI_API_KEY_P="sk-your-key-here"
125
- ```
126
-
127
- ### 4. Use It
128
-
129
- ```bash
130
- source ~/.zshrc
131
- codex-glm-a
81
+ export OPENAI_API_KEY="your-zai-api-key"
82
+ codex -m "GLM-4.7" -c model_provider="zai_proxy"
132
83
  ```
133
84
 
134
85
  ---
@@ -136,7 +87,7 @@ codex-glm-a
136
87
  ## CLI Usage
137
88
 
138
89
  ```bash
139
- # Start with defaults (http://127.0.0.1:31415)
90
+ # Start with defaults
140
91
  zai-codex-bridge
141
92
 
142
93
  # Custom port
@@ -163,130 +114,123 @@ export LOG_LEVEL=info
163
114
 
164
115
  ---
165
116
 
166
- ## API Endpoints
167
-
168
- ### `POST /responses`
169
- Accepts OpenAI Responses API format, translates to Chat, returns Responses format.
117
+ ## Auto-Starting Proxy with Codex
170
118
 
171
- ### `POST /v1/responses`
172
- Same as `/responses` (for compatibility with Codex's path structure).
173
-
174
- ### `GET /health`
175
- Health check endpoint.
176
-
177
- ---
178
-
179
- ## Testing
119
+ You can create a shell function that starts the proxy automatically when needed:
180
120
 
181
121
  ```bash
182
- # Set your Z.AI API key
183
- export ZAI_API_KEY="sk-your-key"
122
+ codex-with-zai() {
123
+ local HOST="127.0.0.1"
124
+ local PORT="31415"
125
+ local HEALTH="http://${HOST}:${PORT}/health"
126
+ local PROXY_PID=""
184
127
 
185
- # Run test suite
186
- npm run test:curl
187
- ```
128
+ # Start proxy only if not responding
129
+ if ! curl -fsS "$HEALTH" >/dev/null 2>&1; then
130
+ zai-codex-bridge --host "$HOST" --port "$PORT" >/dev/null 2>&1 &
131
+ PROXY_PID=$!
132
+ trap 'kill $PROXY_PID 2>/dev/null' EXIT INT TERM
133
+ sleep 2
134
+ fi
188
135
 
189
- ### Manual Test
136
+ # Run codex
137
+ codex -m "GLM-4.7" -c model_provider="zai_proxy" "$@"
138
+ }
139
+ ```
190
140
 
141
+ Usage:
191
142
  ```bash
192
- # Start proxy
193
- zai-codex-bridge &
194
-
195
- # Test health
196
- curl http://127.0.0.1:31415/health
197
-
198
- # Test Responses API
199
- curl -X POST http://127.0.0.1:31415/v1/responses \
200
- -H "Content-Type: application/json" \
201
- -H "Authorization: Bearer $ZAI_API_KEY" \
202
- -d '{
203
- "model": "GLM-4.7",
204
- "instructions": "Be brief",
205
- "input": [{"role": "user", "content": "What is 2+2?"}],
206
- "stream": false
207
- }'
143
+ codex-with-zai
144
+ # Proxy auto-starts, Codex runs
145
+ # Ctrl+D exits both
208
146
  ```
209
147
 
210
148
  ---
211
149
 
212
- ## Architecture
213
-
214
- ```
215
- Codex (Responses API)
216
-
217
- Proxy (localhost:31415)
218
-
219
- Translation: Responses → Chat
220
-
221
- Z.AI (Chat Completions)
222
-
223
- Translation: Chat → Responses
224
-
225
- Codex
226
- ```
227
-
228
- ---
150
+ ## API Endpoints
229
151
 
230
- ## Documentation
152
+ ### `POST /responses`
153
+ Accepts OpenAI Responses API format, translates to Chat, returns Responses format.
231
154
 
232
- 📚 **Complete Guide**: [docs/guide.md](docs/guide.md)
155
+ ### `POST /v1/responses`
156
+ Same as `/responses` (for compatibility with Codex's path structure).
233
157
 
234
- The guide includes:
235
- - Detailed setup instructions
236
- - Usage examples
237
- - Troubleshooting
238
- - Function reference
239
- - Advanced configuration
158
+ ### `GET /health`
159
+ Health check endpoint.
240
160
 
241
161
  ---
242
162
 
243
- ## Troubleshooting
244
-
245
- ### Port Already in Use
163
+ ## Translation Details
246
164
 
247
- ```bash
248
- lsof -i :31415
249
- kill -9 <PID>
250
- ```
165
+ ### Request: Responses → Chat
251
166
 
252
- ### Connection Refused
167
+ ```javascript
168
+ // Input (Responses format)
169
+ {
170
+ model: "GLM-4.7",
171
+ instructions: "Be helpful",
172
+ input: [
173
+ { role: "user", content: "Hello" }
174
+ ],
175
+ max_output_tokens: 1000
176
+ }
253
177
 
254
- ```bash
255
- curl http://127.0.0.1:31415/health
256
- zai-codex-bridge --log-level debug
178
+ // Output (Chat format)
179
+ {
180
+ model: "GLM-4.7",
181
+ messages: [
182
+ { role: "system", content: "Be helpful" },
183
+ { role: "user", content: "Hello" }
184
+ ],
185
+ max_tokens: 1000
186
+ }
257
187
  ```
258
188
 
259
- ### Z.AI Errors
189
+ ### Response: Chat → Responses
260
190
 
261
- Verify API key and test Z.AI directly:
262
- ```bash
263
- curl -X POST https://api.z.ai/api/coding/paas/v4/chat/completions \
264
- -H "Authorization: Bearer $ZAI_API_KEY" \
265
- -H "Content-Type: application/json" \
266
- -d '{"model":"GLM-4.7","messages":[{"role":"user","content":"test"}]}'
191
+ ```javascript
192
+ // Input (Chat format)
193
+ {
194
+ choices: [{
195
+ message: { content: "Hi there!" }
196
+ }],
197
+ usage: {
198
+ prompt_tokens: 10,
199
+ completion_tokens: 5
200
+ }
201
+ }
202
+
203
+ // Output (Responses format)
204
+ {
205
+ output: [{ value: "Hi there!", content_type: "text" }],
206
+ status: "completed",
207
+ usage: {
208
+ input_tokens: 10,
209
+ output_tokens: 5
210
+ }
211
+ }
267
212
  ```
268
213
 
269
214
  ---
270
215
 
271
- ## Development
216
+ ## Testing
272
217
 
273
218
  ```bash
274
- # Clone repo
275
- git clone https://github.com/DioNanos/zai-codex-bridge.git
276
- cd zai-codex-bridge
277
-
278
- # Run in development
279
- npm start
219
+ # Set your Z.AI API key
220
+ export ZAI_API_KEY="sk-your-key"
280
221
 
281
- # Run tests
222
+ # Run test suite
282
223
  npm run test:curl
283
-
284
- # Link globally
285
- npm link
286
224
  ```
287
225
 
288
226
  ---
289
227
 
228
+ ## Documentation
229
+
230
+ Complete usage guide: [docs/guide.md](docs/guide.md)
231
+
232
+ ---
233
+
290
234
  ## License
291
235
 
292
236
  MIT License - Copyright (c) 2026 Davide A. Guglielmi
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@mmmbuto/zai-codex-bridge",
3
- "version": "0.1.0",
3
+ "version": "0.1.2",
4
4
  "description": "Local proxy that translates OpenAI Responses API format to Z.AI Chat Completions format for Codex",
5
5
  "main": "src/server.js",
6
6
  "bin": {
package/src/server.js CHANGED
@@ -188,8 +188,10 @@ async function makeUpstreamRequest(path, body, headers) {
188
188
  'Authorization': headers['authorization'] || headers['Authorization'] || ''
189
189
  };
190
190
 
191
- log('debug', 'Upstream request:', {
191
+ log('info', 'Upstream request:', {
192
192
  url: url.href,
193
+ path: path,
194
+ base: ZAI_BASE_URL,
193
195
  hasAuth: !!upstreamHeaders.Authorization
194
196
  });
195
197