codex-claude-proxy 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,338 @@
1
+ # Using with OpenClaw
2
+
3
+ [OpenClaw](https://docs.openclaw.ai/) (formerly ClawdBot/Moltbot) is an AI agent gateway that connects to messaging apps like Telegram, WhatsApp, Discord, Slack, and iMessage. You can configure it to use this proxy for GPT-5 Codex models.
4
+
5
+ ## What is OpenClaw?
6
+
7
+ OpenClaw acts as a bridge between messaging platforms and AI models:
8
+
9
+ ```
10
+ ┌──────────────┐ ┌─────────────┐ ┌─────────────────────┐ ┌──────────────┐
11
+ │ Telegram │────▶│ │ │ Codex-Claude-Proxy │ │ ChatGPT │
12
+ │ WhatsApp │ │ OpenClaw │────▶│ (Anthropic API) │────▶│ Codex API │
13
+ │ Discord │ │ Gateway │ │ │ │ │
14
+ │ Slack │ │ │ │ │ │ │
15
+ │ iMessage │ │ │ │ │ │ │
16
+ └──────────────┘ └─────────────┘ └─────────────────────┘ └──────────────┘
17
+ ```
18
+
19
+ OpenClaw expects providers to expose an Anthropic-compatible or OpenAI-compatible API, which this proxy provides.
20
+
21
+ ## Prerequisites
22
+
23
+ - OpenClaw installed:
24
+ ```bash
25
+ npm install -g openclaw@latest
26
+ ```
27
+ - Codex-Claude-Proxy running on port 8081 (or your configured port)
28
+ - At least one ChatGPT account linked to the proxy
29
+
30
+ ## Configure OpenClaw
31
+
32
+ Edit your OpenClaw config file:
33
+ - **macOS/Linux**: `~/.openclaw/openclaw.json`
34
+ - **Windows**: `%USERPROFILE%\.openclaw\openclaw.json`
35
+
36
+ ### Basic Configuration
37
+
38
+ ```json
39
+ {
40
+ "models": {
41
+ "mode": "merge",
42
+ "providers": {
43
+ "codex-proxy": {
44
+ "baseUrl": "http://127.0.0.1:8081",
45
+ "apiKey": "test",
46
+ "api": "anthropic-messages",
47
+ "models": [
48
+ {
49
+ "id": "gpt-5.3-codex",
50
+ "name": "GPT-5.3 Codex",
51
+ "reasoning": true,
52
+ "input": ["text", "image"],
53
+ "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
54
+ "contextWindow": 200000,
55
+ "maxTokens": 128000
56
+ },
57
+ {
58
+ "id": "gpt-5.2-codex",
59
+ "name": "GPT-5.2 Codex",
60
+ "reasoning": true,
61
+ "input": ["text", "image"],
62
+ "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
63
+ "contextWindow": 200000,
64
+ "maxTokens": 128000
65
+ },
66
+ {
67
+ "id": "gpt-5.2",
68
+ "name": "GPT-5.2",
69
+ "reasoning": false,
70
+ "input": ["text", "image"],
71
+ "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
72
+ "contextWindow": 128000,
73
+ "maxTokens": 16384
74
+ }
75
+ ]
76
+ }
77
+ }
78
+ },
79
+ "agents": {
80
+ "defaults": {
81
+ "model": {
82
+ "primary": "codex-proxy/gpt-5.2-codex",
83
+ "fallbacks": ["codex-proxy/gpt-5.2"]
84
+ },
85
+ "models": {
86
+ "codex-proxy/gpt-5.2-codex": {}
87
+ }
88
+ }
89
+ }
90
+ }
91
+ ```
92
+
93
+ ### Using Claude Model Names (Auto-Mapped)
94
+
95
+ The proxy automatically maps Claude model names to Codex equivalents. You can use familiar names:
96
+
97
+ ```json
98
+ {
99
+ "models": {
100
+ "mode": "merge",
101
+ "providers": {
102
+ "codex-proxy": {
103
+ "baseUrl": "http://127.0.0.1:8081",
104
+ "apiKey": "test",
105
+ "api": "anthropic-messages",
106
+ "models": [
107
+ {
108
+ "id": "claude-opus-4-5",
109
+ "name": "Claude Opus 4.5 (GPT-5.3 Codex)",
110
+ "reasoning": true,
111
+ "input": ["text", "image"],
112
+ "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
113
+ "contextWindow": 200000,
114
+ "maxTokens": 128000
115
+ },
116
+ {
117
+ "id": "claude-sonnet-4-5",
118
+ "name": "Claude Sonnet 4.5 (GPT-5.2 Codex)",
119
+ "reasoning": true,
120
+ "input": ["text", "image"],
121
+ "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
122
+ "contextWindow": 200000,
123
+ "maxTokens": 128000
124
+ },
125
+ {
126
+ "id": "claude-haiku-4",
127
+ "name": "Claude Haiku 4 (GPT-5.2)",
128
+ "reasoning": false,
129
+ "input": ["text", "image"],
130
+ "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
131
+ "contextWindow": 128000,
132
+ "maxTokens": 16384
133
+ }
134
+ ]
135
+ }
136
+ }
137
+ },
138
+ "agents": {
139
+ "defaults": {
140
+ "model": {
141
+ "primary": "codex-proxy/claude-sonnet-4-5",
142
+ "fallbacks": ["codex-proxy/claude-haiku-4"]
143
+ }
144
+ }
145
+ }
146
+ }
147
+ ```
148
+
149
+ > **Note**: The `reasoning` field indicates whether the model supports extended thinking/reasoning. Set to `true` for Codex models which excel at complex reasoning tasks.
150
+
151
+ ## Model Reference
152
+
153
+ | Proxy Model | Mapped From | Best For |
154
+ |-------------|-------------|----------|
155
+ | `gpt-5.3-codex` | `claude-opus-4-5` | Most capable, complex coding tasks |
156
+ | `gpt-5.2-codex` | `claude-sonnet-4-5` | Balanced performance, recommended default |
157
+ | `gpt-5.2` | `claude-haiku-4` | Fast responses, simpler tasks |
158
+
159
+ ## Start Both Services
160
+
161
+ ```bash
162
+ # Terminal 1: Start the proxy
163
+ cd codex-claude-proxy
164
+ npm start
165
+
166
+ # Terminal 2: Start OpenClaw gateway
167
+ openclaw gateway
168
+ ```
169
+
170
+ ## Verify Configuration
171
+
172
+ ```bash
173
+ # Check available models
174
+ openclaw models list
175
+
176
+ # Check gateway status
177
+ openclaw status
178
+ ```
179
+
180
+ You should see models prefixed with `codex-proxy/` in the list.
181
+
182
+ ## Switch Models
183
+
184
+ To change the default model:
185
+
186
+ ```bash
187
+ openclaw models set codex-proxy/gpt-5.3-codex
188
+ ```
189
+
190
+ Or edit the `model.primary` field in your config file.
191
+
192
+ ## API Compatibility
193
+
194
+ The proxy exposes an **Anthropic Messages API** compatible interface:
195
+
196
+ | Endpoint | Description |
197
+ |----------|-------------|
198
+ | `POST /v1/messages` | Main chat endpoint |
199
+ | `GET /v1/models` | List available models |
200
+ | `POST /v1/messages/count_tokens` | Token counting |
201
+
202
+ OpenClaw's `api: "anthropic-messages"` setting tells it to use the Anthropic format, which this proxy fully supports including:
203
+ - Streaming responses (SSE)
204
+ - Tool/function calling
205
+ - Multi-turn conversations
206
+ - Image inputs
207
+ - System prompts
208
+
209
+ ## Multi-Account Support
210
+
211
+ The proxy supports multiple ChatGPT accounts with automatic switching:
212
+
213
+ ```bash
214
+ # Add accounts via Web UI at http://localhost:8081
215
+ # Or via CLI:
216
+ npm run accounts:add
217
+
218
+ # List accounts
219
+ curl http://localhost:8081/accounts
220
+ ```
221
+
222
+ When one account hits rate limits, the proxy can automatically switch to another. This provides seamless continuity for OpenClaw users.
223
+
224
+ ## Advanced Configuration
225
+
226
+ ### Custom Port
227
+
228
+ If running the proxy on a different port:
229
+
230
+ ```json
231
+ {
232
+ "models": {
233
+ "providers": {
234
+ "codex-proxy": {
235
+ "baseUrl": "http://127.0.0.1:3000",
236
+ ...
237
+ }
238
+ }
239
+ }
240
+ }
241
+ ```
242
+
243
+ ### VPS/Remote Server
244
+
245
+ When running on a VPS, bind the proxy to localhost only:
246
+
247
+ ```bash
248
+ HOST=127.0.0.1 PORT=8081 npm start
249
+ ```
250
+
251
+ Then use SSH port forwarding on your local machine:
252
+
253
+ ```bash
254
+ ssh -L 8081:127.0.0.1:8081 user@your-vps
255
+ ```
256
+
257
+ Configure OpenClaw to use the local tunnel:
258
+
259
+ ```json
260
+ {
261
+ "baseUrl": "http://127.0.0.1:8081"
262
+ }
263
+ ```
264
+
265
+ ### Authentication
266
+
267
+ To protect the proxy with an API key:
268
+
269
+ ```bash
270
+ API_KEY=your-secret-key npm start
271
+ ```
272
+
273
+ Update OpenClaw config:
274
+
275
+ ```json
276
+ {
277
+ "providers": {
278
+ "codex-proxy": {
279
+ "baseUrl": "http://127.0.0.1:8081",
280
+ "apiKey": "your-secret-key",
281
+ ...
282
+ }
283
+ }
284
+ }
285
+ ```
286
+
287
+ ## Troubleshooting
288
+
289
+ ### Connection Refused
290
+
291
+ Ensure the proxy is running:
292
+ ```bash
293
+ curl http://127.0.0.1:8081/health
294
+ ```
295
+
296
+ ### Models Not Showing
297
+
298
+ 1. Verify config file is valid JSON
299
+ 2. Check `mode` is set to `"merge"`
300
+ 3. Restart OpenClaw after config changes:
301
+ ```bash
302
+ openclaw gateway restart
303
+ ```
304
+
305
+ ### Rate Limiting
306
+
307
+ If you see rate limit errors:
308
+ 1. Add more ChatGPT accounts to the proxy
309
+ 2. The proxy will auto-switch between accounts
310
+ 3. Check `/accounts` endpoint for account status
311
+
312
+ ### Use 127.0.0.1 Not localhost
313
+
314
+ Always use `127.0.0.1` instead of `localhost` in `baseUrl`:
315
+ - Avoids DNS resolution issues
316
+ - Explicitly stays on loopback interface
317
+ - Prevents accidental exposure on VPS
318
+
319
+ ## Comparison: Codex Proxy vs Antigravity Proxy
320
+
321
+ | Feature | Codex-Claude-Proxy | Antigravity Proxy |
322
+ |---------|-------------------|-------------------|
323
+ | Backend API | ChatGPT Codex API | Google Cloud Code API |
324
+ | Models | GPT-5.2/5.3 Codex | Gemini 3, Claude (via Google) |
325
+ | Auth | ChatGPT OAuth | Google OAuth |
326
+ | Account Source | ChatGPT accounts | Google accounts |
327
+ | Rate Limits | ChatGPT limits | Google Cloud quotas |
328
+ | Best For | Codex-native access | Google Cloud users |
329
+
330
+ Both proxies expose the same Anthropic-compatible API, so OpenClaw configuration is nearly identical.
331
+
332
+ ## Further Reading
333
+
334
+ - [OpenClaw Documentation](https://docs.openclaw.ai/)
335
+ - [OpenClaw Configuration Reference](https://docs.openclaw.ai/gateway/configuration)
336
+ - [Proxy API Reference](./API.md)
337
+ - [Account Management](./ACCOUNTS.md)
338
+ - [OAuth Setup](./OAUTH.md)
package/package.json ADDED
@@ -0,0 +1,44 @@
1
+ {
2
+ "name": "codex-claude-proxy",
3
+ "version": "1.0.0",
4
+ "description": "Multi-account proxy server for OpenAI Codex CLI with Claude API compatibility",
5
+ "type": "module",
6
+ "main": "src/index.js",
7
+ "bin": {
8
+ "codex-claude-proxy": "./src/cli/index.js",
9
+ "codex-proxy-accounts": "./src/cli/accounts.js"
10
+ },
11
+ "files": [
12
+ "src",
13
+ "public",
14
+ "docs",
15
+ "images",
16
+ "README.md",
17
+ "LICENSE",
18
+ "package.json"
19
+ ],
20
+ "engines": {
21
+ "node": ">=18"
22
+ },
23
+ "repository": {
24
+ "type": "git",
25
+ "url": "git+https://github.com/Ayush-Kotlin-Dev/codex-claude-proxy.git"
26
+ },
27
+ "bugs": {
28
+ "url": "https://github.com/Ayush-Kotlin-Dev/codex-claude-proxy/issues"
29
+ },
30
+ "homepage": "https://github.com/Ayush-Kotlin-Dev/codex-claude-proxy#readme",
31
+ "scripts": {
32
+ "start": "node src/index.js",
33
+ "accounts": "node src/cli/accounts.js",
34
+ "accounts:add": "node src/cli/accounts.js add",
35
+ "accounts:add:headless": "node src/cli/accounts.js add --no-browser",
36
+ "accounts:list": "node src/cli/accounts.js list"
37
+ },
38
+ "dependencies": {
39
+ "express": "^4.18.2",
40
+ "cors": "^2.8.5"
41
+ },
42
+ "keywords": ["codex", "openai", "proxy", "claude", "chatgpt", "oauth"],
43
+ "license": "MIT"
44
+ }