@samooth/open-codex 0.1.47 → 0.1.49
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +475 -55
- package/dist/cli.js +130 -124
- package/dist/cli.js.map +3 -3
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -1,93 +1,513 @@
|
|
|
1
|
-
|
|
1
|
+
<h1 align="center">Open Codex CLI</h1>
|
|
2
|
+
<p align="center">Lightweight coding agent that runs in your terminal</p>
|
|
2
3
|
|
|
3
|
-
>
|
|
4
|
+
<p align="center"><code>npm i -g @samooth/open-codex</code></p>
|
|
4
5
|
|
|
5
|
-
|
|
6
|
+
> **Important Note**: This is a fork of the [original OpenAI Codex CLI](https://github.com/openai/codex) with expanded model support and changed installation instructions. The main differences in this fork are:
|
|
7
|
+
>
|
|
8
|
+
> - Support for multiple AI providers (OpenAI, Gemini, OpenRouter, Ollama, xAI, DeepSeek, Hugging Face)
|
|
9
|
+
> - Uses the [Chat Completion API instead of the Responses API](https://platform.openai.com/docs/guides/responses-vs-chat-completions) which allows us to support any openai compatible provider and model.
|
|
10
|
+
> - All other functionality remains similar to the original project
|
|
11
|
+
> - You can install this fork globally with `npm i -g @samooth/open-codex`
|
|
6
12
|
|
|
7
|
-
|
|
13
|
+
---
|
|
8
14
|
|
|
9
|
-
|
|
15
|
+
<details>
|
|
16
|
+
<summary><strong>Table of Contents</strong></summary>
|
|
10
17
|
|
|
11
|
-
-
|
|
12
|
-
-
|
|
13
|
-
-
|
|
14
|
-
-
|
|
15
|
-
-
|
|
16
|
-
-
|
|
17
|
-
-
|
|
18
|
+
- [Experimental Technology Disclaimer](#experimental-technology-disclaimer)
|
|
19
|
+
- [Quickstart](#quickstart)
|
|
20
|
+
- [Why Codex?](#whycodex)
|
|
21
|
+
- [Security Model & Permissions](#securitymodelpermissions)
|
|
22
|
+
- [Platform sandboxing details](#platform-sandboxing-details)
|
|
23
|
+
- [System Requirements](#systemrequirements)
|
|
24
|
+
- [CLI Reference](#clireference)
|
|
25
|
+
- [Memory & Project Docs](#memoryprojectdocs)
|
|
26
|
+
- [Non‑interactive / CI mode](#noninteractivecimode)
|
|
27
|
+
- [Recipes](#recipes)
|
|
28
|
+
- [Installation](#installation)
|
|
29
|
+
- [Configuration](#configuration)
|
|
30
|
+
- [FAQ](#faq)
|
|
31
|
+
- [Contributing](#contributing)
|
|
32
|
+
- [Development workflow](#development-workflow)
|
|
33
|
+
- [Writing high‑impact code changes](#writing-highimpact-code-changes)
|
|
34
|
+
- [Opening a pull request](#opening-a-pull-request)
|
|
35
|
+
- [Review process](#review-process)
|
|
36
|
+
- [Community values](#community-values)
|
|
37
|
+
- [Getting help](#getting-help)
|
|
38
|
+
- [Releasing `codex`](#releasing-codex)
|
|
39
|
+
- [Security & Responsible AI](#securityresponsibleai)
|
|
40
|
+
- [License](#license)
|
|
41
|
+
- [Zero Data Retention (ZDR) Organization Limitation](#zero-data-retention-zdr-organization-limitation)
|
|
42
|
+
|
|
43
|
+
</details>
|
|
44
|
+
|
|
45
|
+
---
|
|
46
|
+
|
|
47
|
+
## Experimental Technology Disclaimer
|
|
48
|
+
|
|
49
|
+
Codex CLI is an experimental project under active development. It is not yet stable, may contain bugs, incomplete features, or undergo breaking changes. We're building it in the open with the community and welcome:
|
|
50
|
+
|
|
51
|
+
- Bug reports
|
|
52
|
+
- Feature requests
|
|
53
|
+
- Pull requests
|
|
54
|
+
- Good vibes
|
|
55
|
+
|
|
56
|
+
Help us improve by filing issues or submitting PRs (see the section below for how to contribute)!
|
|
57
|
+
|
|
58
|
+
## Quickstart
|
|
59
|
+
|
|
60
|
+
Install globally:
|
|
61
|
+
|
|
62
|
+
```shell
|
|
63
|
+
npm install -g @samooth/open-codex
|
|
64
|
+
```
|
|
65
|
+
|
|
66
|
+
Next, set your API key as an environment variable (shown here with OpenAI, but other providers are supported):
|
|
67
|
+
|
|
68
|
+
```shell
|
|
69
|
+
export OPENAI_API_KEY="your-api-key-here"
|
|
70
|
+
```
|
|
71
|
+
|
|
72
|
+
> **Note:** This command sets the key only for your current terminal session. To make it permanent, add the `export` line to your shell's configuration file (e.g., `~/.zshrc`).
|
|
73
|
+
>
|
|
74
|
+
> **Tip:** You can also place your API key into a `.env` file at the root of your project:
|
|
75
|
+
>
|
|
76
|
+
> ```env
|
|
77
|
+
> OPENAI_API_KEY=your-api-key-here
|
|
78
|
+
> ```
|
|
79
|
+
>
|
|
80
|
+
> The CLI will automatically load variables from `.env` (via `dotenv/config`).
|
|
81
|
+
|
|
82
|
+
Run interactively:
|
|
83
|
+
|
|
84
|
+
```shell
|
|
85
|
+
open-codex
|
|
86
|
+
```
|
|
87
|
+
|
|
88
|
+
Or, run with a prompt as input (and optionally in `Full Auto` mode):
|
|
89
|
+
|
|
90
|
+
```shell
|
|
91
|
+
open-codex "explain this codebase to me"
|
|
92
|
+
```
|
|
93
|
+
|
|
94
|
+
```shell
|
|
95
|
+
open-codex --approval-mode full-auto "create the fanciest todo-list app"
|
|
96
|
+
```
|
|
97
|
+
|
|
98
|
+
That's it – Codex will scaffold a file, run it inside a sandbox, install any
|
|
99
|
+
missing dependencies, and show you the live result. Approve the changes and
|
|
100
|
+
they'll be committed to your working directory.
|
|
101
|
+
|
|
102
|
+
---
|
|
103
|
+
|
|
104
|
+
## Why Codex?
|
|
105
|
+
|
|
106
|
+
Codex CLI is built for developers who already **live in the terminal** and want
|
|
107
|
+
ChatGPT‑level reasoning **plus** the power to actually run code, manipulate
|
|
108
|
+
files, and iterate – all under version control. In short, it's _chat‑driven
|
|
109
|
+
development_ that understands and executes your repo.
|
|
110
|
+
|
|
111
|
+
- **Zero setup** — bring your API key and it just works!
|
|
112
|
+
- **Multiple AI providers** — use OpenAI, Gemini, OpenRouter, Ollama, xAI, DeepSeek, or Hugging Face!
|
|
113
|
+
- **High Performance** — parallel tool execution and asynchronous file indexing for speed ✨
|
|
114
|
+
- **Syntax Highlighting** — full terminal color support for code diffs and file contents 🎨
|
|
115
|
+
- **Full auto-approval, while safe + secure** by running network-disabled and directory-sandboxed
|
|
116
|
+
- **Multimodal** — pass in screenshots or diagrams to implement features ✨
|
|
117
|
+
- **Dry Run mode** — preview all changes without actually modifying files or running commands!
|
|
118
|
+
- **Interactive Config** — toggle settings like dry-run and debug mode in-session with `/config` ⚙️
|
|
119
|
+
- **Loop Protection** — automatic detection and prevention of repetitive failing tool calls 🔄
|
|
120
|
+
|
|
121
|
+
And it's **fully open-source** so you can see and contribute to how it develops!
|
|
122
|
+
|
|
123
|
+
---
|
|
124
|
+
|
|
125
|
+
## Security Model & Permissions
|
|
126
|
+
|
|
127
|
+
Codex lets you decide _how much autonomy_ the agent receives and auto-approval policy via the
|
|
128
|
+
`--approval-mode` flag (or the interactive onboarding prompt):
|
|
129
|
+
|
|
130
|
+
| Mode | What the agent may do without asking | Still requires approval |
|
|
131
|
+
| ------------------------- | ----------------------------------------------- | -------------------------------------------------------------- |
|
|
132
|
+
| **Suggest** <br>(default) | • Read any file in the repo | • **All** file writes/patches <br>• **All** shell/Bash commands |
|
|
133
|
+
| **Auto Edit** | • Read **and** apply‑patch writes to files | • **All** shell/Bash commands |
|
|
134
|
+
| **Full Auto** | • Read/write files <br>• Execute shell commands | – |
|
|
135
|
+
|
|
136
|
+
In **Full Auto** every command is run **network‑disabled** and confined to the
|
|
137
|
+
current working directory (plus temporary files) for defense‑in‑depth. Codex
|
|
138
|
+
will also show a warning/confirmation if you start in **auto‑edit** or
|
|
139
|
+
**full‑auto** while the directory is _not_ tracked by Git, so you always have a
|
|
140
|
+
safety net.
|
|
141
|
+
|
|
142
|
+
### Dry Run Mode
|
|
143
|
+
|
|
144
|
+
If you're unsure about what the agent might do, you can use the `--dry-run` flag. In this mode, Codex will simulate all operations (file writes, shell commands, etc.) and show you exactly what it *would* have done without actually touching your filesystem or executing any code.
|
|
145
|
+
|
|
146
|
+
```shell
|
|
147
|
+
open-codex --dry-run "Refactor all components to TypeScript"
|
|
148
|
+
```
|
|
149
|
+
|
|
150
|
+
### Platform sandboxing details
|
|
151
|
+
|
|
152
|
+
The hardening mechanism Codex uses depends on your OS:
|
|
153
|
+
|
|
154
|
+
- **macOS 12+** – commands are wrapped with **Apple Seatbelt** (`sandbox-exec`).
|
|
155
|
+
|
|
156
|
+
- Everything is placed in a read‑only jail except for a small set of
|
|
157
|
+
writable roots (`$PWD`, `$TMPDIR`, `~/.codex`, etc.).
|
|
158
|
+
- Outbound network is _fully blocked_ by default – even if a child process
|
|
159
|
+
tries to `curl` somewhere it will fail.
|
|
160
|
+
|
|
161
|
+
- **Linux** – there is no sandboxing by default.
|
|
162
|
+
We recommend using Docker for sandboxing, where Codex launches itself inside a **minimal
|
|
163
|
+
container image** and mounts your repo _read/write_ at the same path. A
|
|
164
|
+
custom `iptables`/`ipset` firewall script denies all egress except the
|
|
165
|
+
OpenAI API. This gives you deterministic, reproducible runs without needing
|
|
166
|
+
root on the host. You can use the [`run_in_container.sh`](./codex-cli/scripts/run_in_container.sh) script to set up the sandbox.
|
|
167
|
+
|
|
168
|
+
---
|
|
169
|
+
|
|
170
|
+
## System Requirements
|
|
171
|
+
|
|
172
|
+
| Requirement | Details |
|
|
173
|
+
| --------------------------- | --------------------------------------------------------------- |
|
|
174
|
+
| Operating systems | macOS 12+, Ubuntu 20.04+/Debian 10+, or Windows 11 **via WSL2** |
|
|
175
|
+
| Node.js | **22 or newer** (LTS recommended) |
|
|
176
|
+
| Git (optional, recommended) | 2.23+ for built‑in PR helpers |
|
|
177
|
+
| RAM | 4‑GB minimum (8‑GB recommended) |
|
|
178
|
+
|
|
179
|
+
> Never run `sudo npm install -g`; fix npm permissions instead.
|
|
180
|
+
|
|
181
|
+
---
|
|
182
|
+
|
|
183
|
+
## CLI Reference
|
|
184
|
+
|
|
185
|
+
| Command | Purpose | Example |
|
|
186
|
+
| ----------------------------------------- | ----------------------------------- | ------------------------------------ |
|
|
187
|
+
| `open-codex` | Interactive REPL | `codex` |
|
|
188
|
+
| `open-codex "…"` | Initial prompt for interactive REPL | `codex "fix lint errors"` |
|
|
189
|
+
| `open-codex -q "…"` | Non‑interactive "quiet mode" | `codex -q --json "explain utils.ts"` |
|
|
190
|
+
| `open-codex completion <bash\|zsh\|fish>` | Print shell completion script | `codex completion bash` |
|
|
191
|
+
|
|
192
|
+
Inside the chat, use slash commands like `/help`, `/model`, `/approval`, `/config`, `/history`, and `/clear`.
|
|
193
|
+
|
|
194
|
+
Key flags:
|
|
195
|
+
- `--provider / -p`: AI provider to use.
|
|
196
|
+
- `--model / -m`: Model to use for completions.
|
|
197
|
+
- `--approval-mode / -a`: Override the approval policy.
|
|
198
|
+
- `--dry-run`: Preview changes without applying them.
|
|
199
|
+
- `--quiet / -q`: Non-interactive mode.
|
|
200
|
+
|
|
201
|
+
---
|
|
202
|
+
|
|
203
|
+
## Memory & Project Docs
|
|
204
|
+
|
|
205
|
+
Codex merges Markdown instructions in this order:
|
|
206
|
+
|
|
207
|
+
1. `~/.codex/instructions.md` – personal global guidance
|
|
208
|
+
2. `codex.md` at repo root – shared project notes
|
|
209
|
+
3. `codex.md` in cwd – sub‑package specifics
|
|
210
|
+
4. `.codex/memory.md` – persistent project-specific facts learned by the agent.
|
|
211
|
+
|
|
212
|
+
Disable with `--no-project-doc` or `CODEX_DISABLE_PROJECT_DOC=1`.
|
|
213
|
+
|
|
214
|
+
---
|
|
215
|
+
|
|
216
|
+
## Non‑interactive / CI mode
|
|
217
|
+
|
|
218
|
+
Run Codex head‑less in pipelines. Example GitHub Action step:
|
|
219
|
+
|
|
220
|
+
```yaml
|
|
221
|
+
- name: Update changelog via Codex
|
|
222
|
+
run: |
|
|
223
|
+
npm install -g @samooth/open-codex
|
|
224
|
+
export OPENAI_API_KEY="${{ secrets.OPENAI_KEY }}"
|
|
225
|
+
open-codex -a auto-edit --quiet "update CHANGELOG for next release"
|
|
226
|
+
```
|
|
227
|
+
|
|
228
|
+
Set `CODEX_QUIET_MODE=1` to silence interactive UI noise.
|
|
229
|
+
|
|
230
|
+
## Tracing / Verbose Logging
|
|
231
|
+
|
|
232
|
+
Setting the environment variable `DEBUG=true` prints full API request and response details:
|
|
233
|
+
|
|
234
|
+
```shell
|
|
235
|
+
DEBUG=true open-codex
|
|
236
|
+
```
|
|
237
|
+
|
|
238
|
+
---
|
|
239
|
+
|
|
240
|
+
## Recipes
|
|
241
|
+
|
|
242
|
+
Below are a few bite‑size examples you can copy‑paste. Replace the text in quotes with your own task. See the [prompting guide](https://github.com/openai/codex/blob/main/codex-cli/examples/prompting_guide.md) for more tips and usage patterns.
|
|
243
|
+
|
|
244
|
+
| ✨ | What you type | What happens |
|
|
245
|
+
| --- | ------------------------------------------------------------------------------- | -------------------------------------------------------------------------- |
|
|
246
|
+
| 1 | `codex "Refactor the Dashboard component to React Hooks"` | Codex rewrites the class component, runs `npm test`, and shows the diff. |
|
|
247
|
+
| 2 | `codex "Generate SQL migrations for adding a users table"` | Infers your ORM, creates migration files, and runs them in a sandboxed DB. |
|
|
248
|
+
| 3 | `codex "Write unit tests for utils/date.ts"` | Generates tests, executes them, and iterates until they pass. |
|
|
249
|
+
| 4 | `codex "Bulk‑rename *.jpeg → *.jpg with git mv"` | Safely renames files and updates imports/usages. |
|
|
250
|
+
| 5 | `codex "Explain what this regex does: ^(?=.*[A-Z]).{8,}$"` | Outputs a step‑by‑step human explanation. |
|
|
251
|
+
| 6 | `codex "Carefully review this repo, and propose 3 high impact well-scoped PRs"` | Suggests impactful PRs in the current codebase. |
|
|
252
|
+
| 7 | `codex "Look for vulnerabilities and create a security review report"` | Finds and explains security bugs. |
|
|
253
|
+
|
|
254
|
+
---
|
|
18
255
|
|
|
19
256
|
## Installation
|
|
20
257
|
|
|
258
|
+
<details open>
|
|
259
|
+
<summary><strong>From npm (Recommended)</strong></summary>
|
|
260
|
+
|
|
21
261
|
```bash
|
|
22
262
|
npm install -g @samooth/open-codex
|
|
263
|
+
# or
|
|
264
|
+
yarn global add @samooth/open-codex
|
|
23
265
|
```
|
|
24
266
|
|
|
25
|
-
|
|
267
|
+
</details>
|
|
26
268
|
|
|
27
|
-
|
|
28
|
-
|
|
29
|
-
export OPENAI_API_KEY="sk-..."
|
|
30
|
-
```
|
|
269
|
+
<details>
|
|
270
|
+
<summary><strong>Build from source</strong></summary>
|
|
31
271
|
|
|
32
|
-
|
|
33
|
-
|
|
34
|
-
|
|
35
|
-
|
|
272
|
+
```bash
|
|
273
|
+
# Clone the repository and navigate to the CLI package
|
|
274
|
+
git clone https://github.com/ymichael/open-codex.git
|
|
275
|
+
cd open-codex/codex-cli
|
|
36
276
|
|
|
37
|
-
|
|
38
|
-
|
|
39
|
-
|
|
40
|
-
```
|
|
277
|
+
# Install dependencies and build
|
|
278
|
+
npm install
|
|
279
|
+
npm run build
|
|
41
280
|
|
|
42
|
-
|
|
281
|
+
# Get the usage and the options
|
|
282
|
+
node ./dist/cli.js --help
|
|
43
283
|
|
|
44
|
-
|
|
284
|
+
# Run the locally‑built CLI directly
|
|
285
|
+
node ./dist/cli.js
|
|
45
286
|
|
|
46
|
-
|
|
47
|
-
|
|
48
|
-
|
|
49
|
-
- `/history` – Review the commands run and files touched in this session.
|
|
50
|
-
- `/clear` – Reset the conversation context.
|
|
287
|
+
# Or link the command globally for convenience
|
|
288
|
+
npm link
|
|
289
|
+
```
|
|
51
290
|
|
|
52
|
-
|
|
291
|
+
</details>
|
|
53
292
|
|
|
54
|
-
|
|
55
|
-
- **Up/Down**: Navigate prompt history.
|
|
56
|
-
- **Ctrl+J**: Insert a newline.
|
|
57
|
-
- **Esc (twice)**: Interrupt the agent's current action.
|
|
58
|
-
- **Ctrl+C**: Exit the application.
|
|
293
|
+
---
|
|
59
294
|
|
|
60
295
|
## Configuration
|
|
61
296
|
|
|
62
|
-
Codex
|
|
297
|
+
Codex looks for config files in **`~/.codex/`** (either YAML or JSON format). The configuration is validated using Zod to ensure correctness.
|
|
63
298
|
|
|
64
299
|
```json
|
|
300
|
+
// ~/.codex/config.json
|
|
65
301
|
{
|
|
66
|
-
"model": "o4-mini",
|
|
67
|
-
"provider": "openai",
|
|
68
|
-
"approvalMode": "suggest",
|
|
302
|
+
"model": "o4-mini", // Default model
|
|
303
|
+
"provider": "openai", // Default provider
|
|
304
|
+
"approvalMode": "suggest", // or auto-edit, full-auto
|
|
305
|
+
"fullAutoErrorMode": "ask-user", // or ignore-and-continue
|
|
69
306
|
"memory": {
|
|
70
307
|
"enabled": true
|
|
71
308
|
}
|
|
72
309
|
}
|
|
73
310
|
```
|
|
74
311
|
|
|
75
|
-
|
|
312
|
+
You can also define custom instructions:
|
|
313
|
+
|
|
314
|
+
```md
|
|
315
|
+
# ~/.codex/instructions.md
|
|
316
|
+
|
|
317
|
+
- Always respond with emojis
|
|
318
|
+
- Only use git commands if I explicitly mention you should
|
|
319
|
+
```
|
|
320
|
+
|
|
321
|
+
### Alternative AI Providers
|
|
322
|
+
|
|
323
|
+
This fork of Codex supports multiple AI providers:
|
|
324
|
+
|
|
325
|
+
- openai (default)
|
|
326
|
+
- gemini
|
|
327
|
+
- openrouter
|
|
328
|
+
- ollama
|
|
329
|
+
- xai
|
|
330
|
+
- deepseek
|
|
331
|
+
- hf (Hugging Face)
|
|
332
|
+
|
|
333
|
+
To use a different provider, set the `provider` key in your config file:
|
|
334
|
+
|
|
335
|
+
```json
|
|
336
|
+
{
|
|
337
|
+
"provider": "gemini"
|
|
338
|
+
}
|
|
339
|
+
```
|
|
340
|
+
|
|
341
|
+
OR use the `--provider` flag. eg. `codex --provider gemini`
|
|
342
|
+
|
|
343
|
+
#### Dynamic Model Discovery
|
|
344
|
+
|
|
345
|
+
For many providers, you can use the `/models` command within the interactive chat to see a list of available models and switch between them. For the **Hugging Face** provider, this dynamically fetches the latest `tool-use` compatible models directly from the Hugging Face Hub.
|
|
346
|
+
|
|
347
|
+
Here's a list of all the providers and their default models:
|
|
348
|
+
|
|
349
|
+
| Provider | Environment Variable Required | Default Agentic Model | Default Full Context Model |
|
|
350
|
+
| ---------- | ----------------------------- | ---------------------------- | -------------------------- |
|
|
351
|
+
| openai | OPENAI_API_KEY | o4-mini | o3 |
|
|
352
|
+
| gemini | GOOGLE_GENERATIVE_AI_API_KEY | gemini-3-pro-preview | gemini-2.5-pro |
|
|
353
|
+
| openrouter | OPENROUTER_API_KEY | openai/o4-mini | openai/o3 |
|
|
354
|
+
| ollama | Not required | User must specify | User must specify |
|
|
355
|
+
| xai | XAI_API_KEY | grok-3-mini-beta | grok-3-beta |
|
|
356
|
+
| deepseek | DS_API_KEY | deepseek-chat | deepseek-reasoner |
|
|
357
|
+
| hf | HF_API_KEY | moonshotai/Kimi-K2.5 | moonshotai/Kimi-K2.5 |
|
|
358
|
+
|
|
359
|
+
#### When using an alternative provider, make sure you have the correct environment variables set.
|
|
360
|
+
|
|
361
|
+
```bash
|
|
362
|
+
export GOOGLE_GENERATIVE_AI_API_KEY="your-gemini-api-key-here"
|
|
363
|
+
```
|
|
364
|
+
|
|
365
|
+
---
|
|
366
|
+
|
|
367
|
+
## FAQ
|
|
368
|
+
|
|
369
|
+
<details>
|
|
370
|
+
<summary>What's the difference between this and the original OpenAI Codex CLI?</summary>
|
|
371
|
+
|
|
372
|
+
This is a fork of the original OpenAI Codex CLI project with expanded support for multiple AI providers beyond just OpenAI. The installation package is also different (`open-codex` instead of `@openai/codex`), but the core functionality remains similar.
|
|
373
|
+
|
|
374
|
+
</details>
|
|
375
|
+
|
|
376
|
+
<details>
|
|
377
|
+
<summary>How do I stop Codex from touching my repo?</summary>
|
|
378
|
+
|
|
379
|
+
Codex always runs in a **sandbox first**. If a proposed command or file change looks suspicious you can simply answer **n** when prompted and nothing happens to your working tree. For extra safety, use the `--dry-run` flag.
|
|
380
|
+
|
|
381
|
+
</details>
|
|
382
|
+
|
|
383
|
+
<details>
|
|
384
|
+
<summary>Does it work on Windows?</summary>
|
|
385
|
+
|
|
386
|
+
Not directly. It requires [Windows Subsystem for Linux (WSL2)](https://learn.microsoft.com/en-us/windows/wsl/install) – Codex has been tested on macOS and Linux with Node ≥ 22.
|
|
387
|
+
|
|
388
|
+
</details>
|
|
389
|
+
|
|
390
|
+
<details>
|
|
391
|
+
<summary>Which models are supported?</summary>
|
|
392
|
+
|
|
393
|
+
The default is `o4-mini`, but pass `--model gpt-4o` or set `model: gpt-4o` in your config file to override.
|
|
394
|
+
|
|
395
|
+
You can also use models from other providers like Gemini, DeepSeek, and Hugging Face. See the [Configuration](#configuration) section for more details.
|
|
396
|
+
|
|
397
|
+
</details>
|
|
398
|
+
|
|
399
|
+
---
|
|
400
|
+
|
|
401
|
+
## Zero Data Retention (ZDR) Organization Limitation
|
|
402
|
+
|
|
403
|
+
> **Note:** Codex CLI does **not** currently support OpenAI organizations with [Zero Data Retention (ZDR)](https://platform.openai.com/docs/guides/your-data#zero-data-retention) enabled.
|
|
404
|
+
|
|
405
|
+
If your OpenAI organization has Zero Data Retention enabled, you may encounter errors such as:
|
|
406
|
+
|
|
407
|
+
```
|
|
408
|
+
OpenAI rejected the request. Error details: Status: 400, Code: unsupported_parameter, Type: invalid_request_error, Message: 400 Previous response cannot be used for this organization due to Zero Data Retention.
|
|
409
|
+
```
|
|
410
|
+
|
|
411
|
+
**Why?**
|
|
412
|
+
|
|
413
|
+
- Codex CLI relies on the Responses API with `store:true` to enable internal reasoning steps.
|
|
414
|
+
- As noted in the [docs](https://platform.openai.com/docs/guides/your-data#responses-api), the Responses API requires a 30-day retention period by default, or when the store parameter is set to true.
|
|
415
|
+
- ZDR organizations cannot use `store:true`, so requests will fail.
|
|
416
|
+
|
|
417
|
+
**What can I do?**
|
|
418
|
+
|
|
419
|
+
- If you are part of a ZDR organization, Codex CLI will not work until support is added.
|
|
420
|
+
- We are tracking this limitation and will update the documentation if support becomes available.
|
|
421
|
+
|
|
422
|
+
---
|
|
423
|
+
|
|
424
|
+
## Contributing
|
|
425
|
+
|
|
426
|
+
This project is under active development and the code will likely change pretty significantly. We'll update this message once that's complete!
|
|
427
|
+
|
|
428
|
+
More broadly we welcome contributions – whether you are opening your very first pull request or you're a seasoned maintainer. At the same time we care about reliability and long‑term maintainability, so the bar for merging code is intentionally **high**. The guidelines below spell out what "high‑quality" means in practice and should make the whole process transparent and friendly.
|
|
429
|
+
|
|
430
|
+
### Development workflow
|
|
431
|
+
|
|
432
|
+
- Create a _topic branch_ from `main` – e.g. `feat/interactive-prompt`.
|
|
433
|
+
- Keep your changes focused. Multiple unrelated fixes should be opened as separate PRs.
|
|
434
|
+
- Use `npm run test:watch` during development for super‑fast feedback.
|
|
435
|
+
- We use **Vitest** for unit tests, **ESLint** + **Prettier** for style, and **TypeScript** for type‑checking.
|
|
436
|
+
- Before pushing, run the full test/type/lint suite:
|
|
437
|
+
|
|
438
|
+
```bash
|
|
439
|
+
npm test && npm run lint && npm run typecheck
|
|
440
|
+
```
|
|
441
|
+
|
|
442
|
+
```bash
|
|
443
|
+
# Watch mode (tests rerun on change)
|
|
444
|
+
npm run test:watch
|
|
445
|
+
|
|
446
|
+
# Type‑check without emitting files
|
|
447
|
+
npm run typecheck
|
|
448
|
+
|
|
449
|
+
# Automatically fix lint + prettier issues
|
|
450
|
+
npm run lint:fix
|
|
451
|
+
npm run format:fix
|
|
452
|
+
```
|
|
453
|
+
|
|
454
|
+
### Writing high‑impact code changes
|
|
455
|
+
|
|
456
|
+
1. **Start with an issue.** Open a new one or comment on an existing discussion so we can agree on the solution before code is written.
|
|
457
|
+
2. **Add or update tests.** Every new feature or bug‑fix should come with test coverage that fails before your change and passes afterwards. 100 % coverage is not required, but aim for meaningful assertions.
|
|
458
|
+
3. **Document behaviour.** If your change affects user‑facing behaviour, update the README, inline help (`codex --help`), or relevant example projects.
|
|
459
|
+
4. **Keep commits atomic.** Each commit should compile and the tests should pass. This makes reviews and potential rollbacks easier.
|
|
460
|
+
|
|
461
|
+
### Opening a pull request
|
|
462
|
+
|
|
463
|
+
- Fill in the PR template (or include similar information) – **What? Why? How?**
|
|
464
|
+
- Run **all** checks locally (`npm test && npm run lint && npm run typecheck`). CI failures that could have been caught locally slow down the process.
|
|
465
|
+
- Make sure your branch is up‑to‑date with `main` and that you have resolved merge conflicts.
|
|
466
|
+
- Mark the PR as **Ready for review** only when you believe it is in a merge‑able state.
|
|
467
|
+
|
|
468
|
+
### Review process
|
|
469
|
+
|
|
470
|
+
1. One maintainer will be assigned as a primary reviewer.
|
|
471
|
+
2. We may ask for changes – please do not take this personally. We value the work, we just also value consistency and long‑term maintainability.
|
|
472
|
+
3. When there is consensus that the PR meets the bar, a maintainer will squash‑and‑merge.
|
|
473
|
+
|
|
474
|
+
### Community values
|
|
475
|
+
|
|
476
|
+
- **Be kind and inclusive.** Treat others with respect; we follow the [Contributor Covenant](https://www.contributor-covenant.org/).
|
|
477
|
+
- **Assume good intent.** Written communication is hard – err on the side of generosity.
|
|
478
|
+
- **Teach & learn.** If you spot something confusing, open an issue or PR with improvements.
|
|
76
479
|
|
|
77
|
-
|
|
78
|
-
| :--- | :--- |
|
|
79
|
-
| **OpenAI** | `OPENAI_API_KEY` |
|
|
80
|
-
| **Gemini** | `GOOGLE_GENERATIVE_AI_API_KEY` |
|
|
81
|
-
| **OpenRouter** | `OPENROUTER_API_KEY` |
|
|
82
|
-
| **xAI** | `XAI_API_KEY` |
|
|
83
|
-
| **DeepSeek** | `DS_API_KEY` |
|
|
84
|
-
| **Hugging Face** | `HF_API_KEY` |
|
|
85
|
-
| **Ollama** | (Runs locally) |
|
|
480
|
+
### Getting help
|
|
86
481
|
|
|
87
|
-
|
|
482
|
+
If you run into problems setting up the project, would like feedback on an idea, or just want to say _hi_ – please open a Discussion or jump into the relevant issue. We are happy to help.
|
|
88
483
|
|
|
89
|
-
|
|
484
|
+
Together we can make Codex CLI an incredible tool. **Happy hacking!** :rocket:
|
|
485
|
+
|
|
486
|
+
### Releasing `codex`
|
|
487
|
+
|
|
488
|
+
To publish a new version of the CLI, run the release scripts defined in `codex-cli/package.json`:
|
|
489
|
+
|
|
490
|
+
1. Open the `codex-cli` directory
|
|
491
|
+
2. Make sure you're on a branch like `git checkout -b bump-version`
|
|
492
|
+
3. Bump the version and `CLI_VERSION` to current datetime: `npm run release:version`
|
|
493
|
+
4. Commit the version bump (with DCO sign-off):
|
|
494
|
+
```bash
|
|
495
|
+
git add codex-cli/src/utils/session.ts codex-cli/package.json
|
|
496
|
+
git commit -s -m "chore(release): codex-cli v$(node -p \"require('./codex-cli/package.json').version\")"
|
|
497
|
+
```
|
|
498
|
+
5. Copy README, build, and publish to npm: `npm run release`
|
|
499
|
+
6. Push to branch: `git push origin HEAD`
|
|
500
|
+
|
|
501
|
+
---
|
|
502
|
+
|
|
503
|
+
## Security & Responsible AI
|
|
504
|
+
|
|
505
|
+
Have you discovered a vulnerability or have concerns about model output? Please e‑mail **security@openai.com** and we will respond promptly.
|
|
506
|
+
|
|
507
|
+
---
|
|
90
508
|
|
|
91
509
|
## License
|
|
92
510
|
|
|
93
|
-
[Apache-2.0](
|
|
511
|
+
This repository is licensed under the [Apache-2.0 License](LICENSE).
|
|
512
|
+
|
|
513
|
+
Original project: [OpenAI Codex CLI](https://github.com/openai/codex)
|