@openai/codex 0.20.0 → 0.22.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -18,6 +18,9 @@
18
18
  - [Quickstart](#quickstart)
19
19
  - [Installing and running Codex CLI](#installing-and-running-codex-cli)
20
20
  - [Using Codex with your ChatGPT plan](#using-codex-with-your-chatgpt-plan)
21
+ - [Connecting on a "Headless" Machine](#connecting-on-a-headless-machine)
22
+ - [Authenticate locally and copy your credentials to the "headless" machine](#authenticate-locally-and-copy-your-credentials-to-the-headless-machine)
23
+ - [Connecting through VPS or remote](#connecting-through-vps-or-remote)
21
24
  - [Usage-based billing alternative: Use an OpenAI API key](#usage-based-billing-alternative-use-an-openai-api-key)
22
25
  - [Choosing Codex's level of autonomy](#choosing-codexs-level-of-autonomy)
23
26
  - [**1. Read/write**](#1-readwrite)
@@ -98,16 +101,57 @@ Each archive contains a single entry with the platform baked into the name (e.g.
98
101
  <img src="./.github/codex-cli-login.png" alt="Codex CLI login" width="50%" />
99
102
  </p>
100
103
 
101
- After you run `codex` select Sign in with ChatGPT. You'll need a Plus, Pro, or Team ChatGPT account, and will get access to our latest models, including `gpt-5`, at no extra cost to your plan. (Enterprise is coming soon.)
104
+ Run `codex` and select **Sign in with ChatGPT**. You'll need a Plus, Pro, or Team ChatGPT account, and will get access to our latest models, including `gpt-5`, at no extra cost to your plan. (Enterprise is coming soon.)
102
105
 
103
- > Important: If you've used the Codex CLI before, you'll need to follow these steps to migrate from usage-based billing with your API key:
106
+ > Important: If you've used the Codex CLI before, follow these steps to migrate from usage-based billing with your API key:
104
107
  >
105
- > 1. Update the CLI with `codex update` and ensure `codex --version` is greater than 0.13
106
- > 2. Ensure that there is no `OPENAI_API_KEY` environment variable set. (Check that `env | grep 'OPENAI_API_KEY'` returns empty)
108
+ > 1. Update the CLI and ensure `codex --version` is `0.20.0` or later
109
+ > 2. Delete `~/.codex/auth.json` (this should be `C:\Users\USERNAME\.codex\auth.json` on Windows)
107
110
  > 3. Run `codex login` again
108
111
 
109
112
  If you encounter problems with the login flow, please comment on [this issue](https://github.com/openai/codex/issues/1243).
110
113
 
114
+ ### Connecting on a "Headless" Machine
115
+
116
+ Today, the login process entails running a server on `localhost:1455`. If you are on a "headless" server, such as a Docker container or are `ssh`'d into a remote machine, loading `localhost:1455` in the browser on your local machine will not automatically connect to the webserver running on the _headless_ machine, so you must use one of the following workarounds:
117
+
118
+ #### Authenticate locally and copy your credentials to the "headless" machine
119
+
120
+ The easiest solution is likely to run through the `codex login` process on your local machine such that `localhost:1455` _is_ accessible in your web browser. When you complete the authentication process, an `auth.json` file should be available at `$CODEX_HOME/auth.json` (on Mac/Linux, `$CODEX_HOME` defaults to `~/.codex` whereas on Windows, it defaults to `%USERPROFILE%\.codex`).
121
+
122
+ Because the `auth.json` file is not tied to a specific host, once you complete the authentication flow locally, you can copy the `$CODEX_HOME/auth.json` file to the headless machine and then `codex` should "just work" on that machine. Note to copy a file to a Docker container, you can do:
123
+
124
+ ```shell
125
+ # substitute MY_CONTAINER with the name or id of your Docker container:
126
+ CONTAINER_HOME=$(docker exec MY_CONTAINER printenv HOME)
127
+ docker exec MY_CONTAINER mkdir -p "$CONTAINER_HOME/.codex"
128
+ docker cp auth.json MY_CONTAINER:"$CONTAINER_HOME/.codex/auth.json"
129
+ ```
130
+
131
+ whereas if you are `ssh`'d into a remote machine, you likely want to use [`scp`](https://en.wikipedia.org/wiki/Secure_copy_protocol):
132
+
133
+ ```shell
134
+ ssh user@remote 'mkdir -p ~/.codex'
135
+ scp ~/.codex/auth.json user@remote:~/.codex/auth.json
136
+ ```
137
+
138
+ or try this one-liner:
139
+
140
+ ```shell
141
+ ssh user@remote 'mkdir -p ~/.codex && cat > ~/.codex/auth.json' < ~/.codex/auth.json
142
+ ```
143
+
144
+ #### Connecting through VPS or remote
145
+
146
+ If you run Codex on a remote machine (VPS/server) without a local browser, the login helper starts a server on `localhost:1455` on the remote host. To complete login in your local browser, forward that port to your machine before starting the login flow:
147
+
148
+ ```bash
149
+ # From your local machine
150
+ ssh -L 1455:localhost:1455 <user>@<remote-host>
151
+ ```
152
+
153
+ Then, in that SSH session, run `codex` and select "Sign in with ChatGPT". When prompted, open the printed URL (it will be `http://localhost:1455/...`) in your local browser. The traffic will be tunneled to the remote server.
154
+
111
155
  ### Usage-based billing alternative: Use an OpenAI API key
112
156
 
113
157
  If you prefer to pay-as-you-go, you can still authenticate with your OpenAI API key by setting it as an environment variable:
@@ -522,9 +566,13 @@ We're excited to launch a **$1 million initiative** supporting open source proje
522
566
 
523
567
  ## Contributing
524
568
 
525
- This project is under active development and the code will likely change pretty significantly. We'll update this message once that's complete!
569
+ This project is under active development and the code will likely change pretty significantly.
570
+
571
+ **At the moment, we only plan to prioritize reviewing external contributions for bugs or security fixes.**
572
+
573
+ If you want to add a new feature or change the behavior of an existing one, please open an issue proposing the feature and get approval from an OpenAI team member before spending time building it.
526
574
 
527
- More broadly we welcome contributions - whether you are opening your very first pull request or you're a seasoned maintainer. At the same time we care about reliability and long-term maintainability, so the bar for merging code is intentionally **high**. The guidelines below spell out what "high-quality" means in practice and should make the whole process transparent and friendly.
575
+ **New contributions that don't go through this process may be closed** if they aren't aligned with our current roadmap or conflict with other priorities/upcoming features.
528
576
 
529
577
  ### Development workflow
530
578
 
@@ -549,8 +597,9 @@ More broadly we welcome contributions - whether you are opening your very first
549
597
  ### Review process
550
598
 
551
599
  1. One maintainer will be assigned as a primary reviewer.
552
- 2. We may ask for changes - please do not take this personally. We value the work, we just also value consistency and long-term maintainability.
553
- 3. When there is consensus that the PR meets the bar, a maintainer will squash-and-merge.
600
+ 2. If your PR adds a new feature that was not previously discussed and approved, we may choose to close your PR (see [Contributing](#contributing)).
601
+ 3. We may ask for changes - please do not take this personally. We value the work, but we also value consistency and long-term maintainability.
602
+ 5. When there is consensus that the PR meets the bar, a maintainer will squash-and-merge.
554
603
 
555
604
  ### Community values
556
605
 
Binary file
Binary file
Binary file
package/bin/codex.js CHANGED
@@ -43,7 +43,7 @@ switch (platform) {
43
43
  targetTriple = "x86_64-pc-windows-msvc.exe";
44
44
  break;
45
45
  case "arm64":
46
- // We do not build this today, fall through...
46
+ // We do not build this today, fall through...
47
47
  default:
48
48
  break;
49
49
  }
@@ -65,9 +65,43 @@ const binaryPath = path.join(__dirname, "..", "bin", `codex-${targetTriple}`);
65
65
  // receives a fatal signal, both processes exit in a predictable manner.
66
66
  const { spawn } = await import("child_process");
67
67
 
68
+ async function tryImport(moduleName) {
69
+ try {
70
+ // eslint-disable-next-line node/no-unsupported-features/es-syntax
71
+ return await import(moduleName);
72
+ } catch (err) {
73
+ return null;
74
+ }
75
+ }
76
+
77
+ async function resolveRgDir() {
78
+ const ripgrep = await tryImport("@vscode/ripgrep");
79
+ if (!ripgrep?.rgPath) {
80
+ return null;
81
+ }
82
+ return path.dirname(ripgrep.rgPath);
83
+ }
84
+
85
+ function getUpdatedPath(newDirs) {
86
+ const pathSep = process.platform === "win32" ? ";" : ":";
87
+ const existingPath = process.env.PATH || "";
88
+ const updatedPath = [
89
+ ...newDirs,
90
+ ...existingPath.split(pathSep).filter(Boolean),
91
+ ].join(pathSep);
92
+ return updatedPath;
93
+ }
94
+
95
+ const additionalDirs = [];
96
+ const rgDir = await resolveRgDir();
97
+ if (rgDir) {
98
+ additionalDirs.push(rgDir);
99
+ }
100
+ const updatedPath = getUpdatedPath(additionalDirs);
101
+
68
102
  const child = spawn(binaryPath, process.argv.slice(2), {
69
103
  stdio: "inherit",
70
- env: { ...process.env, CODEX_MANAGED_BY_NPM: "1" },
104
+ env: { ...process.env, PATH: updatedPath, CODEX_MANAGED_BY_NPM: "1" },
71
105
  });
72
106
 
73
107
  child.on("error", (err) => {
@@ -120,4 +154,3 @@ if (childResult.type === "signal") {
120
154
  } else {
121
155
  process.exit(childResult.exitCode);
122
156
  }
123
-
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@openai/codex",
3
- "version": "0.20.0",
3
+ "version": "0.22.0",
4
4
  "license": "Apache-2.0",
5
5
  "bin": {
6
6
  "codex": "bin/codex.js"
@@ -16,5 +16,11 @@
16
16
  "repository": {
17
17
  "type": "git",
18
18
  "url": "git+https://github.com/openai/codex.git"
19
+ },
20
+ "dependencies": {
21
+ "@vscode/ripgrep": "^1.15.14"
22
+ },
23
+ "devDependencies": {
24
+ "prettier": "^3.3.3"
19
25
  }
20
26
  }