@openai/codex 0.12.0 → 0.13.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -18,6 +18,7 @@ This is the home of the **Codex CLI**, which is a coding agent from OpenAI that
18
18
  - [Quickstart](#quickstart)
19
19
  - [OpenAI API Users](#openai-api-users)
20
20
  - [OpenAI Plus/Pro Users](#openai-pluspro-users)
21
+ - [Using Open Source Models](#using-open-source-models)
21
22
  - [Why Codex?](#why-codex)
22
23
  - [Security model & permissions](#security-model--permissions)
23
24
  - [Platform sandboxing details](#platform-sandboxing-details)
@@ -186,6 +187,41 @@ they'll be committed to your working directory.
186
187
 
187
188
  ---
188
189
 
190
+ ## Using Open Source Models
191
+
192
+ Codex can run fully locally against an OpenAI-compatible OSS host (like Ollama) using the `--oss` flag:
193
+
194
+ - Interactive UI:
195
+ - codex --oss
196
+ - Non-interactive (programmatic) mode:
197
+ - echo "Refactor utils" | codex exec --oss
198
+
199
+ Model selection when using `--oss`:
200
+
201
+ - If you omit `-m/--model`, Codex defaults to -m gpt-oss:20b and will verify it exists locally (downloading if needed).
202
+ - To pick a different size, pass one of:
203
+ - -m "gpt-oss:20b"
204
+ - -m "gpt-oss:120b"
205
+
206
+ Point Codex at your own OSS host:
207
+
208
+ - By default, `--oss` talks to http://localhost:11434/v1.
209
+ - To use a different host, set one of these environment variables before running Codex:
210
+ - CODEX_OSS_BASE_URL, for example:
211
+ - CODEX_OSS_BASE_URL="http://my-ollama.example.com:11434/v1" codex --oss -m gpt-oss:20b
212
+ - or CODEX_OSS_PORT (when the host is localhost):
213
+ - CODEX_OSS_PORT=11434 codex --oss
214
+
215
+ Advanced: you can persist this in your config instead of environment variables by overriding the built-in `oss` provider in `~/.codex/config.toml`:
216
+
217
+ ```toml
218
+ [model_providers.oss]
219
+ name = "Open Source"
220
+ base_url = "http://my-ollama.example.com:11434/v1"
221
+ ```
222
+
223
+ ---
224
+
189
225
  ## Why Codex?
190
226
 
191
227
  Codex CLI is built for developers who already **live in the terminal** and want
Binary file
Binary file
Binary file
Binary file
Binary file
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@openai/codex",
3
- "version": "0.12.0",
3
+ "version": "0.13.0",
4
4
  "license": "Apache-2.0",
5
5
  "bin": {
6
6
  "codex": "bin/codex.js"