ai-sdk-provider-codex-cli 1.0.0-beta.1 → 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -4,33 +4,31 @@
4
4
  [![npm downloads](https://img.shields.io/npm/dm/ai-sdk-provider-codex-cli.svg)](https://www.npmjs.com/package/ai-sdk-provider-codex-cli)
5
5
  [![License: MIT](https://img.shields.io/badge/license-MIT-blue.svg)](LICENSE)
6
6
  ![Node >= 18](https://img.shields.io/badge/node-%3E%3D18-43853d?logo=node.js&logoColor=white)
7
- ![AI SDK v6 beta](https://img.shields.io/badge/AI%20SDK-v6%20beta-000?logo=vercel&logoColor=white)
7
+ ![AI SDK v6](https://img.shields.io/badge/AI%20SDK-v6-000?logo=vercel&logoColor=white)
8
8
  ![Modules: ESM + CJS](https://img.shields.io/badge/modules-ESM%20%2B%20CJS-3178c6)
9
9
  ![TypeScript](https://img.shields.io/badge/TypeScript-blue)
10
10
  [![PRs welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/ben-vargas/ai-sdk-provider-codex-cli/issues)
11
11
  [![Latest Release](https://img.shields.io/github/v/release/ben-vargas/ai-sdk-provider-codex-cli?display_name=tag)](https://github.com/ben-vargas/ai-sdk-provider-codex-cli/releases/latest)
12
12
 
13
- A community provider for Vercel AI SDK v6 (beta) that uses OpenAI's Codex CLI (non‑interactive `codex exec`) to talk to GPT‑5.1 class models (`gpt-5.1`, the Codex-specific `gpt-5.1-codex`, the flagship `gpt-5.1-codex-max`, and the lightweight `gpt-5.1-codex-mini` slugs) with your ChatGPT Plus/Pro subscription. The provider spawns the Codex CLI process, parses its JSONL output, and adapts it to the AI SDK LanguageModelV3 interface. Legacy GPT-5 / GPT-5-codex slugs remain compatible for existing workflows.
14
-
15
- > **⚠️ AI SDK v6 Beta**: This version (`1.0.0-beta.x`) requires AI SDK v6 beta packages. For AI SDK v5, use the `0.x` releases from the main branch.
13
+ A community provider for Vercel AI SDK v6 that uses OpenAI's Codex CLI (non‑interactive `codex exec`) to talk to GPT‑5.1 class models (`gpt-5.1`, the Codex-specific `gpt-5.1-codex`, the flagship `gpt-5.1-codex-max`, and the lightweight `gpt-5.1-codex-mini` slugs) with your ChatGPT Plus/Pro subscription. The provider spawns the Codex CLI process, parses its JSONL output, and adapts it to the AI SDK LanguageModelV3 interface. Legacy GPT-5 / GPT-5-codex slugs remain compatible for existing workflows.
16
14
 
17
15
  - Works with `generateText`, `streamText`, and `generateObject` (native JSON Schema support via `--output-schema`)
18
16
  - Uses ChatGPT OAuth from `codex login` (tokens in `~/.codex/auth.json`) or `OPENAI_API_KEY`
19
17
  - Node-only (spawns a local process); supports CI and local dev
20
- - **v1.0.0-beta.1**: AI SDK v6 migration with LanguageModelV3 interface
18
+ - **v1.0.0**: AI SDK v6 stable migration with LanguageModelV3 interface
21
19
  - **v0.5.0**: Adds comprehensive logging system with verbose mode and custom logger support
22
20
  - **v0.3.0**: Adds comprehensive tool streaming support for monitoring autonomous tool execution
23
21
 
24
22
  ## Version Compatibility
25
23
 
26
- | Provider Version | AI SDK Version | NPM Tag | Status | Branch |
27
- | ---------------- | -------------- | --------- | ------ | ------------ |
28
- | 1.x.x-beta | v6 (beta) | `beta` | Beta | `ai-sdk-v6` |
29
- | 0.x.x | v5 | `latest` | Stable | `main` |
24
+ | Provider Version | AI SDK Version | NPM Tag | NPM Installation |
25
+ | ---------------- | -------------- | ------------ | ---------------- |
26
+ | 1.x.x | v6 | `latest` | `npm i ai-sdk-provider-codex-cli ai@^6.0.0` |
27
+ | 0.x.x | v5 | `ai-sdk-v5` | `npm i ai-sdk-provider-codex-cli@ai-sdk-v5 ai@^5.0.0` |
30
28
 
31
29
  ## Installation
32
30
 
33
- ### For AI SDK v6 (beta)
31
+ ### For AI SDK v6 (default)
34
32
 
35
33
  1. Install and authenticate Codex CLI
36
34
 
@@ -39,16 +37,16 @@ npm i -g @openai/codex
39
37
  codex login # or set OPENAI_API_KEY
40
38
  ```
41
39
 
42
- 2. Install provider and AI SDK v6 beta
40
+ 2. Install provider and AI SDK v6
43
41
 
44
42
  ```bash
45
- npm i ai@beta ai-sdk-provider-codex-cli@beta
43
+ npm i ai ai-sdk-provider-codex-cli
46
44
  ```
47
45
 
48
- ### For AI SDK v5 (stable)
46
+ ### For AI SDK v5
49
47
 
50
48
  ```bash
51
- npm i ai ai-sdk-provider-codex-cli
49
+ npm i ai@^5.0.0 ai-sdk-provider-codex-cli@ai-sdk-v5
52
50
  ```
53
51
 
54
52
  > **⚠️ Codex CLI Version**: Requires Codex CLI **>= 0.42.0** for `--experimental-json` and `--output-schema` support. **>= 0.60.0 recommended** for `gpt-5.1-codex-max` and `xhigh` reasoning effort. If you supply your own Codex CLI (global install or custom `codexPath`/`allowNpx`), check it with `codex --version` and upgrade if needed. The optional dependency `@openai/codex` in this package pulls a compatible version automatically.
package/dist/index.cjs CHANGED
@@ -356,6 +356,41 @@ function isAuthenticationError(err) {
356
356
  }
357
357
 
358
358
  // src/codex-cli-language-model.ts
359
+ function createEmptyCodexUsage() {
360
+ return {
361
+ inputTokens: {
362
+ total: 0,
363
+ noCache: 0,
364
+ cacheRead: 0,
365
+ cacheWrite: 0
366
+ },
367
+ outputTokens: {
368
+ total: 0,
369
+ text: void 0,
370
+ reasoning: void 0
371
+ },
372
+ raw: void 0
373
+ };
374
+ }
375
+ function mapCodexCliFinishReason(reason) {
376
+ switch (reason) {
377
+ case "stop":
378
+ case "end_turn":
379
+ case void 0:
380
+ return { unified: "stop", raw: reason };
381
+ case "length":
382
+ case "max_tokens":
383
+ return { unified: "length", raw: reason };
384
+ case "content_filter":
385
+ return { unified: "content-filter", raw: reason };
386
+ case "tool_calls":
387
+ return { unified: "tool-calls", raw: reason };
388
+ case "error":
389
+ return { unified: "error", raw: reason };
390
+ default:
391
+ return { unified: "other", raw: reason };
392
+ }
393
+ }
359
394
  var codexCliProviderOptionsSchema = zod.z.object({
360
395
  reasoningEffort: zod.z.enum(["minimal", "low", "medium", "high", "xhigh"]).optional(),
361
396
  reasoningSummary: zod.z.enum(["auto", "detailed"]).optional(),
@@ -759,13 +794,14 @@ var CodexCliLanguageModel = class {
759
794
  total: inputTotal,
760
795
  noCache: inputTotal - cachedInputTokens,
761
796
  cacheRead: cachedInputTokens,
762
- cacheWrite: void 0
797
+ cacheWrite: 0
763
798
  },
764
799
  outputTokens: {
765
800
  total: outputTotal,
766
- text: outputTotal,
801
+ text: void 0,
767
802
  reasoning: void 0
768
- }
803
+ },
804
+ raw: reported
769
805
  };
770
806
  }
771
807
  getToolName(item) {
@@ -963,11 +999,8 @@ var CodexCliLanguageModel = class {
963
999
  `[codex-cli] Executing Codex CLI: ${cmd} with ${args.length} arguments, cwd: ${cwd ?? "default"}`
964
1000
  );
965
1001
  let text = "";
966
- let usage = {
967
- inputTokens: { total: 0, noCache: 0, cacheRead: 0, cacheWrite: void 0 },
968
- outputTokens: { total: 0, text: 0, reasoning: void 0 }
969
- };
970
- const finishReason = "stop";
1002
+ let usage = createEmptyCodexUsage();
1003
+ let finishReason = mapCodexCliFinishReason(void 0);
971
1004
  const startTime = Date.now();
972
1005
  const child = child_process.spawn(cmd, args, { env, cwd, stdio: ["ignore", "pipe", "pipe"] });
973
1006
  let onAbort;
@@ -1280,10 +1313,7 @@ var CodexCliLanguageModel = class {
1280
1313
  controller.enqueue({ type: "text-delta", id: textId, delta: finalText });
1281
1314
  controller.enqueue({ type: "text-end", id: textId });
1282
1315
  }
1283
- const usageSummary = lastUsage ?? {
1284
- inputTokens: { total: 0, noCache: 0, cacheRead: 0, cacheWrite: void 0 },
1285
- outputTokens: { total: 0, text: 0, reasoning: void 0 }
1286
- };
1316
+ const usageSummary = lastUsage ?? createEmptyCodexUsage();
1287
1317
  const totalTokens = (usageSummary.inputTokens.total ?? 0) + (usageSummary.outputTokens.total ?? 0);
1288
1318
  this.logger.info(
1289
1319
  `[codex-cli] Stream completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${totalTokens}`
@@ -1293,7 +1323,7 @@ var CodexCliLanguageModel = class {
1293
1323
  );
1294
1324
  controller.enqueue({
1295
1325
  type: "finish",
1296
- finishReason: "stop",
1326
+ finishReason: mapCodexCliFinishReason(void 0),
1297
1327
  usage: usageSummary
1298
1328
  });
1299
1329
  controller.close();
package/dist/index.js CHANGED
@@ -353,6 +353,41 @@ function isAuthenticationError(err) {
353
353
  }
354
354
 
355
355
  // src/codex-cli-language-model.ts
356
+ function createEmptyCodexUsage() {
357
+ return {
358
+ inputTokens: {
359
+ total: 0,
360
+ noCache: 0,
361
+ cacheRead: 0,
362
+ cacheWrite: 0
363
+ },
364
+ outputTokens: {
365
+ total: 0,
366
+ text: void 0,
367
+ reasoning: void 0
368
+ },
369
+ raw: void 0
370
+ };
371
+ }
372
+ function mapCodexCliFinishReason(reason) {
373
+ switch (reason) {
374
+ case "stop":
375
+ case "end_turn":
376
+ case void 0:
377
+ return { unified: "stop", raw: reason };
378
+ case "length":
379
+ case "max_tokens":
380
+ return { unified: "length", raw: reason };
381
+ case "content_filter":
382
+ return { unified: "content-filter", raw: reason };
383
+ case "tool_calls":
384
+ return { unified: "tool-calls", raw: reason };
385
+ case "error":
386
+ return { unified: "error", raw: reason };
387
+ default:
388
+ return { unified: "other", raw: reason };
389
+ }
390
+ }
356
391
  var codexCliProviderOptionsSchema = z.object({
357
392
  reasoningEffort: z.enum(["minimal", "low", "medium", "high", "xhigh"]).optional(),
358
393
  reasoningSummary: z.enum(["auto", "detailed"]).optional(),
@@ -756,13 +791,14 @@ var CodexCliLanguageModel = class {
756
791
  total: inputTotal,
757
792
  noCache: inputTotal - cachedInputTokens,
758
793
  cacheRead: cachedInputTokens,
759
- cacheWrite: void 0
794
+ cacheWrite: 0
760
795
  },
761
796
  outputTokens: {
762
797
  total: outputTotal,
763
- text: outputTotal,
798
+ text: void 0,
764
799
  reasoning: void 0
765
- }
800
+ },
801
+ raw: reported
766
802
  };
767
803
  }
768
804
  getToolName(item) {
@@ -960,11 +996,8 @@ var CodexCliLanguageModel = class {
960
996
  `[codex-cli] Executing Codex CLI: ${cmd} with ${args.length} arguments, cwd: ${cwd ?? "default"}`
961
997
  );
962
998
  let text = "";
963
- let usage = {
964
- inputTokens: { total: 0, noCache: 0, cacheRead: 0, cacheWrite: void 0 },
965
- outputTokens: { total: 0, text: 0, reasoning: void 0 }
966
- };
967
- const finishReason = "stop";
999
+ let usage = createEmptyCodexUsage();
1000
+ let finishReason = mapCodexCliFinishReason(void 0);
968
1001
  const startTime = Date.now();
969
1002
  const child = spawn(cmd, args, { env, cwd, stdio: ["ignore", "pipe", "pipe"] });
970
1003
  let onAbort;
@@ -1277,10 +1310,7 @@ var CodexCliLanguageModel = class {
1277
1310
  controller.enqueue({ type: "text-delta", id: textId, delta: finalText });
1278
1311
  controller.enqueue({ type: "text-end", id: textId });
1279
1312
  }
1280
- const usageSummary = lastUsage ?? {
1281
- inputTokens: { total: 0, noCache: 0, cacheRead: 0, cacheWrite: void 0 },
1282
- outputTokens: { total: 0, text: 0, reasoning: void 0 }
1283
- };
1313
+ const usageSummary = lastUsage ?? createEmptyCodexUsage();
1284
1314
  const totalTokens = (usageSummary.inputTokens.total ?? 0) + (usageSummary.outputTokens.total ?? 0);
1285
1315
  this.logger.info(
1286
1316
  `[codex-cli] Stream completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${totalTokens}`
@@ -1290,7 +1320,7 @@ var CodexCliLanguageModel = class {
1290
1320
  );
1291
1321
  controller.enqueue({
1292
1322
  type: "finish",
1293
- finishReason: "stop",
1323
+ finishReason: mapCodexCliFinishReason(void 0),
1294
1324
  usage: usageSummary
1295
1325
  });
1296
1326
  controller.close();
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "ai-sdk-provider-codex-cli",
3
- "version": "1.0.0-beta.1",
4
- "description": "AI SDK v6 provider for OpenAI Codex CLI with native JSON Schema support",
3
+ "version": "1.0.0",
4
+ "description": "AI SDK v6 provider for OpenAI Codex CLI (use ChatGPT Plus/Pro subscription)",
5
5
  "keywords": [
6
6
  "ai-sdk",
7
7
  "codex",
@@ -59,8 +59,8 @@
59
59
  "test:coverage": "vitest run --coverage"
60
60
  },
61
61
  "dependencies": {
62
- "@ai-sdk/provider": "3.0.0-beta.26",
63
- "@ai-sdk/provider-utils": "4.0.0-beta.51",
62
+ "@ai-sdk/provider": "^3.0.0",
63
+ "@ai-sdk/provider-utils": "^4.0.1",
64
64
  "jsonc-parser": "^3.3.1"
65
65
  },
66
66
  "optionalDependencies": {
@@ -70,7 +70,7 @@
70
70
  "@eslint/js": "^9.14.0",
71
71
  "@types/node": "20.19.6",
72
72
  "@vitest/coverage-v8": "^3.2.4",
73
- "ai": "6.0.0-beta.156",
73
+ "ai": "^6.0.3",
74
74
  "eslint": "^9.14.0",
75
75
  "prettier": "^3.3.3",
76
76
  "tsup": "8.5.0",