ai-sdk-provider-codex-cli 0.7.0 → 1.0.0-beta.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -4,23 +4,34 @@
4
4
  [![npm downloads](https://img.shields.io/npm/dm/ai-sdk-provider-codex-cli.svg)](https://www.npmjs.com/package/ai-sdk-provider-codex-cli)
5
5
  [![License: MIT](https://img.shields.io/badge/license-MIT-blue.svg)](LICENSE)
6
6
  ![Node >= 18](https://img.shields.io/badge/node-%3E%3D18-43853d?logo=node.js&logoColor=white)
7
- ![AI SDK v5](https://img.shields.io/badge/AI%20SDK-v5-000?logo=vercel&logoColor=white)
7
+ ![AI SDK v6 beta](https://img.shields.io/badge/AI%20SDK-v6%20beta-000?logo=vercel&logoColor=white)
8
8
  ![Modules: ESM + CJS](https://img.shields.io/badge/modules-ESM%20%2B%20CJS-3178c6)
9
9
  ![TypeScript](https://img.shields.io/badge/TypeScript-blue)
10
10
  [![PRs welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/ben-vargas/ai-sdk-provider-codex-cli/issues)
11
11
  [![Latest Release](https://img.shields.io/github/v/release/ben-vargas/ai-sdk-provider-codex-cli?display_name=tag)](https://github.com/ben-vargas/ai-sdk-provider-codex-cli/releases/latest)
12
12
 
13
- A community provider for Vercel AI SDK v5 that uses OpenAIs Codex CLI (non‑interactive `codex exec`) to talk to GPT‑5.1 class models (`gpt-5.1`, the Codex-specific `gpt-5.1-codex`, the flagship `gpt-5.1-codex-max`, and the lightweight `gpt-5.1-codex-mini` slugs) with your ChatGPT Plus/Pro subscription. The provider spawns the Codex CLI process, parses its JSONL output, and adapts it to the AI SDK LanguageModelV2 interface. Legacy GPT-5 / GPT-5-codex slugs remain compatible for existing workflows.
13
+ A community provider for Vercel AI SDK v6 (beta) that uses OpenAI's Codex CLI (non‑interactive `codex exec`) to talk to GPT‑5.1 class models (`gpt-5.1`, the Codex-specific `gpt-5.1-codex`, the flagship `gpt-5.1-codex-max`, and the lightweight `gpt-5.1-codex-mini` slugs) with your ChatGPT Plus/Pro subscription. The provider spawns the Codex CLI process, parses its JSONL output, and adapts it to the AI SDK LanguageModelV3 interface. Legacy GPT-5 / GPT-5-codex slugs remain compatible for existing workflows.
14
+
15
+ > **⚠️ AI SDK v6 Beta**: This version (`1.0.0-beta.x`) requires AI SDK v6 beta packages. For AI SDK v5, use the `0.x` releases from the main branch.
14
16
 
15
17
  - Works with `generateText`, `streamText`, and `generateObject` (native JSON Schema support via `--output-schema`)
16
18
  - Uses ChatGPT OAuth from `codex login` (tokens in `~/.codex/auth.json`) or `OPENAI_API_KEY`
17
19
  - Node-only (spawns a local process); supports CI and local dev
20
+ - **v1.0.0-beta.1**: AI SDK v6 migration with LanguageModelV3 interface
18
21
  - **v0.5.0**: Adds comprehensive logging system with verbose mode and custom logger support
19
22
  - **v0.3.0**: Adds comprehensive tool streaming support for monitoring autonomous tool execution
20
- - **v0.2.0 Breaking Changes**: Switched to `--experimental-json` and native schema enforcement (see [CHANGELOG](CHANGELOG.md))
23
+
24
+ ## Version Compatibility
25
+
26
+ | Provider Version | AI SDK Version | NPM Tag | Status | Branch |
27
+ | ---------------- | -------------- | --------- | ------ | ------------ |
28
+ | 1.x.x-beta | v6 (beta) | `beta` | Beta | `ai-sdk-v6` |
29
+ | 0.x.x | v5 | `latest` | Stable | `main` |
21
30
 
22
31
  ## Installation
23
32
 
33
+ ### For AI SDK v6 (beta)
34
+
24
35
  1. Install and authenticate Codex CLI
25
36
 
26
37
  ```bash
@@ -28,18 +39,24 @@ npm i -g @openai/codex
28
39
  codex login # or set OPENAI_API_KEY
29
40
  ```
30
41
 
31
- > **⚠️ Version Requirement**: Requires Codex CLI **>= 0.42.0** for `--experimental-json` and `--output-schema` support. **>= 0.60.0 recommended** for `gpt-5.1-codex-max` and `xhigh` reasoning effort. If you supply your own Codex CLI (global install or custom `codexPath`/`allowNpx`), check it with `codex --version` and upgrade if needed. The optional dependency `@openai/codex` in this package pulls a compatible version automatically.
32
- >
33
- > ```bash
34
- > npm i -g @openai/codex@latest
35
- > ```
42
+ 2. Install provider and AI SDK v6 beta
36
43
 
37
- 2. Install provider and AI SDK
44
+ ```bash
45
+ npm i ai@beta ai-sdk-provider-codex-cli@beta
46
+ ```
47
+
48
+ ### For AI SDK v5 (stable)
38
49
 
39
50
  ```bash
40
51
  npm i ai ai-sdk-provider-codex-cli
41
52
  ```
42
53
 
54
+ > **⚠️ Codex CLI Version**: Requires Codex CLI **>= 0.42.0** for `--experimental-json` and `--output-schema` support. **>= 0.60.0 recommended** for `gpt-5.1-codex-max` and `xhigh` reasoning effort. If you supply your own Codex CLI (global install or custom `codexPath`/`allowNpx`), check it with `codex --version` and upgrade if needed. The optional dependency `@openai/codex` in this package pulls a compatible version automatically.
55
+ >
56
+ > ```bash
57
+ > npm i -g @openai/codex@latest
58
+ > ```
59
+
43
60
  ## Quick Start
44
61
 
45
62
  Text generation
@@ -95,7 +112,7 @@ console.log(object);
95
112
 
96
113
  ## Features
97
114
 
98
- - AI SDK v5 compatible (LanguageModelV2)
115
+ - AI SDK v6 compatible (LanguageModelV3)
99
116
  - Streaming and non‑streaming
100
117
  - **Configurable logging** (v0.5.0+) - Verbose mode, custom loggers, or silent operation
101
118
  - **Tool streaming support** (v0.3.0+) - Monitor autonomous tool execution in real-time
package/dist/index.cjs CHANGED
@@ -388,7 +388,7 @@ function resolveCodexPath(explicitPath, allowNpx) {
388
388
  }
389
389
  }
390
390
  var CodexCliLanguageModel = class {
391
- specificationVersion = "v2";
391
+ specificationVersion = "v3";
392
392
  provider = "codex-cli";
393
393
  defaultObjectGenerationMode = "json";
394
394
  supportsImageUrls = false;
@@ -727,8 +727,8 @@ var CodexCliLanguageModel = class {
727
727
  const add = (setting, name) => {
728
728
  if (setting !== void 0)
729
729
  unsupported.push({
730
- type: "unsupported-setting",
731
- setting: name,
730
+ type: "unsupported",
731
+ feature: name,
732
732
  details: `Codex CLI does not support ${name}; it will be ignored.`
733
733
  });
734
734
  };
@@ -751,15 +751,21 @@ var CodexCliLanguageModel = class {
751
751
  extractUsage(evt) {
752
752
  const reported = evt.usage;
753
753
  if (!reported) return void 0;
754
- const inputTokens = reported.input_tokens ?? 0;
755
- const outputTokens = reported.output_tokens ?? 0;
754
+ const inputTotal = reported.input_tokens ?? 0;
755
+ const outputTotal = reported.output_tokens ?? 0;
756
756
  const cachedInputTokens = reported.cached_input_tokens ?? 0;
757
757
  return {
758
- inputTokens,
759
- outputTokens,
760
- // totalTokens should not double-count cached tokens; track cached separately
761
- totalTokens: inputTokens + outputTokens,
762
- cachedInputTokens
758
+ inputTokens: {
759
+ total: inputTotal,
760
+ noCache: inputTotal - cachedInputTokens,
761
+ cacheRead: cachedInputTokens,
762
+ cacheWrite: void 0
763
+ },
764
+ outputTokens: {
765
+ total: outputTotal,
766
+ text: outputTotal,
767
+ reasoning: void 0
768
+ }
763
769
  };
764
770
  }
765
771
  getToolName(item) {
@@ -957,7 +963,10 @@ var CodexCliLanguageModel = class {
957
963
  `[codex-cli] Executing Codex CLI: ${cmd} with ${args.length} arguments, cwd: ${cwd ?? "default"}`
958
964
  );
959
965
  let text = "";
960
- const usage = { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
966
+ let usage = {
967
+ inputTokens: { total: 0, noCache: 0, cacheRead: 0, cacheWrite: void 0 },
968
+ outputTokens: { total: 0, text: 0, reasoning: void 0 }
969
+ };
961
970
  const finishReason = "stop";
962
971
  const startTime = Date.now();
963
972
  const child = child_process.spawn(cmd, args, { env, cwd, stdio: ["ignore", "pipe", "pipe"] });
@@ -1003,9 +1012,7 @@ var CodexCliLanguageModel = class {
1003
1012
  if (event.type === "turn.completed") {
1004
1013
  const usageEvent = this.extractUsage(event);
1005
1014
  if (usageEvent) {
1006
- usage.inputTokens = usageEvent.inputTokens;
1007
- usage.outputTokens = usageEvent.outputTokens;
1008
- usage.totalTokens = usageEvent.totalTokens;
1015
+ usage = usageEvent;
1009
1016
  }
1010
1017
  }
1011
1018
  if (event.type === "item.completed" && this.getItemType(event.item) === "assistant_message" && typeof event.item?.text === "string") {
@@ -1040,11 +1047,12 @@ var CodexCliLanguageModel = class {
1040
1047
  );
1041
1048
  return;
1042
1049
  }
1050
+ const totalTokens = (usage.inputTokens.total ?? 0) + (usage.outputTokens.total ?? 0);
1043
1051
  this.logger.info(
1044
- `[codex-cli] Request completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${usage.totalTokens}`
1052
+ `[codex-cli] Request completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${totalTokens}`
1045
1053
  );
1046
1054
  this.logger.debug(
1047
- `[codex-cli] Token usage - Input: ${usage.inputTokens}, Output: ${usage.outputTokens}, Total: ${usage.totalTokens}`
1055
+ `[codex-cli] Token usage - Input: ${usage.inputTokens.total ?? 0}, Output: ${usage.outputTokens.total ?? 0}, Total: ${totalTokens}`
1048
1056
  );
1049
1057
  resolve();
1050
1058
  } else {
@@ -1272,12 +1280,16 @@ var CodexCliLanguageModel = class {
1272
1280
  controller.enqueue({ type: "text-delta", id: textId, delta: finalText });
1273
1281
  controller.enqueue({ type: "text-end", id: textId });
1274
1282
  }
1275
- const usageSummary = lastUsage ?? { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
1283
+ const usageSummary = lastUsage ?? {
1284
+ inputTokens: { total: 0, noCache: 0, cacheRead: 0, cacheWrite: void 0 },
1285
+ outputTokens: { total: 0, text: 0, reasoning: void 0 }
1286
+ };
1287
+ const totalTokens = (usageSummary.inputTokens.total ?? 0) + (usageSummary.outputTokens.total ?? 0);
1276
1288
  this.logger.info(
1277
- `[codex-cli] Stream completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${usageSummary.totalTokens}`
1289
+ `[codex-cli] Stream completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${totalTokens}`
1278
1290
  );
1279
1291
  this.logger.debug(
1280
- `[codex-cli] Token usage - Input: ${usageSummary.inputTokens}, Output: ${usageSummary.outputTokens}, Total: ${usageSummary.totalTokens}`
1292
+ `[codex-cli] Token usage - Input: ${usageSummary.inputTokens.total ?? 0}, Output: ${usageSummary.outputTokens.total ?? 0}, Total: ${totalTokens}`
1281
1293
  );
1282
1294
  controller.enqueue({
1283
1295
  type: "finish",
@@ -1381,8 +1393,8 @@ function createCodexCli(options = {}) {
1381
1393
  };
1382
1394
  provider$1.languageModel = createModel;
1383
1395
  provider$1.chat = createModel;
1384
- provider$1.textEmbeddingModel = ((modelId) => {
1385
- throw new provider.NoSuchModelError({ modelId, modelType: "textEmbeddingModel" });
1396
+ provider$1.embeddingModel = ((modelId) => {
1397
+ throw new provider.NoSuchModelError({ modelId, modelType: "embeddingModel" });
1386
1398
  });
1387
1399
  provider$1.imageModel = ((modelId) => {
1388
1400
  throw new provider.NoSuchModelError({ modelId, modelType: "imageModel" });
package/dist/index.d.cts CHANGED
@@ -1,4 +1,4 @@
1
- import { ProviderV2, LanguageModelV2 } from '@ai-sdk/provider';
1
+ import { ProviderV3, LanguageModelV3 } from '@ai-sdk/provider';
2
2
 
3
3
  /**
4
4
  * Logger interface for custom logging.
@@ -251,11 +251,11 @@ interface CodexCliProviderOptions {
251
251
  rmcpClient?: boolean;
252
252
  }
253
253
 
254
- interface CodexCliProvider extends ProviderV2 {
255
- (modelId: string, settings?: CodexCliSettings): LanguageModelV2;
256
- languageModel(modelId: string, settings?: CodexCliSettings): LanguageModelV2;
257
- chat(modelId: string, settings?: CodexCliSettings): LanguageModelV2;
258
- textEmbeddingModel(modelId: string): never;
254
+ interface CodexCliProvider extends ProviderV3 {
255
+ (modelId: string, settings?: CodexCliSettings): LanguageModelV3;
256
+ languageModel(modelId: string, settings?: CodexCliSettings): LanguageModelV3;
257
+ chat(modelId: string, settings?: CodexCliSettings): LanguageModelV3;
258
+ embeddingModel(modelId: string): never;
259
259
  imageModel(modelId: string): never;
260
260
  }
261
261
  declare function createCodexCli(options?: CodexCliProviderSettings): CodexCliProvider;
@@ -265,8 +265,8 @@ interface CodexLanguageModelOptions {
265
265
  id: string;
266
266
  settings?: CodexCliSettings;
267
267
  }
268
- declare class CodexCliLanguageModel implements LanguageModelV2 {
269
- readonly specificationVersion: "v2";
268
+ declare class CodexCliLanguageModel implements LanguageModelV3 {
269
+ readonly specificationVersion: "v3";
270
270
  readonly provider = "codex-cli";
271
271
  readonly defaultObjectGenerationMode: "json";
272
272
  readonly supportsImageUrls = false;
@@ -301,8 +301,8 @@ declare class CodexCliLanguageModel implements LanguageModelV2 {
301
301
  private emitToolInvocation;
302
302
  private emitToolResult;
303
303
  private handleSpawnError;
304
- doGenerate(options: Parameters<LanguageModelV2['doGenerate']>[0]): Promise<Awaited<ReturnType<LanguageModelV2['doGenerate']>>>;
305
- doStream(options: Parameters<LanguageModelV2['doStream']>[0]): Promise<Awaited<ReturnType<LanguageModelV2['doStream']>>>;
304
+ doGenerate(options: Parameters<LanguageModelV3['doGenerate']>[0]): Promise<Awaited<ReturnType<LanguageModelV3['doGenerate']>>>;
305
+ doStream(options: Parameters<LanguageModelV3['doStream']>[0]): Promise<Awaited<ReturnType<LanguageModelV3['doStream']>>>;
306
306
  }
307
307
 
308
308
  declare function isAuthenticationError(err: unknown): boolean;
package/dist/index.d.ts CHANGED
@@ -1,4 +1,4 @@
1
- import { ProviderV2, LanguageModelV2 } from '@ai-sdk/provider';
1
+ import { ProviderV3, LanguageModelV3 } from '@ai-sdk/provider';
2
2
 
3
3
  /**
4
4
  * Logger interface for custom logging.
@@ -251,11 +251,11 @@ interface CodexCliProviderOptions {
251
251
  rmcpClient?: boolean;
252
252
  }
253
253
 
254
- interface CodexCliProvider extends ProviderV2 {
255
- (modelId: string, settings?: CodexCliSettings): LanguageModelV2;
256
- languageModel(modelId: string, settings?: CodexCliSettings): LanguageModelV2;
257
- chat(modelId: string, settings?: CodexCliSettings): LanguageModelV2;
258
- textEmbeddingModel(modelId: string): never;
254
+ interface CodexCliProvider extends ProviderV3 {
255
+ (modelId: string, settings?: CodexCliSettings): LanguageModelV3;
256
+ languageModel(modelId: string, settings?: CodexCliSettings): LanguageModelV3;
257
+ chat(modelId: string, settings?: CodexCliSettings): LanguageModelV3;
258
+ embeddingModel(modelId: string): never;
259
259
  imageModel(modelId: string): never;
260
260
  }
261
261
  declare function createCodexCli(options?: CodexCliProviderSettings): CodexCliProvider;
@@ -265,8 +265,8 @@ interface CodexLanguageModelOptions {
265
265
  id: string;
266
266
  settings?: CodexCliSettings;
267
267
  }
268
- declare class CodexCliLanguageModel implements LanguageModelV2 {
269
- readonly specificationVersion: "v2";
268
+ declare class CodexCliLanguageModel implements LanguageModelV3 {
269
+ readonly specificationVersion: "v3";
270
270
  readonly provider = "codex-cli";
271
271
  readonly defaultObjectGenerationMode: "json";
272
272
  readonly supportsImageUrls = false;
@@ -301,8 +301,8 @@ declare class CodexCliLanguageModel implements LanguageModelV2 {
301
301
  private emitToolInvocation;
302
302
  private emitToolResult;
303
303
  private handleSpawnError;
304
- doGenerate(options: Parameters<LanguageModelV2['doGenerate']>[0]): Promise<Awaited<ReturnType<LanguageModelV2['doGenerate']>>>;
305
- doStream(options: Parameters<LanguageModelV2['doStream']>[0]): Promise<Awaited<ReturnType<LanguageModelV2['doStream']>>>;
304
+ doGenerate(options: Parameters<LanguageModelV3['doGenerate']>[0]): Promise<Awaited<ReturnType<LanguageModelV3['doGenerate']>>>;
305
+ doStream(options: Parameters<LanguageModelV3['doStream']>[0]): Promise<Awaited<ReturnType<LanguageModelV3['doStream']>>>;
306
306
  }
307
307
 
308
308
  declare function isAuthenticationError(err: unknown): boolean;
package/dist/index.js CHANGED
@@ -385,7 +385,7 @@ function resolveCodexPath(explicitPath, allowNpx) {
385
385
  }
386
386
  }
387
387
  var CodexCliLanguageModel = class {
388
- specificationVersion = "v2";
388
+ specificationVersion = "v3";
389
389
  provider = "codex-cli";
390
390
  defaultObjectGenerationMode = "json";
391
391
  supportsImageUrls = false;
@@ -724,8 +724,8 @@ var CodexCliLanguageModel = class {
724
724
  const add = (setting, name) => {
725
725
  if (setting !== void 0)
726
726
  unsupported.push({
727
- type: "unsupported-setting",
728
- setting: name,
727
+ type: "unsupported",
728
+ feature: name,
729
729
  details: `Codex CLI does not support ${name}; it will be ignored.`
730
730
  });
731
731
  };
@@ -748,15 +748,21 @@ var CodexCliLanguageModel = class {
748
748
  extractUsage(evt) {
749
749
  const reported = evt.usage;
750
750
  if (!reported) return void 0;
751
- const inputTokens = reported.input_tokens ?? 0;
752
- const outputTokens = reported.output_tokens ?? 0;
751
+ const inputTotal = reported.input_tokens ?? 0;
752
+ const outputTotal = reported.output_tokens ?? 0;
753
753
  const cachedInputTokens = reported.cached_input_tokens ?? 0;
754
754
  return {
755
- inputTokens,
756
- outputTokens,
757
- // totalTokens should not double-count cached tokens; track cached separately
758
- totalTokens: inputTokens + outputTokens,
759
- cachedInputTokens
755
+ inputTokens: {
756
+ total: inputTotal,
757
+ noCache: inputTotal - cachedInputTokens,
758
+ cacheRead: cachedInputTokens,
759
+ cacheWrite: void 0
760
+ },
761
+ outputTokens: {
762
+ total: outputTotal,
763
+ text: outputTotal,
764
+ reasoning: void 0
765
+ }
760
766
  };
761
767
  }
762
768
  getToolName(item) {
@@ -954,7 +960,10 @@ var CodexCliLanguageModel = class {
954
960
  `[codex-cli] Executing Codex CLI: ${cmd} with ${args.length} arguments, cwd: ${cwd ?? "default"}`
955
961
  );
956
962
  let text = "";
957
- const usage = { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
963
+ let usage = {
964
+ inputTokens: { total: 0, noCache: 0, cacheRead: 0, cacheWrite: void 0 },
965
+ outputTokens: { total: 0, text: 0, reasoning: void 0 }
966
+ };
958
967
  const finishReason = "stop";
959
968
  const startTime = Date.now();
960
969
  const child = spawn(cmd, args, { env, cwd, stdio: ["ignore", "pipe", "pipe"] });
@@ -1000,9 +1009,7 @@ var CodexCliLanguageModel = class {
1000
1009
  if (event.type === "turn.completed") {
1001
1010
  const usageEvent = this.extractUsage(event);
1002
1011
  if (usageEvent) {
1003
- usage.inputTokens = usageEvent.inputTokens;
1004
- usage.outputTokens = usageEvent.outputTokens;
1005
- usage.totalTokens = usageEvent.totalTokens;
1012
+ usage = usageEvent;
1006
1013
  }
1007
1014
  }
1008
1015
  if (event.type === "item.completed" && this.getItemType(event.item) === "assistant_message" && typeof event.item?.text === "string") {
@@ -1037,11 +1044,12 @@ var CodexCliLanguageModel = class {
1037
1044
  );
1038
1045
  return;
1039
1046
  }
1047
+ const totalTokens = (usage.inputTokens.total ?? 0) + (usage.outputTokens.total ?? 0);
1040
1048
  this.logger.info(
1041
- `[codex-cli] Request completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${usage.totalTokens}`
1049
+ `[codex-cli] Request completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${totalTokens}`
1042
1050
  );
1043
1051
  this.logger.debug(
1044
- `[codex-cli] Token usage - Input: ${usage.inputTokens}, Output: ${usage.outputTokens}, Total: ${usage.totalTokens}`
1052
+ `[codex-cli] Token usage - Input: ${usage.inputTokens.total ?? 0}, Output: ${usage.outputTokens.total ?? 0}, Total: ${totalTokens}`
1045
1053
  );
1046
1054
  resolve();
1047
1055
  } else {
@@ -1269,12 +1277,16 @@ var CodexCliLanguageModel = class {
1269
1277
  controller.enqueue({ type: "text-delta", id: textId, delta: finalText });
1270
1278
  controller.enqueue({ type: "text-end", id: textId });
1271
1279
  }
1272
- const usageSummary = lastUsage ?? { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
1280
+ const usageSummary = lastUsage ?? {
1281
+ inputTokens: { total: 0, noCache: 0, cacheRead: 0, cacheWrite: void 0 },
1282
+ outputTokens: { total: 0, text: 0, reasoning: void 0 }
1283
+ };
1284
+ const totalTokens = (usageSummary.inputTokens.total ?? 0) + (usageSummary.outputTokens.total ?? 0);
1273
1285
  this.logger.info(
1274
- `[codex-cli] Stream completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${usageSummary.totalTokens}`
1286
+ `[codex-cli] Stream completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${totalTokens}`
1275
1287
  );
1276
1288
  this.logger.debug(
1277
- `[codex-cli] Token usage - Input: ${usageSummary.inputTokens}, Output: ${usageSummary.outputTokens}, Total: ${usageSummary.totalTokens}`
1289
+ `[codex-cli] Token usage - Input: ${usageSummary.inputTokens.total ?? 0}, Output: ${usageSummary.outputTokens.total ?? 0}, Total: ${totalTokens}`
1278
1290
  );
1279
1291
  controller.enqueue({
1280
1292
  type: "finish",
@@ -1378,8 +1390,8 @@ function createCodexCli(options = {}) {
1378
1390
  };
1379
1391
  provider.languageModel = createModel;
1380
1392
  provider.chat = createModel;
1381
- provider.textEmbeddingModel = ((modelId) => {
1382
- throw new NoSuchModelError({ modelId, modelType: "textEmbeddingModel" });
1393
+ provider.embeddingModel = ((modelId) => {
1394
+ throw new NoSuchModelError({ modelId, modelType: "embeddingModel" });
1383
1395
  });
1384
1396
  provider.imageModel = ((modelId) => {
1385
1397
  throw new NoSuchModelError({ modelId, modelType: "imageModel" });
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "ai-sdk-provider-codex-cli",
3
- "version": "0.7.0",
4
- "description": "AI SDK v5 provider for OpenAI Codex CLI with native JSON Schema support",
3
+ "version": "1.0.0-beta.1",
4
+ "description": "AI SDK v6 provider for OpenAI Codex CLI with native JSON Schema support",
5
5
  "keywords": [
6
6
  "ai-sdk",
7
7
  "codex",
@@ -59,8 +59,8 @@
59
59
  "test:coverage": "vitest run --coverage"
60
60
  },
61
61
  "dependencies": {
62
- "@ai-sdk/provider": "2.0.0",
63
- "@ai-sdk/provider-utils": "3.0.3",
62
+ "@ai-sdk/provider": "3.0.0-beta.26",
63
+ "@ai-sdk/provider-utils": "4.0.0-beta.51",
64
64
  "jsonc-parser": "^3.3.1"
65
65
  },
66
66
  "optionalDependencies": {
@@ -70,7 +70,7 @@
70
70
  "@eslint/js": "^9.14.0",
71
71
  "@types/node": "20.19.6",
72
72
  "@vitest/coverage-v8": "^3.2.4",
73
- "ai": "5.0.93",
73
+ "ai": "6.0.0-beta.156",
74
74
  "eslint": "^9.14.0",
75
75
  "prettier": "^3.3.3",
76
76
  "tsup": "8.5.0",