ai-sdk-provider-codex-cli 0.7.0 → 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -4,23 +4,32 @@
4
4
  [![npm downloads](https://img.shields.io/npm/dm/ai-sdk-provider-codex-cli.svg)](https://www.npmjs.com/package/ai-sdk-provider-codex-cli)
5
5
  [![License: MIT](https://img.shields.io/badge/license-MIT-blue.svg)](LICENSE)
6
6
  ![Node >= 18](https://img.shields.io/badge/node-%3E%3D18-43853d?logo=node.js&logoColor=white)
7
- ![AI SDK v5](https://img.shields.io/badge/AI%20SDK-v5-000?logo=vercel&logoColor=white)
7
+ ![AI SDK v6](https://img.shields.io/badge/AI%20SDK-v6-000?logo=vercel&logoColor=white)
8
8
  ![Modules: ESM + CJS](https://img.shields.io/badge/modules-ESM%20%2B%20CJS-3178c6)
9
9
  ![TypeScript](https://img.shields.io/badge/TypeScript-blue)
10
10
  [![PRs welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/ben-vargas/ai-sdk-provider-codex-cli/issues)
11
11
  [![Latest Release](https://img.shields.io/github/v/release/ben-vargas/ai-sdk-provider-codex-cli?display_name=tag)](https://github.com/ben-vargas/ai-sdk-provider-codex-cli/releases/latest)
12
12
 
13
- A community provider for Vercel AI SDK v5 that uses OpenAIs Codex CLI (non‑interactive `codex exec`) to talk to GPT‑5.1 class models (`gpt-5.1`, the Codex-specific `gpt-5.1-codex`, the flagship `gpt-5.1-codex-max`, and the lightweight `gpt-5.1-codex-mini` slugs) with your ChatGPT Plus/Pro subscription. The provider spawns the Codex CLI process, parses its JSONL output, and adapts it to the AI SDK LanguageModelV2 interface. Legacy GPT-5 / GPT-5-codex slugs remain compatible for existing workflows.
13
+ A community provider for Vercel AI SDK v6 that uses OpenAI's Codex CLI (non‑interactive `codex exec`) to talk to GPT‑5.1 class models (`gpt-5.1`, the Codex-specific `gpt-5.1-codex`, the flagship `gpt-5.1-codex-max`, and the lightweight `gpt-5.1-codex-mini` slugs) with your ChatGPT Plus/Pro subscription. The provider spawns the Codex CLI process, parses its JSONL output, and adapts it to the AI SDK LanguageModelV3 interface. Legacy GPT-5 / GPT-5-codex slugs remain compatible for existing workflows.
14
14
 
15
15
  - Works with `generateText`, `streamText`, and `generateObject` (native JSON Schema support via `--output-schema`)
16
16
  - Uses ChatGPT OAuth from `codex login` (tokens in `~/.codex/auth.json`) or `OPENAI_API_KEY`
17
17
  - Node-only (spawns a local process); supports CI and local dev
18
+ - **v1.0.0**: AI SDK v6 stable migration with LanguageModelV3 interface
18
19
  - **v0.5.0**: Adds comprehensive logging system with verbose mode and custom logger support
19
20
  - **v0.3.0**: Adds comprehensive tool streaming support for monitoring autonomous tool execution
20
- - **v0.2.0 Breaking Changes**: Switched to `--experimental-json` and native schema enforcement (see [CHANGELOG](CHANGELOG.md))
21
+
22
+ ## Version Compatibility
23
+
24
+ | Provider Version | AI SDK Version | NPM Tag | NPM Installation |
25
+ | ---------------- | -------------- | ------------ | ---------------- |
26
+ | 1.x.x | v6 | `latest` | `npm i ai-sdk-provider-codex-cli ai@^6.0.0` |
27
+ | 0.x.x | v5 | `ai-sdk-v5` | `npm i ai-sdk-provider-codex-cli@ai-sdk-v5 ai@^5.0.0` |
21
28
 
22
29
  ## Installation
23
30
 
31
+ ### For AI SDK v6 (default)
32
+
24
33
  1. Install and authenticate Codex CLI
25
34
 
26
35
  ```bash
@@ -28,18 +37,24 @@ npm i -g @openai/codex
28
37
  codex login # or set OPENAI_API_KEY
29
38
  ```
30
39
 
31
- > **⚠️ Version Requirement**: Requires Codex CLI **>= 0.42.0** for `--experimental-json` and `--output-schema` support. **>= 0.60.0 recommended** for `gpt-5.1-codex-max` and `xhigh` reasoning effort. If you supply your own Codex CLI (global install or custom `codexPath`/`allowNpx`), check it with `codex --version` and upgrade if needed. The optional dependency `@openai/codex` in this package pulls a compatible version automatically.
32
- >
33
- > ```bash
34
- > npm i -g @openai/codex@latest
35
- > ```
36
-
37
- 2. Install provider and AI SDK
40
+ 2. Install provider and AI SDK v6
38
41
 
39
42
  ```bash
40
43
  npm i ai ai-sdk-provider-codex-cli
41
44
  ```
42
45
 
46
+ ### For AI SDK v5
47
+
48
+ ```bash
49
+ npm i ai@^5.0.0 ai-sdk-provider-codex-cli@ai-sdk-v5
50
+ ```
51
+
52
+ > **⚠️ Codex CLI Version**: Requires Codex CLI **>= 0.42.0** for `--experimental-json` and `--output-schema` support. **>= 0.60.0 recommended** for `gpt-5.1-codex-max` and `xhigh` reasoning effort. If you supply your own Codex CLI (global install or custom `codexPath`/`allowNpx`), check it with `codex --version` and upgrade if needed. The optional dependency `@openai/codex` in this package pulls a compatible version automatically.
53
+ >
54
+ > ```bash
55
+ > npm i -g @openai/codex@latest
56
+ > ```
57
+
43
58
  ## Quick Start
44
59
 
45
60
  Text generation
@@ -95,7 +110,7 @@ console.log(object);
95
110
 
96
111
  ## Features
97
112
 
98
- - AI SDK v5 compatible (LanguageModelV2)
113
+ - AI SDK v6 compatible (LanguageModelV3)
99
114
  - Streaming and non‑streaming
100
115
  - **Configurable logging** (v0.5.0+) - Verbose mode, custom loggers, or silent operation
101
116
  - **Tool streaming support** (v0.3.0+) - Monitor autonomous tool execution in real-time
package/dist/index.cjs CHANGED
@@ -356,6 +356,41 @@ function isAuthenticationError(err) {
356
356
  }
357
357
 
358
358
  // src/codex-cli-language-model.ts
359
+ function createEmptyCodexUsage() {
360
+ return {
361
+ inputTokens: {
362
+ total: 0,
363
+ noCache: 0,
364
+ cacheRead: 0,
365
+ cacheWrite: 0
366
+ },
367
+ outputTokens: {
368
+ total: 0,
369
+ text: void 0,
370
+ reasoning: void 0
371
+ },
372
+ raw: void 0
373
+ };
374
+ }
375
+ function mapCodexCliFinishReason(reason) {
376
+ switch (reason) {
377
+ case "stop":
378
+ case "end_turn":
379
+ case void 0:
380
+ return { unified: "stop", raw: reason };
381
+ case "length":
382
+ case "max_tokens":
383
+ return { unified: "length", raw: reason };
384
+ case "content_filter":
385
+ return { unified: "content-filter", raw: reason };
386
+ case "tool_calls":
387
+ return { unified: "tool-calls", raw: reason };
388
+ case "error":
389
+ return { unified: "error", raw: reason };
390
+ default:
391
+ return { unified: "other", raw: reason };
392
+ }
393
+ }
359
394
  var codexCliProviderOptionsSchema = zod.z.object({
360
395
  reasoningEffort: zod.z.enum(["minimal", "low", "medium", "high", "xhigh"]).optional(),
361
396
  reasoningSummary: zod.z.enum(["auto", "detailed"]).optional(),
@@ -388,7 +423,7 @@ function resolveCodexPath(explicitPath, allowNpx) {
388
423
  }
389
424
  }
390
425
  var CodexCliLanguageModel = class {
391
- specificationVersion = "v2";
426
+ specificationVersion = "v3";
392
427
  provider = "codex-cli";
393
428
  defaultObjectGenerationMode = "json";
394
429
  supportsImageUrls = false;
@@ -727,8 +762,8 @@ var CodexCliLanguageModel = class {
727
762
  const add = (setting, name) => {
728
763
  if (setting !== void 0)
729
764
  unsupported.push({
730
- type: "unsupported-setting",
731
- setting: name,
765
+ type: "unsupported",
766
+ feature: name,
732
767
  details: `Codex CLI does not support ${name}; it will be ignored.`
733
768
  });
734
769
  };
@@ -751,15 +786,22 @@ var CodexCliLanguageModel = class {
751
786
  extractUsage(evt) {
752
787
  const reported = evt.usage;
753
788
  if (!reported) return void 0;
754
- const inputTokens = reported.input_tokens ?? 0;
755
- const outputTokens = reported.output_tokens ?? 0;
789
+ const inputTotal = reported.input_tokens ?? 0;
790
+ const outputTotal = reported.output_tokens ?? 0;
756
791
  const cachedInputTokens = reported.cached_input_tokens ?? 0;
757
792
  return {
758
- inputTokens,
759
- outputTokens,
760
- // totalTokens should not double-count cached tokens; track cached separately
761
- totalTokens: inputTokens + outputTokens,
762
- cachedInputTokens
793
+ inputTokens: {
794
+ total: inputTotal,
795
+ noCache: inputTotal - cachedInputTokens,
796
+ cacheRead: cachedInputTokens,
797
+ cacheWrite: 0
798
+ },
799
+ outputTokens: {
800
+ total: outputTotal,
801
+ text: void 0,
802
+ reasoning: void 0
803
+ },
804
+ raw: reported
763
805
  };
764
806
  }
765
807
  getToolName(item) {
@@ -957,8 +999,8 @@ var CodexCliLanguageModel = class {
957
999
  `[codex-cli] Executing Codex CLI: ${cmd} with ${args.length} arguments, cwd: ${cwd ?? "default"}`
958
1000
  );
959
1001
  let text = "";
960
- const usage = { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
961
- const finishReason = "stop";
1002
+ let usage = createEmptyCodexUsage();
1003
+ let finishReason = mapCodexCliFinishReason(void 0);
962
1004
  const startTime = Date.now();
963
1005
  const child = child_process.spawn(cmd, args, { env, cwd, stdio: ["ignore", "pipe", "pipe"] });
964
1006
  let onAbort;
@@ -1003,9 +1045,7 @@ var CodexCliLanguageModel = class {
1003
1045
  if (event.type === "turn.completed") {
1004
1046
  const usageEvent = this.extractUsage(event);
1005
1047
  if (usageEvent) {
1006
- usage.inputTokens = usageEvent.inputTokens;
1007
- usage.outputTokens = usageEvent.outputTokens;
1008
- usage.totalTokens = usageEvent.totalTokens;
1048
+ usage = usageEvent;
1009
1049
  }
1010
1050
  }
1011
1051
  if (event.type === "item.completed" && this.getItemType(event.item) === "assistant_message" && typeof event.item?.text === "string") {
@@ -1040,11 +1080,12 @@ var CodexCliLanguageModel = class {
1040
1080
  );
1041
1081
  return;
1042
1082
  }
1083
+ const totalTokens = (usage.inputTokens.total ?? 0) + (usage.outputTokens.total ?? 0);
1043
1084
  this.logger.info(
1044
- `[codex-cli] Request completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${usage.totalTokens}`
1085
+ `[codex-cli] Request completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${totalTokens}`
1045
1086
  );
1046
1087
  this.logger.debug(
1047
- `[codex-cli] Token usage - Input: ${usage.inputTokens}, Output: ${usage.outputTokens}, Total: ${usage.totalTokens}`
1088
+ `[codex-cli] Token usage - Input: ${usage.inputTokens.total ?? 0}, Output: ${usage.outputTokens.total ?? 0}, Total: ${totalTokens}`
1048
1089
  );
1049
1090
  resolve();
1050
1091
  } else {
@@ -1272,16 +1313,17 @@ var CodexCliLanguageModel = class {
1272
1313
  controller.enqueue({ type: "text-delta", id: textId, delta: finalText });
1273
1314
  controller.enqueue({ type: "text-end", id: textId });
1274
1315
  }
1275
- const usageSummary = lastUsage ?? { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
1316
+ const usageSummary = lastUsage ?? createEmptyCodexUsage();
1317
+ const totalTokens = (usageSummary.inputTokens.total ?? 0) + (usageSummary.outputTokens.total ?? 0);
1276
1318
  this.logger.info(
1277
- `[codex-cli] Stream completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${usageSummary.totalTokens}`
1319
+ `[codex-cli] Stream completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${totalTokens}`
1278
1320
  );
1279
1321
  this.logger.debug(
1280
- `[codex-cli] Token usage - Input: ${usageSummary.inputTokens}, Output: ${usageSummary.outputTokens}, Total: ${usageSummary.totalTokens}`
1322
+ `[codex-cli] Token usage - Input: ${usageSummary.inputTokens.total ?? 0}, Output: ${usageSummary.outputTokens.total ?? 0}, Total: ${totalTokens}`
1281
1323
  );
1282
1324
  controller.enqueue({
1283
1325
  type: "finish",
1284
- finishReason: "stop",
1326
+ finishReason: mapCodexCliFinishReason(void 0),
1285
1327
  usage: usageSummary
1286
1328
  });
1287
1329
  controller.close();
@@ -1381,8 +1423,8 @@ function createCodexCli(options = {}) {
1381
1423
  };
1382
1424
  provider$1.languageModel = createModel;
1383
1425
  provider$1.chat = createModel;
1384
- provider$1.textEmbeddingModel = ((modelId) => {
1385
- throw new provider.NoSuchModelError({ modelId, modelType: "textEmbeddingModel" });
1426
+ provider$1.embeddingModel = ((modelId) => {
1427
+ throw new provider.NoSuchModelError({ modelId, modelType: "embeddingModel" });
1386
1428
  });
1387
1429
  provider$1.imageModel = ((modelId) => {
1388
1430
  throw new provider.NoSuchModelError({ modelId, modelType: "imageModel" });
package/dist/index.d.cts CHANGED
@@ -1,4 +1,4 @@
1
- import { ProviderV2, LanguageModelV2 } from '@ai-sdk/provider';
1
+ import { ProviderV3, LanguageModelV3 } from '@ai-sdk/provider';
2
2
 
3
3
  /**
4
4
  * Logger interface for custom logging.
@@ -251,11 +251,11 @@ interface CodexCliProviderOptions {
251
251
  rmcpClient?: boolean;
252
252
  }
253
253
 
254
- interface CodexCliProvider extends ProviderV2 {
255
- (modelId: string, settings?: CodexCliSettings): LanguageModelV2;
256
- languageModel(modelId: string, settings?: CodexCliSettings): LanguageModelV2;
257
- chat(modelId: string, settings?: CodexCliSettings): LanguageModelV2;
258
- textEmbeddingModel(modelId: string): never;
254
+ interface CodexCliProvider extends ProviderV3 {
255
+ (modelId: string, settings?: CodexCliSettings): LanguageModelV3;
256
+ languageModel(modelId: string, settings?: CodexCliSettings): LanguageModelV3;
257
+ chat(modelId: string, settings?: CodexCliSettings): LanguageModelV3;
258
+ embeddingModel(modelId: string): never;
259
259
  imageModel(modelId: string): never;
260
260
  }
261
261
  declare function createCodexCli(options?: CodexCliProviderSettings): CodexCliProvider;
@@ -265,8 +265,8 @@ interface CodexLanguageModelOptions {
265
265
  id: string;
266
266
  settings?: CodexCliSettings;
267
267
  }
268
- declare class CodexCliLanguageModel implements LanguageModelV2 {
269
- readonly specificationVersion: "v2";
268
+ declare class CodexCliLanguageModel implements LanguageModelV3 {
269
+ readonly specificationVersion: "v3";
270
270
  readonly provider = "codex-cli";
271
271
  readonly defaultObjectGenerationMode: "json";
272
272
  readonly supportsImageUrls = false;
@@ -301,8 +301,8 @@ declare class CodexCliLanguageModel implements LanguageModelV2 {
301
301
  private emitToolInvocation;
302
302
  private emitToolResult;
303
303
  private handleSpawnError;
304
- doGenerate(options: Parameters<LanguageModelV2['doGenerate']>[0]): Promise<Awaited<ReturnType<LanguageModelV2['doGenerate']>>>;
305
- doStream(options: Parameters<LanguageModelV2['doStream']>[0]): Promise<Awaited<ReturnType<LanguageModelV2['doStream']>>>;
304
+ doGenerate(options: Parameters<LanguageModelV3['doGenerate']>[0]): Promise<Awaited<ReturnType<LanguageModelV3['doGenerate']>>>;
305
+ doStream(options: Parameters<LanguageModelV3['doStream']>[0]): Promise<Awaited<ReturnType<LanguageModelV3['doStream']>>>;
306
306
  }
307
307
 
308
308
  declare function isAuthenticationError(err: unknown): boolean;
package/dist/index.d.ts CHANGED
@@ -1,4 +1,4 @@
1
- import { ProviderV2, LanguageModelV2 } from '@ai-sdk/provider';
1
+ import { ProviderV3, LanguageModelV3 } from '@ai-sdk/provider';
2
2
 
3
3
  /**
4
4
  * Logger interface for custom logging.
@@ -251,11 +251,11 @@ interface CodexCliProviderOptions {
251
251
  rmcpClient?: boolean;
252
252
  }
253
253
 
254
- interface CodexCliProvider extends ProviderV2 {
255
- (modelId: string, settings?: CodexCliSettings): LanguageModelV2;
256
- languageModel(modelId: string, settings?: CodexCliSettings): LanguageModelV2;
257
- chat(modelId: string, settings?: CodexCliSettings): LanguageModelV2;
258
- textEmbeddingModel(modelId: string): never;
254
+ interface CodexCliProvider extends ProviderV3 {
255
+ (modelId: string, settings?: CodexCliSettings): LanguageModelV3;
256
+ languageModel(modelId: string, settings?: CodexCliSettings): LanguageModelV3;
257
+ chat(modelId: string, settings?: CodexCliSettings): LanguageModelV3;
258
+ embeddingModel(modelId: string): never;
259
259
  imageModel(modelId: string): never;
260
260
  }
261
261
  declare function createCodexCli(options?: CodexCliProviderSettings): CodexCliProvider;
@@ -265,8 +265,8 @@ interface CodexLanguageModelOptions {
265
265
  id: string;
266
266
  settings?: CodexCliSettings;
267
267
  }
268
- declare class CodexCliLanguageModel implements LanguageModelV2 {
269
- readonly specificationVersion: "v2";
268
+ declare class CodexCliLanguageModel implements LanguageModelV3 {
269
+ readonly specificationVersion: "v3";
270
270
  readonly provider = "codex-cli";
271
271
  readonly defaultObjectGenerationMode: "json";
272
272
  readonly supportsImageUrls = false;
@@ -301,8 +301,8 @@ declare class CodexCliLanguageModel implements LanguageModelV2 {
301
301
  private emitToolInvocation;
302
302
  private emitToolResult;
303
303
  private handleSpawnError;
304
- doGenerate(options: Parameters<LanguageModelV2['doGenerate']>[0]): Promise<Awaited<ReturnType<LanguageModelV2['doGenerate']>>>;
305
- doStream(options: Parameters<LanguageModelV2['doStream']>[0]): Promise<Awaited<ReturnType<LanguageModelV2['doStream']>>>;
304
+ doGenerate(options: Parameters<LanguageModelV3['doGenerate']>[0]): Promise<Awaited<ReturnType<LanguageModelV3['doGenerate']>>>;
305
+ doStream(options: Parameters<LanguageModelV3['doStream']>[0]): Promise<Awaited<ReturnType<LanguageModelV3['doStream']>>>;
306
306
  }
307
307
 
308
308
  declare function isAuthenticationError(err: unknown): boolean;
package/dist/index.js CHANGED
@@ -353,6 +353,41 @@ function isAuthenticationError(err) {
353
353
  }
354
354
 
355
355
  // src/codex-cli-language-model.ts
356
+ function createEmptyCodexUsage() {
357
+ return {
358
+ inputTokens: {
359
+ total: 0,
360
+ noCache: 0,
361
+ cacheRead: 0,
362
+ cacheWrite: 0
363
+ },
364
+ outputTokens: {
365
+ total: 0,
366
+ text: void 0,
367
+ reasoning: void 0
368
+ },
369
+ raw: void 0
370
+ };
371
+ }
372
+ function mapCodexCliFinishReason(reason) {
373
+ switch (reason) {
374
+ case "stop":
375
+ case "end_turn":
376
+ case void 0:
377
+ return { unified: "stop", raw: reason };
378
+ case "length":
379
+ case "max_tokens":
380
+ return { unified: "length", raw: reason };
381
+ case "content_filter":
382
+ return { unified: "content-filter", raw: reason };
383
+ case "tool_calls":
384
+ return { unified: "tool-calls", raw: reason };
385
+ case "error":
386
+ return { unified: "error", raw: reason };
387
+ default:
388
+ return { unified: "other", raw: reason };
389
+ }
390
+ }
356
391
  var codexCliProviderOptionsSchema = z.object({
357
392
  reasoningEffort: z.enum(["minimal", "low", "medium", "high", "xhigh"]).optional(),
358
393
  reasoningSummary: z.enum(["auto", "detailed"]).optional(),
@@ -385,7 +420,7 @@ function resolveCodexPath(explicitPath, allowNpx) {
385
420
  }
386
421
  }
387
422
  var CodexCliLanguageModel = class {
388
- specificationVersion = "v2";
423
+ specificationVersion = "v3";
389
424
  provider = "codex-cli";
390
425
  defaultObjectGenerationMode = "json";
391
426
  supportsImageUrls = false;
@@ -724,8 +759,8 @@ var CodexCliLanguageModel = class {
724
759
  const add = (setting, name) => {
725
760
  if (setting !== void 0)
726
761
  unsupported.push({
727
- type: "unsupported-setting",
728
- setting: name,
762
+ type: "unsupported",
763
+ feature: name,
729
764
  details: `Codex CLI does not support ${name}; it will be ignored.`
730
765
  });
731
766
  };
@@ -748,15 +783,22 @@ var CodexCliLanguageModel = class {
748
783
  extractUsage(evt) {
749
784
  const reported = evt.usage;
750
785
  if (!reported) return void 0;
751
- const inputTokens = reported.input_tokens ?? 0;
752
- const outputTokens = reported.output_tokens ?? 0;
786
+ const inputTotal = reported.input_tokens ?? 0;
787
+ const outputTotal = reported.output_tokens ?? 0;
753
788
  const cachedInputTokens = reported.cached_input_tokens ?? 0;
754
789
  return {
755
- inputTokens,
756
- outputTokens,
757
- // totalTokens should not double-count cached tokens; track cached separately
758
- totalTokens: inputTokens + outputTokens,
759
- cachedInputTokens
790
+ inputTokens: {
791
+ total: inputTotal,
792
+ noCache: inputTotal - cachedInputTokens,
793
+ cacheRead: cachedInputTokens,
794
+ cacheWrite: 0
795
+ },
796
+ outputTokens: {
797
+ total: outputTotal,
798
+ text: void 0,
799
+ reasoning: void 0
800
+ },
801
+ raw: reported
760
802
  };
761
803
  }
762
804
  getToolName(item) {
@@ -954,8 +996,8 @@ var CodexCliLanguageModel = class {
954
996
  `[codex-cli] Executing Codex CLI: ${cmd} with ${args.length} arguments, cwd: ${cwd ?? "default"}`
955
997
  );
956
998
  let text = "";
957
- const usage = { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
958
- const finishReason = "stop";
999
+ let usage = createEmptyCodexUsage();
1000
+ let finishReason = mapCodexCliFinishReason(void 0);
959
1001
  const startTime = Date.now();
960
1002
  const child = spawn(cmd, args, { env, cwd, stdio: ["ignore", "pipe", "pipe"] });
961
1003
  let onAbort;
@@ -1000,9 +1042,7 @@ var CodexCliLanguageModel = class {
1000
1042
  if (event.type === "turn.completed") {
1001
1043
  const usageEvent = this.extractUsage(event);
1002
1044
  if (usageEvent) {
1003
- usage.inputTokens = usageEvent.inputTokens;
1004
- usage.outputTokens = usageEvent.outputTokens;
1005
- usage.totalTokens = usageEvent.totalTokens;
1045
+ usage = usageEvent;
1006
1046
  }
1007
1047
  }
1008
1048
  if (event.type === "item.completed" && this.getItemType(event.item) === "assistant_message" && typeof event.item?.text === "string") {
@@ -1037,11 +1077,12 @@ var CodexCliLanguageModel = class {
1037
1077
  );
1038
1078
  return;
1039
1079
  }
1080
+ const totalTokens = (usage.inputTokens.total ?? 0) + (usage.outputTokens.total ?? 0);
1040
1081
  this.logger.info(
1041
- `[codex-cli] Request completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${usage.totalTokens}`
1082
+ `[codex-cli] Request completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${totalTokens}`
1042
1083
  );
1043
1084
  this.logger.debug(
1044
- `[codex-cli] Token usage - Input: ${usage.inputTokens}, Output: ${usage.outputTokens}, Total: ${usage.totalTokens}`
1085
+ `[codex-cli] Token usage - Input: ${usage.inputTokens.total ?? 0}, Output: ${usage.outputTokens.total ?? 0}, Total: ${totalTokens}`
1045
1086
  );
1046
1087
  resolve();
1047
1088
  } else {
@@ -1269,16 +1310,17 @@ var CodexCliLanguageModel = class {
1269
1310
  controller.enqueue({ type: "text-delta", id: textId, delta: finalText });
1270
1311
  controller.enqueue({ type: "text-end", id: textId });
1271
1312
  }
1272
- const usageSummary = lastUsage ?? { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
1313
+ const usageSummary = lastUsage ?? createEmptyCodexUsage();
1314
+ const totalTokens = (usageSummary.inputTokens.total ?? 0) + (usageSummary.outputTokens.total ?? 0);
1273
1315
  this.logger.info(
1274
- `[codex-cli] Stream completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${usageSummary.totalTokens}`
1316
+ `[codex-cli] Stream completed - Session: ${this.sessionId ?? "N/A"}, Duration: ${duration}ms, Tokens: ${totalTokens}`
1275
1317
  );
1276
1318
  this.logger.debug(
1277
- `[codex-cli] Token usage - Input: ${usageSummary.inputTokens}, Output: ${usageSummary.outputTokens}, Total: ${usageSummary.totalTokens}`
1319
+ `[codex-cli] Token usage - Input: ${usageSummary.inputTokens.total ?? 0}, Output: ${usageSummary.outputTokens.total ?? 0}, Total: ${totalTokens}`
1278
1320
  );
1279
1321
  controller.enqueue({
1280
1322
  type: "finish",
1281
- finishReason: "stop",
1323
+ finishReason: mapCodexCliFinishReason(void 0),
1282
1324
  usage: usageSummary
1283
1325
  });
1284
1326
  controller.close();
@@ -1378,8 +1420,8 @@ function createCodexCli(options = {}) {
1378
1420
  };
1379
1421
  provider.languageModel = createModel;
1380
1422
  provider.chat = createModel;
1381
- provider.textEmbeddingModel = ((modelId) => {
1382
- throw new NoSuchModelError({ modelId, modelType: "textEmbeddingModel" });
1423
+ provider.embeddingModel = ((modelId) => {
1424
+ throw new NoSuchModelError({ modelId, modelType: "embeddingModel" });
1383
1425
  });
1384
1426
  provider.imageModel = ((modelId) => {
1385
1427
  throw new NoSuchModelError({ modelId, modelType: "imageModel" });
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "ai-sdk-provider-codex-cli",
3
- "version": "0.7.0",
4
- "description": "AI SDK v5 provider for OpenAI Codex CLI with native JSON Schema support",
3
+ "version": "1.0.0",
4
+ "description": "AI SDK v6 provider for OpenAI Codex CLI (use ChatGPT Plus/Pro subscription)",
5
5
  "keywords": [
6
6
  "ai-sdk",
7
7
  "codex",
@@ -59,8 +59,8 @@
59
59
  "test:coverage": "vitest run --coverage"
60
60
  },
61
61
  "dependencies": {
62
- "@ai-sdk/provider": "2.0.0",
63
- "@ai-sdk/provider-utils": "3.0.3",
62
+ "@ai-sdk/provider": "^3.0.0",
63
+ "@ai-sdk/provider-utils": "^4.0.1",
64
64
  "jsonc-parser": "^3.3.1"
65
65
  },
66
66
  "optionalDependencies": {
@@ -70,7 +70,7 @@
70
70
  "@eslint/js": "^9.14.0",
71
71
  "@types/node": "20.19.6",
72
72
  "@vitest/coverage-v8": "^3.2.4",
73
- "ai": "5.0.93",
73
+ "ai": "^6.0.3",
74
74
  "eslint": "^9.14.0",
75
75
  "prettier": "^3.3.3",
76
76
  "tsup": "8.5.0",