@classicicn/codex-transfer 0.3.0 → 0.3.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.en.md +21 -0
  2. package/README.md +21 -0
  3. package/package.json +1 -1
package/README.en.md CHANGED
@@ -1,5 +1,7 @@
1
1
  # codex-transfer
2
2
 
3
+ **English** | [中文](./README.md)
4
+
3
5
  > Responses API ↔ Chat Completions translation bridge — use DeepSeek, Kimi, Qwen, and other OpenAI-compatible providers with Codex CLI.
4
6
 
5
7
  ## Overview
@@ -371,6 +373,25 @@ const { app, port } = createTransfer({
371
373
 
372
374
  ---
373
375
 
376
+ ## Changelog
377
+
378
+ ### v0.3.0 (2026-05-08)
379
+
380
+ - **Token usage**: Extract usage from upstream streaming responses — Codex now correctly displays context utilization (fixes 0% display issue)
381
+ - **Usage details**: Auto-map `cached_tokens` and `reasoning_tokens`, compatible with both OpenAI and DeepSeek upstream formats
382
+ - **Reasoning effort**: Map `reasoning.effort` to DeepSeek `thinking`/`reasoning_effort` and MiMo/Kimi/GLM `thinking` toggle
383
+ - **Config-driven control**: New `--no-reasoning-effort` / `reasoningEffort` option to strip reasoning effort parameters on demand
384
+
385
+ ### v0.2.0 (2026-05-07)
386
+
387
+ - First npm release
388
+ - Responses API ↔ Chat Completions bidirectional protocol translation
389
+ - Streaming SSE event generation, session management, reasoning model support
390
+ - Model name mapping, daemon mode, log rotation
391
+ - TLS certificate bypass, config file support
392
+
393
+ ---
394
+
374
395
  ## License
375
396
 
376
397
  MIT
package/README.md CHANGED
@@ -1,5 +1,7 @@
1
1
  # codex-transfer
2
2
 
3
+ [English](./README.en.md) | **中文**
4
+
3
5
  > Responses API ↔ Chat Completions 协议翻译桥接 — 让 Codex CLI 无缝对接 DeepSeek、Kimi、Qwen 等任意 OpenAI 兼容厂商。
4
6
 
5
7
  ## 概述
@@ -371,6 +373,25 @@ const { app, port } = createTransfer({
371
373
 
372
374
  ---
373
375
 
376
+ ## 更新日志
377
+
378
+ ### v0.3.0 (2026-05-08)
379
+
380
+ - **Token 用量**:从上游流式响应中提取 usage,Codex 可正确显示上下文占用率(修复 0% 问题)
381
+ - **用量详情**:自动映射 `cached_tokens` 和 `reasoning_tokens`,兼容 OpenAI 和 DeepSeek 两种上游格式
382
+ - **推理强度**:映射 `reasoning.effort` 到 DeepSeek `thinking`/`reasoning_effort`、MiMo/Kimi/GLM `thinking` 开关
383
+ - **配置化控制**:新增 `--no-reasoning-effort` / `reasoningEffort` 配置项,按需剥离推理强度参数
384
+
385
+ ### v0.2.0 (2026-05-07)
386
+
387
+ - 首次 npm 发布
388
+ - Responses API ↔ Chat Completions 双向协议翻译
389
+ - 流式 SSE 事件序列生成、会话管理、推理模型支持
390
+ - 模型名称映射、Daemon 模式、日志轮转
391
+ - TLS 证书跳过、配置文件支持
392
+
393
+ ---
394
+
374
395
  ## License
375
396
 
376
397
  MIT
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@classicicn/codex-transfer",
3
- "version": "0.3.0",
3
+ "version": "0.3.1",
4
4
  "description": "Responses API ↔ Chat Completions translation bridge for Codex — use DeepSeek, Kimi, Qwen, and other providers with Codex",
5
5
  "type": "module",
6
6
  "main": "dist/codex-transfer.mjs",