opencode-aicodewith-auth 0.1.0 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.ai.md ADDED
@@ -0,0 +1,262 @@
1
+ # Install & Configure OpenCode with AICodeWith Auth
2
+
3
+ You are an LLM agent.
4
+ Your task is to set up **OpenCode** with **opencode-aicodewith-auth** plugin using **AICodeWith** as the authentication provider.
5
+
6
+ You MUST follow the steps **in order**. DO NOT skip any step.
7
+
8
+ ---
9
+
10
+ ## Step 0 — Ask for AICodeWith API Key (MANDATORY, DO NOT SKIP)
11
+
12
+ Ask the user **exactly** this (keep the wording natural, but the meaning must be preserved):
13
+
14
+ > **"你现在有没有 AICodeWith 的 API key?"**
15
+ >
16
+ > - 如果已经有了,请直接发给我
17
+ > - 如果还没有,请去 **https://aicodewith.com/** 注册(按住 Command 点击链接可直接跳转),注册后会有免费体验额度
18
+ >
19
+ > 请告诉我你的情况,我会继续帮你配置。
20
+
21
+ ### Branching rules (IMPORTANT)
22
+
23
+ ### Case A — User HAS a key
24
+
25
+ - User provides the key (format: `sk-...` or similar).
26
+ - Store it temporarily as: `AICODEWITH_API_KEY`.
27
+ - Continue to **Step 1**.
28
+
29
+ ### Case B — User DOES NOT have a key
30
+
31
+ The link **https://aicodewith.com/** is already provided above. Just wait for the user to register and come back with their key.
32
+
33
+ **Rules:**
34
+ - DO NOT proceed with installation until the user confirms they have obtained a key.
35
+ - DO NOT fabricate or guess a key.
36
+ - Wait patiently for the user to respond.
37
+
38
+ Only continue when the user explicitly provides an AICodeWith API key.
39
+
40
+ ---
41
+
42
+ ## Step 1 — Check if OpenCode is installed
43
+
44
+ Run:
45
+
46
+ ```bash
47
+ which opencode || command -v opencode
48
+ ```
49
+
50
+ ### If OpenCode is NOT installed
51
+
52
+ Guide the user to install OpenCode first:
53
+
54
+ **macOS (Homebrew):**
55
+ ```bash
56
+ brew install opencode
57
+ ```
58
+
59
+ **npm:**
60
+ ```bash
61
+ npm install -g opencode-ai
62
+ ```
63
+
64
+ **Other methods:** See https://opencode.ai/docs
65
+
66
+ After installation, verify:
67
+ ```bash
68
+ opencode --version
69
+ ```
70
+
71
+ ### If OpenCode IS installed
72
+
73
+ Continue to **Step 2**.
74
+
75
+ ---
76
+
77
+ ## Step 2 — Write auth.json (DO NOT use opencode auth login)
78
+
79
+ **IMPORTANT:** Do NOT run `opencode auth login` — it is an interactive command that will block. Instead, directly edit the auth.json file.
80
+
81
+ Target path:
82
+ - **Linux / macOS:** `~/.local/share/opencode/auth.json`
83
+
84
+ ### 2.1 Ensure directory exists
85
+
86
+ ```bash
87
+ mkdir -p ~/.local/share/opencode
88
+ ```
89
+
90
+ ### 2.2 Read existing auth.json (if exists)
91
+
92
+ If `auth.json` exists:
93
+ - Read and parse it as JSON.
94
+ - Preserve all existing providers (openai / anthropic / google / others).
95
+ - Only modify or insert the `aicodewith` entry.
96
+
97
+ If it does NOT exist:
98
+ - Start from an empty JSON object `{}`.
99
+
100
+ ### 2.3 Write AICodeWith entry
101
+
102
+ Add or update the `aicodewith` entry with the user's API key:
103
+
104
+ ```json
105
+ {
106
+ "aicodewith": {
107
+ "type": "api",
108
+ "key": "<USER_PROVIDED_API_KEY>"
109
+ }
110
+ }
111
+ ```
112
+
113
+ **Merge rules:**
114
+ - If `aicodewith` exists → overwrite its `key` field only
115
+ - If `aicodewith` does not exist → add the entire entry
116
+ - **NEVER delete or modify other providers**
117
+
118
+ After writing, re-read the file to ensure it is valid JSON.
119
+
120
+ Continue to **Step 3**.
121
+
122
+ ---
123
+
124
+ ## Step 3 — Configure OpenCode to use the plugin
125
+
126
+ Target path:
127
+ - **Global config:** `~/.config/opencode/opencode.json`
128
+ - **Project config:** `./opencode.json` (in current directory)
129
+
130
+ ### 3.1 Ensure config directory exists
131
+
132
+ ```bash
133
+ mkdir -p ~/.config/opencode
134
+ ```
135
+
136
+ ### 3.2 Read existing opencode.json (if exists)
137
+
138
+ If `opencode.json` exists:
139
+ - Read and parse it as JSON.
140
+ - Preserve all existing configuration.
141
+
142
+ If it does NOT exist:
143
+ - Start with this base structure:
144
+ ```json
145
+ {
146
+ "$schema": "https://opencode.ai/config.json"
147
+ }
148
+ ```
149
+
150
+ ### 3.3 Add plugin to configuration
151
+
152
+ Ensure the `plugin` array includes `opencode-aicodewith-auth`:
153
+
154
+ ```json
155
+ {
156
+ "$schema": "https://opencode.ai/config.json",
157
+ "plugin": ["opencode-aicodewith-auth"]
158
+ }
159
+ ```
160
+
161
+ **Merge rules:**
162
+ - If `plugin` array exists → append `"opencode-aicodewith-auth"` if not already present
163
+ - If `plugin` does not exist → create the array with `["opencode-aicodewith-auth"]`
164
+ - **NEVER remove existing plugins**
165
+
166
+ ---
167
+
168
+ ## Step 4 — Verify installation and check npm registry
169
+
170
+ ### 4.1 Check if user is using a non-official npm registry
171
+
172
+ Run this command to detect the npm registry:
173
+
174
+ ```bash
175
+ npm config get registry
176
+ ```
177
+
178
+ If the output is NOT `https://registry.npmjs.org/` (e.g., shows taobao, cnpm, or other mirrors), **remember to include a warning in your final message**.
179
+
180
+ ### 4.2 Tell the user installation is complete
181
+
182
+ **If using official npm registry** (https://registry.npmjs.org/):
183
+
184
+ > "安装完成!请重启 OpenCode 以加载插件。
185
+ > 你现在可以使用以下模型:
186
+ > - `gpt-5.2-codex`
187
+ > - `gpt-5.2`
188
+ > - `claude-sonnet-4-5-20250929`
189
+ > - `claude-opus-4-5-20251101`
190
+ > - `gemini-3-pro-high`
191
+ >
192
+ > 测试命令: `opencode --model gpt-5.2-codex`"
193
+
194
+ **If using a mirror registry** (taobao, cnpm, etc.), add this warning:
195
+
196
+ > "安装完成!请重启 OpenCode 以加载插件。
197
+ > 你现在可以使用以下模型:
198
+ > - `gpt-5.2-codex`
199
+ > - `gpt-5.2`
200
+ > - `claude-sonnet-4-5-20250929`
201
+ > - `claude-opus-4-5-20251101`
202
+ > - `gemini-3-pro-high`
203
+ >
204
+ > 测试命令: `opencode --model gpt-5.2-codex`
205
+ >
206
+ > ⚠️ **注意**: 检测到你使用的是非官方 npm 源。如果启动 OpenCode 时遇到插件加载失败(如 `BunInstallFailedError`),可能是镜像源尚未同步最新版本。解决方法:
207
+ > ```bash
208
+ > # 临时使用官方源重新安装
209
+ > npm config set registry https://registry.npmjs.org/
210
+ > # 重启 OpenCode 后,可以改回原来的源
211
+ > ```"
212
+
213
+ ---
214
+
215
+ ## Summary of changes
216
+
217
+ | File | Description |
218
+ |------|-------------|
219
+ | `~/.local/share/opencode/auth.json` | Stores AICodeWith API key |
220
+ | `~/.config/opencode/opencode.json` | Registers the plugin |
221
+
222
+ ---
223
+
224
+ ## Troubleshooting
225
+
226
+ If the user reports issues:
227
+
228
+ ### "BunInstallFailedError" or plugin fails to load
229
+ This is usually caused by npm mirror sync delays. The user's npm registry may not have synced the latest version yet.
230
+
231
+ Tell the user:
232
+ > "这个问题通常是因为你的 npm 镜像源还没有同步最新版本。解决方法:
233
+ > ```bash
234
+ > # 临时切换到官方源
235
+ > npm config set registry https://registry.npmjs.org/
236
+ > # 重启 OpenCode
237
+ > # 之后可以改回原来的源
238
+ > ```"
239
+
240
+ ### "Provider not found: aicodewith"
241
+ - Verify `opencode.json` contains `"plugin": ["opencode-aicodewith-auth"]`
242
+ - Ask user to restart OpenCode
243
+
244
+ ### "Authentication failed"
245
+ - Verify `auth.json` has correct key format
246
+ - Ask user to double-check their API key at https://aicodewith.com/
247
+
248
+ ### "Model not found"
249
+ - The plugin auto-injects models on first load
250
+ - Ask user to restart OpenCode and try again
251
+
252
+ ---
253
+
254
+ ## Important rules for the AI agent
255
+
256
+ 1. **NEVER skip Step 0** — Always ask for the API key first
257
+ 2. **NEVER fabricate API keys** — Only use what the user provides
258
+ 3. **NEVER delete existing config** — Always merge, never overwrite entirely
259
+ 4. **NEVER run interactive commands** — Do NOT run `opencode auth login`, it will block. Edit files directly instead.
260
+ 5. **ALWAYS verify file writes** — Re-read after writing to confirm valid JSON
261
+ 6. **ALWAYS wait for user confirmation** — Don't assume, ask when unclear
262
+ 7. **ALWAYS check npm registry and warn if using mirrors** — Run `npm config get registry` and warn user about potential sync issues if not using official registry
package/README.md CHANGED
@@ -1,113 +1,135 @@
1
- # opencode-aicodewith-auth
1
+ <!--
2
+ opencode-aicodewith-auth
3
+ An OpenCode auth plugin for AICodewith
2
4
 
3
- An OpenCode plugin for AICodewith authentication. Access multiple AI models including GPT-5.2, Claude, and Gemini through AICodewith API.
5
+ 📌 MAINTAINER NOTICE:
6
+ Any architecture, feature, or convention changes MUST update:
7
+ 1. This file (if affecting overall structure)
8
+ 2. Relevant subdirectory's ARCHITECTURE.md
9
+ 3. Affected file headers (input/output/pos comments)
10
+ -->
4
11
 
5
- ## Features
12
+ <div align="center">
6
13
 
7
- - Unified authentication for multiple AI providers
8
- - Support for the following models:
9
- - `gpt-5.2-codex` - OpenAI GPT-5.2 Codex
10
- - `gpt-5.2` - OpenAI GPT-5.2
11
- - `claude-sonnet-4-5-20250929` - Anthropic Claude Sonnet 4.5
12
- - `claude-opus-4-5-20251101` - Anthropic Claude Opus 4.5
13
- - `gemini-3-pro-high` - Google Gemini 3 Pro High
14
- - Automatic configuration injection
14
+ # opencode-aicodewith-auth
15
15
 
16
- ## Installation
16
+ **OpenCode 的 AICodewith 认证插件**
17
17
 
18
- ### From npm
18
+ 一次登录 多模型可用(GPT-5.2、Claude、Gemini)
19
19
 
20
- Add the plugin to your OpenCode configuration file:
20
+ [![npm version](https://img.shields.io/npm/v/opencode-aicodewith-auth?label=npm&style=flat-square)](https://www.npmjs.com/package/opencode-aicodewith-auth)
21
+ [![npm downloads](https://img.shields.io/npm/dt/opencode-aicodewith-auth?style=flat-square)](https://www.npmjs.com/package/opencode-aicodewith-auth)
22
+ [![license](https://img.shields.io/badge/license-MIT-black?style=flat-square)](#license)
21
23
 
22
- ```json title="opencode.json"
23
- {
24
- "$schema": "https://opencode.ai/config.json",
25
- "plugin": ["opencode-aicodewith-auth"]
26
- }
27
- ```
24
+ </div>
28
25
 
29
- Then restart OpenCode. The plugin will be automatically installed.
26
+ ---
30
27
 
31
- ### From source
28
+ ## Architecture Overview
32
29
 
33
- 1. Clone the repository:
34
- ```bash
35
- git clone https://github.com/DaneelOlivaw1/opencode-aicodewith-auth.git
36
- cd opencode-aicodewith-auth
37
30
  ```
38
-
39
- 2. Install dependencies:
40
- ```bash
41
- bun install
31
+ ┌─────────────────────────────────────────────────────────────────┐
32
+ │ opencode-aicodewith-auth │
33
+ ├─────────────────────────────────────────────────────────────────┤
34
+ │ index.ts Plugin entry, auth hook, config injection │
35
+ │ provider.ts Multi-provider factory (OpenAI/Claude/Gemini)│
36
+ ├─────────────────────────────────────────────────────────────────┤
37
+ │ lib/ Core library modules │
38
+ │ ├── constants.ts Global constants & header names │
39
+ │ ├── types.ts Shared TypeScript interfaces │
40
+ │ ├── logger.ts Debug/request logging utilities │
41
+ │ ├── prompts/ Codex prompt fetching & bridging │
42
+ │ └── request/ Request transformation & response handling│
43
+ ├─────────────────────────────────────────────────────────────────┤
44
+ │ scripts/ Installation automation │
45
+ │ └── install-opencode-aicodewith.js postinstall config writer │
46
+ └─────────────────────────────────────────────────────────────────┘
42
47
  ```
43
48
 
44
- 3. Build the plugin:
45
- ```bash
46
- bun run build
49
+ ### Data Flow
50
+
51
+ ```
52
+ User Request → OpenCode → Plugin Auth Hook → Route by Model:
53
+ ├── gpt-*/codex-* → Codex API (transform + headers)
54
+ ├── claude-* → Anthropic API (URL rewrite)
55
+ └── gemini-* → Gemini API (headers + URL build)
47
56
  ```
48
57
 
49
- ## Configuration
58
+ ---
50
59
 
51
- After installation, the plugin will automatically configure the `aicodewith` provider in your OpenCode config. You can also manually configure it:
60
+ ## 支持的模型
52
61
 
53
- ```json title="~/.config/opencode/opencode.json"
54
- {
55
- "provider": {
56
- "aicodewith": {
57
- "name": "AICodewith",
58
- "api": "https://api.openai.com/v1",
59
- "env": ["AICODEWITH_API_KEY"],
60
- "models": {
61
- "gpt-5.2-codex": {},
62
- "gpt-5.2": {},
63
- "claude-sonnet-4-5-20250929": {},
64
- "claude-opus-4-5-20251101": {},
65
- "gemini-3-pro-high": {}
66
- }
67
- }
68
- }
69
- }
70
- ```
62
+ | 模型 ID | 显示名称 | 适合场景 |
63
+ |---------|---------|---------|
64
+ | `aicodewith/gpt-5.2-codex` | GPT-5.2 Codex | 日常编程、代码生成 |
65
+ | `aicodewith/gpt-5.2` | GPT-5.2 | 架构设计、逻辑推理 |
66
+ | `aicodewith/claude-sonnet-4-5-20250929` | Claude Sonnet 4.5 | 代码审查、文档查询 |
67
+ | `aicodewith/claude-opus-4-5-20251101` | Claude Opus 4.5 | 复杂任务、深度思考 |
68
+ | `aicodewith/gemini-3-pro-high` | Gemini 3 Pro | 前端 UI、多模态任务 |
71
69
 
72
- ## Authentication
70
+ ---
73
71
 
74
- Set your AICodewith API key as an environment variable:
72
+ ## 快速开始
75
73
 
76
- ```bash
77
- export AICODEWITH_API_KEY=sk-your-api-key
74
+ ### 🔥 方案一:完整安装(推荐 - 包含 oh-my-opencode)
75
+
76
+ ```
77
+ 帮我安装配置 opencode-aicodewith-auth 和 oh-my-opencode,按照这个说明操作:https://raw.githubusercontent.com/DaneelOlivaw1/opencode-aicodewith-auth/main/INSTALL-WITH-OMO.md
78
78
  ```
79
79
 
80
- Or authenticate via the OpenCode TUI by selecting the "AICodewith API Key" authentication method.
80
+ ### 📦 方案二:单独安装
81
81
 
82
- ## Usage
82
+ ```
83
+ 帮我安装配置 opencode-aicodewith-auth,按照这个说明操作:https://raw.githubusercontent.com/DaneelOlivaw1/opencode-aicodewith-auth/main/README.ai.md
84
+ ```
83
85
 
84
- Once configured, you can select any of the supported models in OpenCode:
86
+ <details>
87
+ <summary><strong>手动安装</strong></summary>
85
88
 
86
- ```bash
87
- opencode --model gpt-5.2-codex
89
+ 1. 编辑 `~/.config/opencode/opencode.json`:
90
+ ```json
91
+ {
92
+ "$schema": "https://opencode.ai/config.json",
93
+ "plugin": ["opencode-aicodewith-auth"]
94
+ }
88
95
  ```
89
96
 
90
- Or switch models within the TUI.
97
+ 2. 运行 `opencode auth login` → 选择 Other → 输入 `aicodewith` → 输入 API Key
98
+
99
+ </details>
91
100
 
92
- ## Development
101
+ ---
93
102
 
94
- ### Build
103
+ ## 使用
95
104
 
96
105
  ```bash
97
- bun run build
106
+ opencode --model aicodewith/gpt-5.2-codex
98
107
  ```
99
108
 
100
- ### Type check
109
+ ---
110
+
111
+ ## 开发
101
112
 
102
113
  ```bash
114
+ git clone https://github.com/DaneelOlivaw1/opencode-aicodewith-auth.git
115
+ cd opencode-aicodewith-auth
116
+ bun install
117
+ bun run build
103
118
  bun run typecheck
104
119
  ```
105
120
 
106
- ### Clean
121
+ ---
107
122
 
108
- ```bash
109
- bun run clean
110
- ```
123
+ ## File Index
124
+
125
+ | File | Role | Description |
126
+ |------|------|-------------|
127
+ | `index.ts` | **Entry** | Plugin main, auth hook, config auto-injection |
128
+ | `provider.ts` | **Core** | Multi-provider language model factory |
129
+ | `lib/` | **Library** | See [lib/ARCHITECTURE.md](lib/ARCHITECTURE.md) |
130
+ | `scripts/` | **Tooling** | See [scripts/ARCHITECTURE.md](scripts/ARCHITECTURE.md) |
131
+
132
+ ---
111
133
 
112
134
  ## License
113
135
 
package/dist/index.d.ts CHANGED
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file index.ts
3
+ * @input OpenCode plugin context, auth credentials
4
+ * @output Auth hook, config injection, fetch interceptor
5
+ * @pos Plugin entry point - orchestrates auth and request routing
6
+ *
7
+ * 📌 On change: Update this header + README.md file index
8
+ */
1
9
  import type { Plugin, PluginInput, Hooks } from "@opencode-ai/plugin";
2
10
  import { type AicodewithProviderSettings } from "./provider";
3
11
  export declare function createAicodewith(input: AicodewithProviderSettings | PluginInput): import("./provider").AicodewithProvider | Hooks;
package/dist/index.js CHANGED
@@ -998,17 +998,19 @@ var CODEX_MODEL_PREFIXES = ["gpt-", "codex"];
998
998
  var PACKAGE_NAME = "opencode-aicodewith-auth";
999
999
  var PROVIDER_NAME = "AICodewith";
1000
1000
  var PLUGIN_ENTRY = import.meta.url;
1001
- var PROVIDER_NPM = new URL("./provider.ts", import.meta.url).href;
1001
+ var PROVIDER_EXT = import.meta.url.endsWith(".ts") ? ".ts" : ".js";
1002
+ var PROVIDER_NPM = new URL(`./provider${PROVIDER_EXT}`, import.meta.url).href;
1002
1003
  var DEFAULT_API = "https://api.openai.com/v1";
1003
1004
  var DEFAULT_ENV = ["AICODEWITH_API_KEY"];
1004
1005
  var DEFAULT_OUTPUT_TOKEN_MAX = 32000;
1005
- var ALLOWED_MODEL_IDS = [
1006
- "gpt-5.2-codex",
1007
- "gpt-5.2",
1008
- "claude-sonnet-4-5-20250929",
1009
- "claude-opus-4-5-20251101",
1010
- "gemini-3-pro-high"
1011
- ];
1006
+ var MODEL_CONFIGS = {
1007
+ "gpt-5.2-codex": { name: "GPT-5.2 Codex" },
1008
+ "gpt-5.2": { name: "GPT-5.2" },
1009
+ "claude-sonnet-4-5-20250929": { name: "Claude Sonnet 4.5" },
1010
+ "claude-opus-4-5-20251101": { name: "Claude Opus 4.5" },
1011
+ "gemini-3-pro-high": { name: "Gemini 3 Pro" }
1012
+ };
1013
+ var ALLOWED_MODEL_IDS = Object.keys(MODEL_CONFIGS);
1012
1014
  var ALLOWED_MODEL_SET = new Set(ALLOWED_MODEL_IDS);
1013
1015
  var homeDir = process.env.OPENCODE_TEST_HOME || os.homedir();
1014
1016
  var configRoot = process.env.XDG_CONFIG_HOME || path.join(homeDir, ".config");
@@ -1016,7 +1018,9 @@ var configDir = path.join(configRoot, "opencode");
1016
1018
  var configPath = path.join(configDir, "opencode.json");
1017
1019
  var ensureConfigPromise;
1018
1020
  var toModelMap = (ids, existing = {}) => ids.reduce((acc, id) => {
1019
- acc[id] = Object.prototype.hasOwnProperty.call(existing, id) ? existing[id] : {};
1021
+ const existingConfig = Object.prototype.hasOwnProperty.call(existing, id) ? existing[id] : {};
1022
+ const defaultConfig = MODEL_CONFIGS[id] ?? {};
1023
+ acc[id] = { ...defaultConfig, ...typeof existingConfig === "object" ? existingConfig : {} };
1020
1024
  return acc;
1021
1025
  }, {});
1022
1026
  var readJson = async (filePath) => {
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file constants.ts
3
+ * @input -
4
+ * @output Global constants (URLs, header names, provider IDs)
5
+ * @pos Foundation - imported by most other modules
6
+ *
7
+ * 📌 On change: Update this header + lib/ARCHITECTURE.md
8
+ */
1
9
  export declare const PLUGIN_NAME = "opencode-aicodewith-auth";
2
10
  export declare const PROVIDER_ID = "aicodewith";
3
11
  export declare const AUTH_METHOD_LABEL = "AICodewith API Key";
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file logger.ts
3
+ * @input Log stage, data objects
4
+ * @output File logs (~/.opencode/logs/), console debug output
5
+ * @pos Utility - debug/request logging for development
6
+ *
7
+ * 📌 On change: Update this header + lib/ARCHITECTURE.md
8
+ */
1
9
  export declare const LOGGING_ENABLED: boolean;
2
10
  export declare const DEBUG_ENABLED: boolean;
3
11
  export declare function logRequest(stage: string, data: Record<string, unknown>): void;
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file codex-opencode-bridge.ts
3
+ * @input -
4
+ * @output CODEX_OPENCODE_BRIDGE constant (tool remapping rules)
5
+ * @pos Bridge layer - maps Codex tools to OpenCode equivalents
6
+ *
7
+ * 📌 On change: Update this header + lib/prompts/ARCHITECTURE.md
8
+ */
1
9
  /**
2
10
  * Codex-OpenCode Bridge Prompt
3
11
  *
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file codex.ts
3
+ * @input Normalized model name
4
+ * @output Codex system prompt (fetched from GitHub, cached)
5
+ * @pos Prompt provider - fetches model-specific instructions from openai/codex repo
6
+ *
7
+ * 📌 On change: Update this header + lib/prompts/ARCHITECTURE.md
8
+ */
1
9
  export type ModelFamily = "gpt-5.2-codex" | "codex-max" | "codex" | "gpt-5.2" | "gpt-5.1";
2
10
  export declare function getModelFamily(normalizedModel: string): ModelFamily;
3
11
  export declare function getCodexInstructions(normalizedModel?: string): Promise<string>;
@@ -1,8 +1,10 @@
1
1
  /**
2
- * OpenCode Codex Prompt Fetcher
2
+ * @file opencode-codex.ts
3
+ * @input -
4
+ * @output OpenCode's codex.txt prompt (for filtering duplicate system prompts)
5
+ * @pos Prompt fetcher - caches OpenCode's system prompt for comparison
3
6
  *
4
- * Fetches and caches the codex.txt system prompt from OpenCode's GitHub repository.
5
- * Uses ETag-based caching to efficiently track updates.
7
+ * 📌 On change: Update this header + lib/prompts/ARCHITECTURE.md
6
8
  */
7
9
  /**
8
10
  * Fetch OpenCode's codex.txt prompt with ETag-based caching
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file fetch-helpers.ts
3
+ * @input Raw request, API key, RequestInit
4
+ * @output Transformed headers, URL extraction, response handlers
5
+ * @pos Request layer entry - coordinates transformation and response handling
6
+ *
7
+ * 📌 On change: Update this header + lib/request/ARCHITECTURE.md
8
+ */
1
9
  import type { RequestBody } from "../types";
2
10
  export declare function extractRequestUrl(input: Request | string | URL): string;
3
11
  export declare function transformRequestForCodex(init?: RequestInit): Promise<{
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file input-utils.ts
3
+ * @input InputItem[], cached OpenCode prompt
4
+ * @output Filtered InputItem[] (OpenCode prompts removed, orphaned outputs fixed)
5
+ * @pos Helper - message filtering and tool output normalization
6
+ *
7
+ * 📌 On change: Update this header + lib/request/helpers/ARCHITECTURE.md
8
+ */
1
9
  import type { InputItem } from "../../types";
2
10
  export declare const getContentText: (item: InputItem) => string;
3
11
  export declare function isOpenCodeSystemPrompt(item: InputItem, cachedPrompt: string | null): boolean;
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file model-map.ts
3
+ * @input Config model ID (e.g., "gpt-5.2-codex-high")
4
+ * @output Normalized API model name (e.g., "gpt-5.2-codex")
5
+ * @pos Helper - static model ID mapping for all supported variants
6
+ *
7
+ * 📌 On change: Update this header + lib/request/helpers/ARCHITECTURE.md
8
+ */
1
9
  /**
2
10
  * Model Configuration Map
3
11
  *
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file request-transformer.ts
3
+ * @input RequestBody from OpenCode
4
+ * @output Transformed RequestBody for Codex API
5
+ * @pos Core transformation - model normalization, reasoning config, input filtering
6
+ *
7
+ * 📌 On change: Update this header + lib/request/ARCHITECTURE.md
8
+ */
1
9
  import type { ConfigOptions, InputItem, ReasoningConfig, RequestBody } from "../types";
2
10
  export declare function normalizeModel(model: string | undefined): string;
3
11
  export declare function filterInput(input: InputItem[] | undefined): InputItem[] | undefined;
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file response-handler.ts
3
+ * @input SSE Response stream
4
+ * @output JSON Response (for non-streaming) or passthrough (for streaming)
5
+ * @pos Response layer - SSE parsing and content-type handling
6
+ *
7
+ * 📌 On change: Update this header + lib/request/ARCHITECTURE.md
8
+ */
1
9
  /**
2
10
  * Convert SSE stream response to JSON for generateText()
3
11
  * @param response - Fetch response with SSE stream
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file types.ts
3
+ * @input -
4
+ * @output TypeScript interfaces (RequestBody, InputItem, etc.)
5
+ * @pos Foundation - shared type definitions across lib/
6
+ *
7
+ * 📌 On change: Update this header + lib/ARCHITECTURE.md
8
+ */
1
9
  export interface ConfigOptions {
2
10
  reasoningEffort?: "none" | "minimal" | "low" | "medium" | "high" | "xhigh";
3
11
  reasoningSummary?: "auto" | "concise" | "detailed" | "off" | "on";
@@ -1,3 +1,11 @@
1
+ /**
2
+ * @file provider.ts
3
+ * @input Provider settings (apiKey, baseURL, headers)
4
+ * @output Multi-provider language model factory (OpenAI/Anthropic/Google)
5
+ * @pos Core provider - routes model requests to appropriate SDK
6
+ *
7
+ * 📌 On change: Update this header + README.md file index
8
+ */
1
9
  import type { LanguageModelV2 } from "@ai-sdk/provider";
2
10
  import type { FetchFunction } from "@ai-sdk/provider-utils";
3
11
  export type AicodewithProviderSettings = {
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "opencode-aicodewith-auth",
3
- "version": "0.1.0",
3
+ "version": "0.1.2",
4
4
  "description": "OpenCode plugin for AICodewith authentication - Access GPT-5.2, Claude, and Gemini models through AICodewith API",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",