opencode-windsurf-codeium 0.1.0 → 0.1.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +64 -187
  2. package/package.json +1 -1
package/README.md CHANGED
@@ -2,6 +2,13 @@
2
2
 
3
3
  OpenCode plugin for Windsurf/Codeium authentication - use Windsurf models in OpenCode.
4
4
 
5
+ ## Features
6
+
7
+ - OpenAI-compatible `/v1/chat/completions` interface with streaming SSE
8
+ - Automatic credential discovery (CSRF token, port, API key)
9
+ - Transparent REST↔gRPC translation over HTTP/2
10
+ - Zero extra auth prompts when Windsurf is running
11
+
5
12
  ## Overview
6
13
 
7
14
  This plugin enables OpenCode users to access Windsurf/Codeium models by leveraging their existing Windsurf installation. It communicates directly with the **local Windsurf language server** via gRPC - no network traffic capture or OAuth flows required.
@@ -13,20 +20,27 @@ This plugin enables OpenCode users to access Windsurf/Codeium models by leveragi
13
20
  3. **gRPC Communication**: Sends requests to `localhost:{port}` using HTTP/2 gRPC protocol
14
21
  4. **Response Transformation**: Converts gRPC responses to OpenAI-compatible SSE format
15
22
 
16
- ### Supported Models (90+)
17
-
18
- | Category | Models |
19
- |----------|--------|
20
- | **SWE** | `swe-1.5`, `swe-1.5-thinking` |
21
- | **Claude** | `claude-3.5-sonnet`, `claude-3.7-sonnet`, `claude-4-opus`, `claude-4-sonnet`, `claude-4.5-sonnet`, `claude-4.5-opus`, `claude-code` |
22
- | **GPT** | `gpt-4o`, `gpt-4.5`, `gpt-4.1`, `gpt-5`, `gpt-5.2`, `gpt-5-codex` |
23
- | **O-Series** | `o1`, `o3`, `o3-mini`, `o3-pro`, `o4-mini` |
24
- | **Gemini** | `gemini-2.5-flash`, `gemini-2.5-pro`, `gemini-3.0-pro` |
25
- | **DeepSeek** | `deepseek-v3`, `deepseek-r1`, `deepseek-r1-fast` |
26
- | **Llama** | `llama-3.1-8b`, `llama-3.1-70b`, `llama-3.1-405b`, `llama-3.3-70b` |
27
- | **Qwen** | `qwen-2.5-72b`, `qwen-3-235b`, `qwen-3-coder-480b` |
28
- | **Grok** | `grok-2`, `grok-3`, `grok-code-fast` |
29
- | **Other** | `mistral-7b`, `kimi-k2`, `glm-4.5`, `minimax-m2` |
23
+ ### Supported Models (canonical names)
24
+
25
+ **Claude**: `claude-3-opus`, `claude-3-sonnet`, `claude-3-haiku`, `claude-3.5-sonnet`, `claude-3.5-haiku`, `claude-3.7-sonnet`, `claude-3.7-sonnet-thinking`, `claude-4-opus`, `claude-4-opus-thinking`, `claude-4-sonnet`, `claude-4-sonnet-thinking`, `claude-4.1-opus`, `claude-4.1-opus-thinking`, `claude-4.5-sonnet`, `claude-4.5-sonnet-thinking`, `claude-4.5-opus`, `claude-4.5-opus-thinking`, `claude-code`.
26
+
27
+ **OpenAI GPT**: `gpt-4`, `gpt-4-turbo`, `gpt-4o`, `gpt-4o-mini`, `gpt-4.1`, `gpt-4.1-mini`, `gpt-4.1-nano`, `gpt-5`, `gpt-5-nano`, `gpt-5-low`, `gpt-5-high`, `gpt-5-codex`, `gpt-5.1-codex-mini`, `gpt-5.1-codex`, `gpt-5.1-codex-max`, `gpt-5.2-low`, `gpt-5.2`, `gpt-5.2-high`, `gpt-5.2-xhigh`, `gpt-5.2-priority` (plus the low/high/xhigh priority variants).
28
+
29
+ **OpenAI O-series**: `o3`, `o3-mini`, `o3-low`, `o3-high`, `o3-pro`, `o3-pro-low`, `o3-pro-high`, `o4-mini`, `o4-mini-low`, `o4-mini-high`.
30
+
31
+ **Gemini**: `gemini-2.0-flash`, `gemini-2.5-pro`, `gemini-2.5-flash`, `gemini-2.5-flash-thinking`, `gemini-2.5-flash-lite`, `gemini-3.0-pro`, `gemini-3.0-pro-low`, `gemini-3.0-pro-high`, `gemini-3.0-flash`, `gemini-3.0-flash-high`.
32
+
33
+ **DeepSeek**: `deepseek-v3`, `deepseek-v3-2`, `deepseek-r1`, `deepseek-r1-fast`, `deepseek-r1-slow`.
34
+
35
+ **Llama**: `llama-3.1-8b`, `llama-3.1-70b`, `llama-3.1-405b`, `llama-3.3-70b`, `llama-3.3-70b-r1`.
36
+
37
+ **Qwen**: `qwen-2.5-7b`, `qwen-2.5-32b`, `qwen-2.5-72b`, `qwen-2.5-32b-r1`, `qwen-3-235b`, `qwen-3-coder-480b`, `qwen-3-coder-480b-fast`.
38
+
39
+ **Grok (xAI)**: `grok-2`, `grok-3`, `grok-3-mini`, `grok-code-fast`.
40
+
41
+ **Specialty & Proprietary**: `mistral-7b`, `kimi-k2`, `kimi-k2-thinking`, `glm-4.5`, `glm-4.5-fast`, `glm-4.6`, `glm-4.6-fast`, `glm-4.7`, `glm-4.7-fast`, `minimax-m2`, `minimax-m2.1`, `swe-1.5`, `swe-1.5-thinking`, `swe-1.5-slow`.
42
+
43
+ Aliases (e.g., `gpt-5.2-low-priority`) are also accepted.
30
44
 
31
45
  ## Prerequisites
32
46
 
@@ -38,30 +52,39 @@ This plugin enables OpenCode users to access Windsurf/Codeium models by leveragi
38
52
  ## Installation
39
53
 
40
54
  ```bash
41
- bun add opencode-windsurf-codeium
55
+ bun add opencode-windsurf-codeium@beta
42
56
  ```
43
57
 
44
- ## Usage
58
+ ## OpenCode Configuration
45
59
 
46
- Add to your OpenCode configuration (`~/.config/opencode/opencode.json`):
60
+ Add the following to your OpenCode config (typically `~/.config/opencode/config.json`). Pin the plugin if you want the beta tag:
47
61
 
48
62
  ```json
49
63
  {
50
- "plugin": ["opencode-windsurf-codeium"],
64
+ "$schema": "https://opencode.ai/config.json",
65
+ "plugin": ["opencode-windsurf-codeium@beta"],
51
66
  "provider": {
52
67
  "windsurf": {
53
68
  "models": {
54
- "claude-4.5-sonnet": {
55
- "name": "Claude 4.5 Sonnet (Windsurf)",
56
- "limit": { "context": 200000, "output": 64000 }
69
+ "claude-4.5-opus": {
70
+ "name": "Claude 4.5 Opus (Windsurf)",
71
+ "limit": { "context": 200000, "output": 8192 }
72
+ },
73
+ "gpt-5.2-xhigh": {
74
+ "name": "GPT 5.2 XHigh (Windsurf)",
75
+ "limit": { "context": 128000, "output": 16384 }
76
+ },
77
+ "gemini-3.0-pro-high": {
78
+ "name": "Gemini 3.0 Pro High (Windsurf)",
79
+ "limit": { "context": 200000, "output": 8192 }
80
+ },
81
+ "deepseek-r1": {
82
+ "name": "DeepSeek R1 (Windsurf)",
83
+ "limit": { "context": 64000, "output": 8192 }
57
84
  },
58
85
  "swe-1.5": {
59
86
  "name": "SWE 1.5 (Windsurf)",
60
87
  "limit": { "context": 128000, "output": 32000 }
61
- },
62
- "gpt-5": {
63
- "name": "GPT-5 (Windsurf)",
64
- "limit": { "context": 128000, "output": 32000 }
65
88
  }
66
89
  }
67
90
  }
@@ -69,95 +92,27 @@ Add to your OpenCode configuration (`~/.config/opencode/opencode.json`):
69
92
  }
70
93
  ```
71
94
 
72
- Then run:
73
-
74
- ```bash
75
- opencode run "Hello" --model=windsurf/claude-4.5-sonnet
76
- ```
77
-
78
- ## Testing Inside OpenCode
79
-
80
- 1. **Build and install the plugin locally**
81
- ```bash
82
- bun run build
83
- bun add -g /path/to/opencode-windsurf-codeium
84
- ```
85
- Installing globally makes the package resolvable when OpenCode loads npm plugins.
86
-
87
- 2. **Update your OpenCode config** (`~/.config/opencode/opencode.json`):
88
- ```json
89
- {
90
- "$schema": "https://opencode.ai/config.json",
91
- "plugin": ["opencode-windsurf-codeium"],
92
- "provider": {
93
- "windsurf": {
94
- "models": {
95
- "claude-4.5-sonnet": { "name": "Claude 4.5 Sonnet (Windsurf)" },
96
- "swe-1.5": { "name": "SWE 1.5 (Windsurf)" },
97
- "gpt-5": { "name": "GPT-5 (Windsurf)" }
98
- }
99
- }
100
- }
101
- }
102
- ```
103
- Adding the package name to the `plugin` array tells OpenCode to load this plugin at startup, while the `provider.windsurf.models` block exposes friendly names and limits inside the TUI.
104
-
105
- 3. **Verify the plugin is active**
106
- ```bash
107
- opencode doctor # confirms plugin load + provider wiring
108
- opencode models list # should show windsurf/* models
109
- opencode chat --model=windsurf/claude-4.5-sonnet "Test message"
110
- ```
111
- Run these commands while Windsurf is running and logged-in so the plugin can discover credentials from the local language server.
112
-
113
- ## Verification
114
-
115
- To verify the plugin can communicate with Windsurf:
95
+ After saving the config:
116
96
 
117
97
  ```bash
118
- # 1. Check Windsurf is running
119
- ps aux | grep language_server_macos
120
-
121
- # 2. Extract credentials manually
122
- ps aux | grep language_server_macos | grep -oE '\-\-csrf_token\s+[a-f0-9-]+'
123
- ps aux | grep language_server_macos | grep -oE '\-\-extension_server_port\s+[0-9]+'
124
- cat ~/.codeium/config.json | grep apiKey
125
-
126
- # 3. Test gRPC endpoint (port = extension_server_port + 2)
127
- curl -X POST http://localhost:{port}/exa.language_server_pb.LanguageServerService/RawGetChatMessage \
128
- -H "content-type: application/grpc" \
129
- -H "te: trailers" \
130
- -H "x-codeium-csrf-token: YOUR_TOKEN" \
131
- --data-binary ""
98
+ bun run build && bun add -g opencode-windsurf-codeium@beta # local install during development
99
+ opencode models list # confirm models appear under windsurf/
100
+ opencode chat --model=windsurf/claude-4.5-opus "Hello" # quick smoke test
132
101
  ```
133
102
 
134
- ## Architecture
103
+ Keep Windsurf running and signed in—credentials are fetched live from the IDE process.
135
104
 
136
- ```
137
- ┌─────────────────┐ ┌──────────────────┐ ┌─────────────────────┐
138
- │ OpenCode │────▶│ Windsurf Plugin │────▶│ language_server │
139
- │ (requests) │ │ (transform) │ │ (local gRPC) │
140
- └─────────────────┘ └──────────────────┘ └─────────────────────┘
141
-
142
-
143
- ┌──────────────────┐
144
- │ ~/.codeium/ │
145
- │ config.json │
146
- │ (API key) │
147
- └──────────────────┘
148
- ```
149
-
150
- ### File Structure
105
+ ## Project Layout
151
106
 
152
107
  ```
153
108
  src/
154
- ├── plugin.ts # Main plugin, OpenAI-compatible fetch handler
155
- ├── constants.ts # Plugin ID, gRPC service names
109
+ ├── plugin.ts # Fetch interceptor that routes to Windsurf
110
+ ├── constants.ts # gRPC service metadata
156
111
  └── plugin/
157
- ├── auth.ts # Credential discovery from process args
158
- ├── grpc-client.ts # HTTP/2 gRPC client with protobuf encoding
159
- ├── models.ts # Model name → enum mappings
160
- └── types.ts # TypeScript types, ModelEnum values
112
+ ├── auth.ts # Credential discovery
113
+ ├── grpc-client.ts # Streaming chat bridge
114
+ ├── models.ts # Model lookup tables
115
+ └── types.ts # Shared enums/types
161
116
  ```
162
117
 
163
118
  ## Development
@@ -176,86 +131,6 @@ bun run typecheck
176
131
  bun test
177
132
  ```
178
133
 
179
- ## Publishing (beta tag)
180
-
181
- To release the plugin as `opencode-windsurf-codeium@beta` using Bun:
182
-
183
- 1. **Bump the version** – update `package.json` with the new semantic version (e.g., `0.2.0-beta.1`).
184
- 2. **Login once** – run `npm login` (Bun reuses the credentials in `~/.npmrc`).
185
- 3. **Build & verify** – `bun run build && bun test` to ensure `dist/` is fresh.
186
- 4. **Publish with tag** –
187
- ```bash
188
- bun publish --tag beta --access public
189
- ```
190
- This creates the `opencode-windsurf-codeium@beta` dist-tag so OpenCode users can install the beta specifically via `bun add opencode-windsurf-codeium@beta`.
191
-
192
- For stable releases, rerun `bun publish` without `--tag beta` (or with `--tag latest`).
193
-
194
- ## How It Works (Technical Details)
195
-
196
- ### 1. Credential Discovery
197
-
198
- The plugin discovers credentials from the running Windsurf process:
199
-
200
- ```bash
201
- # Process args contain:
202
- --csrf_token abc123-def456-...
203
- --extension_server_port 42100
204
- --windsurf_version 1.13.104
205
- ```
206
-
207
- The gRPC port is `extension_server_port + 2`.
208
-
209
- ### 2. Protobuf Encoding
210
-
211
- Requests are manually encoded to protobuf format (no protobuf library needed):
212
-
213
- ```typescript
214
- // Encode a string field (wire type 2)
215
- function encodeString(fieldNum: number, str: string): number[] {
216
- const strBytes = Buffer.from(str, 'utf8');
217
- return [(fieldNum << 3) | 2, ...encodeVarint(strBytes.length), ...strBytes];
218
- }
219
- ```
220
-
221
- ### 3. Model Enum Values
222
-
223
- Model names are mapped to protobuf enum values extracted from Windsurf's `extension.js`:
224
-
225
- ```typescript
226
- const ModelEnum = {
227
- CLAUDE_4_5_SONNET: 353,
228
- CLAUDE_4_5_OPUS: 391,
229
- GPT_5: 340,
230
- SWE_1_5: 359,
231
- // ... 80+ more
232
- };
233
- ```
234
-
235
- ### 4. gRPC Service
236
-
237
- Requests go to the local language server:
238
-
239
- ```
240
- POST http://localhost:{port}/exa.language_server_pb.LanguageServerService/RawGetChatMessage
241
- Headers:
242
- content-type: application/grpc
243
- te: trailers
244
- x-codeium-csrf-token: {csrf_token}
245
- ```
246
-
247
- ## Reverse Engineering Notes
248
-
249
- The model enum values were extracted from:
250
- ```
251
- /Applications/Windsurf.app/Contents/Resources/app/extensions/windsurf/dist/extension.js
252
- ```
253
-
254
- To discover new models:
255
- ```bash
256
- grep -oE '[A-Z0-9_]+\s*=\s*[0-9]+' extension.js | grep -E 'CLAUDE|GPT|GEMINI|DEEPSEEK'
257
- ```
258
-
259
134
  ## Known Limitations
260
135
 
261
136
  - **Windsurf must be running** - The plugin communicates with the local language server
@@ -263,9 +138,11 @@ grep -oE '[A-Z0-9_]+\s*=\s*[0-9]+' extension.js | grep -E 'CLAUDE|GPT|GEMINI|DEE
263
138
  - **Response parsing** - Uses heuristic text extraction from protobuf (may miss edge cases)
264
139
  - **No tool calling yet** - Basic chat completion only
265
140
 
266
- ## Related Projects
141
+ ## Further Reading
267
142
 
268
- - [opencode-antigravity-auth](https://github.com/NoeFabris/opencode-antigravity-auth) - Similar plugin for Google's Antigravity API
143
+ - `docs/WINDSURF_API_SPEC.md` gRPC endpoints & protobuf notes
144
+ - `docs/REVERSE_ENGINEERING.md` – credential discovery + tooling
145
+ - [opencode-antigravity-auth](https://github.com/NoeFabris/opencode-antigravity-auth) – related project
269
146
 
270
147
  ## License
271
148
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "opencode-windsurf-codeium",
3
- "version": "0.1.0",
3
+ "version": "0.1.1",
4
4
  "description": "OpenCode plugin for Windsurf/Codeium authentication - use Windsurf models in OpenCode",
5
5
  "type": "module",
6
6
  "main": "./dist/index.js",