opencode-windsurf-codeium 0.1.0 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +69 -192
  2. package/package.json +1 -1
package/README.md CHANGED
@@ -2,31 +2,16 @@
2
2
 
3
3
  OpenCode plugin for Windsurf/Codeium authentication - use Windsurf models in OpenCode.
4
4
 
5
- ## Overview
5
+ ## Features
6
6
 
7
- This plugin enables OpenCode users to access Windsurf/Codeium models by leveraging their existing Windsurf installation. It communicates directly with the **local Windsurf language server** via gRPC - no network traffic capture or OAuth flows required.
7
+ - OpenAI-compatible `/v1/chat/completions` interface with streaming SSE
8
+ - Automatic credential discovery (CSRF token, port, API key)
9
+ - Transparent REST↔gRPC translation over HTTP/2
10
+ - Zero extra auth prompts when Windsurf is running
8
11
 
9
- ### How It Works
10
-
11
- 1. **Credential Discovery**: Extracts CSRF token and port from the running `language_server_macos` process
12
- 2. **API Key**: Reads from `~/.codeium/config.json`
13
- 3. **gRPC Communication**: Sends requests to `localhost:{port}` using HTTP/2 gRPC protocol
14
- 4. **Response Transformation**: Converts gRPC responses to OpenAI-compatible SSE format
12
+ ## Overview
15
13
 
16
- ### Supported Models (90+)
17
-
18
- | Category | Models |
19
- |----------|--------|
20
- | **SWE** | `swe-1.5`, `swe-1.5-thinking` |
21
- | **Claude** | `claude-3.5-sonnet`, `claude-3.7-sonnet`, `claude-4-opus`, `claude-4-sonnet`, `claude-4.5-sonnet`, `claude-4.5-opus`, `claude-code` |
22
- | **GPT** | `gpt-4o`, `gpt-4.5`, `gpt-4.1`, `gpt-5`, `gpt-5.2`, `gpt-5-codex` |
23
- | **O-Series** | `o1`, `o3`, `o3-mini`, `o3-pro`, `o4-mini` |
24
- | **Gemini** | `gemini-2.5-flash`, `gemini-2.5-pro`, `gemini-3.0-pro` |
25
- | **DeepSeek** | `deepseek-v3`, `deepseek-r1`, `deepseek-r1-fast` |
26
- | **Llama** | `llama-3.1-8b`, `llama-3.1-70b`, `llama-3.1-405b`, `llama-3.3-70b` |
27
- | **Qwen** | `qwen-2.5-72b`, `qwen-3-235b`, `qwen-3-coder-480b` |
28
- | **Grok** | `grok-2`, `grok-3`, `grok-code-fast` |
29
- | **Other** | `mistral-7b`, `kimi-k2`, `glm-4.5`, `minimax-m2` |
14
+ This plugin enables OpenCode users to access Windsurf/Codeium models by leveraging their existing Windsurf installation. It communicates directly with the **local Windsurf language server** via gRPC - no network traffic capture or OAuth flows required.
30
15
 
31
16
  ## Prerequisites
32
17
 
@@ -38,30 +23,39 @@ This plugin enables OpenCode users to access Windsurf/Codeium models by leveragi
38
23
  ## Installation
39
24
 
40
25
  ```bash
41
- bun add opencode-windsurf-codeium
26
+ bun add opencode-windsurf-codeium@beta
42
27
  ```
43
28
 
44
- ## Usage
29
+ ## OpenCode Configuration
45
30
 
46
- Add to your OpenCode configuration (`~/.config/opencode/opencode.json`):
31
+ Add the following to your OpenCode config (typically `~/.config/opencode/config.json`). Pin the plugin if you want the beta tag:
47
32
 
48
33
  ```json
49
34
  {
50
- "plugin": ["opencode-windsurf-codeium"],
35
+ "$schema": "https://opencode.ai/config.json",
36
+ "plugin": ["opencode-windsurf-codeium@beta"],
51
37
  "provider": {
52
38
  "windsurf": {
53
39
  "models": {
54
- "claude-4.5-sonnet": {
55
- "name": "Claude 4.5 Sonnet (Windsurf)",
56
- "limit": { "context": 200000, "output": 64000 }
40
+ "claude-4.5-opus": {
41
+ "name": "Claude 4.5 Opus (Windsurf)",
42
+ "limit": { "context": 200000, "output": 8192 }
43
+ },
44
+ "gpt-5.2-xhigh": {
45
+ "name": "GPT 5.2 XHigh (Windsurf)",
46
+ "limit": { "context": 128000, "output": 16384 }
47
+ },
48
+ "gemini-3.0-pro-high": {
49
+ "name": "Gemini 3.0 Pro High (Windsurf)",
50
+ "limit": { "context": 200000, "output": 8192 }
51
+ },
52
+ "deepseek-r1": {
53
+ "name": "DeepSeek R1 (Windsurf)",
54
+ "limit": { "context": 64000, "output": 8192 }
57
55
  },
58
56
  "swe-1.5": {
59
57
  "name": "SWE 1.5 (Windsurf)",
60
58
  "limit": { "context": 128000, "output": 32000 }
61
- },
62
- "gpt-5": {
63
- "name": "GPT-5 (Windsurf)",
64
- "limit": { "context": 128000, "output": 32000 }
65
59
  }
66
60
  }
67
61
  }
@@ -69,191 +63,72 @@ Add to your OpenCode configuration (`~/.config/opencode/opencode.json`):
69
63
  }
70
64
  ```
71
65
 
72
- Then run:
66
+ After saving the config:
73
67
 
74
68
  ```bash
75
- opencode run "Hello" --model=windsurf/claude-4.5-sonnet
69
+ bun run build && bun add -g opencode-windsurf-codeium@beta # local install during development
70
+ opencode models list # confirm models appear under windsurf/
71
+ opencode chat --model=windsurf/claude-4.5-opus "Hello" # quick smoke test
76
72
  ```
77
73
 
78
- ## Testing Inside OpenCode
79
-
80
- 1. **Build and install the plugin locally**
81
- ```bash
82
- bun run build
83
- bun add -g /path/to/opencode-windsurf-codeium
84
- ```
85
- Installing globally makes the package resolvable when OpenCode loads npm plugins.
86
-
87
- 2. **Update your OpenCode config** (`~/.config/opencode/opencode.json`):
88
- ```json
89
- {
90
- "$schema": "https://opencode.ai/config.json",
91
- "plugin": ["opencode-windsurf-codeium"],
92
- "provider": {
93
- "windsurf": {
94
- "models": {
95
- "claude-4.5-sonnet": { "name": "Claude 4.5 Sonnet (Windsurf)" },
96
- "swe-1.5": { "name": "SWE 1.5 (Windsurf)" },
97
- "gpt-5": { "name": "GPT-5 (Windsurf)" }
98
- }
99
- }
100
- }
101
- }
102
- ```
103
- Adding the package name to the `plugin` array tells OpenCode to load this plugin at startup, while the `provider.windsurf.models` block exposes friendly names and limits inside the TUI.
104
-
105
- 3. **Verify the plugin is active**
106
- ```bash
107
- opencode doctor # confirms plugin load + provider wiring
108
- opencode models list # should show windsurf/* models
109
- opencode chat --model=windsurf/claude-4.5-sonnet "Test message"
110
- ```
111
- Run these commands while Windsurf is running and logged-in so the plugin can discover credentials from the local language server.
112
-
113
- ## Verification
114
-
115
- To verify the plugin can communicate with Windsurf:
74
+ Keep Windsurf running and signed in—credentials are fetched live from the IDE process.
116
75
 
117
- ```bash
118
- # 1. Check Windsurf is running
119
- ps aux | grep language_server_macos
120
-
121
- # 2. Extract credentials manually
122
- ps aux | grep language_server_macos | grep -oE '\-\-csrf_token\s+[a-f0-9-]+'
123
- ps aux | grep language_server_macos | grep -oE '\-\-extension_server_port\s+[0-9]+'
124
- cat ~/.codeium/config.json | grep apiKey
125
-
126
- # 3. Test gRPC endpoint (port = extension_server_port + 2)
127
- curl -X POST http://localhost:{port}/exa.language_server_pb.LanguageServerService/RawGetChatMessage \
128
- -H "content-type: application/grpc" \
129
- -H "te: trailers" \
130
- -H "x-codeium-csrf-token: YOUR_TOKEN" \
131
- --data-binary ""
132
- ```
133
-
134
- ## Architecture
135
-
136
- ```
137
- ┌─────────────────┐ ┌──────────────────┐ ┌─────────────────────┐
138
- │ OpenCode │────▶│ Windsurf Plugin │────▶│ language_server │
139
- │ (requests) │ │ (transform) │ │ (local gRPC) │
140
- └─────────────────┘ └──────────────────┘ └─────────────────────┘
141
-
142
-
143
- ┌──────────────────┐
144
- │ ~/.codeium/ │
145
- │ config.json │
146
- │ (API key) │
147
- └──────────────────┘
148
- ```
149
-
150
- ### File Structure
76
+ ## Project Layout
151
77
 
152
78
  ```
153
79
  src/
154
- ├── plugin.ts # Main plugin, OpenAI-compatible fetch handler
155
- ├── constants.ts # Plugin ID, gRPC service names
80
+ ├── plugin.ts # Fetch interceptor that routes to Windsurf
81
+ ├── constants.ts # gRPC service metadata
156
82
  └── plugin/
157
- ├── auth.ts # Credential discovery from process args
158
- ├── grpc-client.ts # HTTP/2 gRPC client with protobuf encoding
159
- ├── models.ts # Model name → enum mappings
160
- └── types.ts # TypeScript types, ModelEnum values
161
- ```
162
-
163
- ## Development
164
-
165
- ```bash
166
- # Install dependencies
167
- bun install
168
-
169
- # Build
170
- bun run build
171
-
172
- # Type check
173
- bun run typecheck
174
-
175
- # Run tests
176
- bun test
83
+ ├── auth.ts # Credential discovery
84
+ ├── grpc-client.ts # Streaming chat bridge
85
+ ├── models.ts # Model lookup tables
86
+ └── types.ts # Shared enums/types
177
87
  ```
178
88
 
179
- ## Publishing (beta tag)
180
-
181
- To release the plugin as `opencode-windsurf-codeium@beta` using Bun:
182
-
183
- 1. **Bump the version** – update `package.json` with the new semantic version (e.g., `0.2.0-beta.1`).
184
- 2. **Login once** – run `npm login` (Bun reuses the credentials in `~/.npmrc`).
185
- 3. **Build & verify** – `bun run build && bun test` to ensure `dist/` is fresh.
186
- 4. **Publish with tag** –
187
- ```bash
188
- bun publish --tag beta --access public
189
- ```
190
- This creates the `opencode-windsurf-codeium@beta` dist-tag so OpenCode users can install the beta specifically via `bun add opencode-windsurf-codeium@beta`.
89
+ ### How It Works
191
90
 
192
- For stable releases, rerun `bun publish` without `--tag beta` (or with `--tag latest`).
91
+ 1. **Credential Discovery**: Extracts CSRF token and port from the running `language_server_macos` process
92
+ 2. **API Key**: Reads from `~/.codeium/config.json`
93
+ 3. **gRPC Communication**: Sends requests to `localhost:{port}` using HTTP/2 gRPC protocol
94
+ 4. **Response Transformation**: Converts gRPC responses to OpenAI-compatible SSE format
193
95
 
194
- ## How It Works (Technical Details)
96
+ ### Supported Models (canonical names)
195
97
 
196
- ### 1. Credential Discovery
98
+ **Claude**: `claude-3-opus`, `claude-3-sonnet`, `claude-3-haiku`, `claude-3.5-sonnet`, `claude-3.5-haiku`, `claude-3.7-sonnet`, `claude-3.7-sonnet-thinking`, `claude-4-opus`, `claude-4-opus-thinking`, `claude-4-sonnet`, `claude-4-sonnet-thinking`, `claude-4.1-opus`, `claude-4.1-opus-thinking`, `claude-4.5-sonnet`, `claude-4.5-sonnet-thinking`, `claude-4.5-opus`, `claude-4.5-opus-thinking`, `claude-code`.
197
99
 
198
- The plugin discovers credentials from the running Windsurf process:
100
+ **OpenAI GPT**: `gpt-4`, `gpt-4-turbo`, `gpt-4o`, `gpt-4o-mini`, `gpt-4.1`, `gpt-4.1-mini`, `gpt-4.1-nano`, `gpt-5`, `gpt-5-nano`, `gpt-5-low`, `gpt-5-high`, `gpt-5-codex`, `gpt-5.1-codex-mini`, `gpt-5.1-codex`, `gpt-5.1-codex-max`, `gpt-5.2-low`, `gpt-5.2`, `gpt-5.2-high`, `gpt-5.2-xhigh`, `gpt-5.2-priority` (plus the low/high/xhigh priority variants).
199
101
 
200
- ```bash
201
- # Process args contain:
202
- --csrf_token abc123-def456-...
203
- --extension_server_port 42100
204
- --windsurf_version 1.13.104
205
- ```
102
+ **OpenAI O-series**: `o3`, `o3-mini`, `o3-low`, `o3-high`, `o3-pro`, `o3-pro-low`, `o3-pro-high`, `o4-mini`, `o4-mini-low`, `o4-mini-high`.
206
103
 
207
- The gRPC port is `extension_server_port + 2`.
104
+ **Gemini**: `gemini-2.0-flash`, `gemini-2.5-pro`, `gemini-2.5-flash`, `gemini-2.5-flash-thinking`, `gemini-2.5-flash-lite`, `gemini-3.0-pro`, `gemini-3.0-pro-low`, `gemini-3.0-pro-high`, `gemini-3.0-flash`, `gemini-3.0-flash-high`.
208
105
 
209
- ### 2. Protobuf Encoding
106
+ **DeepSeek**: `deepseek-v3`, `deepseek-v3-2`, `deepseek-r1`, `deepseek-r1-fast`, `deepseek-r1-slow`.
210
107
 
211
- Requests are manually encoded to protobuf format (no protobuf library needed):
108
+ **Llama**: `llama-3.1-8b`, `llama-3.1-70b`, `llama-3.1-405b`, `llama-3.3-70b`, `llama-3.3-70b-r1`.
212
109
 
213
- ```typescript
214
- // Encode a string field (wire type 2)
215
- function encodeString(fieldNum: number, str: string): number[] {
216
- const strBytes = Buffer.from(str, 'utf8');
217
- return [(fieldNum << 3) | 2, ...encodeVarint(strBytes.length), ...strBytes];
218
- }
219
- ```
110
+ **Qwen**: `qwen-2.5-7b`, `qwen-2.5-32b`, `qwen-2.5-72b`, `qwen-2.5-32b-r1`, `qwen-3-235b`, `qwen-3-coder-480b`, `qwen-3-coder-480b-fast`.
220
111
 
221
- ### 3. Model Enum Values
112
+ **Grok (xAI)**: `grok-2`, `grok-3`, `grok-3-mini`, `grok-code-fast`.
222
113
 
223
- Model names are mapped to protobuf enum values extracted from Windsurf's `extension.js`:
114
+ **Specialty & Proprietary**: `mistral-7b`, `kimi-k2`, `kimi-k2-thinking`, `glm-4.5`, `glm-4.5-fast`, `glm-4.6`, `glm-4.6-fast`, `glm-4.7`, `glm-4.7-fast`, `minimax-m2`, `minimax-m2.1`, `swe-1.5`, `swe-1.5-thinking`, `swe-1.5-slow`.
224
115
 
225
- ```typescript
226
- const ModelEnum = {
227
- CLAUDE_4_5_SONNET: 353,
228
- CLAUDE_4_5_OPUS: 391,
229
- GPT_5: 340,
230
- SWE_1_5: 359,
231
- // ... 80+ more
232
- };
233
- ```
116
+ Aliases (e.g., `gpt-5.2-low-priority`) are also accepted.
234
117
 
235
- ### 4. gRPC Service
236
-
237
- Requests go to the local language server:
118
+ ## Development
238
119
 
239
- ```
240
- POST http://localhost:{port}/exa.language_server_pb.LanguageServerService/RawGetChatMessage
241
- Headers:
242
- content-type: application/grpc
243
- te: trailers
244
- x-codeium-csrf-token: {csrf_token}
245
- ```
120
+ ```bash
121
+ # Install dependencies
122
+ bun install
246
123
 
247
- ## Reverse Engineering Notes
124
+ # Build
125
+ bun run build
248
126
 
249
- The model enum values were extracted from:
250
- ```
251
- /Applications/Windsurf.app/Contents/Resources/app/extensions/windsurf/dist/extension.js
252
- ```
127
+ # Type check
128
+ bun run typecheck
253
129
 
254
- To discover new models:
255
- ```bash
256
- grep -oE '[A-Z0-9_]+\s*=\s*[0-9]+' extension.js | grep -E 'CLAUDE|GPT|GEMINI|DEEPSEEK'
130
+ # Run tests
131
+ bun test
257
132
  ```
258
133
 
259
134
  ## Known Limitations
@@ -263,9 +138,11 @@ grep -oE '[A-Z0-9_]+\s*=\s*[0-9]+' extension.js | grep -E 'CLAUDE|GPT|GEMINI|DEE
263
138
  - **Response parsing** - Uses heuristic text extraction from protobuf (may miss edge cases)
264
139
  - **No tool calling yet** - Basic chat completion only
265
140
 
266
- ## Related Projects
141
+ ## Further Reading
267
142
 
268
- - [opencode-antigravity-auth](https://github.com/NoeFabris/opencode-antigravity-auth) - Similar plugin for Google's Antigravity API
143
+ - `docs/WINDSURF_API_SPEC.md` gRPC endpoints & protobuf notes
144
+ - `docs/REVERSE_ENGINEERING.md` – credential discovery + tooling
145
+ - [opencode-antigravity-auth](https://github.com/NoeFabris/opencode-antigravity-auth) – related project
269
146
 
270
147
  ## License
271
148
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "opencode-windsurf-codeium",
3
- "version": "0.1.0",
3
+ "version": "0.1.2",
4
4
  "description": "OpenCode plugin for Windsurf/Codeium authentication - use Windsurf models in OpenCode",
5
5
  "type": "module",
6
6
  "main": "./dist/index.js",