opencode-semantic-search 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/SETUP.md ADDED
@@ -0,0 +1,541 @@
1
+ # Setup Guide — OpenCode Semantic Search Plugin
2
+
3
+ This guide walks you through installing the plugin, configuring an embedding provider, and verifying everything works.
4
+
5
+ ---
6
+
7
+ ## Table of Contents
8
+
9
+ 1. [Prerequisites](#1-prerequisites)
10
+ 2. [Quick Install (recommended)](#2-quick-install-recommended)
11
+ 3. [Manual Install](#3-manual-install)
12
+ 4. [Embedding Provider Setup](#4-embedding-provider-setup)
13
+ - [Ollama (local, default)](#ollama-local-default)
14
+ - [OpenAI / compatible API](#openai--compatible-api)
15
+ 5. [Project Configuration](#5-project-configuration)
16
+ 6. [Global vs Project-local Installation](#6-global-vs-project-local-installation)
17
+ 7. [Verify the Installation](#7-verify-the-installation)
18
+ 8. [Using the Plugin in OpenCode](#8-using-the-plugin-in-opencode)
19
+ 9. [Advanced Configuration Reference](#9-advanced-configuration-reference)
20
+ 10. [Upgrading](#10-upgrading)
21
+ 11. [Uninstalling](#11-uninstalling)
22
+ 12. [Troubleshooting](#12-troubleshooting)
23
+
24
+ ---
25
+
26
+ ## 1. Prerequisites
27
+
28
+ | Tool | Minimum version | Install |
29
+ |---|---|---|
30
+ | **Bun** | 1.1+ | https://bun.sh |
31
+ | **OpenCode** | latest | https://opencode.ai |
32
+ | **ripgrep** (`rg`) | any | `brew install ripgrep` / `apt install ripgrep` |
33
+ | **Ollama** _(or OpenAI API key)_ | latest | https://ollama.com |
34
+
35
+ ### Check your environment
36
+
37
+ ```bash
38
+ bun --version
39
+ opencode --version
40
+ rg --version
41
+ ollama --version # only if using Ollama
42
+ ```
43
+
44
+ ---
45
+
46
+ ## 2. Quick Install (recommended)
47
+
48
+ **From npm (`bunx`)** — after [opencode-semantic-search](https://www.npmjs.com/package/opencode-semantic-search) is published:
49
+
50
+ ```bash
51
+ bunx opencode-semantic-search@latest
52
+ bunx opencode-semantic-search@latest install --local
53
+ ```
54
+
55
+ **From a clone** of [github.com/jainprashul/opencode-semantic-search](https://github.com/jainprashul/opencode-semantic-search):
56
+
57
+ ```bash
58
+ git clone https://github.com/jainprashul/opencode-semantic-search.git
59
+ cd opencode-semantic-search
60
+
61
+ # Global install (default — ~/.config/opencode/plugins, all projects)
62
+ bash install.sh
63
+
64
+ # OR: project-local only (current directory)
65
+ bash install.sh --local
66
+ ```
67
+
68
+ **Remote `install.sh`** (downloads the script from `main`):
69
+
70
+ ```bash
71
+ curl -fsSL https://raw.githubusercontent.com/jainprashul/opencode-semantic-search/main/install.sh | bash
72
+ ```
73
+
74
+ The script will:
75
+
76
+ 1. Verify all prerequisites
77
+ 2. Run `bun install` to fetch npm dependencies
78
+ 3. Create a plugin shim in `~/.config/opencode/plugins/` by default (or `.opencode/plugins/` with `--local`)
79
+ 4. Pull the `nomic-embed-text` Ollama model (unless `--skip-ollama` is passed)
80
+ 5. Write a default `semantic-search.json` config (`~/.config/opencode/` when global, or project `.opencode/` with `--local`)
81
+ 6. Run the integration tests when installing from a full clone (skipped for minimal npm installs)
82
+
83
+ ### Install script options
84
+
85
+ ```bash
86
+ bash install.sh [OPTIONS]
87
+
88
+ --global Register plugin globally (~/.config/opencode/plugins/) (default)
89
+ --local Register under ./.opencode/plugins/ in the current directory
90
+ --ollama-model MODEL Pull a different embedding model (default: nomic-embed-text)
91
+ --openai-key-env VAR Use OpenAI instead of Ollama (pass the env var name, e.g. OPENAI_API_KEY)
92
+ --skip-ollama Skip all Ollama checks and model pulls
93
+ -h, --help Show help
94
+ ```
95
+
96
+ #### Examples
97
+
98
+ ```bash
99
+ # Use a lighter Ollama model
100
+ bash install.sh --ollama-model all-minilm
101
+
102
+ # Use OpenAI embeddings instead of Ollama
103
+ bash install.sh --openai-key-env OPENAI_API_KEY --skip-ollama
104
+
105
+ # Use a self-hosted OpenAI-compatible endpoint (e.g. LM Studio, LocalAI)
106
+ bash install.sh --openai-key-env MY_LMSTUDIO_KEY --skip-ollama
107
+ # Then edit .opencode/semantic-search.json to set your api_base
108
+ ```
109
+
110
+ ---
111
+
112
+ ## 3. Manual Install
113
+
114
+ If you prefer to install without the script:
115
+
116
+ ```bash
117
+ # 1. Install dependencies
118
+ cd /path/to/opencode-semantic-search-plugin
119
+ bun install
120
+
121
+ # 2. Create the plugin directory (project-local)
122
+ mkdir -p /your/project/.opencode/plugins
123
+
124
+ # 3. Write the plugin shim
125
+ cat > /your/project/.opencode/plugins/opencode-semantic-search.ts << 'EOF'
126
+ export { default } from "/path/to/opencode-semantic-search-plugin/index.ts";
127
+ EOF
128
+
129
+ # 4. (Optional) Create project config
130
+ mkdir -p /your/project/.opencode
131
+ cat > /your/project/.opencode/semantic-search.json << 'EOF'
132
+ {
133
+ "embedding": {
134
+ "provider": "ollama",
135
+ "model": "nomic-embed-text"
136
+ }
137
+ }
138
+ EOF
139
+ ```
140
+
141
+ ---
142
+
143
+ ## 4. Embedding Provider Setup
144
+
145
+ ### Ollama (local, default)
146
+
147
+ Ollama runs 100% locally — no API key required.
148
+
149
+ ```bash
150
+ # Start the Ollama server (keep this running in the background)
151
+ ollama serve
152
+
153
+ # Pull the embedding model
154
+ ollama pull nomic-embed-text
155
+ ```
156
+
157
+ > **Recommended models:**
158
+ > - `nomic-embed-text` — best quality, 768 dimensions (~270 MB)
159
+ > - `all-minilm` — faster, smaller, 384 dimensions (~45 MB)
160
+ > - `mxbai-embed-large` — highest quality, 1024 dimensions (~670 MB)
161
+
162
+ To change the model, update `.opencode/semantic-search.json`:
163
+
164
+ ```json
165
+ {
166
+ "embedding": {
167
+ "model": "mxbai-embed-large",
168
+ "dimensions": 1024
169
+ }
170
+ }
171
+ ```
172
+
173
+ > **Important:** changing the model requires a full reindex. Open OpenCode and call `reindex()`.
174
+
175
+ ### OpenAI / compatible API
176
+
177
+ Set your API key in the environment, then configure the plugin:
178
+
179
+ ```bash
180
+ export OPENAI_API_KEY="sk-..."
181
+ ```
182
+
183
+ ```json
184
+ {
185
+ "embedding": {
186
+ "provider": "openai",
187
+ "api_base": "https://api.openai.com/v1",
188
+ "model": "text-embedding-3-small",
189
+ "api_key_env": "OPENAI_API_KEY",
190
+ "dimensions": 1536
191
+ }
192
+ }
193
+ ```
194
+
195
+ #### Self-hosted / local OpenAI-compatible servers
196
+
197
+ Works with **LM Studio**, **LocalAI**, **Jan**, **Ollama's OpenAI-compatible endpoint**, etc.:
198
+
199
+ ```json
200
+ {
201
+ "embedding": {
202
+ "provider": "openai",
203
+ "api_base": "http://localhost:1234/v1",
204
+ "model": "nomic-embed-text-v1.5",
205
+ "api_key_env": "LM_STUDIO_KEY",
206
+ "dimensions": 768
207
+ }
208
+ }
209
+ ```
210
+
211
+ ---
212
+
213
+ ## 5. Project Configuration
214
+
215
+ Create `.opencode/semantic-search.json` in the root of your project:
216
+
217
+ ```json
218
+ {
219
+ "embedding": {
220
+ "provider": "ollama",
221
+ "api_base": "http://localhost:11434/v1",
222
+ "model": "nomic-embed-text",
223
+ "dimensions": 768,
224
+ "batch_size": 10
225
+ },
226
+ "chunking": {
227
+ "max_tokens": 512,
228
+ "overlap_tokens": 50
229
+ },
230
+ "indexing": {
231
+ "include": [
232
+ "**/*.ts", "**/*.tsx",
233
+ "**/*.js", "**/*.jsx",
234
+ "**/*.py", "**/*.go",
235
+ "**/*.rs", "**/*.java"
236
+ ],
237
+ "exclude": [
238
+ "node_modules/**", "dist/**",
239
+ ".git/**", "**/*.min.js"
240
+ ],
241
+ "respect_gitignore": true,
242
+ "max_file_size_kb": 500,
243
+ "concurrency": 4
244
+ },
245
+ "search": {
246
+ "top_k": 10,
247
+ "similarity_threshold": 0.35,
248
+ "context_lines": 3
249
+ },
250
+ "smart_grep": {
251
+ "enabled": true,
252
+ "min_words_for_semantic": 2,
253
+ "fallback_to_grep_on_empty": true
254
+ }
255
+ }
256
+ ```
257
+
258
+ All fields are optional — any omitted key falls back to the defaults above.
259
+
260
+ ---
261
+
262
+ ## 6. Global vs Project-local Installation
263
+
264
+ ### Global (default)
265
+
266
+ The shim is placed in `~/.config/opencode/plugins/`. Every OpenCode project will automatically load this plugin.
267
+
268
+ ```bash
269
+ bash install.sh
270
+ ```
271
+
272
+ To set defaults for all workspaces, use `~/.config/opencode/semantic-search.json` (written by `bash install.sh` when installing globally). Per-project `.opencode/semantic-search.json` merges on top.
273
+
274
+ ### Project-local
275
+
276
+ Use `--local` to place the shim in `.opencode/plugins/` inside your project directory only.
277
+
278
+ ```bash
279
+ bash install.sh --local
280
+ ```
281
+
282
+ ```
283
+ your-project/
284
+ ├── .opencode/
285
+ │ ├── plugins/
286
+ │ │ └── opencode-semantic-search.ts ← shim (add to .gitignore)
287
+ │ └── semantic-search.json ← per-project config
288
+ └── ...
289
+ ```
290
+
291
+ > Add `.opencode/plugins/opencode-semantic-search.ts` to your `.gitignore` since the shim contains an absolute path.
292
+
293
+ ---
294
+
295
+ ## 7. Verify the Installation
296
+
297
+ ### 1. Run integration tests
298
+
299
+ ```bash
300
+ bun run test:integration
301
+ ```
302
+
303
+ Expected output:
304
+
305
+ ```
306
+ [integration-index] Creating temp repo...
307
+ [integration-index] Running delta sync...
308
+ [integration-index] Files indexed: 3
309
+ [integration-index] ✔ index populated
310
+ [integration-index] ✔ semantic search returns results
311
+ [integration-index] All checks passed.
312
+
313
+ [integration-routing] ✔ conceptual query routed to semantic search
314
+ [integration-routing] ✔ exact query routed to ripgrep
315
+ [integration-routing] ✔ fallback fires when embedder is unhealthy
316
+ [integration-routing] All routing checks passed.
317
+ ```
318
+
319
+ ### 2. Run type checks
320
+
321
+ ```bash
322
+ bun run check
323
+ ```
324
+
325
+ ### 3. Open OpenCode and verify
326
+
327
+ Start OpenCode inside a project where the plugin is registered:
328
+
329
+ ```bash
330
+ opencode
331
+ ```
332
+
333
+ Ask the assistant:
334
+
335
+ ```
336
+ What is the index_status?
337
+ ```
338
+
339
+ You should see a JSON response with `files_indexed`, `chunks_indexed`, and indexing coverage.
340
+
341
+ ---
342
+
343
+ ## 8. Using the Plugin in OpenCode
344
+
345
+ Once loaded, the plugin:
346
+
347
+ - **Auto-indexes your codebase** the moment a session starts (delta sync — only changed files are processed).
348
+ - **Overrides the `grep` tool** with a smart version that routes conceptual queries to semantic search and exact patterns to ripgrep.
349
+ - **Re-indexes changed files** automatically as you edit them.
350
+
351
+ ### Tools available to the assistant
352
+
353
+ | Tool | What it does |
354
+ |---|---|
355
+ | `grep(pattern, ...)` | Smart grep: semantic-first for concepts, ripgrep for exact matches |
356
+ | `semantic_search(query, top_k?, threshold?, path?)` | Direct semantic search by intent/concept |
357
+ | `index_status()` | Shows files indexed, chunks, provider health, coverage % |
358
+ | `reindex()` | Forces a complete rebuild of the index |
359
+ | `diagnostic_bundle()` | JSON bundle for debugging (same fields as `bun run diagnostic:bundle` in this repo) |
360
+
361
+ ### Slash commands (`/sem-…`)
362
+
363
+ OpenCode slash commands are defined by Markdown files in **`.opencode/commands/`** (project) or **`~/.config/opencode/commands/`** (global). The filename without `.md` becomes the command name. Custom slash commands are **LLM-driven**: when you type `/sem-status`, OpenCode sends the Markdown template content as a prompt to the assistant, which then calls the relevant tool and shows the result.
364
+
365
+ This means the `.md` bodies should tell the assistant which tool to call — not contain data themselves.
366
+
367
+ | Shortcut | Stub file | Template content |
368
+ |---|---|---|
369
+ | `/sem-status` | `sem-status.md` | _Call `index_status()` and show the result._ |
370
+ | `/sem-reindex` | `sem-reindex.md` | _Call `reindex()` and show the stats._ |
371
+ | `/sem-diag` | `sem-diag.md` | _Call `diagnostic_bundle()` and show the JSON._ |
372
+ | `/sem-search` | `sem-search.md` | _Call `semantic_search()` with query: $ARGUMENTS_ |
373
+
374
+ **Example** — `.opencode/commands/sem-status.md`:
375
+
376
+ ```markdown
377
+ ---
378
+ description: Show semantic index and embedding provider health
379
+ ---
380
+
381
+ Call index_status() and show the result.
382
+ ```
383
+
384
+ **Example** — `.opencode/commands/sem-search.md`:
385
+
386
+ ```markdown
387
+ ---
388
+ description: Semantic search — pass a query as argument
389
+ ---
390
+
391
+ Call semantic_search() with query: $ARGUMENTS
392
+ ```
393
+
394
+ After restarting OpenCode, typing `/sem-status` asks the assistant to call `index_status()` exactly as if you had asked directly.
395
+
396
+ ### Example prompts
397
+
398
+ ```
399
+ Find where we handle authentication errors.
400
+ ```
401
+ → Semantic search finds auth error handling, even if the code uses `401`, `unauthorized`, `AuthError`, etc.
402
+
403
+ ```
404
+ Search for all TODO comments.
405
+ ```
406
+ → Routes to ripgrep (exact keyword).
407
+
408
+ ```
409
+ Where is the retry logic implemented?
410
+ ```
411
+ → Semantic search finds retry functions, backoff logic, reconnect handlers.
412
+
413
+ ```
414
+ grep for "import React"
415
+ ```
416
+ → Routes to ripgrep (exact string).
417
+
418
+ ---
419
+
420
+ ## 9. Advanced Configuration Reference
421
+
422
+ | Key | Default | Description |
423
+ |---|---|---|
424
+ | `embedding.provider` | `"ollama"` | `"ollama"` or `"openai"` |
425
+ | `embedding.api_base` | `"http://localhost:11434/v1"` | Base URL of the embedding API |
426
+ | `embedding.model` | `"nomic-embed-text"` | Model name |
427
+ | `embedding.dimensions` | `768` | Must match the model's output dimensions |
428
+ | `embedding.batch_size` | `10` | Chunks per embedding API request |
429
+ | `embedding.api_key_env` | — | Name of env var holding the API key (OpenAI) |
430
+ | `chunking.max_tokens` | `512` | Max tokens per code chunk |
431
+ | `chunking.overlap_tokens` | `50` | Token overlap between adjacent chunks |
432
+ | `indexing.include` | TS/JS/Py/Go/Rust/Java | Glob patterns for files to index |
433
+ | `indexing.exclude` | node_modules, dist, .git | Glob patterns for files to skip |
434
+ | `indexing.respect_gitignore` | `true` | Honor `.gitignore` during indexing |
435
+ | `indexing.max_file_size_kb` | `500` | Skip files larger than this |
436
+ | `indexing.concurrency` | `4` | Parallel file indexing workers |
437
+ | `search.top_k` | `10` | Max results returned |
438
+ | `search.similarity_threshold` | `0.35` | Minimum score to include a result |
439
+ | `search.context_lines` | `3` | Lines of source context around each result |
440
+ | `search.hybrid.weight_vector` | `0.4` | Weight for vector (ANN) score in hybrid ranking |
441
+ | `search.hybrid.weight_bm25` | `0.3` | Weight for BM25 keyword score |
442
+ | `search.hybrid.weight_rrf` | `0.3` | Weight for Reciprocal Rank Fusion |
443
+ | `search.hybrid.identifier_boost` | `1.5` | Score multiplier when query matches identifiers |
444
+ | `smart_grep.enabled` | `true` | Enable the smart grep tool override |
445
+ | `smart_grep.min_words_for_semantic` | `2` | Minimum word count to route to semantic search |
446
+ | `smart_grep.fallback_to_grep_on_empty` | `true` | Fall back to ripgrep if semantic returns nothing |
447
+ | `storage.global_cache_dir` | `~/.cache/opencode/semantic-search` | Where the SQLite database is stored |
448
+
449
+ ---
450
+
451
+ ## 10. Upgrading
452
+
453
+ ```bash
454
+ cd /path/to/opencode-semantic-search-plugin
455
+
456
+ # Pull latest changes
457
+ git pull
458
+
459
+ # Reinstall dependencies
460
+ bun install
461
+
462
+ # If the embedding model config was unchanged, the index is reused.
463
+ # If you changed the model or dimensions, run reindex inside OpenCode:
464
+ # reindex()
465
+ ```
466
+
467
+ ---
468
+
469
+ ## 11. Uninstalling
470
+
471
+ ### Project-local
472
+
473
+ ```bash
474
+ rm .opencode/plugins/opencode-semantic-search.ts
475
+ rm .opencode/semantic-search.json # optional — keeps your config
476
+ ```
477
+
478
+ ### Global
479
+
480
+ ```bash
481
+ rm ~/.config/opencode/plugins/opencode-semantic-search.ts
482
+ ```
483
+
484
+ ### Remove the index database
485
+
486
+ ```bash
487
+ rm -rf ~/.cache/opencode/semantic-search/
488
+ ```
489
+
490
+ ---
491
+
492
+ ## 12. Troubleshooting
493
+
494
+ ### Plugin does not load
495
+
496
+ - Confirm the shim exists in `.opencode/plugins/` or `~/.config/opencode/plugins/`.
497
+ - Confirm the absolute path in the shim points to the correct `index.ts`.
498
+ - Run `bun run check` in the plugin directory — resolve any TypeScript errors.
499
+
500
+ ### Semantic search returns no results
501
+
502
+ 1. Check `index_status()` — if `files_indexed` is 0, the index hasn't run yet.
503
+ 2. Ensure Ollama is running: `ollama serve`.
504
+ 3. Ensure the model is pulled: `ollama pull nomic-embed-text`.
505
+ 4. Try `reindex()` from within OpenCode.
506
+
507
+ ### Ollama is unreachable
508
+
509
+ The plugin gracefully falls back to keyword-only grep. You will not get semantic results until Ollama is available. No data is lost.
510
+
511
+ ### Wrong embedding dimensions
512
+
513
+ If you switch models, the dimension mismatch guard will warn you. Run `reindex()` to rebuild the index with the new model.
514
+
515
+ ### Database corruption
516
+
517
+ ```bash
518
+ rm ~/.cache/opencode/semantic-search/<project-hash>/embeddings.db
519
+ # Then call reindex() from OpenCode
520
+ ```
521
+
522
+ ### High memory / slow indexing
523
+
524
+ - Reduce `indexing.concurrency` to `2`.
525
+ - Reduce `chunking.max_tokens` to `256`.
526
+ - Add large generated directories to `indexing.exclude`.
527
+
528
+ ### `rg` not found at runtime
529
+
530
+ Install ripgrep and make sure it is on `$PATH`:
531
+
532
+ ```bash
533
+ # macOS
534
+ brew install ripgrep
535
+
536
+ # Ubuntu / WSL
537
+ sudo apt install ripgrep
538
+
539
+ # Verify
540
+ rg --version
541
+ ```
@@ -0,0 +1,70 @@
1
+ #!/usr/bin/env node
2
+ /**
3
+ * Published package entry: npx / bunx opencode-semantic-search
4
+ * Resolves the plugin root and runs install.sh so the shim points at the installed package copy.
5
+ */
6
+ import { spawnSync } from "node:child_process";
7
+ import { dirname, join } from "node:path";
8
+ import { fileURLToPath } from "node:url";
9
+
10
+ const __dirname = dirname(fileURLToPath(import.meta.url));
11
+ const pluginRoot = join(__dirname, "..");
12
+ const installSh = join(pluginRoot, "install.sh");
13
+
14
+ function printHelp() {
15
+ console.log(`OpenCode Semantic Search — installer
16
+
17
+ Usage:
18
+ bunx opencode-semantic-search@latest [install] [options]
19
+ npx opencode-semantic-search@latest [install] [options]
20
+
21
+ Options (passed to install.sh):
22
+ --global Install to ~/.config/opencode/plugins/ (default)
23
+ --local Install to ./.opencode/plugins/ in the current directory
24
+ --ollama-model MODEL Embedding model to pull (default: nomic-embed-text)
25
+ --openai-key-env VAR Use OpenAI; pass the env var name (e.g. OPENAI_API_KEY)
26
+ --skip-ollama Skip Ollama checks and model pull
27
+ -h, --help Show install.sh help
28
+
29
+ Examples:
30
+ bunx opencode-semantic-search@latest
31
+ bunx opencode-semantic-search@latest install --local
32
+ bunx opencode-semantic-search@latest --openai-key-env OPENAI_API_KEY --skip-ollama
33
+
34
+ From a git clone:
35
+ bash install.sh
36
+ bash install.sh --local
37
+
38
+ Docs: https://github.com/jainprashul/opencode-semantic-search
39
+ `);
40
+ }
41
+
42
+ const argv = process.argv.slice(2);
43
+
44
+ if (argv[0] === "help" || argv[0] === "-h" || argv[0] === "--help") {
45
+ printHelp();
46
+ process.exit(0);
47
+ }
48
+
49
+ let installArgs;
50
+ if (argv.length === 0) {
51
+ installArgs = [];
52
+ } else if (argv[0] === "install") {
53
+ installArgs = argv.slice(1);
54
+ } else if (argv[0]?.startsWith("-")) {
55
+ installArgs = argv;
56
+ } else {
57
+ console.error(`Unknown command: ${argv[0]}. Try: opencode-semantic-search install [...]`);
58
+ process.exit(1);
59
+ }
60
+
61
+ const result = spawnSync("bash", [installSh, ...installArgs], {
62
+ stdio: "inherit",
63
+ cwd: process.cwd(),
64
+ env: {
65
+ ...process.env,
66
+ PLUGIN_DIR: pluginRoot
67
+ }
68
+ });
69
+
70
+ process.exit(result.status ?? 1);
package/bun.lock ADDED
@@ -0,0 +1,61 @@
1
+ {
2
+ "lockfileVersion": 1,
3
+ "configVersion": 1,
4
+ "workspaces": {
5
+ "": {
6
+ "name": "semantic-search-plugin",
7
+ "dependencies": {
8
+ "@opencode-ai/plugin": "^1.3.2",
9
+ "@opencode-ai/sdk": "1.3.2",
10
+ "@vscode/tree-sitter-wasm": "^0.3.0",
11
+ "ignore": "^7.0.5",
12
+ "picomatch": "^4.0.4",
13
+ "sqlite-vec": "^0.1.7",
14
+ "web-tree-sitter": "^0.26.7",
15
+ },
16
+ "devDependencies": {
17
+ "@types/bun": "latest",
18
+ },
19
+ "peerDependencies": {
20
+ "typescript": "^5",
21
+ },
22
+ },
23
+ },
24
+ "packages": {
25
+ "@opencode-ai/plugin": ["@opencode-ai/plugin@1.3.2", "", { "dependencies": { "@opencode-ai/sdk": "1.3.2", "zod": "4.1.8" } }, "sha512-eT0ZovMCOQlfTdAnfbEWgW343mJ9SHgEVfdiOSX1NMIVXac6hxE2xwUsRVTV3wLvfA6dKZhN800f8wLUEyPlyg=="],
26
+
27
+ "@opencode-ai/sdk": ["@opencode-ai/sdk@1.3.2", "", {}, "sha512-u7sXVKn0kyAA5vVVHuHQfq3+3UGWOU1Sh6d/e+aS4zO8AwriTSWNQ9r8Qy5yxBH+PoeOGl5WIVdp+s2Ea2zuAg=="],
28
+
29
+ "@types/bun": ["@types/bun@1.3.11", "", { "dependencies": { "bun-types": "1.3.11" } }, "sha512-5vPne5QvtpjGpsGYXiFyycfpDF2ECyPcTSsFBMa0fraoxiQyMJ3SmuQIGhzPg2WJuWxVBoxWJ2kClYTcw/4fAg=="],
30
+
31
+ "@types/node": ["@types/node@25.5.0", "", { "dependencies": { "undici-types": "~7.18.0" } }, "sha512-jp2P3tQMSxWugkCUKLRPVUpGaL5MVFwF8RDuSRztfwgN1wmqJeMSbKlnEtQqU8UrhTmzEmZdu2I6v2dpp7XIxw=="],
32
+
33
+ "@vscode/tree-sitter-wasm": ["@vscode/tree-sitter-wasm@0.3.0", "", {}, "sha512-4kjB1jgLyG9VimGfyJb1F8/GFdrx55atsBCH/9r2D/iZHAUDCvZ5zhWXB7sRQ2z2WkkuNYm/0pgQtUm1jhdf7A=="],
34
+
35
+ "bun-types": ["bun-types@1.3.11", "", { "dependencies": { "@types/node": "*" } }, "sha512-1KGPpoxQWl9f6wcZh57LvrPIInQMn2TQ7jsgxqpRzg+l0QPOFvJVH7HmvHo/AiPgwXy+/Thf6Ov3EdVn1vOabg=="],
36
+
37
+ "ignore": ["ignore@7.0.5", "", {}, "sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg=="],
38
+
39
+ "picomatch": ["picomatch@4.0.4", "", {}, "sha512-QP88BAKvMam/3NxH6vj2o21R6MjxZUAd6nlwAS/pnGvN9IVLocLHxGYIzFhg6fUQ+5th6P4dv4eW9jX3DSIj7A=="],
40
+
41
+ "sqlite-vec": ["sqlite-vec@0.1.7", "", { "optionalDependencies": { "sqlite-vec-darwin-arm64": "0.1.7", "sqlite-vec-darwin-x64": "0.1.7", "sqlite-vec-linux-arm64": "0.1.7", "sqlite-vec-linux-x64": "0.1.7", "sqlite-vec-windows-x64": "0.1.7" } }, "sha512-1Sge9uRc3B6wDKR4J6sGFi/E2ai9SAU5FenDki3OmhdP/a49PO2Juy1U5yQnx2bZP5t+C3BYJTkG+KkDi3q9Xg=="],
42
+
43
+ "sqlite-vec-darwin-arm64": ["sqlite-vec-darwin-arm64@0.1.7", "", { "os": "darwin", "cpu": "arm64" }, "sha512-dQ7u4GKPdOPi3IfZ44K7HHdYup2JssM6fuKR9zgqRzW137uFOQmRhbYChNu+ZfW+yhJutsPgfNRFsuWKmy627w=="],
44
+
45
+ "sqlite-vec-darwin-x64": ["sqlite-vec-darwin-x64@0.1.7", "", { "os": "darwin", "cpu": "x64" }, "sha512-MDoczft1BriQcGMEz+CqeSCkB0OsAf12ytZOapS6MaB7zgNzLLSLH6Sxe3yzcPWUyDuCWgK7WzyRIo8u1vAIVA=="],
46
+
47
+ "sqlite-vec-linux-arm64": ["sqlite-vec-linux-arm64@0.1.7", "", { "os": "linux", "cpu": "arm64" }, "sha512-V429sYT/gwr9PgtT8rbjQd6ls7CFchFpiS45TKSf7rU7wxt9MBmCVorUcheD4kEZb4VeZ6PnFXXCqPMeaHkaUw=="],
48
+
49
+ "sqlite-vec-linux-x64": ["sqlite-vec-linux-x64@0.1.7", "", { "os": "linux", "cpu": "x64" }, "sha512-wZL+lXeW7y63DLv6FYU6Q4nv2lP5F94cWt7bJCWNiHmZ6NdKIgz/p0QlyuJA/51b8TyoDvsTdusLVlZz9cIh5A=="],
50
+
51
+ "sqlite-vec-windows-x64": ["sqlite-vec-windows-x64@0.1.7", "", { "os": "win32", "cpu": "x64" }, "sha512-FEZMjMT03irJxwqMQg+A+4hHCiFslxISOAkQ0eYn2lP7GdpppkgYveaT5Xnw/2V+GLq2MXOJb0nDGFNethHSkg=="],
52
+
53
+ "typescript": ["typescript@5.9.3", "", { "bin": { "tsc": "bin/tsc", "tsserver": "bin/tsserver" } }, "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw=="],
54
+
55
+ "undici-types": ["undici-types@7.18.2", "", {}, "sha512-AsuCzffGHJybSaRrmr5eHr81mwJU3kjw6M+uprWvCXiNeN9SOGwQ3Jn8jb8m3Z6izVgknn1R0FTCEAP2QrLY/w=="],
56
+
57
+ "web-tree-sitter": ["web-tree-sitter@0.26.7", "", {}, "sha512-KiZhelTvBA/ziUHEO7Emb75cGVAq8iGZNabYaZm53Zpy50NsXyOW+xSHlwHt5CVg/TRPZBfeVLTTobF0LjFJ1w=="],
58
+
59
+ "zod": ["zod@4.1.8", "", {}, "sha512-5R1P+WwQqmmMIEACyzSvo4JXHY5WiAFHRMg+zBZKgKS+Q1viRa0C1hmUKtHltoIFKtIdki3pRxkmpP74jnNYHQ=="],
60
+ }
61
+ }