lancedb-opencode-pro 0.1.1 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -10,60 +10,23 @@ LanceDB-backed long-term memory provider for OpenCode.
10
10
 
11
11
  ## Install
12
12
 
13
- For normal use, install from npm into a fixed local plugin directory:
13
+ ### Primary (Recommended): npm package name
14
14
 
15
- ```bash
16
- mkdir -p ~/.config/opencode/plugins/lancedb-opencode-pro
17
- npm install --prefix ~/.config/opencode/plugins/lancedb-opencode-pro lancedb-opencode-pro
18
- ```
19
-
20
- Register the plugin as a `file://` path in `~/.config/opencode/opencode.json`:
15
+ 1. Register the published package name in `~/.config/opencode/opencode.json`:
21
16
 
22
17
  ```json
23
18
  {
24
19
  "$schema": "https://opencode.ai/config.json",
25
20
  "plugin": [
26
21
  "oh-my-opencode",
27
- "file:///home/<user>/.config/opencode/plugins/lancedb-opencode-pro/node_modules/lancedb-opencode-pro/dist/index.js"
22
+ "lancedb-opencode-pro"
28
23
  ]
29
24
  }
30
25
  ```
31
26
 
32
- If you already use other plugins, keep them and append this `file://` entry.
33
-
34
- ## Fallback Install From Local `.tgz`
35
-
36
- Use this flow when you want to install an unpublished build or test a release candidate on another machine.
37
-
38
- 1. On the build host, build and pack:
39
-
40
- ```bash
41
- npm ci
42
- npm run typecheck
43
- npm run build
44
- npm pack
45
- ```
46
-
47
- This generates a file like:
48
-
49
- ```text
50
- lancedb-opencode-pro-0.1.1.tgz
51
- ```
52
-
53
- 2. Copy the `.tgz` to the target host (example):
54
-
55
- ```bash
56
- scp lancedb-opencode-pro-0.1.1.tgz <user>@<target-host>:/tmp/
57
- ```
58
-
59
- 3. On the target host, install into a fixed local plugin directory:
60
-
61
- ```bash
62
- mkdir -p ~/.config/opencode/plugins/lancedb-opencode-pro
63
- npm install --prefix ~/.config/opencode/plugins/lancedb-opencode-pro /tmp/lancedb-opencode-pro-0.1.1.tgz
64
- ```
27
+ If you already use other plugins, keep them and append `"lancedb-opencode-pro"`.
65
28
 
66
- 4. For OpenCode `1.2.27+`, create the sidecar config file `~/.config/opencode/lancedb-opencode-pro.json`:
29
+ 2. For OpenCode `1.2.27+`, create the sidecar config file `~/.config/opencode/lancedb-opencode-pro.json`:
67
30
 
68
31
  ```json
69
32
  {
@@ -86,14 +49,14 @@ npm install --prefix ~/.config/opencode/plugins/lancedb-opencode-pro /tmp/lanced
86
49
  }
87
50
  ```
88
51
 
89
- 5. Set `embedding.baseUrl` to the Ollama endpoint that is reachable from that host.
52
+ 3. Set `embedding.baseUrl` to the Ollama endpoint that is reachable from that host.
90
53
 
91
54
  - Same machine as OpenCode: `http://127.0.0.1:11434`
92
55
  - Another machine on the network: for example `http://192.168.11.206:11434`
93
56
 
94
57
  You do not need `LANCEDB_OPENCODE_PRO_OLLAMA_BASE_URL` if the sidecar file already contains the correct `embedding.baseUrl`. Use the environment variable only when you want to override the file at runtime.
95
58
 
96
- 6. Make sure Ollama is reachable from that host before starting OpenCode:
59
+ 4. Make sure Ollama is reachable from that host, then start or restart OpenCode:
97
60
 
98
61
  ```bash
99
62
  curl http://127.0.0.1:11434/api/tags
@@ -105,26 +68,57 @@ or, for a remote Ollama server:
105
68
  curl http://192.168.11.206:11434/api/tags
106
69
  ```
107
70
 
108
- 7. Verify plugin file path and start/restart OpenCode:
71
+ After the first successful memory operation, LanceDB files should appear under:
72
+
73
+ ```text
74
+ ~/.opencode/memory/lancedb
75
+ ```
76
+
77
+ You can verify that the directory exists:
109
78
 
110
79
  ```bash
111
- ls -la ~/.config/opencode/plugins/lancedb-opencode-pro/node_modules/lancedb-opencode-pro/dist/index.js
80
+ ls -la ~/.opencode/memory/lancedb
112
81
  ```
113
82
 
114
- Then start or restart OpenCode, and verify memory store initialization.
83
+ ## Fallback Install From Release `.tgz`
115
84
 
116
- After the first successful memory operation, LanceDB files should appear under:
85
+ Use this only when npm registry install is unavailable (for example, restricted network, offline staging, or registry outage).
117
86
 
118
- ```text
119
- ~/.opencode/memory/lancedb
87
+ 1. Download the latest published release asset:
88
+
89
+ ```bash
90
+ curl -fL "https://github.com/tryweb/lancedb-opencode-pro/releases/latest/download/lancedb-opencode-pro.tgz" -o /tmp/lancedb-opencode-pro.tgz
120
91
  ```
121
92
 
122
- You can also verify that the directory exists:
93
+ 2. Install into the fixed local plugin directory:
123
94
 
124
95
  ```bash
125
- ls -la ~/.opencode/memory/lancedb
96
+ mkdir -p ~/.config/opencode/plugins/lancedb-opencode-pro
97
+ npm install --prefix ~/.config/opencode/plugins/lancedb-opencode-pro /tmp/lancedb-opencode-pro.tgz
98
+ ```
99
+
100
+ 3. Register the plugin as a `file://` path in `~/.config/opencode/opencode.json`:
101
+
102
+ ```json
103
+ {
104
+ "$schema": "https://opencode.ai/config.json",
105
+ "plugin": [
106
+ "oh-my-opencode",
107
+ "file:///home/<user>/.config/opencode/plugins/lancedb-opencode-pro/node_modules/lancedb-opencode-pro/dist/index.js"
108
+ ]
109
+ }
126
110
  ```
127
111
 
112
+ 4. Reuse the same sidecar config from the primary install flow, then start/restart OpenCode.
113
+
114
+ 5. Verify plugin file path:
115
+
116
+ ```bash
117
+ ls -la ~/.config/opencode/plugins/lancedb-opencode-pro/node_modules/lancedb-opencode-pro/dist/index.js
118
+ ```
119
+
120
+ Then verify memory store initialization.
121
+
128
122
  ### When to use environment variables
129
123
 
130
124
  Environment variables are optional. The recommended default is:
@@ -140,6 +134,26 @@ export LANCEDB_OPENCODE_PRO_OLLAMA_BASE_URL="http://192.168.11.206:11434"
140
134
 
141
135
  This override has higher priority than the sidecar file.
142
136
 
137
+ ### Secondary Fallback: Build From Source And Pack Locally
138
+
139
+ Use this when you need an unpublished local build (for example, testing unreleased commits).
140
+
141
+ ```bash
142
+ npm ci
143
+ npm run typecheck
144
+ npm run build
145
+ npm pack
146
+ ```
147
+
148
+ Then install the generated tarball:
149
+
150
+ ```bash
151
+ mkdir -p ~/.config/opencode/plugins/lancedb-opencode-pro
152
+ npm install --prefix ~/.config/opencode/plugins/lancedb-opencode-pro ./lancedb-opencode-pro-<version>.tgz
153
+ ```
154
+
155
+ Use the same `file://` plugin registration shown in the fallback section above.
156
+
143
157
  ## OpenCode Config
144
158
 
145
159
  Use a sidecar config file. This is the supported configuration model for current OpenCode versions.
@@ -189,9 +203,14 @@ Supported environment variables:
189
203
 
190
204
  - `LANCEDB_OPENCODE_PRO_CONFIG_PATH`
191
205
  - `LANCEDB_OPENCODE_PRO_PROVIDER`
206
+ - `LANCEDB_OPENCODE_PRO_EMBEDDING_PROVIDER` (`ollama` or `openai`, default `ollama`)
192
207
  - `LANCEDB_OPENCODE_PRO_DB_PATH`
193
208
  - `LANCEDB_OPENCODE_PRO_EMBEDDING_MODEL`
194
209
  - `LANCEDB_OPENCODE_PRO_OLLAMA_BASE_URL`
210
+ - `LANCEDB_OPENCODE_PRO_OPENAI_API_KEY`
211
+ - `LANCEDB_OPENCODE_PRO_OPENAI_MODEL`
212
+ - `LANCEDB_OPENCODE_PRO_OPENAI_BASE_URL`
213
+ - `LANCEDB_OPENCODE_PRO_OPENAI_TIMEOUT_MS`
195
214
  - `LANCEDB_OPENCODE_PRO_EMBEDDING_TIMEOUT_MS`
196
215
  - `LANCEDB_OPENCODE_PRO_RETRIEVAL_MODE`
197
216
  - `LANCEDB_OPENCODE_PRO_VECTOR_WEIGHT`
@@ -213,6 +232,51 @@ Supported environment variables:
213
232
  - `memory_stats`
214
233
  - `memory_port_plan`
215
234
 
235
+ ## OpenAI Embedding Configuration
236
+
237
+ Default behavior stays on Ollama. To use OpenAI embeddings, set `embedding.provider` to `openai` and provide API key + model.
238
+
239
+ Example sidecar:
240
+
241
+ ```json
242
+ {
243
+ "provider": "lancedb-opencode-pro",
244
+ "dbPath": "~/.opencode/memory/lancedb",
245
+ "embedding": {
246
+ "provider": "openai",
247
+ "model": "text-embedding-3-small",
248
+ "baseUrl": "https://api.openai.com/v1",
249
+ "apiKey": "sk-your-openai-key"
250
+ },
251
+ "retrieval": {
252
+ "mode": "hybrid",
253
+ "vectorWeight": 0.7,
254
+ "bm25Weight": 0.3,
255
+ "minScore": 0.2
256
+ },
257
+ "includeGlobalScope": true,
258
+ "minCaptureChars": 80,
259
+ "maxEntriesPerScope": 3000
260
+ }
261
+ ```
262
+
263
+ Recommended env overrides for OpenAI:
264
+
265
+ ```bash
266
+ export LANCEDB_OPENCODE_PRO_EMBEDDING_PROVIDER="openai"
267
+ export LANCEDB_OPENCODE_PRO_OPENAI_API_KEY="$OPENAI_API_KEY"
268
+ export LANCEDB_OPENCODE_PRO_OPENAI_MODEL="text-embedding-3-small"
269
+ export LANCEDB_OPENCODE_PRO_OPENAI_BASE_URL="https://api.openai.com/v1"
270
+ ```
271
+
272
+ `lancedb-opencode-pro.json` is parsed as plain JSON, so `${...}` interpolation is not performed. Prefer environment variables for secrets.
273
+
274
+ Validation behavior:
275
+
276
+ - If `embedding.provider=openai` and API key is missing, initialization fails with an explicit configuration error.
277
+ - If `embedding.provider=openai` and model is missing, initialization fails with an explicit configuration error.
278
+ - Ollama remains the default provider when `embedding.provider` is omitted.
279
+
216
280
  ## Compose Port Planning (Cross-Project)
217
281
 
218
282
  Use `memory_port_plan` before writing `docker-compose.yml` to avoid host port collisions across projects on the same machine.
@@ -497,5 +561,5 @@ Treat the feature as verified only when all of these are true:
497
561
  ## Notes
498
562
 
499
563
  - Default storage path: `~/.opencode/memory/lancedb`
500
- - Embedding backend in v1: `ollama`
564
+ - Embedding provider defaults to `ollama`; `openai` is supported via `embedding.provider=openai`
501
565
  - The provider keeps schema metadata (`schemaVersion`, `embeddingModel`, `vectorDim`) to guard against unsafe vector mixing.
package/dist/config.js CHANGED
@@ -3,6 +3,7 @@ import { join } from "node:path";
3
3
  import { clamp, expandHomePath, parseJsonObject, toBoolean, toNumber } from "./utils.js";
4
4
  const DEFAULT_DB_PATH = "~/.opencode/memory/lancedb";
5
5
  const DEFAULT_OLLAMA_BASE_URL = "http://127.0.0.1:11434";
6
+ const DEFAULT_OPENAI_BASE_URL = "https://api.openai.com/v1";
6
7
  const SIDECAR_FILE = "lancedb-opencode-pro.json";
7
8
  export function resolveMemoryConfig(config, worktree) {
8
9
  const legacyRaw = (config?.memory ?? {});
@@ -19,14 +20,29 @@ export function resolveMemoryConfig(config, worktree) {
19
20
  const weightSum = vectorWeight + bm25Weight;
20
21
  const normalizedVectorWeight = weightSum > 0 ? vectorWeight / weightSum : 0.7;
21
22
  const normalizedBm25Weight = weightSum > 0 ? bm25Weight / weightSum : 0.3;
22
- return {
23
+ const embeddingProvider = resolveEmbeddingProvider(firstString(process.env.LANCEDB_OPENCODE_PRO_EMBEDDING_PROVIDER, embeddingRaw.provider));
24
+ const embeddingModel = embeddingProvider === "openai"
25
+ ? firstString(process.env.LANCEDB_OPENCODE_PRO_OPENAI_MODEL, process.env.LANCEDB_OPENCODE_PRO_EMBEDDING_MODEL, embeddingRaw.model)
26
+ : firstString(process.env.LANCEDB_OPENCODE_PRO_EMBEDDING_MODEL, embeddingRaw.model) ?? "nomic-embed-text";
27
+ const embeddingBaseUrl = embeddingProvider === "openai"
28
+ ? firstString(process.env.LANCEDB_OPENCODE_PRO_OPENAI_BASE_URL, embeddingRaw.baseUrl) ?? DEFAULT_OPENAI_BASE_URL
29
+ : firstString(process.env.LANCEDB_OPENCODE_PRO_OLLAMA_BASE_URL, embeddingRaw.baseUrl) ?? DEFAULT_OLLAMA_BASE_URL;
30
+ const embeddingApiKey = embeddingProvider === "openai"
31
+ ? firstString(process.env.LANCEDB_OPENCODE_PRO_OPENAI_API_KEY, embeddingRaw.apiKey)
32
+ : undefined;
33
+ const timeoutEnv = embeddingProvider === "openai"
34
+ ? process.env.LANCEDB_OPENCODE_PRO_OPENAI_TIMEOUT_MS ?? process.env.LANCEDB_OPENCODE_PRO_EMBEDDING_TIMEOUT_MS
35
+ : process.env.LANCEDB_OPENCODE_PRO_EMBEDDING_TIMEOUT_MS;
36
+ const timeoutRaw = timeoutEnv ?? embeddingRaw.timeoutMs;
37
+ const resolvedConfig = {
23
38
  provider,
24
39
  dbPath,
25
40
  embedding: {
26
- provider: "ollama",
27
- model: firstString(process.env.LANCEDB_OPENCODE_PRO_EMBEDDING_MODEL, embeddingRaw.model) ?? "nomic-embed-text",
28
- baseUrl: firstString(process.env.LANCEDB_OPENCODE_PRO_OLLAMA_BASE_URL, embeddingRaw.baseUrl) ?? DEFAULT_OLLAMA_BASE_URL,
29
- timeoutMs: Math.max(500, Math.floor(toNumber(process.env.LANCEDB_OPENCODE_PRO_EMBEDDING_TIMEOUT_MS ?? embeddingRaw.timeoutMs, 6000))),
41
+ provider: embeddingProvider,
42
+ model: embeddingModel ?? "",
43
+ baseUrl: embeddingBaseUrl,
44
+ apiKey: embeddingApiKey,
45
+ timeoutMs: Math.max(500, Math.floor(toNumber(timeoutRaw, 6000))),
30
46
  },
31
47
  retrieval: {
32
48
  mode,
@@ -38,6 +54,25 @@ export function resolveMemoryConfig(config, worktree) {
38
54
  minCaptureChars: Math.max(30, Math.floor(toNumber(process.env.LANCEDB_OPENCODE_PRO_MIN_CAPTURE_CHARS ?? raw.minCaptureChars, 80))),
39
55
  maxEntriesPerScope: Math.max(50, Math.floor(toNumber(process.env.LANCEDB_OPENCODE_PRO_MAX_ENTRIES_PER_SCOPE ?? raw.maxEntriesPerScope, 3000))),
40
56
  };
57
+ validateEmbeddingConfig(resolvedConfig.embedding);
58
+ return resolvedConfig;
59
+ }
60
+ function resolveEmbeddingProvider(raw) {
61
+ if (!raw || raw === "ollama")
62
+ return "ollama";
63
+ if (raw === "openai")
64
+ return "openai";
65
+ throw new Error(`[lancedb-opencode-pro] Invalid embedding provider "${raw}". Expected "ollama" or "openai".`);
66
+ }
67
+ function validateEmbeddingConfig(embedding) {
68
+ if (embedding.provider !== "openai")
69
+ return;
70
+ if (!embedding.apiKey) {
71
+ throw new Error("[lancedb-opencode-pro] OpenAI embedding provider requires apiKey. Set embedding.apiKey or LANCEDB_OPENCODE_PRO_OPENAI_API_KEY.");
72
+ }
73
+ if (!embedding.model) {
74
+ throw new Error("[lancedb-opencode-pro] OpenAI embedding provider requires model. Set embedding.model or LANCEDB_OPENCODE_PRO_OPENAI_MODEL.");
75
+ }
41
76
  }
42
77
  function loadSidecarConfig(worktree) {
43
78
  const configPath = firstString(process.env.LANCEDB_OPENCODE_PRO_CONFIG_PATH);
@@ -12,3 +12,12 @@ export declare class OllamaEmbedder implements Embedder {
12
12
  embed(text: string): Promise<number[]>;
13
13
  dim(): Promise<number>;
14
14
  }
15
+ export declare class OpenAIEmbedder implements Embedder {
16
+ private readonly config;
17
+ readonly model: string;
18
+ private cachedDim;
19
+ constructor(config: EmbeddingConfig);
20
+ embed(text: string): Promise<number[]>;
21
+ dim(): Promise<number>;
22
+ }
23
+ export declare function createEmbedder(config: EmbeddingConfig): Embedder;
package/dist/embedder.js CHANGED
@@ -3,6 +3,9 @@ const KNOWN_MODEL_DIMS = {
3
3
  "mxbai-embed-large": 1024,
4
4
  "all-minilm": 384,
5
5
  "snowflake-arctic-embed": 1024,
6
+ "text-embedding-3-small": 1536,
7
+ "text-embedding-3-large": 3072,
8
+ "text-embedding-ada-002": 1536,
6
9
  };
7
10
  function fallbackDim(model) {
8
11
  const normalized = model.toLowerCase().replace(/:.*$/, "");
@@ -70,3 +73,76 @@ export class OllamaEmbedder {
70
73
  }
71
74
  }
72
75
  }
76
+ export class OpenAIEmbedder {
77
+ config;
78
+ model;
79
+ cachedDim = null;
80
+ constructor(config) {
81
+ this.config = config;
82
+ this.model = config.model;
83
+ }
84
+ async embed(text) {
85
+ if (!this.config.apiKey) {
86
+ throw new Error("OpenAI embedding request failed: missing apiKey. Set embedding.apiKey or LANCEDB_OPENCODE_PRO_OPENAI_API_KEY.");
87
+ }
88
+ const baseUrl = (this.config.baseUrl ?? "https://api.openai.com/v1").replace(/\/+$/, "");
89
+ const endpoint = `${baseUrl}/embeddings`;
90
+ const controller = new AbortController();
91
+ const timeout = setTimeout(() => controller.abort(), this.config.timeoutMs ?? 6000);
92
+ try {
93
+ const response = await fetch(endpoint, {
94
+ method: "POST",
95
+ headers: {
96
+ "content-type": "application/json",
97
+ authorization: `Bearer ${this.config.apiKey}`,
98
+ },
99
+ body: JSON.stringify({
100
+ model: this.config.model,
101
+ input: text,
102
+ encoding_format: "float",
103
+ }),
104
+ signal: controller.signal,
105
+ });
106
+ if (!response.ok) {
107
+ const details = await response.text().catch(() => "");
108
+ const suffix = details ? ` - ${details.slice(0, 240)}` : "";
109
+ throw new Error(`OpenAI embedding request failed: HTTP ${response.status}${suffix}`);
110
+ }
111
+ const data = (await response.json());
112
+ const vector = data.data?.[0]?.embedding;
113
+ if (!Array.isArray(vector) || vector.length === 0) {
114
+ throw new Error("OpenAI embedding response missing embedding vector");
115
+ }
116
+ if (this.cachedDim === null) {
117
+ this.cachedDim = vector.length;
118
+ }
119
+ return vector;
120
+ }
121
+ finally {
122
+ clearTimeout(timeout);
123
+ }
124
+ }
125
+ async dim() {
126
+ if (this.cachedDim !== null)
127
+ return this.cachedDim;
128
+ try {
129
+ const probe = await this.embed("dimension probe");
130
+ this.cachedDim = probe.length;
131
+ return this.cachedDim;
132
+ }
133
+ catch {
134
+ const fb = fallbackDim(this.model);
135
+ if (fb !== null) {
136
+ console.warn(`[lancedb-opencode-pro] OpenAI embedding probe failed, using fallback dim ${fb} for model "${this.model}"`);
137
+ return fb;
138
+ }
139
+ throw new Error(`OpenAI embedding probe failed and no known fallback dimension for model "${this.model}"`);
140
+ }
141
+ }
142
+ }
143
+ export function createEmbedder(config) {
144
+ if (config.provider === "openai") {
145
+ return new OpenAIEmbedder(config);
146
+ }
147
+ return new OllamaEmbedder(config);
148
+ }
package/dist/index.js CHANGED
@@ -1,6 +1,6 @@
1
1
  import { tool } from "@opencode-ai/plugin";
2
2
  import { resolveMemoryConfig } from "./config.js";
3
- import { OllamaEmbedder } from "./embedder.js";
3
+ import { createEmbedder } from "./embedder.js";
4
4
  import { extractCaptureCandidate } from "./extract.js";
5
5
  import { isTcpPortAvailable, parsePortReservations, planPorts, reservationKey } from "./ports.js";
6
6
  import { buildScopeFilter, deriveProjectScope } from "./scope.js";
@@ -11,7 +11,12 @@ const plugin = async (input) => {
11
11
  const state = await createRuntimeState(input);
12
12
  const hooks = {
13
13
  config: async (config) => {
14
- state.config = resolveMemoryConfig(config, input.worktree);
14
+ const nextConfig = resolveMemoryConfig(config, input.worktree);
15
+ if (hasEmbeddingConfigChanged(state.config.embedding, nextConfig.embedding)) {
16
+ state.embedder = createEmbedder(nextConfig.embedding);
17
+ state.initialized = false;
18
+ }
19
+ state.config = nextConfig;
15
20
  },
16
21
  event: async ({ event }) => {
17
22
  if (event.type === "session.idle" || event.type === "session.compacted") {
@@ -72,7 +77,7 @@ const plugin = async (input) => {
72
77
  execute: async (args, context) => {
73
78
  await state.ensureInitialized();
74
79
  if (!state.initialized)
75
- return "Memory store unavailable (Ollama may be offline). Will retry automatically.";
80
+ return unavailableMessage(state.config.embedding.provider);
76
81
  const activeScope = args.scope ?? deriveProjectScope(context.worktree);
77
82
  const scopes = buildScopeFilter(activeScope, state.config.includeGlobalScope);
78
83
  let queryVector = [];
@@ -111,7 +116,7 @@ const plugin = async (input) => {
111
116
  execute: async (args, context) => {
112
117
  await state.ensureInitialized();
113
118
  if (!state.initialized)
114
- return "Memory store unavailable (Ollama may be offline). Will retry automatically.";
119
+ return unavailableMessage(state.config.embedding.provider);
115
120
  if (!args.confirm) {
116
121
  return "Rejected: memory_delete requires confirm=true.";
117
122
  }
@@ -130,7 +135,7 @@ const plugin = async (input) => {
130
135
  execute: async (args) => {
131
136
  await state.ensureInitialized();
132
137
  if (!state.initialized)
133
- return "Memory store unavailable (Ollama may be offline). Will retry automatically.";
138
+ return unavailableMessage(state.config.embedding.provider);
134
139
  if (!args.confirm) {
135
140
  return "Rejected: destructive clear requires confirm=true.";
136
141
  }
@@ -146,7 +151,7 @@ const plugin = async (input) => {
146
151
  execute: async (args, context) => {
147
152
  await state.ensureInitialized();
148
153
  if (!state.initialized)
149
- return "Memory store unavailable (Ollama may be offline). Will retry automatically.";
154
+ return unavailableMessage(state.config.embedding.provider);
150
155
  const scope = args.scope ?? deriveProjectScope(context.worktree);
151
156
  const entries = await state.store.list(scope, 20);
152
157
  const incompatibleVectors = await state.store.countIncompatibleVectors(buildScopeFilter(scope, state.config.includeGlobalScope), await state.embedder.dim());
@@ -180,7 +185,7 @@ const plugin = async (input) => {
180
185
  execute: async (args, context) => {
181
186
  await state.ensureInitialized();
182
187
  if (!state.initialized)
183
- return "Memory store unavailable (Ollama may be offline). Will retry automatically.";
188
+ return unavailableMessage(state.config.embedding.provider);
184
189
  if (args.rangeStart > args.rangeEnd) {
185
190
  return "Invalid range: rangeStart must be <= rangeEnd.";
186
191
  }
@@ -261,7 +266,7 @@ const plugin = async (input) => {
261
266
  };
262
267
  async function createRuntimeState(input) {
263
268
  const resolved = resolveMemoryConfig(undefined, input.worktree);
264
- const embedder = new OllamaEmbedder(resolved.embedding);
269
+ const embedder = createEmbedder(resolved.embedding);
265
270
  const store = new MemoryStore(resolved.dbPath);
266
271
  const state = {
267
272
  config: resolved,
@@ -369,4 +374,14 @@ function unwrapData(value) {
369
374
  }
370
375
  return value;
371
376
  }
377
+ function unavailableMessage(provider) {
378
+ return `Memory store unavailable (${provider} embedding may be offline). Will retry automatically.`;
379
+ }
380
+ function hasEmbeddingConfigChanged(current, next) {
381
+ return (current.provider !== next.provider
382
+ || current.model !== next.model
383
+ || (current.baseUrl ?? "") !== (next.baseUrl ?? "")
384
+ || (current.apiKey ?? "") !== (next.apiKey ?? "")
385
+ || (current.timeoutMs ?? 0) !== (next.timeoutMs ?? 0));
386
+ }
372
387
  export default plugin;
package/dist/types.d.ts CHANGED
@@ -1,10 +1,11 @@
1
- export type EmbeddingProvider = "ollama";
1
+ export type EmbeddingProvider = "ollama" | "openai";
2
2
  export type RetrievalMode = "hybrid" | "vector";
3
3
  export type MemoryCategory = "preference" | "fact" | "decision" | "entity" | "other";
4
4
  export interface EmbeddingConfig {
5
5
  provider: EmbeddingProvider;
6
6
  model: string;
7
7
  baseUrl?: string;
8
+ apiKey?: string;
8
9
  timeoutMs?: number;
9
10
  }
10
11
  export interface RetrievalConfig {
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "lancedb-opencode-pro",
3
- "version": "0.1.1",
3
+ "version": "0.1.2",
4
4
  "description": "LanceDB-backed long-term memory provider for OpenCode",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",
@@ -25,11 +25,11 @@
25
25
  ],
26
26
  "repository": {
27
27
  "type": "git",
28
- "url": "git+https://gitlab-238.ichiayi.com/jonathan/lancedb-opencode-pro.git"
28
+ "url": "git+https://github.com/tryweb/lancedb-opencode-pro.git"
29
29
  },
30
- "homepage": "https://gitlab-238.ichiayi.com/jonathan/lancedb-opencode-pro",
30
+ "homepage": "https://github.com/tryweb/lancedb-opencode-pro",
31
31
  "bugs": {
32
- "url": "https://gitlab-238.ichiayi.com/jonathan/lancedb-opencode-pro/-/issues"
32
+ "url": "https://github.com/tryweb/lancedb-opencode-pro/issues"
33
33
  },
34
34
  "license": "MIT",
35
35
  "engines": {
package/LICENSE DELETED
@@ -1,21 +0,0 @@
1
- MIT License
2
-
3
- Copyright (c) 2026 Jonathan Tsai <tryweb@ichiayi.com>
4
-
5
- Permission is hereby granted, free of charge, to any person obtaining a copy
6
- of this software and associated documentation files (the "Software"), to deal
7
- in the Software without restriction, including without limitation the rights
8
- to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
- copies of the Software, and to permit persons to whom the Software is
10
- furnished to do so, subject to the following conditions:
11
-
12
- The above copyright notice and this permission notice shall be included in all
13
- copies or substantial portions of the Software.
14
-
15
- THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
- IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
- FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
- AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
- LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
- OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
- SOFTWARE.