@jussmor/commit-memory-mcp 0.3.5 → 0.3.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,60 +1,225 @@
1
1
  # @jussmor/commit-memory-mcp
2
2
 
3
- Local commit-aware RAG package powered by sqlite-vec with MCP tool endpoints.
3
+ A local MCP server for PR intelligence, author tracing, and worktree-aware async planning.
4
4
 
5
- ## Features
5
+ ## Purpose
6
6
 
7
- - Indexes git commits across branches into commit-file chunks
8
- - Embeds chunks using a local embedding source (Ollama when configured)
9
- - Stores vectors in sqlite-vec (SQLite)
10
- - Exposes MCP tools for agent workflows
7
+ This package helps agents answer:
11
8
 
12
- ## MCP tools
9
+ - who changed a file or area
10
+ - why a change was made
11
+ - what landed on main recently
12
+ - what context should be reviewed before planning
13
+
14
+ ## Current MCP tools
15
+
16
+ - `sync_pr_context`
17
+ - `build_context_pack`
18
+ - `promote_context_facts`
19
+ - `archive_feature_context`
20
+ - `list_active_worktrees`
21
+ - `who_changed_this`
22
+ - `why_was_this_changed`
23
+ - `get_main_branch_overnight_brief`
24
+ - `resume_feature_session_brief`
25
+ - `pre_plan_sync_brief`
26
+
27
+ ## Removed tools
28
+
29
+ The following legacy tools were removed:
13
30
 
14
31
  - `search_related_commits`
15
32
  - `explain_commit_match`
16
33
  - `get_commit_diff`
34
+ - `reindex_commits`
17
35
 
18
36
  ## Quick start
19
37
 
20
38
  ```bash
21
39
  npm install @jussmor/commit-memory-mcp
22
- npx commit-memory-index --repo /path/to/repo --db /path/to/repo/.commit-rag.db --limit 400
23
40
  npx commit-memory-mcp
24
41
  ```
25
42
 
26
- ## Publish
43
+ For local development in this repository:
27
44
 
28
45
  ```bash
46
+ cd packages/commit-rag-mcp
47
+ npm install
29
48
  npm run build
30
- npm publish --access public
31
- mcp-publisher login github
32
- mcp-publisher publish
49
+ node dist/mcp/server.js
50
+ ```
51
+
52
+ ## Requirements
53
+
54
+ - Node.js 20+
55
+ - `gh` CLI authenticated for GitHub PR sync features
56
+ - a git repository available through `COMMIT_RAG_REPO`
57
+
58
+ ## Environment variables
59
+
60
+ - `COMMIT_RAG_REPO` repository path used by the MCP server
61
+ - `COMMIT_RAG_DB` SQLite database path
62
+ - `COMMIT_RAG_LIMIT` default sync/query limit
63
+ - `OLLAMA_BASE_URL` optional Ollama base URL
64
+ - `OLLAMA_EMBED_MODEL` optional embedding model
65
+ - `COPILOT_TOKEN` optional Copilot reranking token
66
+ - `COPILOT_MODEL` optional Copilot model override
67
+ - `COPILOT_BASE_URL` optional Copilot API base URL
68
+
69
+ ## Use cases
70
+
71
+ ### Sync GitHub PR context
72
+
73
+ ```text
74
+ sync_pr_context({
75
+ owner: "MaxwellClinic-Development",
76
+ repo: "EverBetter-Pro",
77
+ domain: "billing",
78
+ feature: "invoice-retry",
79
+ branch: "feat/invoice-retry",
80
+ taskType: "planning",
81
+ limit: 20
82
+ })
83
+ ```
84
+
85
+ Use this before planning when you need fresh PR descriptions, comments, and reviews.
86
+
87
+ ### List active worktrees
88
+
89
+ ```text
90
+ list_active_worktrees({
91
+ baseBranch: "main"
92
+ })
93
+ ```
94
+
95
+ Use this when your team works on multiple features in parallel and wants session-aware context.
96
+
97
+ ### Build a scoped context pack
98
+
99
+ ```text
100
+ build_context_pack({
101
+ domain: "billing",
102
+ feature: "invoice-retry",
103
+ branch: "feat/invoice-retry",
104
+ taskType: "coding",
105
+ includeDraft: false,
106
+ limit: 12
107
+ })
108
+ ```
109
+
110
+ Use this before invoking a coding subagent to keep prompts small and focused.
111
+
112
+ ### Promote draft facts after review
113
+
114
+ ```text
115
+ promote_context_facts({
116
+ domain: "billing",
117
+ feature: "invoice-retry",
118
+ branch: "feat/invoice-retry"
119
+ })
120
+ ```
121
+
122
+ Use this when discussion outcomes are approved and should become durable context.
123
+
124
+ ### Archive completed feature context
125
+
126
+ ```text
127
+ archive_feature_context({
128
+ domain: "billing",
129
+ feature: "invoice-retry"
130
+ })
131
+ ```
132
+
133
+ Use this after merge/closure to prevent active context bloat.
134
+
135
+ ### Find ownership for a file
136
+
137
+ ```text
138
+ who_changed_this({
139
+ filePath: "src/features/auth/session.ts",
140
+ limit: 15
141
+ })
142
+ ```
143
+
144
+ Use this to discover recent authors and commit history for a target file.
145
+
146
+ ### Explain intent for a change
147
+
148
+ ```text
149
+ why_was_this_changed({
150
+ filePath: "src/features/auth/session.ts",
151
+ owner: "MaxwellClinic-Development",
152
+ repo: "EverBetter-Pro"
153
+ })
154
+ ```
155
+
156
+ Use this to combine commit metadata with synced PR context.
157
+
158
+ ### Get an overnight main-branch brief
159
+
160
+ ```text
161
+ get_main_branch_overnight_brief({
162
+ baseBranch: "main",
163
+ sinceHours: 12,
164
+ limit: 25
165
+ })
166
+ ```
167
+
168
+ Use this at the start of the day to review what landed while you were offline.
169
+
170
+ ### Resume a feature worktree
171
+
172
+ ```text
173
+ resume_feature_session_brief({
174
+ worktreePath: "/path/to/worktree",
175
+ baseBranch: "main"
176
+ })
177
+ ```
178
+
179
+ Use this before resuming unfinished work in a separate worktree.
180
+
181
+ ### Run the full pre-plan sync flow
182
+
183
+ ```text
184
+ pre_plan_sync_brief({
185
+ owner: "MaxwellClinic-Development",
186
+ repo: "EverBetter-Pro",
187
+ baseBranch: "main",
188
+ worktreePath: "/path/to/worktree",
189
+ sinceHours: 12,
190
+ limit: 25
191
+ })
33
192
  ```
34
193
 
35
- ## VS Code MCP registration
194
+ Use this as the default entrypoint for async team planning.
195
+
196
+ ## Multi-session git worktree flow
36
197
 
37
- Copy `mcp.config.example.json` entries into your user MCP config and adjust paths/env values.
198
+ For parallel AI coding sessions:
38
199
 
39
- For MCP Registry publication, keep `package.json` `mcpName` and `server.json` `name` in sync.
200
+ 1. Create one git worktree per feature branch.
201
+ 2. Use `list_active_worktrees` to enumerate active sessions.
202
+ 3. Use `resume_feature_session_brief` per worktree to check divergence and overlap risks.
203
+ 4. Generate a worktree-specific `build_context_pack` and hand it to the target subagent.
40
204
 
41
- ## Environment
205
+ This pattern avoids one giant shared context and scales better as features grow.
42
206
 
43
- - `COMMIT_RAG_REPO` default repository path for MCP
44
- - `COMMIT_RAG_DB` sqlite db path
45
- - `COMMIT_RAG_LIMIT` max commits to index per run
46
- - `OLLAMA_BASE_URL` local ollama URL (default `http://127.0.0.1:11434`)
47
- - `OLLAMA_EMBED_MODEL` local embedding model name
207
+ ## Data model overview
48
208
 
49
- If `OLLAMA_EMBED_MODEL` is not set, the package uses deterministic local fallback embeddings.
209
+ The package stores local context for:
50
210
 
51
- ### Copilot LLM reranking (optional)
211
+ - commits
212
+ - pull requests
213
+ - PR comments
214
+ - PR reviews
215
+ - promoted decision/blocker summaries
216
+ - worktree session checkpoints
52
217
 
53
- Set `COPILOT_TOKEN` to a GitHub token with Copilot access to enable LLM-based reranking.
54
- After initial vector/keyword retrieval, results are sent to Copilot for semantic scoring and re-sorted.
218
+ ## Publishing
55
219
 
56
- - `COPILOT_TOKEN` GitHub PAT or token with Copilot access (enables reranking)
57
- - `COPILOT_MODEL` model slug (default: `gpt-4o-mini`, supports `claude-sonnet-4-5`, `gpt-4o`, etc.)
58
- - `COPILOT_BASE_URL` API base URL (default: `https://api.githubcopilot.com`)
220
+ ```bash
221
+ npm run build
222
+ npm publish --access public
223
+ ```
59
224
 
60
- Reranking works alongside or instead of Ollama no embedding model required.
225
+ For MCP Registry publication, keep `package.json` `mcpName` and `server.json` `name` aligned.
@@ -1,7 +1,32 @@
1
1
  import Database from "better-sqlite3";
2
- import type { CommitChunk } from "../types.js";
2
+ import type { CommitChunk, ContextFactRecord, ContextPackRecord, PullRequestCommentRecord, PullRequestDecisionRecord, PullRequestRecord, PullRequestReviewRecord, WorktreeSessionRecord } from "../types.js";
3
3
  export type RagDatabase = Database.Database;
4
4
  export declare function openDatabase(dbPath: string): RagDatabase;
5
5
  export declare function hasChunk(db: RagDatabase, chunkId: string): boolean;
6
6
  export declare function upsertChunk(db: RagDatabase, chunk: CommitChunk, embedding: number[]): void;
7
7
  export declare function touchIndexState(db: RagDatabase): void;
8
+ export declare function upsertPullRequest(db: RagDatabase, pr: PullRequestRecord): void;
9
+ export declare function replacePullRequestComments(db: RagDatabase, repoOwner: string, repoName: string, prNumber: number, comments: PullRequestCommentRecord[]): void;
10
+ export declare function replacePullRequestReviews(db: RagDatabase, repoOwner: string, repoName: string, prNumber: number, reviews: PullRequestReviewRecord[]): void;
11
+ export declare function replacePullRequestDecisions(db: RagDatabase, repoOwner: string, repoName: string, prNumber: number, decisions: PullRequestDecisionRecord[]): void;
12
+ export declare function touchPullRequestSyncState(db: RagDatabase, repoOwner: string, repoName: string): void;
13
+ export declare function upsertWorktreeSession(db: RagDatabase, session: WorktreeSessionRecord): void;
14
+ export declare function upsertContextFact(db: RagDatabase, fact: ContextFactRecord): void;
15
+ export declare function promoteContextFacts(db: RagDatabase, options: {
16
+ domain?: string;
17
+ feature?: string;
18
+ branch?: string;
19
+ sourceType?: string;
20
+ }): number;
21
+ export declare function buildContextPack(db: RagDatabase, options: {
22
+ domain?: string;
23
+ feature?: string;
24
+ branch?: string;
25
+ taskType?: string;
26
+ includeDraft?: boolean;
27
+ limit: number;
28
+ }): ContextPackRecord[];
29
+ export declare function archiveFeatureContext(db: RagDatabase, options: {
30
+ domain: string;
31
+ feature: string;
32
+ }): number;
package/dist/db/client.js CHANGED
@@ -39,6 +39,114 @@ export function openDatabase(dbPath) {
39
39
  id INTEGER PRIMARY KEY CHECK (id = 1),
40
40
  last_indexed_at TEXT NOT NULL
41
41
  );
42
+
43
+ CREATE TABLE IF NOT EXISTS prs (
44
+ repo_owner TEXT NOT NULL,
45
+ repo_name TEXT NOT NULL,
46
+ pr_number INTEGER NOT NULL,
47
+ title TEXT NOT NULL,
48
+ body TEXT NOT NULL,
49
+ author TEXT NOT NULL,
50
+ state TEXT NOT NULL,
51
+ created_at TEXT NOT NULL,
52
+ updated_at TEXT NOT NULL,
53
+ merged_at TEXT,
54
+ url TEXT NOT NULL,
55
+ PRIMARY KEY (repo_owner, repo_name, pr_number)
56
+ );
57
+
58
+ CREATE TABLE IF NOT EXISTS pr_comments (
59
+ id TEXT PRIMARY KEY,
60
+ repo_owner TEXT NOT NULL,
61
+ repo_name TEXT NOT NULL,
62
+ pr_number INTEGER NOT NULL,
63
+ author TEXT NOT NULL,
64
+ body TEXT NOT NULL,
65
+ created_at TEXT NOT NULL,
66
+ updated_at TEXT NOT NULL,
67
+ url TEXT NOT NULL,
68
+ FOREIGN KEY (repo_owner, repo_name, pr_number)
69
+ REFERENCES prs(repo_owner, repo_name, pr_number)
70
+ );
71
+
72
+ CREATE TABLE IF NOT EXISTS pr_reviews (
73
+ id TEXT PRIMARY KEY,
74
+ repo_owner TEXT NOT NULL,
75
+ repo_name TEXT NOT NULL,
76
+ pr_number INTEGER NOT NULL,
77
+ author TEXT NOT NULL,
78
+ state TEXT NOT NULL,
79
+ body TEXT NOT NULL,
80
+ submitted_at TEXT NOT NULL,
81
+ FOREIGN KEY (repo_owner, repo_name, pr_number)
82
+ REFERENCES prs(repo_owner, repo_name, pr_number)
83
+ );
84
+
85
+ CREATE TABLE IF NOT EXISTS pr_decisions (
86
+ id TEXT PRIMARY KEY,
87
+ repo_owner TEXT NOT NULL,
88
+ repo_name TEXT NOT NULL,
89
+ pr_number INTEGER NOT NULL,
90
+ source TEXT NOT NULL,
91
+ author TEXT NOT NULL,
92
+ summary TEXT NOT NULL,
93
+ severity TEXT NOT NULL,
94
+ created_at TEXT NOT NULL,
95
+ FOREIGN KEY (repo_owner, repo_name, pr_number)
96
+ REFERENCES prs(repo_owner, repo_name, pr_number)
97
+ );
98
+
99
+ CREATE TABLE IF NOT EXISTS pr_sync_state (
100
+ repo_owner TEXT NOT NULL,
101
+ repo_name TEXT NOT NULL,
102
+ last_synced_at TEXT NOT NULL,
103
+ PRIMARY KEY (repo_owner, repo_name)
104
+ );
105
+
106
+ CREATE TABLE IF NOT EXISTS worktree_sessions (
107
+ path TEXT PRIMARY KEY,
108
+ branch TEXT NOT NULL,
109
+ base_branch TEXT NOT NULL,
110
+ last_synced_at TEXT NOT NULL
111
+ );
112
+
113
+ CREATE TABLE IF NOT EXISTS context_facts (
114
+ id TEXT PRIMARY KEY,
115
+ source_type TEXT NOT NULL,
116
+ source_ref TEXT NOT NULL,
117
+ scope_domain TEXT NOT NULL,
118
+ scope_feature TEXT NOT NULL,
119
+ scope_branch TEXT NOT NULL,
120
+ scope_task_type TEXT NOT NULL,
121
+ title TEXT NOT NULL,
122
+ content TEXT NOT NULL,
123
+ priority REAL NOT NULL,
124
+ confidence REAL NOT NULL,
125
+ status TEXT NOT NULL,
126
+ created_at TEXT NOT NULL,
127
+ updated_at TEXT NOT NULL
128
+ );
129
+
130
+ CREATE INDEX IF NOT EXISTS idx_context_scope
131
+ ON context_facts(scope_domain, scope_feature, scope_branch, scope_task_type, status, updated_at);
132
+
133
+ CREATE TABLE IF NOT EXISTS context_fact_archive (
134
+ id TEXT PRIMARY KEY,
135
+ source_type TEXT NOT NULL,
136
+ source_ref TEXT NOT NULL,
137
+ scope_domain TEXT NOT NULL,
138
+ scope_feature TEXT NOT NULL,
139
+ scope_branch TEXT NOT NULL,
140
+ scope_task_type TEXT NOT NULL,
141
+ title TEXT NOT NULL,
142
+ content TEXT NOT NULL,
143
+ priority REAL NOT NULL,
144
+ confidence REAL NOT NULL,
145
+ status TEXT NOT NULL,
146
+ created_at TEXT NOT NULL,
147
+ updated_at TEXT NOT NULL,
148
+ archived_at TEXT NOT NULL
149
+ );
42
150
  `);
43
151
  return db;
44
152
  }
@@ -90,3 +198,296 @@ export function touchIndexState(db) {
90
198
  last_indexed_at = excluded.last_indexed_at
91
199
  `).run(now);
92
200
  }
201
+ export function upsertPullRequest(db, pr) {
202
+ db.prepare(`
203
+ INSERT INTO prs (
204
+ repo_owner,
205
+ repo_name,
206
+ pr_number,
207
+ title,
208
+ body,
209
+ author,
210
+ state,
211
+ created_at,
212
+ updated_at,
213
+ merged_at,
214
+ url
215
+ )
216
+ VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
217
+ ON CONFLICT(repo_owner, repo_name, pr_number) DO UPDATE SET
218
+ title = excluded.title,
219
+ body = excluded.body,
220
+ author = excluded.author,
221
+ state = excluded.state,
222
+ created_at = excluded.created_at,
223
+ updated_at = excluded.updated_at,
224
+ merged_at = excluded.merged_at,
225
+ url = excluded.url
226
+ `).run(pr.repoOwner, pr.repoName, pr.number, pr.title, pr.body, pr.author, pr.state, pr.createdAt, pr.updatedAt, pr.mergedAt, pr.url);
227
+ }
228
+ export function replacePullRequestComments(db, repoOwner, repoName, prNumber, comments) {
229
+ db.prepare(`
230
+ DELETE FROM pr_comments
231
+ WHERE repo_owner = ? AND repo_name = ? AND pr_number = ?
232
+ `).run(repoOwner, repoName, prNumber);
233
+ const insert = db.prepare(`
234
+ INSERT INTO pr_comments (
235
+ id,
236
+ repo_owner,
237
+ repo_name,
238
+ pr_number,
239
+ author,
240
+ body,
241
+ created_at,
242
+ updated_at,
243
+ url
244
+ ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)
245
+ `);
246
+ for (const comment of comments) {
247
+ insert.run(comment.id, repoOwner, repoName, prNumber, comment.author, comment.body, comment.createdAt, comment.updatedAt, comment.url);
248
+ }
249
+ }
250
+ export function replacePullRequestReviews(db, repoOwner, repoName, prNumber, reviews) {
251
+ db.prepare(`
252
+ DELETE FROM pr_reviews
253
+ WHERE repo_owner = ? AND repo_name = ? AND pr_number = ?
254
+ `).run(repoOwner, repoName, prNumber);
255
+ const insert = db.prepare(`
256
+ INSERT INTO pr_reviews (
257
+ id,
258
+ repo_owner,
259
+ repo_name,
260
+ pr_number,
261
+ author,
262
+ state,
263
+ body,
264
+ submitted_at
265
+ ) VALUES (?, ?, ?, ?, ?, ?, ?, ?)
266
+ `);
267
+ for (const review of reviews) {
268
+ insert.run(review.id, repoOwner, repoName, prNumber, review.author, review.state, review.body, review.submittedAt);
269
+ }
270
+ }
271
+ export function replacePullRequestDecisions(db, repoOwner, repoName, prNumber, decisions) {
272
+ db.prepare(`
273
+ DELETE FROM pr_decisions
274
+ WHERE repo_owner = ? AND repo_name = ? AND pr_number = ?
275
+ `).run(repoOwner, repoName, prNumber);
276
+ const insert = db.prepare(`
277
+ INSERT INTO pr_decisions (
278
+ id,
279
+ repo_owner,
280
+ repo_name,
281
+ pr_number,
282
+ source,
283
+ author,
284
+ summary,
285
+ severity,
286
+ created_at
287
+ ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)
288
+ `);
289
+ for (const decision of decisions) {
290
+ insert.run(decision.id, repoOwner, repoName, prNumber, decision.source, decision.author, decision.summary, decision.severity, decision.createdAt);
291
+ }
292
+ }
293
+ export function touchPullRequestSyncState(db, repoOwner, repoName) {
294
+ db.prepare(`
295
+ INSERT INTO pr_sync_state (repo_owner, repo_name, last_synced_at)
296
+ VALUES (?, ?, ?)
297
+ ON CONFLICT(repo_owner, repo_name) DO UPDATE SET
298
+ last_synced_at = excluded.last_synced_at
299
+ `).run(repoOwner, repoName, new Date().toISOString());
300
+ }
301
+ export function upsertWorktreeSession(db, session) {
302
+ db.prepare(`
303
+ INSERT INTO worktree_sessions (path, branch, base_branch, last_synced_at)
304
+ VALUES (?, ?, ?, ?)
305
+ ON CONFLICT(path) DO UPDATE SET
306
+ branch = excluded.branch,
307
+ base_branch = excluded.base_branch,
308
+ last_synced_at = excluded.last_synced_at
309
+ `).run(session.path, session.branch, session.baseBranch, session.lastSyncedAt);
310
+ }
311
+ export function upsertContextFact(db, fact) {
312
+ db.prepare(`
313
+ INSERT INTO context_facts (
314
+ id,
315
+ source_type,
316
+ source_ref,
317
+ scope_domain,
318
+ scope_feature,
319
+ scope_branch,
320
+ scope_task_type,
321
+ title,
322
+ content,
323
+ priority,
324
+ confidence,
325
+ status,
326
+ created_at,
327
+ updated_at
328
+ ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
329
+ ON CONFLICT(id) DO UPDATE SET
330
+ source_type = excluded.source_type,
331
+ source_ref = excluded.source_ref,
332
+ scope_domain = excluded.scope_domain,
333
+ scope_feature = excluded.scope_feature,
334
+ scope_branch = excluded.scope_branch,
335
+ scope_task_type = excluded.scope_task_type,
336
+ title = excluded.title,
337
+ content = excluded.content,
338
+ priority = excluded.priority,
339
+ confidence = excluded.confidence,
340
+ status = excluded.status,
341
+ created_at = excluded.created_at,
342
+ updated_at = excluded.updated_at
343
+ `).run(fact.id, fact.sourceType, fact.sourceRef, fact.domain, fact.feature, fact.branch, fact.taskType, fact.title, fact.content, fact.priority, fact.confidence, fact.status, fact.createdAt, fact.updatedAt);
344
+ }
345
+ export function promoteContextFacts(db, options) {
346
+ const clauses = ["status = 'draft'"];
347
+ const params = [];
348
+ if (options.domain) {
349
+ clauses.push("scope_domain = ?");
350
+ params.push(options.domain);
351
+ }
352
+ if (options.feature) {
353
+ clauses.push("scope_feature = ?");
354
+ params.push(options.feature);
355
+ }
356
+ if (options.branch) {
357
+ clauses.push("scope_branch = ?");
358
+ params.push(options.branch);
359
+ }
360
+ if (options.sourceType) {
361
+ clauses.push("source_type = ?");
362
+ params.push(options.sourceType);
363
+ }
364
+ const sql = `
365
+ UPDATE context_facts
366
+ SET status = 'promoted', updated_at = ?
367
+ WHERE ${clauses.join(" AND ")}
368
+ `;
369
+ const now = new Date().toISOString();
370
+ const result = db.prepare(sql).run(now, ...params);
371
+ return Number(result.changes ?? 0);
372
+ }
373
+ export function buildContextPack(db, options) {
374
+ const clauses = [];
375
+ const params = [];
376
+ if (options.domain) {
377
+ clauses.push("scope_domain = ?");
378
+ params.push(options.domain);
379
+ }
380
+ if (options.feature) {
381
+ clauses.push("scope_feature = ?");
382
+ params.push(options.feature);
383
+ }
384
+ if (options.branch) {
385
+ clauses.push("scope_branch = ?");
386
+ params.push(options.branch);
387
+ }
388
+ if (options.taskType) {
389
+ clauses.push("scope_task_type IN (?, 'general')");
390
+ params.push(options.taskType);
391
+ }
392
+ if (options.includeDraft) {
393
+ clauses.push("status IN ('promoted', 'draft')");
394
+ }
395
+ else {
396
+ clauses.push("status = 'promoted'");
397
+ }
398
+ const where = clauses.length > 0 ? `WHERE ${clauses.join(" AND ")}` : "";
399
+ const sql = `
400
+ SELECT
401
+ id,
402
+ source_type,
403
+ source_ref,
404
+ title,
405
+ content,
406
+ scope_domain,
407
+ scope_feature,
408
+ scope_branch,
409
+ scope_task_type,
410
+ priority,
411
+ confidence,
412
+ status,
413
+ updated_at,
414
+ ((priority * 0.45) + (confidence * 0.35) +
415
+ CASE
416
+ WHEN scope_task_type = ? THEN 0.20
417
+ WHEN scope_task_type = 'general' THEN 0.10
418
+ ELSE 0.0
419
+ END) AS score
420
+ FROM context_facts
421
+ ${where}
422
+ ORDER BY score DESC, updated_at DESC
423
+ LIMIT ?
424
+ `;
425
+ const taskType = options.taskType ?? "general";
426
+ const rows = db.prepare(sql).all(taskType, ...params, options.limit);
427
+ return rows.map((row) => ({
428
+ id: String(row.id ?? ""),
429
+ sourceType: String(row.source_type ?? ""),
430
+ sourceRef: String(row.source_ref ?? ""),
431
+ title: String(row.title ?? ""),
432
+ content: String(row.content ?? ""),
433
+ domain: String(row.scope_domain ?? ""),
434
+ feature: String(row.scope_feature ?? ""),
435
+ branch: String(row.scope_branch ?? ""),
436
+ taskType: String(row.scope_task_type ?? ""),
437
+ priority: Number(row.priority ?? 0),
438
+ confidence: Number(row.confidence ?? 0),
439
+ score: Number(row.score ?? 0),
440
+ status: String(row.status ?? "promoted"),
441
+ updatedAt: String(row.updated_at ?? ""),
442
+ }));
443
+ }
444
+ export function archiveFeatureContext(db, options) {
445
+ const now = new Date().toISOString();
446
+ const tx = db.transaction(() => {
447
+ db.prepare(`
448
+ INSERT OR REPLACE INTO context_fact_archive (
449
+ id,
450
+ source_type,
451
+ source_ref,
452
+ scope_domain,
453
+ scope_feature,
454
+ scope_branch,
455
+ scope_task_type,
456
+ title,
457
+ content,
458
+ priority,
459
+ confidence,
460
+ status,
461
+ created_at,
462
+ updated_at,
463
+ archived_at
464
+ )
465
+ SELECT
466
+ id,
467
+ source_type,
468
+ source_ref,
469
+ scope_domain,
470
+ scope_feature,
471
+ scope_branch,
472
+ scope_task_type,
473
+ title,
474
+ content,
475
+ priority,
476
+ confidence,
477
+ 'archived',
478
+ created_at,
479
+ updated_at,
480
+ ?
481
+ FROM context_facts
482
+ WHERE scope_domain = ? AND scope_feature = ?
483
+ `).run(now, options.domain, options.feature);
484
+ return db
485
+ .prepare(`
486
+ DELETE FROM context_facts
487
+ WHERE scope_domain = ? AND scope_feature = ?
488
+ `)
489
+ .run(options.domain, options.feature);
490
+ });
491
+ const result = tx();
492
+ return Number(result.changes ?? 0);
493
+ }