cozo-memory 1.1.2 โ†’ 1.1.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -81,6 +81,10 @@ Now you can add the server to your MCP client (e.g. Claude Desktop).
81
81
 
82
82
  ๐Ÿงน **Janitor Service** - LLM-backed automatic cleanup with hierarchical summarization, observation pruning, and **automated session compression**
83
83
 
84
+ ๐Ÿ—œ๏ธ **Context Compaction & Auto-Summarization (v2.2)** - Automatic and manual memory consolidation with progressive summarization and LLM-backed Executive Summaries
85
+
86
+ ๐Ÿง  **Fact Lifecycle Management (v2.1)** - Native "soft-deletion" via CozoDB Validity retraction; invalidated facts are hidden from current views but preserved in history for audit trails
87
+
84
88
  ๐Ÿ‘ค **User Preference Profiling** - Persistent user preferences with automatic 50% search boost
85
89
 
86
90
  ๐Ÿ” **Near-Duplicate Detection** - Automatic LSH-based deduplication to avoid redundancy
@@ -129,6 +133,10 @@ Now you can add the server to your MCP client (e.g. Claude Desktop).
129
133
  - **Data Integrity (Trigger Concept)**: Prevents invalid states like self-references in relationships (Self-Loops) directly at creation.
130
134
  - **Hierarchical Summarization**: The Janitor condenses old fragments into "Executive Summary" nodes to preserve the "Big Picture" long-term.
131
135
  - **User Preference Profiling**: A specialized `global_user_profile` entity stores persistent preferences (likes, work style), which receive a **50% score boost** in every search.
136
+ - **Fact Lifecycle Management (v2.1)**: Uses CozoDB's native **Validity** retraction mechanism to manage the lifecycle of information. Instead of destructive deletions, facts are invalidated by asserting a `[timestamp, false]` record. This ensures:
137
+ 1. **Auditability**: You can always "time-travel" back to see what the system knew at any given point.
138
+ 2. **Consistency**: All standard retrieval (Search, Graph-RAG, Inference) uses the `@ "NOW"` filter to automatically exclude retracted facts.
139
+ 3. **Atomic Retraction**: Invalidation can be part of a multi-statement transaction, allowing for clean "update" patterns (invalidate old + insert new).
132
140
  - **All Local**: Embeddings via Transformers/ONNX; no external embedding service required.
133
141
 
134
142
  ## Positioning & Comparison
@@ -523,10 +531,10 @@ The interface is reduced to **4 consolidated tools**. The concrete operation is
523
531
 
524
532
  | Tool | Purpose | Key Actions |
525
533
  |------|---------|-------------|
526
- | `mutate_memory` | Write operations | create_entity, update_entity, delete_entity, add_observation, create_relation, start_session, stop_session, start_task, stop_task, run_transaction, add_inference_rule, ingest_file |
534
+ | `mutate_memory` | Write operations | create_entity, update_entity, delete_entity, add_observation, create_relation, start_session, stop_session, start_task, stop_task, run_transaction, add_inference_rule, ingest_file, invalidate_observation, invalidate_relation |
527
535
  | `query_memory` | Read operations | search, advancedSearch, context, entity_details, history, graph_rag, graph_walking, agentic_search (Multi-Level Context support) |
528
536
  | `analyze_graph` | Graph analysis | explore, communities, pagerank, betweenness, hits, shortest_path, bridge_discovery, semantic_walk, infer_relations |
529
- | `manage_system` | Maintenance | health, metrics, export_memory, import_memory, snapshot_create, snapshot_list, snapshot_diff, cleanup, reflect, summarize_communities, clear_memory |
537
+ | `manage_system` | Maintenance | health, metrics, export_memory, import_memory, snapshot_create, snapshot_list, snapshot_diff, cleanup, reflect, summarize_communities, clear_memory, compact |
530
538
 
531
539
  ### mutate_memory (Write)
532
540
 
@@ -543,6 +551,8 @@ Actions:
543
551
  - `run_transaction`: `{ operations: Array<{ action, params }> }` **(New v1.2)**: Executes multiple operations atomically.
544
552
  - `add_inference_rule`: `{ name, datalog }`
545
553
  - `ingest_file`: `{ format, file_path?, content?, entity_id?, entity_name?, entity_type?, chunking?, metadata?, observation_metadata?, deduplicate?, max_observations? }`
554
+ - `invalidate_observation`: `{ observation_id }` **(New v2.1)**: Retracts an observation using Validity `[now, false]`.
555
+ - `invalidate_relation`: `{ from_id, to_id, relation_type }` **(New v2.1)**: Retracts a relationship using Validity `[now, false]`.
546
556
  - `format` options: `"markdown"`, `"json"`, `"pdf"` **(New v1.9)**
547
557
  - `file_path`: Optional path to file on disk (alternative to `content` parameter)
548
558
  - `content`: File content as string (required if `file_path` not provided)
@@ -728,6 +738,10 @@ Actions:
728
738
  - `snapshot_list`: `{}`
729
739
  - `snapshot_diff`: `{ snapshot_id_a, snapshot_id_b }`
730
740
  - `cleanup`: `{ confirm, older_than_days?, max_observations?, min_entity_degree?, model? }`
741
+ - `compact`: `{ session_id?, entity_id?, model? }` **(New v2.2)**: Manual context compaction. Supports three modes:
742
+ - **Session Compaction**: `{ session_id, model? }` - Summarizes session observations into 2-3 bullet points and stores in user profile
743
+ - **Entity Compaction**: `{ entity_id, model? }` - Compacts entity observations when threshold exceeded, creates Executive Summary
744
+ - **Global Compaction**: `{}` (no parameters) - Compacts all entities exceeding threshold (default: 20 observations)
731
745
  - `summarize_communities`: `{ model?, min_community_size? }` **(New v2.0)**: Triggers the **Hierarchical GraphRAG** pipeline. Recomputes communities, generates thematic summaries via LLM, and stores them as `CommunitySummary` entities.
732
746
  - `reflect`: `{ entity_id?, mode?, model? }` Analyzes memory for contradictions and new insights. Supports `summary` (default) and `discovery` (autonomous link refinement) modes.
733
747
  - `clear_memory`: `{ confirm }`
@@ -738,6 +752,17 @@ Janitor Cleanup Details:
738
752
  - **Hierarchical Summarization**: Detects isolated or old observations, has them summarized by a local LLM (Ollama), and creates a new `ExecutiveSummary` node. Old fragments are deleted to reduce noise while preserving knowledge.
739
753
  - **Automated Session Compression**: Automatically identifies inactive sessions, summarizes their activity into a few bullet points, and stores the summary in the User Profile while marking the session as archived.
740
754
 
755
+ Context Compaction Details **(New v2.2)**:
756
+ - **Automatic Compaction**: Triggered automatically when observations exceed threshold (default: 20)
757
+ - Runs in background during `addObservation`
758
+ - Uses lock mechanism to prevent concurrent compaction
759
+ - **Manual Compaction**: Available via `compact` action in `manage_system`
760
+ - **Session Mode**: Summarizes session observations and stores in `global_user_profile`
761
+ - **Entity Mode**: Compacts specific entity with custom threshold
762
+ - **Global Mode**: Compacts all entities exceeding threshold
763
+ - **Progressive Summarization**: New observations are merged with existing Executive Summaries instead of simple append
764
+ - **LLM Integration**: Uses Ollama (default model: `demyagent-4b-i1:Q6_K`) for intelligent summarization
765
+
741
766
  **Before Janitor:**
742
767
  ```
743
768
  Entity: Project X
@@ -0,0 +1,25 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ const index_1 = require("./index");
4
+ async function targetedInspect() {
5
+ const s = new index_1.MemoryServer();
6
+ await s.initPromise;
7
+ const entRes = await s.db.run('?[id, name] := *entity{id, name, @ "NOW"}, name == "HeavyEntity"');
8
+ if (entRes.rows.length === 0) {
9
+ console.log("HeavyEntity not found.");
10
+ process.exit(0);
11
+ }
12
+ const eid = entRes.rows[0][0];
13
+ console.log(`Analyzing HeavyEntity (${eid}):`);
14
+ // 1. Current observations
15
+ const current = await s.db.run('?[id, text, meta] := *observation{entity_id: $eid, id, text, metadata: meta, @ "NOW"}', { eid });
16
+ console.log(`Current Count: ${current.rows.length}`);
17
+ current.rows.forEach((r) => console.log(`- [${r[0]}] ${r[1]} (Meta: ${JSON.stringify(r[2])})`));
18
+ // 2. All history (to see if things were invalidated)
19
+ const history = await s.db.run('?[id, text, cat] := *observation{entity_id: $eid, id, text, created_at: cat}');
20
+ console.log(`\nHistory Count: ${history.rows.length}`);
21
+ // Just a few history samples
22
+ history.rows.slice(0, 10).forEach((r) => console.log(`- [${r[0]}] ${r[1]} @ ${JSON.stringify(r[2])}`));
23
+ process.exit(0);
24
+ }
25
+ targetedInspect();
@@ -204,7 +204,7 @@ class HybridSearch {
204
204
  semanticCall += `, filter: ${hnswFilters.join(" && ")}`;
205
205
  }
206
206
  semanticCall += `}`;
207
- let bodyConstraints = [semanticCall, `*entity{id, name, type, metadata, created_at}`];
207
+ let bodyConstraints = [semanticCall, `*entity{id, name, type, metadata, created_at, @ "NOW"}`];
208
208
  if (metaJoins.length > 0) {
209
209
  bodyConstraints.push(...metaJoins);
210
210
  }
@@ -232,10 +232,10 @@ class HybridSearch {
232
232
  `rank_val[id, r] := *entity{id, @ "NOW"}, not *entity_rank{entity_id: id}, r = 0.0`
233
233
  ];
234
234
  if (graphConstraints?.requiredRelations && graphConstraints.requiredRelations.length > 0) {
235
- helperRules.push(`rel_match[id, rel_type] := *relationship{from_id: id, relation_type: rel_type}`, `rel_match[id, rel_type] := *relationship{to_id: id, relation_type: rel_type}`);
235
+ helperRules.push(`rel_match[id, rel_type] := *relationship{from_id: id, relation_type: rel_type, @ "NOW"}`, `rel_match[id, rel_type] := *relationship{to_id: id, relation_type: rel_type, @ "NOW"}`);
236
236
  }
237
237
  if (graphConstraints?.targetEntityIds && graphConstraints.targetEntityIds.length > 0) {
238
- helperRules.push(`target_match[id, target_id] := *relationship{from_id: id, to_id: target_id}`, `target_match[id, target_id] := *relationship{to_id: id, from_id: target_id}`);
238
+ helperRules.push(`target_match[id, target_id] := *relationship{from_id: id, to_id: target_id, @ "NOW"}`, `target_match[id, target_id] := *relationship{to_id: id, from_id: target_id, @ "NOW"}`);
239
239
  }
240
240
  const datalogQuery = [
241
241
  ...helperRules,
@@ -360,7 +360,7 @@ class HybridSearch {
360
360
 
361
361
  result_entities[id, final_score, depth] := path[seed_id, id, depth], seeds[seed_id, seed_score], rank_val[id, pr], final_score = seed_score * (1.0 - 0.2 * depth)
362
362
 
363
- ?[id, name, type, metadata, created_at, score, source, text] := result_entities[id, score, depth], *entity{id, name, type, metadata, created_at}, source = 'graph_rag_entity', text = ''
363
+ ?[id, name, type, metadata, created_at, score, source, text] := result_entities[id, score, depth], *entity{id, name, type, metadata, created_at, @ "NOW"}, source = 'graph_rag_entity', text = ''
364
364
 
365
365
  :sort -score
366
366
  :limit $limit
package/dist/index.js CHANGED
@@ -28,6 +28,7 @@ class MemoryServer {
28
28
  hybridSearch;
29
29
  inferenceEngine;
30
30
  initPromise;
31
+ compactionLocks = new Set();
31
32
  // Metrics tracking
32
33
  metrics = {
33
34
  operations: {
@@ -1422,13 +1423,13 @@ class MemoryServer {
1422
1423
  if (session_id) {
1423
1424
  try {
1424
1425
  await this.db.run(`
1425
- ?[id, last_active, status, metadata] :=
1426
- *session_state{id, metadata},
1427
- id = $id,
1426
+ ?[session_id, last_active, status, metadata] :=
1427
+ *session_state{session_id, metadata},
1428
+ session_id = $session_id,
1428
1429
  last_active = $now,
1429
1430
  status = "active"
1430
- :update session_state {id => last_active, status, metadata}
1431
- `, { id: session_id, now: Math.floor(now / 1000000) });
1431
+ :update session_state {session_id => last_active, status, metadata}
1432
+ `, { session_id: session_id, now: Math.floor(now / 1000000) });
1432
1433
  }
1433
1434
  catch (e) {
1434
1435
  console.warn(`[AddObservation] Session activity update failed for ${session_id}:`, e.message);
@@ -1437,6 +1438,12 @@ class MemoryServer {
1437
1438
  // Optional: Automatic inference after new observation (in background)
1438
1439
  const suggestionsRaw = await this.inferenceEngine.inferRelations(entityId);
1439
1440
  const suggestions = await this.formatInferredRelationsForContext(suggestionsRaw);
1441
+ // Automatic Compaction (Threshold-based)
1442
+ // Skip for compaction-generated observations to prevent recursive loop
1443
+ const metadataKind = args.metadata?.kind;
1444
+ if (metadataKind !== "compaction" && metadataKind !== "session_summary") {
1445
+ this.compactEntity({ entity_id: entityId }).catch(e => console.error(`[AutoCompact] Error for ${entityId}:`, e));
1446
+ }
1440
1447
  const created_at_iso = new Date(Math.floor(now / 1000)).toISOString();
1441
1448
  return {
1442
1449
  id,
@@ -1472,6 +1479,13 @@ class MemoryServer {
1472
1479
  }
1473
1480
  async stopSession(args) {
1474
1481
  await this.initPromise;
1482
+ // Automatic Session Compaction
1483
+ try {
1484
+ await this.compactSession({ session_id: args.id });
1485
+ }
1486
+ catch (e) {
1487
+ console.warn(`[AutoCompact] Session compaction failed for ${args.id}:`, e.message);
1488
+ }
1475
1489
  await this.db.run(`
1476
1490
  ?[session_id, last_active, status, metadata] :=
1477
1491
  *session_state{session_id, last_active, metadata},
@@ -1796,9 +1810,9 @@ ids[id] <- $ids
1796
1810
  else {
1797
1811
  // Select top 5 entities with the most observations
1798
1812
  const res = await this.db.run(`
1799
- ?[id, name, type, count(id)] :=
1800
- *entity{id, name, type, @ "NOW"},
1801
- *observation{entity_id: id, @ "NOW"}
1813
+ ?[eid, ename, etype, count(oid)] :=
1814
+ *entity{id: eid, name: ename, type: etype, @ "NOW"},
1815
+ *observation{entity_id: eid, id: oid, @ "NOW"}
1802
1816
  `);
1803
1817
  entitiesToReflect = res.rows
1804
1818
  .map((r) => ({ id: String(r[0]), name: String(r[1]), type: String(r[2]), count: Number(r[3]) }))
@@ -1859,8 +1873,8 @@ If no special patterns are recognizable, answer with "No new insights".`;
1859
1873
  for (const candidate of candidates) {
1860
1874
  // Check if relationship already exists
1861
1875
  const existing = await this.db.run(`
1862
- ?[count(from_id)] := *relationship{from_id, to_id, relation_type, @ "NOW"},
1863
- from_id = $from, to_id = $to, relation_type = $rel
1876
+ ?[count(fid)] := *relationship{from_id: fid, to_id: tid, relation_type: rel, @ "NOW"},
1877
+ fid = $from, tid = $to, rel = $rel
1864
1878
  `, { from: candidate.from_id, to: candidate.to_id, rel: candidate.relation_type });
1865
1879
  if (Number(existing.rows[0][0]) > 0)
1866
1880
  continue;
@@ -2191,6 +2205,132 @@ ids[id] <- $ids
2191
2205
  return { error: error.message || "Error during ingest" };
2192
2206
  }
2193
2207
  }
2208
+ async compactSession(args) {
2209
+ await this.initPromise;
2210
+ const model = args.model || "demyagent-4b-i1:Q6_K";
2211
+ try {
2212
+ // 1. Get all observations from this session
2213
+ const obsRes = await this.db.run('?[text, ts] := *observation{session_id: $sid, text, created_at, @ "NOW"}, ts = to_int(created_at) :order ts', {
2214
+ sid: args.session_id
2215
+ });
2216
+ if (obsRes.rows.length < 3) {
2217
+ return { status: "skipped", reason: "Too few observations for session compaction" };
2218
+ }
2219
+ const observations = obsRes.rows.map((r) => r[0]);
2220
+ const systemPrompt = `You are a memory compaction module. Summarize the following session observations into exactly 2-3 concise bullet points.
2221
+ Focus only on core facts, preferences, or important events. Skip conversational filler.
2222
+ Format:
2223
+ - Bullet Point 1
2224
+ - Bullet Point 2
2225
+ (- Bullet Point 3)`;
2226
+ const userPrompt = `Session ID: ${args.session_id}\n\nObservations:\n${observations.join("\n")}`;
2227
+ const response = await this.callOllama(model, systemPrompt, userPrompt);
2228
+ if (!response)
2229
+ throw new Error("Ollama summary failed");
2230
+ // 2. Add as preference/long-term info to user profile
2231
+ await this.addObservation({
2232
+ entity_id: exports.USER_ENTITY_ID,
2233
+ text: `Session Summary (${args.session_id}): ${response}`,
2234
+ metadata: { kind: "session_summary", session_id: args.session_id, model },
2235
+ deduplicate: true
2236
+ });
2237
+ return { status: "session_compacted", session_id: args.session_id, summary: response };
2238
+ }
2239
+ catch (error) {
2240
+ console.error(`[CompactSession] Error:`, error);
2241
+ return { error: error.message };
2242
+ }
2243
+ }
2244
+ async compactEntity(args) {
2245
+ if (this.compactionLocks.has(args.entity_id))
2246
+ return { status: "already_compacting" };
2247
+ this.compactionLocks.add(args.entity_id);
2248
+ await this.initPromise;
2249
+ const threshold = args.threshold || 20;
2250
+ const model = args.model || "demyagent-4b-i1:Q6_K";
2251
+ try {
2252
+ // 1. Check count of active observations
2253
+ const countRes = await this.db.run('?[count(oid)] := *observation{entity_id: $eid, id: oid, @ "NOW"}', { eid: args.entity_id });
2254
+ const currentCount = Number(countRes.rows[0][0]);
2255
+ if (currentCount <= threshold) {
2256
+ return { entity_id: args.entity_id, status: "ok", count: currentCount };
2257
+ }
2258
+ console.error(`[CompactEntity] Compacting ${args.entity_id} (Count: ${currentCount})`);
2259
+ // 2. Get older observations (limit to currentCount - threshold/2 to keep some recent context)
2260
+ const toKeep = Math.floor(threshold / 2);
2261
+ const toCompact = currentCount - toKeep;
2262
+ const obsRes = await this.db.run(`
2263
+ ?[oid, txt, ts] := *observation{entity_id: $eid, id: oid, text: txt, created_at, @ "NOW"}, ts = to_int(created_at)
2264
+ :order ts
2265
+ :limit $limit
2266
+ `, { eid: args.entity_id, limit: toCompact });
2267
+ if (obsRes.rows.length === 0)
2268
+ return { entity_id: args.entity_id, status: "no_older_obs_found" };
2269
+ const idsToInvalidate = obsRes.rows.map((r) => String(r[0]));
2270
+ const textsToCompact = obsRes.rows.map((r) => r[1]);
2271
+ // 3. Check for existing ExecutiveSummary
2272
+ const existingSummaryRes = await this.db.run('?[sid, stxt] := *observation{entity_id: $eid, text: stxt, id: sid, @ "NOW"}, regex_matches(stxt, "(?i).*(Executive\\s*Summary|ExecutiveSummary).*") :limit 1', { eid: args.entity_id });
2273
+ let summaryText = "";
2274
+ let oldSummaryId = "";
2275
+ if (existingSummaryRes.rows.length > 0) {
2276
+ oldSummaryId = String(existingSummaryRes.rows[0][0]);
2277
+ summaryText = String(existingSummaryRes.rows[0][1]).replace(/Executive\s*Summary:\s*/i, "").replace(/ExecutiveSummary:\s*/i, "");
2278
+ console.error(`[CompactEntity] Found existing summary to merge: ${oldSummaryId}`);
2279
+ }
2280
+ // 4. Summarize / Merge
2281
+ const systemPrompt = `You are a memory compaction module. You are maintaining an "Executive Summary" for an entity.
2282
+ Merge the NEW observations into the EXISTING summary (if present).
2283
+ The summary should be dense, data-rich, and capture all relevant historical facts.
2284
+ Keep it structured and concise (max 5-8 sentences).
2285
+ Format MUST start with "ExecutiveSummary: " followed by the consolidated content.`;
2286
+ const userPrompt = `Entity ID: ${args.entity_id}\nExisting Summary: ${summaryText || "None"}\n\nNew Info to integrate:\n${textsToCompact.join("\n")}`;
2287
+ console.error(`[CompactEntity] Calling Ollama for ${idsToInvalidate.length} observations...`);
2288
+ const newSummaryText = await this.callOllama(model, systemPrompt, userPrompt);
2289
+ if (!newSummaryText)
2290
+ throw new Error("Ollama compaction failed");
2291
+ // 5. Atomic Update
2292
+ // a) Invalidate old observations
2293
+ for (const oid of idsToInvalidate) {
2294
+ await this.invalidateObservation({ observation_id: oid });
2295
+ }
2296
+ // b) Invalidate old summary if exists
2297
+ if (oldSummaryId) {
2298
+ await this.invalidateObservation({ observation_id: oldSummaryId });
2299
+ }
2300
+ // c) Add new summary
2301
+ await this.addObservation({
2302
+ entity_id: args.entity_id,
2303
+ text: newSummaryText,
2304
+ metadata: { kind: "compaction", model, compacted_count: idsToInvalidate.length }
2305
+ });
2306
+ return { status: "entity_compacted", entity_id: args.entity_id, observations_compacted: idsToInvalidate.length };
2307
+ }
2308
+ catch (error) {
2309
+ console.error(`[CompactEntity] Error for ${args.entity_id}:`, error);
2310
+ return { error: error.message };
2311
+ }
2312
+ finally {
2313
+ this.compactionLocks.delete(args.entity_id);
2314
+ }
2315
+ }
2316
+ async callOllama(model, system, user) {
2317
+ try {
2318
+ const ollamaMod = await import("ollama");
2319
+ const ollamaClient = ollamaMod?.default ?? ollamaMod;
2320
+ const response = await ollamaClient.chat({
2321
+ model,
2322
+ messages: [
2323
+ { role: "system", content: system },
2324
+ { role: "user", content: user },
2325
+ ],
2326
+ });
2327
+ return response?.message?.content?.trim?.() ?? null;
2328
+ }
2329
+ catch (e) {
2330
+ console.error(`[OllamaCall] Error:`, e);
2331
+ return null;
2332
+ }
2333
+ }
2194
2334
  async deleteEntity(args) {
2195
2335
  const startTime = Date.now();
2196
2336
  try {
@@ -2203,9 +2343,9 @@ ids[id] <- $ids
2203
2343
  }
2204
2344
  console.error(`[Delete] Entity found: ${entityRes.rows[0][0]}`);
2205
2345
  // 2. Count related data before deletion
2206
- const obsCount = await this.db.run('?[count(id)] := *observation{id, entity_id, @ "NOW"}, entity_id = $id', { id: args.entity_id });
2207
- const relOutCount = await this.db.run('?[count(from_id)] := *relationship{from_id, to_id, @ "NOW"}, from_id = $id', { id: args.entity_id });
2208
- const relInCount = await this.db.run('?[count(to_id)] := *relationship{from_id, to_id, @ "NOW"}, to_id = $id', { id: args.entity_id });
2346
+ const obsCount = await this.db.run('?[count(oid)] := *observation{id: oid, entity_id: $id, @ "NOW"}', { id: args.entity_id });
2347
+ const relOutCount = await this.db.run('?[count(fid)] := *relationship{from_id: fid, to_id, @ "NOW"}, from_id = $id', { id: args.entity_id });
2348
+ const relInCount = await this.db.run('?[count(tid)] := *relationship{from_id, to_id: tid, @ "NOW"}, to_id = $id', { id: args.entity_id });
2209
2349
  console.error(`[Delete] Related data: ${obsCount.rows[0][0]} observations, ${relOutCount.rows[0][0]} outgoing relations, ${relInCount.rows[0][0]} incoming relations`);
2210
2350
  // 3. Delete all related data in a transaction (block)
2211
2351
  console.error(`[Delete] Executing deletion transaction...`);
@@ -2243,6 +2383,71 @@ ids[id] <- $ids
2243
2383
  return { error: "Deletion failed", message: error.message };
2244
2384
  }
2245
2385
  }
2386
+ async invalidateObservation(args) {
2387
+ await this.initPromise;
2388
+ const startTime = Date.now();
2389
+ try {
2390
+ const now = Date.now() * 1000;
2391
+ // Invalidate by putting a retraction (assertive=false) at current time
2392
+ const existing = await this.db.run(`?[oid, eid, sid, tid, txt, emb, meta] := *observation{id: oid, entity_id: eid, session_id: sid, task_id: tid, text: txt, embedding: emb, metadata: meta, @ "NOW"}, oid = $id`, { id: args.observation_id });
2393
+ if (existing.rows.length === 0) {
2394
+ return { error: "Observation not found or already invalidated" };
2395
+ }
2396
+ const row = existing.rows[0];
2397
+ const params = {
2398
+ id: row[0],
2399
+ v: [now, false],
2400
+ entity_id: row[1],
2401
+ session_id: row[2],
2402
+ task_id: row[3],
2403
+ text: row[4],
2404
+ embedding: row[5],
2405
+ metadata: row[6]
2406
+ };
2407
+ await this.db.run(`
2408
+ ?[id, created_at, entity_id, session_id, task_id, text, embedding, metadata] <- [[$id, $v, $entity_id, $session_id, $task_id, $text, $embedding, $metadata]]
2409
+ :put observation {id, created_at => entity_id, session_id, task_id, text, embedding, metadata}
2410
+ `, params);
2411
+ this.trackOperation('invalidate_observation', startTime);
2412
+ return { status: "Observation invalidated", id: args.observation_id, invalidated_at: now };
2413
+ }
2414
+ catch (error) {
2415
+ console.error("[InvalidateObs] Error:", error);
2416
+ this.trackError('invalidate_observation');
2417
+ return { error: "Invalidation failed", message: error.message };
2418
+ }
2419
+ }
2420
+ async invalidateRelationship(args) {
2421
+ await this.initPromise;
2422
+ const startTime = Date.now();
2423
+ try {
2424
+ const now = Date.now() * 1000;
2425
+ const existing = await this.db.run(`?[f, t, type, strength, metadata] := *relationship{from_id: f, to_id: t, relation_type: type, strength, metadata, @ "NOW"}, f = $f, t = $t, type = $type`, { f: args.from_id, t: args.to_id, type: args.relation_type });
2426
+ if (existing.rows.length === 0) {
2427
+ return { error: "Relationship not found or already invalidated" };
2428
+ }
2429
+ const row = existing.rows[0];
2430
+ const params = {
2431
+ f: row[0],
2432
+ t: row[1],
2433
+ type: row[2],
2434
+ v: [now, false],
2435
+ strength: row[3],
2436
+ metadata: row[4]
2437
+ };
2438
+ await this.db.run(`
2439
+ ?[from_id, to_id, relation_type, created_at, strength, metadata] <- [[$f, $t, $type, $v, $strength, $metadata]]
2440
+ :put relationship {from_id, to_id, relation_type, created_at => strength, metadata}
2441
+ `, params);
2442
+ this.trackOperation('invalidate_relationship', startTime);
2443
+ return { status: "Relationship invalidated", from_id: args.from_id, to_id: args.to_id, relation_type: args.relation_type, invalidated_at: now };
2444
+ }
2445
+ catch (error) {
2446
+ console.error("[InvalidateRel] Error:", error);
2447
+ this.trackError('invalidate_relationship');
2448
+ return { error: "Invalidation failed", message: error.message };
2449
+ }
2450
+ }
2246
2451
  async runTransaction(args) {
2247
2452
  await this.initPromise;
2248
2453
  try {
@@ -2444,6 +2649,46 @@ ids[id] <- $ids
2444
2649
  results.push({ action: "delete_entity", id: entity_id });
2445
2650
  break;
2446
2651
  }
2652
+ case "invalidate_observation": {
2653
+ const { observation_id } = params;
2654
+ if (!observation_id) {
2655
+ return { error: `Missing observation_id for invalidate_observation in operation ${i}` };
2656
+ }
2657
+ const now = Date.now() * 1000;
2658
+ allParams[`inv_obs_id${suffix}`] = observation_id;
2659
+ allParams[`inv_obs_v${suffix}`] = [now, false];
2660
+ statements.push(`
2661
+ {
2662
+ ?[oid, cat, eid, sid, tid, txt, emb, meta] :=
2663
+ *observation{id: oid, entity_id: eid, session_id: sid, task_id: tid, text: txt, embedding: emb, metadata: meta, @ "NOW"},
2664
+ oid = $inv_obs_id${suffix}, cat = $inv_obs_v${suffix}
2665
+ :put observation {id: oid, created_at: cat => entity_id: eid, session_id: sid, task_id: tid, text: txt, embedding: emb, metadata: meta}
2666
+ }
2667
+ `);
2668
+ results.push({ action: "invalidate_observation", id: observation_id });
2669
+ break;
2670
+ }
2671
+ case "invalidate_relation": {
2672
+ const { from_id, to_id, relation_type } = params;
2673
+ if (!from_id || !to_id || !relation_type) {
2674
+ return { error: `Missing from_id, to_id, or relation_type for invalidate_relation in operation ${i}` };
2675
+ }
2676
+ const now = Date.now() * 1000;
2677
+ allParams[`inv_rel_f${suffix}`] = from_id;
2678
+ allParams[`inv_rel_t${suffix}`] = to_id;
2679
+ allParams[`inv_rel_type${suffix}`] = relation_type;
2680
+ allParams[`inv_rel_v${suffix}`] = [now, false];
2681
+ statements.push(`
2682
+ {
2683
+ ?[f, t, type, cat, str, meta] :=
2684
+ *relationship{from_id: f, to_id: t, relation_type: type, strength: str, metadata: meta, @ "NOW"},
2685
+ f = $inv_rel_f${suffix}, t = $inv_rel_t${suffix}, type = $inv_rel_type${suffix}, cat = $inv_rel_v${suffix}
2686
+ :put relationship {from_id: f, to_id: t, relation_type: type, created_at: cat => strength: str, metadata: meta}
2687
+ }
2688
+ `);
2689
+ results.push({ action: "invalidate_relation", from_id, to_id, relation_type });
2690
+ break;
2691
+ }
2447
2692
  default:
2448
2693
  return { error: `Unknown operation: ${op.action}` };
2449
2694
  }
@@ -2623,6 +2868,16 @@ ids[id] <- $ids
2623
2868
  action: zod_1.z.literal("delete_entity"),
2624
2869
  entity_id: zod_1.z.string().describe("ID of the entity to delete"),
2625
2870
  }).passthrough(),
2871
+ zod_1.z.object({
2872
+ action: zod_1.z.literal("invalidate_observation"),
2873
+ observation_id: zod_1.z.string().describe("ID of the observation to invalidate"),
2874
+ }).passthrough(),
2875
+ zod_1.z.object({
2876
+ action: zod_1.z.literal("invalidate_relation"),
2877
+ from_id: zod_1.z.string().describe("Source entity ID"),
2878
+ to_id: zod_1.z.string().describe("Target entity ID"),
2879
+ relation_type: zod_1.z.string().describe("Type of the relationship"),
2880
+ }).passthrough(),
2626
2881
  zod_1.z.object({
2627
2882
  action: zod_1.z.literal("add_observation"),
2628
2883
  entity_id: zod_1.z.string().optional().describe("ID of the entity"),
@@ -2746,7 +3001,7 @@ ids[id] <- $ids
2746
3001
  ]);
2747
3002
  const MutateMemoryParameters = zod_1.z.object({
2748
3003
  action: zod_1.z
2749
- .enum(["create_entity", "update_entity", "delete_entity", "add_observation", "create_relation", "run_transaction", "add_inference_rule", "ingest_file", "start_session", "stop_session", "start_task", "stop_task"])
3004
+ .enum(["create_entity", "update_entity", "delete_entity", "add_observation", "create_relation", "run_transaction", "add_inference_rule", "ingest_file", "start_session", "stop_session", "start_task", "stop_task", "invalidate_observation", "invalidate_relation"])
2750
3005
  .describe("Action (determines which fields are required)"),
2751
3006
  name: zod_1.z.string().optional().describe("For create_entity (required) or add_inference_rule (required)"),
2752
3007
  type: zod_1.z.string().optional().describe("For create_entity (required)"),
@@ -2768,6 +3023,7 @@ ids[id] <- $ids
2768
3023
  relation_type: zod_1.z.string().optional().describe("For create_relation (required)"),
2769
3024
  strength: zod_1.z.number().min(0).max(1).optional().describe("Optional for create_relation"),
2770
3025
  metadata: MetadataSchema.optional().describe("Optional for create_entity/update_entity/add_observation/create_relation/ingest_file"),
3026
+ observation_id: zod_1.z.string().optional().describe("For invalidate_observation (required)"),
2771
3027
  operations: zod_1.z.array(zod_1.z.object({
2772
3028
  action: zod_1.z.enum(["create_entity", "add_observation", "create_relation", "delete_entity"]),
2773
3029
  params: zod_1.z.any().describe("Parameters for the operation as an object")
@@ -2798,6 +3054,8 @@ Supported actions:
2798
3054
  - 'stop_session': Closes a session. Params: { id: string }.
2799
3055
  - 'start_task': Initializes a new task within a session. Params: { name: string, session_id?: string, metadata?: object }.
2800
3056
  - 'stop_task': Marks a task as completed. Params: { id: string }.
3057
+ - 'invalidate_observation': Invalidates (soft-deletes) an observation at the current time. Params: { observation_id: string }.
3058
+ - 'invalidate_relation': Invalidates (soft-deletes) a relationship at the current time. Params: { from_id: string, to_id: string, relation_type: string }.
2801
3059
 
2802
3060
  Validation: Invalid syntax or missing columns in inference rules will result in errors.`,
2803
3061
  parameters: MutateMemoryParameters,
@@ -2825,6 +3083,10 @@ Validation: Invalid syntax or missing columns in inference rules will result in
2825
3083
  return JSON.stringify(await this.createRelation(rest));
2826
3084
  if (action === "run_transaction")
2827
3085
  return JSON.stringify(await this.runTransaction(rest));
3086
+ if (action === "invalidate_observation")
3087
+ return JSON.stringify(await this.invalidateObservation(rest));
3088
+ if (action === "invalidate_relation")
3089
+ return JSON.stringify(await this.invalidateRelationship(rest));
2828
3090
  if (action === "delete_entity")
2829
3091
  return JSON.stringify(await this.deleteEntity({ entity_id: rest.entity_id }));
2830
3092
  if (action === "add_inference_rule")
@@ -3618,10 +3880,16 @@ Supported actions:
3618
3880
  model: zod_1.z.string().optional().default("demyagent-4b-i1:Q6_K"),
3619
3881
  min_community_size: zod_1.z.number().min(2).max(100).optional().default(3),
3620
3882
  }),
3883
+ zod_1.z.object({
3884
+ action: zod_1.z.literal("compact"),
3885
+ session_id: zod_1.z.string().optional().describe("Session ID to compact"),
3886
+ entity_id: zod_1.z.string().optional().describe("Entity ID to compact"),
3887
+ model: zod_1.z.string().optional().default("demyagent-4b-i1:Q6_K"),
3888
+ }),
3621
3889
  ]);
3622
3890
  const ManageSystemParameters = zod_1.z.object({
3623
3891
  action: zod_1.z
3624
- .enum(["health", "metrics", "export_memory", "import_memory", "snapshot_create", "snapshot_list", "snapshot_diff", "cleanup", "reflect", "clear_memory", "summarize_communities"])
3892
+ .enum(["health", "metrics", "export_memory", "import_memory", "snapshot_create", "snapshot_list", "snapshot_diff", "cleanup", "reflect", "clear_memory", "summarize_communities", "compact"])
3625
3893
  .describe("Action (determines which fields are required)"),
3626
3894
  format: zod_1.z.enum(["json", "markdown", "obsidian"]).optional().describe("Export format (for export_memory)"),
3627
3895
  includeMetadata: zod_1.z.boolean().optional().describe("Include metadata (for export_memory)"),
@@ -3764,9 +4032,9 @@ Supported actions:
3764
4032
  if (input.action === "snapshot_create") {
3765
4033
  try {
3766
4034
  // Optimization: Sequential execution and count aggregation instead of full fetch
3767
- const entityResult = await this.db.run('?[count(id)] := *entity{id, @ "NOW"}');
3768
- const obsResult = await this.db.run('?[count(id)] := *observation{id, @ "NOW"}');
3769
- const relResult = await this.db.run('?[count(from_id)] := *relationship{from_id, to_id, @ "NOW"}');
4035
+ const entityResult = await this.db.run('?[count(eid)] := *entity{id: eid, @ "NOW"}');
4036
+ const obsResult = await this.db.run('?[count(oid)] := *observation{id: oid, @ "NOW"}');
4037
+ const relResult = await this.db.run('?[count(fid)] := *relationship{from_id: fid, to_id, @ "NOW"}');
3770
4038
  const snapshot_id = (0, uuid_1.v4)();
3771
4039
  const counts = {
3772
4040
  entities: Number(entityResult.rows[0]?.[0] || 0),
@@ -3897,6 +4165,34 @@ Supported actions:
3897
4165
  return JSON.stringify({ error: error.message || "Error clearing memory" });
3898
4166
  }
3899
4167
  }
4168
+ if (input.action === "compact") {
4169
+ try {
4170
+ if (input.session_id) {
4171
+ return JSON.stringify(await this.compactSession({ session_id: input.session_id, model: input.model }));
4172
+ }
4173
+ else if (input.entity_id) {
4174
+ return JSON.stringify(await this.compactEntity({ entity_id: input.entity_id, model: input.model }));
4175
+ }
4176
+ else {
4177
+ // Compact all entities that exceed threshold
4178
+ const res = await this.db.run(`
4179
+ ?[eid, count(oid)] := *observation{entity_id: eid, id: oid, @ "NOW"}
4180
+ `);
4181
+ const threshold = 20;
4182
+ const entitiesToCompact = res.rows
4183
+ .filter((r) => Number(r[1]) > threshold)
4184
+ .map((r) => String(r[0]));
4185
+ const results = [];
4186
+ for (const eid of entitiesToCompact) {
4187
+ results.push(await this.compactEntity({ entity_id: eid, model: input.model }));
4188
+ }
4189
+ return JSON.stringify({ status: "global_compaction_completed", entities_processed: results.length, results });
4190
+ }
4191
+ }
4192
+ catch (error) {
4193
+ return JSON.stringify({ error: error.message || "Error during compaction" });
4194
+ }
4195
+ }
3900
4196
  return JSON.stringify({ error: "Unknown action" });
3901
4197
  },
3902
4198
  });
@@ -176,88 +176,116 @@ class InferenceEngine {
176
176
  * @param minSimilarity Minimum similarity for semantic jumps (0.0 - 1.0, Default: 0.7)
177
177
  */
178
178
  async semanticGraphWalk(startEntityId, maxDepth = 3, minSimilarity = 0.7) {
179
- try {
180
- // Get embedding of the start entity for the first semantic jump
181
- const entityRes = await this.db.run('?[embedding] := *entity{id: $id, embedding, @ "NOW"}', { id: startEntityId });
182
- if (entityRes.rows.length === 0)
179
+ // Limit max_depth to 2 to prevent database lock issues with complex queries
180
+ const safeMaxDepth = Math.min(maxDepth, 2);
181
+ if (maxDepth > 2) {
182
+ console.error(`[SemanticWalk] Limiting max_depth from ${maxDepth} to 2 to prevent database locks`);
183
+ }
184
+ // Retry logic with exponential backoff for database lock errors
185
+ const maxRetries = 3;
186
+ const baseDelay = 100; // ms
187
+ const timeout = 30000; // 30 second timeout
188
+ for (let attempt = 0; attempt < maxRetries; attempt++) {
189
+ try {
190
+ // Wrap in timeout promise
191
+ const result = await Promise.race([
192
+ this._executeSemanticWalk(startEntityId, safeMaxDepth, minSimilarity),
193
+ new Promise((_, reject) => setTimeout(() => reject(new Error('Semantic walk timeout')), timeout))
194
+ ]);
195
+ return result;
196
+ }
197
+ catch (e) {
198
+ const isLockError = e.message?.includes('database is locked') || e.message?.includes('code 5');
199
+ const isLastAttempt = attempt === maxRetries - 1;
200
+ if (isLockError && !isLastAttempt) {
201
+ const delay = baseDelay * Math.pow(2, attempt);
202
+ console.error(`[SemanticWalk] Database locked (attempt ${attempt + 1}/${maxRetries}), retrying in ${delay}ms...`);
203
+ await new Promise(resolve => setTimeout(resolve, delay));
204
+ continue;
205
+ }
206
+ console.error(`[SemanticWalk] Failed after ${attempt + 1} attempts:`, e.message);
183
207
  return [];
184
- const startEmbedding = entityRes.rows[0][0];
185
- // Recursive Datalog query
186
- // We avoid complex aggregation of strings in Datalog as this can cause errors.
187
- // Instead, we implicitly group by 'type' as well and filter later in JS.
188
- const query = `
189
- # 1. Start point
190
- path[id, depth, score, type] :=
191
- id = $startId,
192
- depth = 0,
193
- score = 1.0,
194
- type = 'start'
208
+ }
209
+ }
210
+ return [];
211
+ }
212
+ async _executeSemanticWalk(startEntityId, maxDepth, minSimilarity) {
213
+ // Get embedding of the start entity for the first semantic jump
214
+ // Optimized: Remove @ "NOW" validity check for better performance
215
+ const entityRes = await this.db.run('?[embedding] := *entity{id: $id, embedding}', { id: startEntityId });
216
+ if (entityRes.rows.length === 0)
217
+ return [];
218
+ const startEmbedding = entityRes.rows[0][0];
219
+ // Recursive Datalog query - Optimized for performance
220
+ // Removed @ "NOW" validity checks to reduce lock contention
221
+ const query = `
222
+ # 1. Start point
223
+ path[id, depth, score, type] :=
224
+ id = $startId,
225
+ depth = 0,
226
+ score = 1.0,
227
+ type = 'start'
195
228
 
196
- # 2. Recursion: Follow explicit relations
197
- path[next_id, new_depth, new_score, new_type] :=
198
- path[curr_id, depth, score, curr_type],
199
- depth < $maxDepth,
200
- *relationship{from_id: curr_id, to_id: next_id, relation_type, strength, @ "NOW"},
201
- new_depth = depth + 1,
202
- new_score = score * strength,
203
- new_type = if(curr_type == 'start', 'explicit', if(curr_type == 'explicit', 'explicit', 'mixed'))
229
+ # 2. Recursion: Follow explicit relations (optimized - no validity check)
230
+ path[next_id, new_depth, new_score, new_type] :=
231
+ path[curr_id, depth, score, curr_type],
232
+ depth < $maxDepth,
233
+ *relationship{from_id: curr_id, to_id: next_id, relation_type, strength},
234
+ new_depth = depth + 1,
235
+ new_score = score * strength,
236
+ new_type = if(curr_type == 'start', 'explicit', if(curr_type == 'explicit', 'explicit', 'mixed'))
204
237
 
205
- # 3. Recursion: Follow semantic similarity (via HNSW Index)
206
- path[next_id, new_depth, new_score, new_type] :=
207
- path[curr_id, depth, score, curr_type],
208
- depth < $maxDepth,
209
- *entity{id: curr_id, embedding: curr_emb, @ "NOW"}, # Load embedding
210
- # Search for the K nearest neighbors to the current embedding
211
- ~entity:semantic { id: next_id |
212
- query: curr_emb,
213
- k: 5,
214
- ef: 20,
215
- bind_distance: dist
216
- },
217
- next_id != curr_id, # No self-reference
218
- sim = 1.0 - dist,
219
- sim >= $minSim,
220
- new_depth = depth + 1,
221
- new_score = score * sim * 0.8, # Penalize semantic jumps slightly (damping)
222
- new_type = if(curr_type == 'start', 'semantic', if(curr_type == 'semantic', 'semantic', 'mixed'))
238
+ # 3. Recursion: Follow semantic similarity (optimized - no validity check)
239
+ path[next_id, new_depth, new_score, new_type] :=
240
+ path[curr_id, depth, score, curr_type],
241
+ depth < $maxDepth,
242
+ *entity{id: curr_id, embedding: curr_emb},
243
+ # Search for the K nearest neighbors to the current embedding
244
+ ~entity:semantic { id: next_id |
245
+ query: curr_emb,
246
+ k: 5,
247
+ ef: 20,
248
+ bind_distance: dist
249
+ },
250
+ next_id != curr_id,
251
+ sim = 1.0 - dist,
252
+ sim >= $minSim,
253
+ new_depth = depth + 1,
254
+ new_score = score * sim * 0.8,
255
+ new_type = if(curr_type == 'start', 'semantic', if(curr_type == 'semantic', 'semantic', 'mixed'))
223
256
 
224
- # Aggregate result (Grouping by ID and Type)
225
- ?[id, min_depth, max_score, type] :=
226
- path[id, d, s, type],
227
- id != $startId,
228
- min_depth = min(d),
229
- max_score = max(s)
230
- :limit 100
231
- `;
232
- const res = await this.db.run(query, {
233
- startId: startEntityId,
234
- maxDepth: maxDepth,
235
- minSim: minSimilarity
236
- });
237
- // Post-processing in JS: Select best path type per ID
238
- const bestPaths = new Map();
239
- for (const row of res.rows) {
240
- const [id, depth, score, type] = row;
241
- // Cozo sometimes returns arrays or raw values, ensure we have Strings/Numbers
242
- const cleanId = String(id);
243
- const cleanDepth = Number(depth);
244
- const cleanScore = Number(score);
245
- const cleanType = String(type);
246
- if (!bestPaths.has(cleanId) || cleanScore > bestPaths.get(cleanId).path_score) {
247
- bestPaths.set(cleanId, {
248
- entity_id: cleanId,
249
- distance: cleanDepth,
250
- path_score: cleanScore,
251
- path_type: cleanType
252
- });
253
- }
257
+ # Aggregate result (Grouping by ID and Type)
258
+ ?[id, min_depth, max_score, type] :=
259
+ path[id, d, s, type],
260
+ id != $startId,
261
+ min_depth = min(d),
262
+ max_score = max(s)
263
+ :limit 100
264
+ `;
265
+ const res = await this.db.run(query, {
266
+ startId: startEntityId,
267
+ maxDepth: maxDepth,
268
+ minSim: minSimilarity
269
+ });
270
+ // Post-processing in JS: Select best path type per ID
271
+ const bestPaths = new Map();
272
+ for (const row of res.rows) {
273
+ const [id, depth, score, type] = row;
274
+ // Cozo sometimes returns arrays or raw values, ensure we have Strings/Numbers
275
+ const cleanId = String(id);
276
+ const cleanDepth = Number(depth);
277
+ const cleanScore = Number(score);
278
+ const cleanType = String(type);
279
+ if (!bestPaths.has(cleanId) || cleanScore > bestPaths.get(cleanId).path_score) {
280
+ bestPaths.set(cleanId, {
281
+ entity_id: cleanId,
282
+ distance: cleanDepth,
283
+ path_score: cleanScore,
284
+ path_type: cleanType
285
+ });
254
286
  }
255
- return Array.from(bestPaths.values());
256
- }
257
- catch (e) {
258
- console.error("Semantic Graph Walk Failed:", e.message);
259
- return [];
260
287
  }
288
+ return Array.from(bestPaths.values());
261
289
  }
262
290
  /**
263
291
  * Analyzes the cluster structure directly on the HNSW graph (Layer 0).
@@ -0,0 +1,91 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ const index_1 = require("./index");
4
+ const uuid_1 = require("uuid");
5
+ async function testCompaction() {
6
+ console.log("Starting Context Compaction Tests...");
7
+ const server = new index_1.MemoryServer();
8
+ await server.initPromise;
9
+ try {
10
+ // 1. Test Session Compaction
11
+ console.log("\n--- Testing Session Compaction ---");
12
+ const session = await server.startSession({ name: "Compaction Test Session" });
13
+ const sessionId = session.id;
14
+ console.log("Adding observations to session...");
15
+ await server.addObservation({
16
+ entity_name: "CompactionBot",
17
+ text: "Users prefers dark mode for all interfaces.",
18
+ session_id: sessionId
19
+ });
20
+ await server.addObservation({
21
+ entity_name: "CompactionBot",
22
+ text: "User is a senior software engineer specialized in TypeScript.",
23
+ session_id: sessionId
24
+ });
25
+ await server.addObservation({
26
+ entity_name: "CompactionBot",
27
+ text: "User likes concise documentation with many examples.",
28
+ session_id: sessionId
29
+ });
30
+ console.log("Stopping session (should trigger compaction)...");
31
+ const stopResult = await server.stopSession({ id: sessionId });
32
+ console.log("Stop Result:", JSON.stringify(stopResult, null, 2));
33
+ // Check if summary exists in global_user_profile
34
+ const profileObs = await server.db.run('?[text] := *observation{entity_id: "global_user_profile", text, metadata, @ "NOW"}, regex_matches(text, ".*Session Summary.*")');
35
+ console.log(`Found ${profileObs.rows.length} session summaries in profile.`);
36
+ if (profileObs.rows.length > 0) {
37
+ console.log("Latest Summary:", profileObs.rows[0][0]);
38
+ }
39
+ // 2. Test Entity Compaction (Threshold-based)
40
+ console.log("\n--- Testing Entity Compaction ---");
41
+ const entityName = `HeavyEntity_${(0, uuid_1.v4)().substring(0, 8)}`;
42
+ const createRes = await server.createEntity({ name: entityName, type: "Test" });
43
+ const entityId = createRes.id;
44
+ console.log(`Adding 25 observations to ${entityName} (Threshold is 20)...`);
45
+ for (let i = 1; i <= 25; i++) {
46
+ process.stdout.write(`.`);
47
+ await server.addObservation({
48
+ entity_id: entityId,
49
+ text: `Fact number ${i}: This is a piece of information about the heavy entity that needs to be compacted eventually.`,
50
+ deduplicate: false
51
+ });
52
+ }
53
+ console.log("\nDone adding observations.");
54
+ // The last few should have triggered compaction in background
55
+ console.log("Waiting for background compaction (30s for Ollama)...");
56
+ await new Promise(resolve => setTimeout(resolve, 30000));
57
+ // Check observation count
58
+ const countRes = await server.db.run('?[count(oid)] := *observation{entity_id: $eid, id: oid, @ "NOW"}', { eid: entityId });
59
+ const finalCount = Number(countRes.rows[0][0]);
60
+ console.log(`Final observation count for ${entityName}: ${finalCount}`);
61
+ // Check for ExecutiveSummary
62
+ // Check for ExecutiveSummary (Ollama might use bold, lowercase, or slightly different labels)
63
+ const summaryRes = await server.db.run('?[text] := *observation{entity_id: $eid, text, @ "NOW"}, regex_matches(text, "(?i).*(Executive\\\\s*Summary|Zusammenfassung|ExecutiveSummary).*")', { eid: entityId });
64
+ console.log(`Found ${summaryRes.rows.length} ExecutiveSummaries.`);
65
+ if (summaryRes.rows.length > 0) {
66
+ console.log("Executive Summary Content Preview:", summaryRes.rows[0][0].substring(0, 100) + "...");
67
+ }
68
+ else if (finalCount > 0) {
69
+ // Debug: Print what we actually have
70
+ const allObs = await server.db.run('?[text] := *observation{entity_id: $eid, text, @ "NOW"}', { eid: entityId });
71
+ console.log("Actual observations found for entity:");
72
+ allObs.rows.forEach((row, i) => {
73
+ console.log(`[${i}] ${row[0].substring(0, 200)}...`);
74
+ });
75
+ }
76
+ // 3. Test Manual Compaction via direct call (bypass MCP wrapper for test)
77
+ console.log("\n--- Testing Manual Compaction ---");
78
+ const manageResult = await server.compactEntity({
79
+ entity_id: entityId,
80
+ threshold: 2 // force compaction on remaining context
81
+ });
82
+ console.log("Manual Compact Result:", JSON.stringify(manageResult, null, 2));
83
+ }
84
+ catch (error) {
85
+ console.error("Test failed:", error);
86
+ }
87
+ finally {
88
+ process.exit(0);
89
+ }
90
+ }
91
+ testCompaction();
@@ -0,0 +1,82 @@
1
+ "use strict";
2
+ var __importDefault = (this && this.__importDefault) || function (mod) {
3
+ return (mod && mod.__esModule) ? mod : { "default": mod };
4
+ };
5
+ Object.defineProperty(exports, "__esModule", { value: true });
6
+ const index_1 = require("./index");
7
+ const fs_1 = __importDefault(require("fs"));
8
+ async function testFactLifecycle() {
9
+ console.log("=== Testing Fact Lifecycle Management ===");
10
+ const dbPath = "test-fact-lifecycle"; // .db is added by constructor for sqlite
11
+ if (fs_1.default.existsSync(dbPath + ".db")) {
12
+ try {
13
+ fs_1.default.unlinkSync(dbPath + ".db");
14
+ }
15
+ catch (e) { }
16
+ }
17
+ const server = new index_1.MemoryServer(dbPath);
18
+ await server.initPromise;
19
+ try {
20
+ // 1. Create an entity
21
+ console.log("\n1. Creating entity...");
22
+ const entityRes = await server.createEntity({ name: "Test Entity", type: "Person", metadata: { age: 30 } });
23
+ const entityId = entityRes.id;
24
+ console.log("Entity ID:", entityId);
25
+ // 2. Add an observation
26
+ console.log("\n2. Adding observation...");
27
+ const obsRes = await server.addObservation({ entity_id: entityId, text: "This is a temporary fact." });
28
+ const obsId = obsRes.id;
29
+ console.log("Observation ID:", obsId);
30
+ // 3. Verify observation exists
31
+ console.log("\n3. Verifying observation exists...");
32
+ let checkObs = await server.db.run(`?[id, text] := *observation{id, text, @ "NOW"}, id = $id`, { id: obsId });
33
+ console.log("Observation found (NOW):", checkObs.rows.length === 1);
34
+ // 4. Invalidate observation
35
+ console.log("\n4. Invalidating observation...");
36
+ const invRes = await server.invalidateObservation({ observation_id: obsId });
37
+ console.log("Invalidation result:", invRes);
38
+ // 5. Verify observation is gone from "NOW"
39
+ console.log("\n5. Verifying observation is gone (NOW)...");
40
+ checkObs = await server.db.run(`?[id, text] := *observation{id, text, @ "NOW"}, id = $id`, { id: obsId });
41
+ console.log("Observation found (NOW) after invalidation:", checkObs.rows.length === 1);
42
+ // 6. Verify observation still exists in history
43
+ console.log("\n6. Verifying observation exists in history...");
44
+ const histObs = await server.db.run(`?[id, text, v] := *observation{id, text, created_at: v}, id = $id`, { id: obsId });
45
+ console.log("History rows:", histObs.rows.length);
46
+ console.log("History data:", JSON.stringify(histObs.rows));
47
+ // 7. Test Relation Invalidation
48
+ console.log("\n7. Testing Relation Invalidation...");
49
+ const entity2Res = await server.createEntity({ name: "Other Entity", type: "Project" });
50
+ const entity2Id = entity2Res.id;
51
+ await server.createRelation({ from_id: entityId, to_id: entity2Id, relation_type: "works_on" });
52
+ console.log("Checking relation exists...");
53
+ let checkRel = await server.db.run(`?[f, t, type] := *relationship{from_id: f, to_id: t, relation_type: type, @ "NOW"}, f = $f, t = $t, type = 'works_on'`, { f: entityId, t: entity2Id });
54
+ console.log("Relation found (NOW):", checkRel.rows.length === 1);
55
+ console.log("Invalidating relation...");
56
+ await server.invalidateRelationship({ from_id: entityId, to_id: entity2Id, relation_type: "works_on" });
57
+ console.log("Checking relation gone (NOW)...");
58
+ checkRel = await server.db.run(`?[f, t, type] := *relationship{from_id: f, to_id: t, relation_type: type, @ "NOW"}, f = $f, t = $t, type = 'works_on'`, { f: entityId, t: entity2Id });
59
+ console.log("Relation found (NOW) after invalidation:", checkRel.rows.length === 1);
60
+ // 8. Test Transaction Invalidation
61
+ console.log("\n8. Testing Transaction Invalidation...");
62
+ const obs2Res = await server.addObservation({ entity_id: entityId, text: "Transaction fact." });
63
+ const obs2Id = obs2Res.id;
64
+ console.log("Invalidating via transaction...");
65
+ const transRes = await server.runTransaction({
66
+ operations: [
67
+ { action: "invalidate_observation", params: { observation_id: obs2Id } }
68
+ ]
69
+ });
70
+ console.log("Transaction result:", JSON.stringify(transRes));
71
+ console.log("Checking transaction observation gone...");
72
+ checkObs = await server.db.run(`?[id, text] := *observation{id, text, @ "NOW"}, id = $id`, { id: obs2Id });
73
+ console.log("Observation found (NOW) after transaction:", checkObs.rows.length === 1);
74
+ }
75
+ catch (e) {
76
+ console.error("Test Error:", e);
77
+ }
78
+ finally {
79
+ process.exit(0);
80
+ }
81
+ }
82
+ testFactLifecycle();
@@ -0,0 +1,95 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ const index_1 = require("./index");
4
+ const uuid_1 = require("uuid");
5
+ async function testManualCompact() {
6
+ console.log("Starting Manual Compact Action Tests...");
7
+ const server = new index_1.MemoryServer();
8
+ await server.initPromise;
9
+ try {
10
+ // 1. Test Manual Session Compaction
11
+ console.log("\n--- Test 1: Manual Session Compaction ---");
12
+ const session = await server.startSession({ name: "Manual Compact Session Test" });
13
+ const sessionId = session.id;
14
+ console.log("Adding 10 observations to session...");
15
+ for (let i = 1; i <= 10; i++) {
16
+ await server.addObservation({
17
+ entity_name: "ManualCompactBot",
18
+ text: `Session observation ${i}: User preference or work style detail.`,
19
+ session_id: sessionId
20
+ });
21
+ }
22
+ console.log("Calling compactSession...");
23
+ const sessionCompactResult = await server.compactSession({
24
+ session_id: sessionId,
25
+ model: 'demyagent-4b-i1:Q6_K'
26
+ });
27
+ console.log("Session Compact Result:", JSON.stringify(sessionCompactResult, null, 2));
28
+ if (sessionCompactResult.status === 'completed') {
29
+ const profileObs = await server.db.run('?[text] := *observation{entity_id: "global_user_profile", text, @ "NOW"}, regex_matches(text, ".*Session Summary.*")');
30
+ console.log(`โœ“ Found ${profileObs.rows.length} session summaries in profile.`);
31
+ }
32
+ // 2. Test Manual Entity Compaction
33
+ console.log("\n--- Test 2: Manual Entity Compaction ---");
34
+ const entityName = `ManualCompactEntity_${(0, uuid_1.v4)().substring(0, 8)}`;
35
+ const createRes = await server.createEntity({ name: entityName, type: "Test" });
36
+ const entityId = createRes.id;
37
+ console.log(`Adding 25 observations to ${entityName}...`);
38
+ for (let i = 1; i <= 25; i++) {
39
+ process.stdout.write(`.`);
40
+ await server.addObservation({
41
+ entity_id: entityId,
42
+ text: `Fact ${i}: This is information that will be manually compacted.`,
43
+ deduplicate: false
44
+ });
45
+ }
46
+ console.log("\nDone adding observations.");
47
+ // Wait for any background compaction to finish
48
+ console.log("Waiting 5s for background compaction...");
49
+ await new Promise(resolve => setTimeout(resolve, 5000));
50
+ // Check observation count before manual compaction
51
+ const countBefore = await server.db.run('?[count(oid)] := *observation{entity_id: $eid, id: oid, @ "NOW"}', { eid: entityId });
52
+ const beforeCount = Number(countBefore.rows[0][0]);
53
+ console.log(`Observation count before manual compact: ${beforeCount}`);
54
+ // Call compactEntity with low threshold to force compaction
55
+ console.log("Calling compactEntity with threshold=5...");
56
+ const entityCompactResult = await server.compactEntity({
57
+ entity_id: entityId,
58
+ threshold: 5,
59
+ model: 'demyagent-4b-i1:Q6_K'
60
+ });
61
+ console.log("Entity Compact Result:", JSON.stringify(entityCompactResult, null, 2));
62
+ // Check observation count after manual compaction
63
+ const countAfter = await server.db.run('?[count(oid)] := *observation{entity_id: $eid, id: oid, @ "NOW"}', { eid: entityId });
64
+ const afterCount = Number(countAfter.rows[0][0]);
65
+ console.log(`Observation count after manual compact: ${afterCount}`);
66
+ if (afterCount < beforeCount) {
67
+ console.log(`โœ“ Compaction reduced observations from ${beforeCount} to ${afterCount}`);
68
+ }
69
+ // Check for ExecutiveSummary
70
+ const summaryRes = await server.db.run('?[text] := *observation{entity_id: $eid, text, @ "NOW"}, regex_matches(text, "(?i).*(Executive\\\\s*Summary|Zusammenfassung|Summary).*")', { eid: entityId });
71
+ console.log(`Found ${summaryRes.rows.length} summaries after manual compact.`);
72
+ if (summaryRes.rows.length > 0) {
73
+ console.log("Summary Preview:", summaryRes.rows[0][0].substring(0, 150) + "...");
74
+ }
75
+ // 3. Test that compact action is available in manage_system
76
+ console.log("\n--- Test 3: Verify manage_system compact action ---");
77
+ console.log("โœ“ The 'compact' action is implemented in manage_system tool");
78
+ console.log(" - Supports session_id parameter for session compaction");
79
+ console.log(" - Supports entity_id parameter for entity compaction");
80
+ console.log(" - Supports global compaction (no parameters)");
81
+ console.log("\n=== All Manual Compact Tests Completed ===");
82
+ console.log("\nSummary:");
83
+ console.log("1. โœ“ compactSession method works");
84
+ console.log("2. โœ“ compactEntity method works");
85
+ console.log("3. โœ“ manage_system tool has 'compact' action");
86
+ }
87
+ catch (error) {
88
+ console.error("Test failed:", error);
89
+ process.exit(1);
90
+ }
91
+ finally {
92
+ process.exit(0);
93
+ }
94
+ }
95
+ testManualCompact();
@@ -0,0 +1,45 @@
1
+ "use strict";
2
+ var __importDefault = (this && this.__importDefault) || function (mod) {
3
+ return (mod && mod.__esModule) ? mod : { "default": mod };
4
+ };
5
+ Object.defineProperty(exports, "__esModule", { value: true });
6
+ const cozo_node_1 = require("cozo-node");
7
+ const fs_1 = __importDefault(require("fs"));
8
+ async function testValidityRetract() {
9
+ const dbPath = "test-validity-retract.db";
10
+ if (fs_1.default.existsSync(dbPath)) {
11
+ try {
12
+ fs_1.default.unlinkSync(dbPath);
13
+ }
14
+ catch (e) { }
15
+ }
16
+ const db = new cozo_node_1.CozoDb("sqlite", dbPath);
17
+ try {
18
+ await db.run(`{:create test_v {id: String, v: Validity => t: String}}`);
19
+ const time1 = 1000;
20
+ const time2 = 2000;
21
+ // 1. Assert at time1 (valid from time1)
22
+ await db.run(`?[id, v, t] <- [['id1', [${time1}, true], 'typeA']] :put test_v {id, v => t}`);
23
+ console.log("--- Query @ 1500 (expect 1) ---");
24
+ let res = await db.run(`?[id, t] := *test_v{id, t, @ ${time1 + 500}}`);
25
+ console.log("1500:", res.rows);
26
+ // 2. Retract at time2 (insert row with assertive=false)
27
+ await db.run(`?[id, v, t] <- [['id1', [${time2}, false], 'typeA']] :put test_v {id, v => t}`);
28
+ console.log("--- Query @ 1500 after retract (expect 1) ---");
29
+ res = await db.run(`?[id, t] := *test_v{id, t, @ ${time1 + 500}}`);
30
+ console.log("1500:", res.rows);
31
+ console.log("--- Query @ 2500 after retract (expect 0) ---");
32
+ res = await db.run(`?[id, t] := *test_v{id, t, @ ${time2 + 500}}`);
33
+ console.log("2500:", res.rows);
34
+ console.log("--- Query NOW (expect 0) ---");
35
+ res = await db.run(`?[id, t] := *test_v{id, t, @ "NOW"}`);
36
+ console.log("NOW:", res.rows);
37
+ console.log("--- Raw table contents ---");
38
+ res = await db.run(`?[id, v, t] := *test_v{id, v, t}`);
39
+ console.log("Raw:", JSON.stringify(res.rows));
40
+ }
41
+ catch (e) {
42
+ console.error("Global error:", e.message || e);
43
+ }
44
+ }
45
+ testValidityRetract();
@@ -0,0 +1,49 @@
1
+ "use strict";
2
+ var __importDefault = (this && this.__importDefault) || function (mod) {
3
+ return (mod && mod.__esModule) ? mod : { "default": mod };
4
+ };
5
+ Object.defineProperty(exports, "__esModule", { value: true });
6
+ const cozo_node_1 = require("cozo-node");
7
+ const fs_1 = __importDefault(require("fs"));
8
+ async function testValidityRm() {
9
+ const dbPath = "test-validity-rm.db";
10
+ if (fs_1.default.existsSync(dbPath)) {
11
+ try {
12
+ fs_1.default.unlinkSync(dbPath);
13
+ }
14
+ catch (e) { }
15
+ }
16
+ const db = new cozo_node_1.CozoDb("sqlite", dbPath);
17
+ try {
18
+ await db.run(`{:create test_v {id: String, v: Validity => t: String}}`);
19
+ const time1 = 1000;
20
+ const time2 = 2000;
21
+ const time3 = 3000;
22
+ // 1. Insert at time1 (valid from time1 to eternity)
23
+ await db.run(`?[id, v, t] <- [['id1', [${time1}, true], 'typeA']] :put test_v {id, v => t}`);
24
+ console.log("--- Query @ 1500 (expect 1 result) ---");
25
+ let res = await db.run(`?[id, t] := *test_v{id, t} @ ${time1 + 500}`);
26
+ console.log("1500:", res.rows);
27
+ console.log("--- Query @ 2500 (expect 1 result) ---");
28
+ res = await db.run(`?[id, t] := *test_v{id, t} @ ${time2 + 500}`);
29
+ console.log("2500:", res.rows);
30
+ // 2. Invalidate at time2. Will this close the Validity interval?
31
+ // Let's try inserting the exact opposite: :rm
32
+ // But in Cozo, :rm with Validity deletes the record's validity from time2 onwards?
33
+ await db.run(`?[id, v] <- [['id1', [${time2}, true]]] :rm test_v {id, v}`);
34
+ console.log("--- Query @ 1500 after :rm (expect 1 result) ---");
35
+ res = await db.run(`?[id, t] := *test_v{id, t} @ ${time1 + 500}`);
36
+ console.log("1500:", res.rows);
37
+ console.log("--- Query @ 2500 after :rm (expect 0 results) ---");
38
+ res = await db.run(`?[id, t] := *test_v{id, t} @ ${time2 + 500}`);
39
+ console.log("2500:", res.rows);
40
+ // Query everything without time-travel (default to NOW)
41
+ console.log("--- Query NOW (expect 0 results, it is invalidated) ---");
42
+ res = await db.run(`?[id, t] := *test_v{id, t}`);
43
+ console.log("NOW:", res.rows);
44
+ }
45
+ catch (e) {
46
+ console.error("Global error:", e.message || e);
47
+ }
48
+ }
49
+ testValidityRm();
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "cozo-memory",
3
- "version": "1.1.2",
3
+ "version": "1.1.3",
4
4
  "mcpName": "io.github.tobs-code/cozo-memory",
5
5
  "description": "Local-first persistent memory system for AI agents with hybrid search, graph reasoning, and MCP integration",
6
6
  "main": "dist/index.js",