cozo-memory 1.1.0 โ†’ 1.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -63,6 +63,8 @@ Now you can add the server to your MCP client (e.g. Claude Desktop).
63
63
 
64
64
  ๐Ÿง  **Agentic Retrieval Layer (v2.0)** - Auto-routing engine that analyzes query intent via local LLM to select the optimal search strategy (Vector, Graph, or Community)
65
65
 
66
+ ๐Ÿง  **Multi-Level Memory (v2.0)** - Context-aware memory system with built-in session and task management
67
+
66
68
  ๐ŸŽฏ **Tiny Learned Reranker (v2.0)** - Integrated Cross-Encoder model (`ms-marco-MiniLM-L-6-v2`) for ultra-precise re-ranking of top search results
67
69
 
68
70
  ๐ŸŽฏ **Multi-Vector Support (since v1.7)** - Dual embeddings per entity: content-embedding for context, name-embedding for identification
@@ -77,7 +79,7 @@ Now you can add the server to your MCP client (e.g. Claude Desktop).
77
79
 
78
80
  ๐Ÿ—๏ธ **Hierarchical GraphRAG (v2.0)** - Automatic generation of thematic "Community Summaries" using local LLMs to enable global "Big Picture" reasoning
79
81
 
80
- ๐Ÿงน **Janitor Service** - LLM-backed automatic cleanup with hierarchical summarization and observation pruning
82
+ ๐Ÿงน **Janitor Service** - LLM-backed automatic cleanup with hierarchical summarization, observation pruning, and **automated session compression**
81
83
 
82
84
  ๐Ÿ‘ค **User Preference Profiling** - Persistent user preferences with automatic 50% search boost
83
85
 
@@ -219,10 +221,10 @@ graph TB
219
221
  Services --> Inference
220
222
  Services --> DB
221
223
 
222
- style Client fill:#e1f5ff
223
- style Server fill:#fff4e1
224
- style Services fill:#f0e1ff
225
- style DB fill:#e1ffe1
224
+ style Client fill:#e1f5ff,color:#000
225
+ style Server fill:#fff4e1,color:#000
226
+ style Services fill:#f0e1ff,color:#000
227
+ style DB fill:#e1ffe1,color:#000
226
228
  ```
227
229
 
228
230
  ### Graph-Walking Visualization
@@ -243,12 +245,12 @@ graph LR
243
245
  E1 -->|colleague_of| E4
244
246
  E4 -.semantic: also relevant.-> E2
245
247
 
246
- style Start fill:#e1f5ff
247
- style V1 fill:#fff4e1
248
- style E1 fill:#ffe1e1
249
- style E2 fill:#e1ffe1
250
- style E3 fill:#f0e1ff
251
- style E4 fill:#ffe1e1
248
+ style Start fill:#e1f5ff,color:#000
249
+ style V1 fill:#fff4e1,color:#000
250
+ style E1 fill:#ffe1e1,color:#000
251
+ style E2 fill:#e1ffe1,color:#000
252
+ style E3 fill:#f0e1ff,color:#000
253
+ style E4 fill:#ffe1e1,color:#000
252
254
  ```
253
255
 
254
256
  ## Installation
@@ -368,6 +370,7 @@ CozoDB Memory includes a full-featured CLI for all operations:
368
370
  # System operations
369
371
  cozo-memory system health
370
372
  cozo-memory system metrics
373
+ cozo-memory system reflect
371
374
 
372
375
  # Entity operations
373
376
  cozo-memory entity create -n "MyEntity" -t "person" -m '{"age": 30}'
@@ -383,11 +386,19 @@ cozo-memory relation create --from <id1> --to <id2> --type "knows" -s 0.8
383
386
  # Search
384
387
  cozo-memory search query -q "search term" -l 10
385
388
  cozo-memory search context -q "context query"
389
+ cozo-memory search agentic -q "agentic query"
386
390
 
387
391
  # Graph operations
388
392
  cozo-memory graph explore -s <entity-id> -h 3
389
393
  cozo-memory graph pagerank
390
394
  cozo-memory graph communities
395
+ cozo-memory graph summarize
396
+
397
+ # Session & Task management
398
+ cozo-memory session start -n "My Session"
399
+ cozo-memory session stop -i <session-id>
400
+ cozo-memory task start -n "My Task" -s <session-id>
401
+ cozo-memory task stop -i <task-id>
391
402
 
392
403
  # Export/Import
393
404
  cozo-memory export json -o backup.json --include-metadata --include-relationships --include-observations
@@ -512,8 +523,8 @@ The interface is reduced to **4 consolidated tools**. The concrete operation is
512
523
 
513
524
  | Tool | Purpose | Key Actions |
514
525
  |------|---------|-------------|
515
- | `mutate_memory` | Write operations | create_entity, update_entity, delete_entity, add_observation, create_relation, run_transaction, add_inference_rule, ingest_file |
516
- | `query_memory` | Read operations | search, advancedSearch, context, entity_details, history, graph_rag, graph_walking, agentic_search |
526
+ | `mutate_memory` | Write operations | create_entity, update_entity, delete_entity, add_observation, create_relation, start_session, stop_session, start_task, stop_task, run_transaction, add_inference_rule, ingest_file |
527
+ | `query_memory` | Read operations | search, advancedSearch, context, entity_details, history, graph_rag, graph_walking, agentic_search (Multi-Level Context support) |
517
528
  | `analyze_graph` | Graph analysis | explore, communities, pagerank, betweenness, hits, shortest_path, bridge_discovery, semantic_walk, infer_relations |
518
529
  | `manage_system` | Maintenance | health, metrics, export_memory, import_memory, snapshot_create, snapshot_list, snapshot_diff, cleanup, reflect, summarize_communities, clear_memory |
519
530
 
@@ -525,6 +536,10 @@ Actions:
525
536
  - `delete_entity`: `{ entity_id }`
526
537
  - `add_observation`: `{ entity_id?, entity_name?, entity_type?, text, metadata? }`
527
538
  - `create_relation`: `{ from_id, to_id, relation_type, strength?, metadata? }`
539
+ - `start_session`: `{ name?, metadata? }` **(New v2.0)**: Starts a new session context (metadata can include `user_id`, `project`, etc.)
540
+ - `stop_session`: `{ session_id }` **(New v2.0)**: Closes/archives an active session.
541
+ - `start_task`: `{ name, session_id?, metadata? }` **(New v2.0)**: Starts a specific task within a session.
542
+ - `stop_task`: `{ task_id }` **(New v2.0)**: Marks a task as completed.
528
543
  - `run_transaction`: `{ operations: Array<{ action, params }> }` **(New v1.2)**: Executes multiple operations atomically.
529
544
  - `add_inference_rule`: `{ name, datalog }`
530
545
  - `ingest_file`: `{ format, file_path?, content?, entity_id?, entity_name?, entity_type?, chunking?, metadata?, observation_metadata?, deduplicate?, max_observations? }`
@@ -721,6 +736,7 @@ Janitor Cleanup Details:
721
736
  - `cleanup` supports `dry_run`: with `confirm: false` only candidates are listed.
722
737
  - With `confirm: true`, the Janitor becomes active:
723
738
  - **Hierarchical Summarization**: Detects isolated or old observations, has them summarized by a local LLM (Ollama), and creates a new `ExecutiveSummary` node. Old fragments are deleted to reduce noise while preserving knowledge.
739
+ - **Automated Session Compression**: Automatically identifies inactive sessions, summarizes their activity into a few bullet points, and stores the summary in the User Profile while marking the session as archived.
724
740
 
725
741
  **Before Janitor:**
726
742
  ```
@@ -110,6 +110,12 @@ class CLICommands {
110
110
  async advancedSearch(params) {
111
111
  return await this.server.advancedSearch(params);
112
112
  }
113
+ async agenticSearch(query, limit) {
114
+ return await this.server.hybridSearch.agenticRetrieve({
115
+ query,
116
+ limit: limit || 10
117
+ });
118
+ }
113
119
  async context(query, contextWindow, timeRangeHours) {
114
120
  // Use advancedSearch with appropriate parameters
115
121
  return await this.server.advancedSearch({
@@ -146,6 +152,12 @@ class CLICommands {
146
152
  async communities() {
147
153
  return await this.server.recomputeCommunities();
148
154
  }
155
+ async summarizeCommunities(model, minCommunitySize) {
156
+ return await this.server.summarizeCommunities({
157
+ model,
158
+ min_community_size: minCommunitySize
159
+ });
160
+ }
149
161
  // System operations
150
162
  async health() {
151
163
  const entityCount = await this.server.db.run('?[count(id)] := *entity{id, @ "NOW"}');
@@ -162,6 +174,13 @@ class CLICommands {
162
174
  // Access private metrics via type assertion
163
175
  return this.server.metrics;
164
176
  }
177
+ async reflect(entityId, model, mode) {
178
+ return await this.server.reflectMemory({
179
+ entity_id: entityId,
180
+ model,
181
+ mode
182
+ });
183
+ }
165
184
  async exportMemory(format, options) {
166
185
  const { ExportImportService } = await import('./export-import-service.js');
167
186
  // Create a simple wrapper that implements DbService interface
@@ -206,5 +225,18 @@ class CLICommands {
206
225
  async getUserProfile() {
207
226
  return await this.server.editUserProfile({});
208
227
  }
228
+ // Session and Task management
229
+ async startSession(name, metadata) {
230
+ return await this.server.startSession({ name, metadata });
231
+ }
232
+ async stopSession(id) {
233
+ return await this.server.stopSession({ id });
234
+ }
235
+ async startTask(name, sessionId, metadata) {
236
+ return await this.server.startTask({ name, session_id: sessionId, metadata });
237
+ }
238
+ async stopTask(id) {
239
+ return await this.server.stopTask({ id });
240
+ }
209
241
  }
210
242
  exports.CLICommands = CLICommands;
package/dist/cli.js CHANGED
@@ -203,6 +203,23 @@ search
203
203
  handleError(error);
204
204
  }
205
205
  });
206
+ search
207
+ .command('agentic')
208
+ .description('Perform agentic retrieval with automatic routing')
209
+ .requiredOption('-q, --query <query>', 'Search query')
210
+ .option('-l, --limit <number>', 'Result limit', parseInt, 10)
211
+ .option('-f, --format <format>', 'Output format (json|pretty)', 'pretty')
212
+ .action(async (options) => {
213
+ try {
214
+ await cli.init();
215
+ const result = await cli.agenticSearch(options.query, options.limit);
216
+ formatOutput(result, options.format);
217
+ await cli.close();
218
+ }
219
+ catch (error) {
220
+ handleError(error);
221
+ }
222
+ });
206
223
  // Graph commands
207
224
  const graph = program.command('graph').description('Graph operations');
208
225
  graph
@@ -255,6 +272,23 @@ graph
255
272
  handleError(error);
256
273
  }
257
274
  });
275
+ graph
276
+ .command('summarize')
277
+ .description('Generate hierarchical community summaries (GraphRAG)')
278
+ .option('-m, --model <model>', 'LLM model to use for summaries')
279
+ .option('--min-size <number>', 'Minimum community size', parseInt)
280
+ .option('-f, --format <format>', 'Output format (json|pretty)', 'pretty')
281
+ .action(async (options) => {
282
+ try {
283
+ await cli.init();
284
+ const result = await cli.summarizeCommunities(options.model, options.minSize);
285
+ formatOutput(result, options.format);
286
+ await cli.close();
287
+ }
288
+ catch (error) {
289
+ handleError(error);
290
+ }
291
+ });
258
292
  // System commands
259
293
  const system = program.command('system').alias('sys').description('System operations');
260
294
  system
@@ -287,6 +321,24 @@ system
287
321
  handleError(error);
288
322
  }
289
323
  });
324
+ system
325
+ .command('reflect')
326
+ .description('Perform self-reflection to discover implicit relations')
327
+ .option('-i, --id <id>', 'Entity ID to reflect on (optional)')
328
+ .option('-m, --model <model>', 'LLM model to use')
329
+ .option('--mode <mode>', 'Reflection mode (summary|discovery)', 'discovery')
330
+ .option('-f, --format <format>', 'Output format (json|pretty)', 'pretty')
331
+ .action(async (options) => {
332
+ try {
333
+ await cli.init();
334
+ const result = await cli.reflect(options.id, options.model, options.mode);
335
+ formatOutput(result, options.format);
336
+ await cli.close();
337
+ }
338
+ catch (error) {
339
+ handleError(error);
340
+ }
341
+ });
290
342
  // Export/Import commands
291
343
  const exportCmd = program.command('export').description('Export memory');
292
344
  exportCmd
@@ -487,4 +539,77 @@ profile
487
539
  handleError(error);
488
540
  }
489
541
  });
542
+ // Session commands
543
+ const session = program.command('session').description('Session management');
544
+ session
545
+ .command('start')
546
+ .description('Start a new session')
547
+ .option('-n, --name <name>', 'Session name')
548
+ .option('-m, --metadata <json>', 'Metadata as JSON string')
549
+ .option('-f, --format <format>', 'Output format (json|pretty)', 'pretty')
550
+ .action(async (options) => {
551
+ try {
552
+ await cli.init();
553
+ const metadata = options.metadata ? JSON.parse(options.metadata) : undefined;
554
+ const result = await cli.startSession(options.name, metadata);
555
+ formatOutput(result, options.format);
556
+ await cli.close();
557
+ }
558
+ catch (error) {
559
+ handleError(error);
560
+ }
561
+ });
562
+ session
563
+ .command('stop')
564
+ .description('Stop a session')
565
+ .requiredOption('-i, --id <id>', 'Session ID')
566
+ .option('-f, --format <format>', 'Output format (json|pretty)', 'pretty')
567
+ .action(async (options) => {
568
+ try {
569
+ await cli.init();
570
+ const result = await cli.stopSession(options.id);
571
+ formatOutput(result, options.format);
572
+ await cli.close();
573
+ }
574
+ catch (error) {
575
+ handleError(error);
576
+ }
577
+ });
578
+ // Task commands
579
+ const task = program.command('task').description('Task management');
580
+ task
581
+ .command('start')
582
+ .description('Start a new task')
583
+ .requiredOption('-n, --name <name>', 'Task name')
584
+ .option('-s, --session-id <id>', 'Session ID')
585
+ .option('-m, --metadata <json>', 'Metadata as JSON string')
586
+ .option('-f, --format <format>', 'Output format (json|pretty)', 'pretty')
587
+ .action(async (options) => {
588
+ try {
589
+ await cli.init();
590
+ const metadata = options.metadata ? JSON.parse(options.metadata) : undefined;
591
+ const result = await cli.startTask(options.name, options.sessionId, metadata);
592
+ formatOutput(result, options.format);
593
+ await cli.close();
594
+ }
595
+ catch (error) {
596
+ handleError(error);
597
+ }
598
+ });
599
+ task
600
+ .command('stop')
601
+ .description('Stop a task')
602
+ .requiredOption('-i, --id <id>', 'Task ID')
603
+ .option('-f, --format <format>', 'Output format (json|pretty)', 'pretty')
604
+ .action(async (options) => {
605
+ try {
606
+ await cli.init();
607
+ const result = await cli.stopTask(options.id);
608
+ formatOutput(result, options.format);
609
+ await cli.close();
610
+ }
611
+ catch (error) {
612
+ handleError(error);
613
+ }
614
+ });
490
615
  program.parse();
@@ -78,6 +78,35 @@ class HybridSearch {
78
78
  return { ...r, score };
79
79
  });
80
80
  }
81
+ applyContextBoost(results, options) {
82
+ const { session_id, task_id } = options;
83
+ if (!session_id && !task_id)
84
+ return results;
85
+ return results.map(result => {
86
+ let boost = 1.0;
87
+ let reasons = [];
88
+ const metadata = result.metadata || {};
89
+ if (task_id && metadata.task_id === task_id) {
90
+ boost += 0.5;
91
+ reasons.push("Task Match");
92
+ }
93
+ if (session_id && metadata.session_id === session_id) {
94
+ boost += 0.3;
95
+ reasons.push("Session Match");
96
+ }
97
+ if (boost > 1.0) {
98
+ // Cap the score at 1.0 to stay within standard search score range
99
+ const newScore = Math.min(1.0, result.score * boost);
100
+ return {
101
+ ...result,
102
+ score: newScore,
103
+ explanation: (typeof result.explanation === 'string' ? result.explanation : JSON.stringify(result.explanation)) +
104
+ ` | Context Boost (x${boost.toFixed(1)}): ${reasons.join(', ')}`
105
+ };
106
+ }
107
+ return result;
108
+ });
109
+ }
81
110
  async applyReranking(query, results) {
82
111
  if (results.length <= 1)
83
112
  return results;
@@ -214,10 +243,6 @@ class HybridSearch {
214
243
  `:sort -score`,
215
244
  `:limit $limit`
216
245
  ].join('\n').trim();
217
- console.error('--- DEBUG: Cozo Datalog Query ---');
218
- console.error(datalogQuery);
219
- console.error('--- DEBUG: Params ---');
220
- console.error(JSON.stringify(params, null, 2));
221
246
  try {
222
247
  const results = await this.db.run(datalogQuery, params);
223
248
  let searchResults = results.rows.map((r) => ({
@@ -244,7 +269,8 @@ class HybridSearch {
244
269
  return Object.entries(filters.metadata).every(([key, val]) => r.metadata[key] === val);
245
270
  });
246
271
  }
247
- const finalResults = this.applyTimeDecay(searchResults);
272
+ const timeDecayedResults = this.applyTimeDecay(searchResults);
273
+ const finalResults = this.applyContextBoost(timeDecayedResults, options);
248
274
  // Phase 3: Reranking
249
275
  if (options.rerank) {
250
276
  const rerankedResults = await this.applyReranking(options.query, finalResults);
@@ -339,7 +365,6 @@ class HybridSearch {
339
365
  :sort -score
340
366
  :limit $limit
341
367
  `.trim();
342
- console.error("[HybridSearch] Graph-RAG Datalog Query:\n", datalogQuery);
343
368
  try {
344
369
  const results = await this.db.run(datalogQuery, params);
345
370
  let searchResults = results.rows.map((r) => ({
package/dist/index.js CHANGED
@@ -88,6 +88,8 @@ class MemoryServer {
88
88
  }
89
89
  async janitorCleanup(args) {
90
90
  await this.initPromise;
91
+ console.error(`[Janitor] Starting cleanup (auto-compressing sessions first)...`);
92
+ await this.compressSessions({ model: args.model });
91
93
  const olderThanDays = Math.max(1, Math.floor(args.older_than_days ?? 30));
92
94
  const maxObservations = Math.max(1, Math.floor(args.max_observations ?? 20));
93
95
  const minEntityDegree = Math.max(0, Math.floor(args.min_entity_degree ?? 2));
@@ -333,6 +335,69 @@ class MemoryServer {
333
335
  results,
334
336
  };
335
337
  }
338
+ async compressSessions(args) {
339
+ const model = args.model ?? "demyagent-4b-i1:Q6_K";
340
+ const inactiveThreshold = Date.now() - 30 * 60 * 1000; // 30 minutes
341
+ try {
342
+ // 1. Find inactive open sessions
343
+ const inactiveRes = await this.db.run(`?[session_id, last_active, metadata] := *session_state{session_id, last_active, status, metadata}, status == 'open', last_active < $threshold`, { threshold: inactiveThreshold });
344
+ if (inactiveRes.rows.length === 0) {
345
+ console.error(`[Janitor] No inactive sessions found for compression.`);
346
+ return;
347
+ }
348
+ console.error(`[Janitor] Found ${inactiveRes.rows.length} inactive sessions to compress.`);
349
+ for (const row of inactiveRes.rows) {
350
+ const sessionId = row[0];
351
+ const lastActive = row[1];
352
+ const metadata = row[2];
353
+ // 2. Fetch observations for this session
354
+ const obsRes = await this.db.run(`?[text] := *observation{session_id: $session_id, text, @ "NOW"}`, { session_id: sessionId });
355
+ if (obsRes.rows.length > 0) {
356
+ const sessionLogs = obsRes.rows.map((r) => r[0]);
357
+ // 3. Summarize using LLM
358
+ const systemPrompt = "You are a memory consolidation service. Summarize the following session logs into exactly 2-3 concise bullet points for long-term storage. Respond ONLY with the bullet points.";
359
+ const userPrompt = `Session Logs for ${sessionId}:\n\n` + sessionLogs.join('\n');
360
+ let summaryText;
361
+ try {
362
+ const ollamaMod = await import("ollama");
363
+ const ollamaClient = ollamaMod?.default ?? ollamaMod;
364
+ const response = await ollamaClient.chat({
365
+ model,
366
+ messages: [
367
+ { role: "system", content: systemPrompt },
368
+ { role: "user", content: userPrompt },
369
+ ],
370
+ });
371
+ summaryText = response?.message?.content?.trim?.() ?? "";
372
+ }
373
+ catch (e) {
374
+ console.warn(`[Janitor] LLM failed for session ${sessionId}: ${e.message}. Using basic concatenation.`);
375
+ summaryText = sessionLogs.join('\n').slice(0, 500);
376
+ }
377
+ if (summaryText) {
378
+ // 4. Write compression summary to global user profile
379
+ await this.addObservation({
380
+ entity_id: "global_user_profile",
381
+ text: `Session Summary (${sessionId}):\n${summaryText}`,
382
+ metadata: {
383
+ kind: "session_compression",
384
+ original_session_id: sessionId,
385
+ summarized_at: new Date().toISOString()
386
+ }
387
+ });
388
+ }
389
+ }
390
+ // 5. Mark session as compressed
391
+ await this.db.run(`?[session_id, last_active, status, metadata] <- $data
392
+ :put session_state {session_id, last_active, status, metadata}`, {
393
+ data: [[sessionId, lastActive, "compressed", metadata]]
394
+ });
395
+ }
396
+ }
397
+ catch (e) {
398
+ console.error(`[Janitor] Session compression error:`, e.message);
399
+ }
400
+ }
336
401
  async advancedSearch(args) {
337
402
  await this.initPromise;
338
403
  return this.hybridSearch.advancedSearch(args);
@@ -699,7 +764,7 @@ class MemoryServer {
699
764
  // Observation Table
700
765
  if (!relations.includes("observation")) {
701
766
  try {
702
- await this.db.run(`{:create observation {id: String, created_at: Validity => entity_id: String, text: String, embedding: <F32; ${EMBEDDING_DIM}>, metadata: Json}}`);
767
+ await this.db.run(`{:create observation {id: String, created_at: Validity => entity_id: String, session_id: String, task_id: String, text: String, embedding: <F32; ${EMBEDDING_DIM}>, metadata: Json}}`);
703
768
  console.error("[Schema] Observation table created.");
704
769
  }
705
770
  catch (e) {
@@ -707,8 +772,11 @@ class MemoryServer {
707
772
  }
708
773
  }
709
774
  else {
775
+ const columnsRes = await this.db.run(`::columns observation`);
776
+ const columns = columnsRes.rows.map((r) => r[0]);
710
777
  const timeTravelReady = await this.isTimeTravelReady("observation");
711
- if (!timeTravelReady) {
778
+ if (!columns.includes("session_id") || !columns.includes("task_id") || !timeTravelReady) {
779
+ console.error("[Schema] Migrating observation table for Session/Task support...");
712
780
  // Drop indices before migration
713
781
  try {
714
782
  await this.db.run("::hnsw drop observation:semantic");
@@ -722,13 +790,26 @@ class MemoryServer {
722
790
  await this.db.run("::lsh drop observation:lsh");
723
791
  }
724
792
  catch (e) { }
725
- await this.db.run(`
726
- ?[id, created_at, entity_id, text, embedding, metadata] :=
727
- *observation{id, entity_id, text, embedding, metadata, created_at: created_at_raw},
728
- created_at = [created_at_raw, true]
729
- :replace observation {id: String, created_at: Validity => entity_id: String, text: String, embedding: <F32; ${EMBEDDING_DIM}>, metadata: Json}
730
- `);
731
- console.error("[Schema] Observation table migrated (Validity).");
793
+ if (!timeTravelReady) {
794
+ await this.db.run(`
795
+ ?[id, created_at, entity_id, session_id, task_id, text, embedding, metadata] :=
796
+ *observation{id, entity_id, text, embedding, metadata, created_at: created_at_raw},
797
+ created_at = [created_at_raw, true],
798
+ session_id = "",
799
+ task_id = ""
800
+ :replace observation {id: String, created_at: Validity => entity_id: String, session_id: String, task_id: String, text: String, embedding: <F32; ${EMBEDDING_DIM}>, metadata: Json}
801
+ `);
802
+ }
803
+ else {
804
+ await this.db.run(`
805
+ ?[id, created_at, entity_id, session_id, task_id, text, embedding, metadata] :=
806
+ *observation{id, entity_id, text, embedding, metadata, created_at},
807
+ session_id = "",
808
+ task_id = ""
809
+ :replace observation {id: String, created_at: Validity => entity_id: String, session_id: String, task_id: String, text: String, embedding: <F32; ${EMBEDDING_DIM}>, metadata: Json}
810
+ `);
811
+ }
812
+ console.error("[Schema] Observation table migrated (Session/Task/Validity).");
732
813
  }
733
814
  }
734
815
  try {
@@ -867,6 +948,26 @@ class MemoryServer {
867
948
  console.error("[Schema] Snapshot table error:", e.message);
868
949
  }
869
950
  }
951
+ // Session State Table
952
+ if (!relations.includes("session_state")) {
953
+ try {
954
+ await this.db.run('{:create session_state {session_id: String => last_active: Int, status: String, metadata: Json}}');
955
+ console.error("[Schema] Session State table created.");
956
+ }
957
+ catch (e) {
958
+ console.error("[Schema] Session State table error:", e.message);
959
+ }
960
+ }
961
+ // Task State Table
962
+ if (!relations.includes("task_state")) {
963
+ try {
964
+ await this.db.run('{:create task_state {task_id: String => status: String, session_id: String, metadata: Json}}');
965
+ console.error("[Schema] Task State table created.");
966
+ }
967
+ catch (e) {
968
+ console.error("[Schema] Task State table error:", e.message);
969
+ }
970
+ }
870
971
  if (!relations.includes("inference_rule")) {
871
972
  try {
872
973
  await this.db.run('{:create inference_rule {id: String => name: String, datalog: String, created_at: Int}}');
@@ -1302,11 +1403,37 @@ class MemoryServer {
1302
1403
  }
1303
1404
  }
1304
1405
  const now = Date.now() * 1000;
1406
+ const session_id = args.session_id ?? "";
1407
+ const task_id = args.task_id ?? "";
1305
1408
  await this.db.run(`
1306
- ?[id, created_at, entity_id, text, embedding, metadata] <- [
1307
- [$id, [${now}, true], $entity_id, $text, $embedding, $metadata]
1308
- ] :insert observation {id, created_at => entity_id, text, embedding, metadata}
1309
- `, { id, entity_id: entityId, text: args.text, embedding, metadata: args.metadata || {} });
1409
+ ?[id, created_at, entity_id, session_id, task_id, text, embedding, metadata] <- [
1410
+ [$id, [${now}, true], $entity_id, $session, $task, $text, $embedding, $metadata]
1411
+ ] :insert observation {id, created_at => entity_id, session_id, task_id, text, embedding, metadata}
1412
+ `, {
1413
+ id,
1414
+ entity_id: entityId,
1415
+ session: session_id,
1416
+ task: task_id,
1417
+ text: args.text,
1418
+ embedding,
1419
+ metadata: args.metadata || {}
1420
+ });
1421
+ // Update session activity if session_id is provided
1422
+ if (session_id) {
1423
+ try {
1424
+ await this.db.run(`
1425
+ ?[id, last_active, status, metadata] :=
1426
+ *session_state{id, metadata},
1427
+ id = $id,
1428
+ last_active = $now,
1429
+ status = "active"
1430
+ :update session_state {id => last_active, status, metadata}
1431
+ `, { id: session_id, now: Math.floor(now / 1000000) });
1432
+ }
1433
+ catch (e) {
1434
+ console.warn(`[AddObservation] Session activity update failed for ${session_id}:`, e.message);
1435
+ }
1436
+ }
1310
1437
  // Optional: Automatic inference after new observation (in background)
1311
1438
  const suggestionsRaw = await this.inferenceEngine.inferRelations(entityId);
1312
1439
  const suggestions = await this.formatInferredRelationsForContext(suggestionsRaw);
@@ -1314,6 +1441,8 @@ class MemoryServer {
1314
1441
  return {
1315
1442
  id,
1316
1443
  entity_id: entityId,
1444
+ session_id,
1445
+ task_id,
1317
1446
  created_at: now,
1318
1447
  created_at_iso,
1319
1448
  status: "Observation saved",
@@ -1324,6 +1453,61 @@ class MemoryServer {
1324
1453
  return { error: error.message || "Unknown error" };
1325
1454
  }
1326
1455
  }
1456
+ async startSession(args) {
1457
+ await this.initPromise;
1458
+ const id = (0, uuid_1.v4)();
1459
+ const now = Math.floor(Date.now() / 1000);
1460
+ const name = args.name || `Session ${new Date().toISOString()}`;
1461
+ // Create a Session entity first
1462
+ await this.createEntity({
1463
+ name,
1464
+ type: "Session",
1465
+ metadata: { session_id: id, ...args.metadata }
1466
+ });
1467
+ await this.db.run(`
1468
+ ?[session_id, last_active, status, metadata] <- [[$id, $now, "active", $metadata]]
1469
+ :insert session_state {session_id => last_active, status, metadata}
1470
+ `, { id, now, metadata: args.metadata || {} });
1471
+ return { id, name, status: "Session started" };
1472
+ }
1473
+ async stopSession(args) {
1474
+ await this.initPromise;
1475
+ await this.db.run(`
1476
+ ?[session_id, last_active, status, metadata] :=
1477
+ *session_state{session_id, last_active, metadata},
1478
+ session_id = $id,
1479
+ status = "closed"
1480
+ :update session_state {session_id => last_active, status, metadata}
1481
+ `, { id: args.id });
1482
+ return { id: args.id, status: "Session closed" };
1483
+ }
1484
+ async startTask(args) {
1485
+ await this.initPromise;
1486
+ const id = (0, uuid_1.v4)();
1487
+ const sessionId = args.session_id || "";
1488
+ // Create a Task entity
1489
+ await this.createEntity({
1490
+ name: args.name,
1491
+ type: "Task",
1492
+ metadata: { task_id: id, session_id: sessionId, ...args.metadata }
1493
+ });
1494
+ await this.db.run(`
1495
+ ?[task_id, status, session_id, metadata] <- [[$id, "active", $session, $metadata]]
1496
+ :insert task_state {task_id => status, session_id, metadata}
1497
+ `, { id, session: sessionId, metadata: args.metadata || {} });
1498
+ return { id, name: args.name, session_id: sessionId, status: "Task started" };
1499
+ }
1500
+ async stopTask(args) {
1501
+ await this.initPromise;
1502
+ await this.db.run(`
1503
+ ?[task_id, status, session_id, metadata] :=
1504
+ *task_state{task_id, session_id, metadata},
1505
+ task_id = $id,
1506
+ status = "completed"
1507
+ :update task_state {task_id => status, session_id, metadata}
1508
+ `, { id: args.id });
1509
+ return { id: args.id, status: "Task stopped" };
1510
+ }
1327
1511
  async createRelation(args) {
1328
1512
  if (args.from_id === args.to_id) {
1329
1513
  return { error: "Self-references in relationships are not allowed" };
@@ -2162,14 +2346,16 @@ ids[id] <- $ids
2162
2346
  const now = Date.now() * 1000;
2163
2347
  allParams[`obs_id${suffix}`] = id;
2164
2348
  allParams[`obs_entity_id${suffix}`] = entity_id;
2349
+ allParams[`obs_session_id${suffix}`] = params.session_id || "";
2350
+ allParams[`obs_task_id${suffix}`] = params.task_id || "";
2165
2351
  allParams[`obs_text${suffix}`] = text || "";
2166
2352
  allParams[`obs_embedding${suffix}`] = embedding;
2167
2353
  allParams[`obs_metadata${suffix}`] = metadata || {};
2168
2354
  statements.push(`
2169
2355
  {
2170
- ?[id, created_at, entity_id, text, embedding, metadata] <- [
2171
- [$obs_id${suffix}, [${now}, true], $obs_entity_id${suffix}, $obs_text${suffix}, $obs_embedding${suffix}, $obs_metadata${suffix}]
2172
- ] :insert observation {id, created_at => entity_id, text, embedding, metadata}
2356
+ ?[id, created_at, entity_id, session_id, task_id, text, embedding, metadata] <- [
2357
+ [$obs_id${suffix}, [${now}, true], $obs_entity_id${suffix}, $obs_session_id${suffix}, $obs_task_id${suffix}, $obs_text${suffix}, $obs_embedding${suffix}, $obs_metadata${suffix}]
2358
+ ] :insert observation {id, created_at => entity_id, session_id, task_id, text, embedding, metadata}
2173
2359
  }
2174
2360
  `);
2175
2361
  results.push({ action: "add_observation", id, entity_id });
@@ -2445,6 +2631,8 @@ ids[id] <- $ids
2445
2631
  text: zod_1.z.string().describe("The fact or observation"),
2446
2632
  metadata: MetadataSchema.optional().describe("Additional metadata"),
2447
2633
  deduplicate: zod_1.z.boolean().optional().default(true).describe("Skip exact duplicates"),
2634
+ session_id: zod_1.z.string().optional().describe("Associated session ID"),
2635
+ task_id: zod_1.z.string().optional().describe("Associated task ID"),
2448
2636
  }).passthrough().refine((v) => Boolean(v.entity_id) || Boolean(v.entity_name), {
2449
2637
  message: "entity_id or entity_name is required",
2450
2638
  path: ["entity_id"],
@@ -2536,10 +2724,29 @@ ids[id] <- $ids
2536
2724
  message: "file_path or content is required for ingest_file",
2537
2725
  path: ["file_path"],
2538
2726
  }),
2727
+ zod_1.z.object({
2728
+ action: zod_1.z.literal("start_session"),
2729
+ name: zod_1.z.string().optional().describe("Optional name for the session"),
2730
+ metadata: MetadataSchema.optional().describe("Additional metadata"),
2731
+ }).passthrough(),
2732
+ zod_1.z.object({
2733
+ action: zod_1.z.literal("stop_session"),
2734
+ id: zod_1.z.string().describe("ID of the session to stop"),
2735
+ }).passthrough(),
2736
+ zod_1.z.object({
2737
+ action: zod_1.z.literal("start_task"),
2738
+ name: zod_1.z.string().describe("Name of the task"),
2739
+ session_id: zod_1.z.string().optional().describe("Optional session ID this task belongs to"),
2740
+ metadata: MetadataSchema.optional().describe("Additional metadata"),
2741
+ }).passthrough(),
2742
+ zod_1.z.object({
2743
+ action: zod_1.z.literal("stop_task"),
2744
+ id: zod_1.z.string().describe("ID of the task to stop"),
2745
+ }).passthrough(),
2539
2746
  ]);
2540
2747
  const MutateMemoryParameters = zod_1.z.object({
2541
2748
  action: zod_1.z
2542
- .enum(["create_entity", "update_entity", "delete_entity", "add_observation", "create_relation", "run_transaction", "add_inference_rule", "ingest_file"])
2749
+ .enum(["create_entity", "update_entity", "delete_entity", "add_observation", "create_relation", "run_transaction", "add_inference_rule", "ingest_file", "start_session", "stop_session", "start_task", "stop_task"])
2543
2750
  .describe("Action (determines which fields are required)"),
2544
2751
  name: zod_1.z.string().optional().describe("For create_entity (required) or add_inference_rule (required)"),
2545
2752
  type: zod_1.z.string().optional().describe("For create_entity (required)"),
@@ -2587,6 +2794,10 @@ Supported actions:
2587
2794
  Example (Manager Transitivity):
2588
2795
  '?[from_id, to_id, relation_type, confidence, reason] := *relationship{from_id: $id, to_id: mid, relation_type: "manager_of", @ "NOW"}, *relationship{from_id: mid, to_id: target, relation_type: "manager_of", @ "NOW"}, from_id = $id, to_id = target, relation_type = "ober_manager_von", confidence = 0.6, reason = "Transitive Manager Path"'
2589
2796
  - 'ingest_file': Bulk import of documents (Markdown/JSON). Supports chunking (paragraphs) and automatic entity creation. Params: { entity_id | entity_name (required), format, content, ... }. Ideal for quickly populating memory from existing notes.
2797
+ - 'start_session': Initializes a new session for context tracking. Params: { name?: string, metadata?: object }.
2798
+ - 'stop_session': Closes a session. Params: { id: string }.
2799
+ - 'start_task': Initializes a new task within a session. Params: { name: string, session_id?: string, metadata?: object }.
2800
+ - 'stop_task': Marks a task as completed. Params: { id: string }.
2590
2801
 
2591
2802
  Validation: Invalid syntax or missing columns in inference rules will result in errors.`,
2592
2803
  parameters: MutateMemoryParameters,
@@ -2620,6 +2831,14 @@ Validation: Invalid syntax or missing columns in inference rules will result in
2620
2831
  return JSON.stringify(await this.addInferenceRule(rest));
2621
2832
  if (action === "ingest_file")
2622
2833
  return JSON.stringify(await this.ingestFile(rest));
2834
+ if (action === "start_session")
2835
+ return JSON.stringify(await this.startSession(rest));
2836
+ if (action === "stop_session")
2837
+ return JSON.stringify(await this.stopSession(rest));
2838
+ if (action === "start_task")
2839
+ return JSON.stringify(await this.startTask(rest));
2840
+ if (action === "stop_task")
2841
+ return JSON.stringify(await this.stopTask(rest));
2623
2842
  return JSON.stringify({ error: "Unknown action" });
2624
2843
  },
2625
2844
  });
@@ -2632,6 +2851,8 @@ Validation: Invalid syntax or missing columns in inference rules will result in
2632
2851
  include_entities: zod_1.z.boolean().optional().default(true).describe("Include entities in search"),
2633
2852
  include_observations: zod_1.z.boolean().optional().default(true).describe("Include observations in search"),
2634
2853
  rerank: zod_1.z.boolean().optional().default(false).describe("Use Cross-Encoder reranking for higher precision"),
2854
+ session_id: zod_1.z.string().optional().describe("Prioritize results from this session"),
2855
+ task_id: zod_1.z.string().optional().describe("Prioritize results from this task"),
2635
2856
  }),
2636
2857
  zod_1.z.object({
2637
2858
  action: zod_1.z.literal("advancedSearch"),
@@ -2660,6 +2881,8 @@ Validation: Invalid syntax or missing columns in inference rules will result in
2660
2881
  efSearch: zod_1.z.number().optional().describe("HNSW search precision"),
2661
2882
  }).optional().describe("Vector parameters"),
2662
2883
  rerank: zod_1.z.boolean().optional().default(false).describe("Use Cross-Encoder reranking for higher precision"),
2884
+ session_id: zod_1.z.string().optional().describe("Prioritize results from this session"),
2885
+ task_id: zod_1.z.string().optional().describe("Prioritize results from this task"),
2663
2886
  }),
2664
2887
  zod_1.z.object({
2665
2888
  action: zod_1.z.literal("context"),
@@ -2695,6 +2918,8 @@ Validation: Invalid syntax or missing columns in inference rules will result in
2695
2918
  query: zod_1.z.string().describe("Context query for agentic routing"),
2696
2919
  limit: zod_1.z.number().optional().default(10).describe("Maximum number of results"),
2697
2920
  rerank: zod_1.z.boolean().optional().default(false).describe("Use Cross-Encoder reranking for higher precision"),
2921
+ session_id: zod_1.z.string().optional().describe("Prioritize results from this session"),
2922
+ task_id: zod_1.z.string().optional().describe("Prioritize results from this task"),
2698
2923
  }),
2699
2924
  ]);
2700
2925
  const QueryMemoryParameters = zod_1.z.object({
@@ -2703,6 +2928,8 @@ Validation: Invalid syntax or missing columns in inference rules will result in
2703
2928
  .describe("Action (determines which fields are required)"),
2704
2929
  query: zod_1.z.string().optional().describe("Required for search/advancedSearch/context/graph_rag/graph_walking/agentic_search"),
2705
2930
  limit: zod_1.z.number().optional().describe("Only for search/advancedSearch/graph_rag/graph_walking"),
2931
+ session_id: zod_1.z.string().optional().describe("Optional session ID for context boosting"),
2932
+ task_id: zod_1.z.string().optional().describe("Optional task ID for context boosting"),
2706
2933
  filters: zod_1.z.any().optional().describe("Only for advancedSearch"),
2707
2934
  graphConstraints: zod_1.z.any().optional().describe("Only for advancedSearch"),
2708
2935
  vectorOptions: zod_1.z.any().optional().describe("Only for advancedSearch"),
@@ -2751,6 +2978,8 @@ Notes: 'agentic_search' is the most powerful and adaptable, 'context' is ideal f
2751
2978
  includeEntities: input.include_entities,
2752
2979
  includeObservations: input.include_observations,
2753
2980
  rerank: input.rerank,
2981
+ session_id: input.session_id,
2982
+ task_id: input.task_id,
2754
2983
  });
2755
2984
  const conflictEntityIds = Array.from(new Set(results
2756
2985
  .map((r) => (r.name ? r.id : r.entity_id))
@@ -2783,6 +3012,8 @@ Notes: 'agentic_search' is the most powerful and adaptable, 'context' is ideal f
2783
3012
  graphConstraints: input.graphConstraints,
2784
3013
  vectorParams: input.vectorParams,
2785
3014
  rerank: input.rerank,
3015
+ session_id: input.session_id,
3016
+ task_id: input.task_id,
2786
3017
  });
2787
3018
  const conflictEntityIds = Array.from(new Set(results
2788
3019
  .map((r) => (r.name ? r.id : r.entity_id))
@@ -3707,6 +3938,7 @@ Note: Use 'mutate_memory' with action='add_observation' and entity_id='global_us
3707
3938
  });
3708
3939
  }
3709
3940
  async start() {
3941
+ await this.initPromise;
3710
3942
  await this.mcp.start({ transportType: "stdio" });
3711
3943
  console.error("Cozo Memory MCP Server running on stdio");
3712
3944
  }
@@ -0,0 +1,80 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ const index_1 = require("./index");
4
+ async function testMultiLevelMemory() {
5
+ console.log("=== Testing Multi-Level Memory (v2.0) ===\n");
6
+ const server = new index_1.MemoryServer();
7
+ await server.start();
8
+ try {
9
+ // 1. Session Management
10
+ console.log("--- 1. Testing Session Management ---");
11
+ const sessionRes = await server.startSession({ name: "Research Session", metadata: { user: "test_user" } });
12
+ const sessionId = sessionRes.id;
13
+ console.log(`Started session: ${sessionId}`);
14
+ // 2. Task Management
15
+ console.log("\n--- 2. Testing Task Management ---");
16
+ const taskRes = await server.startTask({ name: "Analyze Multi-Level Memory", session_id: sessionId });
17
+ const taskId = taskRes.id;
18
+ console.log(`Started task: ${taskId}`);
19
+ // 3. Observations with Context
20
+ console.log("\n--- 3. Adding observations with session and task IDs ---");
21
+ await server.addObservation({
22
+ entity_name: "Multi-Level Memory",
23
+ entity_type: "Concept",
24
+ text: "Multi-Level Memory is implemented in v2.0.",
25
+ session_id: sessionId,
26
+ task_id: taskId,
27
+ metadata: { importance: "high" }
28
+ });
29
+ console.log("Added observation for Multi-Level Memory with session and task context.");
30
+ await server.addObservation({
31
+ entity_name: "Janitor",
32
+ entity_type: "Service",
33
+ text: "Janitor now handles session compression.",
34
+ session_id: sessionId,
35
+ metadata: { importance: "medium" }
36
+ });
37
+ console.log("Added observation for Janitor with session context.");
38
+ // 4. Search with Context Boost
39
+ console.log("\n--- 4. Testing Context-Aware Search (Boosting) ---");
40
+ console.log("Case A: Search with session and task boost");
41
+ const searchResA = await server.advancedSearch({
42
+ query: "Multi-Level Memory",
43
+ session_id: sessionId,
44
+ task_id: taskId,
45
+ limit: 5
46
+ });
47
+ console.log("Search results (A):");
48
+ searchResA.forEach((r, i) => {
49
+ console.log(` [${i + 1}] Score: ${r.score.toFixed(4)}, Text: ${r.text || r.name}, Explanation: ${r.explanation}`);
50
+ });
51
+ // 5. Session Compression
52
+ console.log("\n--- 5. Testing Session Compression (Janitor) ---");
53
+ console.log("Stopping session to prepare for compression...");
54
+ await server.stopSession({ id: sessionId });
55
+ console.log("Manually aging session activity in session_state table...");
56
+ const oldTs = Date.now() - 40 * 60 * 1000; // 40 minutes ago
57
+ await server['db'].run(`?[session_id, last_active, status, metadata] <- [[$id, $ts, 'open', $meta]]
58
+ :put session_state {session_id, last_active, status, metadata}`, { id: sessionId, ts: oldTs, meta: { user: "test_user" } });
59
+ console.log("Running Janitor cleanup with confirm=true...");
60
+ const janitorRes = await server.janitorCleanup({ confirm: true });
61
+ console.log("Janitor result:", JSON.stringify(janitorRes, null, 2));
62
+ // 6. Verify Compression Result
63
+ console.log("\n--- 6. Verifying compression summary in User Profile ---");
64
+ const profileRes = await server.advancedSearch({ query: "Session Summary", entityTypes: ["Observation"] });
65
+ console.log("Profile search results:");
66
+ profileRes.forEach((r, i) => {
67
+ if (r.text?.includes(sessionId)) {
68
+ console.log(` [MATCH] ${r.text}`);
69
+ }
70
+ });
71
+ console.log("\n=== Multi-Level Memory Test completed successfully ===");
72
+ }
73
+ catch (error) {
74
+ console.error("Test failed:", error);
75
+ }
76
+ finally {
77
+ process.exit(0);
78
+ }
79
+ }
80
+ testMultiLevelMemory();
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "cozo-memory",
3
- "version": "1.1.0",
3
+ "version": "1.1.2",
4
4
  "mcpName": "io.github.tobs-code/cozo-memory",
5
5
  "description": "Local-first persistent memory system for AI agents with hybrid search, graph reasoning, and MCP integration",
6
6
  "main": "dist/index.js",