@psiclawops/hypermem 0.8.5 → 0.9.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +26 -0
- package/INSTALL.md +132 -9
- package/README.md +119 -272
- package/bench/README.md +42 -0
- package/bench/data-access-bench.mjs +380 -0
- package/bin/hypermem-bench.mjs +2 -0
- package/bin/hypermem-doctor.mjs +412 -0
- package/bin/hypermem-model-audit.mjs +339 -0
- package/bin/hypermem-status.mjs +491 -70
- package/dist/adaptive-lifecycle.d.ts +81 -0
- package/dist/adaptive-lifecycle.d.ts.map +1 -0
- package/dist/adaptive-lifecycle.js +190 -0
- package/dist/budget-policy.d.ts +1 -1
- package/dist/budget-policy.d.ts.map +1 -1
- package/dist/budget-policy.js +10 -5
- package/dist/cache.d.ts +1 -0
- package/dist/cache.d.ts.map +1 -1
- package/dist/cache.js +2 -0
- package/dist/composition-snapshot-integrity.d.ts +36 -0
- package/dist/composition-snapshot-integrity.d.ts.map +1 -0
- package/dist/composition-snapshot-integrity.js +131 -0
- package/dist/composition-snapshot-runtime.d.ts +59 -0
- package/dist/composition-snapshot-runtime.d.ts.map +1 -0
- package/dist/composition-snapshot-runtime.js +250 -0
- package/dist/composition-snapshot-store.d.ts +44 -0
- package/dist/composition-snapshot-store.d.ts.map +1 -0
- package/dist/composition-snapshot-store.js +117 -0
- package/dist/compositor.d.ts +125 -1
- package/dist/compositor.d.ts.map +1 -1
- package/dist/compositor.js +692 -44
- package/dist/doc-chunk-store.d.ts +19 -0
- package/dist/doc-chunk-store.d.ts.map +1 -1
- package/dist/doc-chunk-store.js +56 -6
- package/dist/hybrid-retrieval.d.ts +38 -0
- package/dist/hybrid-retrieval.d.ts.map +1 -1
- package/dist/hybrid-retrieval.js +86 -1
- package/dist/index.d.ts +12 -3
- package/dist/index.d.ts.map +1 -1
- package/dist/index.js +28 -2
- package/dist/knowledge-store.d.ts +4 -1
- package/dist/knowledge-store.d.ts.map +1 -1
- package/dist/knowledge-store.js +27 -4
- package/dist/library-schema.d.ts +12 -8
- package/dist/library-schema.d.ts.map +1 -1
- package/dist/library-schema.js +22 -8
- package/dist/message-store.d.ts.map +1 -1
- package/dist/message-store.js +7 -3
- package/dist/metrics-dashboard.d.ts +18 -1
- package/dist/metrics-dashboard.d.ts.map +1 -1
- package/dist/metrics-dashboard.js +52 -14
- package/dist/reranker.d.ts +1 -1
- package/dist/reranker.js +2 -2
- package/dist/schema.d.ts +1 -1
- package/dist/schema.d.ts.map +1 -1
- package/dist/schema.js +28 -1
- package/dist/seed.d.ts.map +1 -1
- package/dist/seed.js +2 -0
- package/dist/topic-synthesizer.d.ts +20 -0
- package/dist/topic-synthesizer.d.ts.map +1 -1
- package/dist/topic-synthesizer.js +113 -3
- package/dist/trigger-registry.d.ts.map +1 -1
- package/dist/trigger-registry.js +10 -2
- package/dist/types.d.ts +271 -1
- package/dist/types.d.ts.map +1 -1
- package/dist/version.d.ts +7 -7
- package/dist/version.d.ts.map +1 -1
- package/dist/version.js +17 -7
- package/docs/DIAGNOSTICS.md +205 -0
- package/docs/INTEGRATION_VALIDATION.md +186 -0
- package/docs/MIGRATION.md +9 -6
- package/docs/MIGRATION_GUIDE.md +125 -101
- package/docs/ROADMAP.md +238 -20
- package/docs/TUNING.md +19 -5
- package/install.sh +152 -401
- package/memory-plugin/LICENSE +190 -0
- package/memory-plugin/README.md +20 -0
- package/memory-plugin/dist/index.js +50 -0
- package/memory-plugin/package.json +2 -2
- package/package.json +18 -4
- package/plugin/LICENSE +190 -0
- package/plugin/README.md +20 -0
- package/plugin/dist/index.d.ts +29 -0
- package/plugin/dist/index.d.ts.map +1 -1
- package/plugin/dist/index.js +288 -23
- package/plugin/dist/index.js.map +1 -1
- package/plugin/package.json +2 -2
- package/scripts/install-runtime.mjs +12 -1
package/docs/MIGRATION_GUIDE.md
CHANGED
|
@@ -45,13 +45,15 @@ After any migration, the background indexer picks up imported content and builds
|
|
|
45
45
|
|
|
46
46
|
Run this before any migration path.
|
|
47
47
|
|
|
48
|
+
> **Script status:** this guide contains source-specific adapter snippets. The repo does not ship a unified migration dispatcher yet. If a section says "save as", create that file in a temporary migration directory or repo checkout, run the dry-run first, then run with `--apply`.
|
|
49
|
+
|
|
48
50
|
**1. Confirm hypermem is installed and has initialized:**
|
|
49
51
|
```bash
|
|
50
52
|
openclaw plugins list | grep hypermem
|
|
51
53
|
ls ~/.openclaw/hypermem/library.db # must exist — send one message first if not
|
|
52
54
|
```
|
|
53
55
|
|
|
54
|
-
If `library.db` doesn't exist yet, start the gateway with hypermem enabled, send one message to any agent, then come back. See [INSTALL.md](../INSTALL.md) for the
|
|
56
|
+
If `library.db` doesn't exist yet, start the gateway with hypermem enabled, send one message to any agent, then come back. See [INSTALL.md](../INSTALL.md) for the npm-first staging, wiring, restart, and verification path, then continue.
|
|
55
57
|
|
|
56
58
|
**2. Back up your existing data:**
|
|
57
59
|
```bash
|
|
@@ -74,7 +76,7 @@ cp -r ~/.cognee ~/.cognee.pre-hypermem 2>/dev/null || true
|
|
|
74
76
|
|
|
75
77
|
## Fresh install
|
|
76
78
|
|
|
77
|
-
Nothing to migrate. Follow the [INSTALL.md](../INSTALL.md) guide (
|
|
79
|
+
Nothing to migrate. Follow the [INSTALL.md](../INSTALL.md) npm-first guide (stage runtime, wire plugins, restart, verify runtime active). hypermem begins building context from your first conversation. The background indexer starts automatically.
|
|
78
80
|
|
|
79
81
|
---
|
|
80
82
|
|
|
@@ -90,32 +92,39 @@ OpenClaw's built-in memory system stores facts, preferences, and context entries
|
|
|
90
92
|
| Preferences | `facts` table with `domain: preference` |
|
|
91
93
|
| Context entries | `facts` table with `domain: general` |
|
|
92
94
|
|
|
93
|
-
**Step 1:
|
|
95
|
+
**Step 1: Inspect the source schema**
|
|
94
96
|
```bash
|
|
95
|
-
|
|
97
|
+
sqlite3 ~/.openclaw/memory.db '.tables'
|
|
98
|
+
sqlite3 ~/.openclaw/memory.db 'SELECT name, sql FROM sqlite_master WHERE type = "table";'
|
|
96
99
|
```
|
|
97
100
|
|
|
98
|
-
|
|
101
|
+
OpenClaw's legacy memory schema has varied across releases and plugins. Do not run a blind importer until you know which table names and content columns your deployment uses.
|
|
99
102
|
|
|
100
|
-
**Step 2:
|
|
101
|
-
|
|
102
|
-
|
|
103
|
+
**Step 2: Export facts to normalized JSON**
|
|
104
|
+
|
|
105
|
+
Create a JSON array shaped like this:
|
|
106
|
+
|
|
107
|
+
```json
|
|
108
|
+
[
|
|
109
|
+
{
|
|
110
|
+
"agentId": "main",
|
|
111
|
+
"content": "User prefers concise release status updates",
|
|
112
|
+
"domain": "preference",
|
|
113
|
+
"confidence": 0.85,
|
|
114
|
+
"source": "openclaw-memory-db"
|
|
115
|
+
}
|
|
116
|
+
]
|
|
103
117
|
```
|
|
104
118
|
|
|
105
|
-
**Step 3:
|
|
119
|
+
**Step 3: Import through the custom importer pattern**
|
|
120
|
+
|
|
121
|
+
Use the [custom system](#from-a-custom-system) importer below. Start with a dry run and import only after the counts and sample content look right.
|
|
122
|
+
|
|
123
|
+
**Step 4: Restart**
|
|
106
124
|
```bash
|
|
107
125
|
openclaw gateway restart
|
|
108
126
|
```
|
|
109
127
|
|
|
110
|
-
**Options:**
|
|
111
|
-
```
|
|
112
|
-
--agent <id> Agent to import facts for (default: main)
|
|
113
|
-
--memory-db <path> Path to memory.db (default: ~/.openclaw/memory.db)
|
|
114
|
-
--hypermem-dir <path> hypermem data directory (default: ~/.openclaw/hypermem)
|
|
115
|
-
--limit <n> Import only first N facts (useful for testing)
|
|
116
|
-
--apply Actually write data (default is dry-run)
|
|
117
|
-
```
|
|
118
|
-
|
|
119
128
|
> **Note:** The built-in memory.db is not agent-scoped. All entries go to the agent you specify with `--agent`. If multiple agents share the same memory.db, run the script once per agent.
|
|
120
129
|
|
|
121
130
|
---
|
|
@@ -135,7 +144,7 @@ QMD is the OpenClaw local-first memory sidecar — it runs behind `plugins.slots
|
|
|
135
144
|
| BM25 + vector hybrid search | FTS5 + nomic-embed-text hybrid search |
|
|
136
145
|
| Extra indexed paths (`memory.qmd.paths`) | Not yet supported — see capability gaps below |
|
|
137
146
|
| Session transcript indexing | Covered natively — all message history is indexed |
|
|
138
|
-
| Reranking (QMD cross-encoder) |
|
|
147
|
+
| Reranking (QMD cross-encoder) | Supported as optional post-fusion reranking when configured; otherwise hypermem falls back to RRF/MMR ordering |
|
|
139
148
|
|
|
140
149
|
**Pre-flight:**
|
|
141
150
|
|
|
@@ -184,7 +193,7 @@ if (!extraDir) {
|
|
|
184
193
|
process.exit(1);
|
|
185
194
|
}
|
|
186
195
|
|
|
187
|
-
const hm = await HyperMem.create({
|
|
196
|
+
const hm = await HyperMem.create({ dataDir: join(homedir(), '.openclaw/hypermem') });
|
|
188
197
|
let imported = 0;
|
|
189
198
|
const files = [];
|
|
190
199
|
for await (const f of glob('**/*.md', { cwd: extraDir })) files.push(f);
|
|
@@ -198,7 +207,8 @@ for (const file of files) {
|
|
|
198
207
|
} else {
|
|
199
208
|
await hm.addFact(agentId, chunk.trim(), {
|
|
200
209
|
domain: 'general',
|
|
201
|
-
|
|
210
|
+
sourceType: 'migration',
|
|
211
|
+
sourceRef: `qmd-extra:${file}`,
|
|
202
212
|
confidence: 0.8,
|
|
203
213
|
});
|
|
204
214
|
}
|
|
@@ -214,8 +224,8 @@ if (dryRun) console.log('Run with --apply to write data.');
|
|
|
214
224
|
|
|
215
225
|
| QMD feature | Status in hypermem |
|
|
216
226
|
|---|---|
|
|
217
|
-
| Reranking (cross-encoder) |
|
|
218
|
-
| Extra path indexing |
|
|
227
|
+
| Reranking (cross-encoder) | Supported as optional reranker. Validate with compose diagnostics; fallback is RRF/MMR ordering. |
|
|
228
|
+
| Extra path indexing | No automatic QMD path watcher. Use the manual fact import workaround. |
|
|
219
229
|
| Session transcript search | Covered natively. |
|
|
220
230
|
| BM25 hybrid search | Covered — FTS5 + vector hybrid with configurable weights. |
|
|
221
231
|
| Automatic fallback to builtin | Not applicable — hypermem does not fall back. |
|
|
@@ -234,31 +244,30 @@ ClawText stores full conversation history in `session-intelligence.db`. This scr
|
|
|
234
244
|
| Identity anchors | Used to route messages to correct agent DB |
|
|
235
245
|
| Optimization `.jsonl` logs | Not imported (operational data, not conversation history) |
|
|
236
246
|
|
|
237
|
-
**Step 1:
|
|
247
|
+
**Step 1: Inspect the source DB**
|
|
238
248
|
```bash
|
|
239
|
-
|
|
249
|
+
sqlite3 ~/.openclaw/workspace/.clawtext/session-intelligence.db '.tables'
|
|
250
|
+
sqlite3 ~/.openclaw/workspace/.clawtext/session-intelligence.db 'SELECT name, sql FROM sqlite_master WHERE type = "table";'
|
|
240
251
|
```
|
|
241
252
|
|
|
242
|
-
**Step 2:
|
|
243
|
-
|
|
244
|
-
|
|
253
|
+
**Step 2: Export conversations to normalized JSONL**
|
|
254
|
+
|
|
255
|
+
Recommended shape, one record per line:
|
|
256
|
+
|
|
257
|
+
```json
|
|
258
|
+
{"agentId":"main","sessionKey":"clawtext:session-1","role":"user","content":"...","createdAt":"2026-04-24T00:00:00Z"}
|
|
245
259
|
```
|
|
246
260
|
|
|
247
|
-
**Step 3:
|
|
261
|
+
**Step 3: Import through the custom importer pattern**
|
|
262
|
+
|
|
263
|
+
Use `recordUserMessage()` and `recordAssistantMessage()` from the [custom system](#from-a-custom-system) section. Dry-run by printing counts per agent/session before writing.
|
|
264
|
+
|
|
265
|
+
**Step 4: Restart**
|
|
248
266
|
```bash
|
|
249
267
|
openclaw gateway restart
|
|
250
268
|
```
|
|
251
269
|
|
|
252
|
-
|
|
253
|
-
```
|
|
254
|
-
--apply Actually write data (default is dry-run)
|
|
255
|
-
--limit <n> Import only first N conversations
|
|
256
|
-
--clawtext-db <path> Path to session-intelligence.db
|
|
257
|
-
(default: ~/.openclaw/workspace/.clawtext/session-intelligence.db)
|
|
258
|
-
--hypermem-dir <path> hypermem data directory (default: ~/.openclaw/hypermem)
|
|
259
|
-
```
|
|
260
|
-
|
|
261
|
-
Conversations without a detectable agent identity are routed to `main`.
|
|
270
|
+
Conversations without a detectable agent identity should be routed to `main` or reviewed manually before import.
|
|
262
271
|
|
|
263
272
|
---
|
|
264
273
|
|
|
@@ -320,7 +329,7 @@ const exportPath = process.argv[3] ?? 'cognee_export.json';
|
|
|
320
329
|
const dryRun = !process.argv.includes('--apply');
|
|
321
330
|
|
|
322
331
|
const entries = JSON.parse(readFileSync(exportPath, 'utf-8'));
|
|
323
|
-
const hm = await HyperMem.create({
|
|
332
|
+
const hm = await HyperMem.create({ dataDir: join(homedir(), '.openclaw/hypermem') });
|
|
324
333
|
|
|
325
334
|
let imported = 0;
|
|
326
335
|
let skipped = 0;
|
|
@@ -338,7 +347,8 @@ for (const entry of entries) {
|
|
|
338
347
|
|
|
339
348
|
await hm.addFact(agentId, text, {
|
|
340
349
|
domain: entry.type ?? 'general',
|
|
341
|
-
|
|
350
|
+
sourceType: 'migration',
|
|
351
|
+
sourceRef: entry.source ?? 'cognee-export',
|
|
342
352
|
confidence: 0.85,
|
|
343
353
|
});
|
|
344
354
|
imported++;
|
|
@@ -450,12 +460,12 @@ print(f"Exported {result['count']} memories")
|
|
|
450
460
|
|
|
451
461
|
**Step 2: Dry run**
|
|
452
462
|
```bash
|
|
453
|
-
node
|
|
463
|
+
node ./migrate-mem0.mjs main mem0_export.json
|
|
454
464
|
```
|
|
455
465
|
|
|
456
|
-
Save this as `
|
|
466
|
+
Save this as `migrate-mem0.mjs` in your migration working directory:
|
|
457
467
|
```js
|
|
458
|
-
//
|
|
468
|
+
// migrate-mem0.mjs
|
|
459
469
|
import { HyperMem } from '@psiclawops/hypermem';
|
|
460
470
|
import { readFileSync } from 'node:fs';
|
|
461
471
|
import { homedir } from 'node:os';
|
|
@@ -469,7 +479,7 @@ const raw = JSON.parse(readFileSync(exportPath, 'utf-8'));
|
|
|
469
479
|
// handle both export job format {memories: [...]} and get_all format {results: [...]}
|
|
470
480
|
const entries = raw.memories ?? raw.results ?? raw;
|
|
471
481
|
|
|
472
|
-
const hm = await HyperMem.create({
|
|
482
|
+
const hm = await HyperMem.create({ dataDir: join(homedir(), '.openclaw/hypermem') });
|
|
473
483
|
let imported = 0, skipped = 0;
|
|
474
484
|
|
|
475
485
|
for (const entry of entries) {
|
|
@@ -487,9 +497,9 @@ for (const entry of entries) {
|
|
|
487
497
|
|
|
488
498
|
await hm.addFact(agentId, text, {
|
|
489
499
|
domain,
|
|
490
|
-
|
|
500
|
+
sourceType: 'migration',
|
|
501
|
+
sourceRef: entry.id ? `mem0:${entry.id}` : 'mem0-export',
|
|
491
502
|
confidence: 0.9,
|
|
492
|
-
createdAt: entry.created_at,
|
|
493
503
|
});
|
|
494
504
|
imported++;
|
|
495
505
|
}
|
|
@@ -500,7 +510,7 @@ if (dryRun) console.log('Run with --apply to write data.');
|
|
|
500
510
|
|
|
501
511
|
**Step 3: Import**
|
|
502
512
|
```bash
|
|
503
|
-
node
|
|
513
|
+
node ./migrate-mem0.mjs main mem0_export.json --apply
|
|
504
514
|
```
|
|
505
515
|
|
|
506
516
|
**Step 4: Restart**
|
|
@@ -585,9 +595,9 @@ print(f"Exported {len(export['sessions'])} sessions, {len(export['facts'])} fact
|
|
|
585
595
|
|
|
586
596
|
**Step 2: Import**
|
|
587
597
|
|
|
588
|
-
Save as `
|
|
598
|
+
Save as `migrate-zep.mjs` in your migration working directory:
|
|
589
599
|
```js
|
|
590
|
-
//
|
|
600
|
+
// migrate-zep.mjs
|
|
591
601
|
import { HyperMem } from '@psiclawops/hypermem';
|
|
592
602
|
import { readFileSync } from 'node:fs';
|
|
593
603
|
import { homedir } from 'node:os';
|
|
@@ -598,7 +608,7 @@ const exportPath = process.argv[3] ?? 'zep_export.json';
|
|
|
598
608
|
const dryRun = !process.argv.includes('--apply');
|
|
599
609
|
|
|
600
610
|
const { sessions = [], facts = [] } = JSON.parse(readFileSync(exportPath, 'utf-8'));
|
|
601
|
-
const hm = await HyperMem.create({
|
|
611
|
+
const hm = await HyperMem.create({ dataDir: join(homedir(), '.openclaw/hypermem') });
|
|
602
612
|
|
|
603
613
|
let msgCount = 0, factCount = 0;
|
|
604
614
|
|
|
@@ -626,9 +636,9 @@ for (const fact of facts) {
|
|
|
626
636
|
if (dryRun) { console.log(`[dry-run] fact: ${fact.text.slice(0, 80)}`); factCount++; continue; }
|
|
627
637
|
await hm.addFact(agentId, fact.text, {
|
|
628
638
|
domain: 'general',
|
|
629
|
-
|
|
639
|
+
sourceType: 'migration',
|
|
640
|
+
sourceRef: fact.uuid ? `zep:${fact.uuid}` : 'zep-export',
|
|
630
641
|
confidence: 0.85,
|
|
631
|
-
createdAt: fact.created_at,
|
|
632
642
|
});
|
|
633
643
|
factCount++;
|
|
634
644
|
}
|
|
@@ -639,10 +649,10 @@ if (dryRun) console.log('Run with --apply to write data.');
|
|
|
639
649
|
|
|
640
650
|
```bash
|
|
641
651
|
# Dry run
|
|
642
|
-
node
|
|
652
|
+
node ./migrate-zep.mjs main zep_export.json
|
|
643
653
|
|
|
644
654
|
# Apply
|
|
645
|
-
node
|
|
655
|
+
node ./migrate-zep.mjs main zep_export.json --apply
|
|
646
656
|
|
|
647
657
|
openclaw gateway restart
|
|
648
658
|
```
|
|
@@ -702,9 +712,9 @@ print(f"Exported {len(export['conclusions'])} conclusions")
|
|
|
702
712
|
|
|
703
713
|
**Step 2: Import conclusions as facts**
|
|
704
714
|
|
|
705
|
-
Save as `
|
|
715
|
+
Save as `migrate-honcho.mjs` in your migration working directory:
|
|
706
716
|
```js
|
|
707
|
-
//
|
|
717
|
+
// migrate-honcho.mjs
|
|
708
718
|
import { HyperMem } from '@psiclawops/hypermem';
|
|
709
719
|
import { readFileSync } from 'node:fs';
|
|
710
720
|
import { homedir } from 'node:os';
|
|
@@ -715,7 +725,7 @@ const exportPath = process.argv[3] ?? 'honcho_export.json';
|
|
|
715
725
|
const dryRun = !process.argv.includes('--apply');
|
|
716
726
|
|
|
717
727
|
const { conclusions = [] } = JSON.parse(readFileSync(exportPath, 'utf-8'));
|
|
718
|
-
const hm = await HyperMem.create({
|
|
728
|
+
const hm = await HyperMem.create({ dataDir: join(homedir(), '.openclaw/hypermem') });
|
|
719
729
|
let imported = 0, skipped = 0;
|
|
720
730
|
|
|
721
731
|
for (const c of conclusions) {
|
|
@@ -724,9 +734,9 @@ for (const c of conclusions) {
|
|
|
724
734
|
if (dryRun) { console.log(`[dry-run] conclusion: ${text.slice(0, 80)}`); imported++; continue; }
|
|
725
735
|
await hm.addFact(agentId, text, {
|
|
726
736
|
domain: 'general',
|
|
727
|
-
|
|
737
|
+
sourceType: 'migration',
|
|
738
|
+
sourceRef: c.id ? `honcho:${c.id}` : 'honcho-export',
|
|
728
739
|
confidence: 0.9,
|
|
729
|
-
createdAt: c.created_at,
|
|
730
740
|
});
|
|
731
741
|
imported++;
|
|
732
742
|
}
|
|
@@ -737,10 +747,10 @@ if (dryRun) console.log('Run with --apply to write data.');
|
|
|
737
747
|
|
|
738
748
|
```bash
|
|
739
749
|
# Dry run
|
|
740
|
-
node
|
|
750
|
+
node ./migrate-honcho.mjs main honcho_export.json
|
|
741
751
|
|
|
742
752
|
# Apply
|
|
743
|
-
node
|
|
753
|
+
node ./migrate-honcho.mjs main honcho_export.json --apply
|
|
744
754
|
```
|
|
745
755
|
|
|
746
756
|
**Step 3: Uninstall Honcho plugin and enable hypermem**
|
|
@@ -811,9 +821,9 @@ Install lancedb if needed: `pip install lancedb`
|
|
|
811
821
|
|
|
812
822
|
**Step 3: Import**
|
|
813
823
|
|
|
814
|
-
Save as `
|
|
824
|
+
Save as `migrate-lancedb.mjs` in your migration working directory:
|
|
815
825
|
```js
|
|
816
|
-
//
|
|
826
|
+
// migrate-lancedb.mjs
|
|
817
827
|
import { HyperMem } from '@psiclawops/hypermem';
|
|
818
828
|
import { readFileSync } from 'node:fs';
|
|
819
829
|
import { homedir } from 'node:os';
|
|
@@ -825,7 +835,7 @@ const exportPath = process.argv.find(a => a.endsWith('.json')) ?? 'lancedb_expor
|
|
|
825
835
|
const dryRun = !process.argv.includes('--apply');
|
|
826
836
|
|
|
827
837
|
const entries = JSON.parse(readFileSync(exportPath, 'utf-8'));
|
|
828
|
-
const hm = await HyperMem.create({
|
|
838
|
+
const hm = await HyperMem.create({ dataDir: join(homedir(), '.openclaw/hypermem') });
|
|
829
839
|
let imported = 0, skipped = 0;
|
|
830
840
|
|
|
831
841
|
for (const entry of entries) {
|
|
@@ -835,7 +845,8 @@ for (const entry of entries) {
|
|
|
835
845
|
if (dryRun) { console.log(`[dry-run] [${agentId}] ${text.slice(0, 80)}`); imported++; continue; }
|
|
836
846
|
await hm.addFact(agentId, text, {
|
|
837
847
|
domain: 'general',
|
|
838
|
-
|
|
848
|
+
sourceType: 'migration',
|
|
849
|
+
sourceRef: entry.id ? `lancedb:${entry.id}` : 'lancedb-export',
|
|
839
850
|
confidence: 0.85,
|
|
840
851
|
});
|
|
841
852
|
imported++;
|
|
@@ -847,13 +858,13 @@ if (dryRun) console.log('Run with --apply to write data.');
|
|
|
847
858
|
|
|
848
859
|
```bash
|
|
849
860
|
# Dry run (all agents from export)
|
|
850
|
-
node
|
|
861
|
+
node ./migrate-lancedb.mjs main lancedb_export.json
|
|
851
862
|
|
|
852
863
|
# Or target a specific agent
|
|
853
|
-
node
|
|
864
|
+
node ./migrate-lancedb.mjs main lancedb_export.json
|
|
854
865
|
|
|
855
866
|
# Apply
|
|
856
|
-
node
|
|
867
|
+
node ./migrate-lancedb.mjs main lancedb_export.json --apply
|
|
857
868
|
```
|
|
858
869
|
|
|
859
870
|
**Step 4: Disable memory-lancedb and enable hypermem**
|
|
@@ -875,38 +886,29 @@ If your agents use the standard OpenClaw MEMORY.md + daily checkpoint pattern (`
|
|
|
875
886
|
|
|
876
887
|
> **If you are coming from QMD**, use the [QMD path](#from-qmd) instead — it covers MEMORY.md files and handles the slot change correctly.
|
|
877
888
|
|
|
878
|
-
**Step 1:
|
|
879
|
-
```bash
|
|
880
|
-
node scripts/migrate-memory-md.mjs
|
|
881
|
-
```
|
|
889
|
+
**Step 1: Decide whether you need an import**
|
|
882
890
|
|
|
883
|
-
|
|
891
|
+
Current HyperMem bootstraps workspace `MEMORY.md` and daily memory files during normal context assembly. If your files are still present in the agent workspace, you usually do **not** need a data import. Enable HyperMem and verify recall first.
|
|
884
892
|
|
|
885
|
-
**Step 2:
|
|
886
|
-
```bash
|
|
887
|
-
node scripts/migrate-memory-md.mjs --apply
|
|
888
|
-
```
|
|
893
|
+
**Step 2: If you need durable fact rows, export daily entries**
|
|
889
894
|
|
|
890
|
-
|
|
891
|
-
|
|
892
|
-
|
|
895
|
+
Create a normalized JSON array from substantive daily-memory bullets only. Skip pointer-only `MEMORY.md` index lines and `memory_search(...)` references.
|
|
896
|
+
|
|
897
|
+
```json
|
|
898
|
+
[
|
|
899
|
+
{"agentId":"forge","content":"HyperMem 0.9.0 release path requires npm-first staging validation","domain":"operations","confidence":0.8}
|
|
900
|
+
]
|
|
893
901
|
```
|
|
894
902
|
|
|
895
|
-
**Step 3:
|
|
903
|
+
**Step 3: Import through the custom importer pattern**
|
|
904
|
+
|
|
905
|
+
Use the [custom system](#from-a-custom-system) fact importer. Start with one agent and a small sample before bulk import.
|
|
906
|
+
|
|
907
|
+
**Step 4: Restart**
|
|
896
908
|
```bash
|
|
897
909
|
openclaw gateway restart
|
|
898
910
|
```
|
|
899
911
|
|
|
900
|
-
**Options:**
|
|
901
|
-
```
|
|
902
|
-
--agent <id> Only import for this agent (default: all detected)
|
|
903
|
-
--workspace-root <path> Scan workspace directories under this path
|
|
904
|
-
(default: ~/.openclaw)
|
|
905
|
-
--hypermem-dir <path> hypermem data directory (default: ~/.openclaw/hypermem)
|
|
906
|
-
--limit <n> Import only first N facts
|
|
907
|
-
--apply Actually write data (default is dry-run)
|
|
908
|
-
```
|
|
909
|
-
|
|
910
912
|
**Parsing rules:** Imports bullet list items from daily files (`memory/YYYY-MM-DD.md`) only. `MEMORY.md` index files are intentionally skipped — they're pointers, not content. Lines under 40 characters, `→ memory_search(...)` pointers, and code-like lines are also skipped.
|
|
911
913
|
|
|
912
914
|
---
|
|
@@ -919,11 +921,11 @@ Use hypermem's programmatic API to import directly.
|
|
|
919
921
|
```js
|
|
920
922
|
import { HyperMem } from '@psiclawops/hypermem';
|
|
921
923
|
|
|
922
|
-
const hm = await HyperMem.create({
|
|
924
|
+
const hm = await HyperMem.create({ dataDir: `${process.env.HOME}/.openclaw/hypermem` });
|
|
923
925
|
|
|
924
926
|
await hm.addFact('your-agent-id', 'User prefers dark mode in all UIs', {
|
|
925
927
|
domain: 'preference',
|
|
926
|
-
|
|
928
|
+
sourceType: 'migration',
|
|
927
929
|
confidence: 0.9,
|
|
928
930
|
});
|
|
929
931
|
```
|
|
@@ -940,7 +942,7 @@ await hm.recordAssistantMessage('your-agent-id', 'session-key:your-session', {
|
|
|
940
942
|
});
|
|
941
943
|
```
|
|
942
944
|
|
|
943
|
-
For bulk imports, write a
|
|
945
|
+
For bulk imports, write a one-off importer using the API examples above. If you need to preserve original timestamps exactly, use a direct SQLite importer against the current schema after taking a backup; `addFact()` stamps imported facts with current `created_at` and tracks source through `sourceType`/`sourceRef`.
|
|
944
946
|
|
|
945
947
|
After import:
|
|
946
948
|
```bash
|
|
@@ -951,15 +953,22 @@ openclaw gateway restart
|
|
|
951
953
|
|
|
952
954
|
## Enabling hypermem
|
|
953
955
|
|
|
954
|
-
Once your data is imported, enable both plugins
|
|
956
|
+
Once your data is imported, enable both plugins. Merge existing config arrays instead of overwriting them.
|
|
955
957
|
|
|
956
958
|
```bash
|
|
959
|
+
openclaw config get plugins.load.paths
|
|
960
|
+
openclaw config get plugins.allow
|
|
961
|
+
|
|
962
|
+
HYPERMEM_PATHS="[\"${HOME}/.openclaw/plugins/hypermem/plugin\",\"${HOME}/.openclaw/plugins/hypermem/memory-plugin\"]"
|
|
963
|
+
openclaw config set plugins.load.paths "$HYPERMEM_PATHS" --strict-json
|
|
957
964
|
openclaw config set plugins.slots.contextEngine hypercompositor
|
|
958
965
|
openclaw config set plugins.slots.memory hypermem
|
|
959
966
|
openclaw gateway restart
|
|
960
967
|
```
|
|
961
968
|
|
|
962
|
-
If
|
|
969
|
+
If `plugins.allow` is already a non-empty array, append `hypercompositor` and `hypermem`. If it is unset or null, do not create a restrictive allowlist just for HyperMem.
|
|
970
|
+
|
|
971
|
+
If you were on memory-core, QMD, or memory-lancedb, the `slots.memory hypermem` line above replaces the old memory provider. No separate disable step is needed.
|
|
963
972
|
|
|
964
973
|
---
|
|
965
974
|
|
|
@@ -994,6 +1003,21 @@ for (const agent of fs.readdirSync(agentsDir)) {
|
|
|
994
1003
|
"
|
|
995
1004
|
```
|
|
996
1005
|
|
|
1006
|
+
**Verify runtime wiring and diagnostics:**
|
|
1007
|
+
```bash
|
|
1008
|
+
openclaw plugins list | grep -E 'hypercompositor|hypermem'
|
|
1009
|
+
hypermem-status --health
|
|
1010
|
+
hypermem-model-audit --strict
|
|
1011
|
+
openclaw logs --limit 100 | grep -E 'hypermem:compose|falling back to default engine'
|
|
1012
|
+
```
|
|
1013
|
+
|
|
1014
|
+
Pass criteria:
|
|
1015
|
+
- both `hypercompositor` and `hypermem` are loaded
|
|
1016
|
+
- `hypermem-status --health` can read the data directory
|
|
1017
|
+
- model audit reports the active provider/model/dimensions without silent defaults
|
|
1018
|
+
- logs show `[hypermem:compose]` after a test message
|
|
1019
|
+
- logs do **not** show fallback to `legacy` context engine
|
|
1020
|
+
|
|
997
1021
|
**Ask your agent to recall something** from the imported history. If recall seems patchy in the first session, the background indexer is still building embeddings — send one message and wait one turn. It runs automatically after ingest.
|
|
998
1022
|
|
|
999
1023
|
---
|
|
@@ -1024,11 +1048,11 @@ Original data (memory.db, ClawText database, QMD collections, Cognee data direct
|
|
|
1024
1048
|
|
|
1025
1049
|
**"library.db not found" during migration**
|
|
1026
1050
|
|
|
1027
|
-
hypermem hasn't initialized yet. Start the gateway with the plugin enabled, send one message, then re-run:
|
|
1051
|
+
hypermem hasn't initialized yet. Start the gateway with the plugin enabled, send one message, then re-run your dry-run importer:
|
|
1028
1052
|
```bash
|
|
1029
1053
|
openclaw gateway restart
|
|
1030
1054
|
# wait a few seconds, send one message
|
|
1031
|
-
node
|
|
1055
|
+
node ./your-migration-importer.mjs main export.json
|
|
1032
1056
|
```
|
|
1033
1057
|
|
|
1034
1058
|
**Facts imported but agent doesn't recall them**
|
|
@@ -1039,9 +1063,9 @@ openclaw gateway restart
|
|
|
1039
1063
|
```
|
|
1040
1064
|
Send one message and wait one turn — the indexer runs after the first ingest.
|
|
1041
1065
|
|
|
1042
|
-
**Duplicate facts after re-running
|
|
1066
|
+
**Duplicate facts after re-running an importer**
|
|
1043
1067
|
|
|
1044
|
-
|
|
1068
|
+
Example snippets are intentionally simple and may not deduplicate across changed source IDs. Keep dry-run output, import logs, and source export IDs. If you need repeatable bulk migration, add a stable source reference such as `sourceRef` or include an external-ID marker in the fact content before re-running with `--apply`.
|
|
1045
1069
|
|
|
1046
1070
|
**Agent routed to wrong database**
|
|
1047
1071
|
|