openclaw-memory-decay 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,323 @@
1
+ ![Memory Decay Banner](assets/banner.png)
2
+
3
+ # openclaw-memory-decay
4
+
5
+ **Give your OpenClaw a memory — one that forgets.**
6
+
7
+ An OpenClaw plugin that replaces flat file-based memory with a human-like decay system. Important things stick. Noise fades. Your agent remembers what matters without drowning in everything else.
8
+
9
+ Built on [memory-decay-core](https://github.com/memory-decay/memory-decay-core) — a mathematical memory model where activation decays over time, stability grows through recall, and retrieval consolidation reinforces what you actually use.
10
+
11
+ ## Why
12
+
13
+ AI agents forget everything between sessions. The usual fix: dump everything into files and load it all back.
14
+
15
+ That works. Until it doesn't.
16
+
17
+ - 50 lines of memory → great. 500 lines → context pollution. 5000 lines → the agent stops attending to what matters.
18
+ - Everything is stored equally. A one-off joke about coffee has the same weight as your API architecture decisions.
19
+ - Retrieval is binary: either it's in the file or it isn't. No notion of "I think I remember this, but I'm not sure."
20
+
21
+ `openclaw-memory-decay` solves this with **decay**:
22
+
23
+ - Memories have an activation score that decreases over time — like human forgetting
24
+ - Important memories decay slower; trivial ones fade fast
25
+ - When your agent recalls a memory, it gets reinforced — the testing effect
26
+ - Search results come with freshness indicators: `fresh`, `normal`, `stale`
27
+ - The result: your agent naturally retains what matters and loses what doesn't
28
+
29
+ ```
30
+ Activation
31
+ 1.0 ┤●
32
+ │ ●● ● (reinforced — recalled)
33
+ 0.8 ┤ ●● ●●●
34
+ │ ●●●● ●●
35
+ 0.6 ┤ ●●●●●● ●●●●
36
+ │ ●●●●●● ●●●●
37
+ 0.4 ┤ ●●●●●●●●●●●
38
+ │ ▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴▴
39
+ 0.2 ┤ (unreinforced — fading)
40
+
41
+ 0.0 └─────────────────────────────────────────── Time
42
+ ● saved at importance 0.9 ▴ saved at importance 0.3
43
+ ```
44
+
45
+ ## Features
46
+
47
+ - **Decay-aware search** — retrieval scores blend semantic similarity with activation and reinforcement history
48
+ - **Automatic noise cleanup** — low-importance memories decay naturally; no manual pruning
49
+ - **Retrieval consolidation** — memories get stronger every time they're recalled, modeling the testing effect
50
+ - **Category-aware storing** — the agent picks the right category (`preference`, `decision`, `fact`, `episode`) with calibrated importance (0.3–1.0), not a flat 0.8 for everything
51
+ - **Proactive agent saves** — the agent stores preferences, decisions, facts, and episodes without being asked
52
+ - **Freshness indicators** — search results include `fresh` / `normal` / `stale` so the agent can judge reliability
53
+ - **Dual-score model** — storage score (can it be found?) and retrieval score (how easily?) are tracked separately
54
+ - **`/remember` skill** — users can explicitly ask the agent to remember something
55
+ - **Markdown migration** — imports existing `~/.openclaw/workspace/memory/` files on first run
56
+
57
+ ## Quick Start
58
+
59
+ ```bash
60
+ # 1. Install memory-decay-core (the backend engine)
61
+ pip install memory-decay
62
+
63
+ # 2. Install this plugin from npm
64
+ openclaw plugins install openclaw-memory-decay
65
+
66
+ # 3. (Optional but recommended) Restrict auto-load to trusted plugins only
67
+ openclaw config set plugins.allow '["memory-decay"]'
68
+
69
+ # 4. Restart the gateway
70
+ openclaw gateway restart
71
+ ```
72
+
73
+ > **Note:** Steps 5 and 6 both succeed silently if the plugin is already loaded — use `openclaw plugins list` to confirm status is `loaded` and origin is `config`.
74
+
75
+ ### Prerequisites
76
+
77
+ - [OpenClaw](https://openclaw.ai) installed globally
78
+ - Python 3.10+
79
+ - `memory-decay` Python package (`pip install memory-decay`)
80
+
81
+ ## Configuration
82
+
83
+ Add to `~/.openclaw/openclaw.json` under `plugins.entries.memory-decay.config`:
84
+
85
+ ```json
86
+ {
87
+ "plugins": {
88
+ "entries": {
89
+ "memory-decay": {
90
+ "enabled": true,
91
+ "config": {
92
+ "dbPath": "~/.openclaw/memory-decay-data/memories.db",
93
+ "serverPort": 8100,
94
+ "autoSave": false
95
+ }
96
+ }
97
+ }
98
+ }
99
+ }
100
+ ```
101
+
102
+ | Option | Default | Description |
103
+ |--------|---------|-------------|
104
+ | `serverPort` | `8100` | Port for the memory-decay HTTP server |
105
+ | `memoryDecayPath` | (auto) | Path to memory-decay-core. Auto-detected from `pip show memory-decay` if not set |
106
+ | `pythonPath` | `python3` | Path to Python interpreter (use your venv) |
107
+ | `dbPath` | `~/.openclaw/memory-decay-data/memories.db` | SQLite database location |
108
+ | `autoSave` | `true` | Auto-save every conversation turn at low importance. Set `false` to let the agent decide what to save |
109
+ | `embeddingProvider` | `local` | Embedding provider: `local`, `openai`, or `gemini` |
110
+ | `embeddingApiKey` | (auto) | API key for embedding provider. Falls back to env vars (see below) |
111
+ | `embeddingModel` | (auto) | Specific embedding model name |
112
+ | `embeddingDim` | (auto) | Embedding dimension (auto-detected from provider) |
113
+
114
+ ### API Key Configuration
115
+
116
+ The API key for embedding providers is resolved in this order:
117
+
118
+ 1. **Plugin config** — `embeddingApiKey` in `openclaw.json`
119
+ 2. **Generic env var** — `MD_EMBEDDING_API_KEY`
120
+ 3. **Provider-specific env var** — `OPENAI_API_KEY` (openai) or `GEMINI_API_KEY` / `GOOGLE_API_KEY` (gemini)
121
+
122
+ ```bash
123
+ # Example: use OpenAI embeddings with API key from environment
124
+ export OPENAI_API_KEY="sk-..."
125
+
126
+ # Or use the generic variable (works with any provider)
127
+ export MD_EMBEDDING_API_KEY="sk-..."
128
+ ```
129
+
130
+ When using the `local` provider (default), no API key is needed — embeddings are computed locally using sentence-transformers.
131
+
132
+ ### autoSave: true vs false
133
+
134
+ | Mode | Who stores | When | Importance |
135
+ |------|-----------|------|------------|
136
+ | `autoSave: true` | Plugin automatically | Every conversation turn | 0.3 (low) |
137
+ | `autoSave: false` | Agent decides | When something is worth remembering | 0.8 (high, set by agent) |
138
+
139
+ With `autoSave: false`, the agent uses `memory_store` proactively — storing facts, decisions, preferences, and important context. Noise stays out of the memory system entirely.
140
+
141
+ ## Memory Categories
142
+
143
+ The bootstrap prompt and skills guide the agent to pick the right category and importance:
144
+
145
+ | Category | When | Importance | Example |
146
+ |----------|------|------------|---------|
147
+ | `preference` | User's role, style, habits, likes/dislikes | 0.8–1.0 | "User prefers Korean for conversation, English for code" |
148
+ | `decision` | Why X was chosen, tradeoffs, rejected alternatives | 0.8–0.9 | "Chose SQLite over Postgres — single-node, no ops overhead" |
149
+ | `fact` | Technical facts, API behaviors, architecture | 0.7–0.9 | "Auth service returns inconsistent 4xx on token expiry" |
150
+ | `episode` | What was worked on, session context | 0.3–0.6 | "Finished migrating auth middleware" |
151
+
152
+ The agent stores proactively based on conversation triggers — it doesn't wait for `/remember`.
153
+
154
+ ## How It Works
155
+
156
+ ```
157
+ ┌──────────────┐ ┌──────────────────┐ ┌────────────────────┐
158
+ │ OpenClaw │ ◄─────► │ Plugin (TS) │ ◄─────► │ memory-decay-core │
159
+ │ Agent │ tools │ Hook handler │ HTTP │ (Python/FastAPI) │
160
+ │ │ │ │ :8100 │ │
161
+ └──────────────┘ └──────────────────┘ └────────────────────┘
162
+ │ │
163
+ │ session_end │ POST /auto-tick
164
+ │ ──────────────► │
165
+ │ │
166
+ │ message_received │ POST /store
167
+ │ (if autoSave) │
168
+ │ ──────────────► │
169
+ │ │
170
+ │ before_compaction │ POST /store
171
+ │ │
172
+ ```
173
+
174
+ The plugin manages the Python server lifecycle — starts with the gateway, stops on shutdown.
175
+
176
+ ### Memory Lifecycle
177
+
178
+ ```
179
+ Store ──► Activate ──► Decay ──► Search ──► Reinforce ──► Decay (slower)
180
+ │ │ │
181
+ │ │ └──► Stability increases
182
+ │ │
183
+ │ └──► Low importance fades fast
184
+ │ High importance fades slow
185
+
186
+ └──► Importance set by agent (0.8) or auto-save (0.3)
187
+ ```
188
+
189
+ 1. **Store** — memory enters with an activation of 1.0 and a set importance
190
+ 2. **Decay** — each tick, activation decreases based on importance and stability
191
+ 3. **Search** — semantic similarity × activation weighting × BM25 re-ranking
192
+ 4. **Reinforce** — recalled memories get boosted (testing effect), stability grows
193
+ 5. **Forget** — memories with very low activation become practically unretrievable
194
+
195
+ ## Skills
196
+
197
+ The plugin registers these skills:
198
+
199
+ | Skill | Trigger | Description |
200
+ |-------|---------|-------------|
201
+ | `/remember` | `/remember I prefer dark mode` | Save with correct category and calibrated importance |
202
+ | `/recall` | `/recall what did we decide about the API?` | Search memories with freshness-aware action guidance |
203
+ | `/forget` | `/forget the temp password` | Delete a specific memory |
204
+ | `/memory-status` | `/memory-status` | Show memory count, tick, and decay stats |
205
+ | `/migrate` | `/migrate` | Import Markdown files from `memory/` directory |
206
+
207
+ ## Troubleshooting
208
+
209
+ ### `plugins.allow is empty; discovered non-bundled plugins may auto-load`
210
+
211
+ This warning appears when `plugins.allow` is not set. While the plugin still loads (since it is explicitly configured in `plugins.entries`), it is good practice to restrict auto-load to trusted plugins only:
212
+
213
+ ```bash
214
+ openclaw config set plugins.allow '["memory-decay"]'
215
+ openclaw gateway restart
216
+ ```
217
+
218
+ ### `Cannot find module 'openclaw/plugin-sdk'`
219
+
220
+ ```bash
221
+ openclaw plugins install openclaw-memory-decay
222
+ ```
223
+
224
+ ### `Memory service not running`
225
+
226
+ ```bash
227
+ # Check if memory-decay is installed
228
+ pip show memory-decay
229
+
230
+ # Check server health
231
+ curl http://127.0.0.1:8100/health
232
+ ```
233
+
234
+ If `pip show memory-decay` returns nothing, install it:
235
+ ```bash
236
+ pip install memory-decay
237
+ ```
238
+
239
+ ### Plugin shows `error` status
240
+
241
+ ```bash
242
+ openclaw plugins doctor
243
+ ```
244
+
245
+ ## For AI Agents: Self-Installation Guide
246
+
247
+ > This section is written for LLM agents to follow autonomously. If you're a human, use the [Quick Start](#quick-start) section above.
248
+
249
+ ### Quick Install (one-shot)
250
+
251
+ ```bash
252
+ # 1. Install the backend engine (requires Python 3.10+)
253
+ pip install memory-decay
254
+
255
+ # 2. Verify
256
+ pip show memory-decay | head -3
257
+
258
+ # 3. Install plugin from npm
259
+ openclaw plugins install openclaw-memory-decay
260
+
261
+ # 4. Restart gateway
262
+ openclaw gateway restart
263
+
264
+ # 5. Verify plugin is loaded
265
+ openclaw plugins list
266
+ # Look for: memory-decay | loaded
267
+
268
+ # 6. Check server health
269
+ curl -s http://127.0.0.1:8100/health
270
+ # Expected: {"status":"ok","current_tick":0}
271
+ ```
272
+
273
+ ### Diagnosing Issues
274
+
275
+ **Plugin shows `error` status:**
276
+ ```bash
277
+ openclaw plugins doctor
278
+ # Check the error message for missing dependencies or path issues
279
+ ```
280
+
281
+ **`Cannot find module 'openclaw/plugin-sdk'`:**
282
+ ```bash
283
+ openclaw plugins install openclaw-memory-decay
284
+ ```
285
+
286
+ **`Memory service not running`:**
287
+ ```bash
288
+ # Check backend is installed
289
+ pip show memory-decay
290
+
291
+ # Check if gateway started it
292
+ curl -s http://127.0.0.1:8100/health
293
+
294
+ # If pip show returns nothing:
295
+ pip install memory-decay
296
+ openclaw gateway restart
297
+ ```
298
+
299
+ **Port conflict (8100 in use):**
300
+ ```json
301
+ // In ~/.openclaw/openclaw.json, change:
302
+ "serverPort": 8101
303
+ ```
304
+
305
+ **Embedding errors:**
306
+ ```bash
307
+ # Switch to OpenAI if local embeddings fail
308
+ export OPENAI_API_KEY=your-key
309
+ export MD_EMBEDDING_PROVIDER=openai
310
+ openclaw gateway restart
311
+ ```
312
+
313
+ ### Key Files
314
+
315
+ | Path | Purpose |
316
+ |------|---------|
317
+ | `~/.openclaw/openclaw.json` | Plugin configuration |
318
+ | `~/.openclaw/memory-decay-data/memories.db` | Memory database |
319
+ | `~/.openclaw/workspace/memory/` | Legacy memory files (can migrate) |
320
+
321
+ ## License
322
+
323
+ MIT
package/package.json ADDED
@@ -0,0 +1,44 @@
1
+ {
2
+ "name": "openclaw-memory-decay",
3
+ "version": "0.1.2",
4
+ "type": "module",
5
+ "description": "OpenClaw memory plugin backed by memory-decay engine",
6
+ "main": "./src/index.js",
7
+ "types": "./src/index.d.ts",
8
+ "openclaw": {
9
+ "extensions": [
10
+ "./src/index.ts"
11
+ ]
12
+ },
13
+ "files": [
14
+ "src/",
15
+ "scripts/"
16
+ ],
17
+ "scripts": {
18
+ "build": "tsc",
19
+ "prepublishOnly": "npm run build",
20
+ "postinstall": "node scripts/detect-python.mjs",
21
+ "setup": "node scripts/link-sdk.mjs"
22
+ },
23
+ "dependencies": {
24
+ "@sinclair/typebox": "^0.31.0"
25
+ },
26
+ "peerDependencies": {
27
+ "openclaw": "*"
28
+ },
29
+ "devDependencies": {
30
+ "typescript": "^5.0.0"
31
+ },
32
+ "keywords": [
33
+ "openclaw",
34
+ "openclaw-plugin",
35
+ "memory",
36
+ "ai-agent",
37
+ "memory-decay"
38
+ ],
39
+ "license": "MIT",
40
+ "repository": {
41
+ "type": "git",
42
+ "url": "https://github.com/memory-decay/openclaw-memory-decay"
43
+ }
44
+ }
@@ -0,0 +1,35 @@
1
+ #!/usr/bin/env node
2
+ import { execSync } from "node:child_process";
3
+ import { writeFileSync } from "node:fs";
4
+ import { resolve, dirname } from "node:path";
5
+ import { fileURLToPath } from "node:url";
6
+
7
+ const root = dirname(fileURLToPath(import.meta.url));
8
+ const isWin = process.platform === "win32";
9
+ const candidates = isWin ? ["python"] : ["python3", "python"];
10
+
11
+ const python = candidates.find((py) => {
12
+ try {
13
+ execSync(
14
+ `${py} -c "import memory_decay; import sqlite3; sqlite3.connect(':memory:').enable_load_extension(True)"`,
15
+ { stdio: "ignore" },
16
+ );
17
+ return true;
18
+ } catch {
19
+ return false;
20
+ }
21
+ });
22
+
23
+ if (!python) {
24
+ console.warn("[memory-decay] memory_decay not found — run: pip install memory-decay");
25
+ process.exit(0);
26
+ }
27
+
28
+ const pythonPath = execSync(isWin ? `where ${python}` : `which ${python}`, { encoding: "utf8" }).trim().split("\n")[0];
29
+ const memoryDecayPath = execSync(
30
+ `${python} -c "import memory_decay,os; print(os.path.dirname(os.path.dirname(memory_decay.__file__)))"`,
31
+ { encoding: "utf8" }
32
+ ).trim();
33
+
34
+ writeFileSync(resolve(root, "../.python-env.json"), JSON.stringify({ pythonPath, memoryDecayPath }, null, 2));
35
+ console.log(`[memory-decay] Detected: ${pythonPath}`);
@@ -0,0 +1,40 @@
1
+ #!/usr/bin/env node
2
+ /**
3
+ * Links the global openclaw package into this plugin's node_modules
4
+ * so that `import ... from "openclaw/plugin-sdk"` resolves correctly.
5
+ */
6
+ import { existsSync, symlinkSync, mkdirSync } from "node:fs";
7
+ import { join, dirname } from "node:path";
8
+ import { execFileSync } from "node:child_process";
9
+ import { fileURLToPath } from "node:url";
10
+
11
+ const __dirname = dirname(fileURLToPath(import.meta.url));
12
+ const projectRoot = join(__dirname, "..");
13
+ const target = join(projectRoot, "node_modules", "openclaw");
14
+
15
+ if (existsSync(target)) {
16
+ console.log("openclaw SDK already linked.");
17
+ process.exit(0);
18
+ }
19
+
20
+ // Find global openclaw installation
21
+ let globalRoot;
22
+ try {
23
+ globalRoot = execFileSync("npm", ["root", "-g"], { encoding: "utf-8" }).trim();
24
+ } catch {
25
+ console.error("Failed to find global npm root. Is npm installed?");
26
+ process.exit(1);
27
+ }
28
+
29
+ const globalOpenclaw = join(globalRoot, "openclaw");
30
+ if (!existsSync(globalOpenclaw)) {
31
+ console.error(
32
+ `openclaw not found at ${globalOpenclaw}\n` +
33
+ "Install it globally first: npm i -g openclaw"
34
+ );
35
+ process.exit(1);
36
+ }
37
+
38
+ mkdirSync(join(projectRoot, "node_modules"), { recursive: true });
39
+ symlinkSync(globalOpenclaw, target, "junction");
40
+ console.log(`Linked openclaw SDK: ${target} -> ${globalOpenclaw}`);
package/src/client.ts ADDED
@@ -0,0 +1,60 @@
1
+ import type {
2
+ StoreRequest,
3
+ StoreResponse,
4
+ SearchRequest,
5
+ SearchResponse,
6
+ HealthResponse,
7
+ AutoTickResponse,
8
+ } from "./types.js";
9
+
10
+ export class MemoryDecayClient {
11
+ private baseUrl: string;
12
+
13
+ constructor(port: number = 8100) {
14
+ this.baseUrl = `http://127.0.0.1:${port}`;
15
+ }
16
+
17
+ async health(): Promise<HealthResponse> {
18
+ const res = await fetch(`${this.baseUrl}/health`);
19
+ if (!res.ok) throw new Error(`Health check failed: ${res.status}`);
20
+ return res.json() as Promise<HealthResponse>;
21
+ }
22
+
23
+ async store(req: StoreRequest): Promise<StoreResponse> {
24
+ const res = await fetch(`${this.baseUrl}/store`, {
25
+ method: "POST",
26
+ headers: { "Content-Type": "application/json" },
27
+ body: JSON.stringify(req),
28
+ });
29
+ if (!res.ok) throw new Error(`Store failed: ${res.status}`);
30
+ return res.json() as Promise<StoreResponse>;
31
+ }
32
+
33
+ async storeBatch(items: StoreRequest[]): Promise<{ ids: string[]; count: number }> {
34
+ const res = await fetch(`${this.baseUrl}/store-batch`, {
35
+ method: "POST",
36
+ headers: { "Content-Type": "application/json" },
37
+ body: JSON.stringify(items),
38
+ });
39
+ if (!res.ok) throw new Error(`Store-batch failed: ${res.status}`);
40
+ return res.json() as Promise<{ ids: string[]; count: number }>;
41
+ }
42
+
43
+ async search(req: SearchRequest): Promise<SearchResponse> {
44
+ const res = await fetch(`${this.baseUrl}/search`, {
45
+ method: "POST",
46
+ headers: { "Content-Type": "application/json" },
47
+ body: JSON.stringify(req),
48
+ });
49
+ if (!res.ok) throw new Error(`Search failed: ${res.status}`);
50
+ return res.json() as Promise<SearchResponse>;
51
+ }
52
+
53
+ async autoTick(): Promise<AutoTickResponse> {
54
+ const res = await fetch(`${this.baseUrl}/auto-tick`, {
55
+ method: "POST",
56
+ });
57
+ if (!res.ok) throw new Error(`Auto-tick failed: ${res.status}`);
58
+ return res.json() as Promise<AutoTickResponse>;
59
+ }
60
+ }
package/src/index.ts ADDED
@@ -0,0 +1,315 @@
1
+ import type { OpenClawPluginApi } from "openclaw/plugin-sdk";
2
+ import { emptyPluginConfigSchema } from "openclaw/plugin-sdk";
3
+ import { MemoryDecayClient } from "./client.js";
4
+ import { MemoryDecayService, type ServiceConfig } from "./service.js";
5
+ import { shouldMigrate, migrateMarkdownMemories } from "./migrator.js";
6
+ import { toFreshness } from "./types.js";
7
+
8
+ const BOOTSTRAP_PROMPT = `## Memory System (memory-decay)
9
+
10
+ You have access to a decay-aware memory system. Use it actively:
11
+
12
+ - **memory_search**: Search your memories before responding to recall relevant context.
13
+ Results include a "freshness" indicator (fresh/normal/stale) — treat stale memories with caution, they may be outdated.
14
+ - **memory_store**: Save important memories proactively — don't wait to be asked.
15
+ - **memory_store_batch**: Save multiple memories at once. More efficient than repeated memory_store calls.
16
+
17
+ ### Category & Importance Guide
18
+
19
+ ALWAYS set the correct category and importance. Do NOT default everything to "fact" at 0.8.
20
+
21
+ | Category | When | Importance |
22
+ |----------|------|------------|
23
+ | preference | User's role, likes, style, workflow habits | 0.8-1.0 |
24
+ | decision | Why X was chosen, tradeoffs, rejected alternatives | 0.8-0.9 |
25
+ | fact | Technical facts, API behaviors, architecture | 0.7-0.9 |
26
+ | episode | What was worked on, session context | 0.3-0.6 |
27
+
28
+ ### Store Proactively When:
29
+ - User reveals preferences, expertise, or communication style → preference (0.9)
30
+ - A technical choice is made with tradeoffs → decision (0.8)
31
+ - Non-obvious system behavior is discovered → fact (0.8)
32
+ - A feature or fix is completed → episode (0.5)
33
+
34
+ **IMPORTANT**: Do NOT write memory files to workspace/memory/ or any file path. Always use memory_store / memory_store_batch tools. They handle persistence, decay, and retrieval automatically.
35
+
36
+ Your memories naturally decay over time. Frequently recalled memories grow stronger; forgotten ones fade. This is by design.`;
37
+
38
+ const MIN_MESSAGE_LENGTH = 20;
39
+
40
+ const memoryDecayPlugin = {
41
+ id: "memory-decay",
42
+ name: "Memory Decay",
43
+ description: "Human-like memory with decay and reinforcement",
44
+ kind: "memory" as const,
45
+ configSchema: emptyPluginConfigSchema(),
46
+
47
+ register(api: OpenClawPluginApi) {
48
+ const cfg = api.pluginConfig ?? {};
49
+ const port = (cfg.serverPort as number) ?? 8100;
50
+ const autoSave = cfg.autoSave !== false; // default true
51
+
52
+ // Shared client — works as long as the server process is listening
53
+ const client = new MemoryDecayClient(port);
54
+ let service: MemoryDecayService | null = null;
55
+
56
+ // --- Service ---
57
+ api.registerService({
58
+ id: "memory-decay-server",
59
+ async start(ctx) {
60
+ // Resolve embedding provider: config > env var > default
61
+ const embeddingProvider =
62
+ (cfg.embeddingProvider as string) ||
63
+ process.env.MD_EMBEDDING_PROVIDER ||
64
+ "local";
65
+
66
+ // Resolve embedding model: config > env var
67
+ const embeddingModel =
68
+ (cfg.embeddingModel as string) ||
69
+ process.env.MD_EMBEDDING_MODEL;
70
+
71
+ // Resolve embedding API key: config > provider-specific env var > generic env var
72
+ const embeddingApiKey =
73
+ (cfg.embeddingApiKey as string) ||
74
+ process.env.MD_EMBEDDING_API_KEY ||
75
+ (embeddingProvider === "openai"
76
+ ? process.env.OPENAI_API_KEY
77
+ : embeddingProvider === "gemini"
78
+ ? process.env.GEMINI_API_KEY ?? process.env.GOOGLE_API_KEY
79
+ : undefined);
80
+
81
+ if (!embeddingApiKey && embeddingProvider !== "local") {
82
+ ctx.logger.warn(
83
+ `No API key found for embedding provider "${embeddingProvider}". ` +
84
+ `Set MD_EMBEDDING_API_KEY or ${embeddingProvider === "openai" ? "OPENAI_API_KEY" : "GEMINI_API_KEY"} env var.`
85
+ );
86
+ }
87
+
88
+ // Resolve pythonPath + memoryDecayPath: config > .python-env.json (set at install time) > error
89
+ let memoryDecayPath = (cfg.memoryDecayPath as string) ?? "";
90
+ let pythonPath = (cfg.pythonPath as string) ?? "";
91
+ if (!memoryDecayPath) {
92
+ try {
93
+ const { readFileSync } = await import("node:fs");
94
+ const { resolve, dirname } = await import("node:path");
95
+ const { fileURLToPath } = await import("node:url");
96
+ const pluginRoot = dirname(fileURLToPath(import.meta.url));
97
+ const detected = JSON.parse(readFileSync(resolve(pluginRoot, "../.python-env.json"), "utf8"));
98
+ if (!pythonPath && detected.pythonPath) pythonPath = detected.pythonPath;
99
+ if (detected.memoryDecayPath) memoryDecayPath = detected.memoryDecayPath;
100
+ } catch {}
101
+ }
102
+ if (!memoryDecayPath) {
103
+ ctx.logger.error(
104
+ "Could not auto-detect memory-decay installation. " +
105
+ "Run `pip install memory-decay` or set memoryDecayPath in plugin config."
106
+ );
107
+ return;
108
+ }
109
+
110
+ const config: ServiceConfig = {
111
+ pythonPath: pythonPath || "python3",
112
+ memoryDecayPath,
113
+ port,
114
+ dbPath: (cfg.dbPath as string) ?? "~/.openclaw/memory-decay-data/memories.db",
115
+ embeddingProvider,
116
+ embeddingModel,
117
+ embeddingApiKey,
118
+ embeddingDim: (cfg.embeddingDim as number) ??
119
+ (process.env.MD_EMBEDDING_DIM ? parseInt(process.env.MD_EMBEDDING_DIM, 10) : undefined),
120
+ experimentDir: cfg.experimentDir as string | undefined,
121
+ };
122
+
123
+ service = new MemoryDecayService(config);
124
+ await service.start();
125
+ ctx.logger.info("Server started");
126
+ },
127
+ async stop(ctx) {
128
+ if (service) {
129
+ await service.stop();
130
+ ctx.logger.info("Server stopped");
131
+ }
132
+ },
133
+ });
134
+
135
+ // --- Tools ---
136
+ // eslint-disable-next-line @typescript-eslint/no-explicit-any
137
+ const anyApi = api as any;
138
+
139
+ anyApi.registerTool(
140
+ (toolCtx: any) => ({
141
+ label: "memory_search",
142
+ name: "memory_search",
143
+ description: "Search memories with decay-aware ranking. Returns results with freshness indicators.",
144
+ parameters: {
145
+ type: "object",
146
+ properties: {
147
+ query: { type: "string", description: "Search query text" },
148
+ top_k: { type: "number", description: "Max results (default 5)" },
149
+ },
150
+ required: ["query"],
151
+ },
152
+ async execute(toolCallId: string, params: Record<string, unknown>) {
153
+ const res = await client.search({
154
+ query: params.query as string,
155
+ top_k: (params.top_k as number) ?? 5,
156
+ });
157
+
158
+ const enriched = res.results.map((r) => ({
159
+ ...r,
160
+ freshness: toFreshness(r.storage_score),
161
+ }));
162
+
163
+ return {
164
+ content: [{ type: "text" as const, text: JSON.stringify(enriched, null, 2) }],
165
+ };
166
+ },
167
+ }),
168
+ { names: ["memory_search"] },
169
+ );
170
+
171
+ anyApi.registerTool(
172
+ (toolCtx: any) => ({
173
+ label: "memory_store",
174
+ name: "memory_store",
175
+ description: "Save an important memory. Use proactively for facts, preferences, decisions.",
176
+ parameters: {
177
+ type: "object",
178
+ properties: {
179
+ text: { type: "string", description: "The memory content to store" },
180
+ importance: { type: "number", description: "0.0-1.0, default 0.8 for explicit saves" },
181
+ category: { type: "string", description: "fact, episode, preference, decision" },
182
+ },
183
+ required: ["text"],
184
+ },
185
+ async execute(toolCallId: string, params: Record<string, unknown>) {
186
+ const res = await client.store({
187
+ text: params.text as string,
188
+ importance: (params.importance as number) ?? 0.8,
189
+ category: (params.category as string) ?? "fact",
190
+ mtype: "fact",
191
+ });
192
+
193
+ return {
194
+ content: [{ type: "text" as const, text: `Stored memory ${res.id}` }],
195
+ };
196
+ },
197
+ }),
198
+ { names: ["memory_store"] },
199
+ );
200
+
201
+ anyApi.registerTool(
202
+ (toolCtx: any) => ({
203
+ label: "memory_store_batch",
204
+ name: "memory_store_batch",
205
+ description:
206
+ "Save multiple memories at once in a single request. More efficient than calling memory_store repeatedly. Each item can have its own importance and category.",
207
+ parameters: {
208
+ type: "object",
209
+ properties: {
210
+ items: {
211
+ type: "array",
212
+ description: "Array of memories to store",
213
+ items: {
214
+ type: "object",
215
+ properties: {
216
+ text: { type: "string", description: "The memory content to store" },
217
+ importance: { type: "number", description: "0.0-1.0, default 0.8" },
218
+ category: { type: "string", description: "fact, episode, preference, decision" },
219
+ mtype: { type: "string", description: "fact or episode, default fact" },
220
+ },
221
+ required: ["text"],
222
+ },
223
+ },
224
+ },
225
+ required: ["items"],
226
+ },
227
+ async execute(toolCallId: string, params: Record<string, unknown>) {
228
+ const items = params.items as Array<Record<string, unknown>>;
229
+ if (!items || items.length === 0) {
230
+ return {
231
+ content: [{ type: "text" as const, text: "No items provided" }],
232
+ };
233
+ }
234
+
235
+ const storeItems = items.map((item) => ({
236
+ text: item.text as string,
237
+ importance: (item.importance as number) ?? 0.8,
238
+ category: (item.category as string) ?? "fact",
239
+ mtype: (item.mtype as string) ?? "fact",
240
+ }));
241
+
242
+ const res = await client.storeBatch(storeItems);
243
+
244
+ return {
245
+ content: [
246
+ {
247
+ type: "text" as const,
248
+ text: `Stored ${res.count} memories: ${res.ids.join(", ")}`,
249
+ },
250
+ ],
251
+ };
252
+ },
253
+ }),
254
+ { names: ["memory_store_batch"] },
255
+ );
256
+
257
+ // --- Hooks ---
258
+
259
+ // Bootstrap: inject memory instructions + apply time-based decay
260
+ api.on("before_prompt_build", async (event, ctx) => {
261
+ try {
262
+ await client.autoTick();
263
+ } catch {
264
+ // Server might still be starting
265
+ }
266
+
267
+ return { prependSystemContext: BOOTSTRAP_PROMPT };
268
+ });
269
+
270
+ // Auto-save: store every conversation turn at low importance
271
+ // Only active when autoSave is true (default); when disabled, the agent
272
+ // is expected to use memory_store proactively instead.
273
+ if (autoSave) {
274
+ api.on("message_received", async (event, ctx) => {
275
+ const content = event.content;
276
+ if (!content || content.length < MIN_MESSAGE_LENGTH) return;
277
+
278
+ try {
279
+ await client.store({
280
+ text: content,
281
+ importance: 0.3,
282
+ mtype: "episode",
283
+ speaker: event.from ?? "user",
284
+ });
285
+ } catch (err) {
286
+ api.logger.error(`Auto-save failed: ${err}`);
287
+ }
288
+ });
289
+ }
290
+
291
+ // Compaction: save session summary before context is compressed
292
+ api.on("before_compaction", async (event, ctx) => {
293
+ try {
294
+ await client.store({
295
+ text: `[Session summary] Compaction triggered. Topics discussed in this session.`,
296
+ importance: 0.7,
297
+ mtype: "episode",
298
+ });
299
+ } catch (err) {
300
+ api.logger.error(`Compaction save failed: ${err}`);
301
+ }
302
+ });
303
+
304
+ // Session end: apply time-based decay for elapsed time between sessions
305
+ api.on("session_end", async (event, ctx) => {
306
+ try {
307
+ await client.autoTick();
308
+ } catch (err) {
309
+ api.logger.error(`Session-end tick failed: ${err}`);
310
+ }
311
+ });
312
+ },
313
+ };
314
+
315
+ export default memoryDecayPlugin;
@@ -0,0 +1,75 @@
1
+ import { readdir, readFile, writeFile, access } from "node:fs/promises";
2
+ import { join, basename } from "node:path";
3
+ import { MemoryDecayClient } from "./client.js";
4
+ import type { StoreRequest } from "./types.js";
5
+
6
+ const MIGRATION_MARKER = ".migration-done";
7
+
8
+ export async function shouldMigrate(persistenceDir: string): Promise<boolean> {
9
+ try {
10
+ await access(join(persistenceDir, MIGRATION_MARKER));
11
+ return false;
12
+ } catch {
13
+ return true;
14
+ }
15
+ }
16
+
17
+ export async function migrateMarkdownMemories(
18
+ memoryDir: string,
19
+ client: MemoryDecayClient,
20
+ persistenceDir: string,
21
+ ): Promise<{ migrated: number; files: number }> {
22
+ const files = await readdir(memoryDir);
23
+ const mdFiles = files.filter((f) => f.endsWith(".md"));
24
+
25
+ const allItems: StoreRequest[] = [];
26
+
27
+ for (const file of mdFiles) {
28
+ const content = await readFile(join(memoryDir, file), "utf-8");
29
+ const sections = splitIntoSections(content);
30
+ const isDateFile = /^\d{4}-\d{2}-\d{2}\.md$/.test(file);
31
+ const isCurated = file === "MEMORY.md" || file.startsWith("MEMORY");
32
+
33
+ const importance = isCurated ? 0.7 : isDateFile ? 0.4 : 0.5;
34
+
35
+ for (const section of sections) {
36
+ if (section.trim().length < 10) continue;
37
+ allItems.push({
38
+ text: section.trim(),
39
+ importance,
40
+ mtype: isDateFile ? "episode" : "fact",
41
+ category: isCurated ? "curated" : "migrated",
42
+ });
43
+ }
44
+ }
45
+
46
+ if (allItems.length > 0) {
47
+ for (let i = 0; i < allItems.length; i += 50) {
48
+ const chunk = allItems.slice(i, i + 50);
49
+ await client.storeBatch(chunk);
50
+ }
51
+ }
52
+
53
+ await writeFile(
54
+ join(persistenceDir, MIGRATION_MARKER),
55
+ JSON.stringify({ migratedAt: new Date().toISOString(), count: allItems.length, files: mdFiles.length }),
56
+ );
57
+
58
+ return { migrated: allItems.length, files: mdFiles.length };
59
+ }
60
+
61
+ function splitIntoSections(markdown: string): string[] {
62
+ const sections = markdown.split(/\n(?=#{1,3} )/);
63
+ const result: string[] = [];
64
+
65
+ for (const section of sections) {
66
+ if (section.length > 500) {
67
+ const paragraphs = section.split(/\n\n+/);
68
+ result.push(...paragraphs);
69
+ } else {
70
+ result.push(section);
71
+ }
72
+ }
73
+
74
+ return result;
75
+ }
package/src/service.ts ADDED
@@ -0,0 +1,82 @@
1
+ import { spawn, type ChildProcess } from "node:child_process";
2
+ import { MemoryDecayClient } from "./client.js";
3
+
4
+ export interface ServiceConfig {
5
+ pythonPath: string;
6
+ memoryDecayPath: string;
7
+ port: number;
8
+ dbPath: string;
9
+ embeddingProvider?: string;
10
+ embeddingModel?: string;
11
+ embeddingApiKey?: string;
12
+ embeddingDim?: number;
13
+ experimentDir?: string;
14
+ }
15
+
16
+ export class MemoryDecayService {
17
+ private process: ChildProcess | null = null;
18
+ private client: MemoryDecayClient;
19
+ private config: ServiceConfig;
20
+ private restartCount = 0;
21
+ private maxRestarts = 3;
22
+
23
+ constructor(config: ServiceConfig) {
24
+ this.config = config;
25
+ this.client = new MemoryDecayClient(config.port);
26
+ }
27
+
28
+ async start(): Promise<void> {
29
+ const args = [
30
+ "-m", "memory_decay.server",
31
+ "--host", "127.0.0.1",
32
+ "--port", String(this.config.port),
33
+ "--db-path", this.config.dbPath,
34
+ ];
35
+ if (this.config.embeddingProvider) args.push("--embedding-provider", this.config.embeddingProvider);
36
+ if (this.config.embeddingModel) args.push("--embedding-model", this.config.embeddingModel);
37
+ if (this.config.embeddingApiKey) args.push("--embedding-api-key", this.config.embeddingApiKey);
38
+ if (this.config.embeddingDim) args.push("--embedding-dim", String(this.config.embeddingDim));
39
+ if (this.config.experimentDir) args.push("--experiment-dir", this.config.experimentDir);
40
+
41
+ this.process = spawn(this.config.pythonPath, args, {
42
+ cwd: this.config.memoryDecayPath,
43
+ env: { ...process.env, PYTHONPATH: `${this.config.memoryDecayPath}/src` },
44
+ stdio: ["ignore", "pipe", "pipe"],
45
+ });
46
+
47
+ this.process.on("exit", (code) => {
48
+ if (code !== 0 && this.restartCount < this.maxRestarts) {
49
+ this.restartCount++;
50
+ console.error(`[memory-decay] Server exited with ${code}, restarting (${this.restartCount}/${this.maxRestarts})`);
51
+ this.start();
52
+ }
53
+ });
54
+
55
+ await this.waitForHealth();
56
+ }
57
+
58
+ async stop(): Promise<void> {
59
+ if (this.process) {
60
+ this.maxRestarts = 0;
61
+ this.process.kill("SIGTERM");
62
+ this.process = null;
63
+ }
64
+ }
65
+
66
+ getClient(): MemoryDecayClient {
67
+ return this.client;
68
+ }
69
+
70
+ private async waitForHealth(timeoutMs = 15000): Promise<void> {
71
+ const start = Date.now();
72
+ while (Date.now() - start < timeoutMs) {
73
+ try {
74
+ await this.client.health();
75
+ return;
76
+ } catch {
77
+ await new Promise((r) => setTimeout(r, 500));
78
+ }
79
+ }
80
+ throw new Error(`[memory-decay] Server failed to start within ${timeoutMs}ms`);
81
+ }
82
+ }
package/src/types.ts ADDED
@@ -0,0 +1,53 @@
1
+ export interface StoreRequest {
2
+ text: string;
3
+ importance?: number;
4
+ category?: string;
5
+ mtype?: string;
6
+ associations?: string[];
7
+ speaker?: string;
8
+ }
9
+
10
+ export interface StoreResponse {
11
+ id: string;
12
+ text: string;
13
+ tick: number;
14
+ }
15
+
16
+ export interface SearchRequest {
17
+ query: string;
18
+ top_k?: number;
19
+ }
20
+
21
+ export interface SearchResult {
22
+ id: string;
23
+ text: string;
24
+ score: number;
25
+ storage_score: number;
26
+ retrieval_score: number;
27
+ category: string;
28
+ created_tick: number;
29
+ speaker?: string;
30
+ }
31
+
32
+ export interface SearchResponse {
33
+ results: SearchResult[];
34
+ }
35
+
36
+ export interface HealthResponse {
37
+ status: string;
38
+ current_tick: number;
39
+ }
40
+
41
+ export interface AutoTickResponse {
42
+ ticks_applied: number;
43
+ current_tick: number;
44
+ elapsed_seconds: number;
45
+ }
46
+
47
+ export type Freshness = "fresh" | "normal" | "stale";
48
+
49
+ export function toFreshness(storageScore: number): Freshness {
50
+ if (storageScore > 0.7) return "fresh";
51
+ if (storageScore > 0.3) return "normal";
52
+ return "stale";
53
+ }