@xdarkicex/openclaw-memory-libravdb 1.3.11 → 1.3.13

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/docs/gating.md CHANGED
@@ -3,327 +3,125 @@
3
3
  This document describes the ingestion gate used to decide whether a user turn should be promoted into durable `user:` memory. It is the most novel scoring component in the repository.
4
4
 
5
5
  Implemented in:
6
-
7
- - [`sidecar/compact/gate.go`](../sidecar/compact/gate.go)
8
- - [`sidecar/compact/tokens.go`](../sidecar/compact/tokens.go)
9
- - [`sidecar/compact/summarize.go`](../sidecar/compact/summarize.go) for the
10
- downstream abstractive-routing threshold
6
+ - `sidecar/compact/gate.go`
7
+ - `sidecar/compact/tokens.go`
8
+ - `sidecar/compact/summarize.go` for the downstream abstractive-routing threshold
11
9
 
12
10
  ## 1. Why the Original Scalar Failed
13
11
 
14
12
  The original scalar assumed conversational memory semantics:
15
-
16
13
  - low novelty meant "already known"
17
14
  - repetition meant "probably redundant"
18
15
  - low natural-language structure meant "probably noise"
19
16
 
20
- That logic breaks for technical sessions.
21
-
22
- Repeated workflow context is often exactly what should be remembered:
23
-
24
- - file paths
25
- - APIs
26
- - failure signatures
27
- - configuration changes
28
- - architectural decisions
29
-
30
- In technical work, repetition can indicate persistent work context rather than low value.
17
+ That logic breaks for technical sessions. Repeated workflow context is often exactly what should be remembered: file paths, APIs, failure signatures, configuration changes, and architectural decisions. In technical work, repetition can indicate persistent work context rather than low value.
31
18
 
32
19
  ## 2. The Convex Mixture
33
20
 
34
21
  The corrected gate is:
35
-
36
- $$
37
- G(t) = (1 - T(t)) \cdot G_{\mathrm{conv}}(t) + T(t) \cdot G_{\mathrm{tech}}(t)
38
- $$
22
+ \[ G(t) = (1 - T(t)) \cdot G_{\mathrm{conv}}(t) + T(t) \cdot G_{\mathrm{tech}}(t) \]
39
23
 
40
24
  where:
25
+ \[ G_{\mathrm{conv}}(t) = w_1^c H(t) + w_2^c R(t) + w_3^c D_{nl}(t) \]
26
+ \[ G_{\mathrm{tech}}(t) = w_1^t P(t) + w_2^t A(t) + w_3^t D_{\mathrm{tech}}(t) \]
41
27
 
42
- $$
43
- G_{\mathrm{conv}}(t) = w_1^c H(t) + w_2^c R(t) + w_3^c D_{nl}(t)
44
- $$
45
-
46
- $$
47
- G_{\mathrm{tech}}(t) = w_1^t P(t) + w_2^t A(t) + w_3^t D_{\mathrm{tech}}(t)
48
- $$
49
-
50
- and:
51
-
52
- $$
53
- T(t) \in [0,1]
54
- $$
28
+ and the domain indicator is bounded:
29
+ \[ T(t) \in [0,1] \]
55
30
 
56
- is the technical-density signal.
57
-
58
- Current default weights from
59
- [`DefaultGatingConfig()`](../sidecar/compact/gate.go):
31
+ ### Weight Invariants
32
+ To guarantee that the sub-branch scores remain strictly bounded to $[0,1]$, the configuration must satisfy:
33
+ \[ \sum_{i=1}^3 w_i^c = 1 \quad \text{and} \quad \sum_{i=1}^3 w_i^t = 1 \]
60
34
 
35
+ Current default weights from `DefaultGatingConfig()`:
61
36
  - conversational branch: $w_1^c = 0.35$, $w_2^c = 0.40$, $w_3^c = 0.25$
62
37
  - technical branch: $w_1^t = 0.40$, $w_2^t = 0.35$, $w_3^t = 0.25$
63
38
 
64
- ### Boundedness
65
-
66
- If:
67
-
68
- - $T(t) \in [0,1]$
69
- - $G_{\mathrm{conv}}(t) \in [0,1]$
70
- - $G_{\mathrm{tech}}(t) \in [0,1]$
71
-
72
- then:
73
-
74
- $$
75
- G(t) \in [0,1]
76
- $$
77
-
78
- because $G$ is a convex combination of two values in $[0,1]$.
79
-
80
- ### Continuity
39
+ ### Boundedness and Continuity
40
+ Because $T(t) \in [0,1]$, $G_{\mathrm{conv}}(t) \in [0,1]$, and $G_{\mathrm{tech}}(t) \in [0,1]$, $G(t)$ is a true convex combination bounded to $[0,1]$.
81
41
 
82
42
  The gate is continuous in $T$:
83
-
84
- $$
85
- \frac{\partial G}{\partial T} = G_{\mathrm{tech}} - G_{\mathrm{conv}}
86
- $$
87
-
88
- There is no discontinuous jump at a domain boundary. A mixed technical/conversational turn interpolates smoothly between the two sub-formulas.
43
+ \[ \frac{\partial G}{\partial T} = G_{\mathrm{tech}} - G_{\mathrm{conv}} \]
44
+ There is no discontinuous jump at a domain boundary. A mixed technical/conversational turn interpolates smoothly.
89
45
 
90
46
  ## 3. Domain Detection $T(t)$
91
47
 
92
48
  Technical density is a weighted sum of technical patterns with saturation:
49
+ \[ T(t) = \min\left(\frac{\sum_i s_i \cdot \mathbf{1}[\mathrm{pattern}_i(t)]}{\theta_{\mathrm{norm}}}, 1\right) \]
93
50
 
94
- $$
95
- T(t) = \min\left(\frac{\sum_i s_i \cdot \mathbf{1}[\mathrm{pattern}_i(t)]}{\theta_{\mathrm{norm}}}, 1\right)
96
- $$
97
-
98
- The shipped patterns include:
99
-
100
- - code fences
101
- - file paths
102
- - function definitions
103
- - shell commands
104
- - URLs or endpoints
105
- - stack traces
106
- - hashes or hex identifiers
107
-
108
- Default normalization:
109
-
110
- $$
111
- \theta_{\mathrm{norm}} = 1.5
112
- $$
51
+ The shipped patterns include code fences, file paths, function definitions, shell commands, URLs, stack traces, and hashes.
113
52
 
114
- This means two strong technical signals are enough to saturate the branch weight.
115
-
116
- Saturation at `1.0` is correct because the gate does not need "how technical beyond fully technical"; it only needs the branch mixture weight.
53
+ Default normalization is $\theta_{\mathrm{norm}} = 1.5$. This means two strong technical signals are enough to saturate the branch weight. Saturation at `1.0` is correct because the gate only needs the branch mixture weight, not a unbounded "technical magnitude."
117
54
 
118
55
  ## 4. Conversational Branch
119
56
 
120
57
  ### Novelty $H(t)$
121
58
 
122
- Novelty is:
59
+ In the live implementation (`sidecar/compact/gate.go`), retrieval scores reaching the gate use the public higher-is-better cosine-style relevance contract from the retrieval layer, spanning $[-1, 1]$ for cosine collections. To ensure the novelty term remains in $[0,1]$ for the convex mixture, the mathematical model applies a zero-clamp:
123
60
 
124
- $$
125
- H(t) = 1 - \frac{1}{|K|} \sum_{k \in K} \cos(\vec{v}_t, \vec{v}_k)
126
- $$
61
+ \[ H(t) = \begin{cases}
62
+ 1.0 & \text{if } |K| = 0 \\
63
+ 1 - \frac{1}{|K|} \sum_{k \in K} \max(0, \cos(\vec{v}_t, \vec{v}_k)) & \text{otherwise}
64
+ \end{cases} \]
127
65
 
128
66
  where $K$ is the retrieved nearest-neighbor set from durable `user:` memory.
129
67
 
130
68
  Properties:
131
-
132
- - empty memory gives $H=1.0$
133
- - highly similar existing memories drive $H$ toward `0`
134
-
135
- The implementation deliberately uses top-k mean similarity rather than centroid distance because user memory is often multimodal.
69
+ - An empty memory (cold start) safely returns $H=1.0$ instead of a division-by-zero.
70
+ - Highly similar existing memories ($\cos \to 1$) drive $H \to 0$.
71
+ - Negative or orthogonal neighbors are clamped to prevent $H(t) > 1$.
136
72
 
137
73
  ### Repetition Gate $R(t)$
138
74
 
139
- The repetition term is:
140
-
141
- $$
142
- R(t) = F(t) \cdot (1 - S(t))
143
- $$
75
+ The repetition term is a product, not a sum:
76
+ \[ R(t) = F(t) \cdot (1 - S(t)) \]
144
77
 
145
78
  with:
79
+ \[ F(t) = \min\left(\frac{\mathrm{hitsAbove}(\mathrm{turns:userId}, 0.80, k=10)}{5}, 1\right) \]
80
+ \[ S(t) = \min\left(\frac{\mathrm{hitsAbove}(\mathrm{user:userId}, 0.85, k=5)}{3}, 1\right) \]
146
81
 
147
- $$
148
- F(t) = \min\left(\frac{\mathrm{hitsAbove}(\mathrm{turns:userId}, 0.80, k=10)}{5}, 1\right)
149
- $$
150
-
151
- $$
152
- S(t) = \min\left(\frac{\mathrm{hitsAbove}(\mathrm{user:userId}, 0.85, k=5)}{3}, 1\right)
153
- $$
154
-
155
- This is intentionally a product, not a sum.
156
-
157
- Why:
158
-
159
- - high input frequency should help only if durable memory is not already saturated
160
- - high saturation should veto the repetition term regardless of frequency
161
-
162
- The veto property is structural:
163
-
164
- $$
165
- S(t) = 1 \Rightarrow R(t) = 0
166
- $$
82
+ Why a product? High input frequency should help only if durable memory is not already saturated. High saturation must veto the repetition term regardless of frequency. The veto property is structural: $S(t) = 1 \Rightarrow R(t) = 0$.
167
83
 
168
84
  ### Natural-Language Structural Load $D_{nl}(t)$
169
-
170
- The conversational branch adds heuristic structure for turns that look like:
171
-
172
- - preferences
173
- - human-name references
174
- - dates
175
- - quantities
176
- - fact assertions
177
-
178
- This is intentionally narrow. It excludes general proper-noun detection so technical identifiers do not inflate the conversational signal.
85
+ Detects heuristics like preferences, human-name references, dates, and fact assertions.
179
86
 
180
87
  ## 5. Technical Branch
181
88
 
182
89
  ### Specificity $P(t)$
183
90
 
184
- Specificity measures concrete artifact density:
185
-
186
- $$
187
- P(t) = \min\left(
188
- \frac{
189
- \sum_j p_j \cdot \mathrm{count}_j(t)
190
- }{
191
- \max(\mathrm{EstimateTokens}(t)/100, 1)
192
- },
193
- 1
194
- \right)
195
- $$
196
-
197
- The numerator counts things like:
198
-
199
- - file paths
200
- - function references
201
- - error codes
202
- - git references
203
- - API endpoints
204
-
205
- The normalization denominator is implemented in
206
- [`sidecar/compact/tokens.go`](../sidecar/compact/tokens.go):
91
+ Specificity measures concrete artifact density normalized by turn length:
207
92
 
208
- $$
209
- L(t)=\max\left(\left\lfloor \frac{\mathrm{len}(t)}{4} \right\rfloor, 1\right)
210
- $$
93
+ \[ P(t) = \min\left( \frac{\sum_j p_j \cdot \mathrm{count}_j(t)}{\max(L(t)/100.0, 1.0)}, 1 \right) \]
211
94
 
212
- This bytes-per-token heuristic is the token estimator used by the gating
213
- subsystem. It is intentionally cheap and deterministic. It is not the same as
214
- the separate host-side prompt-budget estimator in [`src/tokens.ts`](../src/tokens.ts).
95
+ The numerator counts things like file paths, error codes, and API endpoints.
96
+ The normalization denominator is the token estimator used by the gating subsystem (`sidecar/compact/tokens.go`):
97
+ \[ L(t) = \max\left(\left\lfloor \frac{\mathrm{RuneCount}(t)}{4} \right\rfloor, 1\right) \]
215
98
 
216
- Length normalization matters. Without it, any long technical turn would score
217
- high simply because it contains more surface area, not because it is more
218
- memory-worthy.
99
+ Length normalization matters. Without it, any long technical turn would score high simply because it contains more surface area, not because it is more memory-worthy.
219
100
 
220
101
  ### Actionability $A(t)$
221
-
222
- Actionability captures decision and outcome content:
223
-
224
- - architectural decisions
225
- - fixes or resolutions
226
- - deployment or merge milestones
227
- - configuration changes
228
-
229
- These are the kinds of technical turns that are expensive to reconstruct later and therefore worth persisting.
102
+ Captures architectural decisions, fixes, merge milestones, and configuration changes.
230
103
 
231
104
  ### Technical Structural Load $D_{\mathrm{tech}}(t)$
232
-
233
- This branch detects structural technical content such as:
234
-
235
- - function definitions
236
- - data structures
237
- - dependencies
238
- - tests
239
- - documentation comments
240
-
241
- It is the technical analogue to $D_{nl}$, not a replacement for it.
105
+ Detects function definitions, dependencies, and tests. It is the technical analogue to $D_{nl}$.
242
106
 
243
107
  ## 6. Calibration
244
108
 
245
- Stored metadata includes:
246
-
247
- - `gating_score`
248
- - `gating_t`
249
- - `gating_h`
250
- - `gating_r`
251
- - `gating_d`
252
- - `gating_p`
253
- - `gating_a`
254
- - `gating_dtech`
255
- - `gating_gconv`
256
- - `gating_gtech`
257
-
258
- The first calibration pass should inspect the empirical score distribution after real traffic arrives.
259
-
260
- What to look for:
261
-
262
- - bimodality in `gating_score`
263
- - sensible spread in `gating_t`
264
- - non-degenerate contributions from both `gconv` and `gtech`
265
-
266
109
  For threshold tuning, isotonic regression is the correct calibration method once usefulness labels exist:
267
-
268
- $$
269
- P(\mathrm{useful} \mid G) = \mathrm{IsotonicRegression}(G, y)
270
- $$
271
-
272
- It preserves the monotonic design of the gate without assuming a sigmoid link function.
110
+ \[ P(\mathrm{useful} \mid G) = \mathrm{IsotonicRegression}(G, y) \]
273
111
 
274
112
  Current thresholds implemented in code:
275
-
276
- - durable promotion threshold:
277
- [`DefaultGatingConfig().Threshold = 0.35`](../sidecar/compact/gate.go)
278
- - abstractive compaction routing threshold:
279
- [`AbstractiveRoutingThreshold = 0.60`](../sidecar/compact/summarize.go)
113
+ - durable promotion: `DefaultGatingConfig().Threshold = 0.35`
114
+ - abstractive routing: `AbstractiveRoutingThreshold = 0.60`
280
115
 
281
116
  ## 7. Invariants
282
117
 
283
- The gate has six mathematical invariants in `gate_test.go`.
284
-
285
- ### 1. Empty memory implies full novelty
286
-
287
- $$
288
- \mathrm{memHits} = \emptyset \Rightarrow H = 1.0
289
- $$
290
-
291
- This prevents a cold start from suppressing every first durable insertion.
292
-
293
- ### 2. Saturation vetoes repetition
294
-
295
- $$
296
- \mathrm{MemSaturation} = 1 \Rightarrow R = 0
297
- $$
298
-
299
- This is what makes the repetition term a true gate instead of an accumulation bonus.
300
-
301
- ### 3. The convex blend stays in bounds
302
-
303
- $$
304
- G \in [0,1]
305
- $$
306
-
307
- and:
308
-
309
- $$
310
- G \in [\min(G_{\mathrm{conv}}, G_{\mathrm{tech}}), \max(G_{\mathrm{conv}}, G_{\mathrm{tech}})]
311
- $$
312
-
313
- ### 4. Purely conversational turns collapse to the conversational branch
314
-
315
- $$
316
- T = 0 \Rightarrow G = G_{\mathrm{conv}}
317
- $$
318
-
319
- ### 5. Purely technical turns collapse to the technical branch
320
-
321
- $$
322
- T = 1 \Rightarrow G = G_{\mathrm{tech}}
323
- $$
324
-
325
- ### 6. Conversational structure should not overfire on pure code
118
+ The gate preserves six mathematical invariants mapped to `gate_test.go`:
326
119
 
327
- This guards against a common failure mode where technical identifiers masquerade as conversational entities.
120
+ 1. **Empty memory implies full novelty:** $\mathrm{memHits} = \emptyset \Rightarrow H = 1.0$
121
+ 2. **Saturation vetoes repetition:** $\mathrm{MemSaturation} = 1 \Rightarrow R = 0$
122
+ 3. **The convex blend stays in bounds:** $G \in [0,1]$
123
+ 4. **Monotonic Interpolation:** $G \in [\min(G_{\mathrm{conv}}, G_{\mathrm{tech}}), \max(G_{\mathrm{conv}}, G_{\mathrm{tech}})]$
124
+ 5. **Purely conversational turns collapse:** $T = 0 \Rightarrow G = G_{\mathrm{conv}}$
125
+ 6. **Purely technical turns collapse:** $T = 1 \Rightarrow G = G_{\mathrm{tech}}$
328
126
 
329
- Together these invariants make the scalar interpretable, stable, and safe to tune later from real traffic rather than intuition.
127
+ Conversational structure must not overfire on pure code. Together these invariants make the scalar interpretable, stable, and safe to tune.
@@ -0,0 +1,179 @@
1
+ # Install Guide
2
+
3
+ LibraVDB Memory is a connect-only OpenClaw plugin. Install the plugin as a
4
+ normal package, install `libravdbd` separately, and point the plugin at the
5
+ daemon endpoint when you need a non-default location.
6
+
7
+ For deeper operational detail, use the full
8
+ [installation reference](./installation.md).
9
+
10
+ ## Recommended Path: Homebrew + OpenClaw Plugin
11
+
12
+ On macOS, the shortest supported path is:
13
+
14
+ ```bash
15
+ brew tap xDarkicex/openclaw-libravdb-memory
16
+ brew install libravdbd
17
+ brew services start libravdbd
18
+ openclaw plugins install @xdarkicex/openclaw-memory-libravdb
19
+ ```
20
+
21
+ This gives you:
22
+
23
+ - a managed `libravdbd` service
24
+ - a scanner-clean plugin install
25
+ - a clean separation between plugin lifecycle and daemon lifecycle
26
+
27
+ ## Plugin Install
28
+
29
+ Install the plugin package with the OpenClaw CLI:
30
+
31
+ ```bash
32
+ openclaw plugins install @xdarkicex/openclaw-memory-libravdb
33
+ ```
34
+
35
+ If you use the OpenClaw.ai plugin UI instead of the CLI, install the same
36
+ package and then assign the plugin id `libravdb-memory` to the `memory` slot or
37
+ the `contextEngine` slot.
38
+
39
+ Activate the plugin in `~/.openclaw/openclaw.json`:
40
+
41
+ ```json
42
+ {
43
+ "plugins": {
44
+ "slots": {
45
+ "memory": "libravdb-memory"
46
+ }
47
+ }
48
+ }
49
+ ```
50
+
51
+ If you run the daemon on a non-default endpoint, add a plugin config:
52
+
53
+ ```json
54
+ {
55
+ "plugins": {
56
+ "slots": {
57
+ "memory": "libravdb-memory"
58
+ },
59
+ "configs": {
60
+ "libravdb-memory": {
61
+ "sidecarPath": "unix:/Users/<you>/.clawdb/run/libravdb.sock"
62
+ }
63
+ }
64
+ }
65
+ }
66
+ ```
67
+
68
+ ## Sidecar Daemon Install
69
+
70
+ The daemon owns the local database, embeddings, and JSON-RPC endpoint.
71
+
72
+ Default endpoints:
73
+
74
+ - macOS/Linux: `unix:$HOME/.clawdb/run/libravdb.sock`
75
+ - Windows: `tcp:127.0.0.1:37421`
76
+
77
+ Default data path:
78
+
79
+ - macOS/Linux/Windows user installs: `$HOME/.clawdb/data.libravdb`
80
+
81
+ ### Homebrew
82
+
83
+ Homebrew is the preferred daemon lifecycle on macOS:
84
+
85
+ ```bash
86
+ brew tap xDarkicex/openclaw-libravdb-memory
87
+ brew install libravdbd
88
+ brew services start libravdbd
89
+ ```
90
+
91
+ Useful lifecycle commands:
92
+
93
+ ```bash
94
+ brew services restart libravdbd
95
+ brew services stop libravdbd
96
+ brew info libravdbd
97
+ ```
98
+
99
+ ### Manual Service Management
100
+
101
+ If you are not using Homebrew, manage the daemon explicitly.
102
+
103
+ Linux user service from the repo template:
104
+
105
+ ```bash
106
+ # Replace vX.Y.Z with the latest tag from:
107
+ # https://github.com/xDarkicex/openclaw-memory-libravdb/releases
108
+ mkdir -p ~/.local/bin ~/.config/systemd/user
109
+ curl -L -o ~/.local/bin/libravdbd https://github.com/xDarkicex/openclaw-memory-libravdb/releases/download/vX.Y.Z/libravdbd-linux-amd64
110
+ chmod +x ~/.local/bin/libravdbd
111
+ curl -L -o ~/.config/systemd/user/libravdbd.service \
112
+ https://raw.githubusercontent.com/xDarkicex/openclaw-memory-libravdb/main/packaging/systemd/libravdbd.service
113
+ systemctl --user enable --now libravdbd.service
114
+ ```
115
+
116
+ macOS LaunchAgent from the repo template:
117
+
118
+ 1. Download `com.xdarkicex.libravdbd.plist` from:
119
+ `https://raw.githubusercontent.com/xDarkicex/openclaw-memory-libravdb/main/packaging/launchd/com.xdarkicex.libravdbd.plist`
120
+ 2. Replace `__HOME__` with your home directory.
121
+ 3. Save it to `~/Library/LaunchAgents/com.xdarkicex.libravdbd.plist`.
122
+ 4. Load it with `launchctl bootstrap gui/$(id -u) ~/Library/LaunchAgents/com.xdarkicex.libravdbd.plist`.
123
+
124
+ ### Windows
125
+
126
+ Windows uses a loopback TCP endpoint by default:
127
+
128
+ - `tcp:127.0.0.1:37421`
129
+
130
+ This guide does not yet include a full Windows service-install walkthrough.
131
+ For now, use the published Windows daemon asset from the GitHub releases page
132
+ and run it under your preferred process supervisor or a manual terminal session.
133
+
134
+ Foreground manual run:
135
+
136
+ ```bash
137
+ libravdbd serve
138
+ ```
139
+
140
+ That mode is useful for debugging or validating a local release asset before
141
+ you wrap it in `brew services`, `systemd`, or `launchd`.
142
+
143
+ ## Lifecycle Management
144
+
145
+ ### Plugin Lifecycle
146
+
147
+ - Install the package with `openclaw plugins install`.
148
+ - Activate it by assigning `libravdb-memory` to `memory` or `contextEngine`.
149
+ - Update it with your normal OpenClaw plugin update flow.
150
+ - Disable it by removing the slot assignment from `~/.openclaw/openclaw.json`.
151
+
152
+ The plugin does not manage the daemon process. Treat plugin activation and
153
+ daemon supervision as separate lifecycle decisions.
154
+
155
+ ### Daemon Lifecycle
156
+
157
+ - Start it with `brew services`, `systemd --user`, `launchctl bootstrap`, or a manual `libravdbd serve`.
158
+ - Restart it when you change daemon-level environment variables or replace the binary.
159
+ - Stop it before uninstalling or deleting on-disk data.
160
+ - Point the plugin at the correct endpoint with `sidecarPath` if you do not use the default location.
161
+
162
+ ## Verification
163
+
164
+ After the plugin and daemon are both in place, run:
165
+
166
+ ```bash
167
+ openclaw memory status
168
+ ```
169
+
170
+ Healthy output should show that:
171
+
172
+ - the daemon answered the local health check
173
+ - the memory slot is active
174
+ - the plugin can read stored counts and runtime settings
175
+
176
+ If OpenClaw cannot reach the daemon, verify the endpoint first:
177
+
178
+ - macOS/Linux default: `unix:$HOME/.clawdb/run/libravdb.sock`
179
+ - Windows default: `tcp:127.0.0.1:37421`
@@ -117,6 +117,19 @@ extractive compaction. The only optional runtime network path is:
117
117
 
118
118
  ## Standard Install
119
119
 
120
+ ### Fastest Path on macOS
121
+
122
+ ```bash
123
+ brew tap xDarkicex/openclaw-libravdb-memory
124
+ brew install libravdbd
125
+ brew services start libravdbd
126
+ openclaw plugins install @xdarkicex/openclaw-memory-libravdb
127
+ ```
128
+
129
+ This is the preferred install flow for macOS users. It gives you a managed `libravdbd` service and a scanner-clean OpenClaw plugin package.
130
+
131
+ ### Plugin Package
132
+
120
133
  ```bash
121
134
  openclaw plugins install @xdarkicex/openclaw-memory-libravdb
122
135
  ```
@@ -155,7 +168,15 @@ openclaw memory status
155
168
 
156
169
  ### Homebrew / macOS
157
170
 
158
- The release workflow now generates a publish-ready `libravdbd.rb` formula asset from [`packaging/homebrew/libravdbd.rb.tmpl`](../packaging/homebrew/libravdbd.rb.tmpl). It is designed for GitHub release assets named:
171
+ Homebrew users should normally install from the published tap:
172
+
173
+ ```bash
174
+ brew tap xDarkicex/openclaw-libravdb-memory
175
+ brew install libravdbd
176
+ brew services start libravdbd
177
+ ```
178
+
179
+ The release workflow generates a publish-ready `libravdbd.rb` formula asset from [`packaging/homebrew/libravdbd.rb.tmpl`](../packaging/homebrew/libravdbd.rb.tmpl). It is designed for GitHub release assets named:
159
180
 
160
181
  - `libravdbd-darwin-arm64`
161
182
  - `libravdbd-darwin-amd64`
@@ -169,7 +190,7 @@ If your GitHub Actions configuration includes:
169
190
 
170
191
  then tagged releases also push the generated formula into `Formula/libravdbd.rb` in that tap repository automatically.
171
192
 
172
- Example:
193
+ Example plugin config:
173
194
 
174
195
  ```json
175
196
  {
@@ -196,7 +217,7 @@ Installed plugin: libravdb-memory
196
217
 
197
218
  ## Activation
198
219
 
199
- The plugin declares `kind: "memory"` and is intended to occupy the `memory` slot. If your OpenClaw build also exposes legacy context-engine slotting, keep the memory slot authoritative and use the context-engine slot only for compatibility testing.
220
+ The plugin declares `kind: ["memory", "context-engine"]` and registers for both the `memory` and `context-engine` slots. Either slot assignment activates the plugin.
200
221
 
201
222
  Add this to `~/.openclaw/openclaw.json`:
202
223
 
@@ -204,8 +225,19 @@ Add this to `~/.openclaw/openclaw.json`:
204
225
  {
205
226
  "plugins": {
206
227
  "slots": {
207
- "memory": "libravdb-memory",
208
- "contextEngine": "legacy"
228
+ "memory": "libravdb-memory"
229
+ }
230
+ }
231
+ }
232
+ ```
233
+
234
+ If your OpenClaw build uses the `contextEngine` slot instead, you can assign it there:
235
+
236
+ ```json
237
+ {
238
+ "plugins": {
239
+ "slots": {
240
+ "contextEngine": "libravdb-memory"
209
241
  }
210
242
  }
211
243
  }
@@ -213,12 +245,10 @@ Add this to `~/.openclaw/openclaw.json`:
213
245
 
214
246
  Notes:
215
247
 
216
- - `memory: "libravdb-memory"` is the actual activation step.
217
- - `contextEngine: "legacy"` keeps the legacy engine explicit when the host still exposes that slot.
218
- - If you instead point `contextEngine` at another plugin, you are changing a separate slot from the memory replacement.
248
+ - Either `memory` or `contextEngine` slot assignment activates the plugin. You do not need both.
219
249
  - The plugin id is `libravdb-memory`. The npm package name used at install time is `@xdarkicex/openclaw-memory-libravdb`.
220
250
 
221
- Without the `memory` slot entry, OpenClaw's default memory can continue to run in parallel.
251
+ Without a slot entry, OpenClaw's default memory can continue to run in parallel.
222
252
 
223
253
  ## Verification
224
254
 
@@ -301,6 +331,12 @@ openclaw memory status
301
331
 
302
332
  If the daemon is down, start it and verify the configured endpoint:
303
333
 
334
+ ```bash
335
+ brew services start libravdbd
336
+ ```
337
+
338
+ Or, without Homebrew:
339
+
304
340
  ```bash
305
341
  libravdbd serve
306
342
  ```