@xdarkicex/openclaw-memory-libravdb 1.3.13 → 1.3.17

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -4,30 +4,84 @@
4
4
  [![TypeScript](https://img.shields.io/badge/TypeScript-5.x-3178C6?logo=typescript&logoColor=white)](./package.json)
5
5
  [![OpenClaw](https://img.shields.io/badge/OpenClaw-memory%20plugin-111827)](./openclaw.plugin.json)
6
6
 
7
- Local-first memory for OpenClaw that pairs a TypeScript plugin with a Go
8
- daemon, keeps recent work intact, and promotes durable memory only when the
9
- signal is strong enough to matter.
7
+ `@xdarkicex/openclaw-memory-libravdb` is a local-first OpenClaw memory system
8
+ for people who want more than "top-k vectors plus a prompt footer."
10
9
 
11
- ## Install and Lifecycle
10
+ It replaces the default lightweight memory path with a full context lifecycle:
12
11
 
13
- - [Install guide](./docs/install.md) for Homebrew, OpenClaw / OpenClaw.ai plugin setup, and manual daemon lifecycle.
14
- - [Uninstall guide](./docs/uninstall.md) for clean plugin removal, daemon shutdown, and optional data cleanup.
15
- - [Full installation reference](./docs/installation.md) for deeper operational detail, troubleshooting, and packaging notes.
12
+ - active session memory
13
+ - durable per-user memory
14
+ - shared global memory
15
+ - continuity-aware compaction
16
+ - authored context partitioning
17
+ - hybrid scoring across scope, recency, and similarity
16
18
 
17
- Start with the [install guide](./docs/install.md) for the supported daemon
18
- setup paths and activation flow. The short version is:
19
+ This repository pairs a TypeScript OpenClaw plugin with a Go daemon backed by
20
+ `libraVDB`. The plugin owns both the `memory` and `contextEngine` slots, while
21
+ the daemon handles embeddings, retrieval, storage, and compaction.
22
+ On newer OpenClaw builds, it also bridges the built-in `memory_search` runtime
23
+ to the same libraVDB sidecar instead of leaving that tool inert.
19
24
 
20
- - install the plugin with `openclaw plugins install @xdarkicex/openclaw-memory-libravdb`
21
- - install and start `libravdbd` separately
22
- - assign `libravdb-memory` to the OpenClaw `memory` slot
25
+ ## Why This Exists
23
26
 
24
- Then activate the plugin in `~/.openclaw/openclaw.json`:
27
+ The stock "single memory bucket" pattern is good for simple persistence, but it
28
+ starts to break down when you care about:
29
+
30
+ - keeping the newest working context raw and intact
31
+ - separating ephemeral session state from durable memory
32
+ - avoiding long-session prompt collapse
33
+ - preserving authored instructions differently from recalled user content
34
+ - treating memory retrieval as a ranked assembly problem instead of plain
35
+ nearest-neighbor lookup
36
+
37
+ LibraVDB Memory exists for that harder class of memory problem.
38
+
39
+ ## What Makes It Different
40
+
41
+ These are the core differentiators the project is built around:
42
+
43
+ - Dual slot ownership: the plugin owns both memory prompt injection and the
44
+ full context lifecycle.
45
+ - Built-in `memory_search` bridge: newer OpenClaw memory runtime calls are
46
+ routed into the same sidecar-backed retrieval path.
47
+ - Lifecycle hint adoption: `before_reset` and `session_end` are used as
48
+ advisory signals into the sidecar without giving OpenClaw control of ingest
49
+ or compaction.
50
+ - Sidecar-owned lifecycle journal: reset/end hints are recorded internally for
51
+ debugging and auditing without entering normal memory retrieval.
52
+ The journal is bounded by a sidecar retention cap so it does not grow
53
+ forever.
54
+ - Local-first runtime: the core path does not depend on external embedding
55
+ services.
56
+ - Three-tier memory: session, durable user, and global memory stay distinct.
57
+ - Hybrid scoring: retrieval is ranked by semantic similarity, recency, scope,
58
+ and summary quality instead of cosine alone.
59
+ - Automatic compaction: long sessions compact behind a protected recent tail.
60
+ - Crash-resilient IPC: the host talks to a sidecar over a stable local socket
61
+ or loopback TCP endpoint with degraded-mode fallback.
62
+
63
+ ## Quick Start
64
+
65
+ The supported install flow is:
66
+
67
+ ```bash
68
+ brew tap xDarkicex/openclaw-libravdb-memory
69
+ brew install libravdbd
70
+ brew services start libravdbd
71
+ openclaw plugins install @xdarkicex/openclaw-memory-libravdb
72
+ ```
73
+
74
+ The Homebrew formula installs the daemon plus the bundled ONNX Runtime, embedding assets, and T5 summarizer assets it needs to boot cleanly on supported platforms.
75
+
76
+ Then assign the plugin to both required OpenClaw slots in
77
+ `~/.openclaw/openclaw.json`:
25
78
 
26
79
  ```json
27
80
  {
28
81
  "plugins": {
29
82
  "slots": {
30
- "memory": "libravdb-memory"
83
+ "memory": "libravdb-memory",
84
+ "contextEngine": "libravdb-memory"
31
85
  },
32
86
  "configs": {
33
87
  "libravdb-memory": {
@@ -38,45 +92,192 @@ Then activate the plugin in `~/.openclaw/openclaw.json`:
38
92
  }
39
93
  ```
40
94
 
41
- The published plugin is connect-only. It does not compile or spawn a local Go
42
- binary during install. The `libravdbd` daemon is managed separately and the
43
- plugin connects to an endpoint such as `unix:$HOME/.clawdb/run/libravdb.sock`
44
- or `tcp:127.0.0.1:37421`.
95
+ Verify the setup:
45
96
 
46
- Use `sidecarPath: "auto"` or omit the field to use the platform default
47
- endpoint. If your daemon listens elsewhere, set an explicit endpoint such as
48
- `unix:/custom/path/libravdb.sock` or `tcp:127.0.0.1:9999`.
97
+ ```bash
98
+ openclaw memory status
99
+ ```
49
100
 
50
- ## How It Works
101
+ Expected healthy state:
51
102
 
52
- - [Hybrid retrieval and prompt assembly](./docs/mathematics-v2.md): combines semantic similarity, recency, memory scope, and budget-aware packing so the prompt keeps the most useful memory instead of only the nearest vectors.
53
- - [Authored context partitioning](./docs/ast-v2.md): splits authored Markdown into hard directives, soft directives, and searchable lore so critical instructions are always preserved while narrative context still competes through retrieval.
54
- - [Domain-Adaptive Gating](./docs/gating.md): decides which turns deserve promotion into durable memory by blending conversational and technical signals rather than treating all chats like generic prose.
55
- - [Continuity preservation](./docs/continuity.md): protects a recent raw session tail and lets older history compact behind it, preventing summaries from erasing the newest working context.
103
+ - the daemon is reachable
104
+ - the plugin is active as the memory provider
105
+ - the runtime can report stored counts and model readiness
56
106
 
57
- Three practical ideas shape the runtime:
107
+ ## Install Model
58
108
 
59
- - Hybrid ranking keeps session turns, durable user memory, and global memory on the same scoreboard while still respecting recency.
60
- - Two-pass, in-place compaction preserves continuity by refusing destructive rewrites of the newest working tail.
61
- - Domain-adaptive ingestion avoids over-saving noisy chatter while still retaining technical decisions, file paths, error signatures, and workflow milestones.
109
+ This plugin is intentionally **connect-only** at install time.
62
110
 
63
- ## Runtime Model
111
+ It does not compile Go code during plugin installation, and it does not manage
112
+ daemon lifecycle automatically from the npm package. That is deliberate: some
113
+ OpenClaw environments are strict about postinstall behavior, daemon spawning,
114
+ and anything that looks like binary bootstrap or process management.
64
115
 
65
- - Plugin package: `@xdarkicex/openclaw-memory-libravdb`
66
- - OpenClaw plugin id: `libravdb-memory`
67
- - Minimum host version: `openclaw >= 2026.3.22`
68
- - Default daemon endpoint on macOS/Linux: `unix:$HOME/.clawdb/run/libravdb.sock`
69
- - Default daemon endpoint on Windows: `tcp:127.0.0.1:37421`
70
- - Default daemon data path: `$HOME/.clawdb/data.libravdb`
116
+ Current model:
71
117
 
72
- ## Verify
118
+ - npm/OpenClaw package: plugin code and docs
119
+ - `libravdbd`: installed and managed separately
120
+ - default daemon endpoint on macOS/Linux:
121
+ `unix:$HOME/.clawdb/run/libravdb.sock`
122
+ - default daemon endpoint on Windows:
123
+ `tcp:127.0.0.1:37421`
73
124
 
74
- Run:
125
+ If your daemon runs elsewhere, set an explicit `sidecarPath`, for example:
75
126
 
76
- ```bash
77
- openclaw memory status
127
+ - `unix:/custom/path/libravdb.sock`
128
+ - `tcp:127.0.0.1:9999`
129
+
130
+ ## Architecture At A Glance
131
+
132
+ ```text
133
+ OpenClaw host
134
+ -> memoryPromptSection (durable user/global recall)
135
+ -> memory runtime bridge (built-in memory_search)
136
+ -> context engine (bootstrap / ingest / assemble / compact)
137
+ -> plugin runtime
138
+ -> JSON-RPC
139
+ -> libravdbd
140
+ -> libraVDB + local embedding/summarization stack
78
141
  ```
79
142
 
80
- Expected output includes a readable status table showing whether the daemon is
81
- reachable, how much memory is stored, and whether the local summarization path
82
- is provisioned.
143
+ The main runtime split is:
144
+
145
+ - TypeScript host layer:
146
+ - OpenClaw plugin registration
147
+ - prompt assembly
148
+ - hybrid ranking
149
+ - continuity-aware token budgeting
150
+ - degraded-mode behavior
151
+ - Go daemon layer:
152
+ - vector storage
153
+ - embeddings
154
+ - search RPCs
155
+ - compaction and summarization
156
+ - stable local IPC endpoint
157
+
158
+ For the implemented architecture map, read
159
+ [docs/architecture.md](./docs/architecture.md).
160
+
161
+ ## Retrieval Model
162
+
163
+ The assembly path is not "just search some vectors and paste the top hits."
164
+
165
+ It combines:
166
+
167
+ - session search for current-work relevance
168
+ - durable user recall for long-lived personal context
169
+ - global recall for shared facts
170
+ - authored invariant and variant context
171
+ - continuity-preserving recent-tail injection
172
+ - token-budgeted fitting
173
+
174
+ The ranking model currently blends:
175
+
176
+ - semantic similarity
177
+ - scope weighting
178
+ - recency decay
179
+ - summary quality attenuation
180
+
181
+ The formal math lives in:
182
+
183
+ - [docs/mathematics-v2.md](./docs/mathematics-v2.md)
184
+ - [docs/continuity.md](./docs/continuity.md)
185
+ - [docs/ast-v2.md](./docs/ast-v2.md)
186
+ - [docs/elevated-guidance.md](./docs/elevated-guidance.md)
187
+
188
+ ## Compaction Model
189
+
190
+ This system does not treat long chats as append-only forever.
191
+
192
+ Older session turns compact behind a protected recent tail, so the plugin can:
193
+
194
+ - keep the newest working context raw
195
+ - preserve adjacency-sensitive continuity near the boundary
196
+ - promote older material into summaries
197
+ - avoid letting long sessions drown their own prompt budget
198
+
199
+ Compaction is designed as part of the memory system itself, not as a separate
200
+ maintenance convenience.
201
+
202
+ ## For Power Users
203
+
204
+ If you are evaluating this as an operator or advanced OpenClaw user, the key
205
+ practical points are:
206
+
207
+ - This plugin should own both `memory` and `contextEngine`. Partial slot
208
+ assignment is a misconfiguration.
209
+ - On hosts that expose `registerMemoryRuntime`, the built-in `memory_search`
210
+ tool now searches the same libraVDB-backed memory stores.
211
+ - The daemon is a separate operational unit. Treat plugin lifecycle and daemon
212
+ lifecycle as different concerns.
213
+ - The system is local-first by design. The critical retrieval path does not
214
+ require a remote embedding service.
215
+ - The sidecar transport is stable and explicit, which makes it service-manager
216
+ friendly on macOS, Linux, and Windows.
217
+
218
+ Good entry points:
219
+
220
+ - [docs/install.md](./docs/install.md)
221
+ - [docs/installation.md](./docs/installation.md)
222
+ - [docs/uninstall.md](./docs/uninstall.md)
223
+ - [docs/implementation.md](./docs/implementation.md)
224
+
225
+ ## For Researchers And Builders
226
+
227
+ If you are studying retrieval, memory systems, or agent architecture, the
228
+ interesting parts of this repo are:
229
+
230
+ - continuity-aware assembly:
231
+ `C_total(q) = I union T_recent union Proj(V_rest, q)`
232
+ - hybrid ranking instead of pure cosine retrieval
233
+ - separation of authored invariants from searchable authored lore
234
+ - durable-memory admission via domain-adaptive gating
235
+ - local daemon architecture rather than in-process TS vector plumbing
236
+ - compaction that preserves recent working context instead of flattening the
237
+ whole transcript
238
+
239
+ Start here:
240
+
241
+ - [docs/problem.md](./docs/problem.md)
242
+ - [docs/architecture.md](./docs/architecture.md)
243
+ - [docs/mathematics-v2.md](./docs/mathematics-v2.md)
244
+ - [docs/gating.md](./docs/gating.md)
245
+ - [docs/continuity.md](./docs/continuity.md)
246
+
247
+ ## Runtime Facts
248
+
249
+ - npm package: `@xdarkicex/openclaw-memory-libravdb`
250
+ - OpenClaw plugin id: `libravdb-memory`
251
+ - minimum host version: `openclaw >= 2026.3.22`
252
+ - default daemon data path: `$HOME/.clawdb/data.libravdb`
253
+ - default daemon endpoint on macOS/Linux:
254
+ `unix:$HOME/.clawdb/run/libravdb.sock`
255
+ - default daemon endpoint on Windows:
256
+ `tcp:127.0.0.1:37421`
257
+
258
+ ## Repository Guide
259
+
260
+ - [docs/install.md](./docs/install.md): quick install and lifecycle guide
261
+ - [docs/installation.md](./docs/installation.md): full installation and
262
+ packaging reference
263
+ - [docs/uninstall.md](./docs/uninstall.md): clean shutdown and removal
264
+ - [docs/architecture.md](./docs/architecture.md): current implemented system
265
+ architecture
266
+ - [docs/implementation.md](./docs/implementation.md): important implementation
267
+ contracts
268
+ - [docs/mathematics-v2.md](./docs/mathematics-v2.md): formal scoring and
269
+ optimization reference
270
+
271
+ ## Current Constraint
272
+
273
+ Because OpenClaw environments can be strict about postinstall downloads,
274
+ daemon spawning, and scanner-visible binary bootstrap behavior, the cleanest
275
+ supported user path today is:
276
+
277
+ - install plugin
278
+ - install daemon
279
+ - assign both slots
280
+ - let the plugin connect to a stable local endpoint
281
+
282
+ That tradeoff is intentional. It keeps the plugin installation surface simple
283
+ and auditable while preserving the full local memory engine at runtime.
package/docs/README.md CHANGED
@@ -13,6 +13,7 @@ to preserve project history and design evolution.
13
13
  - [compaction-evaluation.md](./compaction-evaluation.md) - Real-model benchmark notes for T5 summary confidence, Nomic-space preservation, and the hard preservation gate.
14
14
  - [continuity.md](./continuity.md) - Continuity model for invariant context, preserved recent raw session tail, and retrieved older memory.
15
15
  - [ast-v2.md](./ast-v2.md) - Reviewed authoritative AST partitioning reference for authored Markdown hard invariants, soft invariants, and variant lore.
16
+ - [elevated-guidance.md](./elevated-guidance.md) - Tier 1.5 protected-shard and elevated-guidance model for preserving shadow rules through compaction.
16
17
  - [ast.md](./ast.md) - Historical predecessor to `ast-v2.md`, kept to show design evolution and earlier bugs.
17
18
  - [gating.md](./gating.md) - Full derivation and calibration guide for the domain-adaptive gating scalar.
18
19
  - [implementation.md](./implementation.md) - Non-obvious implementation decisions and their rationale.
package/docs/ast-v2.md CHANGED
@@ -33,6 +33,27 @@ We formalize this as a binary promotion scalar \(\sigma: N_d \to \{0,1\}\). This
33
33
  \end{cases}
34
34
  \]
35
35
 
36
+ To reason about tuning noise in the bigram set \(W_{\mathrm{deontic}}\), we
37
+ also define the paragraph classifier error rates:
38
+ \[
39
+ P_{\mathrm{fp}} = P(\sigma(n) = 1 \mid n \text{ is narrative lore})
40
+ \]
41
+ \[
42
+ P_{\mathrm{fn}} = P(\sigma(n) = 0 \mid n \text{ is behavioral rule})
43
+ \]
44
+
45
+ For authored documents whose lore paragraphs would otherwise remain in
46
+ \(\mathcal{V}_d\), the expected Tier-2 waste introduced by false positives is:
47
+ \[
48
+ \mathbb{E}[\mathrm{wasted\ toks\ in\ }\mathcal{I}_2]
49
+ =
50
+ P_{\mathrm{fp}} \cdot |\mathcal{V}_{d,\mathrm{paragraphs}}| \cdot \mathbb{E}[\mathrm{toks}(n)]
51
+ \]
52
+
53
+ This gives the parser a concrete quantity to minimize when adjusting
54
+ \(W_{\mathrm{deontic}}\), while \(P_{\mathrm{fn}}\) measures the risk of leaving
55
+ true behavioral rules behind in \(\mathcal{V}_d\).
56
+
36
57
  *Implemented via `NewDeonticFrame` and `EvaluateText` in the zero-allocation byte lexer.*
37
58
 
38
59
  ## 3. The Three-Tier Structural Indicator Function \(\iota\)
@@ -59,12 +80,27 @@ We define the structural indicator function \(\iota: N_d \to \{0,1,2\}\) mapping
59
80
  ## 4. Corpus Decomposition and Set Integration
60
81
 
61
82
  For any document \(d \in \mathbf{D}_{\text{agents}} \cup \mathbf{D}_{\text{souls}}\), the node set \(N_d\) is partitioned cleanly into three sets:
62
- - **Hard Directives:** \(\mathcal{I}_{1d} = \{ n \in N_d \mid \iota(n) = 1 \}\)
83
+ - **Hard Directives:** \(\mathcal{I}_{1d} = \langle n \in N_d \mid \iota(n) = 1 \rangle\), ordered by \(\mathrm{position}(n)\) ascending, where \(\mathrm{position}(n)\) is the byte offset of node \(n\) in \(d_{\mathrm{raw}}\)
63
84
  - **Soft Directives:** \(\mathcal{I}_{2d} = \{ n \in N_d \mid \iota(n) = 2 \}\)
64
85
  - **Contextual Lore:** \(\mathcal{V}_d = \{ n \in N_d \mid \iota(n) = 0 \}\)
65
86
 
66
87
  *Partition Completeness:* Because \(\iota(n)\) maps every node to exactly one integer in \(\{0, 1, 2\}\), the resulting sets are mutually exclusive and collectively exhaustive:
67
- \[ \mathcal{I}_{1d} \cup \mathcal{I}_{2d} \cup \mathcal{V}_d = N_d \quad \text{and} \quad \mathcal{I}_{1d} \cap \mathcal{I}_{2d} \cap \mathcal{V}_d = \emptyset \]
88
+ \[
89
+ \mathcal{I}_{1d} \cup \mathcal{I}_{2d} \cup \mathcal{V}_d = N_d
90
+ \]
91
+ \[
92
+ \mathcal{I}_{1d} \cap \mathcal{I}_{2d} = \emptyset
93
+ \]
94
+ \[
95
+ \mathcal{I}_{1d} \cap \mathcal{V}_d = \emptyset
96
+ \]
97
+ \[
98
+ \mathcal{I}_{2d} \cap \mathcal{V}_d = \emptyset
99
+ \]
100
+
101
+ These pairwise disjointness statements follow directly from \(\iota\) being a
102
+ single-valued total function into \(\{0,1,2\}\): no node can be assigned to
103
+ more than one tier simultaneously.
68
104
 
69
105
  These sets integrate into the global corpus. Let \(\mathbf{D}_{\text{standard}}\) be the set of standard memory documents (non-core files). We formally define the standard variant node set as \(\mathcal{V}_{\text{standard}} = \bigcup_{d \in \mathbf{D}_{\text{standard}}} E(d)\). The global corpus is then:
70
106
  \[ \mathcal{I}_1 = \bigcup_{d} \mathcal{I}_{1d} \qquad \mathcal{I}_2 = \bigcup_{d} \mathcal{I}_{2d} \qquad \mathcal{V} = \mathcal{V}_{\text{standard}} \cup \left( \bigcup_{d} \mathcal{V}_d \right) \]
@@ -86,7 +122,7 @@ For Hard Invariants (\(\alpha_1\)):
86
122
  \[ \sum_{n \in \mathcal{I}_{1d}} \mathrm{toks}(n) \le \alpha_1 \tau \implies \text{fast-fail and reject agent load if exceeded} \]
87
123
 
88
124
  For Soft Invariants (\(\alpha_2\)):
89
- \[ \sum_{n \in \mathcal{I}_{2d}} \mathrm{toks}(n) \le \alpha_2 \tau \implies \text{truncate by position if exceeded} \]
125
+ \[ \sum_{n \in \mathcal{I}_{2d}} \mathrm{toks}(n) \le \alpha_2 \tau \implies \text{truncate by source position if exceeded} \]
90
126
 
91
127
  *Cumulative Verification Proof:* Let the total reserved invariant budget fraction be \(\alpha\), where \(\alpha_1 + \alpha_2 \le \alpha\). If both independent enforcement bounds are satisfied, then:
92
128
  \[ \sum_{n \in \mathcal{I}_{1d}} \mathrm{toks}(n) + \sum_{n \in \mathcal{I}_{2d}} \mathrm{toks}(n) \le \alpha_1 \tau + \alpha_2 \tau = (\alpha_1 + \alpha_2)\tau \le \alpha \tau \]
@@ -102,14 +138,20 @@ therefore treats the tiers with the following precedence:
102
138
  3. **Tier 2 / Soft invariants** are injected by longest-prefix truncation under the effective budget
103
139
  \[
104
140
  \tau_{\mathcal{I}_2}^{\mathrm{eff}}=
141
+ \max\!\left(0,\,
105
142
  \min\!\left(\alpha_2\tau,\,
106
- \tau-\tau_{\mathcal{I}_1}-\mathrm{toks}(T_{\mathrm{base}})\right)
143
+ \tau-\tau_{\mathcal{I}_1}-\mathrm{toks}(T_{\mathrm{base}})\right)\right)
107
144
  \]
108
145
  4. **Variant lore** competes only for the final residual budget after Tier 1,
109
146
  the admitted Tier 2 prefix, and the exact recent tail are accounted for.
110
147
 
111
148
  This makes \(\mathcal{I}_1\) and the minimum continuity suffix hard
112
149
  constraints, while keeping \(\mathcal{I}_2\) order-preserving but elastic.
150
+ Equivalently, the runtime safety invariant is:
151
+ \[
152
+ \tau_{\mathcal{I}_1} + \mathrm{toks}(T_{\mathrm{base}}) \le \tau
153
+ \quad \text{must hold at runtime or Tier 2 is fully evicted}
154
+ \]
113
155
 
114
156
  ## 7. The Document-Addressed Cache (\(\Psi\)) and Runtime Implications
115
157
 
@@ -121,5 +163,5 @@ Because the token estimator function \(\lceil \frac{|t|}{\chi(t)} \rceil\) depen
121
163
 
122
164
  At runtime:
123
165
  1. **Tier 1 (\(\mathcal{I}_{1d}\))** is injected via an \(O(1)\) memory copy.
124
- 2. **Tier 2 (\(\mathcal{I}_{2d}\))** is evaluated via an \(O(|\mathcal{I}_{2d}|)\) prefix sum to enforce position truncation under \(\tau_{\mathcal{I}_2}^{\mathrm{eff}}\).
166
+ 2. **Tier 2 (\(\mathcal{I}_{2d}\))** is evaluated via an \(O(|\mathcal{I}_{2d}|)\) prefix sum to enforce source-order truncation under \(\tau_{\mathcal{I}_2}^{\mathrm{eff}}\).
125
167
  3. **Tier 0 (\(\mathcal{V}_d\))** bypasses re-parsing and feeds into the semantic Pass 1 vector retrieval only after the continuity layer removes the exact recent tail into \(T_{\mathrm{recent}}\), leaving \(\mathcal{V}_{\mathrm{rest}}\).