kairos-chain 3.6.2 → 3.7.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 70e5ff9eff0a25ceba4e38cf6bdb7f0c7fb45bd1f11b9345148625fe86b48be2
4
- data.tar.gz: 7de3d8c8b819cc86ab3847a60250ecc98fe1bf66077a10a9f43521910f51480b
3
+ metadata.gz: cef8ee3fd2052bb3d553e0c7ee0569d50f88fcfadea13dec6722122e2e71adb7
4
+ data.tar.gz: de6425c1fa691e86ad4e3597f36733b9ca592cb017ad58b69d54aecb64b63dd6
5
5
  SHA512:
6
- metadata.gz: 4d8385510f0309d0705ac2a311da86f05c62332b77cf0f4cc7a0de7d271863f73e9137320d5d7470d5552c0b3da441500c178b7f3d5d8abd6e0b253ce7d562ed
7
- data.tar.gz: f298792554a96d5da897523d9b6342d3fef2bc06e4d9a86d54a829c611fe5885fe6896168f7fc1853e3565bc2f4e71ed86d3740b39b898aaa4b0afc3c5337f4b
6
+ metadata.gz: bc9dec7fd29edf94dce3de1a18fc90eb006695cb22475195aeb4422f6aa843a0efc1b2fa2e5796891768934c7307fc0687d0420f2883a099b700a80077ae4b7a
7
+ data.tar.gz: db1e322c0b191ab12d949c403b6b04a5c6d99fccd9dd2e6528754a066a4bf3db26e4fb3e0cba4898826da13887d52f91c3018377a47aa5dd1ee25316f850f7df
data/CHANGELOG.md CHANGED
@@ -4,6 +4,35 @@ All notable changes to the `kairos-chain` gem will be documented in this file.
4
4
 
5
5
  This project follows [Semantic Versioning](https://semver.org/).
6
6
 
7
+ ## [3.7.0] - 2026-03-29
8
+
9
+ ### Added
10
+
11
+ - **Dream SkillSet** — L2 memory consolidation and lifecycle management
12
+ - `dream_scan`: Pattern detection across L2 sessions — tag co-occurrence,
13
+ L2/L1 staleness (mtime-based), name overlap (Jaccard), archive candidate detection.
14
+ Filters soft-archived stubs from promotion candidates.
15
+ - `dream_archive`: L2 soft-archive — gzip compress .md, move full context directory
16
+ to archive, leave searchable stub (tags + summary). SHA256 verified inline.
17
+ Per-context flock. `dry_run: true` by default.
18
+ - `dream_recall`: Restore archived contexts with SHA256 integrity check.
19
+ Preview and verify-only modes (read-only, no permission required).
20
+ - `dream_propose`: Package L1 promotion proposals with ready-to-execute
21
+ `knowledge_update` commands. Optional Persona Assembly templates.
22
+ - L2 lifecycle model: Active → Candidate → Soft-Archived → Recalled
23
+ - `dream_trigger_policy` L1 knowledge for Kairotic trigger heuristics
24
+ - 119 tests across 27 test sections
25
+
26
+ - **Agent SkillSet — permission advisory**
27
+ - `agent_start` now includes `permission_advisory` in response,
28
+ recommending users configure permission mode (Normal / Auto-allow / Auto-accept)
29
+ for smoother autonomous operation
30
+
31
+ ### Fixed
32
+
33
+ - **L1 staleness detection** — use tag overlap and name token matching instead of
34
+ exact L1 name-in-L2-tags check. Reduces false positives from 48/48 to 7/48.
35
+
7
36
  ## [3.6.0] - 2026-03-28
8
37
 
9
38
  ### Added
@@ -1,4 +1,4 @@
1
1
  module KairosMcp
2
- VERSION = "3.6.2"
2
+ VERSION = "3.7.0"
3
3
  CHANGELOG_URL = "https://github.com/masaomi/KairosChain_2026/blob/main/CHANGELOG.md"
4
4
  end
@@ -95,13 +95,18 @@ module KairosMcp
95
95
  session.update_state('observed')
96
96
  session.save
97
97
 
98
- text_content(JSON.generate({
98
+ result = {
99
99
  'status' => 'ok',
100
100
  'session_id' => session_id,
101
101
  'mandate_id' => mandate[:mandate_id],
102
102
  'state' => 'observed',
103
103
  'observation' => observation
104
- }))
104
+ }
105
+
106
+ # Advisory: suggest permission mode for autonomous operation
107
+ result['permission_advisory'] = permission_advisory_message
108
+
109
+ text_content(JSON.generate(result))
105
110
  rescue ArgumentError => e
106
111
  text_content(JSON.generate({ 'status' => 'error', 'error' => e.message }))
107
112
  rescue StandardError => e
@@ -126,6 +131,22 @@ module KairosMcp
126
131
  )
127
132
  end
128
133
 
134
+ def permission_advisory_message
135
+ <<~MSG.strip
136
+ This agent session will execute tools autonomously.
137
+ For smoother operation, consider adjusting your permission mode:
138
+
139
+ 1. Normal (default) — ask for each command. Safest, but interrupts flow.
140
+ 2. Auto-allow — pre-approved commands only. Balanced.
141
+ Configure in .claude/settings.local.json permissions.allow array.
142
+ 3. Auto-accept — allow everything. Fastest for trusted tasks.
143
+ Run /permissions and select auto mode, or start with --dangerously-skip-permissions.
144
+
145
+ Recommendation: For implementation + multi-LLM review workflows, auto-allow
146
+ with ruby/codex/agent commands pre-approved provides the best balance.
147
+ MSG
148
+ end
149
+
129
150
  def run_observe(goal_name)
130
151
  # Gather observation data without LLM
131
152
  observation = { 'goal_name' => goal_name, 'timestamp' => Time.now.iso8601 }
@@ -0,0 +1,26 @@
1
+ scan:
2
+ default_scope: "l2"
3
+ min_recurrence: 3
4
+ max_candidates: 5
5
+ skip_archived: true # scan filters skip soft-archived stubs
6
+
7
+ archive:
8
+ staleness_threshold_days: 90
9
+ dry_run_default: true # safe by default
10
+ preserve_gzip: true
11
+ archive_dir: "dream/archive"
12
+
13
+ bisociation:
14
+ enabled: false # off by default
15
+ min_pmi: 1.5
16
+ min_pair_count: 2
17
+ min_tag_types: 15
18
+ max_results: 5
19
+
20
+ staleness:
21
+ method: "mtime" # explicit definition
22
+ # future options: "referenced" (requires reference tracking)
23
+
24
+ recording:
25
+ scan_findings_only: true
26
+ archive_events: true
@@ -0,0 +1,96 @@
1
+ ---
2
+ name: dream_trigger_policy
3
+ version: "0.2.1"
4
+ tags:
5
+ - dream
6
+ - trigger
7
+ - heuristics
8
+ - consolidation
9
+ - archival
10
+ ---
11
+
12
+ # Dream Trigger Policy
13
+
14
+ ## Overview
15
+
16
+ This knowledge defines the heuristics that determine when Dream SkillSet operations
17
+ should be triggered. Dream is a memory consolidation system that scans L2 contexts
18
+ for recurring patterns, manages soft-archival of stale contexts, and packages
19
+ promotion proposals for L2-to-L1 elevation.
20
+
21
+ ## Trigger Heuristics
22
+
23
+ ### 1. Promotion Candidate Detection (Tag Co-occurrence)
24
+
25
+ **Signal**: A tag appears in `min_recurrence` or more distinct sessions.
26
+
27
+ - Default threshold: 3 sessions
28
+ - Only live contexts are considered (soft-archived stubs are excluded)
29
+ - Tags are extracted from YAML frontmatter of context `.md` files
30
+ - Co-occurrence is counted per-session, not per-context (a tag appearing 5 times
31
+ in one session counts as 1 occurrence)
32
+
33
+ **Action**: Include in `promotion_candidates` list for LLM evaluation.
34
+
35
+ ### 2. Consolidation Candidate Detection (Name Overlap)
36
+
37
+ **Signal**: Two or more contexts share high name-token overlap (Jaccard similarity).
38
+
39
+ - Default threshold: Jaccard >= 0.5
40
+ - Comparison uses underscore-separated tokens from context names
41
+ - Example: `deployment_notes` and `deployment_config_notes` share tokens
42
+ `deployment` and `notes`, yielding Jaccard = 2/3 = 0.67
43
+
44
+ **Action**: Include in `consolidation_candidates` list for LLM evaluation.
45
+
46
+ ### 3. L2 Staleness Detection (Archive Candidates)
47
+
48
+ **Signal**: A context's file modification time (mtime) exceeds the staleness threshold.
49
+
50
+ - Default threshold: 90 days
51
+ - Uses `File.mtime` on the context `.md` file
52
+ - Only live contexts are candidates (already-archived stubs are excluded)
53
+ - "Semantic reference" tracking is a future enhancement (v2.2+)
54
+
55
+ **Action**: Include in `archive_candidates` list. Archive requires explicit
56
+ `dry_run: false` to execute.
57
+
58
+ ### 4. L1 Staleness Detection
59
+
60
+ **Signal**: An L1 knowledge skill's name does not appear in any recent L2 context tags.
61
+
62
+ - Compares L1 knowledge names against the union of all L2 tags
63
+ - L1 skills that are never referenced in L2 may indicate orphaned knowledge
64
+
65
+ **Action**: Include in `health_summary.stale_l1` for informational purposes.
66
+ No automatic action is taken on L1 knowledge.
67
+
68
+ ### 5. Bisociation Detection (Experimental)
69
+
70
+ **Signal**: Two tags that rarely co-occur in the same session but each appear
71
+ frequently across separate sessions.
72
+
73
+ - Measured by Pointwise Mutual Information (PMI)
74
+ - Off by default (`bisociation.enabled: false`)
75
+ - Guarded by `min_pair_count >= 2` and `min_tag_types >= 15`
76
+ - Brittle on small corpora; intended for mature knowledge bases
77
+
78
+ **Action**: Advisory only. LLM evaluates substance; no auto-promotion.
79
+
80
+ ## When to Run dream_scan
81
+
82
+ Recommended triggers:
83
+
84
+ 1. **Session boundary**: At the end of a work session, scan for new patterns
85
+ 2. **Periodic**: Weekly or bi-weekly for knowledge health monitoring
86
+ 3. **Before major work**: Scan to surface relevant prior knowledge
87
+ 4. **Autonomos integration**: As part of the autonomos reflect cycle
88
+
89
+ ## Interpretation Guidelines
90
+
91
+ - Promotion candidates are suggestions, not mandates. The LLM should evaluate
92
+ whether the recurring pattern represents genuine reusable knowledge.
93
+ - Archive candidates should be reviewed before archiving. The 90-day threshold
94
+ is a heuristic; some contexts may be intentionally long-lived.
95
+ - Consolidation candidates may represent genuinely distinct concepts that happen
96
+ to share naming. Always verify semantic overlap before merging.
@@ -0,0 +1,417 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'digest'
4
+ require 'fileutils'
5
+ require 'yaml'
6
+ require 'zlib'
7
+ require 'time'
8
+
9
+ module KairosMcp
10
+ module SkillSets
11
+ module Dream
12
+ class Archiver
13
+ LOCK_FILE = '.dream_lock'
14
+
15
+ def initialize(config: {})
16
+ archive_config = config.fetch('archive', {})
17
+ @archive_dir_name = archive_config.fetch('archive_dir', 'dream/archive')
18
+ @preserve_gzip = archive_config.fetch('preserve_gzip', true)
19
+ end
20
+
21
+ # Soft-archive a single L2 context.
22
+ # Moves full directory contents to archive, leaves stub-only dir.
23
+ #
24
+ # @param session_id [String] Session ID
25
+ # @param context_name [String] Context name
26
+ # @param summary [String] Caller-provided summary for the stub
27
+ # @return [Hash] Result with content_hash, sizes, moved_subdirs
28
+ def archive_context!(session_id:, context_name:, summary:)
29
+ src_dir = context_dir_path(session_id, context_name)
30
+ arch_dir = archive_dir_path(session_id, context_name)
31
+ md_file = File.join(src_dir, "#{context_name}.md")
32
+
33
+ with_lock(src_dir) do
34
+ # 1. Validate source exists and is not already archived
35
+ raise "Context not found: #{md_file}" unless File.exist?(md_file)
36
+ raise "Already archived: #{context_name}" if archived?(session_id: session_id, context_name: context_name)
37
+
38
+ # 2. Read and compress the markdown
39
+ original_content = File.read(md_file)
40
+ original_size = original_content.bytesize
41
+ content_hash = Digest::SHA256.hexdigest(original_content)
42
+
43
+ FileUtils.mkdir_p(arch_dir)
44
+ gz_path = File.join(arch_dir, "#{context_name}.md.gz")
45
+ Zlib::GzipWriter.open(gz_path) { |gz| gz.write(original_content) }
46
+
47
+ # 3. Verify gzip integrity immediately after write
48
+ verify_hash = Digest::SHA256.hexdigest(Zlib::GzipReader.open(gz_path, &:read))
49
+ unless verify_hash == content_hash
50
+ FileUtils.rm_f(gz_path)
51
+ raise "Gzip verification failed — archive aborted, original intact"
52
+ end
53
+
54
+ # 4. Move ALL subdirs/files (except .md and .dream_lock) to archive
55
+ moved_subdirs = []
56
+ Dir.children(src_dir).each do |child|
57
+ next if child == "#{context_name}.md"
58
+ next if child == LOCK_FILE
59
+
60
+ child_src = File.join(src_dir, child)
61
+ FileUtils.mv(child_src, File.join(arch_dir, child))
62
+ moved_subdirs << child if File.directory?(File.join(arch_dir, child))
63
+ end
64
+
65
+ # 5. Extract original frontmatter for stub metadata
66
+ original_meta = extract_frontmatter(original_content)
67
+
68
+ # 6. Detect what was moved for stub flags
69
+ has_scripts = moved_subdirs.include?('scripts')
70
+ has_assets = moved_subdirs.include?('assets')
71
+ has_references = moved_subdirs.include?('references')
72
+
73
+ # 7. Write stub atomically (tempfile + rename)
74
+ stub_content = generate_stub(
75
+ context_name: context_name,
76
+ summary: summary,
77
+ content_hash: content_hash,
78
+ original_size: original_size,
79
+ original_meta: original_meta,
80
+ archive_ref: archive_ref_path(session_id, context_name),
81
+ has_scripts: has_scripts,
82
+ has_assets: has_assets,
83
+ has_references: has_references
84
+ )
85
+ tmp_path = "#{md_file}.tmp"
86
+ File.write(tmp_path, stub_content)
87
+ File.rename(tmp_path, md_file) # POSIX atomic
88
+
89
+ {
90
+ success: true,
91
+ context_name: context_name,
92
+ session_id: session_id,
93
+ content_hash: content_hash,
94
+ original_size: original_size,
95
+ stub_size: stub_content.bytesize,
96
+ moved_subdirs: moved_subdirs,
97
+ verified: true
98
+ }
99
+ end
100
+ end
101
+
102
+ # Restore a soft-archived context.
103
+ #
104
+ # @param session_id [String] Session ID
105
+ # @param context_name [String] Context name
106
+ # @return [Hash] Result with restored_hash, verified
107
+ def recall_context!(session_id:, context_name:)
108
+ src_dir = context_dir_path(session_id, context_name)
109
+ arch_dir = archive_dir_path(session_id, context_name)
110
+ md_file = File.join(src_dir, "#{context_name}.md")
111
+ gz_path = File.join(arch_dir, "#{context_name}.md.gz")
112
+
113
+ with_lock(src_dir) do
114
+ # 1. Verify archive exists and integrity
115
+ raise "Archive not found: #{gz_path}" unless File.exist?(gz_path)
116
+ raise "Stub not found: #{md_file}" unless File.exist?(md_file)
117
+
118
+ stub_meta = parse_stub(md_file)
119
+ restored_content = Zlib::GzipReader.open(gz_path, &:read)
120
+ restored_hash = Digest::SHA256.hexdigest(restored_content)
121
+
122
+ unless restored_hash == stub_meta[:content_hash]
123
+ raise "Archive integrity check failed. Expected #{stub_meta[:content_hash]}, " \
124
+ "got #{restored_hash}. Archive may be corrupted."
125
+ end
126
+
127
+ # 2. Restore markdown atomically
128
+ tmp_path = "#{md_file}.tmp"
129
+ File.write(tmp_path, restored_content)
130
+ File.rename(tmp_path, md_file) # POSIX atomic
131
+
132
+ # 3. Move subdirectories and files back from archive
133
+ moved_back = []
134
+ Dir.children(arch_dir).each do |child|
135
+ next if child == "#{context_name}.md.gz"
136
+
137
+ child_arch = File.join(arch_dir, child)
138
+ FileUtils.mv(child_arch, File.join(src_dir, child))
139
+ moved_back << child
140
+ end
141
+
142
+ # 4. Clean up archive (configurable)
143
+ if @preserve_gzip
144
+ # Keep gzip for safety — no cleanup
145
+ else
146
+ FileUtils.rm_rf(arch_dir)
147
+ end
148
+
149
+ {
150
+ success: true,
151
+ context_name: context_name,
152
+ session_id: session_id,
153
+ restored_hash: restored_hash,
154
+ restored_size: restored_content.bytesize,
155
+ moved_back: moved_back,
156
+ verified: true,
157
+ archive_preserved: @preserve_gzip
158
+ }
159
+ end
160
+ end
161
+
162
+ # Preview archived content without restoring.
163
+ #
164
+ # @param session_id [String] Session ID
165
+ # @param context_name [String] Context name
166
+ # @return [Hash] Preview result with content and metadata
167
+ def preview(session_id:, context_name:)
168
+ arch_dir = archive_dir_path(session_id, context_name)
169
+ gz_path = File.join(arch_dir, "#{context_name}.md.gz")
170
+ md_file = File.join(context_dir_path(session_id, context_name), "#{context_name}.md")
171
+
172
+ raise "Archive not found: #{gz_path}" unless File.exist?(gz_path)
173
+
174
+ content = Zlib::GzipReader.open(gz_path, &:read)
175
+ stub_meta = File.exist?(md_file) ? parse_stub(md_file) : {}
176
+
177
+ {
178
+ success: true,
179
+ context_name: context_name,
180
+ session_id: session_id,
181
+ content: content,
182
+ content_size: content.bytesize,
183
+ content_hash: Digest::SHA256.hexdigest(content),
184
+ stub_meta: stub_meta
185
+ }
186
+ end
187
+
188
+ # Check if a context is soft-archived.
189
+ #
190
+ # @param session_id [String] Session ID
191
+ # @param context_name [String] Context name
192
+ # @return [Boolean]
193
+ def archived?(session_id:, context_name:)
194
+ md_file = File.join(context_dir_path(session_id, context_name), "#{context_name}.md")
195
+ return false unless File.exist?(md_file)
196
+
197
+ content = File.read(md_file)
198
+ status = extract_status(content)
199
+ status == 'soft-archived'
200
+ end
201
+
202
+ # Verify archive integrity without modifying anything.
203
+ #
204
+ # @param session_id [String] Session ID
205
+ # @param context_name [String] Context name
206
+ # @return [Hash] Verification result
207
+ def verify(session_id:, context_name:)
208
+ arch_dir = archive_dir_path(session_id, context_name)
209
+ md_file = File.join(context_dir_path(session_id, context_name), "#{context_name}.md")
210
+ gz_path = File.join(arch_dir, "#{context_name}.md.gz")
211
+
212
+ issues = []
213
+
214
+ # Check stub exists and is archived
215
+ unless File.exist?(md_file)
216
+ issues << "Stub file not found: #{md_file}"
217
+ end
218
+
219
+ stub_meta = {}
220
+ if issues.empty?
221
+ content = File.read(md_file)
222
+ status = extract_status(content)
223
+ unless status == 'soft-archived'
224
+ issues << "Context is not archived (status: #{status || 'nil'})"
225
+ end
226
+ stub_meta = parse_stub(md_file)
227
+ end
228
+
229
+ # Check gzip exists
230
+ unless File.exist?(gz_path)
231
+ issues << "Gzip archive not found: #{gz_path}"
232
+ end
233
+
234
+ # Verify SHA256 if both stub and gzip exist
235
+ if issues.empty? && stub_meta[:content_hash]
236
+ actual_hash = Digest::SHA256.hexdigest(Zlib::GzipReader.open(gz_path, &:read))
237
+ unless actual_hash == stub_meta[:content_hash]
238
+ issues << "SHA256 mismatch. Stub: #{stub_meta[:content_hash]}, Actual: #{actual_hash}"
239
+ end
240
+ end
241
+
242
+ # Check archived subdirs exist
243
+ if stub_meta[:archive_ref] && File.directory?(arch_dir)
244
+ archived_children = Dir.children(arch_dir).reject { |c| c.end_with?('.md.gz') }
245
+ has_expected = []
246
+ has_expected << 'scripts' if stub_meta[:has_scripts]
247
+ has_expected << 'assets' if stub_meta[:has_assets]
248
+ has_expected << 'references' if stub_meta[:has_references]
249
+
250
+ has_expected.each do |subdir|
251
+ unless archived_children.include?(subdir)
252
+ issues << "Expected archived subdir '#{subdir}' not found in archive"
253
+ end
254
+ end
255
+ end
256
+
257
+ {
258
+ success: issues.empty?,
259
+ context_name: context_name,
260
+ session_id: session_id,
261
+ issues: issues,
262
+ stub_meta: stub_meta,
263
+ gzip_exists: File.exist?(gz_path),
264
+ archive_dir_exists: File.directory?(arch_dir)
265
+ }
266
+ end
267
+
268
+ private
269
+
270
+ # ---------------------------------------------------------------
271
+ # Path helpers
272
+ # ---------------------------------------------------------------
273
+
274
+ def context_dir_path(session_id, context_name)
275
+ dir = context_dir
276
+ File.join(dir, session_id, context_name)
277
+ end
278
+
279
+ def archive_dir_path(session_id, context_name)
280
+ base = storage_dir
281
+ File.join(base, @archive_dir_name, session_id, context_name)
282
+ end
283
+
284
+ def archive_ref_path(session_id, context_name)
285
+ File.join(@archive_dir_name, session_id, context_name)
286
+ end
287
+
288
+ def context_dir
289
+ if defined?(KairosMcp) && KairosMcp.respond_to?(:context_dir)
290
+ KairosMcp.context_dir
291
+ else
292
+ File.join(Dir.pwd, '.kairos', 'context')
293
+ end
294
+ end
295
+
296
+ def storage_dir
297
+ if defined?(KairosMcp) && KairosMcp.respond_to?(:storage_dir)
298
+ KairosMcp.storage_dir
299
+ elsif defined?(KairosMcp) && KairosMcp.respond_to?(:kairos_dir)
300
+ KairosMcp.kairos_dir
301
+ else
302
+ File.join(Dir.pwd, '.kairos')
303
+ end
304
+ end
305
+
306
+ # ---------------------------------------------------------------
307
+ # Lock
308
+ # ---------------------------------------------------------------
309
+
310
+ def with_lock(dir)
311
+ FileUtils.mkdir_p(dir)
312
+ lock_path = File.join(dir, LOCK_FILE)
313
+ File.open(lock_path, File::CREAT | File::RDWR) do |f|
314
+ unless f.flock(File::LOCK_EX | File::LOCK_NB)
315
+ raise "Context is locked by another operation. Try again later."
316
+ end
317
+ begin
318
+ yield
319
+ ensure
320
+ f.flock(File::LOCK_UN)
321
+ end
322
+ end
323
+ ensure
324
+ FileUtils.rm_f(File.join(dir, LOCK_FILE)) if File.exist?(File.join(dir, LOCK_FILE))
325
+ end
326
+
327
+ # ---------------------------------------------------------------
328
+ # Stub generation and parsing
329
+ # ---------------------------------------------------------------
330
+
331
+ def generate_stub(context_name:, summary:, content_hash:, original_size:,
332
+ original_meta:, archive_ref:, has_scripts:, has_assets:, has_references:)
333
+ title = original_meta['title'] || context_name
334
+ tags = Array(original_meta['tags'])
335
+ description = original_meta['description'] || ''
336
+
337
+ frontmatter = {
338
+ 'title' => title,
339
+ 'tags' => tags,
340
+ 'description' => description,
341
+ 'status' => 'soft-archived',
342
+ 'archived_at' => Time.now.utc.iso8601,
343
+ 'archived_by' => 'dream_archive',
344
+ 'archive_ref' => archive_ref,
345
+ 'content_hash' => content_hash,
346
+ 'original_size' => original_size,
347
+ 'has_scripts' => has_scripts,
348
+ 'has_assets' => has_assets,
349
+ 'has_references' => has_references,
350
+ 'summary' => summary
351
+ }
352
+
353
+ includes = []
354
+ includes << 'scripts/' if has_scripts
355
+ includes << 'assets/' if has_assets
356
+ includes << 'references/' if has_references
357
+
358
+ body_lines = []
359
+ body_lines << "# #{context_name} [ARCHIVED]"
360
+ body_lines << ""
361
+ body_lines << "This context has been soft-archived. Use `dream_recall` to restore full text."
362
+ body_lines << ""
363
+ body_lines << "**Tags**: #{tags.join(', ')}" unless tags.empty?
364
+ body_lines << "**Original size**: #{format_bytes(original_size)}"
365
+ body_lines << "**Includes**: #{includes.join(', ')}" unless includes.empty?
366
+
367
+ "---\n#{YAML.dump(frontmatter).sub(/\A---\n/, '')}---\n\n#{body_lines.join("\n")}\n"
368
+ end
369
+
370
+ def parse_stub(md_file)
371
+ content = File.read(md_file)
372
+ meta = extract_frontmatter(content)
373
+ {
374
+ content_hash: meta['content_hash'],
375
+ original_size: meta['original_size'],
376
+ archive_ref: meta['archive_ref'],
377
+ archived_at: meta['archived_at'],
378
+ has_scripts: meta['has_scripts'] == true,
379
+ has_assets: meta['has_assets'] == true,
380
+ has_references: meta['has_references'] == true,
381
+ summary: meta['summary'],
382
+ status: meta['status']
383
+ }
384
+ end
385
+
386
+ # ---------------------------------------------------------------
387
+ # Frontmatter helpers
388
+ # ---------------------------------------------------------------
389
+
390
+ def extract_frontmatter(content)
391
+ if content =~ /\A---\n(.*?)\n---/m
392
+ YAML.safe_load($1, permitted_classes: [Symbol]) || {}
393
+ else
394
+ {}
395
+ end
396
+ rescue StandardError
397
+ {}
398
+ end
399
+
400
+ def extract_status(content)
401
+ meta = extract_frontmatter(content)
402
+ meta['status'] || meta[:status]
403
+ end
404
+
405
+ def format_bytes(bytes)
406
+ if bytes >= 1_048_576
407
+ "#{(bytes / 1_048_576.0).round(1)} MB"
408
+ elsif bytes >= 1024
409
+ "#{(bytes / 1024.0).round(1)} KB"
410
+ else
411
+ "#{bytes} bytes"
412
+ end
413
+ end
414
+ end
415
+ end
416
+ end
417
+ end