@c-d-cc/reap 0.3.4 → 0.4.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.ja.md CHANGED
@@ -130,7 +130,7 @@ Objective → Planning → Implementation ⟷ Validation → Completion
130
130
  各項目は`status`フィールドも持ちます:
131
131
 
132
132
  - `status: pending` — 未処理項目(デフォルト)
133
- - `status: consumed` — 現在の世代で処理完了(`consumedBy: gen-XXX`必須)
133
+ - `status: consumed` — 現在の世代で処理完了(`consumedBy: gen-XXX-{hash}`必須)
134
134
 
135
135
  アーカイブ時点(`/reap.next` from Completion)で`consumed`項目はlineageに移動し、`pending`項目は次の世代のbacklogに繰り越されます。
136
136
 
@@ -287,7 +287,7 @@ my-project/
287
287
 
288
288
  | レベル | 入力 | 出力 | 最大行数 | トリガー |
289
289
  |--------|------|------|----------|----------|
290
- | **Level 1** | 世代フォルダ(5つの成果物) | `gen-XXX.md` | 40行 | lineage > 5,000行 + 5世代以上 |
290
+ | **Level 1** | 世代フォルダ(5つの成果物) | `gen-XXX-{hash}.md` | 40行 | lineage > 5,000行 + 5世代以上 |
291
291
  | **Level 2** | Level 1ファイル5つ | `epoch-XXX.md` | 60行 | Level 1が5つ以上 |
292
292
 
293
293
  圧縮は世代完了時に自動実行されます。直近3世代は常に圧縮から保護されます。圧縮されたファイルは目標(Objective)と結果(Completion)を中心に保存し、中間過程は特記事項のみを残します。
package/README.ko.md CHANGED
@@ -130,7 +130,7 @@ Application의 유전 정보 — 아키텍처 원칙, 비즈니스 규칙, 개
130
130
  각 항목은 `status` 필드도 가집니다:
131
131
 
132
132
  - `status: pending` — 미처리 항목 (기본값)
133
- - `status: consumed` — 현재 세대에서 처리 완료 (`consumedBy: gen-XXX` 필수)
133
+ - `status: consumed` — 현재 세대에서 처리 완료 (`consumedBy: gen-XXX-{hash}` 필수)
134
134
 
135
135
  아카이빙 시점(`/reap.next` from Completion)에 `consumed` 항목은 lineage로 이동하고, `pending` 항목은 다음 세대 backlog로 이월됩니다.
136
136
 
@@ -287,7 +287,7 @@ my-project/
287
287
 
288
288
  | 레벨 | 입력 | 출력 | 최대 줄 수 | 트리거 |
289
289
  |------|------|------|-----------|--------|
290
- | **Level 1** | 세대 폴더 (5개 산출물) | `gen-XXX.md` | 40줄 | lineage > 5,000줄 + 5세대 이상 |
290
+ | **Level 1** | 세대 폴더 (5개 산출물) | `gen-XXX-{hash}.md` | 40줄 | lineage > 5,000줄 + 5세대 이상 |
291
291
  | **Level 2** | Level 1 파일 5개 | `epoch-XXX.md` | 60줄 | Level 1이 5개 이상 |
292
292
 
293
293
  압축은 세대 완료 시 자동 실행됩니다. 가장 최근 3개 세대는 항상 압축에서 보호됩니다. 압축된 파일은 목표(Objective)와 결과(Completion)를 중심으로 보존하고, 중간 과정은 특이사항만 남깁니다.
package/README.md CHANGED
@@ -129,7 +129,7 @@ All items to be addressed next are stored in `.reap/life/backlog/`. Each item us
129
129
  Each item also carries a `status` field:
130
130
 
131
131
  - `status: pending` — Not yet processed (default)
132
- - `status: consumed` — Processed in the current generation (requires `consumedBy: gen-XXX`)
132
+ - `status: consumed` — Processed in the current generation (requires `consumedBy: gen-XXX-{hash}`)
133
133
 
134
134
  At archiving time (`/reap.next` from Completion), `consumed` items move to lineage while `pending` items are carried forward to the next generation's backlog.
135
135
 
@@ -286,7 +286,7 @@ As generations accumulate, the lineage directory grows. REAP manages this with a
286
286
 
287
287
  | Level | Input | Output | Max lines | Trigger |
288
288
  |-------|-------|--------|-----------|---------|
289
- | **Level 1** | Generation folder (5 artifacts) | `gen-XXX.md` | 40 lines | lineage > 5,000 lines + 5+ generations |
289
+ | **Level 1** | Generation folder (5 artifacts) | `gen-XXX-{hash}.md` | 40 lines | lineage > 5,000 lines + 5+ generations |
290
290
  | **Level 2** | 5 Level 1 files | `epoch-XXX.md` | 60 lines | 5+ Level 1 files |
291
291
 
292
292
  Compression runs automatically when a generation completes. The most recent 3 generations are always protected from compression. Compressed files preserve objectives and completion results while retaining only notable findings from intermediate stages.
package/README.zh-CN.md CHANGED
@@ -130,7 +130,7 @@ Objective → Planning → Implementation ⟷ Validation → Completion
130
130
  每个项目还有`status`字段:
131
131
 
132
132
  - `status: pending` — 未处理项目(默认)
133
- - `status: consumed` — 在当前世代中已处理(需要`consumedBy: gen-XXX`)
133
+ - `status: consumed` — 在当前世代中已处理(需要`consumedBy: gen-XXX-{hash}`)
134
134
 
135
135
  归档时(`/reap.next` from Completion),`consumed`项目移至lineage,`pending`项目结转到下一个世代的backlog。
136
136
 
@@ -287,7 +287,7 @@ my-project/
287
287
 
288
288
  | 级别 | 输入 | 输出 | 最大行数 | 触发条件 |
289
289
  |------|------|------|----------|----------|
290
- | **Level 1** | 世代文件夹(5个产出物) | `gen-XXX.md` | 40行 | lineage > 5,000行 + 5个以上世代 |
290
+ | **Level 1** | 世代文件夹(5个产出物) | `gen-XXX-{hash}.md` | 40行 | lineage > 5,000行 + 5个以上世代 |
291
291
  | **Level 2** | 5个Level 1文件 | `epoch-XXX.md` | 60行 | Level 1达到5个以上 |
292
292
 
293
293
  压缩在世代完成时自动执行。最近3个世代始终受到保护,不会被压缩。压缩后的文件以目标(Objective)和结果(Completion)为中心保存,中间过程仅保留特别事项。
package/dist/cli.js CHANGED
@@ -9497,8 +9497,8 @@ async function initProject(projectRoot, projectName, entryMode, preset, onProgre
9497
9497
  }
9498
9498
 
9499
9499
  // src/cli/commands/update.ts
9500
- import { readdir as readdir4, unlink as unlink3, rm, mkdir as mkdir4 } from "fs/promises";
9501
- import { join as join5 } from "path";
9500
+ import { readdir as readdir7, unlink as unlink3, rm as rm2, mkdir as mkdir6 } from "fs/promises";
9501
+ import { join as join8 } from "path";
9502
9502
 
9503
9503
  // src/core/hooks.ts
9504
9504
  async function migrateHooks(dryRun = false) {
@@ -9513,148 +9513,17 @@ async function migrateHooks(dryRun = false) {
9513
9513
  return { results };
9514
9514
  }
9515
9515
 
9516
- // src/cli/commands/update.ts
9517
- async function updateProject(projectRoot, dryRun = false) {
9518
- const paths = new ReapPaths(projectRoot);
9519
- if (!await paths.isReapProject()) {
9520
- throw new Error("Not a REAP project. Run 'reap init' first.");
9521
- }
9522
- const result = { updated: [], skipped: [], removed: [] };
9523
- const config = await ConfigManager.read(paths);
9524
- const adapters = await AgentRegistry.getActiveAdapters(config ?? undefined);
9525
- const commandsDir = ReapPaths.packageCommandsDir;
9526
- const commandFiles = await readdir4(commandsDir);
9527
- for (const adapter of adapters) {
9528
- const agentCmdDir = adapter.getCommandsDir();
9529
- const label = `${adapter.displayName}`;
9530
- for (const file of commandFiles) {
9531
- if (!file.endsWith(".md"))
9532
- continue;
9533
- const src = await readTextFileOrThrow(join5(commandsDir, file));
9534
- const dest = join5(agentCmdDir, file);
9535
- const existingContent = await readTextFile(dest);
9536
- if (existingContent !== null && existingContent === src) {
9537
- result.skipped.push(`[${label}] commands/${file}`);
9538
- } else {
9539
- if (!dryRun) {
9540
- await mkdir4(agentCmdDir, { recursive: true });
9541
- await writeTextFile(dest, src);
9542
- }
9543
- result.updated.push(`[${label}] commands/${file}`);
9544
- }
9545
- }
9546
- const validCommandFiles = new Set(commandFiles);
9547
- if (!dryRun) {
9548
- await adapter.removeStaleCommands(validCommandFiles);
9549
- }
9550
- }
9551
- await mkdir4(ReapPaths.userReapTemplates, { recursive: true });
9552
- const artifactFiles = ["01-objective.md", "02-planning.md", "03-implementation.md", "04-validation.md", "05-completion.md"];
9553
- for (const file of artifactFiles) {
9554
- const src = await readTextFileOrThrow(join5(ReapPaths.packageArtifactsDir, file));
9555
- const dest = join5(ReapPaths.userReapTemplates, file);
9556
- const existingContent = await readTextFile(dest);
9557
- if (existingContent !== null && existingContent === src) {
9558
- result.skipped.push(`~/.reap/templates/${file}`);
9559
- } else {
9560
- if (!dryRun)
9561
- await writeTextFile(dest, src);
9562
- result.updated.push(`~/.reap/templates/${file}`);
9563
- }
9564
- }
9565
- const domainGuideSrc = await readTextFileOrThrow(join5(ReapPaths.packageGenomeDir, "domain/README.md"));
9566
- const domainGuideDest = join5(ReapPaths.userReapTemplates, "domain-guide.md");
9567
- const domainExistingContent = await readTextFile(domainGuideDest);
9568
- if (domainExistingContent !== null && domainExistingContent === domainGuideSrc) {
9569
- result.skipped.push(`~/.reap/templates/domain-guide.md`);
9570
- } else {
9571
- if (!dryRun)
9572
- await writeTextFile(domainGuideDest, domainGuideSrc);
9573
- result.updated.push(`~/.reap/templates/domain-guide.md`);
9574
- }
9575
- const migrations = await migrateHooks(dryRun);
9576
- for (const m of migrations.results) {
9577
- if (m.action === "migrated") {
9578
- result.updated.push(`[${m.agent}] hooks (migrated)`);
9579
- }
9580
- }
9581
- for (const adapter of adapters) {
9582
- const hookResult = await adapter.syncSessionHook(dryRun);
9583
- if (hookResult.action === "updated") {
9584
- result.updated.push(`[${adapter.displayName}] session hook`);
9585
- } else {
9586
- result.skipped.push(`[${adapter.displayName}] session hook`);
9587
- }
9588
- }
9589
- await migrateLegacyFiles(paths, dryRun, result);
9590
- return result;
9591
- }
9592
- async function migrateLegacyFiles(paths, dryRun, result) {
9593
- await removeDirIfExists(paths.legacyCommands, ".reap/commands/", dryRun, result);
9594
- await removeDirIfExists(paths.legacyTemplates, ".reap/templates/", dryRun, result);
9595
- try {
9596
- const claudeCmdDir = paths.legacyClaudeCommands;
9597
- const files = await readdir4(claudeCmdDir);
9598
- for (const file of files) {
9599
- if (file.startsWith("reap.") && file.endsWith(".md")) {
9600
- if (!dryRun)
9601
- await unlink3(join5(claudeCmdDir, file));
9602
- result.removed.push(`.claude/commands/${file}`);
9603
- }
9604
- }
9605
- } catch {}
9606
- try {
9607
- const legacyHooksJson = paths.legacyClaudeHooksJson;
9608
- const fileContent = await readTextFile(legacyHooksJson);
9609
- if (fileContent !== null) {
9610
- const content = JSON.parse(fileContent);
9611
- const sessionStart = content["SessionStart"];
9612
- if (Array.isArray(sessionStart)) {
9613
- const filtered = sessionStart.filter((entry) => {
9614
- if (typeof entry !== "object" || entry === null)
9615
- return true;
9616
- const hooks = entry["hooks"];
9617
- if (!Array.isArray(hooks))
9618
- return true;
9619
- return !hooks.some((h) => {
9620
- if (typeof h !== "object" || h === null)
9621
- return false;
9622
- const cmd = h["command"];
9623
- return typeof cmd === "string" && cmd.includes(".reap/hooks/");
9624
- });
9625
- });
9626
- if (filtered.length !== sessionStart.length) {
9627
- if (!dryRun) {
9628
- if (filtered.length === 0 && Object.keys(content).length === 1) {
9629
- await unlink3(legacyHooksJson);
9630
- result.removed.push(`.claude/hooks.json (legacy)`);
9631
- } else {
9632
- content["SessionStart"] = filtered;
9633
- await writeTextFile(legacyHooksJson, JSON.stringify(content, null, 2) + `
9634
- `);
9635
- result.removed.push(`.claude/hooks.json (legacy REAP hook entry)`);
9636
- }
9637
- }
9638
- }
9639
- }
9640
- }
9641
- } catch {}
9642
- }
9643
- async function removeDirIfExists(dirPath, label, dryRun, result) {
9644
- try {
9645
- const entries = await readdir4(dirPath);
9646
- if (entries.length > 0 || true) {
9647
- if (!dryRun)
9648
- await rm(dirPath, { recursive: true });
9649
- result.removed.push(label);
9650
- }
9651
- } catch {}
9652
- }
9516
+ // src/core/migration.ts
9517
+ var import_yaml4 = __toESM(require_dist(), 1);
9518
+ import { readdir as readdir6, rename as rename2 } from "fs/promises";
9519
+ import { join as join7 } from "path";
9653
9520
 
9654
9521
  // src/core/generation.ts
9655
- var import_yaml2 = __toESM(require_dist(), 1);
9656
- import { readdir as readdir6, mkdir as mkdir5, rename } from "fs/promises";
9657
- import { join as join7 } from "path";
9522
+ var import_yaml3 = __toESM(require_dist(), 1);
9523
+ import { createHash } from "crypto";
9524
+ import { hostname } from "os";
9525
+ import { readdir as readdir5, mkdir as mkdir4, rename } from "fs/promises";
9526
+ import { join as join6 } from "path";
9658
9527
 
9659
9528
  // src/types/index.ts
9660
9529
  var LIFECYCLE_ORDER = [
@@ -9708,14 +9577,35 @@ class LifeCycle {
9708
9577
  }
9709
9578
 
9710
9579
  // src/core/compression.ts
9711
- import { readdir as readdir5, rm as rm2 } from "fs/promises";
9712
- import { join as join6 } from "path";
9580
+ var import_yaml2 = __toESM(require_dist(), 1);
9581
+ import { readdir as readdir4, rm } from "fs/promises";
9582
+ import { join as join5 } from "path";
9713
9583
  var LINEAGE_MAX_LINES = 5000;
9714
9584
  var MIN_GENERATIONS_FOR_COMPRESSION = 5;
9715
9585
  var LEVEL1_MAX_LINES = 40;
9716
9586
  var LEVEL2_MAX_LINES = 60;
9717
9587
  var LEVEL2_BATCH_SIZE = 5;
9718
9588
  var RECENT_PROTECTED_COUNT = 3;
9589
+ function extractGenNum(name) {
9590
+ const match = name.match(/^gen-(\d{3})/);
9591
+ return match ? parseInt(match[1], 10) : 0;
9592
+ }
9593
+ function parseFrontmatter(content) {
9594
+ const match = content.match(/^---\n([\s\S]*?)\n---/);
9595
+ if (!match)
9596
+ return null;
9597
+ try {
9598
+ return import_yaml2.default.parse(match[1]);
9599
+ } catch {
9600
+ return null;
9601
+ }
9602
+ }
9603
+ function buildFrontmatter(meta) {
9604
+ return `---
9605
+ ${import_yaml2.default.stringify(meta).trim()}
9606
+ ---
9607
+ `;
9608
+ }
9719
9609
  async function countLines(filePath) {
9720
9610
  const content = await readTextFile(filePath);
9721
9611
  if (content === null)
@@ -9726,9 +9616,9 @@ async function countLines(filePath) {
9726
9616
  async function countDirLines(dirPath) {
9727
9617
  let total = 0;
9728
9618
  try {
9729
- const entries = await readdir5(dirPath, { withFileTypes: true });
9619
+ const entries = await readdir4(dirPath, { withFileTypes: true });
9730
9620
  for (const entry of entries) {
9731
- const fullPath = join6(dirPath, entry.name);
9621
+ const fullPath = join5(dirPath, entry.name);
9732
9622
  if (entry.isFile() && entry.name.endsWith(".md")) {
9733
9623
  total += await countLines(fullPath);
9734
9624
  } else if (entry.isDirectory()) {
@@ -9738,33 +9628,113 @@ async function countDirLines(dirPath) {
9738
9628
  } catch {}
9739
9629
  return total;
9740
9630
  }
9631
+ async function readDirMeta(dirPath) {
9632
+ const content = await readTextFile(join5(dirPath, "meta.yml"));
9633
+ if (content === null)
9634
+ return null;
9635
+ try {
9636
+ return import_yaml2.default.parse(content);
9637
+ } catch {
9638
+ return null;
9639
+ }
9640
+ }
9641
+ async function readFileMeta(filePath) {
9642
+ const content = await readTextFile(filePath);
9643
+ if (content === null)
9644
+ return null;
9645
+ return parseFrontmatter(content);
9646
+ }
9741
9647
  async function scanLineage(paths) {
9742
9648
  const entries = [];
9743
9649
  try {
9744
- const items = await readdir5(paths.lineage, { withFileTypes: true });
9650
+ const items = await readdir4(paths.lineage, { withFileTypes: true });
9745
9651
  for (const item of items) {
9746
- const fullPath = join6(paths.lineage, item.name);
9652
+ const fullPath = join5(paths.lineage, item.name);
9747
9653
  if (item.isDirectory() && item.name.startsWith("gen-")) {
9748
- const genNum = parseInt(item.name.replace("gen-", ""), 10);
9654
+ const genNum = extractGenNum(item.name);
9749
9655
  const lines = await countDirLines(fullPath);
9750
- entries.push({ name: item.name, type: "dir", lines, genNum });
9656
+ const meta = await readDirMeta(fullPath);
9657
+ const genId = meta?.id ?? item.name.match(/^gen-\d{3}(?:-[a-f0-9]{6})?/)?.[0] ?? item.name;
9658
+ entries.push({
9659
+ name: item.name,
9660
+ type: "dir",
9661
+ lines,
9662
+ genNum,
9663
+ completedAt: meta?.completedAt ?? "",
9664
+ genId
9665
+ });
9751
9666
  } else if (item.isFile() && item.name.startsWith("gen-") && item.name.endsWith(".md")) {
9752
- const genNum = parseInt(item.name.replace("gen-", ""), 10);
9667
+ const genNum = extractGenNum(item.name);
9753
9668
  const lines = await countLines(fullPath);
9754
- entries.push({ name: item.name, type: "level1", lines, genNum });
9669
+ const meta = await readFileMeta(fullPath);
9670
+ const genId = meta?.id ?? item.name.replace(".md", "").match(/^gen-\d{3}(?:-[a-f0-9]{6})?/)?.[0] ?? item.name;
9671
+ entries.push({
9672
+ name: item.name,
9673
+ type: "level1",
9674
+ lines,
9675
+ genNum,
9676
+ completedAt: meta?.completedAt ?? "",
9677
+ genId
9678
+ });
9755
9679
  } else if (item.isFile() && item.name.startsWith("epoch-") && item.name.endsWith(".md")) {
9756
9680
  const lines = await countLines(fullPath);
9757
- entries.push({ name: item.name, type: "level2", lines, genNum: 0 });
9681
+ entries.push({
9682
+ name: item.name,
9683
+ type: "level2",
9684
+ lines,
9685
+ genNum: 0,
9686
+ completedAt: "",
9687
+ genId: ""
9688
+ });
9758
9689
  }
9759
9690
  }
9760
9691
  } catch {}
9761
- return entries.sort((a, b) => a.genNum - b.genNum);
9692
+ return entries.sort((a, b) => {
9693
+ if (a.completedAt && b.completedAt) {
9694
+ return a.completedAt.localeCompare(b.completedAt);
9695
+ }
9696
+ return a.genNum - b.genNum;
9697
+ });
9698
+ }
9699
+ async function findLeafNodes(paths, entries) {
9700
+ const allIds = new Set;
9701
+ const referencedAsParent = new Set;
9702
+ for (const entry of entries) {
9703
+ if (entry.type === "level2")
9704
+ continue;
9705
+ let meta = null;
9706
+ const fullPath = join5(paths.lineage, entry.name);
9707
+ if (entry.type === "dir") {
9708
+ meta = await readDirMeta(fullPath);
9709
+ } else {
9710
+ meta = await readFileMeta(fullPath);
9711
+ }
9712
+ if (meta) {
9713
+ allIds.add(meta.id);
9714
+ for (const parent of meta.parents) {
9715
+ referencedAsParent.add(parent);
9716
+ }
9717
+ } else {
9718
+ allIds.add(entry.genId);
9719
+ }
9720
+ }
9721
+ const leaves = new Set;
9722
+ for (const id of allIds) {
9723
+ if (!referencedAsParent.has(id)) {
9724
+ leaves.add(id);
9725
+ }
9726
+ }
9727
+ return leaves;
9762
9728
  }
9763
9729
  async function compressLevel1(genDir, genName) {
9764
9730
  const lines = [];
9731
+ const meta = await readDirMeta(genDir);
9732
+ if (meta) {
9733
+ lines.push(buildFrontmatter(meta));
9734
+ }
9765
9735
  let goal = "", completionConditions = "";
9766
9736
  {
9767
- const objective = await readTextFile(join6(genDir, "01-objective.md"));
9737
+ const objective = await readTextFile(join5(genDir, "01-objective.md"));
9768
9738
  if (objective) {
9769
9739
  const goalMatch = objective.match(/## Goal\n([\s\S]*?)(?=\n##)/);
9770
9740
  if (goalMatch)
@@ -9776,7 +9746,7 @@ async function compressLevel1(genDir, genName) {
9776
9746
  }
9777
9747
  let lessons = "", genomeChanges = "", nextBacklog = "";
9778
9748
  {
9779
- const completion = await readTextFile(join6(genDir, "05-completion.md"));
9749
+ const completion = await readTextFile(join5(genDir, "05-completion.md"));
9780
9750
  if (completion) {
9781
9751
  const lessonsMatch = completion.match(/### Lessons Learned\n([\s\S]*?)(?=\n###)/);
9782
9752
  if (lessonsMatch)
@@ -9789,18 +9759,18 @@ async function compressLevel1(genDir, genName) {
9789
9759
  nextBacklog = backlogMatch[1].trim();
9790
9760
  }
9791
9761
  }
9792
- let metadata = "";
9762
+ let summaryText = "";
9793
9763
  {
9794
- const completion = await readTextFile(join6(genDir, "05-completion.md"));
9764
+ const completion = await readTextFile(join5(genDir, "05-completion.md"));
9795
9765
  if (completion) {
9796
9766
  const summaryMatch = completion.match(/## Summary\n([\s\S]*?)(?=\n##)/);
9797
9767
  if (summaryMatch)
9798
- metadata = summaryMatch[1].trim();
9768
+ summaryText = summaryMatch[1].trim();
9799
9769
  }
9800
9770
  }
9801
9771
  let validationResult = "";
9802
9772
  {
9803
- const validation = await readTextFile(join6(genDir, "04-validation.md"));
9773
+ const validation = await readTextFile(join5(genDir, "04-validation.md"));
9804
9774
  if (validation) {
9805
9775
  const resultMatch = validation.match(/## Result: (.+)/);
9806
9776
  if (resultMatch)
@@ -9809,7 +9779,7 @@ async function compressLevel1(genDir, genName) {
9809
9779
  }
9810
9780
  let deferred = "";
9811
9781
  {
9812
- const impl = await readTextFile(join6(genDir, "03-implementation.md"));
9782
+ const impl = await readTextFile(join5(genDir, "03-implementation.md"));
9813
9783
  if (impl) {
9814
9784
  const deferredMatch = impl.match(/## Deferred Tasks\n([\s\S]*?)(?=\n##)/);
9815
9785
  if (deferredMatch) {
@@ -9820,10 +9790,10 @@ async function compressLevel1(genDir, genName) {
9820
9790
  }
9821
9791
  }
9822
9792
  }
9823
- const genId = genName.match(/^gen-\d+/)?.[0] ?? genName;
9793
+ const genId = genName.match(/^gen-\d{3}(?:-[a-f0-9]{6})?/)?.[0] ?? genName;
9824
9794
  lines.push(`# ${genId}`);
9825
- if (metadata) {
9826
- lines.push(metadata.replace(/^# .+\n/, "").trim());
9795
+ if (summaryText) {
9796
+ lines.push(summaryText.replace(/^# .+\n/, "").trim());
9827
9797
  }
9828
9798
  lines.push("");
9829
9799
  if (goal) {
@@ -9873,18 +9843,19 @@ async function compressLevel1(genDir, genName) {
9873
9843
  }
9874
9844
  async function compressLevel2(level1Files, epochNum) {
9875
9845
  const lines = [];
9876
- const genIds = level1Files.map((f) => f.name.replace(".md", "").match(/^gen-\d+/)?.[0] ?? f.name);
9846
+ const genIds = level1Files.map((f) => f.name.replace(".md", "").match(/^gen-\d{3}(?:-[a-f0-9]{6})?/)?.[0] ?? f.name);
9877
9847
  const first = genIds[0];
9878
9848
  const last = genIds[genIds.length - 1];
9879
9849
  lines.push(`# Epoch ${String(epochNum).padStart(3, "0")} (${first} ~ ${last})`);
9880
9850
  lines.push("");
9881
9851
  for (const file of level1Files) {
9882
9852
  const content = await readTextFileOrThrow(file.path);
9883
- const headerMatch = content.match(/^# (gen-\d+)/m);
9884
- const goalMatch = content.match(/- Goal: (.+)/);
9885
- const periodMatch = content.match(/- (?:Started|Period): (.+)/);
9886
- const genomeMatch = content.match(/- Genome.*: (.+)/);
9887
- const resultMatch = content.match(/## Result: (.+)/);
9853
+ const bodyContent = content.replace(/^---\n[\s\S]*?\n---\n?/, "");
9854
+ const headerMatch = bodyContent.match(/^# (gen-\d{3}(?:-[a-f0-9]{6})?)/m);
9855
+ const goalMatch = bodyContent.match(/- Goal: (.+)/);
9856
+ const periodMatch = bodyContent.match(/- (?:Started|Period): (.+)/);
9857
+ const genomeMatch = bodyContent.match(/- Genome.*: (.+)/);
9858
+ const resultMatch = bodyContent.match(/## Result: (.+)/);
9888
9859
  const genId = headerMatch?.[1] ?? "unknown";
9889
9860
  const goal = goalMatch?.[1] ?? "";
9890
9861
  const result2 = resultMatch?.[1] ?? "";
@@ -9895,7 +9866,7 @@ async function compressLevel2(level1Files, epochNum) {
9895
9866
  lines.push(`- ${genomeMatch[0].trim()}`);
9896
9867
  if (result2)
9897
9868
  lines.push(`- Result: ${result2}`);
9898
- const changeSection = content.match(/## Genome Changes\n([\s\S]*?)(?=\n##|$)/);
9869
+ const changeSection = bodyContent.match(/## Genome Changes\n([\s\S]*?)(?=\n##|$)/);
9899
9870
  if (changeSection && !changeSection[1].match(/^\|\s*\|\s*\|\s*\|\s*\|$/)) {
9900
9871
  lines.push(`- Genome Changes: ${changeSection[1].trim().split(`
9901
9872
  `)[0]}`);
@@ -9924,21 +9895,23 @@ async function compressLineageIfNeeded(paths) {
9924
9895
  if (totalLines <= LINEAGE_MAX_LINES) {
9925
9896
  return result;
9926
9897
  }
9927
- const allDirs = entries.filter((e) => e.type === "dir").sort((a, b) => a.genNum - b.genNum);
9928
- const dirs = allDirs.slice(0, Math.max(0, allDirs.length - RECENT_PROTECTED_COUNT));
9929
- for (const dir of dirs) {
9898
+ const leafNodes = await findLeafNodes(paths, entries);
9899
+ const allDirs = entries.filter((e) => e.type === "dir");
9900
+ const recentIds = new Set(allDirs.slice(Math.max(0, allDirs.length - RECENT_PROTECTED_COUNT)).map((e) => e.genId));
9901
+ const compressibleDirs = allDirs.filter((dir) => !recentIds.has(dir.genId) && !leafNodes.has(dir.genId));
9902
+ for (const dir of compressibleDirs) {
9930
9903
  const currentTotal = await countDirLines(paths.lineage);
9931
9904
  if (currentTotal <= LINEAGE_MAX_LINES)
9932
9905
  break;
9933
- const dirPath = join6(paths.lineage, dir.name);
9906
+ const dirPath = join5(paths.lineage, dir.name);
9934
9907
  const compressed = await compressLevel1(dirPath, dir.name);
9935
- const genId = dir.name.match(/^gen-\d+/)?.[0] ?? dir.name;
9936
- const outPath = join6(paths.lineage, `${genId}.md`);
9908
+ const genId = dir.name.match(/^gen-\d{3}(?:-[a-f0-9]{6})?/)?.[0] ?? dir.name;
9909
+ const outPath = join5(paths.lineage, `${genId}.md`);
9937
9910
  await writeTextFile(outPath, compressed);
9938
- await rm2(dirPath, { recursive: true });
9911
+ await rm(dirPath, { recursive: true });
9939
9912
  result.level1.push(genId);
9940
9913
  }
9941
- const level1s = (await scanLineage(paths)).filter((e) => e.type === "level1").sort((a, b) => a.genNum - b.genNum);
9914
+ const level1s = (await scanLineage(paths)).filter((e) => e.type === "level1");
9942
9915
  if (level1s.length >= LEVEL2_BATCH_SIZE) {
9943
9916
  const existingEpochs = (await scanLineage(paths)).filter((e) => e.type === "level2");
9944
9917
  let epochNum = existingEpochs.length + 1;
@@ -9947,13 +9920,13 @@ async function compressLineageIfNeeded(paths) {
9947
9920
  const batch = level1s.slice(i * LEVEL2_BATCH_SIZE, (i + 1) * LEVEL2_BATCH_SIZE);
9948
9921
  const files = batch.map((e) => ({
9949
9922
  name: e.name,
9950
- path: join6(paths.lineage, e.name)
9923
+ path: join5(paths.lineage, e.name)
9951
9924
  }));
9952
9925
  const compressed = await compressLevel2(files, epochNum);
9953
- const outPath = join6(paths.lineage, `epoch-${String(epochNum).padStart(3, "0")}.md`);
9926
+ const outPath = join5(paths.lineage, `epoch-${String(epochNum).padStart(3, "0")}.md`);
9954
9927
  await writeTextFile(outPath, compressed);
9955
9928
  for (const file of files) {
9956
- await rm2(file.path);
9929
+ await rm(file.path);
9957
9930
  }
9958
9931
  result.level2.push(`epoch-${String(epochNum).padStart(3, "0")}`);
9959
9932
  epochNum++;
@@ -9963,6 +9936,46 @@ async function compressLineageIfNeeded(paths) {
9963
9936
  }
9964
9937
 
9965
9938
  // src/core/generation.ts
9939
+ function generateGenHash(parents, goal, genomeHash, machineId, startedAt) {
9940
+ const input = JSON.stringify({ parents, goal, genomeHash, machineId, startedAt });
9941
+ return createHash("sha256").update(input).digest("hex").slice(0, 6);
9942
+ }
9943
+ function getMachineId() {
9944
+ return hostname();
9945
+ }
9946
+ async function computeGenomeHash(genomePath) {
9947
+ const hash = createHash("sha256");
9948
+ try {
9949
+ const entries = (await readdir5(genomePath, { recursive: true, withFileTypes: true })).filter((e) => e.isFile()).sort((a, b) => {
9950
+ const pathA = join6(e2path(a), a.name);
9951
+ const pathB = join6(e2path(b), b.name);
9952
+ return pathA.localeCompare(pathB);
9953
+ });
9954
+ for (const entry of entries) {
9955
+ const filePath = join6(e2path(entry), entry.name);
9956
+ const content = await readTextFile(filePath);
9957
+ if (content !== null) {
9958
+ hash.update(filePath.replace(genomePath, ""));
9959
+ hash.update(content);
9960
+ }
9961
+ }
9962
+ } catch {}
9963
+ return hash.digest("hex").slice(0, 8);
9964
+ }
9965
+ function e2path(entry) {
9966
+ return entry.parentPath ?? entry.path ?? "";
9967
+ }
9968
+ function formatGenId(seq, hash) {
9969
+ return `gen-${String(seq).padStart(3, "0")}-${hash}`;
9970
+ }
9971
+ function parseGenSeq(id) {
9972
+ const match = id.match(/^gen-(\d{3})/);
9973
+ return match ? parseInt(match[1], 10) : 0;
9974
+ }
9975
+ function isLegacyId(id) {
9976
+ return /^gen-\d{3}$/.test(id);
9977
+ }
9978
+
9966
9979
  class GenerationManager {
9967
9980
  paths;
9968
9981
  constructor(paths) {
@@ -9972,20 +9985,33 @@ class GenerationManager {
9972
9985
  const content = await readTextFile(this.paths.currentYml);
9973
9986
  if (content === null || !content.trim())
9974
9987
  return null;
9975
- return import_yaml2.default.parse(content);
9988
+ const state = import_yaml3.default.parse(content);
9989
+ if (!state.type)
9990
+ state.type = "normal";
9991
+ if (!state.parents)
9992
+ state.parents = [];
9993
+ return state;
9976
9994
  }
9977
9995
  async create(goal, genomeVersion) {
9978
- const id = await this.nextGenId();
9996
+ const seq = await this.nextSeq();
9979
9997
  const now = new Date().toISOString();
9998
+ const genomeHash = await computeGenomeHash(this.paths.genome);
9999
+ const machineId = getMachineId();
10000
+ const parents = await this.resolveParents();
10001
+ const hash = generateGenHash(parents, goal, genomeHash, machineId, now);
10002
+ const id = formatGenId(seq, hash);
9980
10003
  const state = {
9981
10004
  id,
9982
10005
  goal,
9983
10006
  stage: "objective",
9984
10007
  genomeVersion,
9985
10008
  startedAt: now,
9986
- timeline: [{ stage: "objective", at: now }]
10009
+ timeline: [{ stage: "objective", at: now }],
10010
+ type: "normal",
10011
+ parents,
10012
+ genomeHash
9987
10013
  };
9988
- await writeTextFile(this.paths.currentYml, import_yaml2.default.stringify(state));
10014
+ await writeTextFile(this.paths.currentYml, import_yaml3.default.stringify(state));
9989
10015
  return state;
9990
10016
  }
9991
10017
  async advance() {
@@ -9999,7 +10025,7 @@ class GenerationManager {
9999
10025
  if (!state.timeline)
10000
10026
  state.timeline = [];
10001
10027
  state.timeline.push({ stage: next, at: new Date().toISOString() });
10002
- await writeTextFile(this.paths.currentYml, import_yaml2.default.stringify(state));
10028
+ await writeTextFile(this.paths.currentYml, import_yaml3.default.stringify(state));
10003
10029
  return state;
10004
10030
  }
10005
10031
  async complete() {
@@ -10008,31 +10034,43 @@ class GenerationManager {
10008
10034
  throw new Error("No active generation");
10009
10035
  if (state.stage !== "completion")
10010
10036
  throw new Error("Generation must be in completion stage to complete");
10037
+ const now = new Date().toISOString();
10038
+ state.completedAt = now;
10011
10039
  const goalSlug = state.goal.toLowerCase().replace(/[^a-z0-9가-힣]+/g, "-").replace(/^-|-$/g, "").slice(0, 30);
10012
10040
  const genDirName = `${state.id}-${goalSlug}`;
10013
10041
  const genDir = this.paths.generationDir(genDirName);
10014
- await mkdir5(genDir, { recursive: true });
10015
- const lifeEntries = await readdir6(this.paths.life);
10042
+ await mkdir4(genDir, { recursive: true });
10043
+ const meta = {
10044
+ id: state.id,
10045
+ type: state.type ?? "normal",
10046
+ parents: state.parents ?? [],
10047
+ goal: state.goal,
10048
+ genomeHash: state.genomeHash ?? "unknown",
10049
+ startedAt: state.startedAt,
10050
+ completedAt: now
10051
+ };
10052
+ await writeTextFile(join6(genDir, "meta.yml"), import_yaml3.default.stringify(meta));
10053
+ const lifeEntries = await readdir5(this.paths.life);
10016
10054
  for (const entry of lifeEntries) {
10017
10055
  if (/^\d{2}-[a-z]+(?:-[a-z]+)*\.md$/.test(entry)) {
10018
- await rename(join7(this.paths.life, entry), join7(genDir, entry));
10056
+ await rename(join6(this.paths.life, entry), join6(genDir, entry));
10019
10057
  }
10020
10058
  }
10021
- const backlogDir = join7(genDir, "backlog");
10022
- await mkdir5(backlogDir, { recursive: true });
10059
+ const backlogDir = join6(genDir, "backlog");
10060
+ await mkdir4(backlogDir, { recursive: true });
10023
10061
  try {
10024
- const backlogEntries = await readdir6(this.paths.backlog);
10062
+ const backlogEntries = await readdir5(this.paths.backlog);
10025
10063
  for (const entry of backlogEntries) {
10026
- await rename(join7(this.paths.backlog, entry), join7(backlogDir, entry));
10064
+ await rename(join6(this.paths.backlog, entry), join6(backlogDir, entry));
10027
10065
  }
10028
10066
  } catch {}
10029
10067
  try {
10030
- const mutEntries = await readdir6(this.paths.mutations);
10068
+ const mutEntries = await readdir5(this.paths.mutations);
10031
10069
  if (mutEntries.length > 0) {
10032
- const mutDir = join7(genDir, "mutations");
10033
- await mkdir5(mutDir, { recursive: true });
10070
+ const mutDir = join6(genDir, "mutations");
10071
+ await mkdir4(mutDir, { recursive: true });
10034
10072
  for (const entry of mutEntries) {
10035
- await rename(join7(this.paths.mutations, entry), join7(mutDir, entry));
10073
+ await rename(join6(this.paths.mutations, entry), join6(mutDir, entry));
10036
10074
  }
10037
10075
  }
10038
10076
  } catch {}
@@ -10041,32 +10079,333 @@ class GenerationManager {
10041
10079
  return compression;
10042
10080
  }
10043
10081
  async save(state) {
10044
- await writeTextFile(this.paths.currentYml, import_yaml2.default.stringify(state));
10082
+ await writeTextFile(this.paths.currentYml, import_yaml3.default.stringify(state));
10045
10083
  }
10046
10084
  async listCompleted() {
10047
10085
  try {
10048
- const entries = await readdir6(this.paths.lineage);
10086
+ const entries = await readdir5(this.paths.lineage);
10049
10087
  return entries.filter((e) => e.startsWith("gen-")).sort();
10050
10088
  } catch {
10051
10089
  return [];
10052
10090
  }
10053
10091
  }
10054
- async nextGenId() {
10092
+ async readMeta(lineageDirName) {
10093
+ const metaPath = join6(this.paths.lineage, lineageDirName, "meta.yml");
10094
+ const content = await readTextFile(metaPath);
10095
+ if (content === null)
10096
+ return null;
10097
+ return import_yaml3.default.parse(content);
10098
+ }
10099
+ async listMeta() {
10100
+ const metas = [];
10101
+ try {
10102
+ const entries = await readdir5(this.paths.lineage, { withFileTypes: true });
10103
+ for (const entry of entries) {
10104
+ if (entry.isDirectory() && entry.name.startsWith("gen-")) {
10105
+ const meta = await this.readMeta(entry.name);
10106
+ if (meta)
10107
+ metas.push(meta);
10108
+ } else if (entry.isFile() && entry.name.startsWith("gen-") && entry.name.endsWith(".md")) {
10109
+ const content = await readTextFile(join6(this.paths.lineage, entry.name));
10110
+ if (content) {
10111
+ const meta = parseFrontmatter(content);
10112
+ if (meta)
10113
+ metas.push(meta);
10114
+ }
10115
+ }
10116
+ }
10117
+ } catch {}
10118
+ return metas;
10119
+ }
10120
+ async resolveParents() {
10121
+ const metas = await this.listMeta();
10122
+ if (metas.length > 0) {
10123
+ const sorted = metas.sort((a, b) => new Date(b.completedAt).getTime() - new Date(a.completedAt).getTime());
10124
+ return [sorted[0].id];
10125
+ }
10126
+ const dirs = await this.listCompleted();
10127
+ if (dirs.length > 0) {
10128
+ const lastDir = dirs[dirs.length - 1];
10129
+ const legacyId = lastDir.match(/^(gen-\d{3}(?:-[a-f0-9]{6})?)/)?.[1];
10130
+ if (legacyId)
10131
+ return [legacyId];
10132
+ }
10133
+ return [];
10134
+ }
10135
+ async nextSeq() {
10055
10136
  const genDirs = await this.listCompleted();
10056
10137
  if (genDirs.length === 0) {
10057
10138
  const current = await this.current();
10058
10139
  if (current) {
10059
- const num2 = parseInt(current.id.replace("gen-", ""), 10);
10060
- return `gen-${String(num2 + 1).padStart(3, "0")}`;
10140
+ return parseGenSeq(current.id) + 1;
10061
10141
  }
10062
- return "gen-001";
10142
+ return 1;
10143
+ }
10144
+ let maxSeq = 0;
10145
+ for (const dir of genDirs) {
10146
+ const seq = parseGenSeq(dir);
10147
+ if (seq > maxSeq)
10148
+ maxSeq = seq;
10063
10149
  }
10064
- const last = genDirs[genDirs.length - 1];
10065
- const num = parseInt(last.replace("gen-", ""), 10);
10066
- return `gen-${String(num + 1).padStart(3, "0")}`;
10150
+ return maxSeq + 1;
10151
+ }
10152
+ async nextGenId() {
10153
+ const seq = await this.nextSeq();
10154
+ return `gen-${String(seq).padStart(3, "0")}`;
10067
10155
  }
10068
10156
  }
10069
10157
 
10158
+ // src/core/migration.ts
10159
+ async function needsMigration(paths) {
10160
+ try {
10161
+ const entries = await readdir6(paths.lineage, { withFileTypes: true });
10162
+ for (const entry of entries) {
10163
+ if (!entry.isDirectory() || !entry.name.startsWith("gen-"))
10164
+ continue;
10165
+ const metaPath = join7(paths.lineage, entry.name, "meta.yml");
10166
+ const content = await readTextFile(metaPath);
10167
+ if (content === null)
10168
+ return true;
10169
+ }
10170
+ } catch {
10171
+ return false;
10172
+ }
10173
+ return false;
10174
+ }
10175
+ async function migrateLineage(paths) {
10176
+ const result = { migrated: [], skipped: [], errors: [] };
10177
+ let entries;
10178
+ try {
10179
+ const dirEntries = await readdir6(paths.lineage, { withFileTypes: true });
10180
+ entries = dirEntries.filter((e) => e.isDirectory() && e.name.startsWith("gen-")).map((e) => e.name).sort();
10181
+ } catch {
10182
+ return result;
10183
+ }
10184
+ const plan = [];
10185
+ for (const dirName of entries) {
10186
+ const metaPath = join7(paths.lineage, dirName, "meta.yml");
10187
+ const metaContent = await readTextFile(metaPath);
10188
+ const seq = parseGenSeq(dirName);
10189
+ let goal = "";
10190
+ const objContent = await readTextFile(join7(paths.lineage, dirName, "01-objective.md"));
10191
+ if (objContent) {
10192
+ const goalMatch = objContent.match(/## Goal\n+([\s\S]*?)(?=\n##)/);
10193
+ if (goalMatch)
10194
+ goal = goalMatch[1].trim();
10195
+ }
10196
+ if (!goal) {
10197
+ const slugMatch = dirName.match(/^gen-\d{3}(?:-[a-f0-9]{6})?-(.+)$/);
10198
+ goal = slugMatch ? slugMatch[1].replace(/-/g, " ") : `Generation ${seq}`;
10199
+ }
10200
+ plan.push({ dirName, seq, goal, hasMeta: metaContent !== null });
10201
+ }
10202
+ let prevId = null;
10203
+ for (const entry of plan) {
10204
+ if (entry.hasMeta) {
10205
+ const metaContent = await readTextFile(join7(paths.lineage, entry.dirName, "meta.yml"));
10206
+ if (metaContent) {
10207
+ const meta = import_yaml4.default.parse(metaContent);
10208
+ prevId = meta.id;
10209
+ }
10210
+ result.skipped.push(entry.dirName);
10211
+ continue;
10212
+ }
10213
+ try {
10214
+ const parents = prevId ? [prevId] : [];
10215
+ const hash = generateGenHash(parents, entry.goal, "legacy", "migration", `legacy-${entry.seq}`);
10216
+ const newId = formatGenId(entry.seq, hash);
10217
+ const meta = {
10218
+ id: newId,
10219
+ type: "normal",
10220
+ parents,
10221
+ goal: entry.goal,
10222
+ genomeHash: "legacy",
10223
+ startedAt: `legacy-${entry.seq}`,
10224
+ completedAt: `legacy-${entry.seq}`
10225
+ };
10226
+ await writeTextFile(join7(paths.lineage, entry.dirName, "meta.yml"), import_yaml4.default.stringify(meta));
10227
+ const oldSlug = entry.dirName.replace(/^gen-\d{3}/, "");
10228
+ const newDirName = `${newId}${oldSlug}`;
10229
+ if (newDirName !== entry.dirName) {
10230
+ await rename2(join7(paths.lineage, entry.dirName), join7(paths.lineage, newDirName));
10231
+ }
10232
+ prevId = newId;
10233
+ result.migrated.push(`${entry.dirName} → ${newDirName}`);
10234
+ } catch (err) {
10235
+ result.errors.push(`${entry.dirName}: ${err instanceof Error ? err.message : String(err)}`);
10236
+ }
10237
+ }
10238
+ try {
10239
+ const currentContent = await readTextFile(paths.currentYml);
10240
+ if (currentContent && currentContent.trim()) {
10241
+ const state = import_yaml4.default.parse(currentContent);
10242
+ if (isLegacyId(state.id)) {
10243
+ const parents = prevId ? [prevId] : [];
10244
+ const genomeHash = "legacy";
10245
+ const hash = generateGenHash(parents, state.goal, genomeHash, "migration", state.startedAt);
10246
+ state.id = formatGenId(parseGenSeq(state.id), hash);
10247
+ state.type = state.type ?? "normal";
10248
+ state.parents = parents;
10249
+ state.genomeHash = genomeHash;
10250
+ await writeTextFile(paths.currentYml, import_yaml4.default.stringify(state));
10251
+ result.migrated.push(`current.yml: ${state.id}`);
10252
+ }
10253
+ }
10254
+ } catch {}
10255
+ return result;
10256
+ }
10257
+
10258
+ // src/cli/commands/update.ts
10259
+ async function updateProject(projectRoot, dryRun = false) {
10260
+ const paths = new ReapPaths(projectRoot);
10261
+ if (!await paths.isReapProject()) {
10262
+ throw new Error("Not a REAP project. Run 'reap init' first.");
10263
+ }
10264
+ const result = { updated: [], skipped: [], removed: [] };
10265
+ const config = await ConfigManager.read(paths);
10266
+ const adapters = await AgentRegistry.getActiveAdapters(config ?? undefined);
10267
+ const commandsDir = ReapPaths.packageCommandsDir;
10268
+ const commandFiles = await readdir7(commandsDir);
10269
+ for (const adapter of adapters) {
10270
+ const agentCmdDir = adapter.getCommandsDir();
10271
+ const label = `${adapter.displayName}`;
10272
+ for (const file of commandFiles) {
10273
+ if (!file.endsWith(".md"))
10274
+ continue;
10275
+ const src = await readTextFileOrThrow(join8(commandsDir, file));
10276
+ const dest = join8(agentCmdDir, file);
10277
+ const existingContent = await readTextFile(dest);
10278
+ if (existingContent !== null && existingContent === src) {
10279
+ result.skipped.push(`[${label}] commands/${file}`);
10280
+ } else {
10281
+ if (!dryRun) {
10282
+ await mkdir6(agentCmdDir, { recursive: true });
10283
+ await writeTextFile(dest, src);
10284
+ }
10285
+ result.updated.push(`[${label}] commands/${file}`);
10286
+ }
10287
+ }
10288
+ const validCommandFiles = new Set(commandFiles);
10289
+ if (!dryRun) {
10290
+ await adapter.removeStaleCommands(validCommandFiles);
10291
+ }
10292
+ }
10293
+ await mkdir6(ReapPaths.userReapTemplates, { recursive: true });
10294
+ const artifactFiles = ["01-objective.md", "02-planning.md", "03-implementation.md", "04-validation.md", "05-completion.md"];
10295
+ for (const file of artifactFiles) {
10296
+ const src = await readTextFileOrThrow(join8(ReapPaths.packageArtifactsDir, file));
10297
+ const dest = join8(ReapPaths.userReapTemplates, file);
10298
+ const existingContent = await readTextFile(dest);
10299
+ if (existingContent !== null && existingContent === src) {
10300
+ result.skipped.push(`~/.reap/templates/${file}`);
10301
+ } else {
10302
+ if (!dryRun)
10303
+ await writeTextFile(dest, src);
10304
+ result.updated.push(`~/.reap/templates/${file}`);
10305
+ }
10306
+ }
10307
+ const domainGuideSrc = await readTextFileOrThrow(join8(ReapPaths.packageGenomeDir, "domain/README.md"));
10308
+ const domainGuideDest = join8(ReapPaths.userReapTemplates, "domain-guide.md");
10309
+ const domainExistingContent = await readTextFile(domainGuideDest);
10310
+ if (domainExistingContent !== null && domainExistingContent === domainGuideSrc) {
10311
+ result.skipped.push(`~/.reap/templates/domain-guide.md`);
10312
+ } else {
10313
+ if (!dryRun)
10314
+ await writeTextFile(domainGuideDest, domainGuideSrc);
10315
+ result.updated.push(`~/.reap/templates/domain-guide.md`);
10316
+ }
10317
+ const migrations = await migrateHooks(dryRun);
10318
+ for (const m of migrations.results) {
10319
+ if (m.action === "migrated") {
10320
+ result.updated.push(`[${m.agent}] hooks (migrated)`);
10321
+ }
10322
+ }
10323
+ for (const adapter of adapters) {
10324
+ const hookResult = await adapter.syncSessionHook(dryRun);
10325
+ if (hookResult.action === "updated") {
10326
+ result.updated.push(`[${adapter.displayName}] session hook`);
10327
+ } else {
10328
+ result.skipped.push(`[${adapter.displayName}] session hook`);
10329
+ }
10330
+ }
10331
+ await migrateLegacyFiles(paths, dryRun, result);
10332
+ if (await needsMigration(paths)) {
10333
+ if (!dryRun) {
10334
+ const migrationResult = await migrateLineage(paths);
10335
+ for (const m of migrationResult.migrated) {
10336
+ result.updated.push(`[lineage] ${m}`);
10337
+ }
10338
+ for (const e of migrationResult.errors) {
10339
+ result.removed.push(`[lineage error] ${e}`);
10340
+ }
10341
+ } else {
10342
+ result.updated.push("[lineage] DAG migration pending (dry-run)");
10343
+ }
10344
+ }
10345
+ return result;
10346
+ }
10347
+ async function migrateLegacyFiles(paths, dryRun, result) {
10348
+ await removeDirIfExists(paths.legacyCommands, ".reap/commands/", dryRun, result);
10349
+ await removeDirIfExists(paths.legacyTemplates, ".reap/templates/", dryRun, result);
10350
+ try {
10351
+ const claudeCmdDir = paths.legacyClaudeCommands;
10352
+ const files = await readdir7(claudeCmdDir);
10353
+ for (const file of files) {
10354
+ if (file.startsWith("reap.") && file.endsWith(".md")) {
10355
+ if (!dryRun)
10356
+ await unlink3(join8(claudeCmdDir, file));
10357
+ result.removed.push(`.claude/commands/${file}`);
10358
+ }
10359
+ }
10360
+ } catch {}
10361
+ try {
10362
+ const legacyHooksJson = paths.legacyClaudeHooksJson;
10363
+ const fileContent = await readTextFile(legacyHooksJson);
10364
+ if (fileContent !== null) {
10365
+ const content = JSON.parse(fileContent);
10366
+ const sessionStart = content["SessionStart"];
10367
+ if (Array.isArray(sessionStart)) {
10368
+ const filtered = sessionStart.filter((entry) => {
10369
+ if (typeof entry !== "object" || entry === null)
10370
+ return true;
10371
+ const hooks = entry["hooks"];
10372
+ if (!Array.isArray(hooks))
10373
+ return true;
10374
+ return !hooks.some((h) => {
10375
+ if (typeof h !== "object" || h === null)
10376
+ return false;
10377
+ const cmd = h["command"];
10378
+ return typeof cmd === "string" && cmd.includes(".reap/hooks/");
10379
+ });
10380
+ });
10381
+ if (filtered.length !== sessionStart.length) {
10382
+ if (!dryRun) {
10383
+ if (filtered.length === 0 && Object.keys(content).length === 1) {
10384
+ await unlink3(legacyHooksJson);
10385
+ result.removed.push(`.claude/hooks.json (legacy)`);
10386
+ } else {
10387
+ content["SessionStart"] = filtered;
10388
+ await writeTextFile(legacyHooksJson, JSON.stringify(content, null, 2) + `
10389
+ `);
10390
+ result.removed.push(`.claude/hooks.json (legacy REAP hook entry)`);
10391
+ }
10392
+ }
10393
+ }
10394
+ }
10395
+ }
10396
+ } catch {}
10397
+ }
10398
+ async function removeDirIfExists(dirPath, label, dryRun, result) {
10399
+ try {
10400
+ const entries = await readdir7(dirPath);
10401
+ if (entries.length > 0 || true) {
10402
+ if (!dryRun)
10403
+ await rm2(dirPath, { recursive: true });
10404
+ result.removed.push(label);
10405
+ }
10406
+ } catch {}
10407
+ }
10408
+
10070
10409
  // src/cli/commands/status.ts
10071
10410
  async function getStatus(projectRoot) {
10072
10411
  const paths = new ReapPaths(projectRoot);
@@ -10082,15 +10421,18 @@ async function getStatus(projectRoot) {
10082
10421
  goal: current.goal,
10083
10422
  stage: current.stage,
10084
10423
  genomeVersion: current.genomeVersion,
10085
- startedAt: current.startedAt
10424
+ startedAt: current.startedAt,
10425
+ type: current.type,
10426
+ parents: current.parents,
10427
+ genomeHash: current.genomeHash
10086
10428
  } : null,
10087
10429
  totalGenerations: completedGens.length
10088
10430
  };
10089
10431
  }
10090
10432
 
10091
10433
  // src/cli/commands/fix.ts
10092
- var import_yaml3 = __toESM(require_dist(), 1);
10093
- import { mkdir as mkdir6, stat as stat2 } from "fs/promises";
10434
+ var import_yaml5 = __toESM(require_dist(), 1);
10435
+ import { mkdir as mkdir7, stat as stat2 } from "fs/promises";
10094
10436
  async function dirExists(path) {
10095
10437
  try {
10096
10438
  const s = await stat2(path);
@@ -10113,7 +10455,7 @@ async function fixProject(projectRoot) {
10113
10455
  ];
10114
10456
  for (const dir of requiredDirs) {
10115
10457
  if (!await dirExists(dir.path)) {
10116
- await mkdir6(dir.path, { recursive: true });
10458
+ await mkdir7(dir.path, { recursive: true });
10117
10459
  fixed.push(`Recreated missing directory: ${dir.name}/`);
10118
10460
  }
10119
10461
  }
@@ -10124,7 +10466,7 @@ async function fixProject(projectRoot) {
10124
10466
  if (currentContent !== null) {
10125
10467
  if (currentContent.trim()) {
10126
10468
  try {
10127
- const state = import_yaml3.default.parse(currentContent);
10469
+ const state = import_yaml5.default.parse(currentContent);
10128
10470
  if (!state.stage || !LifeCycle.isValid(state.stage)) {
10129
10471
  issues.push(`Invalid stage "${state.stage}" in current.yml. Valid stages: ${LifeCycle.stages().join(", ")}. Manual correction required.`);
10130
10472
  }
@@ -10133,7 +10475,7 @@ async function fixProject(projectRoot) {
10133
10475
  if (!state.goal)
10134
10476
  issues.push("current.yml is missing 'goal' field. Manual correction required.");
10135
10477
  if (!await dirExists(paths.backlog)) {
10136
- await mkdir6(paths.backlog, { recursive: true });
10478
+ await mkdir7(paths.backlog, { recursive: true });
10137
10479
  fixed.push("Recreated missing backlog/ directory for active generation");
10138
10480
  }
10139
10481
  } catch {
@@ -10146,8 +10488,8 @@ async function fixProject(projectRoot) {
10146
10488
  }
10147
10489
 
10148
10490
  // src/cli/index.ts
10149
- import { join as join8 } from "path";
10150
- program.name("reap").description("REAP — Recursive Evolutionary Autonomous Pipeline").version("0.3.4");
10491
+ import { join as join9 } from "path";
10492
+ program.name("reap").description("REAP — Recursive Evolutionary Autonomous Pipeline").version("0.4.0");
10151
10493
  program.command("init").description("Initialize a new REAP project (Genesis)").argument("[project-name]", "Project name (defaults to current directory name)").option("-m, --mode <mode>", "Entry mode: greenfield, migration, adoption", "greenfield").option("-p, --preset <preset>", "Bootstrap with a genome preset (e.g., bun-hono-react)").action(async (projectName, options) => {
10152
10494
  try {
10153
10495
  const cwd = process.cwd();
@@ -10269,10 +10611,10 @@ program.command("help").description("Show REAP commands, slash commands, and wor
10269
10611
  if (l === "korean" || l === "ko")
10270
10612
  lang = "ko";
10271
10613
  }
10272
- const helpDir = join8(ReapPaths.packageTemplatesDir, "help");
10273
- let helpText = await readTextFile(join8(helpDir, `${lang}.txt`));
10614
+ const helpDir = join9(ReapPaths.packageTemplatesDir, "help");
10615
+ let helpText = await readTextFile(join9(helpDir, `${lang}.txt`));
10274
10616
  if (!helpText)
10275
- helpText = await readTextFile(join8(helpDir, "en.txt"));
10617
+ helpText = await readTextFile(join9(helpDir, "en.txt"));
10276
10618
  if (!helpText) {
10277
10619
  console.log("Help file not found. Run 'reap update' to install templates.");
10278
10620
  return;
@@ -69,7 +69,7 @@ Do NOT finalize Genome changes without running Validation Commands.
69
69
  15. **Human confirmation**:
70
70
  - **If called from `/reap.evolve`** (Autonomous Override active): Apply genome changes automatically after Validation Commands pass. Do NOT pause for human confirmation.
71
71
  - **If called standalone**: Show the modified genome/environment content to the human and get approval. Do NOT finalize changes until the human approves.
72
- 16. For each applied `type: genome-change` and `type: environment-change` backlog item, update its frontmatter to `status: consumed` and add `consumedBy: gen-XXX`
72
+ 16. For each applied `type: genome-change` and `type: environment-change` backlog item, update its frontmatter to `status: consumed` and add `consumedBy: gen-XXX-{hash}`
73
73
 
74
74
  ### Phase 5: Hook Suggestion
75
75
 
@@ -15,16 +15,16 @@ description: "REAP Start — Start a new Generation"
15
15
  - If backlog items exist:
16
16
  - Present the list with title and priority for each item
17
17
  - Ask: "Would you like to select one of these, or enter a new goal?"
18
- - If the human selects a backlog item: use its title/content as the goal, then update the selected item's frontmatter to `status: consumed` and add `consumedBy: gen-XXX`
18
+ - If the human selects a backlog item: use its title/content as the goal, then update the selected item's frontmatter to `status: consumed` and add `consumedBy: gen-XXX-{hash}`
19
19
  - If the human wants a new goal: proceed to Step 1
20
20
  - If no backlog items exist: proceed to Step 1
21
21
 
22
22
  1. Ask the human for the goal of this generation
23
23
  2. Count existing generations in `.reap/lineage/` to determine the genomeVersion
24
- 3. Generate the next generation ID (existing count + 1, in `gen-XXX` format)
24
+ 3. Generate the next generation ID (existing count + 1, in `gen-XXX-{hash}` format where `{hash}` is a short content hash)
25
25
  4. Write the following to `current.yml`:
26
26
  ```yaml
27
- id: gen-XXX
27
+ id: gen-XXX-{hash}
28
28
  goal: [goal provided by the human]
29
29
  stage: objective
30
30
  genomeVersion: [generation count + 1]
@@ -49,4 +49,4 @@ description: "REAP Start — Start a new Generation"
49
49
  - `.sh`: run as shell script in the project root directory
50
50
 
51
51
  ## Completion
52
- - "Generation gen-XXX started. Proceed with `/reap.objective` to define the goal, or `/reap.evolve` to run the full lifecycle."
52
+ - "Generation gen-XXX-{hash} started. Proceed with `/reap.objective` to define the goal, or `/reap.evolve` to run the full lifecycle."
@@ -66,7 +66,7 @@ All items to be carried forward to the next generation are stored in `.reap/life
66
66
 
67
67
  Each item also carries a `status` field:
68
68
  - `status: pending` — Not yet processed (default; absent field treated as pending)
69
- - `status: consumed` — Processed in the current generation (requires `consumedBy: gen-XXX`)
69
+ - `status: consumed` — Processed in the current generation (requires `consumedBy: gen-XXX-{hash}`)
70
70
 
71
71
  Marking rules:
72
72
  - `/reap.start`: backlog items chosen as the generation's goal → mark `consumed`
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@c-d-cc/reap",
3
- "version": "0.3.4",
3
+ "version": "0.4.0",
4
4
  "description": "Recursive Evolutionary Autonomous Pipeline — AI and humans evolve software across generations",
5
5
  "type": "module",
6
6
  "license": "MIT",
@@ -23,7 +23,8 @@
23
23
  "reap": "dist/cli.js"
24
24
  },
25
25
  "files": [
26
- "dist/"
26
+ "dist/",
27
+ "scripts/postinstall.cjs"
27
28
  ],
28
29
  "engines": {
29
30
  "node": ">=18"
@@ -31,6 +32,7 @@
31
32
  "scripts": {
32
33
  "dev": "bun run src/cli/index.ts",
33
34
  "build": "node scripts/build.js",
35
+ "postinstall": "node scripts/postinstall.cjs",
34
36
  "prepublishOnly": "npm run build",
35
37
  "test": "bun test"
36
38
  },
@@ -0,0 +1,56 @@
1
+ #!/usr/bin/env node
2
+ /**
3
+ * postinstall — install REAP slash commands to detected AI agents.
4
+ * Runs after `npm install -g @c-d-cc/reap`.
5
+ * Graceful: never fails npm install (always exits 0).
6
+ */
7
+ const { execSync } = require("child_process");
8
+ const { readdirSync, readFileSync, writeFileSync, mkdirSync, existsSync } = require("fs");
9
+ const { join, dirname } = require("path");
10
+ const { homedir } = require("os");
11
+
12
+ const AGENTS = [
13
+ { name: "Claude Code", bin: "claude", commandsDir: join(homedir(), ".claude", "commands") },
14
+ { name: "OpenCode", bin: "opencode", commandsDir: join(homedir(), ".config", "opencode", "commands") },
15
+ ];
16
+
17
+ function isInstalled(bin) {
18
+ try {
19
+ execSync(`which ${bin}`, { stdio: "ignore" });
20
+ return true;
21
+ } catch {
22
+ return false;
23
+ }
24
+ }
25
+
26
+ try {
27
+ // Resolve commands source: dist/templates/commands/ relative to this script
28
+ const commandsSource = join(dirname(__dirname), "dist", "templates", "commands");
29
+ if (!existsSync(commandsSource)) {
30
+ // During development or if dist not built yet, skip silently
31
+ process.exit(0);
32
+ }
33
+
34
+ const commandFiles = readdirSync(commandsSource).filter(f => f.endsWith(".md"));
35
+ if (commandFiles.length === 0) process.exit(0);
36
+
37
+ let installed = 0;
38
+ for (const agent of AGENTS) {
39
+ if (!isInstalled(agent.bin)) continue;
40
+
41
+ mkdirSync(agent.commandsDir, { recursive: true });
42
+ for (const file of commandFiles) {
43
+ const src = readFileSync(join(commandsSource, file), "utf-8");
44
+ writeFileSync(join(agent.commandsDir, file), src);
45
+ }
46
+ installed++;
47
+ console.log(` reap: ${agent.name} — ${commandFiles.length} slash commands installed`);
48
+ }
49
+
50
+ if (installed === 0) {
51
+ console.log(" reap: no supported AI agents detected (claude, opencode). Run 'reap update' after installing one.");
52
+ }
53
+ } catch (err) {
54
+ // Graceful failure — never break npm install
55
+ console.warn(" reap: postinstall warning —", err.message);
56
+ }