mustard-claude 3.1.31 → 3.1.32

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "mustard-claude",
3
- "version": "3.1.31",
3
+ "version": "3.1.32",
4
4
  "description": "Framework-agnostic CLI for Claude Code project setup",
5
5
  "type": "module",
6
6
  "bin": {
@@ -23,15 +23,31 @@ Approves the active spec and prepares the implementation phase.
23
23
  - Do NOT proceed to step 2 without running this command
24
24
  2. **Read** `.claude/pipeline-config.md` — agents, model selection
25
25
  3. Locate active spec in `.claude/spec/active/`
26
+
27
+ ### Step 3b: Wave Plan Detection
28
+
29
+ Check if the located spec is a wave plan: look for `.claude/spec/active/{specName}/wave-plan.md`.
30
+
31
+ **If `wave-plan.md` exists:**
32
+
33
+ 1. Read `.claude/.pipeline-states/{specName}.json` — expect `isWavePlan: true`, `totalWaves: N`, `currentWave: 1`, `completedWaves: []`.
34
+ 2. Read `wave-plan.md` and print its ENTIRE contents verbatim inside a fenced markdown block (```` ```markdown ... ``` ````). List each wave spec file path below the block (one line each).
35
+ 3. `AskUserQuestion`:
36
+ - **"Approve wave plan — start with wave 1"** → proceed to step 4 (update header + state for wave 1 dispatch)
37
+ - **"Reject decomposition — use single spec"** → merge all wave specs back into a single spec at `.claude/spec/active/{specName}/spec.md` (concatenate `## Files`, `## Tasks`, `## Boundaries` from each wave), delete `wave-plan.md` and `wave-N-*/` subdirectories, set `scopeOverride: "user-rejected-waves"` and `isWavePlan: false` in pipeline state, proceed to step 4 on the single spec
38
+ - **"Stop — re-plan with guidance"** → stop. Instruct user: `Delete .claude/spec/active/{specName}/ and re-run /feature {name} with explicit guidance (e.g., "keep wave 2 and wave 3 together").`
39
+ 4. If user approved wave plan, for step 4 and onward, operate on the **wave 1 spec** (`.claude/spec/active/{specName}/wave-1-{role}/spec.md`) — update its header, not the wave-plan.md header.
40
+
41
+ **If `wave-plan.md` does NOT exist:** proceed as a single spec (original behavior below).
42
+
26
43
  4. **Spec Checkpoint — update spec header:**
27
44
  - `### Status: approved`
28
45
  - `### Phase: PLAN`
29
46
  - `### Checkpoint: {ISO timestamp now}`
30
- 5. **Pipeline State — create `.claude/.pipeline-states/{spec-name}.json`:**
47
+ 5. **Pipeline State — create or update `.claude/.pipeline-states/{spec-name}.json`:**
31
48
  - Extract `spec-name` from the spec directory (e.g. basename of path → `2026-02-26-linked-services-card`)
32
- - Parse Tasks from spec to extract tasks per agent (DB, Backend, Frontend, etc.)
33
- - Create `.claude/.pipeline-states/` directory if it doesn't exist
34
- - Write state file with `specName`, `status: "approved"`, `phaseName: "PLAN"`, `tasks` with names and agents, `model`, `updatedAt`
49
+ - **If wave plan (from Step 3b):** state already exists. Update fields: `status: "approved"`, `currentWave: 1`, `updatedAt`. Parse tasks from **wave-1** spec only (not all waves). Preserve `isWavePlan`, `totalWaves`, `completedWaves`, `failedWaves`.
50
+ - **If single spec:** Parse Tasks from spec to extract tasks per agent (DB, Backend, Frontend, etc.). Create `.claude/.pipeline-states/` directory if it doesn't exist. Write state file with `specName`, `status: "approved"`, `phaseName: "PLAN"`, `tasks` with names and agents, `model`, `updatedAt`.
35
51
  5b. **Memory Persist — record architectural decisions:**
36
52
  - For each significant decision in the spec (technology choices, design patterns, trade-offs):
37
53
  ```bash
@@ -58,6 +58,32 @@ If the diff file is empty or missing, skip the Git State header entirely. Never
58
58
  - Trace callers/callees via Grep in relevant directories (prefer Grep over Read)
59
59
  - Return as soon as root cause is clear — don't exhaustively scan
60
60
  - Return: root cause file(s), line(s), explanation
61
+
62
+ 2b. **Cache root-cause for retry reuse:**
63
+
64
+ After DIAGNOSE returns, compute a cache signature so fix-loop retries can skip re-DIAGNOSE when the affected surface hasn't changed:
65
+
66
+ ```javascript
67
+ // in-memory during bugfix session (also persisted to pipeline-state for Full Path)
68
+ const affectedFiles = [...root-cause file(s) from Explore return, sorted];
69
+ const bugDescription = {user's error description, canonical — trimmed and lowercased};
70
+ const rootCauseHash = sha256(bugDescription + '|' + affectedFiles.join(','));
71
+ const rootCauseSummary = {1-line root cause from Explore, ≤500 chars};
72
+ const affectedFilesHash = sha256(concatenated contents of affectedFiles right now);
73
+ ```
74
+
75
+ Write to pipeline-state if Full Path (`.claude/.pipeline-states/{specName}.json`):
76
+ ```json
77
+ {
78
+ "rootCauseHash": "sha256...",
79
+ "rootCauseSummary": "...",
80
+ "affectedFilesHash": "sha256...",
81
+ "affectedFiles": ["path/a.ts", "path/b.ts"],
82
+ "cachedAt": "{ISO}"
83
+ }
84
+ ```
85
+
86
+ For Fast Path (no spec yet), keep the cache in-memory only — it lives for the duration of the bugfix session, which is sufficient for the retry loop.
61
87
  3. **ASSESS — Decision point:**
62
88
  - Explore returns clear root cause in 1-2 files → **Fast Path** (skip PLAN)
63
89
  - 3+ files, unclear impact, cross-layer → **Full Path** (brief spec via PLAN)
@@ -125,9 +151,19 @@ Before retrying a failed fix attempt, classify the failure:
125
151
 
126
152
  1. **Transient?** — Would re-running succeed without any change? (flaky test, cache, env) → Retry once immediately.
127
153
  2. **Resolvable?** — Is the fix clear and patchable in ≤3 lines without new reads? → Apply patch, retry (counts as retry 1).
128
- 3. **Structural?** — Did the original ANALYZE misidentify the root cause? → Re-analyze: dispatch a focused Explore on the actual failure point, update root cause, re-dispatch bugfix agent. Does NOT count against the 2-retry cap.
129
-
130
- Max 2 retries for Transient + Resolvable. Structural failures trigger a targeted re-ANALYZE, not a blind retry.
154
+ 3. **Structural?** — Did the original ANALYZE misidentify the root cause? → **Before re-Exploring, consult the root-cause cache from Step 2b:**
155
+ - Recompute `affectedFilesHash` for the cached `affectedFiles`.
156
+ - **Cache hit (hash matches) AND failure signal does NOT suggest a different cause** (no keyword in the failure pointing to files outside `affectedFiles`, no REVIEW rationale explicitly naming a different root) → skip re-Explore, inject `rootCauseSummary` verbatim into the retry prompt. Log: `root-cause cached (retry {N}/2), skipping diagnose`.
157
+ - **Cache miss (files changed) OR failure rationale points elsewhere** → invalidate cache, run targeted Explore on the actual failure point, update root cause (including new cache entry via Step 2b), re-dispatch bugfix agent.
158
+ - Re-ANALYZE (with or without cache) does NOT count against the 2-retry cap.
159
+
160
+ Max 2 retries for Transient + Resolvable. Structural failures trigger a targeted re-ANALYZE (cache-gated), not a blind retry.
161
+
162
+ **Cache invalidation signals:**
163
+ - Affected files changed on disk → hash mismatch invalidates
164
+ - Review/build failure rationale mentions files outside `affectedFiles` → invalidate
165
+ - User explicitly overrides (rare) → invalidate
166
+ - After 2 retries exhausted, the cache is naturally flushed when the pipeline aborts or advances
131
167
 
132
168
  ### CLOSE
133
169
 
@@ -121,6 +121,101 @@ Continue to PLAN regardless.
121
121
 
122
122
  ### PLAN Phase
123
123
 
124
+ #### Wave Decomposition Pre-Check (Full scope only)
125
+
126
+ **Skip for Light/Extended Light** — decomposition only makes sense when scope is genuinely large.
127
+
128
+ Before writing the single spec in Full scope, check whether the work should be decomposed into waves:
129
+
130
+ 1. **Compute signals from ANALYZE output:**
131
+ - `fileCount` — files that will go into `## Files`
132
+ - `layerCount` — distinct layers (use role detection derived from paths: schema/api/ui/lib)
133
+ - `newEntityCount` — new entities created by this spec
134
+ - `estimatedTouchPoints` — count of imports/refs from Grep on affected directories (optional)
135
+
136
+ 2. **Read knowledge matches:** Read `.claude/knowledge.json` (if it exists). Extract entries whose `id` starts with `heavy-pipeline` or `high-hook-retry`. Each entry's scope signals represent a historical pipeline that cost a lot.
137
+
138
+ 3. **Run decomposition decision:**
139
+ ```bash
140
+ echo '{"fileCount":{N},"layerCount":{L},"newEntityCount":{E},"knowledgeMatches":[...]}' | node .claude/scripts/scope-decompose.js
141
+ ```
142
+ Output JSON: `{decompose: bool, reason: string, signals: {...}}`
143
+
144
+ 4. **If `decompose: false`** → proceed to `#### Full Scope` below as usual (single spec).
145
+
146
+ 5. **If `decompose: true`** → build wave plan:
147
+ ```bash
148
+ echo '{"files":[...all paths from ANALYZE...],"projectRoot":"."}' | node .claude/scripts/wave-dependency.js
149
+ ```
150
+ Output cases:
151
+ - `{error: "cyclic-dependency", cycle: [...]}` → warn user about cyclic imports (pre-existing architecture issue), fall back to single spec with note in `## Concerns`. Proceed to `#### Full Scope`.
152
+ - `{error: ...}` → fail-open: fall back to single spec.
153
+ - `{waves: [...]}` with only 1 wave → no real DAG depth, fall back to single spec.
154
+ - `{waves: [...]}` with 2+ waves → write **Wave Plan** (step 6).
155
+
156
+ 6. **Write Wave Plan structure:**
157
+ ```
158
+ .claude/spec/active/{date}-{name}/
159
+ ├── wave-plan.md
160
+ ├── wave-1-{role}/spec.md
161
+ ├── wave-2-{role}/spec.md
162
+ └── wave-N-{role}/spec.md
163
+ ```
164
+
165
+ `wave-plan.md` contains:
166
+ ```markdown
167
+ # Wave Plan: {name}
168
+ ### Status: draft | Phase: PLAN | Scope: full | Decomposed: yes
169
+ ### Checkpoint: {ISO now}
170
+ ### Reason: {decompose.reason}
171
+
172
+ ## Summary
173
+ {1-2 lines: what + why}
174
+
175
+ ## Waves
176
+ ### Wave 1 — {roles of wave 1}
177
+ Depends on: none
178
+ Files ({count}): {file1}, {file2}, ...
179
+
180
+ ### Wave 2 — {roles of wave 2}
181
+ Depends on: wave 1
182
+ Files ({count}): {file3}, ...
183
+
184
+ {... for each wave ...}
185
+
186
+ ## Rationale
187
+ {which knowledge entry matched or which threshold triggered; signals from scope-decompose}
188
+ ```
189
+
190
+ Each `wave-N-{role}/spec.md` is a **complete atomic spec** scoped to just that wave's files. Use the same template as Full scope single spec (Summary, Entity Info, Files, Tasks, Dependencies, Boundaries). Reference `../wave-plan.md` at the top as context.
191
+
192
+ 7. **Write pipeline state for wave plan:**
193
+ ```json
194
+ {
195
+ "specName": "{date}-{name}",
196
+ "status": "draft",
197
+ "phase": 2,
198
+ "phaseName": "PLAN",
199
+ "scope": "full",
200
+ "isWavePlan": true,
201
+ "currentWave": 1,
202
+ "totalWaves": N,
203
+ "completedWaves": [],
204
+ "failedWaves": []
205
+ }
206
+ ```
207
+
208
+ 8. **Present wave plan to user:**
209
+ - Read `wave-plan.md` and print its ENTIRE contents verbatim inside a fenced markdown block.
210
+ - Also list each wave's spec file paths (one line each) so the user can open individual wave specs if desired.
211
+ - Then `AskUserQuestion`:
212
+ - **"Approve wave plan and implement now"** → goes to EXECUTE wave 1 inline (same rules as Light inline)
213
+ - **"Approve wave plan for later"** → stop, user runs `/approve` + `/resume`
214
+ - **"Edit decomposition (hint PLAN)"** → user provides hint (e.g., "merge waves 2 and 3"), PLAN reexecutes with the hint appended to `estimatedTouchPoints`/manual grouping. Re-decompose once.
215
+ - **"Reject decomposition — use single spec"** → discard wave plan files, set `scopeOverride: "user-rejected-waves"` in pipeline state, proceed to `#### Full Scope` as if `decompose: false`.
216
+
217
+ 9. **If user approves the wave plan**, the single-spec `#### Full Scope` flow below is **skipped** — wave-1 becomes the first thing to execute (via `/approve --resume` or `/resume`).
218
+
124
219
  #### Full Scope
125
220
 
126
221
  1. Create `.claude/spec/active/{date}-{name}/spec.md` with:
@@ -300,5 +395,7 @@ Scope tag: `[LIGHT]` or `[FULL]` after progress line.
300
395
  - ALWAYS go straight to PLAN once you understand the change — more reads ≠ better spec
301
396
  - Light scope inline implement follows same dispatch rules as `/resume` (template, waves, retries)
302
397
  - Context budget: Grep entity-registry (not full read), Grep recipes (not full read), line-by-line checkbox updates
398
+ - Wave decomposition is opt-in via signals (knowledge matches, layer/file/entity counts) — never force waves on small scopes
399
+ - If wave decomposition is approved, single-spec Full Scope flow is skipped — waves execute sequentially via `/resume`
303
400
 
304
401
  ULTRATHINK
@@ -29,7 +29,28 @@ Before the normal detect-and-confirm flow, scan the newest pipeline state for a
29
29
  3. After the re-dispatch returns, clear the flag: remove `lastDispatchFailure` from the state object and rewrite the pipeline-state JSON.
30
30
  4. Fall through to Step 1 (normal resume flow continues from the updated state).
31
31
  - **If ageMs > 10 * 60 * 1000** (stale): silently remove `lastDispatchFailure` from the state and rewrite the file, then continue to Step 1.
32
- 4. If `lastDispatchFailure` is absent, skip Step 0 entirely and proceed to Step 1.
32
+ 4. If `lastDispatchFailure` is absent, skip Step 0 entirely and proceed to Step 0.5.
33
+
34
+ ### Step 0.5: Resume Mode (continuar vs reanalisar)
35
+
36
+ Before loading heavy context (sync-registry, diff-context, Explore Gate), ask the user which mode to use. This gates roughly 2-5k tokens per resume.
37
+
38
+ 1. **Skip conditions** — enter `reanalyze` mode automatically without prompting:
39
+ - Step 0 just re-dispatched a failed agent (recovery path → always reanalyze next step)
40
+ - `pipeline-state.lastDispatchFailure` was present and <10min old (already handled in Step 0)
41
+ - Wave plan with `failedWaves.length > 0` (handled in wave failure section below — forces `reanalyze`)
42
+
43
+ 2. **Otherwise, AskUserQuestion:**
44
+ - **"Continuar de onde parou (modo leve)"** → `mode = "continued"`: skip sync-registry (Step 2 #6), skip diff-context (unless wave transition forces), skip Pre-EXECUTE Existence Gate (Step 12b). Trust pipeline-state as source of truth.
45
+ - **"Reanalisar contexto (modo completo)"** → `mode = "reanalyzed"`: run Step 2 fully (default behavior, relê tudo).
46
+
47
+ 3. **Record mode in pipeline state:** write `resumeMode: "continued" | "reanalyzed"` and `resumeModeAt: {ISO now}` so downstream steps know which path they are in.
48
+
49
+ 4. **Stale-context fallback (safety net):** if a dispatched agent in `continued` mode returns an error indicating stale context (e.g., references a missing file, fails boundary check, or returns `BLOCKED` with reason citing out-of-date registry), escalate automatically:
50
+ - Update pipeline state: `resumeMode: "escalated-to-reanalyze"`, append to `resumeEscalations` array with `{at, reason}`
51
+ - Re-run Step 2 in full (sync-registry + diff-context)
52
+ - Re-dispatch the failed agent with fresh context
53
+ - Fail-open: escalation never blocks, just upgrades to the heavier path
33
54
 
34
55
  ### Step 1: Detect & Confirm
35
56
 
@@ -83,10 +104,14 @@ Before the normal detect-and-confirm flow, scan the newest pipeline state for a
83
104
  ### Step 2: Bootstrap (after confirmation)
84
105
 
85
106
  6. **AUTO-SYNC:** `node .claude/scripts/sync-registry.js`
107
+ - **Skip if `resumeMode === "continued"`** (Step 0.5): registry is reused from prior session.
108
+ - Always run if `resumeMode === "reanalyzed"` or `"escalated-to-reanalyze"`.
86
109
 
87
110
  ### Diff Context (automatic)
88
111
  Run `node .claude/scripts/diff-context.js --subproject {subproject_path}` per subproject to capture the current git state scoped to each subproject. Include the subproject-specific output in the agent prompt as `{diff_context}` so agents see only changes relevant to their scope.
89
112
 
113
+ **Skip if `resumeMode === "continued"`** unless a wave just completed (wave transitions always refresh diff). The prior diff snapshot is reused from `.claude/.pipeline-states/{specName}.diff-{subproject}.md`.
114
+
90
115
  7. **Read** `.claude/pipeline-config.md`. For `entity-registry.json`: use Grep to extract ONLY the relevant entity block (e.g. `"Contract":`), NEVER read the full JSON
91
116
  9. **Update spec header:** `Status: implementing`, `Phase: EXECUTE`, `Checkpoint: {ISO now}`
92
117
  10. **Update/create pipeline state:** `status: "implementing"`, `phaseName: "EXECUTE"`, `specName`
@@ -99,9 +124,26 @@ Run `node .claude/scripts/diff-context.js --subproject {subproject_path}` per su
99
124
  12. **Match recipe by name only:** Grep `{subproject}/.claude/commands/recipes.md` for recipe title matching the task type — do NOT read the full recipes file. Extract only: recipe number, pattern refs, reference modules
100
125
  12b. **Pre-EXECUTE Existence Gate**: Same gate as `feature/SKILL.md § Pre-EXECUTE Existence Gate`. Invoke identically (Full scope only, `## Files` ≤ 8). On retry/resume, the gate naturally handles idempotence: tasks already `[x]` from a prior run are treated as Mixed — the Haiku confirms they stay done and the orchestrator only re-dispatches what remains `[ ]`.
101
126
 
127
+ **Skip entirely if `resumeMode === "continued"`** (Step 0.5). The `continued` mode trusts pipeline-state checkboxes as-is. If the stale-context fallback escalates to `reanalyze`, the gate runs on the re-dispatch.
128
+
102
129
  **Pre-check (same as `feature/SKILL.md § Pre-EXECUTE Existence Gate`):** Before dispatching Haiku, run `rtk git diff --stat HEAD -- <files listed in spec's ## Files>`. Skip gate entirely if output is empty (no changes) or total insertions/deletions <10. Only proceed with Haiku dispatch if ≥10 lines changed.
103
130
 
131
+ 12c. **Wave Plan Scope (conditional — only if `pipeline-state.isWavePlan === true`):**
132
+
133
+ When the pipeline state indicates a wave plan, the orchestrator dispatches only the **current wave**, not the full spec:
134
+
135
+ 1. Read `pipeline-state.currentWave` and `pipeline-state.totalWaves`.
136
+ 2. The spec to work from for this invocation is `.claude/spec/active/{specName}/wave-{currentWave}-*/spec.md`. Replace any prior reference to `spec.md` at the root of the spec dir with the current wave's spec.
137
+ 3. **Between waves** (see Step 17 post-dispatch):
138
+ - On wave completion: run `/mustard:git commit` style commit with message `feat(wave-{N}/{role}): {summary}`. If `/mustard:git commit` is not appropriate for the project, fall back to `git add {files} && git commit -m "..."`.
139
+ - Update state: `completedWaves.push(currentWave)`, `currentWave += 1`, `updatedAt`.
140
+ - Force `resumeMode = "reanalyzed"` for the next wave transition so diff-context refreshes with the just-committed changes.
141
+ - If `currentWave > totalWaves` → skip remaining wave dispatch, go to Step 19 REVIEW + Step 20 CLOSE on the overall wave plan.
142
+ 4. **If a wave fails (REJECTED after 2 fix-loops, or BLOCKED)** — see § Wave Failure Handling below.
143
+
104
144
  13. **Plan waves:** `Depends on: none` → Wave 1; dependencies → later. DB+Backend parallel. Frontend after Backend UNLESS all parallel override conditions met (see `.claude/pipeline-config.md` Parallel Rules). Review agents: ALWAYS dispatch in single parallel message. Skip completed tasks.
145
+
146
+ **Note on wave plans:** when `isWavePlan === true`, this step plans the agent wave structure **within** the current wave's spec only — agents internal to the current wave-spec may still split across DB/Backend/Frontend sub-waves. The outer wave (1..N) is the cross-spec sequence managed by Step 12c.
105
147
  14. **Build agent prompts using template** (`.claude/commands/mustard/templates/agent-prompt/SKILL.md`):
106
148
  - Read template once, then fill placeholders per agent using `.claude/pipeline-config.md` data:
107
149
  - `{subproject}` → from Agents table (Subproject column)
@@ -183,11 +225,46 @@ When REVIEW returns REJECTED (any CRITICAL):
183
225
  8. If review still REJECTED after 2 fix-loops: STOP + report exhausted retries.
184
226
 
185
227
  20. **CLOSE:**
228
+ - **Wave plan gate:** if `pipeline-state.isWavePlan === true`, only CLOSE when `completedWaves.length === totalWaves`. If waves remain (`currentWave <= totalWaves` and wave N-1 just finished), **do not** run CLOSE — instead update state (`currentWave++`, `completedWaves.push`), output `═══ WAVE {N-1} COMPLETE — {role} ═══`, and stop. Next `/mustard:resume` picks up wave N.
186
229
  - `node .claude/scripts/sync-registry.js`
187
- - Spec: `Status: completed`, `Phase: CLOSE`, all `[ ]` → `[x]`
188
- - Move spec to `.claude/spec/completed/`
230
+ - Spec: `Status: completed`, `Phase: CLOSE`, all `[ ]` → `[x]`. For wave plans: mark `wave-plan.md` status `completed`, and mark each `wave-N-{role}/spec.md` completed too.
231
+ - Move spec to `.claude/spec/completed/` (the entire `{specName}/` directory, including wave subdirs if any)
189
232
  - **Delete** `.claude/.pipeline-states/{spec-name}.json`
190
- - Output with agent colors: `═══ PIPELINE COMPLETE — {name} | Agents: {n} ok | Files: {c} created, {m} modified ═══`
233
+ - Output with agent colors: `═══ PIPELINE COMPLETE — {name} | Agents: {n} ok | Files: {c} created, {m} modified ═══` (for wave plans: append `| Waves: {totalWaves}`).
234
+
235
+ ### Wave Failure Handling
236
+
237
+ Applies only when `pipeline-state.isWavePlan === true`.
238
+
239
+ A wave is considered **failed** when:
240
+ - REVIEW returns REJECTED after 2 fix-loops exhausted (see Step 19b), OR
241
+ - An implementation agent returns `BLOCKED` and the user cannot resolve inline, OR
242
+ - Build/type-check fails repeatedly (max 2 retries) after Granular Retry Protocol is exhausted.
243
+
244
+ **On wave failure:**
245
+
246
+ 1. Update pipeline state:
247
+ - `failedWaves.push(currentWave)`
248
+ - `status = "failed"`
249
+ - `updatedAt = {ISO now}`
250
+ 2. Write failure log to `.claude/spec/active/{specName}/wave-{currentWave}-{role}/failure.md`:
251
+ ```markdown
252
+ # Wave {N} Failure — {role}
253
+ ## When: {ISO}
254
+ ## Phase: {EXECUTE | REVIEW | CLOSE}
255
+ ## Reason: {short cause — e.g., "REVIEW REJECTED after 2 fix-loops"}
256
+ ## Findings (verbatim)
257
+ {last review findings OR BLOCKED rationale OR build error}
258
+ ## Files touched
259
+ {list from agent memory}
260
+ ```
261
+ 3. **Do NOT** attempt further automatic recovery. Wave N-1 commits remain in place — they are real progress.
262
+ 4. **Prompt the user via AskUserQuestion:**
263
+ - **"Corrigir wave {N} manualmente e retomar"** → user fixes by hand; next `/mustard:resume` clears `failedWaves` entry and restarts wave N from EXECUTE.
264
+ - **"Reescrever wave {N} (re-PLAN dessa onda)"** → delete `wave-{N}-{role}/spec.md`, re-enter PLAN for wave N only (run PLAN sub-flow scoped to wave N's files). User then re-approves via `/mustard:approve` for wave N.
265
+ - **"Abortar pipeline"** → set `status: "aborted"`, move spec to `.claude/spec/aborted/{specName}/` (create dir if needed), keep waves 1..N-1 commits. Inform user: `Pipeline aborted. Waves 1..{N-1} commits preserved. Waves {N}..{totalWaves} discarded.`
266
+
267
+ **Risco residual documentado:** wave N-1 commits podem estar incompletos semanticamente sem wave N (ex.: schema criado mas API não). O usuário foi avisado disso no `/approve` da wave plan. O log `failure.md` explicita qual superfície ficou exposta.
191
268
 
192
269
  ### Granular Retry Protocol
193
270
 
@@ -0,0 +1,105 @@
1
+ #!/usr/bin/env node
2
+
3
+ /**
4
+ * scope-decompose.js
5
+ *
6
+ * Decides whether a feature spec should be decomposed into multiple waves.
7
+ * Reads signals from stdin (JSON), emits decision to stdout (JSON).
8
+ *
9
+ * Input (stdin):
10
+ * {
11
+ * fileCount: number, // arquivos em "## Files" do spec
12
+ * layerCount: number, // camadas distintas tocadas
13
+ * newEntityCount: number, // entidades novas
14
+ * estimatedTouchPoints: number, // imports/refs cruzados (opcional)
15
+ * knowledgeMatches: [ // entradas heavy-pipeline/high-hook-retry do knowledge.json
16
+ * { id: string, type: string, scope: object }
17
+ * ]
18
+ * }
19
+ *
20
+ * Output (stdout):
21
+ * { decompose: boolean, reason: string, signals: {...} }
22
+ *
23
+ * Fail-open: on any error, emits { decompose: false, reason: "error-fallback" }
24
+ * and exits 0.
25
+ */
26
+
27
+ "use strict";
28
+
29
+ function readStdin() {
30
+ return new Promise((resolve) => {
31
+ let data = "";
32
+ process.stdin.setEncoding("utf8");
33
+ process.stdin.on("data", (chunk) => {
34
+ data += chunk;
35
+ });
36
+ process.stdin.on("end", () => resolve(data));
37
+ process.stdin.on("error", () => resolve(""));
38
+ });
39
+ }
40
+
41
+ function decide(signals) {
42
+ const {
43
+ fileCount = 0,
44
+ layerCount = 0,
45
+ newEntityCount = 0,
46
+ estimatedTouchPoints = 0,
47
+ knowledgeMatches = [],
48
+ } = signals;
49
+
50
+ const hasHistoricalMatch = Array.isArray(knowledgeMatches) && knowledgeMatches.length > 0;
51
+
52
+ if (!hasHistoricalMatch && fileCount <= 5) {
53
+ return {
54
+ decompose: false,
55
+ reason: "small-scope-no-history",
56
+ signals: { fileCount, layerCount, newEntityCount, estimatedTouchPoints, historicalMatches: 0 },
57
+ };
58
+ }
59
+
60
+ if (hasHistoricalMatch) {
61
+ return {
62
+ decompose: true,
63
+ reason: `history-match:${knowledgeMatches[0].id || "unknown"}`,
64
+ signals: { fileCount, layerCount, newEntityCount, estimatedTouchPoints, historicalMatches: knowledgeMatches.length },
65
+ };
66
+ }
67
+
68
+ if (layerCount >= 3) {
69
+ return {
70
+ decompose: true,
71
+ reason: "deep-layers",
72
+ signals: { fileCount, layerCount, newEntityCount, estimatedTouchPoints, historicalMatches: 0 },
73
+ };
74
+ }
75
+
76
+ if (fileCount > 10 && newEntityCount >= 2) {
77
+ return {
78
+ decompose: true,
79
+ reason: "wide-and-new-entities",
80
+ signals: { fileCount, layerCount, newEntityCount, estimatedTouchPoints, historicalMatches: 0 },
81
+ };
82
+ }
83
+
84
+ return {
85
+ decompose: false,
86
+ reason: "below-thresholds",
87
+ signals: { fileCount, layerCount, newEntityCount, estimatedTouchPoints, historicalMatches: 0 },
88
+ };
89
+ }
90
+
91
+ async function main() {
92
+ try {
93
+ const raw = await readStdin();
94
+ let signals = {};
95
+ if (raw.trim()) {
96
+ signals = JSON.parse(raw);
97
+ }
98
+ const decision = decide(signals);
99
+ process.stdout.write(JSON.stringify(decision));
100
+ } catch (_err) {
101
+ process.stdout.write(JSON.stringify({ decompose: false, reason: "error-fallback" }));
102
+ }
103
+ }
104
+
105
+ main();
@@ -0,0 +1,239 @@
1
+ #!/usr/bin/env node
2
+
3
+ /**
4
+ * wave-dependency.js
5
+ *
6
+ * Builds a dependency DAG from a list of files (via imports/require/from parsing)
7
+ * and groups files into waves using topological level assignment.
8
+ *
9
+ * Input (stdin):
10
+ * {
11
+ * files: ["src/schema/user.ts", "src/api/users.ts", ...],
12
+ * projectRoot: "." // absolute or relative to cwd (optional, defaults to cwd)
13
+ * }
14
+ *
15
+ * Output (stdout):
16
+ * Success:
17
+ * {
18
+ * waves: [
19
+ * { wave: 1, files: [...], roles: [...], dependsOn: [] },
20
+ * { wave: 2, files: [...], roles: [...], dependsOn: [1] },
21
+ * ...
22
+ * ],
23
+ * metadata: { totalWaves: N, totalFiles: M, widestWave: W }
24
+ * }
25
+ *
26
+ * Error:
27
+ * { error: "cyclic-dependency" | "empty-input" | "error-fallback", cycle?: [...] }
28
+ *
29
+ * Fail-open: on unrecoverable error, emits { error: "error-fallback" } and exits 0.
30
+ */
31
+
32
+ "use strict";
33
+
34
+ const fs = require("fs");
35
+ const path = require("path");
36
+
37
+ const RESOLVABLE_EXTENSIONS = [".ts", ".tsx", ".js", ".jsx", ".mjs", ".cjs", ".vue", ".svelte", ".py", ".go", ".cs"];
38
+ const INDEX_BASENAMES = ["index.ts", "index.tsx", "index.js", "index.jsx", "index.mjs", "__init__.py"];
39
+
40
+ function readStdin() {
41
+ return new Promise((resolve) => {
42
+ let data = "";
43
+ process.stdin.setEncoding("utf8");
44
+ process.stdin.on("data", (chunk) => {
45
+ data += chunk;
46
+ });
47
+ process.stdin.on("end", () => resolve(data));
48
+ process.stdin.on("error", () => resolve(""));
49
+ });
50
+ }
51
+
52
+ function extractImports(content) {
53
+ const imports = new Set();
54
+ const patterns = [
55
+ /import\s+[\s\S]*?from\s+['"]([^'"]+)['"]/g,
56
+ /import\s+['"]([^'"]+)['"]/g,
57
+ /require\s*\(\s*['"]([^'"]+)['"]\s*\)/g,
58
+ /from\s+([.\w]+)\s+import\s+/g,
59
+ ];
60
+ for (const re of patterns) {
61
+ let m;
62
+ while ((m = re.exec(content)) !== null) {
63
+ imports.add(m[1]);
64
+ }
65
+ }
66
+ return [...imports];
67
+ }
68
+
69
+ function resolveImport(importPath, currentFileAbs, projectRoot, candidateSet) {
70
+ if (!importPath.startsWith(".") && !importPath.startsWith("/")) {
71
+ return null;
72
+ }
73
+ const baseDir = path.dirname(currentFileAbs);
74
+ const absTarget = path.resolve(baseDir, importPath);
75
+
76
+ if (candidateSet.has(absTarget)) return absTarget;
77
+
78
+ for (const ext of RESOLVABLE_EXTENSIONS) {
79
+ const withExt = absTarget + ext;
80
+ if (candidateSet.has(withExt)) return withExt;
81
+ }
82
+
83
+ const lastDot = absTarget.lastIndexOf(".");
84
+ if (lastDot > absTarget.lastIndexOf(path.sep)) {
85
+ const stripped = absTarget.slice(0, lastDot);
86
+ if (candidateSet.has(stripped)) return stripped;
87
+ for (const ext of RESOLVABLE_EXTENSIONS) {
88
+ const swapped = stripped + ext;
89
+ if (candidateSet.has(swapped)) return swapped;
90
+ }
91
+ }
92
+
93
+ for (const basename of INDEX_BASENAMES) {
94
+ const indexPath = path.join(absTarget, basename);
95
+ if (candidateSet.has(indexPath)) return indexPath;
96
+ }
97
+
98
+ return null;
99
+ }
100
+
101
+ function detectRole(filePath) {
102
+ const lower = filePath.toLowerCase();
103
+ if (/(schema|migration|entity|model|drizzle|prisma)/.test(lower)) return "schema";
104
+ if (/(api|controller|route|endpoint|handler|service)/.test(lower)) return "api";
105
+ if (/(ui|component|view|page|screen|widget)/.test(lower)) return "ui";
106
+ if (/(test|spec|__tests__)/.test(lower)) return "test";
107
+ return "lib";
108
+ }
109
+
110
+ function buildGraph(files, projectRoot) {
111
+ const absFiles = files.map((f) => path.isAbsolute(f) ? f : path.resolve(projectRoot, f));
112
+ const candidateSet = new Set(absFiles);
113
+ const graph = new Map();
114
+
115
+ for (const absFile of absFiles) {
116
+ graph.set(absFile, new Set());
117
+ let content;
118
+ try {
119
+ content = fs.readFileSync(absFile, "utf8");
120
+ } catch {
121
+ continue;
122
+ }
123
+ const imports = extractImports(content);
124
+ for (const imp of imports) {
125
+ const resolved = resolveImport(imp, absFile, projectRoot, candidateSet);
126
+ if (resolved && resolved !== absFile) {
127
+ graph.get(absFile).add(resolved);
128
+ }
129
+ }
130
+ }
131
+
132
+ return { graph, absFiles };
133
+ }
134
+
135
+ function topologicalWaves(graph) {
136
+ const indegree = new Map();
137
+ const dependents = new Map();
138
+
139
+ for (const node of graph.keys()) {
140
+ indegree.set(node, 0);
141
+ dependents.set(node, new Set());
142
+ }
143
+
144
+ for (const [node, deps] of graph.entries()) {
145
+ for (const dep of deps) {
146
+ if (graph.has(dep)) {
147
+ indegree.set(node, (indegree.get(node) || 0) + 1);
148
+ dependents.get(dep).add(node);
149
+ }
150
+ }
151
+ }
152
+
153
+ const waves = [];
154
+ const visited = new Set();
155
+ let currentWave = [...indegree.entries()].filter(([, deg]) => deg === 0).map(([node]) => node);
156
+
157
+ while (currentWave.length > 0) {
158
+ waves.push(currentWave);
159
+ for (const node of currentWave) visited.add(node);
160
+ const nextWave = [];
161
+ for (const node of currentWave) {
162
+ for (const dependent of dependents.get(node) || []) {
163
+ const newDeg = indegree.get(dependent) - 1;
164
+ indegree.set(dependent, newDeg);
165
+ if (newDeg === 0 && !visited.has(dependent)) {
166
+ nextWave.push(dependent);
167
+ }
168
+ }
169
+ }
170
+ currentWave = nextWave;
171
+ }
172
+
173
+ if (visited.size < graph.size) {
174
+ const stuck = [...graph.keys()].filter((n) => !visited.has(n));
175
+ return { error: "cyclic-dependency", cycle: stuck };
176
+ }
177
+
178
+ return { waves };
179
+ }
180
+
181
+ function toRelative(abs, projectRoot) {
182
+ try {
183
+ return path.relative(projectRoot, abs).replace(/\\/g, "/");
184
+ } catch {
185
+ return abs;
186
+ }
187
+ }
188
+
189
+ async function main() {
190
+ try {
191
+ const raw = await readStdin();
192
+ if (!raw.trim()) {
193
+ process.stdout.write(JSON.stringify({ error: "empty-input" }));
194
+ return;
195
+ }
196
+ const { files, projectRoot = process.cwd() } = JSON.parse(raw);
197
+
198
+ if (!Array.isArray(files) || files.length === 0) {
199
+ process.stdout.write(JSON.stringify({ error: "empty-input" }));
200
+ return;
201
+ }
202
+
203
+ const rootAbs = path.isAbsolute(projectRoot) ? projectRoot : path.resolve(projectRoot);
204
+ const { graph } = buildGraph(files, rootAbs);
205
+ const result = topologicalWaves(graph);
206
+
207
+ if (result.error) {
208
+ const cycle = (result.cycle || []).map((f) => toRelative(f, rootAbs));
209
+ process.stdout.write(JSON.stringify({ error: result.error, cycle }));
210
+ return;
211
+ }
212
+
213
+ const waves = result.waves.map((waveFiles, idx) => {
214
+ const relFiles = waveFiles.map((f) => toRelative(f, rootAbs));
215
+ const roles = [...new Set(relFiles.map(detectRole))];
216
+ return {
217
+ wave: idx + 1,
218
+ files: relFiles,
219
+ roles,
220
+ dependsOn: idx === 0 ? [] : [idx],
221
+ };
222
+ });
223
+
224
+ const widestWave = waves.reduce((max, w) => Math.max(max, w.files.length), 0);
225
+
226
+ process.stdout.write(JSON.stringify({
227
+ waves,
228
+ metadata: {
229
+ totalWaves: waves.length,
230
+ totalFiles: files.length,
231
+ widestWave,
232
+ },
233
+ }));
234
+ } catch (_err) {
235
+ process.stdout.write(JSON.stringify({ error: "error-fallback" }));
236
+ }
237
+ }
238
+
239
+ main();