claude-code-workflow 7.0.0 → 7.0.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.codex/skills/csv-wave-pipeline/SKILL.md +834 -0
- package/ccw/dist/core/server.d.ts.map +1 -1
- package/ccw/dist/core/server.js +65 -1
- package/ccw/dist/core/server.js.map +1 -1
- package/ccw/dist/utils/react-frontend.d.ts +4 -0
- package/ccw/dist/utils/react-frontend.d.ts.map +1 -1
- package/ccw/dist/utils/react-frontend.js +45 -3
- package/ccw/dist/utils/react-frontend.js.map +1 -1
- package/ccw/frontend/dist/assets/index-B0X7v-kT.js +1604 -0
- package/ccw/frontend/dist/assets/index-B0X7v-kT.js.map +1 -0
- package/ccw/frontend/dist/assets/index-C-5AfpJE.css +48 -0
- package/ccw/frontend/dist/assets/index-D73mpSPr.js +16 -0
- package/ccw/frontend/dist/assets/index-D73mpSPr.js.map +1 -0
- package/ccw/frontend/dist/assets/index-Dxu1sw9t.js +16 -0
- package/ccw/frontend/dist/assets/index-Dxu1sw9t.js.map +1 -0
- package/ccw/frontend/dist/assets/monitor.worker-CVRNUIeJ.js +2 -0
- package/ccw/frontend/dist/assets/monitor.worker-CVRNUIeJ.js.map +1 -0
- package/ccw/frontend/dist/index.html +17 -0
- package/codex-lens/src/codexlens/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/__pycache__/__main__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/__pycache__/config.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/__pycache__/entities.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/__pycache__/errors.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/cli/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/cli/__pycache__/commands.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/cli/__pycache__/output.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/hybrid_search/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/hybrid_search/__pycache__/data_structures.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/lsp/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/lsp/__pycache__/lsp_bridge.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/lsp/__pycache__/lsp_graph_builder.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/parsers/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/parsers/__pycache__/astgrep_binding.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/parsers/__pycache__/factory.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/parsers/__pycache__/tokenizer.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/parsers/__pycache__/treesitter_parser.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/parsers/patterns/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/parsers/patterns/python/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/search/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/search/__pycache__/chain_search.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/search/__pycache__/global_graph_expander.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/search/__pycache__/hybrid_search.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/search/__pycache__/ranking.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/search/clustering/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/search/clustering/__pycache__/base.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/search/clustering/__pycache__/factory.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/search/clustering/__pycache__/frequency_strategy.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/search/clustering/__pycache__/noop_strategy.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/storage/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/storage/__pycache__/dir_index.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/storage/__pycache__/global_index.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/storage/__pycache__/index_tree.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/storage/__pycache__/path_mapper.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/storage/__pycache__/registry.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/storage/__pycache__/sqlite_store.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/storage/__pycache__/vector_meta_store.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/watcher/__pycache__/__init__.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/watcher/__pycache__/events.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/watcher/__pycache__/file_watcher.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/watcher/__pycache__/incremental_indexer.cpython-312.pyc +0 -0
- package/codex-lens/src/codexlens/watcher/__pycache__/manager.cpython-312.pyc +0 -0
- package/package.json +3 -2
|
@@ -0,0 +1,834 @@
|
|
|
1
|
+
---
|
|
2
|
+
name: csv-wave-pipeline
|
|
3
|
+
description: Requirement planning to wave-based CSV execution pipeline. Decomposes requirement into dependency-sorted CSV tasks, computes execution waves, runs wave-by-wave via spawn_agents_on_csv with cross-wave context propagation.
|
|
4
|
+
argument-hint: "[-y|--yes] [-c|--concurrency N] [--continue] \"requirement description\""
|
|
5
|
+
allowed-tools: spawn_agents_on_csv, Read, Write, Edit, Bash, Glob, Grep, AskUserQuestion
|
|
6
|
+
---
|
|
7
|
+
|
|
8
|
+
## Auto Mode
|
|
9
|
+
|
|
10
|
+
When `--yes` or `-y`: Auto-confirm task decomposition, skip interactive validation, use defaults.
|
|
11
|
+
|
|
12
|
+
# CSV Wave Pipeline
|
|
13
|
+
|
|
14
|
+
## Usage
|
|
15
|
+
|
|
16
|
+
```bash
|
|
17
|
+
$csv-wave-pipeline "Implement user authentication with OAuth, JWT, and 2FA"
|
|
18
|
+
$csv-wave-pipeline -c 4 "Refactor payment module with Stripe and PayPal"
|
|
19
|
+
$csv-wave-pipeline -y "Build notification system with email and SMS"
|
|
20
|
+
$csv-wave-pipeline --continue "auth-20260228"
|
|
21
|
+
```
|
|
22
|
+
|
|
23
|
+
**Flags**:
|
|
24
|
+
- `-y, --yes`: Skip all confirmations (auto mode)
|
|
25
|
+
- `-c, --concurrency N`: Max concurrent agents within each wave (default: 4)
|
|
26
|
+
- `--continue`: Resume existing session
|
|
27
|
+
|
|
28
|
+
**Output Directory**: `.workflow/.csv-wave/{session-id}/`
|
|
29
|
+
**Core Output**: `tasks.csv` (master state) + `results.csv` (final) + `discoveries.ndjson` (shared exploration) + `context.md` (human-readable report)
|
|
30
|
+
|
|
31
|
+
---
|
|
32
|
+
|
|
33
|
+
## Overview
|
|
34
|
+
|
|
35
|
+
Wave-based batch execution using `spawn_agents_on_csv` with **cross-wave context propagation**. Tasks are grouped into dependency waves; each wave executes concurrently, and its results feed into the next wave.
|
|
36
|
+
|
|
37
|
+
**Core workflow**: Decompose → Compute Waves → Execute Wave-by-Wave → Aggregate
|
|
38
|
+
|
|
39
|
+
```
|
|
40
|
+
┌─────────────────────────────────────────────────────────────────────────┐
|
|
41
|
+
│ CSV BATCH EXECUTION WORKFLOW │
|
|
42
|
+
├─────────────────────────────────────────────────────────────────────────┤
|
|
43
|
+
│ │
|
|
44
|
+
│ Phase 1: Requirement → CSV │
|
|
45
|
+
│ ├─ Parse requirement into subtasks (3-10 tasks) │
|
|
46
|
+
│ ├─ Identify dependencies (deps column) │
|
|
47
|
+
│ ├─ Compute dependency waves (topological sort → depth grouping) │
|
|
48
|
+
│ ├─ Generate tasks.csv with wave column │
|
|
49
|
+
│ └─ User validates task breakdown (skip if -y) │
|
|
50
|
+
│ │
|
|
51
|
+
│ Phase 2: Wave Execution Engine │
|
|
52
|
+
│ ├─ For each wave (1..N): │
|
|
53
|
+
│ │ ├─ Build wave CSV (filter rows for this wave) │
|
|
54
|
+
│ │ ├─ Inject previous wave findings into prev_context column │
|
|
55
|
+
│ │ ├─ spawn_agents_on_csv(wave CSV) │
|
|
56
|
+
│ │ ├─ Collect results, merge into master tasks.csv │
|
|
57
|
+
│ │ └─ Check: any failed? → skip dependents or retry │
|
|
58
|
+
│ └─ discoveries.ndjson shared across all waves (append-only) │
|
|
59
|
+
│ │
|
|
60
|
+
│ Phase 3: Results Aggregation │
|
|
61
|
+
│ ├─ Export final results.csv │
|
|
62
|
+
│ ├─ Generate context.md with all findings │
|
|
63
|
+
│ ├─ Display summary: completed/failed/skipped per wave │
|
|
64
|
+
│ └─ Offer: view results | retry failed | done │
|
|
65
|
+
│ │
|
|
66
|
+
└─────────────────────────────────────────────────────────────────────────┘
|
|
67
|
+
```
|
|
68
|
+
|
|
69
|
+
---
|
|
70
|
+
|
|
71
|
+
## CSV Schema
|
|
72
|
+
|
|
73
|
+
### tasks.csv (Master State)
|
|
74
|
+
|
|
75
|
+
```csv
|
|
76
|
+
id,title,description,deps,context_from,wave,status,findings,files_modified,error
|
|
77
|
+
1,Setup auth module,Create auth directory structure and base files,,,1,,,,
|
|
78
|
+
2,Implement OAuth,Add OAuth provider integration with Google and GitHub,1,1,2,,,,
|
|
79
|
+
3,Add JWT tokens,Implement JWT generation and validation,1,1,2,,,,
|
|
80
|
+
4,Setup 2FA,Add TOTP-based 2FA with QR code generation,2;3,1;2;3,3,,,,
|
|
81
|
+
```
|
|
82
|
+
|
|
83
|
+
**Columns**:
|
|
84
|
+
|
|
85
|
+
| Column | Phase | Description |
|
|
86
|
+
|--------|-------|-------------|
|
|
87
|
+
| `id` | Input | Unique task identifier (string) |
|
|
88
|
+
| `title` | Input | Short task title |
|
|
89
|
+
| `description` | Input | Detailed task description |
|
|
90
|
+
| `deps` | Input | Semicolon-separated dependency task IDs (empty = no deps) |
|
|
91
|
+
| `context_from` | Input | Semicolon-separated task IDs whose findings this task needs |
|
|
92
|
+
| `wave` | Computed | Wave number (computed by topological sort, 1-based) |
|
|
93
|
+
| `status` | Output | `pending` → `completed` / `failed` / `skipped` |
|
|
94
|
+
| `findings` | Output | Key discoveries or implementation notes (max 500 chars) |
|
|
95
|
+
| `files_modified` | Output | Semicolon-separated file paths |
|
|
96
|
+
| `error` | Output | Error message if failed (empty if success) |
|
|
97
|
+
|
|
98
|
+
### Per-Wave CSV (Temporary)
|
|
99
|
+
|
|
100
|
+
Each wave generates a temporary `wave-{N}.csv` with an extra `prev_context` column:
|
|
101
|
+
|
|
102
|
+
```csv
|
|
103
|
+
id,title,description,deps,context_from,wave,prev_context
|
|
104
|
+
2,Implement OAuth,Add OAuth integration,1,1,2,"[Task 1] Created auth/ with index.ts and types.ts"
|
|
105
|
+
3,Add JWT tokens,Implement JWT,1,1,2,"[Task 1] Created auth/ with index.ts and types.ts"
|
|
106
|
+
```
|
|
107
|
+
|
|
108
|
+
The `prev_context` column is built from `context_from` by looking up completed tasks' `findings` in the master CSV.
|
|
109
|
+
|
|
110
|
+
---
|
|
111
|
+
|
|
112
|
+
## Output Artifacts
|
|
113
|
+
|
|
114
|
+
| File | Purpose | Lifecycle |
|
|
115
|
+
|------|---------|-----------|
|
|
116
|
+
| `tasks.csv` | Master state — all tasks with status/findings | Updated after each wave |
|
|
117
|
+
| `wave-{N}.csv` | Per-wave input (temporary) | Created before wave, deleted after |
|
|
118
|
+
| `results.csv` | Final export of all task results | Created in Phase 3 |
|
|
119
|
+
| `discoveries.ndjson` | Shared exploration board across all agents | Append-only, carries across waves |
|
|
120
|
+
| `context.md` | Human-readable execution report | Created in Phase 3 |
|
|
121
|
+
|
|
122
|
+
---
|
|
123
|
+
|
|
124
|
+
## Session Structure
|
|
125
|
+
|
|
126
|
+
```
|
|
127
|
+
.workflow/.csv-wave/{session-id}/
|
|
128
|
+
├── tasks.csv # Master state (updated per wave)
|
|
129
|
+
├── results.csv # Final results export
|
|
130
|
+
├── discoveries.ndjson # Shared discovery board (all agents)
|
|
131
|
+
├── context.md # Human-readable report
|
|
132
|
+
└── wave-{N}.csv # Temporary per-wave input (cleaned up)
|
|
133
|
+
```
|
|
134
|
+
|
|
135
|
+
---
|
|
136
|
+
|
|
137
|
+
## Implementation
|
|
138
|
+
|
|
139
|
+
### Session Initialization
|
|
140
|
+
|
|
141
|
+
```javascript
|
|
142
|
+
const getUtc8ISOString = () => new Date(Date.now() + 8 * 60 * 60 * 1000).toISOString()
|
|
143
|
+
|
|
144
|
+
// Parse flags
|
|
145
|
+
const AUTO_YES = $ARGUMENTS.includes('--yes') || $ARGUMENTS.includes('-y')
|
|
146
|
+
const continueMode = $ARGUMENTS.includes('--continue')
|
|
147
|
+
const concurrencyMatch = $ARGUMENTS.match(/(?:--concurrency|-c)\s+(\d+)/)
|
|
148
|
+
const maxConcurrency = concurrencyMatch ? parseInt(concurrencyMatch[1]) : 4
|
|
149
|
+
|
|
150
|
+
// Clean requirement text (remove flags)
|
|
151
|
+
const requirement = $ARGUMENTS
|
|
152
|
+
.replace(/--yes|-y|--continue|--concurrency\s+\d+|-c\s+\d+/g, '')
|
|
153
|
+
.trim()
|
|
154
|
+
|
|
155
|
+
const slug = requirement.toLowerCase()
|
|
156
|
+
.replace(/[^a-z0-9\u4e00-\u9fa5]+/g, '-')
|
|
157
|
+
.substring(0, 40)
|
|
158
|
+
const dateStr = getUtc8ISOString().substring(0, 10).replace(/-/g, '')
|
|
159
|
+
const sessionId = `cwp-${slug}-${dateStr}`
|
|
160
|
+
const sessionFolder = `.workflow/.csv-wave/${sessionId}`
|
|
161
|
+
|
|
162
|
+
// Continue mode: find existing session
|
|
163
|
+
if (continueMode) {
|
|
164
|
+
const existing = Bash(`ls -t .workflow/.csv-wave/ 2>/dev/null | head -1`).trim()
|
|
165
|
+
if (existing) {
|
|
166
|
+
sessionId = existing
|
|
167
|
+
sessionFolder = `.workflow/.csv-wave/${sessionId}`
|
|
168
|
+
// Read existing tasks.csv, find incomplete waves, resume from there
|
|
169
|
+
const existingCsv = Read(`${sessionFolder}/tasks.csv`)
|
|
170
|
+
// → jump to Phase 2 with remaining waves
|
|
171
|
+
}
|
|
172
|
+
}
|
|
173
|
+
|
|
174
|
+
Bash(`mkdir -p ${sessionFolder}`)
|
|
175
|
+
```
|
|
176
|
+
|
|
177
|
+
---
|
|
178
|
+
|
|
179
|
+
### Phase 1: Requirement → CSV
|
|
180
|
+
|
|
181
|
+
**Objective**: Decompose requirement into tasks, compute dependency waves, generate tasks.csv.
|
|
182
|
+
|
|
183
|
+
**Steps**:
|
|
184
|
+
|
|
185
|
+
1. **Decompose Requirement**
|
|
186
|
+
|
|
187
|
+
```javascript
|
|
188
|
+
// Use ccw cli to decompose requirement into subtasks
|
|
189
|
+
Bash({
|
|
190
|
+
command: `ccw cli -p "PURPOSE: Decompose requirement into 3-10 atomic tasks for batch agent execution.
|
|
191
|
+
TASK:
|
|
192
|
+
• Parse requirement into independent subtasks
|
|
193
|
+
• Identify dependencies between tasks (which must complete before others)
|
|
194
|
+
• Identify context flow (which tasks need previous tasks' findings)
|
|
195
|
+
• Each task must be executable by a single agent with file read/write access
|
|
196
|
+
MODE: analysis
|
|
197
|
+
CONTEXT: @**/*
|
|
198
|
+
EXPECTED: JSON object with tasks array. Each task: {id: string, title: string, description: string, deps: string[], context_from: string[]}. deps = task IDs that must complete first. context_from = task IDs whose findings are needed.
|
|
199
|
+
CONSTRAINTS: 3-10 tasks | Each task is atomic | No circular deps | description must be specific enough for an agent to execute independently
|
|
200
|
+
|
|
201
|
+
REQUIREMENT: ${requirement}" --tool gemini --mode analysis --rule planning-breakdown-task-steps`,
|
|
202
|
+
run_in_background: true
|
|
203
|
+
})
|
|
204
|
+
// Wait for CLI completion via hook callback
|
|
205
|
+
// Parse JSON from CLI output → decomposedTasks[]
|
|
206
|
+
```
|
|
207
|
+
|
|
208
|
+
2. **Compute Waves** (Topological Sort → Depth Grouping)
|
|
209
|
+
|
|
210
|
+
```javascript
|
|
211
|
+
function computeWaves(tasks) {
|
|
212
|
+
// Build adjacency: task.deps → predecessors
|
|
213
|
+
const taskMap = new Map(tasks.map(t => [t.id, t]))
|
|
214
|
+
const inDegree = new Map(tasks.map(t => [t.id, 0]))
|
|
215
|
+
const adjList = new Map(tasks.map(t => [t.id, []]))
|
|
216
|
+
|
|
217
|
+
for (const task of tasks) {
|
|
218
|
+
for (const dep of task.deps) {
|
|
219
|
+
if (taskMap.has(dep)) {
|
|
220
|
+
adjList.get(dep).push(task.id)
|
|
221
|
+
inDegree.set(task.id, inDegree.get(task.id) + 1)
|
|
222
|
+
}
|
|
223
|
+
}
|
|
224
|
+
}
|
|
225
|
+
|
|
226
|
+
// BFS-based topological sort with depth tracking
|
|
227
|
+
const queue = [] // [taskId, depth]
|
|
228
|
+
const waveAssignment = new Map()
|
|
229
|
+
|
|
230
|
+
for (const [id, deg] of inDegree) {
|
|
231
|
+
if (deg === 0) {
|
|
232
|
+
queue.push([id, 1])
|
|
233
|
+
waveAssignment.set(id, 1)
|
|
234
|
+
}
|
|
235
|
+
}
|
|
236
|
+
|
|
237
|
+
let maxWave = 1
|
|
238
|
+
let idx = 0
|
|
239
|
+
while (idx < queue.length) {
|
|
240
|
+
const [current, depth] = queue[idx++]
|
|
241
|
+
for (const next of adjList.get(current)) {
|
|
242
|
+
const newDeg = inDegree.get(next) - 1
|
|
243
|
+
inDegree.set(next, newDeg)
|
|
244
|
+
const nextDepth = Math.max(waveAssignment.get(next) || 0, depth + 1)
|
|
245
|
+
waveAssignment.set(next, nextDepth)
|
|
246
|
+
if (newDeg === 0) {
|
|
247
|
+
queue.push([next, nextDepth])
|
|
248
|
+
maxWave = Math.max(maxWave, nextDepth)
|
|
249
|
+
}
|
|
250
|
+
}
|
|
251
|
+
}
|
|
252
|
+
|
|
253
|
+
// Detect cycles: any task without wave assignment
|
|
254
|
+
for (const task of tasks) {
|
|
255
|
+
if (!waveAssignment.has(task.id)) {
|
|
256
|
+
throw new Error(`Circular dependency detected involving task ${task.id}`)
|
|
257
|
+
}
|
|
258
|
+
}
|
|
259
|
+
|
|
260
|
+
return { waveAssignment, maxWave }
|
|
261
|
+
}
|
|
262
|
+
|
|
263
|
+
const { waveAssignment, maxWave } = computeWaves(decomposedTasks)
|
|
264
|
+
```
|
|
265
|
+
|
|
266
|
+
3. **Generate tasks.csv**
|
|
267
|
+
|
|
268
|
+
```javascript
|
|
269
|
+
const header = 'id,title,description,deps,context_from,wave,status,findings,files_modified,error'
|
|
270
|
+
const rows = decomposedTasks.map(task => {
|
|
271
|
+
const wave = waveAssignment.get(task.id)
|
|
272
|
+
return [
|
|
273
|
+
task.id,
|
|
274
|
+
csvEscape(task.title),
|
|
275
|
+
csvEscape(task.description),
|
|
276
|
+
task.deps.join(';'),
|
|
277
|
+
task.context_from.join(';'),
|
|
278
|
+
wave,
|
|
279
|
+
'pending', // status
|
|
280
|
+
'', // findings
|
|
281
|
+
'', // files_modified
|
|
282
|
+
'' // error
|
|
283
|
+
].map(cell => `"${String(cell).replace(/"/g, '""')}"`).join(',')
|
|
284
|
+
})
|
|
285
|
+
|
|
286
|
+
Write(`${sessionFolder}/tasks.csv`, [header, ...rows].join('\n'))
|
|
287
|
+
```
|
|
288
|
+
|
|
289
|
+
4. **User Validation** (skip if AUTO_YES)
|
|
290
|
+
|
|
291
|
+
```javascript
|
|
292
|
+
if (!AUTO_YES) {
|
|
293
|
+
// Display task breakdown with wave assignment
|
|
294
|
+
console.log(`\n## Task Breakdown (${decomposedTasks.length} tasks, ${maxWave} waves)\n`)
|
|
295
|
+
for (let w = 1; w <= maxWave; w++) {
|
|
296
|
+
const waveTasks = decomposedTasks.filter(t => waveAssignment.get(t.id) === w)
|
|
297
|
+
console.log(`### Wave ${w} (${waveTasks.length} tasks, concurrent)`)
|
|
298
|
+
waveTasks.forEach(t => console.log(` - [${t.id}] ${t.title}`))
|
|
299
|
+
}
|
|
300
|
+
|
|
301
|
+
const answer = AskUserQuestion({
|
|
302
|
+
questions: [{
|
|
303
|
+
question: "Approve task breakdown?",
|
|
304
|
+
header: "Validation",
|
|
305
|
+
multiSelect: false,
|
|
306
|
+
options: [
|
|
307
|
+
{ label: "Approve", description: "Proceed with wave execution" },
|
|
308
|
+
{ label: "Modify", description: `Edit ${sessionFolder}/tasks.csv manually, then --continue` },
|
|
309
|
+
{ label: "Cancel", description: "Abort" }
|
|
310
|
+
]
|
|
311
|
+
}]
|
|
312
|
+
}) // BLOCKS
|
|
313
|
+
|
|
314
|
+
if (answer.Validation === "Modify") {
|
|
315
|
+
console.log(`Edit: ${sessionFolder}/tasks.csv\nResume: $csv-wave-pipeline --continue`)
|
|
316
|
+
return
|
|
317
|
+
} else if (answer.Validation === "Cancel") {
|
|
318
|
+
return
|
|
319
|
+
}
|
|
320
|
+
}
|
|
321
|
+
```
|
|
322
|
+
|
|
323
|
+
**Success Criteria**:
|
|
324
|
+
- tasks.csv created with valid schema and wave assignments
|
|
325
|
+
- No circular dependencies
|
|
326
|
+
- User approved (or AUTO_YES)
|
|
327
|
+
|
|
328
|
+
---
|
|
329
|
+
|
|
330
|
+
### Phase 2: Wave Execution Engine
|
|
331
|
+
|
|
332
|
+
**Objective**: Execute tasks wave-by-wave via `spawn_agents_on_csv`. Each wave sees previous waves' results.
|
|
333
|
+
|
|
334
|
+
**Steps**:
|
|
335
|
+
|
|
336
|
+
1. **Wave Loop**
|
|
337
|
+
|
|
338
|
+
```javascript
|
|
339
|
+
const failedIds = new Set()
|
|
340
|
+
const skippedIds = new Set()
|
|
341
|
+
|
|
342
|
+
for (let wave = 1; wave <= maxWave; wave++) {
|
|
343
|
+
console.log(`\n## Wave ${wave}/${maxWave}\n`)
|
|
344
|
+
|
|
345
|
+
// 1. Read current master CSV
|
|
346
|
+
const masterCsv = parseCsv(Read(`${sessionFolder}/tasks.csv`))
|
|
347
|
+
|
|
348
|
+
// 2. Filter tasks for this wave
|
|
349
|
+
const waveTasks = masterCsv.filter(row => parseInt(row.wave) === wave)
|
|
350
|
+
|
|
351
|
+
// 3. Skip tasks whose deps failed
|
|
352
|
+
const executableTasks = []
|
|
353
|
+
for (const task of waveTasks) {
|
|
354
|
+
const deps = task.deps.split(';').filter(Boolean)
|
|
355
|
+
if (deps.some(d => failedIds.has(d) || skippedIds.has(d))) {
|
|
356
|
+
skippedIds.add(task.id)
|
|
357
|
+
// Update master CSV: mark as skipped
|
|
358
|
+
updateMasterCsvRow(sessionFolder, task.id, {
|
|
359
|
+
status: 'skipped',
|
|
360
|
+
error: 'Dependency failed or skipped'
|
|
361
|
+
})
|
|
362
|
+
console.log(` [${task.id}] ${task.title} → SKIPPED (dependency failed)`)
|
|
363
|
+
continue
|
|
364
|
+
}
|
|
365
|
+
executableTasks.push(task)
|
|
366
|
+
}
|
|
367
|
+
|
|
368
|
+
if (executableTasks.length === 0) {
|
|
369
|
+
console.log(` No executable tasks in wave ${wave}`)
|
|
370
|
+
continue
|
|
371
|
+
}
|
|
372
|
+
|
|
373
|
+
// 4. Build prev_context for each task
|
|
374
|
+
for (const task of executableTasks) {
|
|
375
|
+
const contextIds = task.context_from.split(';').filter(Boolean)
|
|
376
|
+
const prevFindings = contextIds
|
|
377
|
+
.map(id => {
|
|
378
|
+
const prevRow = masterCsv.find(r => r.id === id)
|
|
379
|
+
if (prevRow && prevRow.status === 'completed' && prevRow.findings) {
|
|
380
|
+
return `[Task ${id}: ${prevRow.title}] ${prevRow.findings}`
|
|
381
|
+
}
|
|
382
|
+
return null
|
|
383
|
+
})
|
|
384
|
+
.filter(Boolean)
|
|
385
|
+
.join('\n')
|
|
386
|
+
task.prev_context = prevFindings || 'No previous context available'
|
|
387
|
+
}
|
|
388
|
+
|
|
389
|
+
// 5. Write wave CSV
|
|
390
|
+
const waveHeader = 'id,title,description,deps,context_from,wave,prev_context'
|
|
391
|
+
const waveRows = executableTasks.map(t =>
|
|
392
|
+
[t.id, t.title, t.description, t.deps, t.context_from, t.wave, t.prev_context]
|
|
393
|
+
.map(cell => `"${String(cell).replace(/"/g, '""')}"`)
|
|
394
|
+
.join(',')
|
|
395
|
+
)
|
|
396
|
+
Write(`${sessionFolder}/wave-${wave}.csv`, [waveHeader, ...waveRows].join('\n'))
|
|
397
|
+
|
|
398
|
+
// 6. Execute wave
|
|
399
|
+
console.log(` Executing ${executableTasks.length} tasks (concurrency: ${maxConcurrency})...`)
|
|
400
|
+
|
|
401
|
+
const waveResult = spawn_agents_on_csv({
|
|
402
|
+
csv_path: `${sessionFolder}/wave-${wave}.csv`,
|
|
403
|
+
id_column: "id",
|
|
404
|
+
instruction: buildInstructionTemplate(sessionFolder, wave),
|
|
405
|
+
max_concurrency: maxConcurrency,
|
|
406
|
+
max_runtime_seconds: 600,
|
|
407
|
+
output_csv_path: `${sessionFolder}/wave-${wave}-results.csv`,
|
|
408
|
+
output_schema: {
|
|
409
|
+
type: "object",
|
|
410
|
+
properties: {
|
|
411
|
+
id: { type: "string" },
|
|
412
|
+
status: { type: "string", enum: ["completed", "failed"] },
|
|
413
|
+
findings: { type: "string" },
|
|
414
|
+
files_modified: { type: "array", items: { type: "string" } },
|
|
415
|
+
error: { type: "string" }
|
|
416
|
+
},
|
|
417
|
+
required: ["id", "status", "findings"]
|
|
418
|
+
}
|
|
419
|
+
})
|
|
420
|
+
// ↑ Blocks until all agents in this wave complete
|
|
421
|
+
|
|
422
|
+
// 7. Merge results into master CSV
|
|
423
|
+
const waveResults = parseCsv(Read(`${sessionFolder}/wave-${wave}-results.csv`))
|
|
424
|
+
for (const result of waveResults) {
|
|
425
|
+
updateMasterCsvRow(sessionFolder, result.id, {
|
|
426
|
+
status: result.status,
|
|
427
|
+
findings: result.findings || '',
|
|
428
|
+
files_modified: (result.files_modified || []).join(';'),
|
|
429
|
+
error: result.error || ''
|
|
430
|
+
})
|
|
431
|
+
|
|
432
|
+
if (result.status === 'failed') {
|
|
433
|
+
failedIds.add(result.id)
|
|
434
|
+
console.log(` [${result.id}] ${result.title} → FAILED: ${result.error}`)
|
|
435
|
+
} else {
|
|
436
|
+
console.log(` [${result.id}] ${result.title} → COMPLETED`)
|
|
437
|
+
}
|
|
438
|
+
}
|
|
439
|
+
|
|
440
|
+
// 8. Cleanup temporary wave CSV
|
|
441
|
+
Bash(`rm -f "${sessionFolder}/wave-${wave}.csv"`)
|
|
442
|
+
|
|
443
|
+
console.log(` Wave ${wave} done: ${waveResults.filter(r => r.status === 'completed').length} completed, ${waveResults.filter(r => r.status === 'failed').length} failed`)
|
|
444
|
+
}
|
|
445
|
+
```
|
|
446
|
+
|
|
447
|
+
2. **Instruction Template Builder**
|
|
448
|
+
|
|
449
|
+
```javascript
|
|
450
|
+
function buildInstructionTemplate(sessionFolder, wave) {
|
|
451
|
+
return `
|
|
452
|
+
## TASK ASSIGNMENT
|
|
453
|
+
|
|
454
|
+
### MANDATORY FIRST STEPS
|
|
455
|
+
1. Read shared discoveries: ${sessionFolder}/discoveries.ndjson (if exists, skip if not)
|
|
456
|
+
2. Read project context: .workflow/project-tech.json (if exists)
|
|
457
|
+
|
|
458
|
+
---
|
|
459
|
+
|
|
460
|
+
## Your Task
|
|
461
|
+
|
|
462
|
+
**Task ID**: {id}
|
|
463
|
+
**Title**: {title}
|
|
464
|
+
**Description**: {description}
|
|
465
|
+
|
|
466
|
+
### Previous Tasks' Findings (Context)
|
|
467
|
+
{prev_context}
|
|
468
|
+
|
|
469
|
+
---
|
|
470
|
+
|
|
471
|
+
## Execution Protocol
|
|
472
|
+
|
|
473
|
+
1. **Read discoveries**: Load ${sessionFolder}/discoveries.ndjson for shared exploration findings
|
|
474
|
+
2. **Use context**: Apply previous tasks' findings from prev_context above
|
|
475
|
+
3. **Execute**: Implement the task as described
|
|
476
|
+
4. **Share discoveries**: Append exploration findings to shared board:
|
|
477
|
+
\`\`\`bash
|
|
478
|
+
echo '{"ts":"<ISO8601>","worker":"{id}","type":"<type>","data":{...}}' >> ${sessionFolder}/discoveries.ndjson
|
|
479
|
+
\`\`\`
|
|
480
|
+
5. **Report result**: Return JSON via report_agent_job_result
|
|
481
|
+
|
|
482
|
+
### Discovery Types to Share
|
|
483
|
+
- \`code_pattern\`: {name, file, description} — reusable patterns found
|
|
484
|
+
- \`integration_point\`: {file, description, exports[]} — module connection points
|
|
485
|
+
- \`convention\`: {naming, imports, formatting} — code style conventions
|
|
486
|
+
- \`blocker\`: {issue, severity, impact} — blocking issues encountered
|
|
487
|
+
|
|
488
|
+
---
|
|
489
|
+
|
|
490
|
+
## Output (report_agent_job_result)
|
|
491
|
+
|
|
492
|
+
Return JSON:
|
|
493
|
+
{
|
|
494
|
+
"id": "{id}",
|
|
495
|
+
"status": "completed" | "failed",
|
|
496
|
+
"findings": "Key discoveries and implementation notes (max 500 chars)",
|
|
497
|
+
"files_modified": ["path1", "path2"],
|
|
498
|
+
"error": ""
|
|
499
|
+
}
|
|
500
|
+
`
|
|
501
|
+
}
|
|
502
|
+
```
|
|
503
|
+
|
|
504
|
+
3. **Master CSV Update Helper**
|
|
505
|
+
|
|
506
|
+
```javascript
|
|
507
|
+
function updateMasterCsvRow(sessionFolder, taskId, updates) {
|
|
508
|
+
const csvPath = `${sessionFolder}/tasks.csv`
|
|
509
|
+
const content = Read(csvPath)
|
|
510
|
+
const lines = content.split('\n')
|
|
511
|
+
const header = lines[0].split(',')
|
|
512
|
+
|
|
513
|
+
for (let i = 1; i < lines.length; i++) {
|
|
514
|
+
const cells = parseCsvLine(lines[i])
|
|
515
|
+
if (cells[0] === taskId || cells[0] === `"${taskId}"`) {
|
|
516
|
+
// Update specified columns
|
|
517
|
+
for (const [col, val] of Object.entries(updates)) {
|
|
518
|
+
const colIdx = header.indexOf(col)
|
|
519
|
+
if (colIdx >= 0) {
|
|
520
|
+
cells[colIdx] = `"${String(val).replace(/"/g, '""')}"`
|
|
521
|
+
}
|
|
522
|
+
}
|
|
523
|
+
lines[i] = cells.join(',')
|
|
524
|
+
break
|
|
525
|
+
}
|
|
526
|
+
}
|
|
527
|
+
|
|
528
|
+
Write(csvPath, lines.join('\n'))
|
|
529
|
+
}
|
|
530
|
+
```
|
|
531
|
+
|
|
532
|
+
**Success Criteria**:
|
|
533
|
+
- All waves executed in order
|
|
534
|
+
- Each wave's results merged into master CSV before next wave starts
|
|
535
|
+
- Dependent tasks skipped when predecessor failed
|
|
536
|
+
- discoveries.ndjson accumulated across all waves
|
|
537
|
+
|
|
538
|
+
---
|
|
539
|
+
|
|
540
|
+
### Phase 3: Results Aggregation
|
|
541
|
+
|
|
542
|
+
**Objective**: Generate final results and human-readable report.
|
|
543
|
+
|
|
544
|
+
**Steps**:
|
|
545
|
+
|
|
546
|
+
1. **Export results.csv**
|
|
547
|
+
|
|
548
|
+
```javascript
|
|
549
|
+
const masterCsv = Read(`${sessionFolder}/tasks.csv`)
|
|
550
|
+
// results.csv = master CSV (already has all results populated)
|
|
551
|
+
Write(`${sessionFolder}/results.csv`, masterCsv)
|
|
552
|
+
```
|
|
553
|
+
|
|
554
|
+
2. **Generate context.md**
|
|
555
|
+
|
|
556
|
+
```javascript
|
|
557
|
+
const tasks = parseCsv(masterCsv)
|
|
558
|
+
const completed = tasks.filter(t => t.status === 'completed')
|
|
559
|
+
const failed = tasks.filter(t => t.status === 'failed')
|
|
560
|
+
const skipped = tasks.filter(t => t.status === 'skipped')
|
|
561
|
+
|
|
562
|
+
const contextContent = `# CSV Batch Execution Report
|
|
563
|
+
|
|
564
|
+
**Session**: ${sessionId}
|
|
565
|
+
**Requirement**: ${requirement}
|
|
566
|
+
**Completed**: ${getUtc8ISOString()}
|
|
567
|
+
**Waves**: ${maxWave} | **Concurrency**: ${maxConcurrency}
|
|
568
|
+
|
|
569
|
+
---
|
|
570
|
+
|
|
571
|
+
## Summary
|
|
572
|
+
|
|
573
|
+
| Metric | Count |
|
|
574
|
+
|--------|-------|
|
|
575
|
+
| Total Tasks | ${tasks.length} |
|
|
576
|
+
| Completed | ${completed.length} |
|
|
577
|
+
| Failed | ${failed.length} |
|
|
578
|
+
| Skipped | ${skipped.length} |
|
|
579
|
+
|
|
580
|
+
---
|
|
581
|
+
|
|
582
|
+
## Wave Execution
|
|
583
|
+
|
|
584
|
+
${Array.from({ length: maxWave }, (_, i) => i + 1).map(w => {
|
|
585
|
+
const waveTasks = tasks.filter(t => parseInt(t.wave) === w)
|
|
586
|
+
return `### Wave ${w}
|
|
587
|
+
${waveTasks.map(t => `- **[${t.id}] ${t.title}**: ${t.status}${t.error ? ' — ' + t.error : ''}
|
|
588
|
+
${t.findings ? 'Findings: ' + t.findings : ''}`).join('\n')}`
|
|
589
|
+
}).join('\n\n')}
|
|
590
|
+
|
|
591
|
+
---
|
|
592
|
+
|
|
593
|
+
## Task Details
|
|
594
|
+
|
|
595
|
+
${tasks.map(t => `### ${t.id}: ${t.title}
|
|
596
|
+
|
|
597
|
+
| Field | Value |
|
|
598
|
+
|-------|-------|
|
|
599
|
+
| Status | ${t.status} |
|
|
600
|
+
| Wave | ${t.wave} |
|
|
601
|
+
| Dependencies | ${t.deps || 'none'} |
|
|
602
|
+
| Context From | ${t.context_from || 'none'} |
|
|
603
|
+
| Error | ${t.error || 'none'} |
|
|
604
|
+
|
|
605
|
+
**Findings**: ${t.findings || 'N/A'}
|
|
606
|
+
|
|
607
|
+
**Files Modified**: ${t.files_modified || 'none'}
|
|
608
|
+
`).join('\n---\n')}
|
|
609
|
+
|
|
610
|
+
---
|
|
611
|
+
|
|
612
|
+
## All Modified Files
|
|
613
|
+
|
|
614
|
+
${[...new Set(tasks.flatMap(t => (t.files_modified || '').split(';')).filter(Boolean))].map(f => '- ' + f).join('\n') || 'None'}
|
|
615
|
+
`
|
|
616
|
+
|
|
617
|
+
Write(`${sessionFolder}/context.md`, contextContent)
|
|
618
|
+
```
|
|
619
|
+
|
|
620
|
+
3. **Display Summary**
|
|
621
|
+
|
|
622
|
+
```javascript
|
|
623
|
+
console.log(`
|
|
624
|
+
## Execution Complete
|
|
625
|
+
|
|
626
|
+
- **Session**: ${sessionId}
|
|
627
|
+
- **Waves**: ${maxWave}
|
|
628
|
+
- **Completed**: ${completed.length}/${tasks.length}
|
|
629
|
+
- **Failed**: ${failed.length}
|
|
630
|
+
- **Skipped**: ${skipped.length}
|
|
631
|
+
|
|
632
|
+
**Results**: ${sessionFolder}/results.csv
|
|
633
|
+
**Report**: ${sessionFolder}/context.md
|
|
634
|
+
**Discoveries**: ${sessionFolder}/discoveries.ndjson
|
|
635
|
+
`)
|
|
636
|
+
```
|
|
637
|
+
|
|
638
|
+
4. **Offer Next Steps** (skip if AUTO_YES)
|
|
639
|
+
|
|
640
|
+
```javascript
|
|
641
|
+
if (!AUTO_YES && failed.length > 0) {
|
|
642
|
+
const answer = AskUserQuestion({
|
|
643
|
+
questions: [{
|
|
644
|
+
question: `${failed.length} tasks failed. Next action?`,
|
|
645
|
+
header: "Next Step",
|
|
646
|
+
multiSelect: false,
|
|
647
|
+
options: [
|
|
648
|
+
{ label: "Retry Failed", description: `Re-execute ${failed.length} failed tasks with updated context` },
|
|
649
|
+
{ label: "View Report", description: "Display context.md" },
|
|
650
|
+
{ label: "Done", description: "Complete session" }
|
|
651
|
+
]
|
|
652
|
+
}]
|
|
653
|
+
}) // BLOCKS
|
|
654
|
+
|
|
655
|
+
if (answer['Next Step'] === "Retry Failed") {
|
|
656
|
+
// Reset failed tasks to pending, re-run Phase 2 for their waves
|
|
657
|
+
for (const task of failed) {
|
|
658
|
+
updateMasterCsvRow(sessionFolder, task.id, { status: 'pending', error: '' })
|
|
659
|
+
}
|
|
660
|
+
// Also reset skipped tasks whose deps are now retrying
|
|
661
|
+
for (const task of skipped) {
|
|
662
|
+
updateMasterCsvRow(sessionFolder, task.id, { status: 'pending', error: '' })
|
|
663
|
+
}
|
|
664
|
+
// Re-execute Phase 2 (loop will skip already-completed tasks)
|
|
665
|
+
// → goto Phase 2
|
|
666
|
+
} else if (answer['Next Step'] === "View Report") {
|
|
667
|
+
console.log(Read(`${sessionFolder}/context.md`))
|
|
668
|
+
}
|
|
669
|
+
}
|
|
670
|
+
```
|
|
671
|
+
|
|
672
|
+
**Success Criteria**:
|
|
673
|
+
- results.csv exported
|
|
674
|
+
- context.md generated
|
|
675
|
+
- Summary displayed to user
|
|
676
|
+
|
|
677
|
+
---
|
|
678
|
+
|
|
679
|
+
## Shared Discovery Board Protocol
|
|
680
|
+
|
|
681
|
+
All agents across all waves share `discoveries.ndjson`. This eliminates redundant codebase exploration.
|
|
682
|
+
|
|
683
|
+
**Lifecycle**:
|
|
684
|
+
- Created by the first agent to write a discovery
|
|
685
|
+
- Carries over across waves — never cleared
|
|
686
|
+
- Agents append via `echo '...' >> discoveries.ndjson`
|
|
687
|
+
|
|
688
|
+
**Format**: NDJSON, each line is a self-contained JSON:
|
|
689
|
+
|
|
690
|
+
```jsonl
|
|
691
|
+
{"ts":"2026-02-28T10:00:00+08:00","worker":"1","type":"code_pattern","data":{"name":"repository-pattern","file":"src/repos/Base.ts","description":"Abstract CRUD repository"}}
|
|
692
|
+
{"ts":"2026-02-28T10:01:00+08:00","worker":"2","type":"integration_point","data":{"file":"src/auth/index.ts","description":"Auth module entry","exports":["authenticate","authorize"]}}
|
|
693
|
+
```
|
|
694
|
+
|
|
695
|
+
**Discovery Types**:
|
|
696
|
+
|
|
697
|
+
| type | Dedup Key | Description |
|
|
698
|
+
|------|-----------|-------------|
|
|
699
|
+
| `code_pattern` | `data.name` | Reusable code pattern found |
|
|
700
|
+
| `integration_point` | `data.file` | Module connection point |
|
|
701
|
+
| `convention` | singleton | Code style conventions |
|
|
702
|
+
| `blocker` | `data.issue` | Blocking issue encountered |
|
|
703
|
+
| `tech_stack` | singleton | Project technology stack |
|
|
704
|
+
| `test_command` | singleton | Test commands discovered |
|
|
705
|
+
|
|
706
|
+
**Protocol Rules**:
|
|
707
|
+
1. Read board before own exploration → skip covered areas
|
|
708
|
+
2. Write discoveries immediately via `echo >>` → don't batch
|
|
709
|
+
3. Deduplicate — check existing entries; skip if same type + dedup key exists
|
|
710
|
+
4. Append-only — never modify or delete existing lines
|
|
711
|
+
|
|
712
|
+
---
|
|
713
|
+
|
|
714
|
+
## Wave Computation Details
|
|
715
|
+
|
|
716
|
+
### Algorithm
|
|
717
|
+
|
|
718
|
+
Kahn's BFS topological sort with depth tracking:
|
|
719
|
+
|
|
720
|
+
```
|
|
721
|
+
Input: tasks[] with deps[]
|
|
722
|
+
Output: waveAssignment (taskId → wave number)
|
|
723
|
+
|
|
724
|
+
1. Build in-degree map and adjacency list from deps
|
|
725
|
+
2. Enqueue all tasks with in-degree 0 at wave 1
|
|
726
|
+
3. BFS: for each dequeued task at wave W:
|
|
727
|
+
- For each dependent task D:
|
|
728
|
+
- Decrement D's in-degree
|
|
729
|
+
- D.wave = max(D.wave, W + 1)
|
|
730
|
+
- If D's in-degree reaches 0, enqueue D
|
|
731
|
+
4. Any task without wave assignment → circular dependency error
|
|
732
|
+
```
|
|
733
|
+
|
|
734
|
+
### Wave Properties
|
|
735
|
+
|
|
736
|
+
- **Wave 1**: No dependencies — all tasks in wave 1 are fully independent
|
|
737
|
+
- **Wave N**: All dependencies are in waves 1..(N-1) — guaranteed completed before wave N starts
|
|
738
|
+
- **Within a wave**: Tasks are independent of each other → safe for concurrent execution
|
|
739
|
+
|
|
740
|
+
### Example
|
|
741
|
+
|
|
742
|
+
```
|
|
743
|
+
Task A (no deps) → Wave 1
|
|
744
|
+
Task B (no deps) → Wave 1
|
|
745
|
+
Task C (deps: A) → Wave 2
|
|
746
|
+
Task D (deps: A, B) → Wave 2
|
|
747
|
+
Task E (deps: C, D) → Wave 3
|
|
748
|
+
|
|
749
|
+
Execution:
|
|
750
|
+
Wave 1: [A, B] ← concurrent
|
|
751
|
+
Wave 2: [C, D] ← concurrent, sees A+B findings
|
|
752
|
+
Wave 3: [E] ← sees A+B+C+D findings
|
|
753
|
+
```
|
|
754
|
+
|
|
755
|
+
---
|
|
756
|
+
|
|
757
|
+
## Context Propagation Flow
|
|
758
|
+
|
|
759
|
+
```
|
|
760
|
+
Wave 1 agents:
|
|
761
|
+
├─ Execute tasks (no prev_context)
|
|
762
|
+
├─ Write findings to report_agent_job_result
|
|
763
|
+
└─ Append discoveries to discoveries.ndjson
|
|
764
|
+
|
|
765
|
+
↓ merge results into master CSV
|
|
766
|
+
|
|
767
|
+
Wave 2 agents:
|
|
768
|
+
├─ Read discoveries.ndjson (exploration sharing)
|
|
769
|
+
├─ Read prev_context column (wave 1 findings from context_from)
|
|
770
|
+
├─ Execute tasks with full upstream context
|
|
771
|
+
├─ Write findings to report_agent_job_result
|
|
772
|
+
└─ Append new discoveries to discoveries.ndjson
|
|
773
|
+
|
|
774
|
+
↓ merge results into master CSV
|
|
775
|
+
|
|
776
|
+
Wave 3 agents:
|
|
777
|
+
├─ Read discoveries.ndjson (accumulated from waves 1+2)
|
|
778
|
+
├─ Read prev_context column (wave 1+2 findings from context_from)
|
|
779
|
+
├─ Execute tasks
|
|
780
|
+
└─ ...
|
|
781
|
+
```
|
|
782
|
+
|
|
783
|
+
**Two context channels**:
|
|
784
|
+
1. **CSV findings** (structured): `context_from` column → `prev_context` injection — task-specific directed context
|
|
785
|
+
2. **NDJSON discoveries** (broadcast): `discoveries.ndjson` — general exploration findings available to all
|
|
786
|
+
|
|
787
|
+
---
|
|
788
|
+
|
|
789
|
+
## Error Handling
|
|
790
|
+
|
|
791
|
+
| Error | Resolution |
|
|
792
|
+
|-------|------------|
|
|
793
|
+
| Circular dependency | Detect in wave computation, abort with error message |
|
|
794
|
+
| Agent timeout | Mark as failed in results, continue with wave |
|
|
795
|
+
| Agent failed | Mark as failed, skip dependent tasks in later waves |
|
|
796
|
+
| All agents in wave failed | Log error, offer retry or abort |
|
|
797
|
+
| CSV parse error | Validate CSV format before execution, show line number |
|
|
798
|
+
| discoveries.ndjson corrupt | Ignore malformed lines, continue with valid entries |
|
|
799
|
+
| Continue mode: no session found | List available sessions, prompt user to select |
|
|
800
|
+
|
|
801
|
+
---
|
|
802
|
+
|
|
803
|
+
## Core Rules
|
|
804
|
+
|
|
805
|
+
1. **Start Immediately**: First action is session initialization, then Phase 1
|
|
806
|
+
2. **Wave Order is Sacred**: Never execute wave N before wave N-1 completes and results are merged
|
|
807
|
+
3. **CSV is Source of Truth**: Master tasks.csv holds all state — always read before wave, always write after
|
|
808
|
+
4. **Context Propagation**: prev_context built from master CSV, not from memory
|
|
809
|
+
5. **Discovery Board is Append-Only**: Never clear, modify, or recreate discoveries.ndjson
|
|
810
|
+
6. **Skip on Failure**: If a dependency failed, skip the dependent task (don't attempt)
|
|
811
|
+
7. **Cleanup Temp Files**: Remove wave-{N}.csv after results are merged
|
|
812
|
+
8. **DO NOT STOP**: Continuous execution until all waves complete or all remaining tasks are skipped
|
|
813
|
+
|
|
814
|
+
---
|
|
815
|
+
|
|
816
|
+
## Best Practices
|
|
817
|
+
|
|
818
|
+
1. **Task Granularity**: 3-10 tasks optimal; too many = overhead, too few = no parallelism benefit
|
|
819
|
+
2. **Minimize Cross-Wave Deps**: More tasks in wave 1 = more parallelism
|
|
820
|
+
3. **Specific Descriptions**: Agent sees only its CSV row + prev_context — make description self-contained
|
|
821
|
+
4. **Context From ≠ Deps**: `deps` = execution order constraint; `context_from` = information flow. A task can have `context_from` without `deps` (it just reads previous findings but doesn't require them to be done first in its wave)
|
|
822
|
+
5. **Concurrency Tuning**: `-c 1` for serial execution (maximum context sharing); `-c 8` for I/O-bound tasks
|
|
823
|
+
|
|
824
|
+
---
|
|
825
|
+
|
|
826
|
+
## Usage Recommendations
|
|
827
|
+
|
|
828
|
+
| Scenario | Recommended Approach |
|
|
829
|
+
|----------|---------------------|
|
|
830
|
+
| Independent parallel tasks (no deps) | `$csv-wave-pipeline -c 8` — single wave, max parallelism |
|
|
831
|
+
| Linear pipeline (A→B→C) | `$csv-wave-pipeline -c 1` — 3 waves, serial, full context |
|
|
832
|
+
| Diamond dependency (A→B,C→D) | `$csv-wave-pipeline` — 3 waves, B+C concurrent in wave 2 |
|
|
833
|
+
| Complex requirement, unclear tasks | Use `$roadmap-with-file` first for planning, then feed issues here |
|
|
834
|
+
| Single complex task | Use `$lite-execute` instead |
|