@leclabs/agent-flow-navigator-mcp 1.5.1 → 1.6.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +81 -8
- package/catalog/workflows/bug-hunt.json +242 -0
- package/copier.js +4 -4
- package/diagram.js +12 -2
- package/engine.js +110 -5
- package/package.json +3 -3
- package/store.js +67 -0
- package/types.d.ts +27 -1
package/README.md
CHANGED
|
@@ -2,7 +2,7 @@
|
|
|
2
2
|
|
|
3
3
|
[](https://npmjs.com/package/@leclabs/agent-flow-navigator-mcp)
|
|
4
4
|
|
|
5
|
-
A workflow state machine MCP server that navigates agents through
|
|
5
|
+
A workflow state machine MCP server that navigates agents through graph-based workflows.
|
|
6
6
|
|
|
7
7
|
Navigator tracks task state and evaluates graph edges -- it tells the orchestrator _where to go next_, but doesn't drive. Think of it like a GPS: you tell it where you are and what happened, it tells you where to go.
|
|
8
8
|
|
|
@@ -145,13 +145,13 @@ Lists workflows available in the built-in catalog. No parameters.
|
|
|
145
145
|
|
|
146
146
|
### Key Concepts
|
|
147
147
|
|
|
148
|
-
| Concept | Description
|
|
149
|
-
| ----------------------- |
|
|
150
|
-
| **Workflow Definition** | A
|
|
151
|
-
| **Navigate 3-Mode API** | Start a workflow, get current state, or advance -- one tool, three calling patterns
|
|
152
|
-
| **Write-Through** | State transitions are persisted to the task file atomically on every advance
|
|
153
|
-
| **Conditional Edges** | Edges with `on` condition (passed/failed) -- retry logic is on nodes via `maxRetries`
|
|
154
|
-
| **HITL Escalation** | When retries are exhausted, tasks route to end nodes with `escalation: "hitl"`
|
|
148
|
+
| Concept | Description |
|
|
149
|
+
| ----------------------- | -------------------------------------------------------------------------------------- |
|
|
150
|
+
| **Workflow Definition** | A graph blueprint describing how to execute a type of work (nodes + conditional edges) |
|
|
151
|
+
| **Navigate 3-Mode API** | Start a workflow, get current state, or advance -- one tool, three calling patterns |
|
|
152
|
+
| **Write-Through** | State transitions are persisted to the task file atomically on every advance |
|
|
153
|
+
| **Conditional Edges** | Edges with `on` condition (passed/failed) -- retry logic is on nodes via `maxRetries` |
|
|
154
|
+
| **HITL Escalation** | When retries are exhausted, tasks route to end nodes with `escalation: "hitl"` |
|
|
155
155
|
|
|
156
156
|
## Workflow Definition Schema
|
|
157
157
|
|
|
@@ -163,6 +163,8 @@ Lists workflows available in the built-in catalog. No parameters.
|
|
|
163
163
|
| `end` | Exit point with `result` (success/failure/blocked/cancelled) |
|
|
164
164
|
| `task` | Executable work unit |
|
|
165
165
|
| `gate` | Quality gate / review checkpoint |
|
|
166
|
+
| `fork` | Fan out into parallel branches (paired with a `join`) |
|
|
167
|
+
| `join` | Collect parallel branches back together |
|
|
166
168
|
| `subflow` | Connector to another workflow |
|
|
167
169
|
|
|
168
170
|
### End Node Properties
|
|
@@ -193,6 +195,77 @@ Lists workflows available in the built-in catalog. No parameters.
|
|
|
193
195
|
| `label` | Human-readable edge description |
|
|
194
196
|
| `condition` | Expression for future conditional routing (informational) |
|
|
195
197
|
|
|
198
|
+
### Fork/Join (Parallel Branches)
|
|
199
|
+
|
|
200
|
+
Fork/join lets a workflow fan out into parallel investigation tracks that converge at a join point. Each fork must be paired 1:1 with a join, and nested forks are not supported.
|
|
201
|
+
|
|
202
|
+
#### Fork Node
|
|
203
|
+
|
|
204
|
+
```json
|
|
205
|
+
{
|
|
206
|
+
"type": "fork",
|
|
207
|
+
"name": "Fork Investigation",
|
|
208
|
+
"join": "join_investigate",
|
|
209
|
+
"branches": {
|
|
210
|
+
"reproduce": {
|
|
211
|
+
"entryStep": "reproduce",
|
|
212
|
+
"description": "Try to trigger the bug"
|
|
213
|
+
},
|
|
214
|
+
"code_archaeology": {
|
|
215
|
+
"entryStep": "code_archaeology",
|
|
216
|
+
"description": "Trace code paths related to symptoms"
|
|
217
|
+
}
|
|
218
|
+
}
|
|
219
|
+
}
|
|
220
|
+
```
|
|
221
|
+
|
|
222
|
+
| Property | Description |
|
|
223
|
+
| ---------- | -------------------------------------------------------------- |
|
|
224
|
+
| `join` | ID of the paired join node (required) |
|
|
225
|
+
| `branches` | Map of branch name to `{ entryStep, description? }` (required) |
|
|
226
|
+
|
|
227
|
+
Each branch's `entryStep` must reference an existing task/gate node in the workflow. The orchestrator creates a child task per branch and runs them from their entry step.
|
|
228
|
+
|
|
229
|
+
#### Join Node
|
|
230
|
+
|
|
231
|
+
```json
|
|
232
|
+
{
|
|
233
|
+
"type": "join",
|
|
234
|
+
"name": "Join Investigation",
|
|
235
|
+
"fork": "fork_investigate",
|
|
236
|
+
"strategy": "all-pass"
|
|
237
|
+
}
|
|
238
|
+
```
|
|
239
|
+
|
|
240
|
+
| Property | Description |
|
|
241
|
+
| ---------- | -------------------------------------------------------------------------------------------- |
|
|
242
|
+
| `fork` | ID of the paired fork node (required) |
|
|
243
|
+
| `strategy` | `"all-pass"` (all branches must pass) or `"any-pass"` (one is enough). Default: `"all-pass"` |
|
|
244
|
+
|
|
245
|
+
#### Edges
|
|
246
|
+
|
|
247
|
+
Wire the fork and join like any other nodes. Branches run from their `entryStep` until they reach the join node via normal edges:
|
|
248
|
+
|
|
249
|
+
```json
|
|
250
|
+
{ "from": "triage", "to": "fork_investigate" },
|
|
251
|
+
{ "from": "fork_investigate", "to": "reproduce" },
|
|
252
|
+
{ "from": "fork_investigate", "to": "code_archaeology" },
|
|
253
|
+
{ "from": "reproduce", "to": "join_investigate" },
|
|
254
|
+
{ "from": "code_archaeology", "to": "join_investigate" },
|
|
255
|
+
{ "from": "join_investigate", "to": "synthesize", "on": "passed" },
|
|
256
|
+
{ "from": "join_investigate", "to": "hitl_inconclusive", "on": "failed" }
|
|
257
|
+
```
|
|
258
|
+
|
|
259
|
+
The join node supports conditional edges (`on: "passed"` / `on: "failed"`) so the workflow can route to different paths depending on whether the branches succeeded.
|
|
260
|
+
|
|
261
|
+
#### Validation Rules
|
|
262
|
+
|
|
263
|
+
- Fork and join must reference each other (1:1 pairing)
|
|
264
|
+
- All branch `entryStep` values must reference existing nodes
|
|
265
|
+
- Branch entry steps cannot be fork nodes (no nesting)
|
|
266
|
+
|
|
267
|
+
See `catalog/workflows/bug-hunt.json` for a complete working example.
|
|
268
|
+
|
|
196
269
|
## Testing
|
|
197
270
|
|
|
198
271
|
```bash
|
|
@@ -0,0 +1,242 @@
|
|
|
1
|
+
{
|
|
2
|
+
"id": "bug-hunt",
|
|
3
|
+
"name": "Bug Hunt",
|
|
4
|
+
"description": "Parallel investigation workflow for vague bug reports. Fans out into reproduction, code archaeology, and git forensics tracks, then synthesizes findings into a root cause analysis and fix.",
|
|
5
|
+
"nodes": {
|
|
6
|
+
"start": {
|
|
7
|
+
"type": "start",
|
|
8
|
+
"name": "Start",
|
|
9
|
+
"description": "Bug hunt begins"
|
|
10
|
+
},
|
|
11
|
+
"triage": {
|
|
12
|
+
"type": "task",
|
|
13
|
+
"name": "Triage Report",
|
|
14
|
+
"description": "Parse the vague report. Extract symptoms, affected area, timing, severity. Form 2-3 hypotheses to test.",
|
|
15
|
+
"agent": "Investigator",
|
|
16
|
+
"stage": "planning"
|
|
17
|
+
},
|
|
18
|
+
"fork_investigate": {
|
|
19
|
+
"type": "fork",
|
|
20
|
+
"name": "Fork Investigation",
|
|
21
|
+
"join": "join_investigate",
|
|
22
|
+
"branches": {
|
|
23
|
+
"reproduce": {
|
|
24
|
+
"entryStep": "reproduce",
|
|
25
|
+
"description": "Try to trigger the bug"
|
|
26
|
+
},
|
|
27
|
+
"code_archaeology": {
|
|
28
|
+
"entryStep": "code_archaeology",
|
|
29
|
+
"description": "Trace code paths related to symptoms"
|
|
30
|
+
},
|
|
31
|
+
"git_forensics": {
|
|
32
|
+
"entryStep": "git_forensics",
|
|
33
|
+
"description": "Check recent commits and git blame"
|
|
34
|
+
}
|
|
35
|
+
}
|
|
36
|
+
},
|
|
37
|
+
"reproduce": {
|
|
38
|
+
"type": "task",
|
|
39
|
+
"name": "Reproduce Bug",
|
|
40
|
+
"description": "Try to trigger the bug. Document exact reproduction steps, environment, and observed vs expected behavior.",
|
|
41
|
+
"agent": "Tester",
|
|
42
|
+
"stage": "investigation"
|
|
43
|
+
},
|
|
44
|
+
"code_archaeology": {
|
|
45
|
+
"type": "task",
|
|
46
|
+
"name": "Code Archaeology",
|
|
47
|
+
"description": "Trace the code paths related to the reported symptoms. Map data flow, identify suspect modules, check edge cases.",
|
|
48
|
+
"agent": "Investigator",
|
|
49
|
+
"stage": "investigation"
|
|
50
|
+
},
|
|
51
|
+
"git_forensics": {
|
|
52
|
+
"type": "task",
|
|
53
|
+
"name": "Git Forensics",
|
|
54
|
+
"description": "Check recent commits touching affected areas. Run git blame on suspect files. Look for correlated changes or regressions.",
|
|
55
|
+
"agent": "Investigator",
|
|
56
|
+
"stage": "investigation"
|
|
57
|
+
},
|
|
58
|
+
"join_investigate": {
|
|
59
|
+
"type": "join",
|
|
60
|
+
"name": "Join Investigation",
|
|
61
|
+
"fork": "fork_investigate",
|
|
62
|
+
"strategy": "all-pass"
|
|
63
|
+
},
|
|
64
|
+
"synthesize": {
|
|
65
|
+
"type": "task",
|
|
66
|
+
"name": "Synthesize Findings",
|
|
67
|
+
"description": "Combine findings from all investigation tracks into a root cause analysis. Identify the most likely cause, supporting evidence, and a fix strategy.",
|
|
68
|
+
"agent": "Architect",
|
|
69
|
+
"stage": "planning"
|
|
70
|
+
},
|
|
71
|
+
"write_fix": {
|
|
72
|
+
"type": "task",
|
|
73
|
+
"name": "Write Fix",
|
|
74
|
+
"description": "Implement the fix with minimal changes",
|
|
75
|
+
"agent": "Developer",
|
|
76
|
+
"stage": "development"
|
|
77
|
+
},
|
|
78
|
+
"add_regression_test": {
|
|
79
|
+
"type": "task",
|
|
80
|
+
"name": "Add Regression Test",
|
|
81
|
+
"description": "Write a test that would have caught this bug",
|
|
82
|
+
"agent": "Tester",
|
|
83
|
+
"stage": "development"
|
|
84
|
+
},
|
|
85
|
+
"verify_fix": {
|
|
86
|
+
"type": "gate",
|
|
87
|
+
"name": "Verify Fix",
|
|
88
|
+
"description": "Run tests, verify fix addresses root cause",
|
|
89
|
+
"agent": "Tester",
|
|
90
|
+
"stage": "verification",
|
|
91
|
+
"maxRetries": 3
|
|
92
|
+
},
|
|
93
|
+
"lint_format": {
|
|
94
|
+
"type": "gate",
|
|
95
|
+
"name": "Lint & Format",
|
|
96
|
+
"description": "Run lint and format checks. Auto-fix issues where possible.",
|
|
97
|
+
"agent": "Developer",
|
|
98
|
+
"stage": "delivery",
|
|
99
|
+
"maxRetries": 3
|
|
100
|
+
},
|
|
101
|
+
"commit": {
|
|
102
|
+
"type": "task",
|
|
103
|
+
"name": "Commit Changes",
|
|
104
|
+
"description": "Commit the fix and regression test with a descriptive message",
|
|
105
|
+
"agent": "Developer",
|
|
106
|
+
"stage": "delivery"
|
|
107
|
+
},
|
|
108
|
+
"end_success": {
|
|
109
|
+
"type": "end",
|
|
110
|
+
"result": "success",
|
|
111
|
+
"name": "Bug Fixed",
|
|
112
|
+
"description": "Bug hunted down and fixed"
|
|
113
|
+
},
|
|
114
|
+
"hitl_inconclusive": {
|
|
115
|
+
"type": "end",
|
|
116
|
+
"result": "blocked",
|
|
117
|
+
"escalation": "hitl",
|
|
118
|
+
"name": "Inconclusive",
|
|
119
|
+
"description": "Investigation tracks couldn't determine root cause - needs human context"
|
|
120
|
+
},
|
|
121
|
+
"hitl_fix_failed": {
|
|
122
|
+
"type": "end",
|
|
123
|
+
"result": "blocked",
|
|
124
|
+
"escalation": "hitl",
|
|
125
|
+
"name": "Fix Failed",
|
|
126
|
+
"description": "Unable to fix the bug - needs human help"
|
|
127
|
+
}
|
|
128
|
+
},
|
|
129
|
+
"edges": [
|
|
130
|
+
{
|
|
131
|
+
"from": "start",
|
|
132
|
+
"to": "triage"
|
|
133
|
+
},
|
|
134
|
+
{
|
|
135
|
+
"from": "triage",
|
|
136
|
+
"to": "fork_investigate"
|
|
137
|
+
},
|
|
138
|
+
{
|
|
139
|
+
"from": "fork_investigate",
|
|
140
|
+
"to": "reproduce",
|
|
141
|
+
"label": "Branch: reproduce"
|
|
142
|
+
},
|
|
143
|
+
{
|
|
144
|
+
"from": "fork_investigate",
|
|
145
|
+
"to": "code_archaeology",
|
|
146
|
+
"label": "Branch: code archaeology"
|
|
147
|
+
},
|
|
148
|
+
{
|
|
149
|
+
"from": "fork_investigate",
|
|
150
|
+
"to": "git_forensics",
|
|
151
|
+
"label": "Branch: git forensics"
|
|
152
|
+
},
|
|
153
|
+
{
|
|
154
|
+
"from": "reproduce",
|
|
155
|
+
"to": "join_investigate"
|
|
156
|
+
},
|
|
157
|
+
{
|
|
158
|
+
"from": "code_archaeology",
|
|
159
|
+
"to": "join_investigate"
|
|
160
|
+
},
|
|
161
|
+
{
|
|
162
|
+
"from": "git_forensics",
|
|
163
|
+
"to": "join_investigate"
|
|
164
|
+
},
|
|
165
|
+
{
|
|
166
|
+
"from": "join_investigate",
|
|
167
|
+
"to": "synthesize",
|
|
168
|
+
"on": "passed",
|
|
169
|
+
"label": "All tracks complete"
|
|
170
|
+
},
|
|
171
|
+
{
|
|
172
|
+
"from": "join_investigate",
|
|
173
|
+
"to": "hitl_inconclusive",
|
|
174
|
+
"on": "failed",
|
|
175
|
+
"label": "Investigation couldn't determine root cause"
|
|
176
|
+
},
|
|
177
|
+
{
|
|
178
|
+
"from": "synthesize",
|
|
179
|
+
"to": "write_fix"
|
|
180
|
+
},
|
|
181
|
+
{
|
|
182
|
+
"from": "write_fix",
|
|
183
|
+
"to": "add_regression_test"
|
|
184
|
+
},
|
|
185
|
+
{
|
|
186
|
+
"from": "add_regression_test",
|
|
187
|
+
"to": "verify_fix"
|
|
188
|
+
},
|
|
189
|
+
{
|
|
190
|
+
"from": "verify_fix",
|
|
191
|
+
"to": "lint_format",
|
|
192
|
+
"on": "passed",
|
|
193
|
+
"label": "Fix verified, run lint checks"
|
|
194
|
+
},
|
|
195
|
+
{
|
|
196
|
+
"from": "verify_fix",
|
|
197
|
+
"to": "write_fix",
|
|
198
|
+
"on": "failed",
|
|
199
|
+
"label": "Fix didn't work, try again"
|
|
200
|
+
},
|
|
201
|
+
{
|
|
202
|
+
"from": "verify_fix",
|
|
203
|
+
"to": "hitl_fix_failed",
|
|
204
|
+
"on": "failed",
|
|
205
|
+
"label": "Cannot fix the bug"
|
|
206
|
+
},
|
|
207
|
+
{
|
|
208
|
+
"from": "lint_format",
|
|
209
|
+
"to": "commit",
|
|
210
|
+
"on": "passed",
|
|
211
|
+
"label": "Lint passes, commit changes"
|
|
212
|
+
},
|
|
213
|
+
{
|
|
214
|
+
"from": "lint_format",
|
|
215
|
+
"to": "write_fix",
|
|
216
|
+
"on": "failed",
|
|
217
|
+
"label": "Fix lint/format issues"
|
|
218
|
+
},
|
|
219
|
+
{
|
|
220
|
+
"from": "lint_format",
|
|
221
|
+
"to": "hitl_fix_failed",
|
|
222
|
+
"on": "failed",
|
|
223
|
+
"label": "Lint issues persist"
|
|
224
|
+
},
|
|
225
|
+
{
|
|
226
|
+
"from": "commit",
|
|
227
|
+
"to": "end_success"
|
|
228
|
+
},
|
|
229
|
+
{
|
|
230
|
+
"from": "hitl_inconclusive",
|
|
231
|
+
"to": "triage",
|
|
232
|
+
"on": "passed",
|
|
233
|
+
"label": "Human provided context, re-triage"
|
|
234
|
+
},
|
|
235
|
+
{
|
|
236
|
+
"from": "hitl_fix_failed",
|
|
237
|
+
"to": "write_fix",
|
|
238
|
+
"on": "passed",
|
|
239
|
+
"label": "Human resolved issue, resume fix"
|
|
240
|
+
}
|
|
241
|
+
]
|
|
242
|
+
}
|
package/copier.js
CHANGED
|
@@ -12,7 +12,7 @@
|
|
|
12
12
|
export function generateFlowReadme() {
|
|
13
13
|
return `# Flow Plugin
|
|
14
14
|
|
|
15
|
-
|
|
15
|
+
Graph-based workflow orchestration for AI agents.
|
|
16
16
|
|
|
17
17
|
## Quick Start
|
|
18
18
|
|
|
@@ -65,7 +65,7 @@ This directory contains workflow definitions for the flow plugin.
|
|
|
65
65
|
.flow/workflows/
|
|
66
66
|
├── README.md # This file
|
|
67
67
|
└── {workflow}/
|
|
68
|
-
└── workflow.json #
|
|
68
|
+
└── workflow.json # Workflow definition (steps + edges)
|
|
69
69
|
\`\`\`
|
|
70
70
|
|
|
71
71
|
## How Step Instructions Work
|
|
@@ -105,11 +105,11 @@ To add custom instructions for a step, add an \`instructions\` field to the node
|
|
|
105
105
|
|
|
106
106
|
### workflow.json
|
|
107
107
|
|
|
108
|
-
Defines the workflow
|
|
108
|
+
Defines the workflow graph:
|
|
109
109
|
- \`nodes\`: Map of node definitions (type, name, description, agent, maxRetries)
|
|
110
110
|
- \`edges\`: Array of transitions between nodes (from, to, on, label)
|
|
111
111
|
|
|
112
|
-
Do NOT edit edge logic unless you understand
|
|
112
|
+
Do NOT edit edge logic unless you understand workflow flow control.
|
|
113
113
|
|
|
114
114
|
## Adding Custom Workflows
|
|
115
115
|
|
package/diagram.js
CHANGED
|
@@ -58,6 +58,8 @@ export function generateDiagram(workflowDef, currentStep = null) {
|
|
|
58
58
|
lines.push(` ${stepId}[["${label}"]]`);
|
|
59
59
|
} else if (termType === "hitl" || termType === "failure") {
|
|
60
60
|
lines.push(` ${stepId}{{"${label}"}}`);
|
|
61
|
+
} else if (step.type === "fork" || step.type === "join") {
|
|
62
|
+
lines.push(` ${stepId}{{{"${label}"}}}`);
|
|
61
63
|
} else if (step.type === "gate") {
|
|
62
64
|
lines.push(` ${stepId}{"${label}"}`);
|
|
63
65
|
} else {
|
|
@@ -82,6 +84,7 @@ export function generateDiagram(workflowDef, currentStep = null) {
|
|
|
82
84
|
lines.push(" classDef successStep fill:#87CEEB,stroke:#4169E1");
|
|
83
85
|
lines.push(" classDef hitlStep fill:#FFB6C1,stroke:#DC143C");
|
|
84
86
|
lines.push(" classDef gateStep fill:#E6E6FA,stroke:#9370DB");
|
|
87
|
+
lines.push(" classDef forkJoinStep fill:#FFEAA7,stroke:#FDCB6E");
|
|
85
88
|
lines.push(" classDef currentStep fill:#FFD700,stroke:#FF8C00,stroke-width:3px");
|
|
86
89
|
|
|
87
90
|
const startSteps = Object.entries(nodes)
|
|
@@ -97,10 +100,15 @@ export function generateDiagram(workflowDef, currentStep = null) {
|
|
|
97
100
|
.filter(([, s]) => s.type === "gate")
|
|
98
101
|
.map(([id]) => id);
|
|
99
102
|
|
|
103
|
+
const forkJoinSteps = Object.entries(nodes)
|
|
104
|
+
.filter(([, s]) => s.type === "fork" || s.type === "join")
|
|
105
|
+
.map(([id]) => id);
|
|
106
|
+
|
|
100
107
|
if (startSteps.length) lines.push(` class ${startSteps.join(",")} startStep`);
|
|
101
108
|
if (successSteps.length) lines.push(` class ${successSteps.join(",")} successStep`);
|
|
102
109
|
if (hitlSteps.length) lines.push(` class ${hitlSteps.join(",")} hitlStep`);
|
|
103
110
|
if (gateSteps.length) lines.push(` class ${gateSteps.join(",")} gateStep`);
|
|
111
|
+
if (forkJoinSteps.length) lines.push(` class ${forkJoinSteps.join(",")} forkJoinStep`);
|
|
104
112
|
|
|
105
113
|
if (currentStep && nodes[currentStep]) {
|
|
106
114
|
lines.push(` class ${currentStep} currentStep`);
|
|
@@ -111,8 +119,10 @@ export function generateDiagram(workflowDef, currentStep = null) {
|
|
|
111
119
|
tableRows.push("| Stage | Step | Name | Agent | Instructions |");
|
|
112
120
|
tableRows.push("|-------|------|------|-------|--------------|");
|
|
113
121
|
|
|
114
|
-
// Group steps by stage for organized display - filter out terminal/start/end nodes
|
|
115
|
-
const stepEntries = Object.entries(nodes).filter(
|
|
122
|
+
// Group steps by stage for organized display - filter out terminal/start/end and fork/join nodes
|
|
123
|
+
const stepEntries = Object.entries(nodes).filter(
|
|
124
|
+
([, step]) => !isTerminalNode(step) && step.type !== "fork" && step.type !== "join"
|
|
125
|
+
);
|
|
116
126
|
|
|
117
127
|
for (const [stepId, step] of stepEntries) {
|
|
118
128
|
const stage = step.stage || "-";
|
package/engine.js
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
/**
|
|
2
2
|
* Workflow Engine
|
|
3
3
|
*
|
|
4
|
-
* Evaluates
|
|
4
|
+
* Evaluates graph transitions based on node outputs and retry state.
|
|
5
5
|
*
|
|
6
6
|
* Schema:
|
|
7
7
|
* - nodes: { [id]: { type, name, maxRetries?, ... } }
|
|
@@ -14,7 +14,7 @@
|
|
|
14
14
|
*/
|
|
15
15
|
|
|
16
16
|
import { existsSync, readFileSync, writeFileSync } from "fs";
|
|
17
|
-
import { join } from "path";
|
|
17
|
+
import { basename, join } from "path";
|
|
18
18
|
|
|
19
19
|
/**
|
|
20
20
|
* Read and parse a task file
|
|
@@ -32,12 +32,21 @@ export function readTaskFile(taskFilePath) {
|
|
|
32
32
|
|
|
33
33
|
/**
|
|
34
34
|
* Check if a node is a terminal node (start or end)
|
|
35
|
+
* Fork/join are control-flow nodes, not terminal.
|
|
35
36
|
*/
|
|
36
37
|
export function isTerminalNode(node) {
|
|
37
38
|
if (!node) return false;
|
|
38
39
|
return node.type === "start" || node.type === "end";
|
|
39
40
|
}
|
|
40
41
|
|
|
42
|
+
/**
|
|
43
|
+
* Check if a node is a fork or join control-flow node
|
|
44
|
+
*/
|
|
45
|
+
export function isForkJoinNode(node) {
|
|
46
|
+
if (!node) return false;
|
|
47
|
+
return node.type === "fork" || node.type === "join";
|
|
48
|
+
}
|
|
49
|
+
|
|
41
50
|
/**
|
|
42
51
|
* Get terminal type for a node
|
|
43
52
|
* Returns: "start" | "success" | "hitl" | "failure" | null
|
|
@@ -67,6 +76,7 @@ export function toSubagentRef(agentId) {
|
|
|
67
76
|
const WORKFLOW_EMOJIS = {
|
|
68
77
|
"feature-development": "✨",
|
|
69
78
|
"bug-fix": "🐛",
|
|
79
|
+
"bug-hunt": "🔍",
|
|
70
80
|
"agile-task": "📋",
|
|
71
81
|
"context-optimization": "🔧",
|
|
72
82
|
"quick-task": "⚡",
|
|
@@ -253,6 +263,47 @@ ${description || "{task description}"}`;
|
|
|
253
263
|
return result;
|
|
254
264
|
}
|
|
255
265
|
|
|
266
|
+
/**
|
|
267
|
+
* Build response for fork/join control-flow nodes.
|
|
268
|
+
* These nodes are declarative — the engine returns metadata for the orchestrator to act on.
|
|
269
|
+
*/
|
|
270
|
+
function buildForkJoinResponse(workflowType, stepId, stepDef, action, retryCount = 0, sourceRoot = null) {
|
|
271
|
+
const base = {
|
|
272
|
+
currentStep: stepId,
|
|
273
|
+
stage: null,
|
|
274
|
+
subagent: null,
|
|
275
|
+
stepInstructions: null,
|
|
276
|
+
terminal: null,
|
|
277
|
+
action,
|
|
278
|
+
retriesIncremented: false,
|
|
279
|
+
autonomyContinued: false,
|
|
280
|
+
maxRetries: 0,
|
|
281
|
+
orchestratorInstructions: null,
|
|
282
|
+
metadata: {
|
|
283
|
+
workflowType,
|
|
284
|
+
currentStep: stepId,
|
|
285
|
+
retryCount,
|
|
286
|
+
},
|
|
287
|
+
sourceRoot,
|
|
288
|
+
};
|
|
289
|
+
|
|
290
|
+
if (stepDef.type === "fork") {
|
|
291
|
+
base.action = "fork";
|
|
292
|
+
base.fork = {
|
|
293
|
+
branches: stepDef.branches,
|
|
294
|
+
joinStep: stepDef.join,
|
|
295
|
+
};
|
|
296
|
+
} else if (stepDef.type === "join") {
|
|
297
|
+
base.action = "join";
|
|
298
|
+
base.join = {
|
|
299
|
+
forkStep: stepDef.fork,
|
|
300
|
+
strategy: stepDef.strategy || "all-pass",
|
|
301
|
+
};
|
|
302
|
+
}
|
|
303
|
+
|
|
304
|
+
return base;
|
|
305
|
+
}
|
|
306
|
+
|
|
256
307
|
/**
|
|
257
308
|
* Build unified response shape for Navigate
|
|
258
309
|
* Minimal output: only what Orchestrator needs for control flow and delegation
|
|
@@ -551,6 +602,11 @@ export class WorkflowEngine {
|
|
|
551
602
|
throw new Error(`First step '${firstEdge.to}' not found in workflow`);
|
|
552
603
|
}
|
|
553
604
|
|
|
605
|
+
// Fork/join at start position — return control-flow response
|
|
606
|
+
if (isForkJoinNode(firstStepDef)) {
|
|
607
|
+
return buildForkJoinResponse(workflowType, firstEdge.to, firstStepDef, "start", 0, sourceRoot);
|
|
608
|
+
}
|
|
609
|
+
|
|
554
610
|
return buildNavigateResponse(
|
|
555
611
|
workflowType,
|
|
556
612
|
firstEdge.to,
|
|
@@ -572,6 +628,11 @@ export class WorkflowEngine {
|
|
|
572
628
|
throw new Error(`Step '${currentStep}' not found in workflow '${workflowType}'`);
|
|
573
629
|
}
|
|
574
630
|
|
|
631
|
+
// Fork/join — return control-flow response
|
|
632
|
+
if (isForkJoinNode(stepDef)) {
|
|
633
|
+
return buildForkJoinResponse(workflowType, currentStep, stepDef, "current", retryCount, sourceRoot);
|
|
634
|
+
}
|
|
635
|
+
|
|
575
636
|
return buildNavigateResponse(
|
|
576
637
|
workflowType,
|
|
577
638
|
currentStep,
|
|
@@ -602,6 +663,42 @@ export class WorkflowEngine {
|
|
|
602
663
|
throw new Error(`Next step '${evaluation.nextStep}' not found in workflow`);
|
|
603
664
|
}
|
|
604
665
|
|
|
666
|
+
// Fork/join on advance — return control-flow response
|
|
667
|
+
if (isForkJoinNode(nextStepDef)) {
|
|
668
|
+
const forkJoinResponse = buildForkJoinResponse(
|
|
669
|
+
workflowType,
|
|
670
|
+
evaluation.nextStep,
|
|
671
|
+
nextStepDef,
|
|
672
|
+
nextStepDef.type, // "fork" or "join"
|
|
673
|
+
0,
|
|
674
|
+
sourceRoot
|
|
675
|
+
);
|
|
676
|
+
|
|
677
|
+
// Write-through for fork/join advance
|
|
678
|
+
if (taskFilePath) {
|
|
679
|
+
const task = readTaskFile(taskFilePath);
|
|
680
|
+
if (task) {
|
|
681
|
+
const filenameId = basename(taskFilePath, ".json");
|
|
682
|
+
if (task.id !== filenameId) {
|
|
683
|
+
console.error(
|
|
684
|
+
`[Navigator] WARNING: task.id "${task.id}" does not match filename "${filenameId}.json" — auto-correcting`
|
|
685
|
+
);
|
|
686
|
+
}
|
|
687
|
+
const canonicalId = filenameId;
|
|
688
|
+
task.metadata = { ...task.metadata, ...forkJoinResponse.metadata };
|
|
689
|
+
task.id = canonicalId;
|
|
690
|
+
writeFileSync(taskFilePath, JSON.stringify(task, null, 2));
|
|
691
|
+
}
|
|
692
|
+
}
|
|
693
|
+
|
|
694
|
+
// Persist autonomy in metadata
|
|
695
|
+
if (autonomy) {
|
|
696
|
+
forkJoinResponse.metadata.autonomy = true;
|
|
697
|
+
}
|
|
698
|
+
|
|
699
|
+
return forkJoinResponse;
|
|
700
|
+
}
|
|
701
|
+
|
|
605
702
|
// Determine action and whether retries incremented
|
|
606
703
|
const currentStepDef = nodes[currentStep];
|
|
607
704
|
const isHitlResume = getTerminalType(currentStepDef) === "hitl";
|
|
@@ -672,11 +769,19 @@ export class WorkflowEngine {
|
|
|
672
769
|
if (taskFilePath) {
|
|
673
770
|
const task = readTaskFile(taskFilePath);
|
|
674
771
|
if (task) {
|
|
675
|
-
|
|
772
|
+
// Derive canonical ID from filename — this is the source of truth.
|
|
773
|
+
// Auto-corrects if task.id was corrupted by an external write.
|
|
774
|
+
const filenameId = basename(taskFilePath, ".json");
|
|
775
|
+
if (task.id !== filenameId) {
|
|
776
|
+
console.error(
|
|
777
|
+
`[Navigator] WARNING: task.id "${task.id}" does not match filename "${filenameId}.json" — auto-correcting`
|
|
778
|
+
);
|
|
779
|
+
}
|
|
780
|
+
const canonicalId = filenameId;
|
|
676
781
|
const userDesc = task.metadata?.userDescription || "";
|
|
677
782
|
task.metadata = { ...task.metadata, ...response.metadata };
|
|
678
783
|
task.subject = buildTaskSubject(
|
|
679
|
-
|
|
784
|
+
canonicalId,
|
|
680
785
|
userDesc,
|
|
681
786
|
response.metadata.workflowType,
|
|
682
787
|
response.currentStep,
|
|
@@ -693,7 +798,7 @@ export class WorkflowEngine {
|
|
|
693
798
|
if (response.orchestratorInstructions) {
|
|
694
799
|
task.description = response.orchestratorInstructions;
|
|
695
800
|
}
|
|
696
|
-
task.id =
|
|
801
|
+
task.id = canonicalId;
|
|
697
802
|
writeFileSync(taskFilePath, JSON.stringify(task, null, 2));
|
|
698
803
|
}
|
|
699
804
|
}
|
package/package.json
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@leclabs/agent-flow-navigator-mcp",
|
|
3
|
-
"version": "1.
|
|
4
|
-
"description": "MCP server that navigates agents through
|
|
3
|
+
"version": "1.6.0",
|
|
4
|
+
"description": "MCP server that navigates agents through graph-based workflows",
|
|
5
5
|
"license": "MIT",
|
|
6
6
|
"author": "leclabs",
|
|
7
7
|
"homepage": "https://github.com/leclabs/agent-toolkit",
|
|
@@ -15,7 +15,7 @@
|
|
|
15
15
|
"type": "module",
|
|
16
16
|
"scripts": {
|
|
17
17
|
"start": "node index.js",
|
|
18
|
-
"test": "node --test engine.test.js diagram.test.js store.test.js dialog.test.js copier.test.js catalog.test.js refactor-workflow.test.js build-review-workflow.test.js"
|
|
18
|
+
"test": "node --test engine.test.js diagram.test.js store.test.js dialog.test.js copier.test.js catalog.test.js refactor-workflow.test.js build-review-workflow.test.js bug-hunt-workflow.test.js"
|
|
19
19
|
},
|
|
20
20
|
"keywords": [
|
|
21
21
|
"mcp",
|
package/store.js
CHANGED
|
@@ -20,6 +20,73 @@ export function validateWorkflow(id, content) {
|
|
|
20
20
|
console.error(`Invalid workflow ${id}: missing 'edges' array`);
|
|
21
21
|
return false;
|
|
22
22
|
}
|
|
23
|
+
|
|
24
|
+
// Validate fork/join nodes
|
|
25
|
+
const nodes = content.nodes;
|
|
26
|
+
const forkNodes = Object.entries(nodes).filter(([, n]) => n.type === "fork");
|
|
27
|
+
const joinNodes = Object.entries(nodes).filter(([, n]) => n.type === "join");
|
|
28
|
+
|
|
29
|
+
for (const [forkId, forkDef] of forkNodes) {
|
|
30
|
+
// Fork's join field must reference an existing join node
|
|
31
|
+
if (!forkDef.join || !nodes[forkDef.join]) {
|
|
32
|
+
console.error(`Invalid workflow ${id}: fork '${forkId}' references missing join node '${forkDef.join}'`);
|
|
33
|
+
return false;
|
|
34
|
+
}
|
|
35
|
+
if (nodes[forkDef.join].type !== "join") {
|
|
36
|
+
console.error(`Invalid workflow ${id}: fork '${forkId}' join field '${forkDef.join}' is not a join node`);
|
|
37
|
+
return false;
|
|
38
|
+
}
|
|
39
|
+
|
|
40
|
+
// All branch entryStep values must reference existing nodes
|
|
41
|
+
if (!forkDef.branches || typeof forkDef.branches !== "object") {
|
|
42
|
+
console.error(`Invalid workflow ${id}: fork '${forkId}' missing branches`);
|
|
43
|
+
return false;
|
|
44
|
+
}
|
|
45
|
+
for (const [branchName, branchDef] of Object.entries(forkDef.branches)) {
|
|
46
|
+
if (!branchDef.entryStep || !nodes[branchDef.entryStep]) {
|
|
47
|
+
console.error(
|
|
48
|
+
`Invalid workflow ${id}: fork '${forkId}' branch '${branchName}' references missing node '${branchDef.entryStep}'`
|
|
49
|
+
);
|
|
50
|
+
return false;
|
|
51
|
+
}
|
|
52
|
+
}
|
|
53
|
+
|
|
54
|
+
// No nested forks in v1: branch paths must not contain fork nodes
|
|
55
|
+
for (const [branchName, branchDef] of Object.entries(forkDef.branches)) {
|
|
56
|
+
if (nodes[branchDef.entryStep]?.type === "fork") {
|
|
57
|
+
console.error(
|
|
58
|
+
`Invalid workflow ${id}: fork '${forkId}' branch '${branchName}' entry step '${branchDef.entryStep}' is a nested fork (not supported in v1)`
|
|
59
|
+
);
|
|
60
|
+
return false;
|
|
61
|
+
}
|
|
62
|
+
}
|
|
63
|
+
}
|
|
64
|
+
|
|
65
|
+
for (const [joinId, joinDef] of joinNodes) {
|
|
66
|
+
// Join's fork field must reference an existing fork node
|
|
67
|
+
if (!joinDef.fork || !nodes[joinDef.fork]) {
|
|
68
|
+
console.error(`Invalid workflow ${id}: join '${joinId}' references missing fork node '${joinDef.fork}'`);
|
|
69
|
+
return false;
|
|
70
|
+
}
|
|
71
|
+
if (nodes[joinDef.fork].type !== "fork") {
|
|
72
|
+
console.error(`Invalid workflow ${id}: join '${joinId}' fork field '${joinDef.fork}' is not a fork node`);
|
|
73
|
+
return false;
|
|
74
|
+
}
|
|
75
|
+
}
|
|
76
|
+
|
|
77
|
+
// Fork-join pairs must be matched 1:1
|
|
78
|
+
const forkToJoin = new Map(forkNodes.map(([id, def]) => [id, def.join]));
|
|
79
|
+
const joinToFork = new Map(joinNodes.map(([id, def]) => [id, def.fork]));
|
|
80
|
+
|
|
81
|
+
for (const [forkId, joinId] of forkToJoin) {
|
|
82
|
+
if (joinToFork.get(joinId) !== forkId) {
|
|
83
|
+
console.error(
|
|
84
|
+
`Invalid workflow ${id}: fork '${forkId}' → join '${joinId}' pair mismatch (join points to '${joinToFork.get(joinId)}')`
|
|
85
|
+
);
|
|
86
|
+
return false;
|
|
87
|
+
}
|
|
88
|
+
}
|
|
89
|
+
|
|
23
90
|
return true;
|
|
24
91
|
}
|
|
25
92
|
|
package/types.d.ts
CHANGED
|
@@ -13,7 +13,7 @@
|
|
|
13
13
|
// Node Types (Discriminated Union)
|
|
14
14
|
// =============================================================================
|
|
15
15
|
|
|
16
|
-
export type Node = StartNode | EndNode | TaskNode | GateNode | SubflowNode;
|
|
16
|
+
export type Node = StartNode | EndNode | TaskNode | GateNode | SubflowNode | ForkNode | JoinNode;
|
|
17
17
|
|
|
18
18
|
export interface StartNode {
|
|
19
19
|
type: "start";
|
|
@@ -81,6 +81,30 @@ export interface SubflowNode {
|
|
|
81
81
|
inputs?: Record<string, string>;
|
|
82
82
|
}
|
|
83
83
|
|
|
84
|
+
/**
|
|
85
|
+
* Fork node - declares parallel branches within the same workflow.
|
|
86
|
+
* Each branch names an entry step. Links to a join node that collects results.
|
|
87
|
+
*/
|
|
88
|
+
export interface ForkNode {
|
|
89
|
+
type: "fork";
|
|
90
|
+
name: string;
|
|
91
|
+
description?: string;
|
|
92
|
+
branches: Record<string, { entryStep: string; description?: string }>;
|
|
93
|
+
join: string; // join node ID
|
|
94
|
+
}
|
|
95
|
+
|
|
96
|
+
/**
|
|
97
|
+
* Join node - collects results from parallel branches.
|
|
98
|
+
* Strategy determines how branch results are evaluated.
|
|
99
|
+
*/
|
|
100
|
+
export interface JoinNode {
|
|
101
|
+
type: "join";
|
|
102
|
+
name: string;
|
|
103
|
+
description?: string;
|
|
104
|
+
fork: string; // fork node ID
|
|
105
|
+
strategy?: "all-pass" | "any-pass"; // default: "all-pass"
|
|
106
|
+
}
|
|
107
|
+
|
|
84
108
|
export type Stage = "planning" | "development" | "verification" | "delivery";
|
|
85
109
|
|
|
86
110
|
// =============================================================================
|
|
@@ -158,6 +182,8 @@ export interface NavigateResponse {
|
|
|
158
182
|
autonomyContinued: boolean;
|
|
159
183
|
maxRetries: number;
|
|
160
184
|
orchestratorInstructions: string | null;
|
|
185
|
+
fork?: { branches: Record<string, { entryStep: string; description?: string }>; joinStep: string };
|
|
186
|
+
join?: { forkStep: string; strategy: string };
|
|
161
187
|
metadata: {
|
|
162
188
|
workflowType: string;
|
|
163
189
|
currentStep: string;
|