@yemi33/minions 0.1.8 → 0.1.9
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +27 -0
- package/README.md +58 -0
- package/dashboard.html +106 -3
- package/dashboard.js +101 -0
- package/engine/ado.js +1 -0
- package/engine/cli.js +43 -1
- package/engine/github.js +1 -0
- package/engine/lifecycle.js +86 -0
- package/engine/scheduler.js +159 -0
- package/engine/shared.js +3 -0
- package/engine.js +152 -11
- package/package.json +1 -1
- package/playbooks/decompose.md +60 -0
- package/routing.md +1 -0
package/CHANGELOG.md
CHANGED
|
@@ -1,5 +1,32 @@
|
|
|
1
1
|
# Changelog
|
|
2
2
|
|
|
3
|
+
## 0.1.9 (2026-03-26)
|
|
4
|
+
|
|
5
|
+
### Engine
|
|
6
|
+
- engine.js
|
|
7
|
+
- engine/ado.js
|
|
8
|
+
- engine/cli.js
|
|
9
|
+
- engine/github.js
|
|
10
|
+
- engine/lifecycle.js
|
|
11
|
+
- engine/scheduler.js
|
|
12
|
+
- engine/shared.js
|
|
13
|
+
|
|
14
|
+
### Dashboard
|
|
15
|
+
- dashboard.html
|
|
16
|
+
- dashboard.js
|
|
17
|
+
|
|
18
|
+
### Playbooks
|
|
19
|
+
- decompose.md
|
|
20
|
+
|
|
21
|
+
### Documentation
|
|
22
|
+
- README.md
|
|
23
|
+
|
|
24
|
+
### Other
|
|
25
|
+
- CLAUDE.md
|
|
26
|
+
- TODO.md
|
|
27
|
+
- routing.md
|
|
28
|
+
- test/playwright/dashboard.spec.js
|
|
29
|
+
|
|
3
30
|
## 0.1.8 (2026-03-25)
|
|
4
31
|
|
|
5
32
|
### Engine
|
package/README.md
CHANGED
|
@@ -497,6 +497,60 @@ Engine behavior is controlled via `config.json`. Key settings:
|
|
|
497
497
|
| `worktreeRoot` | `../worktrees` | Where git worktrees are created |
|
|
498
498
|
| `idleAlertMinutes` | 15 | Alert after no dispatch for this many minutes |
|
|
499
499
|
| `restartGracePeriod` | 1200000 (20min) | Grace period for agent re-attachment after engine restart |
|
|
500
|
+
| `shutdownTimeout` | 300000 (5min) | Max wait for active agents during graceful shutdown (SIGTERM/SIGINT) |
|
|
501
|
+
| `allowTempAgents` | false | Spawn ephemeral agents when all permanent agents are busy |
|
|
502
|
+
| `autoDecompose` | true | Auto-decompose `implement:large` items into sub-tasks before dispatch |
|
|
503
|
+
|
|
504
|
+
### Scheduled Tasks
|
|
505
|
+
|
|
506
|
+
Add recurring work via `config.schedules`:
|
|
507
|
+
|
|
508
|
+
```json
|
|
509
|
+
{
|
|
510
|
+
"schedules": [
|
|
511
|
+
{
|
|
512
|
+
"id": "nightly-tests",
|
|
513
|
+
"cron": "0 2 *",
|
|
514
|
+
"type": "test",
|
|
515
|
+
"title": "Nightly test suite",
|
|
516
|
+
"project": "MyProject",
|
|
517
|
+
"agent": "dallas",
|
|
518
|
+
"enabled": true
|
|
519
|
+
}
|
|
520
|
+
]
|
|
521
|
+
}
|
|
522
|
+
```
|
|
523
|
+
|
|
524
|
+
Cron format is simplified 3-field: `minute hour dayOfWeek` (0=Sun..6=Sat). Supports `*`, `*/N`, and specific values. Examples:
|
|
525
|
+
- `0 2 *` — 2am daily
|
|
526
|
+
- `0 9 1` — 9am every Monday
|
|
527
|
+
- `*/30 * *` — every 30 minutes
|
|
528
|
+
- `0 9 1,3,5` — 9am Mon/Wed/Fri
|
|
529
|
+
|
|
530
|
+
### Graceful Shutdown
|
|
531
|
+
|
|
532
|
+
The engine handles `SIGTERM` and `SIGINT` (Ctrl+C) gracefully:
|
|
533
|
+
1. Stops accepting new work (enters `stopping` state)
|
|
534
|
+
2. Waits for active agents to finish (up to `shutdownTimeout`, default 5 minutes)
|
|
535
|
+
3. Exits cleanly
|
|
536
|
+
|
|
537
|
+
Active agents continue running as independent processes and will be re-attached on next engine start.
|
|
538
|
+
|
|
539
|
+
### Task Decomposition
|
|
540
|
+
|
|
541
|
+
Work items with `complexity: "large"` or `estimated_complexity: "large"` are auto-decomposed before dispatch (controlled by `engine.autoDecompose`, default `true`). The engine dispatches a `decompose` agent that breaks the item into 2-5 smaller sub-tasks, each becoming an independent work item with dependency tracking.
|
|
542
|
+
|
|
543
|
+
### Temporary Agents
|
|
544
|
+
|
|
545
|
+
Set `engine.allowTempAgents: true` to let the engine spawn ephemeral agents when all 5 permanent agents are busy. Temp agents:
|
|
546
|
+
- Get a `temp-{id}` identifier
|
|
547
|
+
- Use a minimal system prompt (no charter)
|
|
548
|
+
- Are auto-cleaned up after task completion
|
|
549
|
+
- Count toward `maxConcurrent` slots
|
|
550
|
+
|
|
551
|
+
### Live Output Streaming
|
|
552
|
+
|
|
553
|
+
The dashboard streams agent output in real-time via Server-Sent Events (SSE) instead of polling. The `GET /api/agent/:id/live-stream` endpoint pushes output chunks as they're written. Falls back to 3-second polling if SSE is unavailable.
|
|
500
554
|
|
|
501
555
|
## Node.js Upgrade Caution
|
|
502
556
|
|
|
@@ -535,6 +589,8 @@ To move to a new machine: `npm install -g @yemi33/minions && minions init --forc
|
|
|
535
589
|
ado.js <- ADO token management, PR polling, PR reconciliation
|
|
536
590
|
llm.js <- callLLM() with session resume, trackEngineUsage()
|
|
537
591
|
spawn-agent.js <- Agent spawn wrapper (resolves claude cli.js)
|
|
592
|
+
preflight.js <- Prerequisite checks (Node, Git, Claude CLI, API key)
|
|
593
|
+
scheduler.js <- Cron-style scheduled task discovery
|
|
538
594
|
ado-mcp-wrapper.js <- ADO MCP authentication wrapper
|
|
539
595
|
check-status.js <- Quick status check without full engine load
|
|
540
596
|
control.json <- running/paused/stopped (runtime, generated)
|
|
@@ -542,6 +598,7 @@ To move to a new machine: `npm install -g @yemi33/minions && minions init --forc
|
|
|
542
598
|
log.json <- Audit trail, capped at 500 (runtime, generated)
|
|
543
599
|
metrics.json <- Per-agent quality metrics (runtime, generated)
|
|
544
600
|
cooldowns.json <- Dispatch cooldown tracking (runtime, generated)
|
|
601
|
+
schedule-runs.json <- Last-run timestamps for scheduled tasks (runtime, generated)
|
|
545
602
|
dashboard.js <- Web dashboard server
|
|
546
603
|
dashboard.html <- Dashboard UI (single-file)
|
|
547
604
|
config.json <- projects[], agents, engine, claude settings (generated by minions init)
|
|
@@ -566,6 +623,7 @@ To move to a new machine: `npm install -g @yemi33/minions && minions init --forc
|
|
|
566
623
|
implement-shared.md <- Implement on a shared branch
|
|
567
624
|
ask.md <- Answer a question about the codebase
|
|
568
625
|
verify.md <- Plan verification: build, test, start webapp, testing guide
|
|
626
|
+
decompose.md <- Break large work items into 2-5 sub-tasks
|
|
569
627
|
skills/ <- Agent-created reusable workflows (generated)
|
|
570
628
|
agents/
|
|
571
629
|
{name}/
|
package/dashboard.html
CHANGED
|
@@ -917,7 +917,7 @@ function closeDetail() {
|
|
|
917
917
|
document.getElementById('detail-overlay').classList.remove('open');
|
|
918
918
|
document.getElementById('detail-panel').classList.remove('open');
|
|
919
919
|
currentAgentId = null;
|
|
920
|
-
|
|
920
|
+
stopLiveStream();
|
|
921
921
|
}
|
|
922
922
|
|
|
923
923
|
function renderDetailTabs(detail) {
|
|
@@ -978,10 +978,10 @@ function renderDetailContent(detail, tab) {
|
|
|
978
978
|
} else if (tab === 'live') {
|
|
979
979
|
el.innerHTML = '<div class="section" id="live-output" style="max-height:60vh;overflow-y:auto;font-size:11px;line-height:1.6">Loading live output...</div>' +
|
|
980
980
|
'<div style="margin-top:8px;display:flex;gap:8px;align-items:center">' +
|
|
981
|
-
'<span class="pulse"></span><span style="font-size:11px;color:var(--green)">
|
|
981
|
+
'<span class="pulse"></span><span id="live-status-label" style="font-size:11px;color:var(--green)">Streaming live</span>' +
|
|
982
982
|
'<button class="pr-pager-btn" onclick="refreshLiveOutput()" style="font-size:10px">Refresh now</button>' +
|
|
983
983
|
'</div>';
|
|
984
|
-
|
|
984
|
+
startLiveStream(currentAgentId);
|
|
985
985
|
} else if (tab === 'charter') {
|
|
986
986
|
el.innerHTML = '<div class="section">' + escHtml(detail.charter || 'No charter found.') + '</div>';
|
|
987
987
|
} else if (tab === 'history') {
|
|
@@ -1011,6 +1011,49 @@ function renderDetailContent(detail, tab) {
|
|
|
1011
1011
|
}
|
|
1012
1012
|
|
|
1013
1013
|
let livePollingInterval = null;
|
|
1014
|
+
let liveEventSource = null;
|
|
1015
|
+
|
|
1016
|
+
function startLiveStream(agentId) {
|
|
1017
|
+
stopLiveStream();
|
|
1018
|
+
if (!agentId) return;
|
|
1019
|
+
|
|
1020
|
+
const outputEl = document.getElementById('live-output');
|
|
1021
|
+
if (outputEl) outputEl.textContent = '';
|
|
1022
|
+
|
|
1023
|
+
liveEventSource = new EventSource('/api/agent/' + agentId + '/live-stream');
|
|
1024
|
+
|
|
1025
|
+
liveEventSource.onmessage = function(e) {
|
|
1026
|
+
try {
|
|
1027
|
+
const chunk = JSON.parse(e.data);
|
|
1028
|
+
const el = document.getElementById('live-output');
|
|
1029
|
+
if (el) {
|
|
1030
|
+
const wasAtBottom = el.scrollHeight - el.scrollTop - el.clientHeight < 100;
|
|
1031
|
+
el.textContent += chunk;
|
|
1032
|
+
if (wasAtBottom) el.scrollTop = el.scrollHeight;
|
|
1033
|
+
}
|
|
1034
|
+
} catch {}
|
|
1035
|
+
};
|
|
1036
|
+
|
|
1037
|
+
liveEventSource.addEventListener('done', function() {
|
|
1038
|
+
stopLiveStream();
|
|
1039
|
+
});
|
|
1040
|
+
|
|
1041
|
+
liveEventSource.onerror = function() {
|
|
1042
|
+
// Fall back to polling on SSE error
|
|
1043
|
+
stopLiveStream();
|
|
1044
|
+
startLivePolling();
|
|
1045
|
+
const label = document.getElementById('live-status-label');
|
|
1046
|
+
if (label) label.textContent = 'Auto-refreshing every 3s';
|
|
1047
|
+
};
|
|
1048
|
+
}
|
|
1049
|
+
|
|
1050
|
+
function stopLiveStream() {
|
|
1051
|
+
if (liveEventSource) {
|
|
1052
|
+
liveEventSource.close();
|
|
1053
|
+
liveEventSource = null;
|
|
1054
|
+
}
|
|
1055
|
+
stopLivePolling();
|
|
1056
|
+
}
|
|
1014
1057
|
|
|
1015
1058
|
function startLivePolling() {
|
|
1016
1059
|
stopLivePolling();
|
|
@@ -2209,6 +2252,7 @@ function renderDispatch(dispatch) {
|
|
|
2209
2252
|
'<span class="dispatch-type ' + (d.type || '') + '">' + escHtml(d.type || '') + '</span>' +
|
|
2210
2253
|
'<span class="dispatch-agent">' + escHtml(d.agentName || d.agent || '') + '</span>' +
|
|
2211
2254
|
'<span class="dispatch-task" title="' + escHtml(d.task || '') + '">' + escHtml(d.task || '') + '</span>' +
|
|
2255
|
+
(d.skipReason ? '<span style="font-size:9px;color:var(--muted);margin-left:6px" title="' + escHtml(d.skipReason) + '">' + escHtml(d.skipReason.replace(/_/g, ' ')) + '</span>' : '') +
|
|
2212
2256
|
'</div>'
|
|
2213
2257
|
).join('') + '</div>';
|
|
2214
2258
|
} else {
|
|
@@ -2516,6 +2560,7 @@ function wiRow(item) {
|
|
|
2516
2560
|
'<td>' + typeBadge(item.type) + '</td>' +
|
|
2517
2561
|
'<td>' + priBadge(item.priority) + '</td>' +
|
|
2518
2562
|
'<td>' + statusBadge(item.status || 'pending') +
|
|
2563
|
+
(item._pendingReason ? ' <span style="font-size:9px;color:var(--muted);margin-left:4px" title="Pending reason: ' + escHtml(item._pendingReason) + '">' + escHtml(item._pendingReason.replace(/_/g, ' ')) + '</span>' : '') +
|
|
2519
2564
|
(item.status === 'failed' ? ' <button class="pr-pager-btn" style="font-size:9px;padding:1px 6px;color:var(--yellow);border-color:var(--yellow);margin-left:4px" onclick="event.stopPropagation();retryWorkItem(\'' + escHtml(item.id) + '\',\'' + escHtml(item._source || '') + '\')">Retry</button>' : '') +
|
|
2520
2565
|
'</td>' +
|
|
2521
2566
|
'<td>' +
|
|
@@ -2527,6 +2572,7 @@ function wiRow(item) {
|
|
|
2527
2572
|
'<td>' + prLink + '</td>' +
|
|
2528
2573
|
'<td><span class="pr-date">' + shortTime(item.created) + '</span></td>' +
|
|
2529
2574
|
'<td style="white-space:nowrap">' +
|
|
2575
|
+
((item.status === 'pending' || item.status === 'failed') ? '<button class="pr-pager-btn" style="font-size:9px;padding:1px 6px;color:var(--blue);border-color:var(--blue);margin-right:4px" onclick="event.stopPropagation();editWorkItem(\'' + escHtml(item.id) + '\',\'' + escHtml(item._source || '') + '\')" title="Edit work item">✎</button>' : '') +
|
|
2530
2576
|
((item.status === 'done' || item.status === 'failed') ? '<button class="pr-pager-btn" style="font-size:9px;padding:1px 6px;color:var(--muted);border-color:var(--border);margin-right:4px" onclick="event.stopPropagation();archiveWorkItem(\'' + escHtml(item.id) + '\',\'' + escHtml(item._source || '') + '\')" title="Archive work item">📦</button>' : '') +
|
|
2531
2577
|
'<button class="pr-pager-btn" style="font-size:9px;padding:1px 6px;color:var(--red);border-color:var(--red)" onclick="event.stopPropagation();deleteWorkItem(\'' + escHtml(item.id) + '\',\'' + escHtml(item._source || '') + '\')" title="Delete work item and kill agent">✕</button>' +
|
|
2532
2578
|
'</td>' +
|
|
@@ -2588,6 +2634,63 @@ async function retryWorkItem(id, source) {
|
|
|
2588
2634
|
} catch (e) { alert('Retry error: ' + e.message); }
|
|
2589
2635
|
}
|
|
2590
2636
|
|
|
2637
|
+
function editWorkItem(id, source) {
|
|
2638
|
+
const item = allWorkItems.find(i => i.id === id);
|
|
2639
|
+
if (!item) return;
|
|
2640
|
+
const types = ['implement', 'fix', 'review', 'plan', 'verify', 'investigate', 'refactor', 'test', 'docs'];
|
|
2641
|
+
const priorities = ['critical', 'high', 'medium', 'low'];
|
|
2642
|
+
const agentOpts = cmdAgents.map(a => '<option value="' + escHtml(a.id) + '"' + (item.agent === a.id ? ' selected' : '') + '>' + escHtml(a.name) + '</option>').join('');
|
|
2643
|
+
const typeOpts = types.map(t => '<option value="' + t + '"' + ((item.type || 'implement') === t ? ' selected' : '') + '>' + t + '</option>').join('');
|
|
2644
|
+
const priOpts = priorities.map(p => '<option value="' + p + '"' + ((item.priority || 'medium') === p ? ' selected' : '') + '>' + p + '</option>').join('');
|
|
2645
|
+
|
|
2646
|
+
document.getElementById('modal-title').textContent = 'Edit Work Item ' + id;
|
|
2647
|
+
document.getElementById('modal-body').style.whiteSpace = 'normal';
|
|
2648
|
+
document.getElementById('modal-body').innerHTML =
|
|
2649
|
+
'<div style="display:flex;flex-direction:column;gap:12px;font-family:inherit">' +
|
|
2650
|
+
'<label style="color:var(--text);font-size:var(--text-md)">Title' +
|
|
2651
|
+
'<input id="wi-edit-title" value="' + escHtml(item.title || '') + '" style="display:block;width:100%;margin-top:4px;padding:6px 8px;background:var(--bg);border:1px solid var(--border);border-radius:var(--radius-sm);color:var(--text);font-size:var(--text-md);font-family:inherit">' +
|
|
2652
|
+
'</label>' +
|
|
2653
|
+
'<label style="color:var(--text);font-size:var(--text-md)">Description' +
|
|
2654
|
+
'<textarea id="wi-edit-desc" rows="3" style="display:block;width:100%;margin-top:4px;padding:6px 8px;background:var(--bg);border:1px solid var(--border);border-radius:var(--radius-sm);color:var(--text);font-size:var(--text-md);font-family:inherit;resize:vertical">' + escHtml(item.description || '') + '</textarea>' +
|
|
2655
|
+
'</label>' +
|
|
2656
|
+
'<div style="display:flex;gap:12px">' +
|
|
2657
|
+
'<label style="color:var(--text);font-size:var(--text-md);flex:1">Type' +
|
|
2658
|
+
'<select id="wi-edit-type" style="display:block;width:100%;margin-top:4px;padding:6px 8px;background:var(--bg);border:1px solid var(--border);border-radius:var(--radius-sm);color:var(--text);font-size:var(--text-md)">' + typeOpts + '</select>' +
|
|
2659
|
+
'</label>' +
|
|
2660
|
+
'<label style="color:var(--text);font-size:var(--text-md);flex:1">Priority' +
|
|
2661
|
+
'<select id="wi-edit-priority" style="display:block;width:100%;margin-top:4px;padding:6px 8px;background:var(--bg);border:1px solid var(--border);border-radius:var(--radius-sm);color:var(--text);font-size:var(--text-md)">' + priOpts + '</select>' +
|
|
2662
|
+
'</label>' +
|
|
2663
|
+
'<label style="color:var(--text);font-size:var(--text-md);flex:1">Agent' +
|
|
2664
|
+
'<select id="wi-edit-agent" style="display:block;width:100%;margin-top:4px;padding:6px 8px;background:var(--bg);border:1px solid var(--border);border-radius:var(--radius-sm);color:var(--text);font-size:var(--text-md)"><option value="">Auto</option>' + agentOpts + '</select>' +
|
|
2665
|
+
'</label>' +
|
|
2666
|
+
'</div>' +
|
|
2667
|
+
'<div style="display:flex;justify-content:flex-end;gap:8px;margin-top:8px">' +
|
|
2668
|
+
'<button onclick="closeModal()" class="pr-pager-btn" style="padding:6px 16px;font-size:var(--text-md)">Cancel</button>' +
|
|
2669
|
+
'<button onclick="submitWorkItemEdit(\'' + escHtml(id) + '\',\'' + escHtml(source || '') + '\')" style="padding:6px 16px;font-size:var(--text-md);background:var(--blue);color:#fff;border:none;border-radius:var(--radius-sm);cursor:pointer">Save</button>' +
|
|
2670
|
+
'</div>' +
|
|
2671
|
+
'</div>';
|
|
2672
|
+
document.getElementById('modal').classList.add('open');
|
|
2673
|
+
}
|
|
2674
|
+
|
|
2675
|
+
async function submitWorkItemEdit(id, source) {
|
|
2676
|
+
const title = document.getElementById('wi-edit-title').value.trim();
|
|
2677
|
+
const description = document.getElementById('wi-edit-desc').value;
|
|
2678
|
+
const type = document.getElementById('wi-edit-type').value;
|
|
2679
|
+
const priority = document.getElementById('wi-edit-priority').value;
|
|
2680
|
+
const agent = document.getElementById('wi-edit-agent').value;
|
|
2681
|
+
if (!title) { alert('Title is required'); return; }
|
|
2682
|
+
try {
|
|
2683
|
+
const res = await fetch('/api/work-items/update', {
|
|
2684
|
+
method: 'POST', headers: { 'Content-Type': 'application/json' },
|
|
2685
|
+
body: JSON.stringify({ id, source: source || undefined, title, description, type, priority, agent })
|
|
2686
|
+
});
|
|
2687
|
+
if (res.ok) { closeModal(); refresh(); showToast('cmd-toast', 'Work item updated', true); } else {
|
|
2688
|
+
const d = await res.json();
|
|
2689
|
+
alert('Update failed: ' + (d.error || 'unknown'));
|
|
2690
|
+
}
|
|
2691
|
+
} catch (e) { alert('Update error: ' + e.message); }
|
|
2692
|
+
}
|
|
2693
|
+
|
|
2591
2694
|
async function deleteWorkItem(id, source) {
|
|
2592
2695
|
if (!confirm('Delete work item ' + id + '? This will kill any running agent and remove all dispatch history.')) return;
|
|
2593
2696
|
try {
|
package/dashboard.js
CHANGED
|
@@ -852,6 +852,44 @@ const server = http.createServer(async (req, res) => {
|
|
|
852
852
|
} catch (e) { return jsonReply(res, 400, { error: e.message }); }
|
|
853
853
|
}
|
|
854
854
|
|
|
855
|
+
// POST /api/work-items/update — edit a pending/failed work item
|
|
856
|
+
if (req.method === 'POST' && req.url === '/api/work-items/update') {
|
|
857
|
+
try {
|
|
858
|
+
const body = await readBody(req);
|
|
859
|
+
const { id, source, title, description, type, priority, agent } = body;
|
|
860
|
+
if (!id) return jsonReply(res, 400, { error: 'id required' });
|
|
861
|
+
|
|
862
|
+
let wiPath;
|
|
863
|
+
if (!source || source === 'central') {
|
|
864
|
+
wiPath = path.join(MINIONS_DIR, 'work-items.json');
|
|
865
|
+
} else {
|
|
866
|
+
const proj = PROJECTS.find(p => p.name === source);
|
|
867
|
+
if (proj) {
|
|
868
|
+
wiPath = shared.projectWorkItemsPath(proj);
|
|
869
|
+
}
|
|
870
|
+
}
|
|
871
|
+
if (!wiPath) return jsonReply(res, 404, { error: 'source not found' });
|
|
872
|
+
|
|
873
|
+
const items = JSON.parse(safeRead(wiPath) || '[]');
|
|
874
|
+
const item = items.find(i => i.id === id);
|
|
875
|
+
if (!item) return jsonReply(res, 404, { error: 'item not found' });
|
|
876
|
+
|
|
877
|
+
if (item.status === 'dispatched') {
|
|
878
|
+
return jsonReply(res, 400, { error: 'Cannot edit dispatched items' });
|
|
879
|
+
}
|
|
880
|
+
|
|
881
|
+
if (title !== undefined) item.title = title;
|
|
882
|
+
if (description !== undefined) item.description = description;
|
|
883
|
+
if (type !== undefined) item.type = type;
|
|
884
|
+
if (priority !== undefined) item.priority = priority;
|
|
885
|
+
if (agent !== undefined) item.agent = agent || null;
|
|
886
|
+
item.updatedAt = new Date().toISOString();
|
|
887
|
+
|
|
888
|
+
safeWrite(wiPath, items);
|
|
889
|
+
return jsonReply(res, 200, { ok: true, item });
|
|
890
|
+
} catch (e) { return jsonReply(res, 400, { error: e.message }); }
|
|
891
|
+
}
|
|
892
|
+
|
|
855
893
|
// POST /api/notes — write to inbox so it flows through normal consolidation
|
|
856
894
|
if (req.method === 'POST' && req.url === '/api/notes') {
|
|
857
895
|
try {
|
|
@@ -1076,6 +1114,69 @@ const server = http.createServer(async (req, res) => {
|
|
|
1076
1114
|
} catch (e) { return jsonReply(res, 400, { error: e.message }); }
|
|
1077
1115
|
}
|
|
1078
1116
|
|
|
1117
|
+
// GET /api/agent/:id/live-stream — SSE real-time live output streaming
|
|
1118
|
+
const liveStreamMatch = req.url.match(/^\/api\/agent\/([\w-]+)\/live-stream(?:\?.*)?$/);
|
|
1119
|
+
if (liveStreamMatch && req.method === 'GET') {
|
|
1120
|
+
const agentId = liveStreamMatch[1];
|
|
1121
|
+
const liveLogPath = path.join(MINIONS_DIR, 'agents', agentId, 'live-output.log');
|
|
1122
|
+
|
|
1123
|
+
res.writeHead(200, {
|
|
1124
|
+
'Content-Type': 'text/event-stream',
|
|
1125
|
+
'Cache-Control': 'no-cache',
|
|
1126
|
+
'Connection': 'keep-alive',
|
|
1127
|
+
'Access-Control-Allow-Origin': '*',
|
|
1128
|
+
});
|
|
1129
|
+
|
|
1130
|
+
// Send initial content
|
|
1131
|
+
let offset = 0;
|
|
1132
|
+
try {
|
|
1133
|
+
const content = fs.readFileSync(liveLogPath, 'utf8');
|
|
1134
|
+
if (content.length > 0) {
|
|
1135
|
+
res.write(`data: ${JSON.stringify(content)}\n\n`);
|
|
1136
|
+
offset = Buffer.byteLength(content, 'utf8');
|
|
1137
|
+
}
|
|
1138
|
+
} catch {}
|
|
1139
|
+
|
|
1140
|
+
// Watch for changes using fs.watchFile (cross-platform, works on Windows)
|
|
1141
|
+
const watcher = () => {
|
|
1142
|
+
try {
|
|
1143
|
+
const stat = fs.statSync(liveLogPath);
|
|
1144
|
+
if (stat.size > offset) {
|
|
1145
|
+
const fd = fs.openSync(liveLogPath, 'r');
|
|
1146
|
+
const buf = Buffer.alloc(stat.size - offset);
|
|
1147
|
+
fs.readSync(fd, buf, 0, buf.length, offset);
|
|
1148
|
+
fs.closeSync(fd);
|
|
1149
|
+
offset = stat.size;
|
|
1150
|
+
const chunk = buf.toString('utf8');
|
|
1151
|
+
if (chunk) res.write(`data: ${JSON.stringify(chunk)}\n\n`);
|
|
1152
|
+
}
|
|
1153
|
+
} catch {}
|
|
1154
|
+
};
|
|
1155
|
+
|
|
1156
|
+
fs.watchFile(liveLogPath, { interval: 500 }, watcher);
|
|
1157
|
+
|
|
1158
|
+
// Check if agent is still active (poll every 5s)
|
|
1159
|
+
const doneCheck = setInterval(() => {
|
|
1160
|
+
const dispatch = getDispatchQueue();
|
|
1161
|
+
const isActive = (dispatch.active || []).some(d => d.agent === agentId);
|
|
1162
|
+
if (!isActive) {
|
|
1163
|
+
watcher(); // flush final content
|
|
1164
|
+
res.write(`event: done\ndata: complete\n\n`);
|
|
1165
|
+
clearInterval(doneCheck);
|
|
1166
|
+
fs.unwatchFile(liveLogPath, watcher);
|
|
1167
|
+
res.end();
|
|
1168
|
+
}
|
|
1169
|
+
}, 5000);
|
|
1170
|
+
|
|
1171
|
+
// Cleanup on client disconnect
|
|
1172
|
+
req.on('close', () => {
|
|
1173
|
+
clearInterval(doneCheck);
|
|
1174
|
+
fs.unwatchFile(liveLogPath, watcher);
|
|
1175
|
+
});
|
|
1176
|
+
|
|
1177
|
+
return;
|
|
1178
|
+
}
|
|
1179
|
+
|
|
1079
1180
|
// GET /api/agent/:id/live — tail live output for a working agent
|
|
1080
1181
|
const liveMatch = req.url.match(/^\/api\/agent\/([\w-]+)\/live(?:\?.*)?$/);
|
|
1081
1182
|
if (liveMatch && req.method === 'GET') {
|
package/engine/ado.js
CHANGED
|
@@ -192,6 +192,7 @@ async function pollPrStatus(config) {
|
|
|
192
192
|
pr.buildStatus = buildStatus;
|
|
193
193
|
if (buildFailReason) pr.buildFailReason = buildFailReason;
|
|
194
194
|
else delete pr.buildFailReason;
|
|
195
|
+
if (buildStatus !== 'failing') delete pr._buildFailNotified;
|
|
195
196
|
updated = true;
|
|
196
197
|
}
|
|
197
198
|
|
package/engine/cli.js
CHANGED
|
@@ -305,9 +305,51 @@ const commands = {
|
|
|
305
305
|
e.tick();
|
|
306
306
|
|
|
307
307
|
// Start tick loop
|
|
308
|
-
setInterval(() => e.tick(), interval);
|
|
308
|
+
const tickTimer = setInterval(() => e.tick(), interval);
|
|
309
309
|
console.log(`Tick interval: ${interval / 1000}s | Max concurrent: ${config.engine?.maxConcurrent || 5}`);
|
|
310
310
|
console.log('Press Ctrl+C to stop');
|
|
311
|
+
|
|
312
|
+
// Graceful shutdown — wait for active agents before exiting
|
|
313
|
+
let shuttingDown = false;
|
|
314
|
+
function gracefulShutdown(signal) {
|
|
315
|
+
if (shuttingDown) return;
|
|
316
|
+
shuttingDown = true;
|
|
317
|
+
console.log(`\n${signal} received — initiating graceful shutdown...`);
|
|
318
|
+
clearInterval(tickTimer);
|
|
319
|
+
safeWrite(CONTROL_PATH, { state: 'stopping', pid: process.pid, stopping_at: e.ts() });
|
|
320
|
+
e.log('info', `Graceful shutdown initiated (${signal})`);
|
|
321
|
+
|
|
322
|
+
if (e.activeProcesses.size === 0) {
|
|
323
|
+
safeWrite(CONTROL_PATH, { state: 'stopped', stopped_at: e.ts() });
|
|
324
|
+
e.log('info', 'Graceful shutdown complete (no active agents)');
|
|
325
|
+
console.log('No active agents — stopped.');
|
|
326
|
+
process.exit(0);
|
|
327
|
+
}
|
|
328
|
+
|
|
329
|
+
console.log(`Waiting for ${e.activeProcesses.size} active agent(s) to finish...`);
|
|
330
|
+
const timeout = config.engine?.shutdownTimeout || shared.ENGINE_DEFAULTS.shutdownTimeout;
|
|
331
|
+
const deadline = Date.now() + timeout;
|
|
332
|
+
|
|
333
|
+
const poll = setInterval(() => {
|
|
334
|
+
if (e.activeProcesses.size === 0) {
|
|
335
|
+
clearInterval(poll);
|
|
336
|
+
safeWrite(CONTROL_PATH, { state: 'stopped', stopped_at: e.ts() });
|
|
337
|
+
e.log('info', 'Graceful shutdown complete (all agents finished)');
|
|
338
|
+
console.log('All agents finished — stopped.');
|
|
339
|
+
process.exit(0);
|
|
340
|
+
}
|
|
341
|
+
if (Date.now() >= deadline) {
|
|
342
|
+
clearInterval(poll);
|
|
343
|
+
safeWrite(CONTROL_PATH, { state: 'stopped', stopped_at: e.ts() });
|
|
344
|
+
e.log('warn', `Graceful shutdown timed out after ${timeout / 1000}s with ${e.activeProcesses.size} agent(s) still active`);
|
|
345
|
+
console.log(`Shutdown timeout (${timeout / 1000}s) — force exiting with ${e.activeProcesses.size} agent(s) still running.`);
|
|
346
|
+
process.exit(1);
|
|
347
|
+
}
|
|
348
|
+
}, 2000);
|
|
349
|
+
}
|
|
350
|
+
|
|
351
|
+
process.on('SIGTERM', () => gracefulShutdown('SIGTERM'));
|
|
352
|
+
process.on('SIGINT', () => gracefulShutdown('SIGINT'));
|
|
311
353
|
},
|
|
312
354
|
|
|
313
355
|
stop() {
|
package/engine/github.js
CHANGED
|
@@ -159,6 +159,7 @@ async function pollPrStatus(config) {
|
|
|
159
159
|
pr.buildStatus = buildStatus;
|
|
160
160
|
if (buildFailReason) pr.buildFailReason = buildFailReason;
|
|
161
161
|
else delete pr.buildFailReason;
|
|
162
|
+
if (buildStatus !== 'failing') delete pr._buildFailNotified;
|
|
162
163
|
updated = true;
|
|
163
164
|
}
|
|
164
165
|
}
|
package/engine/lifecycle.js
CHANGED
|
@@ -936,6 +936,82 @@ function parseAgentOutput(stdout) {
|
|
|
936
936
|
return { resultSummary: parsed.text, taskUsage: parsed.usage };
|
|
937
937
|
}
|
|
938
938
|
|
|
939
|
+
/**
|
|
940
|
+
* Handle decomposition result — parse sub-items from agent output and create child work items.
|
|
941
|
+
* Called from runPostCompletionHooks when type === 'decompose'.
|
|
942
|
+
*/
|
|
943
|
+
function handleDecompositionResult(stdout, meta, config) {
|
|
944
|
+
const e = engine();
|
|
945
|
+
const parentId = meta?.item?.id;
|
|
946
|
+
if (!parentId) return 0;
|
|
947
|
+
|
|
948
|
+
// Parse sub-items JSON from agent output
|
|
949
|
+
const { text } = shared.parseStreamJsonOutput(stdout);
|
|
950
|
+
const jsonMatch = text.match(/```json\s*\n([\s\S]*?)```/);
|
|
951
|
+
if (!jsonMatch) {
|
|
952
|
+
e.log('warn', `Decomposition for ${parentId}: no JSON block found in output`);
|
|
953
|
+
return 0;
|
|
954
|
+
}
|
|
955
|
+
|
|
956
|
+
let decomposition;
|
|
957
|
+
try {
|
|
958
|
+
decomposition = JSON.parse(jsonMatch[1]);
|
|
959
|
+
} catch (err) {
|
|
960
|
+
e.log('warn', `Decomposition for ${parentId}: invalid JSON — ${err.message}`);
|
|
961
|
+
return 0;
|
|
962
|
+
}
|
|
963
|
+
|
|
964
|
+
const subItems = decomposition.sub_items || decomposition.subItems || [];
|
|
965
|
+
if (subItems.length === 0) {
|
|
966
|
+
e.log('warn', `Decomposition for ${parentId}: no sub-items produced`);
|
|
967
|
+
return 0;
|
|
968
|
+
}
|
|
969
|
+
|
|
970
|
+
// Find and update the parent work item
|
|
971
|
+
const projects = shared.getProjects(config);
|
|
972
|
+
const allPaths = [path.join(MINIONS_DIR, 'work-items.json')];
|
|
973
|
+
for (const p of projects) allPaths.push(shared.projectWorkItemsPath(p));
|
|
974
|
+
|
|
975
|
+
for (const wiPath of allPaths) {
|
|
976
|
+
const items = safeJson(wiPath) || [];
|
|
977
|
+
const parent = items.find(i => i.id === parentId);
|
|
978
|
+
if (!parent) continue;
|
|
979
|
+
|
|
980
|
+
// Mark parent as decomposed
|
|
981
|
+
parent.status = 'decomposed';
|
|
982
|
+
parent._decomposed = true;
|
|
983
|
+
delete parent._decomposing;
|
|
984
|
+
parent._subItemIds = subItems.map(s => s.id);
|
|
985
|
+
|
|
986
|
+
// Create child work items
|
|
987
|
+
for (const sub of subItems) {
|
|
988
|
+
if (items.some(i => i.id === sub.id)) continue; // dedupe
|
|
989
|
+
items.push({
|
|
990
|
+
id: sub.id,
|
|
991
|
+
title: sub.name || sub.title || `Sub-task of ${parentId}`,
|
|
992
|
+
type: (sub.estimated_complexity === 'large') ? 'implement:large' : 'implement',
|
|
993
|
+
priority: sub.priority || parent.priority || 'medium',
|
|
994
|
+
description: sub.description || '',
|
|
995
|
+
status: 'pending',
|
|
996
|
+
complexity: sub.estimated_complexity || 'medium',
|
|
997
|
+
depends_on: sub.depends_on || [],
|
|
998
|
+
parent_id: parentId,
|
|
999
|
+
sourcePlan: parent.sourcePlan,
|
|
1000
|
+
branchStrategy: parent.branchStrategy,
|
|
1001
|
+
featureBranch: parent.featureBranch,
|
|
1002
|
+
created: new Date().toISOString(),
|
|
1003
|
+
createdBy: 'decomposition',
|
|
1004
|
+
});
|
|
1005
|
+
}
|
|
1006
|
+
|
|
1007
|
+
safeWrite(wiPath, items);
|
|
1008
|
+
e.log('info', `Decomposition: ${parentId} → ${subItems.length} sub-items: ${subItems.map(s => s.id).join(', ')}`);
|
|
1009
|
+
return subItems.length;
|
|
1010
|
+
}
|
|
1011
|
+
|
|
1012
|
+
return 0;
|
|
1013
|
+
}
|
|
1014
|
+
|
|
939
1015
|
function runPostCompletionHooks(dispatchItem, agentId, code, stdout, config) {
|
|
940
1016
|
const e = engine();
|
|
941
1017
|
const type = dispatchItem.type;
|
|
@@ -944,6 +1020,16 @@ function runPostCompletionHooks(dispatchItem, agentId, code, stdout, config) {
|
|
|
944
1020
|
const result = isSuccess ? 'success' : 'error';
|
|
945
1021
|
const { resultSummary, taskUsage } = parseAgentOutput(stdout);
|
|
946
1022
|
|
|
1023
|
+
// Handle decomposition results — create sub-items from decompose agent output
|
|
1024
|
+
if (type === 'decompose' && isSuccess && meta?.item?.id) {
|
|
1025
|
+
const subCount = handleDecompositionResult(stdout, meta, config);
|
|
1026
|
+
if (subCount > 0) {
|
|
1027
|
+
// Parent is marked 'decomposed' by handler — don't overwrite with 'done'
|
|
1028
|
+
return { resultSummary: `Decomposed into ${subCount} sub-items`, taskUsage };
|
|
1029
|
+
}
|
|
1030
|
+
// Fallback: if decomposition produced nothing, mark parent as done to avoid stuck state
|
|
1031
|
+
}
|
|
1032
|
+
|
|
947
1033
|
if (isSuccess && meta?.item?.id) updateWorkItemStatus(meta, 'done', '');
|
|
948
1034
|
if (!isSuccess && meta?.item?.id) {
|
|
949
1035
|
// Auto-retry: read fresh _retryCount from file (not stale dispatch-time snapshot)
|
|
@@ -0,0 +1,159 @@
|
|
|
1
|
+
/**
|
|
2
|
+
* engine/scheduler.js — Cron-style scheduled task discovery.
|
|
3
|
+
* Zero dependencies — uses only Node.js built-ins.
|
|
4
|
+
*
|
|
5
|
+
* Config schema:
|
|
6
|
+
* config.schedules: Array<{
|
|
7
|
+
* id: string, — unique schedule ID
|
|
8
|
+
* cron: string, — simplified cron: "minute hour dayOfWeek" (0=Sun..6=Sat)
|
|
9
|
+
* type: string, — work item type (implement, test, explore, ask, etc.)
|
|
10
|
+
* title: string, — work item title
|
|
11
|
+
* description?: string,
|
|
12
|
+
* project?: string, — target project name
|
|
13
|
+
* agent?: string, — preferred agent ID
|
|
14
|
+
* enabled?: boolean — default true
|
|
15
|
+
* }>
|
|
16
|
+
*
|
|
17
|
+
* Cron field syntax:
|
|
18
|
+
* * — every value
|
|
19
|
+
* N — exact value (e.g., "0" = minute 0, "2" = 2am, "1" = Monday)
|
|
20
|
+
* N,M — multiple values (e.g., "1,3,5" = Mon/Wed/Fri)
|
|
21
|
+
* * /N — every Nth value (e.g., "* /15" = every 15 minutes) [no space — formatting only]
|
|
22
|
+
*/
|
|
23
|
+
|
|
24
|
+
const fs = require('fs');
|
|
25
|
+
const path = require('path');
|
|
26
|
+
const shared = require('./shared');
|
|
27
|
+
const { safeJson, safeWrite, mutateJsonFileLocked } = shared;
|
|
28
|
+
|
|
29
|
+
const SCHEDULE_RUNS_PATH = path.join(__dirname, 'schedule-runs.json');
|
|
30
|
+
|
|
31
|
+
/**
|
|
32
|
+
* Parse a single cron field into a matcher function.
|
|
33
|
+
* @param {string} field — e.g., "*", "5", "1,3,5", "*/15"
|
|
34
|
+
* @param {number} min — minimum valid value (0 for minute/dow, 0 for hour)
|
|
35
|
+
* @param {number} max — maximum valid value (59 for minute, 23 for hour, 6 for dow)
|
|
36
|
+
* @returns {function(number): boolean}
|
|
37
|
+
*/
|
|
38
|
+
function parseCronField(field, min, max) {
|
|
39
|
+
field = field.trim();
|
|
40
|
+
if (field === '*') return () => true;
|
|
41
|
+
|
|
42
|
+
// Step: */N
|
|
43
|
+
if (field.startsWith('*/')) {
|
|
44
|
+
const step = parseInt(field.slice(2), 10);
|
|
45
|
+
if (isNaN(step) || step <= 0) return () => false;
|
|
46
|
+
return (val) => val % step === 0;
|
|
47
|
+
}
|
|
48
|
+
|
|
49
|
+
// List: N,M,O
|
|
50
|
+
if (field.includes(',')) {
|
|
51
|
+
const values = new Set(field.split(',').map(v => parseInt(v.trim(), 10)).filter(v => !isNaN(v)));
|
|
52
|
+
return (val) => values.has(val);
|
|
53
|
+
}
|
|
54
|
+
|
|
55
|
+
// Single value: N
|
|
56
|
+
const exact = parseInt(field, 10);
|
|
57
|
+
if (!isNaN(exact)) return (val) => val === exact;
|
|
58
|
+
|
|
59
|
+
return () => false;
|
|
60
|
+
}
|
|
61
|
+
|
|
62
|
+
/**
|
|
63
|
+
* Parse a 3-field cron expression: "minute hour dayOfWeek"
|
|
64
|
+
* @param {string} expr — e.g., "0 2 *" (2am daily), "0 9 1" (9am Monday), "*/30 * *" (every 30 min)
|
|
65
|
+
* @returns {{ matches: function(Date): boolean }} or null if invalid
|
|
66
|
+
*/
|
|
67
|
+
function parseCronExpr(expr) {
|
|
68
|
+
if (!expr || typeof expr !== 'string') return null;
|
|
69
|
+
const parts = expr.trim().split(/\s+/);
|
|
70
|
+
if (parts.length < 2 || parts.length > 3) return null;
|
|
71
|
+
|
|
72
|
+
const minuteMatcher = parseCronField(parts[0], 0, 59);
|
|
73
|
+
const hourMatcher = parseCronField(parts[1], 0, 23);
|
|
74
|
+
const dowMatcher = parts[2] ? parseCronField(parts[2], 0, 6) : () => true;
|
|
75
|
+
|
|
76
|
+
return {
|
|
77
|
+
matches(date) {
|
|
78
|
+
return minuteMatcher(date.getMinutes()) &&
|
|
79
|
+
hourMatcher(date.getHours()) &&
|
|
80
|
+
dowMatcher(date.getDay());
|
|
81
|
+
}
|
|
82
|
+
};
|
|
83
|
+
}
|
|
84
|
+
|
|
85
|
+
/**
|
|
86
|
+
* Check if a schedule should fire now, given its last run time.
|
|
87
|
+
* Prevents double-firing within the same minute window.
|
|
88
|
+
* @param {{ cron: string }} schedule
|
|
89
|
+
* @param {string|null} lastRunAt — ISO timestamp of last run
|
|
90
|
+
* @returns {boolean}
|
|
91
|
+
*/
|
|
92
|
+
function shouldRunNow(schedule, lastRunAt) {
|
|
93
|
+
const cron = parseCronExpr(schedule.cron);
|
|
94
|
+
if (!cron) return false;
|
|
95
|
+
|
|
96
|
+
const now = new Date();
|
|
97
|
+
if (!cron.matches(now)) return false;
|
|
98
|
+
|
|
99
|
+
// Don't fire again if already ran in this minute window
|
|
100
|
+
if (lastRunAt) {
|
|
101
|
+
const last = new Date(lastRunAt);
|
|
102
|
+
if (last.getFullYear() === now.getFullYear() &&
|
|
103
|
+
last.getMonth() === now.getMonth() &&
|
|
104
|
+
last.getDate() === now.getDate() &&
|
|
105
|
+
last.getHours() === now.getHours() &&
|
|
106
|
+
last.getMinutes() === now.getMinutes()) {
|
|
107
|
+
return false; // already fired this minute
|
|
108
|
+
}
|
|
109
|
+
}
|
|
110
|
+
|
|
111
|
+
return true;
|
|
112
|
+
}
|
|
113
|
+
|
|
114
|
+
/**
|
|
115
|
+
* Discover work items from configured schedules.
|
|
116
|
+
* @param {object} config — full config object
|
|
117
|
+
* @returns {Array<object>} — work items to create
|
|
118
|
+
*/
|
|
119
|
+
function discoverScheduledWork(config) {
|
|
120
|
+
const schedules = config.schedules;
|
|
121
|
+
if (!Array.isArray(schedules) || schedules.length === 0) return [];
|
|
122
|
+
|
|
123
|
+
const runs = safeJson(SCHEDULE_RUNS_PATH) || {};
|
|
124
|
+
const work = [];
|
|
125
|
+
|
|
126
|
+
for (const sched of schedules) {
|
|
127
|
+
if (!sched.id || !sched.cron || !sched.title) continue;
|
|
128
|
+
if (sched.enabled === false) continue;
|
|
129
|
+
|
|
130
|
+
const lastRun = runs[sched.id] || null;
|
|
131
|
+
if (!shouldRunNow(sched, lastRun)) continue;
|
|
132
|
+
|
|
133
|
+
work.push({
|
|
134
|
+
id: `sched-${sched.id}-${Date.now()}`,
|
|
135
|
+
title: sched.title,
|
|
136
|
+
type: sched.type || 'implement',
|
|
137
|
+
priority: sched.priority || 'medium',
|
|
138
|
+
description: sched.description || sched.title,
|
|
139
|
+
status: 'pending',
|
|
140
|
+
created: new Date().toISOString(),
|
|
141
|
+
createdBy: 'scheduler',
|
|
142
|
+
agent: sched.agent || null,
|
|
143
|
+
project: sched.project || null,
|
|
144
|
+
_scheduleId: sched.id,
|
|
145
|
+
});
|
|
146
|
+
|
|
147
|
+
// Record run time
|
|
148
|
+
runs[sched.id] = new Date().toISOString();
|
|
149
|
+
}
|
|
150
|
+
|
|
151
|
+
// Persist run times if any schedules fired
|
|
152
|
+
if (work.length > 0) {
|
|
153
|
+
safeWrite(SCHEDULE_RUNS_PATH, runs);
|
|
154
|
+
}
|
|
155
|
+
|
|
156
|
+
return work;
|
|
157
|
+
}
|
|
158
|
+
|
|
159
|
+
module.exports = { parseCronExpr, parseCronField, shouldRunNow, discoverScheduledWork, SCHEDULE_RUNS_PATH };
|
package/engine/shared.js
CHANGED
|
@@ -242,6 +242,9 @@ const ENGINE_DEFAULTS = {
|
|
|
242
242
|
idleAlertMinutes: 15,
|
|
243
243
|
fanOutTimeout: null, // falls back to agentTimeout
|
|
244
244
|
restartGracePeriod: 1200000, // 20min
|
|
245
|
+
shutdownTimeout: 300000, // 5min — max wait for active agents during graceful shutdown
|
|
246
|
+
allowTempAgents: false, // opt-in: spawn ephemeral agents when all permanent agents are busy
|
|
247
|
+
autoDecompose: true, // auto-decompose implement:large items into sub-tasks
|
|
245
248
|
};
|
|
246
249
|
|
|
247
250
|
const DEFAULT_AGENTS = {
|
package/engine.js
CHANGED
|
@@ -207,6 +207,15 @@ function resolveAgent(workType, config, authorAgent = null) {
|
|
|
207
207
|
|
|
208
208
|
if (idle[0]) { _claimedAgents.add(idle[0]); return idle[0]; }
|
|
209
209
|
|
|
210
|
+
// No idle configured agent — try temp agent if enabled
|
|
211
|
+
if (config.engine?.allowTempAgents) {
|
|
212
|
+
const tempId = `temp-${shared.uid()}`;
|
|
213
|
+
_claimedAgents.add(tempId);
|
|
214
|
+
tempAgents.set(tempId, { name: `Temp-${tempId.slice(5, 9)}`, role: 'Temporary Agent', createdAt: ts() });
|
|
215
|
+
log('info', `Spawning temp agent ${tempId} — all permanent agents busy`);
|
|
216
|
+
return tempId;
|
|
217
|
+
}
|
|
218
|
+
|
|
210
219
|
// No idle agent available — return null, item stays pending until next tick
|
|
211
220
|
return null;
|
|
212
221
|
}
|
|
@@ -389,7 +398,15 @@ function getPrCreateInstructions(project) {
|
|
|
389
398
|
const host = getRepoHost(project);
|
|
390
399
|
const repoId = project?.repositoryId || '';
|
|
391
400
|
if (host === 'github') {
|
|
392
|
-
|
|
401
|
+
const org = project?.adoOrg || '';
|
|
402
|
+
const repo = project?.repoName || '';
|
|
403
|
+
const mainBranch = project?.mainBranch || 'main';
|
|
404
|
+
return `Use \`gh pr create\` to create a pull request:\n` +
|
|
405
|
+
`- \`gh pr create --base ${mainBranch} --head <your-branch> --title "PR title" --body "PR description" --repo ${org}/${repo}\`\n` +
|
|
406
|
+
`- Always set --base to \`${mainBranch}\` (the main branch)\n` +
|
|
407
|
+
`- Always set --repo to \`${org}/${repo}\` to target the correct repository\n` +
|
|
408
|
+
`- Use --head to specify your feature branch name\n` +
|
|
409
|
+
`- Include a meaningful --title and --body describing the changes`;
|
|
393
410
|
}
|
|
394
411
|
// Default: Azure DevOps
|
|
395
412
|
return `Use \`mcp__azure-ado__repo_create_pull_request\`:\n- repositoryId: \`${repoId}\``;
|
|
@@ -399,7 +416,13 @@ function getPrCommentInstructions(project) {
|
|
|
399
416
|
const host = getRepoHost(project);
|
|
400
417
|
const repoId = project?.repositoryId || '';
|
|
401
418
|
if (host === 'github') {
|
|
402
|
-
|
|
419
|
+
const org = project?.adoOrg || '';
|
|
420
|
+
const repo = project?.repoName || '';
|
|
421
|
+
return `Use \`gh pr comment\` to post a comment on the PR:\n` +
|
|
422
|
+
`- \`gh pr comment <number> --body "Your comment text" --repo ${org}/${repo}\`\n` +
|
|
423
|
+
`- Replace <number> with the PR number\n` +
|
|
424
|
+
`- Always set --repo to \`${org}/${repo}\` to target the correct repository\n` +
|
|
425
|
+
`- Use --body to provide the comment text (supports Markdown)`;
|
|
403
426
|
}
|
|
404
427
|
return `Use \`mcp__azure-ado__repo_create_pull_request_thread\`:\n- repositoryId: \`${repoId}\``;
|
|
405
428
|
}
|
|
@@ -407,7 +430,17 @@ function getPrCommentInstructions(project) {
|
|
|
407
430
|
function getPrFetchInstructions(project) {
|
|
408
431
|
const host = getRepoHost(project);
|
|
409
432
|
if (host === 'github') {
|
|
410
|
-
|
|
433
|
+
const org = project?.adoOrg || '';
|
|
434
|
+
const repo = project?.repoName || '';
|
|
435
|
+
const mainBranch = project?.mainBranch || 'main';
|
|
436
|
+
return `Use \`gh pr view\` to fetch PR status:\n` +
|
|
437
|
+
`- \`gh pr view <number> --json number,title,state,mergeable,reviewDecision,headRefName,baseRefName,statusCheckRollup --repo ${org}/${repo}\`\n` +
|
|
438
|
+
`- This returns JSON with PR state, mergeability, review decision, and check statuses\n` +
|
|
439
|
+
`- To fetch the PR branch locally:\n` +
|
|
440
|
+
` 1. \`git fetch origin <branch-name>\`\n` +
|
|
441
|
+
` 2. \`git checkout <branch-name>\`\n` +
|
|
442
|
+
`- Or use \`gh pr checkout <number> --repo ${org}/${repo}\` to fetch and checkout in one step\n` +
|
|
443
|
+
`- The base branch is \`${mainBranch}\``;
|
|
411
444
|
}
|
|
412
445
|
return `Use \`mcp__azure-ado__repo_get_pull_request_by_id\` to fetch PR status.`;
|
|
413
446
|
}
|
|
@@ -416,7 +449,15 @@ function getPrVoteInstructions(project) {
|
|
|
416
449
|
const host = getRepoHost(project);
|
|
417
450
|
const repoId = project?.repositoryId || '';
|
|
418
451
|
if (host === 'github') {
|
|
419
|
-
|
|
452
|
+
const org = project?.adoOrg || '';
|
|
453
|
+
const repo = project?.repoName || '';
|
|
454
|
+
return `Use \`gh pr review\` to submit a review on the PR:\n` +
|
|
455
|
+
`- Approve: \`gh pr review <number> --approve --body "Approval comment" --repo ${org}/${repo}\`\n` +
|
|
456
|
+
`- Request changes: \`gh pr review <number> --request-changes --body "What needs to change" --repo ${org}/${repo}\`\n` +
|
|
457
|
+
`- Comment only: \`gh pr review <number> --comment --body "Review comment" --repo ${org}/${repo}\`\n` +
|
|
458
|
+
`- Replace <number> with the PR number\n` +
|
|
459
|
+
`- Always set --repo to \`${org}/${repo}\` to target the correct repository\n` +
|
|
460
|
+
`- Use --body to provide a review summary (supports Markdown)`;
|
|
420
461
|
}
|
|
421
462
|
return `Use \`mcp__azure-ado__repo_update_pull_request_reviewers\`:\n- repositoryId: \`${repoId}\`\n- Set your reviewer vote on the PR (10=approve, 5=approve-with-suggestions, -10=reject)`;
|
|
422
463
|
}
|
|
@@ -437,8 +478,8 @@ function getRepoHostToolRule(project) {
|
|
|
437
478
|
|
|
438
479
|
// Lean system prompt: agent identity + rules only (~2-4KB, never grows)
|
|
439
480
|
function buildSystemPrompt(agentId, config, project) {
|
|
440
|
-
const agent = config.agents[agentId];
|
|
441
|
-
const charter = getAgentCharter(agentId);
|
|
481
|
+
const agent = config.agents[agentId] || tempAgents.get(agentId) || { name: agentId, role: 'Temporary Agent', skills: [] };
|
|
482
|
+
const charter = getAgentCharter(agentId); // returns '' for temp agents (no charter file)
|
|
442
483
|
project = project || getProjects(config)[0] || {};
|
|
443
484
|
|
|
444
485
|
let prompt = '';
|
|
@@ -567,6 +608,7 @@ const { runPostCompletionHooks, updateWorkItemStatus, syncPrdItemStatus, handleP
|
|
|
567
608
|
// ─── Agent Spawner ──────────────────────────────────────────────────────────
|
|
568
609
|
|
|
569
610
|
const activeProcesses = new Map(); // dispatchId → { proc, agentId, startedAt }
|
|
611
|
+
const tempAgents = new Map(); // tempAgentId → { name, role, createdAt }
|
|
570
612
|
let engineRestartGraceUntil = 0; // timestamp — suppress orphan detection until this time
|
|
571
613
|
|
|
572
614
|
// Resolve dependency plan item IDs to their PR branches
|
|
@@ -965,6 +1007,17 @@ function spawnAgent(dispatchItem, config) {
|
|
|
965
1007
|
try { fs.unlinkSync(promptPath.replace(/prompt-/, 'pid-').replace(/\.md$/, '.pid')); } catch {}
|
|
966
1008
|
|
|
967
1009
|
log('info', `Agent ${agentId} completed. Output saved to ${archivePath}`);
|
|
1010
|
+
|
|
1011
|
+
// Clean up temp agent directory
|
|
1012
|
+
if (tempAgents.has(agentId)) {
|
|
1013
|
+
tempAgents.delete(agentId);
|
|
1014
|
+
try {
|
|
1015
|
+
const agentDir = path.join(AGENTS_DIR, agentId);
|
|
1016
|
+
// Keep output archive but remove temp agent directory (live-output.log etc.)
|
|
1017
|
+
fs.rmSync(agentDir, { recursive: true, force: true });
|
|
1018
|
+
log('info', `Temp agent ${agentId} cleaned up`);
|
|
1019
|
+
} catch {}
|
|
1020
|
+
}
|
|
968
1021
|
});
|
|
969
1022
|
|
|
970
1023
|
proc.on('error', (err) => {
|
|
@@ -1014,6 +1067,7 @@ function spawnAgent(dispatchItem, config) {
|
|
|
1014
1067
|
if (idx < 0) return;
|
|
1015
1068
|
const item = dispatch.pending.splice(idx, 1)[0];
|
|
1016
1069
|
item.started_at = startedAt;
|
|
1070
|
+
delete item.skipReason;
|
|
1017
1071
|
if (!dispatch.active.some(d => d.id === id)) {
|
|
1018
1072
|
dispatch.active.push(item);
|
|
1019
1073
|
}
|
|
@@ -2280,7 +2334,7 @@ function selectPlaybook(workType, item) {
|
|
|
2280
2334
|
if (workType === 'review' && !item?._pr && !item?.pr_id) {
|
|
2281
2335
|
return 'work-item';
|
|
2282
2336
|
}
|
|
2283
|
-
const typeSpecificPlaybooks = ['explore', 'review', 'test', 'plan-to-prd', 'plan', 'ask', 'verify'];
|
|
2337
|
+
const typeSpecificPlaybooks = ['explore', 'review', 'test', 'plan-to-prd', 'plan', 'ask', 'verify', 'decompose'];
|
|
2284
2338
|
return typeSpecificPlaybooks.includes(workType) ? workType : 'work-item';
|
|
2285
2339
|
}
|
|
2286
2340
|
|
|
@@ -2399,6 +2453,26 @@ function discoverFromPrs(config, project) {
|
|
|
2399
2453
|
review_note: `Build is failing: ${pr.buildFailReason || 'Check CI pipeline for details'}. Fix the build errors and push.`,
|
|
2400
2454
|
}, `Fix build failure on PR ${pr.id}`, { dispatchKey: key, source: 'pr', pr, branch: pr.branch, project: projMeta });
|
|
2401
2455
|
if (item) { newWork.push(item); setCooldown(key); }
|
|
2456
|
+
|
|
2457
|
+
// Notify the author agent about the build failure
|
|
2458
|
+
if (pr.agent && !pr._buildFailNotified) {
|
|
2459
|
+
writeInboxAlert(`build-fail-${pr.agent}-${pr.id}`,
|
|
2460
|
+
`# Build Failure Notification\n\n` +
|
|
2461
|
+
`**Your PR ${pr.id}** on branch \`${pr.branch || 'unknown'}\` has a failing build.\n` +
|
|
2462
|
+
`**Reason:** ${pr.buildFailReason || 'Check CI pipeline for details'}\n\n` +
|
|
2463
|
+
`A fix agent has been dispatched to address this. Review the fix when complete.\n`
|
|
2464
|
+
);
|
|
2465
|
+
// Mark notified to prevent duplicate alerts
|
|
2466
|
+
try {
|
|
2467
|
+
const prPath = projectPrPath(project);
|
|
2468
|
+
const prs = safeJson(prPath) || [];
|
|
2469
|
+
const target = prs.find(p => p.id === pr.id);
|
|
2470
|
+
if (target) {
|
|
2471
|
+
target._buildFailNotified = true;
|
|
2472
|
+
safeWrite(prPath, prs);
|
|
2473
|
+
}
|
|
2474
|
+
} catch {}
|
|
2475
|
+
}
|
|
2402
2476
|
}
|
|
2403
2477
|
|
|
2404
2478
|
}
|
|
@@ -2442,10 +2516,15 @@ function discoverFromWorkItems(config, project) {
|
|
|
2442
2516
|
if (depStatus === 'failed') {
|
|
2443
2517
|
item.status = 'failed';
|
|
2444
2518
|
item.failReason = 'Dependency failed — cannot proceed';
|
|
2519
|
+
delete item._pendingReason;
|
|
2445
2520
|
log('warn', `Marking ${item.id} as failed: dependency failed (plan: ${item.sourcePlan})`);
|
|
2521
|
+
needsWrite = true;
|
|
2522
|
+
continue;
|
|
2523
|
+
}
|
|
2524
|
+
if (!depStatus) {
|
|
2525
|
+
if (item._pendingReason !== 'dependency_unmet') { item._pendingReason = 'dependency_unmet'; needsWrite = true; }
|
|
2446
2526
|
continue;
|
|
2447
2527
|
}
|
|
2448
|
-
if (!depStatus) continue;
|
|
2449
2528
|
}
|
|
2450
2529
|
|
|
2451
2530
|
const key = `work-${project?.name || 'default'}-${item.id}`;
|
|
@@ -2465,14 +2544,30 @@ function discoverFromWorkItems(config, project) {
|
|
|
2465
2544
|
delete item._resumedAt;
|
|
2466
2545
|
safeWrite(projectWorkItemsPath(project), items);
|
|
2467
2546
|
}
|
|
2468
|
-
if (isAlreadyDispatched(key)
|
|
2547
|
+
if (isAlreadyDispatched(key)) {
|
|
2548
|
+
if (item._pendingReason !== 'already_dispatched') { item._pendingReason = 'already_dispatched'; needsWrite = true; }
|
|
2549
|
+
skipped.gated++; continue;
|
|
2550
|
+
}
|
|
2551
|
+
if (isOnCooldown(key, cooldownMs)) {
|
|
2552
|
+
if (item._pendingReason !== 'cooldown') { item._pendingReason = 'cooldown'; needsWrite = true; }
|
|
2553
|
+
skipped.gated++; continue;
|
|
2554
|
+
}
|
|
2469
2555
|
|
|
2470
2556
|
let workType = item.type || 'implement';
|
|
2471
2557
|
if (workType === 'implement' && (item.complexity === 'large' || item.estimated_complexity === 'large')) {
|
|
2472
2558
|
workType = 'implement:large';
|
|
2473
2559
|
}
|
|
2560
|
+
// Auto-decompose large items before implementation
|
|
2561
|
+
if (workType === 'implement:large' && !item._decomposed && !item._decomposing && config.engine?.autoDecompose !== false) {
|
|
2562
|
+
workType = 'decompose';
|
|
2563
|
+
item._decomposing = true;
|
|
2564
|
+
needsWrite = true;
|
|
2565
|
+
}
|
|
2474
2566
|
const agentId = item.agent || resolveAgent(workType, config);
|
|
2475
|
-
if (!agentId) {
|
|
2567
|
+
if (!agentId) {
|
|
2568
|
+
if (item._pendingReason !== 'no_agent') { item._pendingReason = 'no_agent'; needsWrite = true; }
|
|
2569
|
+
skipped.noAgent++; continue;
|
|
2570
|
+
}
|
|
2476
2571
|
|
|
2477
2572
|
const isShared = item.branchStrategy === 'shared-branch' && item.featureBranch;
|
|
2478
2573
|
const branchName = isShared ? item.featureBranch : (item.branch || `work/${item.id}`);
|
|
@@ -2525,6 +2620,7 @@ function discoverFromWorkItems(config, project) {
|
|
|
2525
2620
|
item.status = 'dispatched';
|
|
2526
2621
|
item.dispatched_at = ts();
|
|
2527
2622
|
item.dispatched_to = agentId;
|
|
2623
|
+
delete item._pendingReason;
|
|
2528
2624
|
prdSyncQueue.push({ id: item.id, sourcePlan: item.sourcePlan });
|
|
2529
2625
|
|
|
2530
2626
|
newWork.push({
|
|
@@ -2964,6 +3060,23 @@ function discoverWork(config) {
|
|
|
2964
3060
|
// Central work items (project-agnostic — agent decides where to work)
|
|
2965
3061
|
const centralWork = discoverCentralWorkItems(config);
|
|
2966
3062
|
|
|
3063
|
+
// Scheduled tasks (cron-style recurring work)
|
|
3064
|
+
try {
|
|
3065
|
+
const { discoverScheduledWork } = require('./engine/scheduler');
|
|
3066
|
+
const scheduledWork = discoverScheduledWork(config);
|
|
3067
|
+
for (const item of scheduledWork) {
|
|
3068
|
+
// Write scheduled items to central work-items.json so they persist across ticks
|
|
3069
|
+
const centralPath = path.join(MINIONS_DIR, 'work-items.json');
|
|
3070
|
+
const items = safeJson(centralPath) || [];
|
|
3071
|
+
// Dedupe: don't re-create if same schedule already has a pending/dispatched item
|
|
3072
|
+
if (!items.some(i => i._scheduleId === item._scheduleId && i.status !== 'done' && i.status !== 'failed')) {
|
|
3073
|
+
items.push(item);
|
|
3074
|
+
safeWrite(centralPath, items);
|
|
3075
|
+
log('info', `Scheduled task fired: ${item._scheduleId} → ${item.title}`);
|
|
3076
|
+
}
|
|
3077
|
+
}
|
|
3078
|
+
} catch {}
|
|
3079
|
+
|
|
2967
3080
|
// Gate reviews and fixes: do not dispatch until all implement items are complete
|
|
2968
3081
|
const hasIncompleteImplements = projects.some(project => {
|
|
2969
3082
|
const items = safeJson(projectWorkItemsPath(project)) || [];
|
|
@@ -3016,7 +3129,7 @@ async function tick() {
|
|
|
3016
3129
|
|
|
3017
3130
|
async function tickInner() {
|
|
3018
3131
|
const control = getControl();
|
|
3019
|
-
if (control.state !== 'running') {
|
|
3132
|
+
if (control.state !== 'running' && control.state !== 'stopping') {
|
|
3020
3133
|
log('info', `Engine state is "${control.state}" — exiting process`);
|
|
3021
3134
|
process.exit(0);
|
|
3022
3135
|
}
|
|
@@ -3031,6 +3144,12 @@ async function tickInner() {
|
|
|
3031
3144
|
checkTimeouts(config);
|
|
3032
3145
|
checkIdleThreshold(config);
|
|
3033
3146
|
|
|
3147
|
+
// In stopping state, only track agent completions — skip discovery and dispatch
|
|
3148
|
+
if (control.state === 'stopping') {
|
|
3149
|
+
log('info', `Engine stopping — ${activeProcesses.size} agent(s) still active, skipping discovery/dispatch`);
|
|
3150
|
+
return;
|
|
3151
|
+
}
|
|
3152
|
+
|
|
3034
3153
|
// 2. Consolidate inbox
|
|
3035
3154
|
consolidateInbox(config);
|
|
3036
3155
|
|
|
@@ -3212,6 +3331,28 @@ async function tickInner() {
|
|
|
3212
3331
|
dispatched.add(item.id);
|
|
3213
3332
|
}
|
|
3214
3333
|
}
|
|
3334
|
+
|
|
3335
|
+
// Annotate remaining pending items with skipReason so dashboard can show why they're waiting.
|
|
3336
|
+
// Re-read dispatch after spawns (spawnAgent moves items from pending→active).
|
|
3337
|
+
const postDispatch = getDispatch();
|
|
3338
|
+
const postBusyAgents = new Set((postDispatch.active || []).map(d => d.agent));
|
|
3339
|
+
const postActiveCount = (postDispatch.active || []).length;
|
|
3340
|
+
let skipReasonChanged = false;
|
|
3341
|
+
for (const item of (postDispatch.pending || [])) {
|
|
3342
|
+
let reason = null;
|
|
3343
|
+
if (postActiveCount >= maxConcurrent) {
|
|
3344
|
+
reason = 'max_concurrency';
|
|
3345
|
+
} else if (postBusyAgents.has(item.agent)) {
|
|
3346
|
+
reason = 'agent_busy';
|
|
3347
|
+
}
|
|
3348
|
+
if (item.skipReason !== reason) {
|
|
3349
|
+
item.skipReason = reason;
|
|
3350
|
+
skipReasonChanged = true;
|
|
3351
|
+
}
|
|
3352
|
+
}
|
|
3353
|
+
if (skipReasonChanged) {
|
|
3354
|
+
mutateDispatch((dp) => { dp.pending = postDispatch.pending; });
|
|
3355
|
+
}
|
|
3215
3356
|
}
|
|
3216
3357
|
|
|
3217
3358
|
// ─── Exports (for engine/cli.js and other modules) ──────────────────────────
|
package/package.json
CHANGED
|
@@ -0,0 +1,60 @@
|
|
|
1
|
+
# Playbook: Task Decomposition
|
|
2
|
+
|
|
3
|
+
You are {{agent_name}}, the {{agent_role}} on the {{project_name}} project.
|
|
4
|
+
TEAM ROOT: {{team_root}}
|
|
5
|
+
|
|
6
|
+
## Your Task
|
|
7
|
+
|
|
8
|
+
A work item has been flagged as too large for a single agent dispatch. Analyze the item and break it into 2-5 smaller, independently implementable sub-tasks.
|
|
9
|
+
|
|
10
|
+
## Work Item
|
|
11
|
+
|
|
12
|
+
- **ID:** {{item_id}}
|
|
13
|
+
- **Title:** {{item_title}}
|
|
14
|
+
- **Description:** {{item_description}}
|
|
15
|
+
- **Complexity:** {{item_complexity}}
|
|
16
|
+
- **Project:** {{project_name}} (`{{project_path}}`)
|
|
17
|
+
|
|
18
|
+
{{#acceptance_criteria}}
|
|
19
|
+
## Acceptance Criteria
|
|
20
|
+
|
|
21
|
+
{{acceptance_criteria}}
|
|
22
|
+
{{/acceptance_criteria}}
|
|
23
|
+
|
|
24
|
+
## Instructions
|
|
25
|
+
|
|
26
|
+
1. **Explore the codebase** at `{{project_path}}` — understand the existing structure, patterns, and dependencies
|
|
27
|
+
2. **Analyze the work item** — identify distinct units of work that can be implemented as separate PRs
|
|
28
|
+
3. **Break into 2-5 sub-tasks** — each should be:
|
|
29
|
+
- Small or medium complexity (not large)
|
|
30
|
+
- A single PR's worth of work
|
|
31
|
+
- Independently testable
|
|
32
|
+
- Clear enough for another agent to implement without ambiguity
|
|
33
|
+
4. **Order by dependency** — if sub-task B needs sub-task A's code, declare `depends_on`
|
|
34
|
+
5. **Generate unique IDs** — use format `{{item_id}}-a`, `{{item_id}}-b`, etc.
|
|
35
|
+
|
|
36
|
+
## Output
|
|
37
|
+
|
|
38
|
+
Write the decomposition result as a JSON code block in your response:
|
|
39
|
+
|
|
40
|
+
```json
|
|
41
|
+
{
|
|
42
|
+
"parent_id": "{{item_id}}",
|
|
43
|
+
"sub_items": [
|
|
44
|
+
{
|
|
45
|
+
"id": "{{item_id}}-a",
|
|
46
|
+
"name": "Short descriptive name",
|
|
47
|
+
"description": "What to build, where, and how. Be specific enough that an engineer can implement without further exploration.",
|
|
48
|
+
"estimated_complexity": "small|medium",
|
|
49
|
+
"depends_on": [],
|
|
50
|
+
"acceptance_criteria": ["Criterion 1", "Criterion 2"]
|
|
51
|
+
}
|
|
52
|
+
]
|
|
53
|
+
}
|
|
54
|
+
```
|
|
55
|
+
|
|
56
|
+
Keep the total number of sub-items between 2 and 5. If the task genuinely cannot be broken down further, output a single sub-item that matches the original.
|
|
57
|
+
|
|
58
|
+
{{pr_create_instructions}}
|
|
59
|
+
|
|
60
|
+
{{pr_comment_instructions}}
|
package/routing.md
CHANGED