@desplega.ai/agent-swarm 1.2.1 → 1.9.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.claude/settings.local.json +20 -1
- package/.env.docker.example +22 -1
- package/.env.example +17 -0
- package/.github/workflows/docker-publish.yml +92 -0
- package/CONTRIBUTING.md +270 -0
- package/DEPLOYMENT.md +391 -0
- package/Dockerfile.worker +29 -1
- package/FAQ.md +19 -0
- package/LICENSE +21 -0
- package/MCP.md +249 -0
- package/README.md +103 -207
- package/assets/agent-swarm-logo-orange.png +0 -0
- package/assets/agent-swarm-logo.png +0 -0
- package/docker-compose.example.yml +137 -0
- package/docker-entrypoint.sh +223 -7
- package/package.json +8 -3
- package/{cc-plugin → plugin}/.claude-plugin/plugin.json +1 -1
- package/plugin/README.md +1 -0
- package/plugin/agents/.gitkeep +0 -0
- package/plugin/agents/codebase-analyzer.md +143 -0
- package/plugin/agents/codebase-locator.md +122 -0
- package/plugin/agents/codebase-pattern-finder.md +227 -0
- package/plugin/agents/web-search-researcher.md +109 -0
- package/plugin/commands/create-plan.md +415 -0
- package/plugin/commands/implement-plan.md +89 -0
- package/plugin/commands/research.md +200 -0
- package/plugin/commands/start-leader.md +101 -0
- package/plugin/commands/start-worker.md +56 -0
- package/plugin/commands/swarm-chat.md +78 -0
- package/plugin/commands/todos.md +66 -0
- package/plugin/commands/work-on-task.md +44 -0
- package/plugin/skills/.gitkeep +0 -0
- package/scripts/generate-mcp-docs.ts +415 -0
- package/slack-manifest.json +69 -0
- package/src/be/db.ts +1431 -25
- package/src/cli.tsx +135 -11
- package/src/commands/lead.ts +13 -0
- package/src/commands/runner.ts +255 -0
- package/src/commands/worker.ts +8 -220
- package/src/hooks/hook.ts +102 -14
- package/src/http.ts +361 -5
- package/src/prompts/base-prompt.ts +131 -0
- package/src/server.ts +56 -0
- package/src/slack/app.ts +73 -0
- package/src/slack/commands.ts +88 -0
- package/src/slack/handlers.ts +281 -0
- package/src/slack/index.ts +3 -0
- package/src/slack/responses.ts +175 -0
- package/src/slack/router.ts +170 -0
- package/src/slack/types.ts +20 -0
- package/src/slack/watcher.ts +119 -0
- package/src/tools/create-channel.ts +80 -0
- package/src/tools/get-tasks.ts +54 -21
- package/src/tools/join-swarm.ts +28 -4
- package/src/tools/list-channels.ts +37 -0
- package/src/tools/list-services.ts +110 -0
- package/src/tools/poll-task.ts +46 -3
- package/src/tools/post-message.ts +87 -0
- package/src/tools/read-messages.ts +192 -0
- package/src/tools/register-service.ts +118 -0
- package/src/tools/send-task.ts +80 -7
- package/src/tools/store-progress.ts +9 -3
- package/src/tools/task-action.ts +211 -0
- package/src/tools/unregister-service.ts +110 -0
- package/src/tools/update-profile.ts +105 -0
- package/src/tools/update-service-status.ts +118 -0
- package/src/types.ts +110 -3
- package/src/utils/pretty-print.ts +224 -0
- package/thoughts/shared/plans/.gitkeep +0 -0
- package/thoughts/shared/plans/2025-12-18-inverse-teleport.md +1142 -0
- package/thoughts/shared/plans/2025-12-18-slack-integration.md +1195 -0
- package/thoughts/shared/plans/2025-12-19-agent-log-streaming.md +732 -0
- package/thoughts/shared/plans/2025-12-19-role-based-swarm-plugin.md +361 -0
- package/thoughts/shared/plans/2025-12-20-mobile-responsive-ui.md +501 -0
- package/thoughts/shared/plans/2025-12-20-startup-team-swarm.md +560 -0
- package/thoughts/shared/research/.gitkeep +0 -0
- package/thoughts/shared/research/2025-12-18-slack-integration.md +442 -0
- package/thoughts/shared/research/2025-12-19-agent-log-streaming.md +339 -0
- package/thoughts/shared/research/2025-12-19-agent-secrets-cli-research.md +390 -0
- package/thoughts/shared/research/2025-12-21-gemini-cli-integration.md +376 -0
- package/thoughts/shared/research/2025-12-22-setup-experience-improvements.md +264 -0
- package/tsconfig.json +3 -1
- package/ui/bun.lock +692 -0
- package/ui/index.html +22 -0
- package/ui/package.json +32 -0
- package/ui/pnpm-lock.yaml +3034 -0
- package/ui/postcss.config.js +6 -0
- package/ui/public/logo.png +0 -0
- package/ui/src/App.tsx +43 -0
- package/ui/src/components/ActivityFeed.tsx +415 -0
- package/ui/src/components/AgentDetailPanel.tsx +534 -0
- package/ui/src/components/AgentsPanel.tsx +549 -0
- package/ui/src/components/ChatPanel.tsx +1820 -0
- package/ui/src/components/ConfigModal.tsx +232 -0
- package/ui/src/components/Dashboard.tsx +534 -0
- package/ui/src/components/Header.tsx +168 -0
- package/ui/src/components/ServicesPanel.tsx +612 -0
- package/ui/src/components/StatsBar.tsx +288 -0
- package/ui/src/components/StatusBadge.tsx +124 -0
- package/ui/src/components/TaskDetailPanel.tsx +807 -0
- package/ui/src/components/TasksPanel.tsx +575 -0
- package/ui/src/hooks/queries.ts +170 -0
- package/ui/src/index.css +235 -0
- package/ui/src/lib/api.ts +161 -0
- package/ui/src/lib/config.ts +35 -0
- package/ui/src/lib/theme.ts +214 -0
- package/ui/src/lib/utils.ts +48 -0
- package/ui/src/main.tsx +32 -0
- package/ui/src/types/api.ts +164 -0
- package/ui/src/vite-env.d.ts +1 -0
- package/ui/tailwind.config.js +35 -0
- package/ui/tsconfig.json +31 -0
- package/ui/vite.config.ts +22 -0
- package/cc-plugin/README.md +0 -49
- package/cc-plugin/commands/setup-leader.md +0 -73
- package/cc-plugin/commands/start-worker.md +0 -64
- package/docker-compose.worker.yml +0 -35
- package/example-req-meta.json +0 -24
- /package/{cc-plugin → plugin}/hooks/hooks.json +0 -0
|
@@ -0,0 +1,732 @@
|
|
|
1
|
+
# Agent Log Streaming and SSE Implementation Plan
|
|
2
|
+
|
|
3
|
+
## Overview
|
|
4
|
+
|
|
5
|
+
Implement real-time log streaming from worker agents to the frontend dashboard. Worker agents will stream their Claude CLI output to the API server, which stores logs on disk and broadcasts them via Server-Sent Events (SSE) to connected frontend clients viewing task details.
|
|
6
|
+
|
|
7
|
+
## Current State Analysis
|
|
8
|
+
|
|
9
|
+
### Existing Infrastructure
|
|
10
|
+
- **Runner** (`src/commands/runner.ts`): Spawns Claude CLI with `--output-format stream-json`, captures stdout/stderr to local JSONL files organized by session ID
|
|
11
|
+
- **Hooks** (`src/hooks/hook.ts`): Intercepts Claude Code events, communicates with MCP server for agent status tracking
|
|
12
|
+
- **HTTP Server** (`src/http.ts`): REST API with Node.js `createHttpServer`, no custom SSE endpoints yet (only MCP SDK streaming)
|
|
13
|
+
- **Frontend** (`ui/src/components/TaskDetailPanel.tsx`): Polls every 5 seconds via React Query, displays database logs only
|
|
14
|
+
|
|
15
|
+
### Key Discoveries
|
|
16
|
+
- Runner uses `sessionId` for log organization, not `taskId` - no link between logs and tasks currently
|
|
17
|
+
- Hook already handles `PostToolUse` events and can intercept `poll-task` and `store-progress` responses
|
|
18
|
+
- Bun file APIs available: `Bun.file()`, `Bun.write()`, `file.exists()`, `file.text()`
|
|
19
|
+
- Poll-task response structure at `src/tools/poll-task.ts:101-108` includes `task.id` in `structuredContent`
|
|
20
|
+
|
|
21
|
+
## Desired End State
|
|
22
|
+
|
|
23
|
+
After implementation:
|
|
24
|
+
1. Worker agents stream their Claude CLI output to the API in real-time
|
|
25
|
+
2. Logs are persisted to `/logs/{taskId}.jsonl` files on the API server
|
|
26
|
+
3. Frontend receives live log updates via SSE when viewing in-progress tasks
|
|
27
|
+
4. TaskDetailPanel shows streaming logs without polling delay
|
|
28
|
+
|
|
29
|
+
### Verification
|
|
30
|
+
- Start API server and worker
|
|
31
|
+
- Assign a task to the worker
|
|
32
|
+
- Open TaskDetailPanel for that task
|
|
33
|
+
- Observe logs appearing in real-time (< 1 second delay)
|
|
34
|
+
- Verify logs persist after task completion
|
|
35
|
+
|
|
36
|
+
## What We're NOT Doing
|
|
37
|
+
|
|
38
|
+
- Streaming Claude Code native JSONL files (`~/.claude/projects/...`)
|
|
39
|
+
- Log pagination or search
|
|
40
|
+
- Log level filtering
|
|
41
|
+
- Log retention policies or cleanup
|
|
42
|
+
- WebSocket implementation (SSE is simpler and sufficient)
|
|
43
|
+
- Authentication for SSE endpoints (inherits existing API key auth)
|
|
44
|
+
|
|
45
|
+
## Implementation Approach
|
|
46
|
+
|
|
47
|
+
Use file-based task ID tracking via `/tmp/.task.json` to link runner logs to tasks. The hook writes the current task ID when `poll-task` succeeds and clears it on task completion. The runner reads this file and streams log chunks to the API. The API stores logs on disk and broadcasts to SSE subscribers.
|
|
48
|
+
|
|
49
|
+
---
|
|
50
|
+
|
|
51
|
+
## Phase 1: Hook - Track Current Task ID
|
|
52
|
+
|
|
53
|
+
### Overview
|
|
54
|
+
Modify the hook to write the current task ID to a temp file when a task is assigned, and clear it when the task completes or fails.
|
|
55
|
+
|
|
56
|
+
### Changes Required
|
|
57
|
+
|
|
58
|
+
#### 1. Add Task File Tracking
|
|
59
|
+
**File**: `src/hooks/hook.ts`
|
|
60
|
+
**Changes**: Add task file write/clear logic in PostToolUse handler
|
|
61
|
+
|
|
62
|
+
```typescript
|
|
63
|
+
// Add at top of file after imports
|
|
64
|
+
const TASK_FILE = "/tmp/.task.json";
|
|
65
|
+
|
|
66
|
+
// In the PostToolUse case (around line 172), replace the existing handler:
|
|
67
|
+
case "PostToolUse":
|
|
68
|
+
// Track task assignment from poll-task
|
|
69
|
+
if (msg.tool_name?.endsWith("poll-task")) {
|
|
70
|
+
const response = msg.tool_response as { success?: boolean; task?: { id: string } };
|
|
71
|
+
if (response?.success && response?.task?.id) {
|
|
72
|
+
await Bun.write(TASK_FILE, JSON.stringify({
|
|
73
|
+
taskId: response.task.id,
|
|
74
|
+
assignedAt: new Date().toISOString()
|
|
75
|
+
}));
|
|
76
|
+
}
|
|
77
|
+
}
|
|
78
|
+
|
|
79
|
+
// Clear on task completion/failure
|
|
80
|
+
if (msg.tool_name?.endsWith("store-progress")) {
|
|
81
|
+
const input = msg.tool_input as { status?: string };
|
|
82
|
+
if (input?.status === "completed" || input?.status === "failed") {
|
|
83
|
+
try {
|
|
84
|
+
const file = Bun.file(TASK_FILE);
|
|
85
|
+
if (await file.exists()) {
|
|
86
|
+
await Bun.write(TASK_FILE, "");
|
|
87
|
+
}
|
|
88
|
+
} catch {
|
|
89
|
+
// Ignore errors clearing task file
|
|
90
|
+
}
|
|
91
|
+
}
|
|
92
|
+
}
|
|
93
|
+
|
|
94
|
+
// Keep existing agent info output
|
|
95
|
+
if (agentInfo) {
|
|
96
|
+
if (agentInfo.isLead) {
|
|
97
|
+
if (msg.tool_name?.endsWith("send-task")) {
|
|
98
|
+
const maybeTaskId = (msg.tool_response as { task?: { id?: string } })?.task?.id;
|
|
99
|
+
console.log(
|
|
100
|
+
`Task sent successfully.${maybeTaskId ? ` Task ID: ${maybeTaskId}.` : ""} Monitor progress using the get-task-details tool periodically.`,
|
|
101
|
+
);
|
|
102
|
+
}
|
|
103
|
+
} else {
|
|
104
|
+
console.log(
|
|
105
|
+
`Remember to call store-progress periodically to update the lead agent on your progress.`,
|
|
106
|
+
);
|
|
107
|
+
}
|
|
108
|
+
}
|
|
109
|
+
break;
|
|
110
|
+
```
|
|
111
|
+
|
|
112
|
+
### Success Criteria
|
|
113
|
+
|
|
114
|
+
#### Automated Verification:
|
|
115
|
+
- [ ] TypeScript compiles: `bun run tsc`
|
|
116
|
+
- [ ] Linting passes: `bun run lint`
|
|
117
|
+
|
|
118
|
+
#### Manual Verification:
|
|
119
|
+
- [ ] Start a worker agent and assign it a task
|
|
120
|
+
- [ ] Verify `/tmp/.task.json` is created with correct `taskId` after poll-task succeeds
|
|
121
|
+
- [ ] Verify `/tmp/.task.json` is cleared (empty) after task completes or fails
|
|
122
|
+
|
|
123
|
+
**Implementation Note**: After completing this phase and all automated verification passes, pause here for manual confirmation that the task file tracking works correctly before proceeding to Phase 2.
|
|
124
|
+
|
|
125
|
+
---
|
|
126
|
+
|
|
127
|
+
## Phase 2: Runner - Stream Logs to API
|
|
128
|
+
|
|
129
|
+
### Overview
|
|
130
|
+
Modify the runner to read the current task ID and stream log entries to the API server as they're captured from Claude CLI.
|
|
131
|
+
|
|
132
|
+
### Changes Required
|
|
133
|
+
|
|
134
|
+
#### 1. Add Task ID Reading and API Streaming
|
|
135
|
+
**File**: `src/commands/runner.ts`
|
|
136
|
+
**Changes**: Add helper functions and modify stdout/stderr capture loops
|
|
137
|
+
|
|
138
|
+
```typescript
|
|
139
|
+
// Add after existing imports (around line 5)
|
|
140
|
+
const TASK_FILE = "/tmp/.task.json";
|
|
141
|
+
const API_BASE_URL = process.env.MCP_BASE_URL || "http://localhost:3013";
|
|
142
|
+
const API_KEY = process.env.API_KEY || "";
|
|
143
|
+
|
|
144
|
+
// Add helper functions before runClaudeIteration (around line 35)
|
|
145
|
+
async function getCurrentTaskId(): Promise<string | null> {
|
|
146
|
+
try {
|
|
147
|
+
const file = Bun.file(TASK_FILE);
|
|
148
|
+
if (await file.exists()) {
|
|
149
|
+
const content = await file.text();
|
|
150
|
+
if (!content.trim()) return null;
|
|
151
|
+
const data = JSON.parse(content);
|
|
152
|
+
return data.taskId || null;
|
|
153
|
+
}
|
|
154
|
+
} catch {
|
|
155
|
+
// Ignore errors reading task file
|
|
156
|
+
}
|
|
157
|
+
return null;
|
|
158
|
+
}
|
|
159
|
+
|
|
160
|
+
async function streamLogToApi(taskId: string, logEntry: object): Promise<void> {
|
|
161
|
+
try {
|
|
162
|
+
await fetch(`${API_BASE_URL}/api/tasks/${taskId}/logs`, {
|
|
163
|
+
method: "POST",
|
|
164
|
+
headers: {
|
|
165
|
+
"Content-Type": "application/json",
|
|
166
|
+
...(API_KEY ? { Authorization: `Bearer ${API_KEY}` } : {}),
|
|
167
|
+
},
|
|
168
|
+
body: JSON.stringify(logEntry),
|
|
169
|
+
});
|
|
170
|
+
} catch {
|
|
171
|
+
// Fire and forget - don't block on API errors
|
|
172
|
+
}
|
|
173
|
+
}
|
|
174
|
+
```
|
|
175
|
+
|
|
176
|
+
#### 2. Modify Stdout Processing Loop
|
|
177
|
+
**File**: `src/commands/runner.ts`
|
|
178
|
+
**Changes**: Add API streaming in the stdout loop (around line 77-126)
|
|
179
|
+
|
|
180
|
+
```typescript
|
|
181
|
+
// Inside stdoutPromise, after logFileHandle.write(text):
|
|
182
|
+
const taskId = await getCurrentTaskId();
|
|
183
|
+
if (taskId) {
|
|
184
|
+
// Stream each parsed JSON line to API
|
|
185
|
+
for (const line of lines) {
|
|
186
|
+
if (line.trim() === "") continue;
|
|
187
|
+
try {
|
|
188
|
+
const json = JSON.parse(line.trim());
|
|
189
|
+
// Don't await - fire and forget
|
|
190
|
+
streamLogToApi(taskId, {
|
|
191
|
+
type: json.type || "unknown",
|
|
192
|
+
content: json,
|
|
193
|
+
timestamp: new Date().toISOString(),
|
|
194
|
+
});
|
|
195
|
+
} catch {
|
|
196
|
+
// Non-JSON lines also streamed
|
|
197
|
+
if (line.trim()) {
|
|
198
|
+
streamLogToApi(taskId, {
|
|
199
|
+
type: "raw",
|
|
200
|
+
content: line.trim(),
|
|
201
|
+
timestamp: new Date().toISOString(),
|
|
202
|
+
});
|
|
203
|
+
}
|
|
204
|
+
}
|
|
205
|
+
}
|
|
206
|
+
}
|
|
207
|
+
```
|
|
208
|
+
|
|
209
|
+
#### 3. Modify Stderr Processing Loop
|
|
210
|
+
**File**: `src/commands/runner.ts`
|
|
211
|
+
**Changes**: Add API streaming in the stderr loop (around line 128-142)
|
|
212
|
+
|
|
213
|
+
```typescript
|
|
214
|
+
// Inside stderrPromise, after logFileHandle.write():
|
|
215
|
+
const taskId = await getCurrentTaskId();
|
|
216
|
+
if (taskId) {
|
|
217
|
+
streamLogToApi(taskId, {
|
|
218
|
+
type: "stderr",
|
|
219
|
+
content: text,
|
|
220
|
+
timestamp: new Date().toISOString(),
|
|
221
|
+
});
|
|
222
|
+
}
|
|
223
|
+
```
|
|
224
|
+
|
|
225
|
+
### Success Criteria
|
|
226
|
+
|
|
227
|
+
#### Automated Verification:
|
|
228
|
+
- [ ] TypeScript compiles: `bun run tsc`
|
|
229
|
+
- [ ] Linting passes: `bun run lint`
|
|
230
|
+
|
|
231
|
+
#### Manual Verification:
|
|
232
|
+
- [ ] Start API server with logging enabled
|
|
233
|
+
- [ ] Run a worker and assign a task
|
|
234
|
+
- [ ] Observe POST requests to `/api/tasks/:id/logs` in API logs (will 404 until Phase 3)
|
|
235
|
+
- [ ] Verify runner doesn't crash or slow down due to API streaming
|
|
236
|
+
|
|
237
|
+
**Implementation Note**: After completing this phase and all automated verification passes, pause here for manual confirmation that the runner is attempting to stream logs before proceeding to Phase 3.
|
|
238
|
+
|
|
239
|
+
---
|
|
240
|
+
|
|
241
|
+
## Phase 3: API Server - Log Ingestion + SSE Broadcast
|
|
242
|
+
|
|
243
|
+
### Overview
|
|
244
|
+
Add three new endpoints to the HTTP server: POST for receiving logs, GET for retrieving stored logs, and GET with SSE for streaming new logs to subscribers.
|
|
245
|
+
|
|
246
|
+
### Changes Required
|
|
247
|
+
|
|
248
|
+
#### 1. Add SSE Subscriber Management
|
|
249
|
+
**File**: `src/http.ts`
|
|
250
|
+
**Changes**: Add subscriber tracking and broadcast helper near the top
|
|
251
|
+
|
|
252
|
+
```typescript
|
|
253
|
+
// Add after existing imports and before globalState definition (around line 25)
|
|
254
|
+
import { appendFile, mkdir } from "node:fs/promises";
|
|
255
|
+
import { createReadStream, existsSync } from "node:fs";
|
|
256
|
+
import { createInterface } from "node:readline";
|
|
257
|
+
|
|
258
|
+
const LOG_DIR = process.env.LOG_DIR || "./logs";
|
|
259
|
+
|
|
260
|
+
// SSE subscribers per task
|
|
261
|
+
const taskLogSubscribers: Map<string, Set<ServerResponse>> = new Map();
|
|
262
|
+
|
|
263
|
+
function broadcastToTaskSubscribers(taskId: string, data: object): void {
|
|
264
|
+
const subscribers = taskLogSubscribers.get(taskId);
|
|
265
|
+
if (!subscribers || subscribers.size === 0) return;
|
|
266
|
+
|
|
267
|
+
const message = `data: ${JSON.stringify(data)}\n\n`;
|
|
268
|
+
for (const res of subscribers) {
|
|
269
|
+
try {
|
|
270
|
+
res.write(message);
|
|
271
|
+
} catch {
|
|
272
|
+
// Remove dead connections
|
|
273
|
+
subscribers.delete(res);
|
|
274
|
+
}
|
|
275
|
+
}
|
|
276
|
+
}
|
|
277
|
+
```
|
|
278
|
+
|
|
279
|
+
#### 2. Add POST /api/tasks/:id/logs Endpoint
|
|
280
|
+
**File**: `src/http.ts`
|
|
281
|
+
**Changes**: Add log ingestion endpoint (add before the MCP endpoint section, around line 320)
|
|
282
|
+
|
|
283
|
+
```typescript
|
|
284
|
+
// POST /api/tasks/:id/logs - Receive log chunks from runner
|
|
285
|
+
if (
|
|
286
|
+
req.method === "POST" &&
|
|
287
|
+
pathSegments[0] === "api" &&
|
|
288
|
+
pathSegments[1] === "tasks" &&
|
|
289
|
+
pathSegments[2] &&
|
|
290
|
+
pathSegments[3] === "logs" &&
|
|
291
|
+
!pathSegments[4]
|
|
292
|
+
) {
|
|
293
|
+
const taskId = pathSegments[2];
|
|
294
|
+
|
|
295
|
+
// Read request body
|
|
296
|
+
let body = "";
|
|
297
|
+
for await (const chunk of req) {
|
|
298
|
+
body += chunk;
|
|
299
|
+
}
|
|
300
|
+
|
|
301
|
+
let logEntry: object;
|
|
302
|
+
try {
|
|
303
|
+
logEntry = JSON.parse(body);
|
|
304
|
+
} catch {
|
|
305
|
+
res.writeHead(400, { "Content-Type": "application/json" });
|
|
306
|
+
res.end(JSON.stringify({ error: "Invalid JSON" }));
|
|
307
|
+
return;
|
|
308
|
+
}
|
|
309
|
+
|
|
310
|
+
// Add receivedAt timestamp
|
|
311
|
+
const enrichedEntry = {
|
|
312
|
+
...logEntry,
|
|
313
|
+
receivedAt: new Date().toISOString(),
|
|
314
|
+
};
|
|
315
|
+
|
|
316
|
+
// Ensure log directory exists
|
|
317
|
+
await mkdir(LOG_DIR, { recursive: true });
|
|
318
|
+
|
|
319
|
+
// Append to task log file
|
|
320
|
+
const logFile = `${LOG_DIR}/${taskId}.jsonl`;
|
|
321
|
+
await appendFile(logFile, JSON.stringify(enrichedEntry) + "\n");
|
|
322
|
+
|
|
323
|
+
// Broadcast to SSE subscribers
|
|
324
|
+
broadcastToTaskSubscribers(taskId, enrichedEntry);
|
|
325
|
+
|
|
326
|
+
res.writeHead(200, { "Content-Type": "application/json" });
|
|
327
|
+
res.end(JSON.stringify({ success: true }));
|
|
328
|
+
return;
|
|
329
|
+
}
|
|
330
|
+
```
|
|
331
|
+
|
|
332
|
+
#### 3. Add GET /api/tasks/:id/logs Endpoint
|
|
333
|
+
**File**: `src/http.ts`
|
|
334
|
+
**Changes**: Add log retrieval endpoint
|
|
335
|
+
|
|
336
|
+
```typescript
|
|
337
|
+
// GET /api/tasks/:id/logs - Retrieve stored logs
|
|
338
|
+
if (
|
|
339
|
+
req.method === "GET" &&
|
|
340
|
+
pathSegments[0] === "api" &&
|
|
341
|
+
pathSegments[1] === "tasks" &&
|
|
342
|
+
pathSegments[2] &&
|
|
343
|
+
pathSegments[3] === "logs" &&
|
|
344
|
+
!pathSegments[4]
|
|
345
|
+
) {
|
|
346
|
+
const taskId = pathSegments[2];
|
|
347
|
+
const logFile = `${LOG_DIR}/${taskId}.jsonl`;
|
|
348
|
+
|
|
349
|
+
if (!existsSync(logFile)) {
|
|
350
|
+
res.writeHead(200, { "Content-Type": "application/json" });
|
|
351
|
+
res.end(JSON.stringify({ logs: [] }));
|
|
352
|
+
return;
|
|
353
|
+
}
|
|
354
|
+
|
|
355
|
+
// Read and parse JSONL file
|
|
356
|
+
const logs: object[] = [];
|
|
357
|
+
const fileStream = createReadStream(logFile);
|
|
358
|
+
const rl = createInterface({ input: fileStream, crlfDelay: Infinity });
|
|
359
|
+
|
|
360
|
+
for await (const line of rl) {
|
|
361
|
+
if (line.trim()) {
|
|
362
|
+
try {
|
|
363
|
+
logs.push(JSON.parse(line));
|
|
364
|
+
} catch {
|
|
365
|
+
// Skip malformed lines
|
|
366
|
+
}
|
|
367
|
+
}
|
|
368
|
+
}
|
|
369
|
+
|
|
370
|
+
res.writeHead(200, { "Content-Type": "application/json" });
|
|
371
|
+
res.end(JSON.stringify({ logs }));
|
|
372
|
+
return;
|
|
373
|
+
}
|
|
374
|
+
```
|
|
375
|
+
|
|
376
|
+
#### 4. Add GET /api/tasks/:id/logs/stream SSE Endpoint
|
|
377
|
+
**File**: `src/http.ts`
|
|
378
|
+
**Changes**: Add SSE streaming endpoint
|
|
379
|
+
|
|
380
|
+
```typescript
|
|
381
|
+
// GET /api/tasks/:id/logs/stream - SSE subscription for new logs
|
|
382
|
+
if (
|
|
383
|
+
req.method === "GET" &&
|
|
384
|
+
pathSegments[0] === "api" &&
|
|
385
|
+
pathSegments[1] === "tasks" &&
|
|
386
|
+
pathSegments[2] &&
|
|
387
|
+
pathSegments[3] === "logs" &&
|
|
388
|
+
pathSegments[4] === "stream"
|
|
389
|
+
) {
|
|
390
|
+
const taskId = pathSegments[2];
|
|
391
|
+
|
|
392
|
+
// Set SSE headers
|
|
393
|
+
res.writeHead(200, {
|
|
394
|
+
"Content-Type": "text/event-stream",
|
|
395
|
+
"Cache-Control": "no-cache",
|
|
396
|
+
Connection: "keep-alive",
|
|
397
|
+
"Access-Control-Allow-Origin": "*",
|
|
398
|
+
});
|
|
399
|
+
|
|
400
|
+
// Send initial connection message
|
|
401
|
+
res.write(`data: ${JSON.stringify({ type: "connected", taskId })}\n\n`);
|
|
402
|
+
|
|
403
|
+
// Add to subscribers
|
|
404
|
+
if (!taskLogSubscribers.has(taskId)) {
|
|
405
|
+
taskLogSubscribers.set(taskId, new Set());
|
|
406
|
+
}
|
|
407
|
+
taskLogSubscribers.get(taskId)!.add(res);
|
|
408
|
+
|
|
409
|
+
// Cleanup on close
|
|
410
|
+
req.on("close", () => {
|
|
411
|
+
const subscribers = taskLogSubscribers.get(taskId);
|
|
412
|
+
if (subscribers) {
|
|
413
|
+
subscribers.delete(res);
|
|
414
|
+
if (subscribers.size === 0) {
|
|
415
|
+
taskLogSubscribers.delete(taskId);
|
|
416
|
+
}
|
|
417
|
+
}
|
|
418
|
+
});
|
|
419
|
+
|
|
420
|
+
// Keep connection alive with periodic heartbeat
|
|
421
|
+
const heartbeat = setInterval(() => {
|
|
422
|
+
try {
|
|
423
|
+
res.write(`: heartbeat\n\n`);
|
|
424
|
+
} catch {
|
|
425
|
+
clearInterval(heartbeat);
|
|
426
|
+
}
|
|
427
|
+
}, 30000);
|
|
428
|
+
|
|
429
|
+
req.on("close", () => clearInterval(heartbeat));
|
|
430
|
+
|
|
431
|
+
return;
|
|
432
|
+
}
|
|
433
|
+
```
|
|
434
|
+
|
|
435
|
+
### Success Criteria
|
|
436
|
+
|
|
437
|
+
#### Automated Verification:
|
|
438
|
+
- [ ] TypeScript compiles: `bun run tsc`
|
|
439
|
+
- [ ] Linting passes: `bun run lint`
|
|
440
|
+
|
|
441
|
+
#### Manual Verification:
|
|
442
|
+
- [ ] Start API server
|
|
443
|
+
- [ ] POST a test log: `curl -X POST http://localhost:3013/api/tasks/test-id/logs -H "Content-Type: application/json" -d '{"type":"test","content":"hello"}'`
|
|
444
|
+
- [ ] GET the logs: `curl http://localhost:3013/api/tasks/test-id/logs`
|
|
445
|
+
- [ ] Subscribe to SSE: `curl -N http://localhost:3013/api/tasks/test-id/logs/stream`
|
|
446
|
+
- [ ] POST another log and see it appear in the SSE stream
|
|
447
|
+
- [ ] Verify log file exists at `./logs/test-id.jsonl`
|
|
448
|
+
|
|
449
|
+
**Implementation Note**: After completing this phase and all automated verification passes, pause here for manual confirmation that the API endpoints work correctly before proceeding to Phase 4.
|
|
450
|
+
|
|
451
|
+
---
|
|
452
|
+
|
|
453
|
+
## Phase 4: Frontend - SSE Subscription
|
|
454
|
+
|
|
455
|
+
### Overview
|
|
456
|
+
Add SSE subscription helper and modify TaskDetailPanel to use real-time log streaming for in-progress tasks.
|
|
457
|
+
|
|
458
|
+
### Changes Required
|
|
459
|
+
|
|
460
|
+
#### 1. Add StreamingLogEntry Type
|
|
461
|
+
**File**: `ui/src/types/api.ts`
|
|
462
|
+
**Changes**: Add type definition for streaming logs
|
|
463
|
+
|
|
464
|
+
```typescript
|
|
465
|
+
// Add after AgentLog interface (around line 48)
|
|
466
|
+
export interface StreamingLogEntry {
|
|
467
|
+
type: string;
|
|
468
|
+
content: unknown;
|
|
469
|
+
timestamp: string;
|
|
470
|
+
receivedAt?: string;
|
|
471
|
+
}
|
|
472
|
+
```
|
|
473
|
+
|
|
474
|
+
#### 2. Add SSE Subscription Helper
|
|
475
|
+
**File**: `ui/src/lib/api.ts`
|
|
476
|
+
**Changes**: Add SSE subscription function
|
|
477
|
+
|
|
478
|
+
```typescript
|
|
479
|
+
// Add after existing methods in ApiClient class (before the closing brace)
|
|
480
|
+
subscribeToTaskLogs(
|
|
481
|
+
taskId: string,
|
|
482
|
+
onLog: (log: StreamingLogEntry) => void,
|
|
483
|
+
onError?: (error: Event) => void,
|
|
484
|
+
onConnected?: () => void
|
|
485
|
+
): () => void {
|
|
486
|
+
const url = `${this.getBaseUrl()}/api/tasks/${taskId}/logs/stream`;
|
|
487
|
+
const eventSource = new EventSource(url);
|
|
488
|
+
|
|
489
|
+
eventSource.onmessage = (event) => {
|
|
490
|
+
try {
|
|
491
|
+
const data = JSON.parse(event.data);
|
|
492
|
+
if (data.type === "connected") {
|
|
493
|
+
onConnected?.();
|
|
494
|
+
} else {
|
|
495
|
+
onLog(data);
|
|
496
|
+
}
|
|
497
|
+
} catch {
|
|
498
|
+
// Ignore parse errors
|
|
499
|
+
}
|
|
500
|
+
};
|
|
501
|
+
|
|
502
|
+
eventSource.onerror = (error) => {
|
|
503
|
+
onError?.(error);
|
|
504
|
+
};
|
|
505
|
+
|
|
506
|
+
// Return cleanup function
|
|
507
|
+
return () => {
|
|
508
|
+
eventSource.close();
|
|
509
|
+
};
|
|
510
|
+
}
|
|
511
|
+
```
|
|
512
|
+
|
|
513
|
+
Also add the import at the top:
|
|
514
|
+
```typescript
|
|
515
|
+
import type { StreamingLogEntry } from "@/types/api";
|
|
516
|
+
```
|
|
517
|
+
|
|
518
|
+
#### 3. Add Streaming Logs Hook
|
|
519
|
+
**File**: `ui/src/hooks/useStreamingLogs.ts` (new file)
|
|
520
|
+
**Changes**: Create custom hook for SSE subscription
|
|
521
|
+
|
|
522
|
+
```typescript
|
|
523
|
+
import { useEffect, useState, useCallback } from "react";
|
|
524
|
+
import { api } from "@/lib/api";
|
|
525
|
+
import type { StreamingLogEntry } from "@/types/api";
|
|
526
|
+
|
|
527
|
+
export function useStreamingLogs(taskId: string, enabled: boolean) {
|
|
528
|
+
const [logs, setLogs] = useState<StreamingLogEntry[]>([]);
|
|
529
|
+
const [isConnected, setIsConnected] = useState(false);
|
|
530
|
+
const [error, setError] = useState<Event | null>(null);
|
|
531
|
+
|
|
532
|
+
const clearLogs = useCallback(() => {
|
|
533
|
+
setLogs([]);
|
|
534
|
+
}, []);
|
|
535
|
+
|
|
536
|
+
useEffect(() => {
|
|
537
|
+
if (!enabled || !taskId) {
|
|
538
|
+
setIsConnected(false);
|
|
539
|
+
return;
|
|
540
|
+
}
|
|
541
|
+
|
|
542
|
+
const unsubscribe = api.subscribeToTaskLogs(
|
|
543
|
+
taskId,
|
|
544
|
+
(log) => {
|
|
545
|
+
setLogs((prev) => [...prev, log]);
|
|
546
|
+
},
|
|
547
|
+
(err) => {
|
|
548
|
+
setError(err);
|
|
549
|
+
setIsConnected(false);
|
|
550
|
+
},
|
|
551
|
+
() => {
|
|
552
|
+
setIsConnected(true);
|
|
553
|
+
setError(null);
|
|
554
|
+
}
|
|
555
|
+
);
|
|
556
|
+
|
|
557
|
+
return () => {
|
|
558
|
+
unsubscribe();
|
|
559
|
+
setIsConnected(false);
|
|
560
|
+
};
|
|
561
|
+
}, [taskId, enabled]);
|
|
562
|
+
|
|
563
|
+
return { logs, isConnected, error, clearLogs };
|
|
564
|
+
}
|
|
565
|
+
```
|
|
566
|
+
|
|
567
|
+
#### 4. Update TaskDetailPanel
|
|
568
|
+
**File**: `ui/src/components/TaskDetailPanel.tsx`
|
|
569
|
+
**Changes**: Add streaming logs display for in-progress tasks
|
|
570
|
+
|
|
571
|
+
```typescript
|
|
572
|
+
// Add import at top
|
|
573
|
+
import { useStreamingLogs } from "@/hooks/useStreamingLogs";
|
|
574
|
+
|
|
575
|
+
// Inside TaskDetailPanel component, after existing hooks (around line 30)
|
|
576
|
+
const isInProgress = task?.status === "in_progress";
|
|
577
|
+
const { logs: streamingLogs, isConnected } = useStreamingLogs(
|
|
578
|
+
taskId,
|
|
579
|
+
isInProgress
|
|
580
|
+
);
|
|
581
|
+
|
|
582
|
+
// Add a new section in the UI to display streaming logs
|
|
583
|
+
// This should be added in the progress section area (around line 260-324)
|
|
584
|
+
// Add after the existing progress logs display:
|
|
585
|
+
|
|
586
|
+
{isInProgress && (
|
|
587
|
+
<Box sx={{ mt: 2 }}>
|
|
588
|
+
<Typography level="title-sm" sx={{ mb: 1 }}>
|
|
589
|
+
Live Output {isConnected && <Chip size="sm" color="success">Connected</Chip>}
|
|
590
|
+
</Typography>
|
|
591
|
+
<Box
|
|
592
|
+
sx={{
|
|
593
|
+
maxHeight: 300,
|
|
594
|
+
overflow: "auto",
|
|
595
|
+
bgcolor: "background.level1",
|
|
596
|
+
borderRadius: "sm",
|
|
597
|
+
p: 1,
|
|
598
|
+
fontFamily: "monospace",
|
|
599
|
+
fontSize: "xs",
|
|
600
|
+
}}
|
|
601
|
+
>
|
|
602
|
+
{streamingLogs.length === 0 ? (
|
|
603
|
+
<Typography level="body-sm" sx={{ color: "text.tertiary" }}>
|
|
604
|
+
Waiting for logs...
|
|
605
|
+
</Typography>
|
|
606
|
+
) : (
|
|
607
|
+
streamingLogs.map((log, i) => (
|
|
608
|
+
<Box key={i} sx={{ py: 0.5, borderBottom: "1px solid", borderColor: "divider" }}>
|
|
609
|
+
<Typography
|
|
610
|
+
level="body-xs"
|
|
611
|
+
sx={{
|
|
612
|
+
color: log.type === "stderr" ? "danger.500" : "text.primary",
|
|
613
|
+
whiteSpace: "pre-wrap",
|
|
614
|
+
wordBreak: "break-word",
|
|
615
|
+
}}
|
|
616
|
+
>
|
|
617
|
+
{typeof log.content === "string"
|
|
618
|
+
? log.content
|
|
619
|
+
: JSON.stringify(log.content, null, 2)}
|
|
620
|
+
</Typography>
|
|
621
|
+
<Typography level="body-xs" sx={{ color: "text.tertiary", fontSize: "10px" }}>
|
|
622
|
+
{log.timestamp}
|
|
623
|
+
</Typography>
|
|
624
|
+
</Box>
|
|
625
|
+
))
|
|
626
|
+
)}
|
|
627
|
+
</Box>
|
|
628
|
+
</Box>
|
|
629
|
+
)}
|
|
630
|
+
```
|
|
631
|
+
|
|
632
|
+
Also add the Chip import if not already present:
|
|
633
|
+
```typescript
|
|
634
|
+
import { Chip } from "@mui/joy";
|
|
635
|
+
```
|
|
636
|
+
|
|
637
|
+
### Success Criteria
|
|
638
|
+
|
|
639
|
+
#### Automated Verification:
|
|
640
|
+
- [ ] TypeScript compiles: `bun run tsc`
|
|
641
|
+
- [ ] Linting passes: `bun run lint`
|
|
642
|
+
- [ ] Frontend builds: `cd ui && bun run build`
|
|
643
|
+
|
|
644
|
+
#### Manual Verification:
|
|
645
|
+
- [ ] Start API server and frontend
|
|
646
|
+
- [ ] Create a task and assign it to a worker
|
|
647
|
+
- [ ] Open TaskDetailPanel for the in-progress task
|
|
648
|
+
- [ ] Verify "Live Output" section appears with "Connected" indicator
|
|
649
|
+
- [ ] Observe logs appearing in real-time as worker executes
|
|
650
|
+
- [ ] Verify logs stop streaming when task completes
|
|
651
|
+
- [ ] Refresh page and verify historical logs are still visible
|
|
652
|
+
|
|
653
|
+
**Implementation Note**: After completing this phase and all automated verification passes, pause here for manual confirmation that the full end-to-end streaming works correctly before proceeding to Phase 5.
|
|
654
|
+
|
|
655
|
+
---
|
|
656
|
+
|
|
657
|
+
## Phase 5: Backend Types
|
|
658
|
+
|
|
659
|
+
### Overview
|
|
660
|
+
Add the StreamingLogEntry type to the backend for consistency.
|
|
661
|
+
|
|
662
|
+
### Changes Required
|
|
663
|
+
|
|
664
|
+
#### 1. Add StreamingLogEntry Type
|
|
665
|
+
**File**: `src/types.ts`
|
|
666
|
+
**Changes**: Add type definition
|
|
667
|
+
|
|
668
|
+
```typescript
|
|
669
|
+
// Add after AgentLogSchema (around line 138)
|
|
670
|
+
export const StreamingLogEntrySchema = z.object({
|
|
671
|
+
type: z.string(),
|
|
672
|
+
content: z.unknown(),
|
|
673
|
+
timestamp: z.iso.datetime(),
|
|
674
|
+
receivedAt: z.iso.datetime().optional(),
|
|
675
|
+
});
|
|
676
|
+
|
|
677
|
+
export type StreamingLogEntry = z.infer<typeof StreamingLogEntrySchema>;
|
|
678
|
+
```
|
|
679
|
+
|
|
680
|
+
### Success Criteria
|
|
681
|
+
|
|
682
|
+
#### Automated Verification:
|
|
683
|
+
- [ ] TypeScript compiles: `bun run tsc`
|
|
684
|
+
- [ ] Linting passes: `bun run lint`
|
|
685
|
+
|
|
686
|
+
#### Manual Verification:
|
|
687
|
+
- [ ] Types are consistent between frontend and backend
|
|
688
|
+
|
|
689
|
+
---
|
|
690
|
+
|
|
691
|
+
## Testing Strategy
|
|
692
|
+
|
|
693
|
+
### Unit Tests
|
|
694
|
+
- Hook task file write/clear operations
|
|
695
|
+
- Runner task ID reading
|
|
696
|
+
- API log file parsing
|
|
697
|
+
|
|
698
|
+
### Integration Tests
|
|
699
|
+
- Full flow: hook writes task ID → runner streams → API stores → SSE broadcasts
|
|
700
|
+
- Multiple concurrent task subscriptions
|
|
701
|
+
- Connection cleanup on client disconnect
|
|
702
|
+
|
|
703
|
+
### Manual Testing Steps
|
|
704
|
+
1. Start API server: `bun run dev:http`
|
|
705
|
+
2. Start a worker agent
|
|
706
|
+
3. Create and assign a task via Slack or API
|
|
707
|
+
4. Open frontend dashboard and select the task
|
|
708
|
+
5. Verify "Live Output" section shows real-time logs
|
|
709
|
+
6. Wait for task completion
|
|
710
|
+
7. Refresh page and verify logs persist
|
|
711
|
+
8. Check `./logs/{taskId}.jsonl` file exists with correct content
|
|
712
|
+
|
|
713
|
+
## Performance Considerations
|
|
714
|
+
|
|
715
|
+
- **Fire-and-forget streaming**: Runner doesn't await API responses to avoid blocking Claude CLI processing
|
|
716
|
+
- **SSE heartbeat**: 30-second interval prevents connection timeout
|
|
717
|
+
- **Subscriber cleanup**: Dead connections are removed on write failure
|
|
718
|
+
- **File append**: Logs are appended, not rewritten, for efficiency
|
|
719
|
+
|
|
720
|
+
## Migration Notes
|
|
721
|
+
|
|
722
|
+
- No database migration required (logs stored on filesystem)
|
|
723
|
+
- Existing session-based logs in `./logs/{sessionId}/` are unaffected
|
|
724
|
+
- New task-based logs stored in `./logs/{taskId}.jsonl`
|
|
725
|
+
|
|
726
|
+
## References
|
|
727
|
+
|
|
728
|
+
- Related research: `thoughts/shared/research/2025-12-19-agent-log-streaming.md`
|
|
729
|
+
- Hook event handling: `src/hooks/hook.ts:172-188`
|
|
730
|
+
- Runner stdout capture: `src/commands/runner.ts:77-126`
|
|
731
|
+
- Existing SSE pattern: `src/http.ts:339-347` (MCP transport)
|
|
732
|
+
- Frontend polling: `ui/src/main.tsx:13`
|