@chuckssmith/agentloom 0.8.0 → 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -14,9 +14,9 @@ loom setup
14
14
  Claude Code is the execution engine. agentloom adds:
15
15
 
16
16
  - **`$grind`** — persistence loop that keeps working until a task is verified complete
17
- - **`$crew`** — parallel workers that decompose and execute simultaneously
17
+ - **`$crew`** — parallel workers that decompose and execute simultaneously
18
18
  - **`$architect`** — deep analysis mode before major decisions
19
- - **`loom crew`** — CLI to spawn and monitor a crew from your terminal
19
+ - **`loom crew`** — CLI to spawn, monitor, and collect results from a crew
20
20
 
21
21
  It does not replace Claude Code. It wraps it.
22
22
 
@@ -27,19 +27,24 @@ It does not replace Claude Code. It wraps it.
27
27
  ```bash
28
28
  npm install -g @chuckssmith/agentloom
29
29
  loom setup # installs $grind, $crew, $architect skills + validates deps
30
+ loom init # create .loomrc config in current project (optional)
30
31
 
31
- # Spawn workers from your terminal:
32
+ # Spawn workers:
32
33
  loom crew "audit every API endpoint for security issues"
34
+ loom crew 3 "refactor the auth module"
33
35
  loom crew 2:explore+1:code-reviewer "review the payment flow"
34
- loom crew --dry-run 3 "migrate the database schema" # preview before launching
36
+
37
+ # Useful flags:
38
+ loom crew --dry-run 3 "migrate the schema" # preview decomposed subtasks first
39
+ loom crew --watch "audit the codebase" # launch + immediately tail logs
40
+ loom crew --serial 3 "build the pipeline" # run workers sequentially
35
41
 
36
42
  # Monitor:
37
- loom watch # live tail all worker logs
38
- loom status # session overview + stale worker detection
39
- loom logs w00 # full output for one worker
43
+ loom watch # live color-coded tail of all worker logs
44
+ loom status # session overview + per-worker liveness (PID-aware)
40
45
 
41
46
  # After workers finish:
42
- loom collect # synthesize results with Claude
47
+ loom collect # synthesize results with Claude into summary.md
43
48
  loom reset --force # clear state for next run
44
49
 
45
50
  # Or use inside any Claude Code session:
@@ -53,46 +58,58 @@ loom reset --force # clear state for next run
53
58
 
54
59
  Install with `loom setup`. Use inside any Claude Code session:
55
60
 
56
- | Skill | Trigger | What it does |
57
- |---|---|---|
58
- | `$grind` | `$grind "<task>"` | Persistence loop — plans, executes in parallel, verifies. Won't stop until a code-reviewer subagent returns PASS |
59
- | `$crew` | `$crew "<task>"` | Decomposes task into independent streams, runs workers simultaneously, verifies result |
60
- | `$architect` | `$architect "<task>"` | Deep analysis — maps the system, finds real problems, recommends approach before you write code |
61
+ | Skill | What it does |
62
+ |---|---|
63
+ | `$grind` | Persistence loop — plans, executes in parallel, verifies. Won't stop until a code-reviewer returns PASS |
64
+ | `$crew` | Decomposes task into independent streams, runs workers simultaneously, verifies result |
65
+ | `$architect` | Deep analysis — maps the system, finds real problems, recommends approach |
61
66
 
62
67
  ---
63
68
 
64
69
  ## CLI reference
65
70
 
71
+ ### Setup
72
+
73
+ ```
74
+ loom init Create .loomrc in current directory (see Configuration below)
75
+ loom setup Install skills to ~/.claude/skills/, validate claude + tmux
76
+ ```
77
+
66
78
  ### Spawning workers
67
79
 
68
80
  ```
69
- loom crew "<task>" 2 general-purpose workers (default)
70
- loom crew 3 "<task>" 3 workers
71
- loom crew 2:explore "<task>" 2 explore-type workers
72
- loom crew 2:explore+1:code-reviewer "<task>" typed crew
73
- loom crew --dry-run 3 "<task>" preview decomposed subtasks, no launch
81
+ loom crew "<task>" Use defaults from .loomrc (or 2 general-purpose)
82
+ loom crew 3 "<task>" 3 workers
83
+ loom crew 2:explore "<task>" 2 explore-type workers
84
+ loom crew 2:explore+1:code-reviewer "<task>" Typed crew
85
+
86
+ Flags (combinable):
87
+ --dry-run Preview AI-decomposed subtasks without launching
88
+ --watch Launch then immediately tail all worker logs
89
+ --serial Run workers sequentially — each worker reads prior results from context file
74
90
  ```
75
91
 
76
92
  ### Monitoring
77
93
 
78
94
  ```
79
- loom watch Live tail all worker logs with color-coded output
80
- loom status Session overview, task counts, stale worker detection
81
- loom logs Summary of all workers (status + last line)
95
+ loom watch Live color-coded tail (auto-exits when all workers done)
96
+ loom status Session overview, task counts, per-worker liveness
97
+ loom logs Summary: all workers, status, last log line
82
98
  loom logs <workerId> Full log + result for one worker (e.g. loom logs w00)
83
99
  ```
84
100
 
85
101
  ### After workers finish
86
102
 
87
103
  ```
88
- loom collect Read worker results + synthesize summary with Claude
104
+ loom collect Read worker results + synthesize with Claude into .claude-team/summary.md
89
105
  loom collect --no-ai Concatenate results without Claude synthesis
90
106
  ```
91
107
 
92
108
  ### Housekeeping
93
109
 
94
110
  ```
95
- loom setup Install skills to ~/.claude/skills/, validate deps
111
+ loom stop Kill all background workers (SIGTERM)
112
+ loom stop <workerId> Kill one worker
96
113
  loom reset --force Wipe .claude-team/ state
97
114
  ```
98
115
 
@@ -100,7 +117,7 @@ loom reset --force Wipe .claude-team/ state
100
117
 
101
118
  ## Worker types
102
119
 
103
- Each type gets a role-specific system prompt that shapes its behavior:
120
+ Each type gets a role-specific system prompt. Read-only roles do **not** receive `--dangerously-skip-permissions`.
104
121
 
105
122
  | Type | Role | Modifies files? |
106
123
  |---|---|---|
@@ -112,20 +129,45 @@ Each type gets a role-specific system prompt that shapes its behavior:
112
129
 
113
130
  ---
114
131
 
132
+ ## Configuration
133
+
134
+ Run `loom init` to create a `.loomrc` in your project directory:
135
+
136
+ ```json
137
+ {
138
+ "workers": 2,
139
+ "agentType": "general-purpose",
140
+ "claimTtlMinutes": 30,
141
+ "staleMinutes": 10
142
+ }
143
+ ```
144
+
145
+ | Key | Default | Description |
146
+ |---|---|---|
147
+ | `workers` | 2 | Default worker count when none specified |
148
+ | `agentType` | `general-purpose` | Default agent type when none specified |
149
+ | `claimTtlMinutes` | 30 | Minutes before a crashed worker's claimed task is re-queued |
150
+ | `staleMinutes` | 10 | Minutes of dead-pid + log silence before worker is flagged STALE |
151
+
152
+ ---
153
+
115
154
  ## State directory
116
155
 
117
156
  Session state lives in `.claude-team/` (gitignored):
118
157
 
119
158
  ```
120
159
  .claude-team/
121
- session.json Active session metadata
122
- context/ Shared context snapshots (workers read + append)
123
- tasks/ Task queue — workers claim atomically via file rename
160
+ session.json Active session metadata
161
+ context/ Shared context snapshots (workers read + append)
162
+ tasks/ Task queue — workers claim atomically via file rename
163
+ Stale claimed tasks (>claimTtlMinutes) auto re-queued
124
164
  workers/
125
- w00.log Live stdout from worker 00
126
- w00-prompt.md Prompt sent to worker 00
127
- w00-result.md Result summary written by worker 00 on completion
128
- summary.md Final synthesis from loom collect
165
+ w00.log Live stdout from worker 00
166
+ w00.pid PID of worker 00 process
167
+ w00-prompt.md Prompt sent to worker 00
168
+ w00-result.md Result summary written by worker 00 on completion
169
+ w00-run.mjs Node.js runner script (tmux mode)
170
+ summary.md Final synthesis from loom collect
129
171
  ```
130
172
 
131
173
  ---
@@ -134,7 +176,7 @@ Session state lives in `.claude-team/` (gitignored):
134
176
 
135
177
  - Node.js 20+
136
178
  - Claude Code CLI (`claude`) — authenticated
137
- - tmux (optional — used on Mac/Linux; falls back to background processes on Windows/WSL)
179
+ - tmux (optional — used on Mac/Linux interactive terminals; background processes used on Windows/WSL/CI)
138
180
 
139
181
  ---
140
182
 
package/dist/cli.js CHANGED
@@ -7,16 +7,19 @@ import { collect } from './commands/collect.js';
7
7
  import { reset } from './commands/reset.js';
8
8
  import { watch } from './commands/watch.js';
9
9
  import { stop } from './commands/stop.js';
10
+ import { init } from './commands/init.js';
10
11
  const [, , command, ...args] = process.argv;
11
12
  const usage = `
12
13
  agentloom (loom) — workflow layer for Claude Code
13
14
 
14
15
  Usage:
16
+ loom init Create .loomrc config in current directory
15
17
  loom setup Install skills and initialize state dir
16
18
  loom crew [N] "<task>" Spawn N parallel workers on a task
17
19
  loom crew 2:explore "<task>" Spawn typed workers (explore/plan/code-reviewer)
18
20
  loom crew --dry-run [N] "<task>" Preview decomposed subtasks without launching
19
21
  loom crew --serial [N] "<task>" Run workers sequentially (each sees prior results)
22
+ loom crew --watch [N] "<task>" Launch workers and immediately tail logs
20
23
  loom watch Live tail all worker logs (Ctrl+C to stop)
21
24
  loom stop Kill all background workers (SIGTERM)
22
25
  loom stop <workerId> Kill one worker
@@ -48,6 +51,9 @@ Examples:
48
51
  loom collect
49
52
  `;
50
53
  switch (command) {
54
+ case 'init':
55
+ await init(args);
56
+ break;
51
57
  case 'setup':
52
58
  await setup();
53
59
  break;
@@ -2,7 +2,7 @@ import { readFile, readdir, writeFile } from 'fs/promises';
2
2
  import { join } from 'path';
3
3
  import { existsSync } from 'fs';
4
4
  import { spawnSync } from 'child_process';
5
- import { STATE_DIR, readSession, readTasks } from '../state/session.js';
5
+ import { STATE_DIR, readSession } from '../state/session.js';
6
6
  const WORKERS_DIR = join(STATE_DIR, 'workers');
7
7
  export async function collect(args) {
8
8
  if (!existsSync(WORKERS_DIR)) {
@@ -10,7 +10,6 @@ export async function collect(args) {
10
10
  return;
11
11
  }
12
12
  const session = await readSession();
13
- const tasks = await readTasks();
14
13
  const files = await readdir(WORKERS_DIR);
15
14
  const resultFiles = files.filter(f => f.endsWith('-result.md')).sort();
16
15
  if (resultFiles.length === 0) {
@@ -34,7 +33,7 @@ export async function collect(args) {
34
33
  const synthesize = !args.includes('--no-ai');
35
34
  let synthesis = '';
36
35
  if (synthesize) {
37
- console.log('\nSynthesizing with Claude...');
36
+ console.log('\nSynthesizing with Claude... (may take up to 60s)');
38
37
  const prompt = `You are summarizing the results of a multi-agent crew that worked on this task:
39
38
 
40
39
  "${taskDesc}"
@@ -4,6 +4,8 @@ import { join } from 'path';
4
4
  import { existsSync } from 'fs';
5
5
  import { parseWorkerSpec, initSession, writeContextSnapshot, decomposeTasks, } from '../team/orchestrator.js';
6
6
  import { STATE_DIR } from '../state/session.js';
7
+ import { watch } from './watch.js';
8
+ import { loadConfig } from '../config.js';
7
9
  const hasTmux = () => {
8
10
  try {
9
11
  execSync('tmux -V', { stdio: 'ignore' });
@@ -69,8 +71,11 @@ export async function crew(args) {
69
71
  }
70
72
  const dryRun = args.includes('--dry-run');
71
73
  const serial = args.includes('--serial');
72
- const filteredArgs = args.filter(a => a !== '--dry-run' && a !== '--serial');
73
- const { specs, task } = parseWorkerSpec(filteredArgs);
74
+ const watchAfter = args.includes('--watch');
75
+ const filteredArgs = args.filter(a => !['--dry-run', '--serial', '--watch'].includes(a));
76
+ const config = await loadConfig();
77
+ const forcePermissions = config.dangerouslySkipPermissions === true;
78
+ const { specs, task } = parseWorkerSpec(filteredArgs, config.workers, config.agentType);
74
79
  const totalWorkers = specs.reduce((sum, s) => sum + s.count, 0);
75
80
  const slug = task.slice(0, 30).toLowerCase().replace(/\s+/g, '-').replace(/[^a-z0-9-]/g, '');
76
81
  console.log(`\nagentloom crew`);
@@ -95,23 +100,28 @@ export async function crew(args) {
95
100
  const mode = serial ? 'serial' : useTmux ? 'tmux' : 'background processes';
96
101
  console.log(`Mode: ${mode}\n`);
97
102
  const session = await initSession(task, totalWorkers);
98
- const contextPath = await writeContextSnapshot(slug, task);
103
+ const contextPath = await writeContextSnapshot(slug, session.id, task);
99
104
  const tasks = await decomposeTasks(task, specs);
100
105
  console.log(`Session: ${session.id}`);
101
106
  console.log(`Tasks: ${tasks.length} created`);
102
107
  console.log(`Context: ${contextPath}\n`);
103
108
  if (serial) {
104
- await launchSerial(session.id, specs, tasks, contextPath);
109
+ await launchSerial(session.id, specs, tasks, contextPath, forcePermissions);
105
110
  console.log(`\nAll workers finished. Run: loom collect`);
106
111
  }
107
112
  else if (useTmux) {
108
- await launchTmux(session.id, specs, tasks, contextPath);
113
+ await launchTmux(session.id, specs, tasks, contextPath, forcePermissions);
109
114
  console.log(`\nWorkers launched. Monitor with:`);
110
115
  console.log(` loom status`);
111
116
  console.log(` loom stop (kill all workers)`);
112
117
  }
113
118
  else {
114
- await launchBackground(session.id, specs, tasks, contextPath);
119
+ await launchBackground(session.id, specs, tasks, contextPath, forcePermissions);
120
+ if (watchAfter) {
121
+ console.log();
122
+ await watch([]);
123
+ return;
124
+ }
115
125
  console.log(`\nWorkers launched. Monitor with:`);
116
126
  console.log(` loom status`);
117
127
  console.log(` loom watch`);
@@ -119,7 +129,7 @@ export async function crew(args) {
119
129
  }
120
130
  console.log(`State dir: ${STATE_DIR}/`);
121
131
  }
122
- async function launchSerial(sessionId, specs, tasks, contextPath) {
132
+ async function launchSerial(sessionId, specs, tasks, contextPath, forcePermissions = false) {
123
133
  await mkdir(join(STATE_DIR, 'workers'), { recursive: true });
124
134
  let workerIdx = 0;
125
135
  for (const spec of specs) {
@@ -135,7 +145,7 @@ async function launchSerial(sessionId, specs, tasks, contextPath) {
135
145
  console.log(` → Worker ${workerId} (${agentType}) starting...`);
136
146
  const claudeArgs = [
137
147
  '--print',
138
- ...(!READ_ONLY_ROLES.has(agentType) ? ['--dangerously-skip-permissions'] : []),
148
+ ...(forcePermissions || !READ_ONLY_ROLES.has(agentType) ? ['--dangerously-skip-permissions'] : []),
139
149
  '-p',
140
150
  prompt,
141
151
  ];
@@ -164,7 +174,7 @@ async function launchSerial(sessionId, specs, tasks, contextPath) {
164
174
  }
165
175
  }
166
176
  }
167
- async function launchBackground(sessionId, specs, tasks, contextPath) {
177
+ async function launchBackground(sessionId, specs, tasks, contextPath, forcePermissions = false) {
168
178
  await mkdir(join(STATE_DIR, 'workers'), { recursive: true });
169
179
  let workerIdx = 0;
170
180
  for (const spec of specs) {
@@ -180,7 +190,7 @@ async function launchBackground(sessionId, specs, tasks, contextPath) {
180
190
  // Build args declaratively — no positional splicing
181
191
  const claudeArgs = [
182
192
  '--print',
183
- ...(!READ_ONLY_ROLES.has(agentType) ? ['--dangerously-skip-permissions'] : []),
193
+ ...(forcePermissions || !READ_ONLY_ROLES.has(agentType) ? ['--dangerously-skip-permissions'] : []),
184
194
  '-p',
185
195
  prompt,
186
196
  ];
@@ -203,7 +213,7 @@ async function launchBackground(sessionId, specs, tasks, contextPath) {
203
213
  }
204
214
  }
205
215
  }
206
- async function launchTmux(sessionId, specs, tasks, contextPath) {
216
+ async function launchTmux(sessionId, specs, tasks, contextPath, forcePermissions = false) {
207
217
  const tmuxSession = `loom-${sessionId}`;
208
218
  // Check for session name collision
209
219
  const existing = spawnSync('tmux', ['has-session', '-t', tmuxSession], { stdio: 'ignore' });
@@ -237,7 +247,7 @@ async function launchTmux(sessionId, specs, tasks, contextPath) {
237
247
  `process.env.AGENTLOOM_WORKER_ID = ${JSON.stringify(workerId)}`,
238
248
  `process.env.AGENTLOOM_SESSION = ${JSON.stringify(sessionId)}`,
239
249
  `const prompt = readFileSync(${JSON.stringify(promptFile)}, 'utf8')`,
240
- `const args = ['--print', ${!READ_ONLY_ROLES.has(agentType) ? `'--dangerously-skip-permissions', ` : ``}'${'-p'}', prompt]`,
250
+ `const args = ['--print', ${(forcePermissions || !READ_ONLY_ROLES.has(agentType)) ? `'--dangerously-skip-permissions', ` : ``}'${'-p'}', prompt]`,
241
251
  `const r = spawnSync('claude', args, { stdio: 'inherit' })`,
242
252
  `console.log('[worker done]')`,
243
253
  `process.exit(r.status ?? 0)`,
@@ -0,0 +1 @@
1
+ export declare function init(_args: string[]): Promise<void>;
@@ -0,0 +1,4 @@
1
+ import { initConfig } from '../config.js';
2
+ export async function init(_args) {
3
+ await initConfig();
4
+ }
@@ -1,5 +1,5 @@
1
1
  import { readFile, readdir } from 'fs/promises';
2
- import { join } from 'path';
2
+ import { join, basename } from 'path';
3
3
  import { existsSync } from 'fs';
4
4
  import { STATE_DIR } from '../state/session.js';
5
5
  const WORKERS_DIR = join(STATE_DIR, 'workers');
@@ -8,7 +8,9 @@ export async function logs(args) {
8
8
  console.log('No worker logs found. Run: loom crew "<task>"');
9
9
  return;
10
10
  }
11
- const workerId = args[0];
11
+ // Sanitize workerId to prevent path traversal
12
+ const rawId = args[0];
13
+ const workerId = rawId ? basename(rawId) : undefined;
12
14
  if (workerId) {
13
15
  // Show log for a specific worker
14
16
  const logFile = join(WORKERS_DIR, `${workerId}.log`);
@@ -1,6 +1,8 @@
1
- import { rm } from 'fs/promises';
1
+ import { rm, readdir, readFile } from 'fs/promises';
2
2
  import { existsSync } from 'fs';
3
+ import { join } from 'path';
3
4
  import { STATE_DIR } from '../state/session.js';
5
+ const WORKERS_DIR = join(STATE_DIR, 'workers');
4
6
  export async function reset(args) {
5
7
  if (!existsSync(STATE_DIR)) {
6
8
  console.log('Nothing to reset.');
@@ -8,10 +10,33 @@ export async function reset(args) {
8
10
  }
9
11
  const force = args.includes('--force') || args.includes('-f');
10
12
  if (!force) {
11
- console.log(`This will delete all session state in ${STATE_DIR}/`);
13
+ console.log(`This will kill running workers and delete all session state in ${STATE_DIR}/`);
12
14
  console.log('Run with --force to confirm: loom reset --force');
13
15
  return;
14
16
  }
17
+ // Kill any live workers before deleting their PID files
18
+ if (existsSync(WORKERS_DIR)) {
19
+ try {
20
+ const files = await readdir(WORKERS_DIR);
21
+ const pidFiles = files.filter(f => f.endsWith('.pid'));
22
+ for (const pidFile of pidFiles) {
23
+ const pid = parseInt(await readFile(join(WORKERS_DIR, pidFile), 'utf8').catch(() => ''), 10);
24
+ if (!isNaN(pid)) {
25
+ try {
26
+ process.kill(pid, 'SIGTERM');
27
+ const workerId = pidFile.replace('.pid', '');
28
+ console.log(` killed worker ${workerId} (pid ${pid})`);
29
+ }
30
+ catch {
31
+ // Process already dead — ignore
32
+ }
33
+ }
34
+ }
35
+ }
36
+ catch {
37
+ // Workers dir unreadable — proceed with delete anyway
38
+ }
39
+ }
15
40
  await rm(STATE_DIR, { recursive: true, force: true });
16
41
  console.log(`✓ Session state cleared (${STATE_DIR}/)`);
17
42
  }
@@ -19,7 +19,14 @@ export async function setup() {
19
19
  }
20
20
  // 2. Install skills
21
21
  await mkdir(SKILLS_DEST, { recursive: true });
22
- const skills = await readdir(SKILLS_SRC);
22
+ let skills;
23
+ try {
24
+ skills = await readdir(SKILLS_SRC);
25
+ }
26
+ catch {
27
+ console.error('✗ Could not find skills directory — package may be misconfigured');
28
+ process.exit(1);
29
+ }
23
30
  for (const skill of skills.filter(f => f.endsWith('.md'))) {
24
31
  const dest = join(SKILLS_DEST, skill);
25
32
  await copyFile(join(SKILLS_SRC, skill), dest);
@@ -2,7 +2,7 @@ import { readSession, readTasks, STATE_DIR } from '../state/session.js';
2
2
  import { existsSync, statSync } from 'fs';
3
3
  import { join } from 'path';
4
4
  import { readdir, readFile } from 'fs/promises';
5
- const STALE_THRESHOLD_MS = 10 * 60 * 1000; // 10 min with no log growth AND no live PID = stale
5
+ import { loadConfig } from '../config.js';
6
6
  function isProcessAlive(pid) {
7
7
  try {
8
8
  process.kill(pid, 0);
@@ -40,6 +40,8 @@ export async function status() {
40
40
  const logFiles = files.filter(f => f.endsWith('.log')).sort();
41
41
  if (logFiles.length === 0)
42
42
  return;
43
+ const config = await loadConfig();
44
+ const staleThresholdMs = config.staleMinutes * 60 * 1000;
43
45
  console.log(`\nWorkers: ${logFiles.length}`);
44
46
  const now = Date.now();
45
47
  for (const logFile of logFiles) {
@@ -72,7 +74,7 @@ export async function status() {
72
74
  const secs = Math.round(msSinceWrite / 1000);
73
75
  console.log(` [${workerId}] running (pid alive, last log ${secs}s ago)`);
74
76
  }
75
- else if (msSinceWrite > STALE_THRESHOLD_MS) {
77
+ else if (msSinceWrite > staleThresholdMs) {
76
78
  const mins = Math.round(msSinceWrite / 60000);
77
79
  console.log(` [${workerId}] STALE — pid dead, no log activity for ${mins}m`);
78
80
  }
@@ -45,6 +45,9 @@ export async function stop(args) {
45
45
  }
46
46
  console.log(`\n${killed} killed, ${notFound} already stopped.`);
47
47
  if (killed > 0) {
48
+ if (process.platform === 'win32') {
49
+ console.log(' note: SIGTERM on Windows is a force kill (TerminateProcess)');
50
+ }
48
51
  console.log('State preserved. Run: loom reset --force to clear it.');
49
52
  }
50
53
  }
@@ -4,24 +4,39 @@ import { existsSync } from 'fs';
4
4
  import { STATE_DIR } from '../state/session.js';
5
5
  const WORKERS_DIR = join(STATE_DIR, 'workers');
6
6
  const POLL_MS = 800;
7
- // A rotating set of ANSI colors for worker prefixes
7
+ const STALE_TIMEOUT_MS = 15 * 60 * 1000; // 15 min no log growth + dead PID = give up
8
8
  const COLORS = ['\x1b[36m', '\x1b[33m', '\x1b[35m', '\x1b[32m', '\x1b[34m', '\x1b[31m'];
9
9
  const RESET = '\x1b[0m';
10
10
  const DIM = '\x1b[2m';
11
+ const YELLOW = '\x1b[33m';
12
+ function isProcessAlive(pid) {
13
+ try {
14
+ process.kill(pid, 0);
15
+ return true;
16
+ }
17
+ catch {
18
+ return false;
19
+ }
20
+ }
11
21
  export async function watch(_args) {
12
22
  if (!existsSync(WORKERS_DIR)) {
13
23
  console.log('No active session. Run: loom crew "<task>"');
14
24
  process.exit(1);
15
25
  }
16
26
  console.log(`${DIM}Watching worker logs. Ctrl+C to stop.${RESET}\n`);
17
- // Track how many bytes we've read from each log file
18
27
  const offsets = {};
28
+ const lastGrowth = {};
19
29
  const seen = new Set();
20
- // eslint-disable-next-line no-constant-condition
21
30
  while (true) {
22
31
  if (!existsSync(WORKERS_DIR))
23
32
  break;
24
- const files = await readdir(WORKERS_DIR);
33
+ let files;
34
+ try {
35
+ files = await readdir(WORKERS_DIR);
36
+ }
37
+ catch {
38
+ break;
39
+ }
25
40
  const logFiles = files.filter(f => f.endsWith('.log')).sort();
26
41
  for (const logFile of logFiles) {
27
42
  const workerId = logFile.replace('.log', '');
@@ -32,20 +47,27 @@ export async function watch(_args) {
32
47
  const resultExists = existsSync(join(WORKERS_DIR, `${workerId}-result.md`));
33
48
  console.log(`${color}[${workerId}]${RESET} ${DIM}started${resultExists ? ' (already done)' : ''}${RESET}`);
34
49
  }
35
- const currentSize = (await stat(filePath)).size;
50
+ // Guard stat — file may be deleted mid-poll (e.g. loom reset)
51
+ let currentSize;
52
+ try {
53
+ currentSize = (await stat(filePath)).size;
54
+ }
55
+ catch {
56
+ continue;
57
+ }
36
58
  const offset = offsets[workerId] ?? 0;
37
59
  if (currentSize > offset) {
38
- const buf = await readFile(filePath);
39
- const newContent = buf.slice(offset).toString('utf8');
40
- offsets[workerId] = currentSize;
41
- const lines = newContent.split('\n');
42
- for (const line of lines) {
43
- if (line.trim()) {
44
- process.stdout.write(`${color}[${workerId}]${RESET} ${line}\n`);
60
+ lastGrowth[workerId] = Date.now();
61
+ const buf = await readFile(filePath).catch(() => null);
62
+ if (buf) {
63
+ const newContent = buf.slice(offset).toString('utf8');
64
+ offsets[workerId] = currentSize;
65
+ for (const line of newContent.split('\n')) {
66
+ if (line.trim())
67
+ process.stdout.write(`${color}[${workerId}]${RESET} ${line}\n`);
45
68
  }
46
69
  }
47
70
  }
48
- // Check if worker just finished (result file appeared)
49
71
  const resultPath = join(WORKERS_DIR, `${workerId}-result.md`);
50
72
  const doneKey = `${workerId}-done`;
51
73
  if (existsSync(resultPath) && !seen.has(doneKey)) {
@@ -55,14 +77,35 @@ export async function watch(_args) {
55
77
  }
56
78
  // Exit when all known workers have results
57
79
  if (logFiles.length > 0) {
58
- const allDone = logFiles.every(f => {
59
- const id = f.replace('.log', '');
60
- return existsSync(join(WORKERS_DIR, `${id}-result.md`));
61
- });
62
- if (allDone) {
80
+ const workersDone = logFiles.map(f => f.replace('.log', '')).filter(id => existsSync(join(WORKERS_DIR, `${id}-result.md`)));
81
+ if (workersDone.length === logFiles.length) {
63
82
  console.log(`\n${DIM}All workers done. Run: loom collect${RESET}`);
64
83
  break;
65
84
  }
85
+ // Stale detection: workers with no result, dead PID, and log silent for >15min
86
+ const now = Date.now();
87
+ const staleWorkers = [];
88
+ for (const logFile of logFiles) {
89
+ const id = logFile.replace('.log', '');
90
+ if (existsSync(join(WORKERS_DIR, `${id}-result.md`)))
91
+ continue;
92
+ const pidPath = join(WORKERS_DIR, `${id}.pid`);
93
+ let pidAlive = false;
94
+ if (existsSync(pidPath)) {
95
+ const pid = parseInt(await readFile(pidPath, 'utf8').catch(() => ''), 10);
96
+ if (!isNaN(pid))
97
+ pidAlive = isProcessAlive(pid);
98
+ }
99
+ const sinceGrowth = now - (lastGrowth[id] ?? now);
100
+ if (!pidAlive && sinceGrowth > STALE_TIMEOUT_MS) {
101
+ staleWorkers.push(id);
102
+ }
103
+ }
104
+ if (staleWorkers.length > 0 && staleWorkers.length + workersDone.length === logFiles.length) {
105
+ console.log(`\n${YELLOW}Workers stalled (dead PID, no output for 15min): ${staleWorkers.join(', ')}${RESET}`);
106
+ console.log(`${DIM}Run: loom logs <workerId> to inspect. loom collect to gather what's available.${RESET}`);
107
+ break;
108
+ }
66
109
  }
67
110
  await new Promise(resolve => setTimeout(resolve, POLL_MS));
68
111
  }
@@ -0,0 +1,10 @@
1
+ export declare const LOOMRC = ".loomrc";
2
+ export type LoomConfig = {
3
+ workers?: number;
4
+ agentType?: string;
5
+ claimTtlMinutes?: number;
6
+ staleMinutes?: number;
7
+ dangerouslySkipPermissions?: boolean;
8
+ };
9
+ export declare function loadConfig(): Promise<Required<LoomConfig>>;
10
+ export declare function initConfig(): Promise<void>;
package/dist/config.js ADDED
@@ -0,0 +1,44 @@
1
+ import { readFile, writeFile } from 'fs/promises';
2
+ import { existsSync } from 'fs';
3
+ export const LOOMRC = '.loomrc';
4
+ const DEFAULTS = {
5
+ workers: 2,
6
+ agentType: 'general-purpose',
7
+ claimTtlMinutes: 30,
8
+ staleMinutes: 10,
9
+ dangerouslySkipPermissions: false,
10
+ };
11
+ export async function loadConfig() {
12
+ if (!existsSync(LOOMRC))
13
+ return { ...DEFAULTS };
14
+ try {
15
+ const raw = await readFile(LOOMRC, 'utf8');
16
+ const parsed = JSON.parse(raw);
17
+ if (typeof parsed !== 'object' || parsed === null)
18
+ return { ...DEFAULTS };
19
+ return { ...DEFAULTS, ...parsed };
20
+ }
21
+ catch {
22
+ console.error(`[agentloom] Warning: could not parse ${LOOMRC} — using defaults`);
23
+ return { ...DEFAULTS };
24
+ }
25
+ }
26
+ export async function initConfig() {
27
+ if (existsSync(LOOMRC)) {
28
+ console.log(`${LOOMRC} already exists.`);
29
+ return;
30
+ }
31
+ const config = {
32
+ workers: 2,
33
+ agentType: 'general-purpose',
34
+ claimTtlMinutes: 30,
35
+ staleMinutes: 10,
36
+ };
37
+ await writeFile(LOOMRC, JSON.stringify(config, null, 2) + '\n');
38
+ console.log(`Created ${LOOMRC}`);
39
+ console.log(`\nOptions:`);
40
+ console.log(` workers Default number of workers (default: 2)`);
41
+ console.log(` agentType Default agent type (default: general-purpose)`);
42
+ console.log(` claimTtlMinutes Minutes before crashed worker's task is re-queued (default: 30)`);
43
+ console.log(` staleMinutes Minutes before dead-pid worker is flagged STALE (default: 10)`);
44
+ }
@@ -12,14 +12,6 @@ export type Task = {
12
12
  claimedAt?: string;
13
13
  completedAt?: string;
14
14
  };
15
- export type Worker = {
16
- id: string;
17
- agentType: string;
18
- status: 'idle' | 'working' | 'done';
19
- currentTaskId?: string;
20
- startedAt: string;
21
- completedAt?: string;
22
- };
23
15
  export type Session = {
24
16
  id: string;
25
17
  description: string;
@@ -34,6 +26,4 @@ export declare function ensureStateDir(): Promise<void>;
34
26
  export declare function writeSession(session: Session): Promise<void>;
35
27
  export declare function readSession(): Promise<Session | null>;
36
28
  export declare function writeTask(task: Task): Promise<void>;
37
- export declare function writeWorker(worker: Worker): Promise<void>;
38
- export declare function readWorkers(): Promise<Worker[]>;
39
29
  export declare function readTasks(): Promise<Task[]>;
@@ -1,6 +1,5 @@
1
- import { readFile, writeFile, mkdir } from 'fs/promises';
1
+ import { readFile, writeFile, mkdir, readdir } from 'fs/promises';
2
2
  import { join } from 'path';
3
- import { existsSync } from 'fs';
4
3
  export const STATE_DIR = '.claude-team';
5
4
  export async function ensureStateDir() {
6
5
  await mkdir(join(STATE_DIR, 'tasks'), { recursive: true });
@@ -11,30 +10,23 @@ export async function writeSession(session) {
11
10
  await writeFile(join(STATE_DIR, 'session.json'), JSON.stringify(session, null, 2));
12
11
  }
13
12
  export async function readSession() {
14
- const path = join(STATE_DIR, 'session.json');
15
- if (!existsSync(path))
13
+ try {
14
+ return JSON.parse(await readFile(join(STATE_DIR, 'session.json'), 'utf8'));
15
+ }
16
+ catch {
16
17
  return null;
17
- return JSON.parse(await readFile(path, 'utf8'));
18
+ }
18
19
  }
19
20
  export async function writeTask(task) {
20
21
  await writeFile(join(STATE_DIR, 'tasks', `${task.id}-${task.status}.json`), JSON.stringify(task, null, 2));
21
22
  }
22
- export async function writeWorker(worker) {
23
- await writeFile(join(STATE_DIR, 'workers', `${worker.id}.json`), JSON.stringify(worker, null, 2));
24
- }
25
- export async function readWorkers() {
26
- const { readdir } = await import('fs/promises');
27
- const dir = join(STATE_DIR, 'workers');
28
- if (!existsSync(dir))
29
- return [];
30
- const files = await readdir(dir);
31
- return Promise.all(files.filter(f => f.endsWith('.json')).map(async (f) => JSON.parse(await readFile(join(dir, f), 'utf8'))));
32
- }
33
23
  export async function readTasks() {
34
- const { readdir } = await import('fs/promises');
35
24
  const dir = join(STATE_DIR, 'tasks');
36
- if (!existsSync(dir))
25
+ try {
26
+ const files = await readdir(dir);
27
+ return Promise.all(files.filter(f => f.endsWith('.json')).map(async (f) => JSON.parse(await readFile(join(dir, f), 'utf8'))));
28
+ }
29
+ catch {
37
30
  return [];
38
- const files = await readdir(dir);
39
- return Promise.all(files.filter(f => f.endsWith('.json')).map(async (f) => JSON.parse(await readFile(join(dir, f), 'utf8'))));
31
+ }
40
32
  }
@@ -3,10 +3,10 @@ export type WorkerSpec = {
3
3
  count: number;
4
4
  agentType: string;
5
5
  };
6
- export declare function parseWorkerSpec(args: string[]): {
6
+ export declare function parseWorkerSpec(args: string[], defaultWorkers?: number, defaultAgentType?: string): {
7
7
  specs: WorkerSpec[];
8
8
  task: string;
9
9
  };
10
10
  export declare function initSession(description: string, workerCount: number): Promise<Session>;
11
- export declare function writeContextSnapshot(slug: string, task: string): Promise<string>;
11
+ export declare function writeContextSnapshot(slug: string, sessionId: string, task: string): Promise<string>;
12
12
  export declare function decomposeTasks(task: string, specs: WorkerSpec[], dryRun?: boolean): Promise<Task[]>;
@@ -3,29 +3,25 @@ import { join } from 'path';
3
3
  import { randomUUID } from 'crypto';
4
4
  import { spawnSync } from 'child_process';
5
5
  import { STATE_DIR, ensureStateDir, writeSession, writeTask } from '../state/session.js';
6
- export function parseWorkerSpec(args) {
7
- // Formats:
8
- // omc team "task description"
9
- // omc team 3 "task description"
10
- // omc team 2:explore "task description"
11
- // omc team 2:explore+1:code-reviewer "task description"
6
+ export function parseWorkerSpec(args, defaultWorkers = 2, defaultAgentType = 'general-purpose') {
12
7
  const task = args[args.length - 1] ?? '';
13
8
  const specArg = args.length > 1 ? args[0] ?? '' : '';
14
9
  if (!specArg) {
15
- return { specs: [{ count: 2, agentType: 'general-purpose' }], task };
10
+ return { specs: [{ count: defaultWorkers, agentType: defaultAgentType }], task };
16
11
  }
17
12
  // Plain number: "3"
18
13
  if (/^\d+$/.test(specArg)) {
19
- return { specs: [{ count: parseInt(specArg), agentType: 'general-purpose' }], task };
14
+ return { specs: [{ count: parseInt(specArg), agentType: defaultAgentType }], task };
20
15
  }
21
16
  // Typed specs: "2:explore+1:code-reviewer"
22
17
  const parts = specArg.split('+');
23
18
  const specs = parts.map(part => {
24
19
  const [countStr, agentType] = part.split(':');
25
- return {
26
- count: parseInt(countStr ?? '1'),
27
- agentType: agentType ?? 'general-purpose',
28
- };
20
+ const count = parseInt(countStr ?? '1');
21
+ if (isNaN(count) || count < 1) {
22
+ throw new Error(`Invalid worker spec "${part}" — count must be a positive integer`);
23
+ }
24
+ return { count, agentType: agentType ?? defaultAgentType };
29
25
  });
30
26
  return { specs, task };
31
27
  }
@@ -41,9 +37,10 @@ export async function initSession(description, workerCount) {
41
37
  await writeSession(session);
42
38
  return session;
43
39
  }
44
- export async function writeContextSnapshot(slug, task) {
45
- const path = join(STATE_DIR, 'context', `${slug}.md`);
46
- const content = `# Task Context\n\n**Task:** ${task}\n\n**Started:** ${new Date().toISOString()}\n\n## Notes\n\n_Workers will append findings here._\n`;
40
+ export async function writeContextSnapshot(slug, sessionId, task) {
41
+ // Include session ID so concurrent runs with similar task names don't clobber each other
42
+ const path = join(STATE_DIR, 'context', `${slug}-${sessionId.slice(0, 6)}.md`);
43
+ const content = `# Task Context\n\n**Task:** ${task}\n\n**Session:** ${sessionId}\n\n**Started:** ${new Date().toISOString()}\n\n## Notes\n\n_Workers will append findings here._\n`;
47
44
  await writeFile(path, content);
48
45
  return path;
49
46
  }
@@ -80,9 +77,9 @@ Task: "${task}"`;
80
77
  encoding: 'utf8',
81
78
  timeout: 30_000,
82
79
  });
83
- if (result.status !== 0 || !result.stdout)
80
+ if (result.status !== 0 || !result.stdout) {
84
81
  throw new Error(result.stderr ?? 'no output');
85
- // Extract JSON array from the response (strip any surrounding prose)
82
+ }
86
83
  const match = result.stdout.match(/\[[\s\S]*\]/);
87
84
  if (!match)
88
85
  throw new Error('No JSON array in response');
@@ -90,13 +87,12 @@ Task: "${task}"`;
90
87
  if (!Array.isArray(parsed) || parsed.length === 0)
91
88
  throw new Error('Empty array');
92
89
  const subtasks = parsed.map(String);
93
- // Pad or trim to exactly n
94
90
  while (subtasks.length < n)
95
91
  subtasks.push(task);
96
92
  return subtasks.slice(0, n);
97
93
  }
98
- catch {
99
- // Fallback: every worker gets the same task description
94
+ catch (err) {
95
+ process.stderr.write(`[agentloom] Task decomposition failed (${err instanceof Error ? err.message : err}) all workers will receive the same task description\n`);
100
96
  return Array(n).fill(task);
101
97
  }
102
98
  }
@@ -1,11 +1,13 @@
1
1
  import { readdir, readFile, rename, writeFile, stat, unlink } from 'fs/promises';
2
2
  import { join } from 'path';
3
3
  import { STATE_DIR } from '../state/session.js';
4
+ import { loadConfig } from '../config.js';
4
5
  const TASKS_DIR = join(STATE_DIR, 'tasks');
5
- const CLAIM_TTL_MS = 30 * 60 * 1000; // 30 minutes
6
6
  // Recover tasks whose worker crashed before completing.
7
7
  // Finds -claimed- files older than CLAIM_TTL_MS and re-queues them as -pending.
8
8
  export async function recoverStaleClaims() {
9
+ const config = await loadConfig();
10
+ const claimTtlMs = config.claimTtlMinutes * 60 * 1000;
9
11
  let recovered = 0;
10
12
  let files;
11
13
  try {
@@ -20,7 +22,7 @@ export async function recoverStaleClaims() {
20
22
  const filePath = join(TASKS_DIR, file);
21
23
  try {
22
24
  const { mtimeMs } = await stat(filePath);
23
- if (now - mtimeMs < CLAIM_TTL_MS)
25
+ if (now - mtimeMs < claimTtlMs)
24
26
  continue;
25
27
  const taskId = file.split('-claimed-')[0];
26
28
  if (!taskId)
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@chuckssmith/agentloom",
3
- "version": "0.8.0",
3
+ "version": "1.0.0",
4
4
  "description": "A workflow layer for Claude Code — reusable roles, persistence loops, and multi-agent crew coordination",
5
5
  "keywords": [
6
6
  "ai",