gsd-opencode 1.22.1 → 1.30.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/agents/gsd-advisor-researcher.md +112 -0
- package/agents/gsd-assumptions-analyzer.md +110 -0
- package/agents/gsd-codebase-mapper.md +0 -2
- package/agents/gsd-debugger.md +118 -2
- package/agents/gsd-executor.md +24 -4
- package/agents/gsd-integration-checker.md +0 -2
- package/agents/gsd-nyquist-auditor.md +0 -2
- package/agents/gsd-phase-researcher.md +150 -5
- package/agents/gsd-plan-checker.md +70 -5
- package/agents/gsd-planner.md +49 -4
- package/agents/gsd-project-researcher.md +28 -3
- package/agents/gsd-research-synthesizer.md +0 -2
- package/agents/gsd-roadmapper.md +29 -2
- package/agents/gsd-ui-auditor.md +445 -0
- package/agents/gsd-ui-checker.md +305 -0
- package/agents/gsd-ui-researcher.md +368 -0
- package/agents/gsd-user-profiler.md +173 -0
- package/agents/gsd-verifier.md +123 -4
- package/commands/gsd/gsd-add-backlog.md +76 -0
- package/commands/gsd/gsd-audit-uat.md +24 -0
- package/commands/gsd/gsd-autonomous.md +41 -0
- package/commands/gsd/gsd-debug.md +5 -0
- package/commands/gsd/gsd-discuss-phase.md +10 -36
- package/commands/gsd/gsd-do.md +30 -0
- package/commands/gsd/gsd-execute-phase.md +20 -2
- package/commands/gsd/gsd-fast.md +30 -0
- package/commands/gsd/gsd-forensics.md +56 -0
- package/commands/gsd/gsd-list-workspaces.md +19 -0
- package/commands/gsd/gsd-manager.md +39 -0
- package/commands/gsd/gsd-milestone-summary.md +51 -0
- package/commands/gsd/gsd-new-workspace.md +44 -0
- package/commands/gsd/gsd-next.md +24 -0
- package/commands/gsd/gsd-note.md +34 -0
- package/commands/gsd/gsd-plan-phase.md +3 -1
- package/commands/gsd/gsd-plant-seed.md +28 -0
- package/commands/gsd/gsd-pr-branch.md +25 -0
- package/commands/gsd/gsd-profile-user.md +46 -0
- package/commands/gsd/gsd-quick.md +4 -2
- package/commands/gsd/gsd-reapply-patches.md +9 -8
- package/commands/gsd/gsd-remove-workspace.md +26 -0
- package/commands/gsd/gsd-research-phase.md +5 -0
- package/commands/gsd/gsd-review-backlog.md +61 -0
- package/commands/gsd/gsd-review.md +37 -0
- package/commands/gsd/gsd-session-report.md +19 -0
- package/commands/gsd/gsd-set-profile.md +24 -23
- package/commands/gsd/gsd-ship.md +23 -0
- package/commands/gsd/gsd-stats.md +18 -0
- package/commands/gsd/gsd-thread.md +127 -0
- package/commands/gsd/gsd-ui-phase.md +34 -0
- package/commands/gsd/gsd-ui-review.md +32 -0
- package/commands/gsd/gsd-workstreams.md +66 -0
- package/get-shit-done/bin/gsd-tools.cjs +410 -84
- package/get-shit-done/bin/lib/commands.cjs +429 -18
- package/get-shit-done/bin/lib/config.cjs +318 -45
- package/get-shit-done/bin/lib/core.cjs +822 -84
- package/get-shit-done/bin/lib/frontmatter.cjs +78 -41
- package/get-shit-done/bin/lib/init.cjs +836 -104
- package/get-shit-done/bin/lib/milestone.cjs +44 -33
- package/get-shit-done/bin/lib/model-profiles.cjs +68 -0
- package/get-shit-done/bin/lib/phase.cjs +293 -306
- package/get-shit-done/bin/lib/profile-output.cjs +952 -0
- package/get-shit-done/bin/lib/profile-pipeline.cjs +539 -0
- package/get-shit-done/bin/lib/roadmap.cjs +55 -24
- package/get-shit-done/bin/lib/security.cjs +382 -0
- package/get-shit-done/bin/lib/state.cjs +363 -53
- package/get-shit-done/bin/lib/template.cjs +2 -2
- package/get-shit-done/bin/lib/uat.cjs +282 -0
- package/get-shit-done/bin/lib/verify.cjs +104 -36
- package/get-shit-done/bin/lib/workstream.cjs +491 -0
- package/get-shit-done/references/checkpoints.md +12 -10
- package/get-shit-done/references/decimal-phase-calculation.md +2 -3
- package/get-shit-done/references/git-integration.md +47 -0
- package/get-shit-done/references/model-profile-resolution.md +2 -0
- package/get-shit-done/references/model-profiles.md +62 -16
- package/get-shit-done/references/phase-argument-parsing.md +2 -2
- package/get-shit-done/references/planning-config.md +3 -1
- package/get-shit-done/references/user-profiling.md +681 -0
- package/get-shit-done/references/workstream-flag.md +58 -0
- package/get-shit-done/templates/UAT.md +21 -3
- package/get-shit-done/templates/UI-SPEC.md +100 -0
- package/get-shit-done/templates/claude-md.md +122 -0
- package/get-shit-done/templates/config.json +10 -3
- package/get-shit-done/templates/context.md +61 -6
- package/get-shit-done/templates/dev-preferences.md +21 -0
- package/get-shit-done/templates/discussion-log.md +63 -0
- package/get-shit-done/templates/phase-prompt.md +46 -5
- package/get-shit-done/templates/project.md +2 -0
- package/get-shit-done/templates/state.md +2 -2
- package/get-shit-done/templates/user-profile.md +146 -0
- package/get-shit-done/workflows/add-phase.md +2 -2
- package/get-shit-done/workflows/add-tests.md +4 -4
- package/get-shit-done/workflows/add-todo.md +3 -3
- package/get-shit-done/workflows/audit-milestone.md +13 -5
- package/get-shit-done/workflows/audit-uat.md +109 -0
- package/get-shit-done/workflows/autonomous.md +891 -0
- package/get-shit-done/workflows/check-todos.md +2 -2
- package/get-shit-done/workflows/cleanup.md +4 -4
- package/get-shit-done/workflows/complete-milestone.md +9 -6
- package/get-shit-done/workflows/diagnose-issues.md +15 -3
- package/get-shit-done/workflows/discovery-phase.md +3 -3
- package/get-shit-done/workflows/discuss-phase-assumptions.md +653 -0
- package/get-shit-done/workflows/discuss-phase.md +411 -38
- package/get-shit-done/workflows/do.md +104 -0
- package/get-shit-done/workflows/execute-phase.md +405 -18
- package/get-shit-done/workflows/execute-plan.md +77 -12
- package/get-shit-done/workflows/fast.md +105 -0
- package/get-shit-done/workflows/forensics.md +265 -0
- package/get-shit-done/workflows/health.md +28 -6
- package/get-shit-done/workflows/help.md +124 -7
- package/get-shit-done/workflows/insert-phase.md +2 -2
- package/get-shit-done/workflows/list-phase-assumptions.md +2 -2
- package/get-shit-done/workflows/list-workspaces.md +56 -0
- package/get-shit-done/workflows/manager.md +362 -0
- package/get-shit-done/workflows/map-codebase.md +74 -13
- package/get-shit-done/workflows/milestone-summary.md +223 -0
- package/get-shit-done/workflows/new-milestone.md +120 -18
- package/get-shit-done/workflows/new-project.md +178 -39
- package/get-shit-done/workflows/new-workspace.md +237 -0
- package/get-shit-done/workflows/next.md +97 -0
- package/get-shit-done/workflows/node-repair.md +92 -0
- package/get-shit-done/workflows/note.md +156 -0
- package/get-shit-done/workflows/pause-work.md +62 -8
- package/get-shit-done/workflows/plan-milestone-gaps.md +4 -5
- package/get-shit-done/workflows/plan-phase.md +332 -33
- package/get-shit-done/workflows/plant-seed.md +169 -0
- package/get-shit-done/workflows/pr-branch.md +129 -0
- package/get-shit-done/workflows/profile-user.md +450 -0
- package/get-shit-done/workflows/progress.md +145 -20
- package/get-shit-done/workflows/quick.md +205 -49
- package/get-shit-done/workflows/remove-phase.md +2 -2
- package/get-shit-done/workflows/remove-workspace.md +90 -0
- package/get-shit-done/workflows/research-phase.md +11 -3
- package/get-shit-done/workflows/resume-project.md +35 -16
- package/get-shit-done/workflows/review.md +228 -0
- package/get-shit-done/workflows/session-report.md +146 -0
- package/get-shit-done/workflows/set-profile.md +2 -2
- package/get-shit-done/workflows/settings.md +79 -10
- package/get-shit-done/workflows/ship.md +228 -0
- package/get-shit-done/workflows/stats.md +60 -0
- package/get-shit-done/workflows/transition.md +147 -20
- package/get-shit-done/workflows/ui-phase.md +302 -0
- package/get-shit-done/workflows/ui-review.md +165 -0
- package/get-shit-done/workflows/update.md +108 -25
- package/get-shit-done/workflows/validate-phase.md +15 -8
- package/get-shit-done/workflows/verify-phase.md +16 -5
- package/get-shit-done/workflows/verify-work.md +72 -18
- package/package.json +1 -1
- package/skills/gsd-audit-milestone/SKILL.md +29 -0
- package/skills/gsd-cleanup/SKILL.md +19 -0
- package/skills/gsd-complete-milestone/SKILL.md +131 -0
- package/skills/gsd-discuss-phase/SKILL.md +54 -0
- package/skills/gsd-execute-phase/SKILL.md +49 -0
- package/skills/gsd-plan-phase/SKILL.md +37 -0
- package/skills/gsd-ui-phase/SKILL.md +24 -0
- package/skills/gsd-ui-review/SKILL.md +24 -0
- package/skills/gsd-verify-work/SKILL.md +30 -0
|
@@ -0,0 +1,491 @@
|
|
|
1
|
+
/**
|
|
2
|
+
* Workstream — CRUD operations for workstream namespacing
|
|
3
|
+
*
|
|
4
|
+
* Workstreams enable parallel milestones by scoping ROADMAP.md, STATE.md,
|
|
5
|
+
* REQUIREMENTS.md, and phases/ into .planning/workstreams/{name}/ directories.
|
|
6
|
+
*
|
|
7
|
+
* When no workstreams/ directory exists, GSD operates in "flat mode" with
|
|
8
|
+
* everything at .planning/ — backward compatible with pre-workstream installs.
|
|
9
|
+
*/
|
|
10
|
+
|
|
11
|
+
const fs = require('fs');
|
|
12
|
+
const path = require('path');
|
|
13
|
+
const { output, error, planningPaths, planningRoot, toPosixPath, getMilestoneInfo, generateSlugInternal, setActiveWorkstream, getActiveWorkstream, filterPlanFiles, filterSummaryFiles, readSubdirectories } = require('./core.cjs');
|
|
14
|
+
const { stateExtractField } = require('./state.cjs');
|
|
15
|
+
|
|
16
|
+
// ─── Migration ──────────────────────────────────────────────────────────────
|
|
17
|
+
|
|
18
|
+
/**
|
|
19
|
+
* Migrate flat .planning/ layout to workstream mode.
|
|
20
|
+
* Moves per-workstream files (ROADMAP.md, STATE.md, REQUIREMENTS.md, phases/)
|
|
21
|
+
* into .planning/workstreams/{name}/. Shared files (PROJECT.md, config.json,
|
|
22
|
+
* milestones/, research/, codebase/, todos/) stay in place.
|
|
23
|
+
*/
|
|
24
|
+
function migrateToWorkstreams(cwd, workstreamName) {
|
|
25
|
+
if (!workstreamName || /[/\\]/.test(workstreamName) || workstreamName === '.' || workstreamName === '..') {
|
|
26
|
+
throw new Error('Invalid workstream name for migration');
|
|
27
|
+
}
|
|
28
|
+
|
|
29
|
+
const baseDir = planningRoot(cwd);
|
|
30
|
+
const wsDir = path.join(baseDir, 'workstreams', workstreamName);
|
|
31
|
+
|
|
32
|
+
if (fs.existsSync(path.join(baseDir, 'workstreams'))) {
|
|
33
|
+
throw new Error('Already in workstream mode — .planning/workstreams/ exists');
|
|
34
|
+
}
|
|
35
|
+
|
|
36
|
+
const toMove = [
|
|
37
|
+
{ name: 'ROADMAP.md', type: 'file' },
|
|
38
|
+
{ name: 'STATE.md', type: 'file' },
|
|
39
|
+
{ name: 'REQUIREMENTS.md', type: 'file' },
|
|
40
|
+
{ name: 'phases', type: 'dir' },
|
|
41
|
+
];
|
|
42
|
+
|
|
43
|
+
fs.mkdirSync(wsDir, { recursive: true });
|
|
44
|
+
|
|
45
|
+
const filesMoved = [];
|
|
46
|
+
try {
|
|
47
|
+
for (const item of toMove) {
|
|
48
|
+
const src = path.join(baseDir, item.name);
|
|
49
|
+
if (fs.existsSync(src)) {
|
|
50
|
+
const dest = path.join(wsDir, item.name);
|
|
51
|
+
fs.renameSync(src, dest);
|
|
52
|
+
filesMoved.push(item.name);
|
|
53
|
+
}
|
|
54
|
+
}
|
|
55
|
+
} catch (err) {
|
|
56
|
+
for (const name of filesMoved) {
|
|
57
|
+
try { fs.renameSync(path.join(wsDir, name), path.join(baseDir, name)); } catch {}
|
|
58
|
+
}
|
|
59
|
+
try { fs.rmSync(wsDir, { recursive: true }); } catch {}
|
|
60
|
+
try { fs.rmdirSync(path.join(baseDir, 'workstreams')); } catch {}
|
|
61
|
+
throw err;
|
|
62
|
+
}
|
|
63
|
+
|
|
64
|
+
return { migrated: true, workstream: workstreamName, files_moved: filesMoved };
|
|
65
|
+
}
|
|
66
|
+
|
|
67
|
+
// ─── CRUD Commands ──────────────────────────────────────────────────────────
|
|
68
|
+
|
|
69
|
+
function cmdWorkstreamCreate(cwd, name, options, raw) {
|
|
70
|
+
if (!name) {
|
|
71
|
+
error('workstream name required. Usage: workstream create <name>');
|
|
72
|
+
}
|
|
73
|
+
|
|
74
|
+
const slug = name.toLowerCase().replace(/[^a-z0-9]+/g, '-').replace(/^-+|-+$/g, '');
|
|
75
|
+
if (!slug) {
|
|
76
|
+
error('Invalid workstream name — must contain at least one alphanumeric character');
|
|
77
|
+
}
|
|
78
|
+
|
|
79
|
+
const baseDir = planningRoot(cwd);
|
|
80
|
+
if (!fs.existsSync(baseDir)) {
|
|
81
|
+
error('.planning/ directory not found — run /gsd-new-project first');
|
|
82
|
+
}
|
|
83
|
+
|
|
84
|
+
const wsRoot = path.join(baseDir, 'workstreams');
|
|
85
|
+
const wsDir = path.join(wsRoot, slug);
|
|
86
|
+
|
|
87
|
+
if (fs.existsSync(wsDir) && fs.existsSync(path.join(wsDir, 'STATE.md'))) {
|
|
88
|
+
output({ created: false, error: 'already_exists', workstream: slug, path: toPosixPath(path.relative(cwd, wsDir)) }, raw);
|
|
89
|
+
return;
|
|
90
|
+
}
|
|
91
|
+
|
|
92
|
+
const isFlatMode = !fs.existsSync(wsRoot);
|
|
93
|
+
let migration = null;
|
|
94
|
+
if (isFlatMode && options.migrate !== false) {
|
|
95
|
+
const hasExistingWork = fs.existsSync(path.join(baseDir, 'ROADMAP.md')) ||
|
|
96
|
+
fs.existsSync(path.join(baseDir, 'STATE.md')) ||
|
|
97
|
+
fs.existsSync(path.join(baseDir, 'phases'));
|
|
98
|
+
|
|
99
|
+
if (hasExistingWork) {
|
|
100
|
+
const migrateName = options.migrateName || null;
|
|
101
|
+
let existingWsName;
|
|
102
|
+
if (migrateName) {
|
|
103
|
+
existingWsName = migrateName;
|
|
104
|
+
} else {
|
|
105
|
+
try {
|
|
106
|
+
const milestone = getMilestoneInfo(cwd);
|
|
107
|
+
existingWsName = generateSlugInternal(milestone.name) || 'default';
|
|
108
|
+
} catch {
|
|
109
|
+
existingWsName = 'default';
|
|
110
|
+
}
|
|
111
|
+
}
|
|
112
|
+
|
|
113
|
+
try {
|
|
114
|
+
migration = migrateToWorkstreams(cwd, existingWsName);
|
|
115
|
+
} catch (e) {
|
|
116
|
+
output({ created: false, error: 'migration_failed', message: e.message }, raw);
|
|
117
|
+
return;
|
|
118
|
+
}
|
|
119
|
+
} else {
|
|
120
|
+
fs.mkdirSync(wsRoot, { recursive: true });
|
|
121
|
+
}
|
|
122
|
+
}
|
|
123
|
+
|
|
124
|
+
fs.mkdirSync(wsDir, { recursive: true });
|
|
125
|
+
fs.mkdirSync(path.join(wsDir, 'phases'), { recursive: true });
|
|
126
|
+
|
|
127
|
+
const today = new Date().toISOString().split('T')[0];
|
|
128
|
+
const stateContent = [
|
|
129
|
+
'---',
|
|
130
|
+
`workstream: ${slug}`,
|
|
131
|
+
`created: ${today}`,
|
|
132
|
+
'---',
|
|
133
|
+
'',
|
|
134
|
+
'# Project State',
|
|
135
|
+
'',
|
|
136
|
+
'## Current Position',
|
|
137
|
+
'**Status:** Not started',
|
|
138
|
+
'**Current Phase:** None',
|
|
139
|
+
`**Last Activity:** ${today}`,
|
|
140
|
+
'**Last Activity Description:** Workstream created',
|
|
141
|
+
'',
|
|
142
|
+
'## Progress',
|
|
143
|
+
'**Phases Complete:** 0',
|
|
144
|
+
'**Current Plan:** N/A',
|
|
145
|
+
'',
|
|
146
|
+
'## Session Continuity',
|
|
147
|
+
'**Stopped At:** N/A',
|
|
148
|
+
'**Resume File:** None',
|
|
149
|
+
'',
|
|
150
|
+
].join('\n');
|
|
151
|
+
|
|
152
|
+
const statePath = path.join(wsDir, 'STATE.md');
|
|
153
|
+
if (!fs.existsSync(statePath)) {
|
|
154
|
+
fs.writeFileSync(statePath, stateContent, 'utf-8');
|
|
155
|
+
}
|
|
156
|
+
|
|
157
|
+
setActiveWorkstream(cwd, slug);
|
|
158
|
+
|
|
159
|
+
const relPath = toPosixPath(path.relative(cwd, wsDir));
|
|
160
|
+
output({
|
|
161
|
+
created: true,
|
|
162
|
+
workstream: slug,
|
|
163
|
+
path: relPath,
|
|
164
|
+
state_path: relPath + '/STATE.md',
|
|
165
|
+
phases_path: relPath + '/phases',
|
|
166
|
+
migration: migration || null,
|
|
167
|
+
active: true,
|
|
168
|
+
}, raw);
|
|
169
|
+
}
|
|
170
|
+
|
|
171
|
+
function cmdWorkstreamList(cwd, raw) {
|
|
172
|
+
const wsRoot = path.join(planningRoot(cwd), 'workstreams');
|
|
173
|
+
|
|
174
|
+
if (!fs.existsSync(wsRoot)) {
|
|
175
|
+
output({ mode: 'flat', workstreams: [], message: 'No workstreams — operating in flat mode' }, raw);
|
|
176
|
+
return;
|
|
177
|
+
}
|
|
178
|
+
|
|
179
|
+
const entries = fs.readdirSync(wsRoot, { withFileTypes: true });
|
|
180
|
+
const workstreams = [];
|
|
181
|
+
|
|
182
|
+
for (const entry of entries) {
|
|
183
|
+
if (!entry.isDirectory()) continue;
|
|
184
|
+
|
|
185
|
+
const wsDir = path.join(wsRoot, entry.name);
|
|
186
|
+
const phasesDir = path.join(wsDir, 'phases');
|
|
187
|
+
|
|
188
|
+
const phaseDirs = readSubdirectories(phasesDir);
|
|
189
|
+
const phaseCount = phaseDirs.length;
|
|
190
|
+
let completedCount = 0;
|
|
191
|
+
for (const d of phaseDirs) {
|
|
192
|
+
try {
|
|
193
|
+
const phaseFiles = fs.readdirSync(path.join(phasesDir, d));
|
|
194
|
+
const plans = filterPlanFiles(phaseFiles);
|
|
195
|
+
const summaries = filterSummaryFiles(phaseFiles);
|
|
196
|
+
if (plans.length > 0 && summaries.length >= plans.length) completedCount++;
|
|
197
|
+
} catch {}
|
|
198
|
+
}
|
|
199
|
+
|
|
200
|
+
let status = 'unknown', currentPhase = null;
|
|
201
|
+
try {
|
|
202
|
+
const stateContent = fs.readFileSync(path.join(wsDir, 'STATE.md'), 'utf-8');
|
|
203
|
+
status = stateExtractField(stateContent, 'Status') || 'unknown';
|
|
204
|
+
currentPhase = stateExtractField(stateContent, 'Current Phase');
|
|
205
|
+
} catch {}
|
|
206
|
+
|
|
207
|
+
workstreams.push({
|
|
208
|
+
name: entry.name,
|
|
209
|
+
path: toPosixPath(path.relative(cwd, wsDir)),
|
|
210
|
+
has_roadmap: fs.existsSync(path.join(wsDir, 'ROADMAP.md')),
|
|
211
|
+
has_state: fs.existsSync(path.join(wsDir, 'STATE.md')),
|
|
212
|
+
status,
|
|
213
|
+
current_phase: currentPhase,
|
|
214
|
+
phase_count: phaseCount,
|
|
215
|
+
completed_phases: completedCount,
|
|
216
|
+
});
|
|
217
|
+
}
|
|
218
|
+
|
|
219
|
+
output({ mode: 'workstream', workstreams, count: workstreams.length }, raw);
|
|
220
|
+
}
|
|
221
|
+
|
|
222
|
+
function cmdWorkstreamStatus(cwd, name, raw) {
|
|
223
|
+
if (!name) error('workstream name required. Usage: workstream status <name>');
|
|
224
|
+
if (/[/\\]/.test(name) || name === '.' || name === '..') error('Invalid workstream name');
|
|
225
|
+
|
|
226
|
+
const wsDir = path.join(planningRoot(cwd), 'workstreams', name);
|
|
227
|
+
if (!fs.existsSync(wsDir)) {
|
|
228
|
+
output({ found: false, workstream: name }, raw);
|
|
229
|
+
return;
|
|
230
|
+
}
|
|
231
|
+
|
|
232
|
+
const p = planningPaths(cwd, name);
|
|
233
|
+
const relPath = toPosixPath(path.relative(cwd, wsDir));
|
|
234
|
+
|
|
235
|
+
const files = {
|
|
236
|
+
roadmap: fs.existsSync(p.roadmap),
|
|
237
|
+
state: fs.existsSync(p.state),
|
|
238
|
+
requirements: fs.existsSync(p.requirements),
|
|
239
|
+
};
|
|
240
|
+
|
|
241
|
+
const phases = [];
|
|
242
|
+
for (const dir of readSubdirectories(p.phases).sort()) {
|
|
243
|
+
try {
|
|
244
|
+
const phaseFiles = fs.readdirSync(path.join(p.phases, dir));
|
|
245
|
+
const plans = filterPlanFiles(phaseFiles);
|
|
246
|
+
const summaries = filterSummaryFiles(phaseFiles);
|
|
247
|
+
phases.push({
|
|
248
|
+
directory: dir,
|
|
249
|
+
status: summaries.length >= plans.length && plans.length > 0 ? 'complete' :
|
|
250
|
+
plans.length > 0 ? 'in_progress' : 'pending',
|
|
251
|
+
plan_count: plans.length,
|
|
252
|
+
summary_count: summaries.length,
|
|
253
|
+
});
|
|
254
|
+
} catch {}
|
|
255
|
+
}
|
|
256
|
+
|
|
257
|
+
let stateInfo = {};
|
|
258
|
+
try {
|
|
259
|
+
const stateContent = fs.readFileSync(p.state, 'utf-8');
|
|
260
|
+
stateInfo = {
|
|
261
|
+
status: stateExtractField(stateContent, 'Status') || 'unknown',
|
|
262
|
+
current_phase: stateExtractField(stateContent, 'Current Phase'),
|
|
263
|
+
last_activity: stateExtractField(stateContent, 'Last Activity'),
|
|
264
|
+
};
|
|
265
|
+
} catch {}
|
|
266
|
+
|
|
267
|
+
output({
|
|
268
|
+
found: true,
|
|
269
|
+
workstream: name,
|
|
270
|
+
path: relPath,
|
|
271
|
+
files,
|
|
272
|
+
phases,
|
|
273
|
+
phase_count: phases.length,
|
|
274
|
+
completed_phases: phases.filter(ph => ph.status === 'complete').length,
|
|
275
|
+
...stateInfo,
|
|
276
|
+
}, raw);
|
|
277
|
+
}
|
|
278
|
+
|
|
279
|
+
function cmdWorkstreamComplete(cwd, name, options, raw) {
|
|
280
|
+
if (!name) error('workstream name required. Usage: workstream complete <name>');
|
|
281
|
+
if (/[/\\]/.test(name) || name === '.' || name === '..') error('Invalid workstream name');
|
|
282
|
+
|
|
283
|
+
const root = planningRoot(cwd);
|
|
284
|
+
const wsRoot = path.join(root, 'workstreams');
|
|
285
|
+
const wsDir = path.join(wsRoot, name);
|
|
286
|
+
|
|
287
|
+
if (!fs.existsSync(wsDir)) {
|
|
288
|
+
output({ completed: false, error: 'not_found', workstream: name }, raw);
|
|
289
|
+
return;
|
|
290
|
+
}
|
|
291
|
+
|
|
292
|
+
const active = getActiveWorkstream(cwd);
|
|
293
|
+
if (active === name) setActiveWorkstream(cwd, null);
|
|
294
|
+
|
|
295
|
+
const archiveDir = path.join(root, 'milestones');
|
|
296
|
+
const today = new Date().toISOString().split('T')[0];
|
|
297
|
+
let archivePath = path.join(archiveDir, `ws-${name}-${today}`);
|
|
298
|
+
let suffix = 1;
|
|
299
|
+
while (fs.existsSync(archivePath)) {
|
|
300
|
+
archivePath = path.join(archiveDir, `ws-${name}-${today}-${suffix++}`);
|
|
301
|
+
}
|
|
302
|
+
|
|
303
|
+
fs.mkdirSync(archivePath, { recursive: true });
|
|
304
|
+
|
|
305
|
+
const filesMoved = [];
|
|
306
|
+
try {
|
|
307
|
+
const entries = fs.readdirSync(wsDir, { withFileTypes: true });
|
|
308
|
+
for (const entry of entries) {
|
|
309
|
+
fs.renameSync(path.join(wsDir, entry.name), path.join(archivePath, entry.name));
|
|
310
|
+
filesMoved.push(entry.name);
|
|
311
|
+
}
|
|
312
|
+
} catch (err) {
|
|
313
|
+
for (const fname of filesMoved) {
|
|
314
|
+
try { fs.renameSync(path.join(archivePath, fname), path.join(wsDir, fname)); } catch {}
|
|
315
|
+
}
|
|
316
|
+
try { fs.rmSync(archivePath, { recursive: true }); } catch {}
|
|
317
|
+
if (active === name) setActiveWorkstream(cwd, name);
|
|
318
|
+
output({ completed: false, error: 'archive_failed', message: err.message, workstream: name }, raw);
|
|
319
|
+
return;
|
|
320
|
+
}
|
|
321
|
+
|
|
322
|
+
try { fs.rmdirSync(wsDir); } catch {}
|
|
323
|
+
|
|
324
|
+
let remainingWs = 0;
|
|
325
|
+
try {
|
|
326
|
+
remainingWs = fs.readdirSync(wsRoot, { withFileTypes: true }).filter(e => e.isDirectory()).length;
|
|
327
|
+
if (remainingWs === 0) fs.rmdirSync(wsRoot);
|
|
328
|
+
} catch {}
|
|
329
|
+
|
|
330
|
+
output({
|
|
331
|
+
completed: true,
|
|
332
|
+
workstream: name,
|
|
333
|
+
archived_to: toPosixPath(path.relative(cwd, archivePath)),
|
|
334
|
+
remaining_workstreams: remainingWs,
|
|
335
|
+
reverted_to_flat: remainingWs === 0,
|
|
336
|
+
}, raw);
|
|
337
|
+
}
|
|
338
|
+
|
|
339
|
+
// ─── Active Workstream Commands ──────────────────────────────────────────────
|
|
340
|
+
|
|
341
|
+
function cmdWorkstreamSet(cwd, name, raw) {
|
|
342
|
+
if (!name) {
|
|
343
|
+
setActiveWorkstream(cwd, null);
|
|
344
|
+
output({ active: null, cleared: true }, raw);
|
|
345
|
+
return;
|
|
346
|
+
}
|
|
347
|
+
|
|
348
|
+
if (!/^[a-zA-Z0-9_-]+$/.test(name)) {
|
|
349
|
+
output({ active: null, error: 'invalid_name', message: 'Workstream name must be alphanumeric, hyphens, and underscores only' }, raw);
|
|
350
|
+
return;
|
|
351
|
+
}
|
|
352
|
+
|
|
353
|
+
const wsDir = path.join(planningRoot(cwd), 'workstreams', name);
|
|
354
|
+
if (!fs.existsSync(wsDir)) {
|
|
355
|
+
output({ active: null, error: 'not_found', workstream: name }, raw);
|
|
356
|
+
return;
|
|
357
|
+
}
|
|
358
|
+
|
|
359
|
+
setActiveWorkstream(cwd, name);
|
|
360
|
+
output({ active: name, set: true }, raw, name);
|
|
361
|
+
}
|
|
362
|
+
|
|
363
|
+
function cmdWorkstreamGet(cwd, raw) {
|
|
364
|
+
const active = getActiveWorkstream(cwd);
|
|
365
|
+
const wsRoot = path.join(planningRoot(cwd), 'workstreams');
|
|
366
|
+
output({ active, mode: fs.existsSync(wsRoot) ? 'workstream' : 'flat' }, raw, active || 'none');
|
|
367
|
+
}
|
|
368
|
+
|
|
369
|
+
function cmdWorkstreamProgress(cwd, raw) {
|
|
370
|
+
const root = planningRoot(cwd);
|
|
371
|
+
const wsRoot = path.join(root, 'workstreams');
|
|
372
|
+
|
|
373
|
+
if (!fs.existsSync(wsRoot)) {
|
|
374
|
+
output({ mode: 'flat', workstreams: [], message: 'No workstreams — operating in flat mode' }, raw);
|
|
375
|
+
return;
|
|
376
|
+
}
|
|
377
|
+
|
|
378
|
+
const active = getActiveWorkstream(cwd);
|
|
379
|
+
const entries = fs.readdirSync(wsRoot, { withFileTypes: true });
|
|
380
|
+
const workstreams = [];
|
|
381
|
+
|
|
382
|
+
for (const entry of entries) {
|
|
383
|
+
if (!entry.isDirectory()) continue;
|
|
384
|
+
|
|
385
|
+
const wsDir = path.join(wsRoot, entry.name);
|
|
386
|
+
const phasesDir = path.join(wsDir, 'phases');
|
|
387
|
+
|
|
388
|
+
const phaseDirsProgress = readSubdirectories(phasesDir);
|
|
389
|
+
const phaseCount = phaseDirsProgress.length;
|
|
390
|
+
let completedCount = 0, totalPlans = 0, completedPlans = 0;
|
|
391
|
+
for (const d of phaseDirsProgress) {
|
|
392
|
+
try {
|
|
393
|
+
const phaseFiles = fs.readdirSync(path.join(phasesDir, d));
|
|
394
|
+
const plans = filterPlanFiles(phaseFiles);
|
|
395
|
+
const summaries = filterSummaryFiles(phaseFiles);
|
|
396
|
+
totalPlans += plans.length;
|
|
397
|
+
completedPlans += Math.min(summaries.length, plans.length);
|
|
398
|
+
if (plans.length > 0 && summaries.length >= plans.length) completedCount++;
|
|
399
|
+
} catch {}
|
|
400
|
+
}
|
|
401
|
+
|
|
402
|
+
let roadmapPhaseCount = phaseCount;
|
|
403
|
+
try {
|
|
404
|
+
const roadmapContent = fs.readFileSync(path.join(wsDir, 'ROADMAP.md'), 'utf-8');
|
|
405
|
+
const phaseMatches = roadmapContent.match(/^###?\s+Phase\s+\d/gm);
|
|
406
|
+
if (phaseMatches) roadmapPhaseCount = phaseMatches.length;
|
|
407
|
+
} catch {}
|
|
408
|
+
|
|
409
|
+
let status = 'unknown', currentPhase = null;
|
|
410
|
+
try {
|
|
411
|
+
const stateContent = fs.readFileSync(path.join(wsDir, 'STATE.md'), 'utf-8');
|
|
412
|
+
status = stateExtractField(stateContent, 'Status') || 'unknown';
|
|
413
|
+
currentPhase = stateExtractField(stateContent, 'Current Phase');
|
|
414
|
+
} catch {}
|
|
415
|
+
|
|
416
|
+
workstreams.push({
|
|
417
|
+
name: entry.name,
|
|
418
|
+
active: entry.name === active,
|
|
419
|
+
status,
|
|
420
|
+
current_phase: currentPhase,
|
|
421
|
+
phases: `${completedCount}/${roadmapPhaseCount}`,
|
|
422
|
+
plans: `${completedPlans}/${totalPlans}`,
|
|
423
|
+
progress_percent: roadmapPhaseCount > 0 ? Math.round((completedCount / roadmapPhaseCount) * 100) : 0,
|
|
424
|
+
});
|
|
425
|
+
}
|
|
426
|
+
|
|
427
|
+
output({ mode: 'workstream', active, workstreams, count: workstreams.length }, raw);
|
|
428
|
+
}
|
|
429
|
+
|
|
430
|
+
// ─── Collision Detection ────────────────────────────────────────────────────
|
|
431
|
+
|
|
432
|
+
/**
|
|
433
|
+
* Return other workstreams that are NOT complete.
|
|
434
|
+
* Used to detect whether the milestone has active parallel work
|
|
435
|
+
* when a workstream finishes its last phase.
|
|
436
|
+
*/
|
|
437
|
+
function getOtherActiveWorkstreams(cwd, excludeWs) {
|
|
438
|
+
const wsRoot = path.join(planningRoot(cwd), 'workstreams');
|
|
439
|
+
if (!fs.existsSync(wsRoot)) return [];
|
|
440
|
+
|
|
441
|
+
const entries = fs.readdirSync(wsRoot, { withFileTypes: true });
|
|
442
|
+
const others = [];
|
|
443
|
+
|
|
444
|
+
for (const entry of entries) {
|
|
445
|
+
if (!entry.isDirectory() || entry.name === excludeWs) continue;
|
|
446
|
+
|
|
447
|
+
const wsDir = path.join(wsRoot, entry.name);
|
|
448
|
+
const statePath = path.join(wsDir, 'STATE.md');
|
|
449
|
+
|
|
450
|
+
let status = 'unknown', currentPhase = null;
|
|
451
|
+
try {
|
|
452
|
+
const content = fs.readFileSync(statePath, 'utf-8');
|
|
453
|
+
status = stateExtractField(content, 'Status') || 'unknown';
|
|
454
|
+
currentPhase = stateExtractField(content, 'Current Phase');
|
|
455
|
+
} catch {}
|
|
456
|
+
|
|
457
|
+
if (status.toLowerCase().includes('milestone complete') ||
|
|
458
|
+
status.toLowerCase().includes('archived')) {
|
|
459
|
+
continue;
|
|
460
|
+
}
|
|
461
|
+
|
|
462
|
+
const phasesDir = path.join(wsDir, 'phases');
|
|
463
|
+
const phaseDirsOther = readSubdirectories(phasesDir);
|
|
464
|
+
const phaseCount = phaseDirsOther.length;
|
|
465
|
+
let completedCount = 0;
|
|
466
|
+
for (const d of phaseDirsOther) {
|
|
467
|
+
try {
|
|
468
|
+
const phaseFiles = fs.readdirSync(path.join(phasesDir, d));
|
|
469
|
+
const plans = filterPlanFiles(phaseFiles);
|
|
470
|
+
const summaries = filterSummaryFiles(phaseFiles);
|
|
471
|
+
if (plans.length > 0 && summaries.length >= plans.length) completedCount++;
|
|
472
|
+
} catch {}
|
|
473
|
+
}
|
|
474
|
+
|
|
475
|
+
others.push({ name: entry.name, status, current_phase: currentPhase, phases: `${completedCount}/${phaseCount}` });
|
|
476
|
+
}
|
|
477
|
+
|
|
478
|
+
return others;
|
|
479
|
+
}
|
|
480
|
+
|
|
481
|
+
module.exports = {
|
|
482
|
+
migrateToWorkstreams,
|
|
483
|
+
cmdWorkstreamCreate,
|
|
484
|
+
cmdWorkstreamList,
|
|
485
|
+
cmdWorkstreamStatus,
|
|
486
|
+
cmdWorkstreamComplete,
|
|
487
|
+
cmdWorkstreamSet,
|
|
488
|
+
cmdWorkstreamGet,
|
|
489
|
+
cmdWorkstreamProgress,
|
|
490
|
+
getOtherActiveWorkstreams,
|
|
491
|
+
};
|
|
@@ -50,7 +50,7 @@ Plans execute autonomously. Checkpoints formalize interaction points where human
|
|
|
50
50
|
<task type="auto">
|
|
51
51
|
<name>Start dev server for verification</name>
|
|
52
52
|
<action>Run `npm run dev` in background, wait for "ready" message, capture port</action>
|
|
53
|
-
<verify>
|
|
53
|
+
<verify>fetch http://localhost:3000 returns 200</verify>
|
|
54
54
|
<done>Dev server running at http://localhost:3000</done>
|
|
55
55
|
</task>
|
|
56
56
|
|
|
@@ -240,7 +240,7 @@ Plans execute autonomously. Checkpoints formalize interaction points where human
|
|
|
240
240
|
<name>Deploy to Vercel</name>
|
|
241
241
|
<files>.vercel/, vercel.json</files>
|
|
242
242
|
<action>Run `vercel --yes` to deploy</action>
|
|
243
|
-
<verify>vercel ls shows deployment,
|
|
243
|
+
<verify>vercel ls shows deployment, fetch returns 200</verify>
|
|
244
244
|
</task>
|
|
245
245
|
|
|
246
246
|
<!-- If vercel returns "Error: Not authenticated", OpenCode creates checkpoint on the fly -->
|
|
@@ -261,7 +261,7 @@ Plans execute autonomously. Checkpoints formalize interaction points where human
|
|
|
261
261
|
<task type="auto">
|
|
262
262
|
<name>Retry Vercel deployment</name>
|
|
263
263
|
<action>Run `vercel --yes` (now authenticated)</action>
|
|
264
|
-
<verify>vercel ls shows deployment,
|
|
264
|
+
<verify>vercel ls shows deployment, fetch returns 200</verify>
|
|
265
265
|
</task>
|
|
266
266
|
```
|
|
267
267
|
|
|
@@ -455,8 +455,8 @@ I'll verify: vercel whoami returns your account
|
|
|
455
455
|
npm run dev &
|
|
456
456
|
DEV_SERVER_PID=$!
|
|
457
457
|
|
|
458
|
-
# Wait for ready (max 30s)
|
|
459
|
-
timeout 30 bash -c 'until
|
|
458
|
+
# Wait for ready (max 30s) — uses fetch() for cross-platform compatibility
|
|
459
|
+
timeout 30 bash -c 'until node -e "fetch(\"http://localhost:3000\").then(r=>{process.exit(r.ok?0:1)}).catch(()=>process.exit(1))" 2>/dev/null; do sleep 1; done'
|
|
460
460
|
```
|
|
461
461
|
|
|
462
462
|
**Port conflicts:** Kill stale process (`lsof -ti:3000 | xargs kill`) or use alternate port (`--port 3001`).
|
|
@@ -489,7 +489,9 @@ timeout 30 bash -c 'until curl -s localhost:3000 > /dev/null 2>&1; do sleep 1; d
|
|
|
489
489
|
| Auth error | Create auth gate checkpoint |
|
|
490
490
|
| Network timeout | Retry with backoff, then checkpoint if persistent |
|
|
491
491
|
|
|
492
|
-
**Never present a checkpoint with broken verification environment.** If
|
|
492
|
+
**Never present a checkpoint with broken verification environment.** If the local server isn't responding, don't ask user to "visit localhost:3000".
|
|
493
|
+
|
|
494
|
+
> **Cross-platform note:** Use `node -e "fetch('http://localhost:3000').then(r=>console.log(r.status))"` instead of `curl` for health checks. `curl` is broken on Windows MSYS/Git bash due to SSL/path mangling issues.
|
|
493
495
|
|
|
494
496
|
```xml
|
|
495
497
|
<!-- WRONG: Checkpoint with broken environment -->
|
|
@@ -502,7 +504,7 @@ timeout 30 bash -c 'until curl -s localhost:3000 > /dev/null 2>&1; do sleep 1; d
|
|
|
502
504
|
<task type="auto">
|
|
503
505
|
<name>Fix server startup issue</name>
|
|
504
506
|
<action>Investigate error, fix root cause, restart server</action>
|
|
505
|
-
<verify>
|
|
507
|
+
<verify>fetch http://localhost:3000 returns 200</verify>
|
|
506
508
|
</task>
|
|
507
509
|
|
|
508
510
|
<task type="checkpoint:human-verify">
|
|
@@ -608,7 +610,7 @@ timeout 30 bash -c 'until curl -s localhost:3000 > /dev/null 2>&1; do sleep 1; d
|
|
|
608
610
|
<task type="auto">
|
|
609
611
|
<name>Start dev server for auth testing</name>
|
|
610
612
|
<action>Run `npm run dev` in background, wait for ready signal</action>
|
|
611
|
-
<verify>
|
|
613
|
+
<verify>fetch http://localhost:3000 returns 200</verify>
|
|
612
614
|
<done>Dev server running at http://localhost:3000</done>
|
|
613
615
|
</task>
|
|
614
616
|
|
|
@@ -651,7 +653,7 @@ timeout 30 bash -c 'until curl -s localhost:3000 > /dev/null 2>&1; do sleep 1; d
|
|
|
651
653
|
<task type="auto">
|
|
652
654
|
<name>Start dev server</name>
|
|
653
655
|
<action>Run `npm run dev` in background</action>
|
|
654
|
-
<verify>
|
|
656
|
+
<verify>fetch http://localhost:3000 returns 200</verify>
|
|
655
657
|
</task>
|
|
656
658
|
|
|
657
659
|
<task type="checkpoint:human-verify" gate="blocking">
|
|
@@ -677,7 +679,7 @@ timeout 30 bash -c 'until curl -s localhost:3000 > /dev/null 2>&1; do sleep 1; d
|
|
|
677
679
|
<task type="auto">
|
|
678
680
|
<name>Deploy to Vercel</name>
|
|
679
681
|
<action>Run `vercel --yes`. Capture URL.</action>
|
|
680
|
-
<verify>vercel ls shows deployment,
|
|
682
|
+
<verify>vercel ls shows deployment, fetch returns 200</verify>
|
|
681
683
|
</task>
|
|
682
684
|
|
|
683
685
|
<task type="checkpoint:human-verify">
|
|
@@ -32,9 +32,8 @@ With existing decimals:
|
|
|
32
32
|
## Extract Values
|
|
33
33
|
|
|
34
34
|
```bash
|
|
35
|
-
|
|
36
|
-
|
|
37
|
-
BASE_PHASE=$(printf '%s\n' "$DECIMAL_INFO" | jq -r '.base_phase')
|
|
35
|
+
DECIMAL_PHASE=$(node "$HOME/.config/opencode/get-shit-done/bin/gsd-tools.cjs" phase next-decimal "${AFTER_PHASE}" --pick next)
|
|
36
|
+
BASE_PHASE=$(node "$HOME/.config/opencode/get-shit-done/bin/gsd-tools.cjs" phase next-decimal "${AFTER_PHASE}" --pick base_phase)
|
|
38
37
|
```
|
|
39
38
|
|
|
40
39
|
Or with --raw flag:
|
|
@@ -61,6 +61,10 @@ node "$HOME/.config/opencode/get-shit-done/bin/gsd-tools.cjs" commit "docs: init
|
|
|
61
61
|
|
|
62
62
|
Each task gets its own commit immediately after completion.
|
|
63
63
|
|
|
64
|
+
> **Parallel agents:** When running as a parallel executor (spawned by execute-phase),
|
|
65
|
+
> use `--no-verify` on all commits to avoid pre-commit hook lock contention.
|
|
66
|
+
> The orchestrator validates hooks once after all agents complete.
|
|
67
|
+
|
|
64
68
|
```
|
|
65
69
|
{type}({phase}-{plan}): {task-name}
|
|
66
70
|
|
|
@@ -246,3 +250,46 @@ Each plan produces 2-4 commits (tasks + metadata). Clear, granular, bisectable.
|
|
|
246
250
|
- "Commit noise" irrelevant when consumer is OpenCode, not humans
|
|
247
251
|
|
|
248
252
|
</commit_strategy_rationale>
|
|
253
|
+
|
|
254
|
+
<sub_repos_support>
|
|
255
|
+
|
|
256
|
+
## Multi-Repo Workspace Support (sub_repos)
|
|
257
|
+
|
|
258
|
+
For workspaces with separate git repos (e.g., `backend/`, `frontend/`, `shared/`), GSD routes commits to each repo independently.
|
|
259
|
+
|
|
260
|
+
### Configuration
|
|
261
|
+
|
|
262
|
+
In `.planning/config.json`, list sub-repo directories under `planning.sub_repos`:
|
|
263
|
+
|
|
264
|
+
```json
|
|
265
|
+
{
|
|
266
|
+
"planning": {
|
|
267
|
+
"commit_docs": false,
|
|
268
|
+
"sub_repos": ["backend", "frontend", "shared"]
|
|
269
|
+
}
|
|
270
|
+
}
|
|
271
|
+
```
|
|
272
|
+
|
|
273
|
+
Set `commit_docs: false` so planning docs stay local and are not committed to any sub-repo.
|
|
274
|
+
|
|
275
|
+
### How It Works
|
|
276
|
+
|
|
277
|
+
1. **Auto-detection:** During `/gsd-new-project`, directories with their own `.git` folder are detected and offered for selection as sub-repos. On subsequent runs, `loadConfig` auto-syncs the `sub_repos` list with the filesystem — adding newly created repos and removing deleted ones. This means `config.json` may be rewritten automatically when repos change on disk.
|
|
278
|
+
2. **File grouping:** Code files are grouped by their sub-repo prefix (e.g., `backend/src/api/users.ts` belongs to the `backend/` repo).
|
|
279
|
+
3. **Independent commits:** Each sub-repo receives its own atomic commit via `gsd-tools.cjs commit-to-subrepo`. File paths are made relative to the sub-repo root before staging.
|
|
280
|
+
4. **Planning stays local:** The `.planning/` directory is not committed; it acts as cross-repo coordination.
|
|
281
|
+
|
|
282
|
+
### Commit Routing
|
|
283
|
+
|
|
284
|
+
Instead of the standard `commit` command, use `commit-to-subrepo` when `sub_repos` is configured:
|
|
285
|
+
|
|
286
|
+
```bash
|
|
287
|
+
node $HOME/.config/opencode/get-shit-done/bin/gsd-tools.cjs commit-to-subrepo "feat(02-01): add user API" \
|
|
288
|
+
--files backend/src/api/users.ts backend/src/types/user.ts frontend/src/components/UserForm.tsx
|
|
289
|
+
```
|
|
290
|
+
|
|
291
|
+
This stages `src/api/users.ts` and `src/types/user.ts` in the `backend/` repo, and `src/components/UserForm.tsx` in the `frontend/` repo, then commits each independently with the same message.
|
|
292
|
+
|
|
293
|
+
Files that don't match any configured sub-repo are reported as unmatched.
|
|
294
|
+
|
|
295
|
+
</sub_repos_support>
|
|
@@ -26,6 +26,8 @@ task(
|
|
|
26
26
|
|
|
27
27
|
**Note:** Opus-tier agents resolve to `"inherit"` (not `"opus"`). This causes the agent to use the parent session's model, avoiding conflicts with organization policies that may block specific opus versions.
|
|
28
28
|
|
|
29
|
+
If `model_profile` is `"inherit"`, all agents resolve to `"inherit"` (useful for OpenCode `/model`).
|
|
30
|
+
|
|
29
31
|
## Usage
|
|
30
32
|
|
|
31
33
|
1. Resolve once at orchestration start
|