worclaude 1.8.0 → 1.9.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +11 -7
- package/package.json +1 -1
- package/src/commands/doctor.js +319 -0
- package/src/commands/init.js +8 -13
- package/src/commands/upgrade.js +6 -11
- package/src/core/config.js +19 -1
- package/src/core/merger.js +34 -16
- package/src/core/remover.js +6 -1
- package/src/core/scaffolder.js +20 -5
- package/src/index.js +6 -0
- package/templates/commands/commit-push-pr.md +31 -9
- package/templates/commands/end.md +7 -3
- package/templates/commands/refactor-clean.md +39 -31
- package/templates/commands/start.md +67 -7
- package/templates/core/claude-md.md +1 -0
- package/templates/settings/base.json +18 -2
- package/templates/skills/universal/context-management.md +55 -0
package/README.md
CHANGED
|
@@ -10,7 +10,7 @@
|
|
|
10
10
|
|
|
11
11
|
[Full Documentation](https://sefaertunc.github.io/Worclaude/) · [Interactive Demo](https://sefaertunc.github.io/Worclaude/demo/) · [npm](https://www.npmjs.com/package/worclaude)
|
|
12
12
|
|
|
13
|
-
Worclaude scaffolds a complete Claude Code workflow into any project in seconds. It implements all [tips by Boris Cherny](https://www.howborisusesclaudecode.com/) — the creator of Claude Code at Anthropic — as a reusable, upgradable scaffold. One `init` command gives you
|
|
13
|
+
Worclaude scaffolds a complete Claude Code workflow into any project in seconds. It implements all [tips by Boris Cherny](https://www.howborisusesclaudecode.com/) — the creator of Claude Code at Anthropic — as a reusable, upgradable scaffold. One `init` command gives you 25 agents, 16 slash commands, 14 skills, hooks, permissions, and a CLAUDE.md template tuned for your tech stack. Whether you're starting fresh or adding structure to an existing project, Worclaude handles the setup so you can focus on building.
|
|
14
14
|
|
|
15
15
|
---
|
|
16
16
|
|
|
@@ -18,25 +18,27 @@ Worclaude scaffolds a complete Claude Code workflow into any project in seconds.
|
|
|
18
18
|
|
|
19
19
|
`worclaude init` installs a production-ready Claude Code workflow:
|
|
20
20
|
|
|
21
|
-
**Agents (
|
|
21
|
+
**Agents (25 total)**
|
|
22
22
|
|
|
23
23
|
- 5 universal: plan-reviewer, code-simplifier, test-writer, build-validator, verify-app
|
|
24
|
-
-
|
|
24
|
+
- 20 optional across 6 categories: Backend, Frontend, DevOps, Quality, Documentation, Data/AI
|
|
25
25
|
|
|
26
|
-
**Slash Commands (
|
|
27
|
-
`/start` `/end` `/commit-push-pr` `/review-plan` `/techdebt` `/verify` `/compact-safe` `/status` `/update-claude-md` `/setup` `/sync` `/conflict-resolver`
|
|
26
|
+
**Slash Commands (16)**
|
|
27
|
+
`/start` `/end` `/commit-push-pr` `/review-plan` `/techdebt` `/verify` `/compact-safe` `/status` `/update-claude-md` `/setup` `/sync` `/conflict-resolver` `/review-changes` `/build-fix` `/refactor-clean` `/test-coverage`
|
|
28
28
|
|
|
29
|
-
**Skills (
|
|
29
|
+
**Skills (14)**
|
|
30
30
|
|
|
31
|
-
-
|
|
31
|
+
- 10 universal knowledge files (testing, git conventions, context management, security, and more)
|
|
32
32
|
- 3 project-specific templates filled in by `/setup`
|
|
33
33
|
- 1 generated agent routing guide (dynamically built from your agent selection)
|
|
34
34
|
|
|
35
35
|
**Hooks**
|
|
36
36
|
|
|
37
|
+
- SessionStart context injection (auto-loads CLAUDE.md, PROGRESS.md, and last session on launch)
|
|
37
38
|
- PostToolUse formatter (auto-formats on every write)
|
|
38
39
|
- PostCompact re-injection (re-reads key files after compaction)
|
|
39
40
|
- Stop notifications (desktop alert when Claude finishes)
|
|
41
|
+
- Hook profiles (`WORCLAUDE_HOOK_PROFILE`) — minimal, standard, or strict
|
|
40
42
|
|
|
41
43
|
**Configuration**
|
|
42
44
|
|
|
@@ -70,6 +72,8 @@ For parallel tasks, run Claude with worktrees: `claude --worktree --tmux`
|
|
|
70
72
|
| `worclaude backup` | Create timestamped backup of workflow files |
|
|
71
73
|
| `worclaude restore` | Restore from a previous backup |
|
|
72
74
|
| `worclaude diff` | Compare current setup vs latest version |
|
|
75
|
+
| `worclaude delete` | Remove worclaude workflow from project |
|
|
76
|
+
| `worclaude doctor` | Validate workflow installation health |
|
|
73
77
|
|
|
74
78
|
The `init` command detects existing setups and merges intelligently — no data is overwritten without your confirmation. Use `upgrade` to pull in new features while preserving your customizations.
|
|
75
79
|
|
package/package.json
CHANGED
|
@@ -0,0 +1,319 @@
|
|
|
1
|
+
import path from 'node:path';
|
|
2
|
+
import { readWorkflowMeta, workflowMetaExists, getPackageVersion } from '../core/config.js';
|
|
3
|
+
import { hashFile } from '../utils/hash.js';
|
|
4
|
+
import { fileExists, readFile, listFilesRecursive } from '../utils/file.js';
|
|
5
|
+
import {
|
|
6
|
+
UNIVERSAL_AGENTS,
|
|
7
|
+
COMMAND_FILES,
|
|
8
|
+
UNIVERSAL_SKILLS,
|
|
9
|
+
TEMPLATE_SKILLS,
|
|
10
|
+
} from '../data/agents.js';
|
|
11
|
+
import * as display from '../utils/display.js';
|
|
12
|
+
|
|
13
|
+
// Check categories
|
|
14
|
+
const PASS = 'pass';
|
|
15
|
+
const WARN = 'warn';
|
|
16
|
+
const FAIL = 'fail';
|
|
17
|
+
|
|
18
|
+
function result(status, label, detail) {
|
|
19
|
+
return { status, label, detail };
|
|
20
|
+
}
|
|
21
|
+
|
|
22
|
+
function printResult(r) {
|
|
23
|
+
const icon =
|
|
24
|
+
r.status === PASS
|
|
25
|
+
? display.green('✓')
|
|
26
|
+
: r.status === WARN
|
|
27
|
+
? display.yellow('⚠')
|
|
28
|
+
: display.red('✗');
|
|
29
|
+
const text = r.status === PASS ? display.dimColor(r.label) : display.white(r.label);
|
|
30
|
+
console.log(` ${icon} ${text}`);
|
|
31
|
+
if (r.detail && r.status !== PASS) {
|
|
32
|
+
console.log(` ${display.dimColor(r.detail)}`);
|
|
33
|
+
}
|
|
34
|
+
}
|
|
35
|
+
|
|
36
|
+
async function checkWorkflowMeta(projectRoot) {
|
|
37
|
+
if (!(await workflowMetaExists(projectRoot))) {
|
|
38
|
+
return result(FAIL, 'workflow-meta.json', 'Missing — run `worclaude init` to create');
|
|
39
|
+
}
|
|
40
|
+
const meta = await readWorkflowMeta(projectRoot);
|
|
41
|
+
if (!meta) {
|
|
42
|
+
return result(FAIL, 'workflow-meta.json', 'Exists but contains invalid JSON');
|
|
43
|
+
}
|
|
44
|
+
if (!meta.version || !meta.projectTypes || !meta.techStack) {
|
|
45
|
+
return result(
|
|
46
|
+
WARN,
|
|
47
|
+
'workflow-meta.json',
|
|
48
|
+
'Missing required fields (version, projectTypes, or techStack)'
|
|
49
|
+
);
|
|
50
|
+
}
|
|
51
|
+
return result(PASS, 'workflow-meta.json', null);
|
|
52
|
+
}
|
|
53
|
+
|
|
54
|
+
async function checkClaudeMd(projectRoot) {
|
|
55
|
+
const claudeMdPath = path.join(projectRoot, 'CLAUDE.md');
|
|
56
|
+
if (!(await fileExists(claudeMdPath))) {
|
|
57
|
+
return result(FAIL, 'CLAUDE.md', 'Missing — run `worclaude init` to create');
|
|
58
|
+
}
|
|
59
|
+
try {
|
|
60
|
+
const content = await readFile(claudeMdPath);
|
|
61
|
+
const lines = content.split('\n').length;
|
|
62
|
+
if (lines < 10) {
|
|
63
|
+
return result(
|
|
64
|
+
WARN,
|
|
65
|
+
'CLAUDE.md',
|
|
66
|
+
`Only ${lines} lines — may be a stub. Run /setup to fill it in.`
|
|
67
|
+
);
|
|
68
|
+
}
|
|
69
|
+
return result(PASS, 'CLAUDE.md', null);
|
|
70
|
+
} catch {
|
|
71
|
+
return result(FAIL, 'CLAUDE.md', 'Exists but could not be read');
|
|
72
|
+
}
|
|
73
|
+
}
|
|
74
|
+
|
|
75
|
+
async function checkSettingsJson(projectRoot) {
|
|
76
|
+
const settingsPath = path.join(projectRoot, '.claude', 'settings.json');
|
|
77
|
+
if (!(await fileExists(settingsPath))) {
|
|
78
|
+
return result(FAIL, 'settings.json', 'Missing — run `worclaude init` to create');
|
|
79
|
+
}
|
|
80
|
+
try {
|
|
81
|
+
const raw = await readFile(settingsPath);
|
|
82
|
+
const settings = JSON.parse(raw);
|
|
83
|
+
|
|
84
|
+
const issues = [];
|
|
85
|
+
if (!settings.permissions?.allow || settings.permissions.allow.length === 0) {
|
|
86
|
+
issues.push('no permissions configured');
|
|
87
|
+
}
|
|
88
|
+
if (!settings.hooks || Object.keys(settings.hooks).length === 0) {
|
|
89
|
+
issues.push('no hooks configured');
|
|
90
|
+
}
|
|
91
|
+
if (!settings.hooks?.PostCompact) {
|
|
92
|
+
issues.push('missing PostCompact hook (context recovery after compaction)');
|
|
93
|
+
}
|
|
94
|
+
if (!settings.hooks?.SessionStart) {
|
|
95
|
+
issues.push('missing SessionStart hook (session persistence)');
|
|
96
|
+
}
|
|
97
|
+
|
|
98
|
+
if (issues.length > 0) {
|
|
99
|
+
return result(WARN, 'settings.json', issues.join('; '));
|
|
100
|
+
}
|
|
101
|
+
return result(PASS, 'settings.json', null);
|
|
102
|
+
} catch {
|
|
103
|
+
return result(FAIL, 'settings.json', 'Contains invalid JSON');
|
|
104
|
+
}
|
|
105
|
+
}
|
|
106
|
+
|
|
107
|
+
async function checkAgents(projectRoot, meta) {
|
|
108
|
+
const agentsDir = path.join(projectRoot, '.claude', 'agents');
|
|
109
|
+
const results = [];
|
|
110
|
+
|
|
111
|
+
// Check universal agents
|
|
112
|
+
for (const agent of UNIVERSAL_AGENTS) {
|
|
113
|
+
const agentPath = path.join(agentsDir, `${agent}.md`);
|
|
114
|
+
if (!(await fileExists(agentPath))) {
|
|
115
|
+
results.push(result(FAIL, `agents/${agent}.md`, 'Missing universal agent'));
|
|
116
|
+
}
|
|
117
|
+
}
|
|
118
|
+
|
|
119
|
+
// Check selected optional agents from meta
|
|
120
|
+
if (meta?.optionalAgents) {
|
|
121
|
+
for (const agent of meta.optionalAgents) {
|
|
122
|
+
const agentPath = path.join(agentsDir, `${agent}.md`);
|
|
123
|
+
if (!(await fileExists(agentPath))) {
|
|
124
|
+
results.push(result(WARN, `agents/${agent}.md`, 'Selected optional agent is missing'));
|
|
125
|
+
}
|
|
126
|
+
}
|
|
127
|
+
}
|
|
128
|
+
|
|
129
|
+
if (results.length === 0) {
|
|
130
|
+
const totalExpected = UNIVERSAL_AGENTS.length + (meta?.optionalAgents?.length || 0);
|
|
131
|
+
results.push(result(PASS, `agents/ (${totalExpected} expected, all present)`, null));
|
|
132
|
+
}
|
|
133
|
+
|
|
134
|
+
return results;
|
|
135
|
+
}
|
|
136
|
+
|
|
137
|
+
async function checkCommands(projectRoot) {
|
|
138
|
+
const commandsDir = path.join(projectRoot, '.claude', 'commands');
|
|
139
|
+
const missing = [];
|
|
140
|
+
|
|
141
|
+
for (const cmd of COMMAND_FILES) {
|
|
142
|
+
const cmdPath = path.join(commandsDir, `${cmd}.md`);
|
|
143
|
+
if (!(await fileExists(cmdPath))) {
|
|
144
|
+
missing.push(cmd);
|
|
145
|
+
}
|
|
146
|
+
}
|
|
147
|
+
|
|
148
|
+
if (missing.length === 0) {
|
|
149
|
+
return [result(PASS, `commands/ (${COMMAND_FILES.length} expected, all present)`, null)];
|
|
150
|
+
}
|
|
151
|
+
return missing.map((cmd) => result(WARN, `commands/${cmd}.md`, 'Missing command'));
|
|
152
|
+
}
|
|
153
|
+
|
|
154
|
+
async function checkSkills(projectRoot) {
|
|
155
|
+
const skillsDir = path.join(projectRoot, '.claude', 'skills');
|
|
156
|
+
const missing = [];
|
|
157
|
+
const allExpected = [...UNIVERSAL_SKILLS, ...TEMPLATE_SKILLS, 'agent-routing'];
|
|
158
|
+
|
|
159
|
+
for (const skill of allExpected) {
|
|
160
|
+
const skillPath = path.join(skillsDir, `${skill}.md`);
|
|
161
|
+
if (!(await fileExists(skillPath))) {
|
|
162
|
+
missing.push(skill);
|
|
163
|
+
}
|
|
164
|
+
}
|
|
165
|
+
|
|
166
|
+
if (missing.length === 0) {
|
|
167
|
+
return [result(PASS, `skills/ (${allExpected.length} expected, all present)`, null)];
|
|
168
|
+
}
|
|
169
|
+
return missing.map((s) => result(WARN, `skills/${s}.md`, 'Missing skill'));
|
|
170
|
+
}
|
|
171
|
+
|
|
172
|
+
async function checkHashIntegrity(projectRoot, meta) {
|
|
173
|
+
if (!meta?.fileHashes || Object.keys(meta.fileHashes).length === 0) {
|
|
174
|
+
return [result(WARN, 'File integrity', 'No file hashes in workflow-meta.json — cannot verify')];
|
|
175
|
+
}
|
|
176
|
+
|
|
177
|
+
let modified = 0;
|
|
178
|
+
let missing = 0;
|
|
179
|
+
let intact = 0;
|
|
180
|
+
|
|
181
|
+
for (const [relPath, storedHash] of Object.entries(meta.fileHashes)) {
|
|
182
|
+
const fullPath = path.join(projectRoot, '.claude', ...relPath.split('/'));
|
|
183
|
+
if (!(await fileExists(fullPath))) {
|
|
184
|
+
missing++;
|
|
185
|
+
} else {
|
|
186
|
+
const currentHash = await hashFile(fullPath);
|
|
187
|
+
if (currentHash !== storedHash) {
|
|
188
|
+
modified++;
|
|
189
|
+
} else {
|
|
190
|
+
intact++;
|
|
191
|
+
}
|
|
192
|
+
}
|
|
193
|
+
}
|
|
194
|
+
|
|
195
|
+
const total = Object.keys(meta.fileHashes).length;
|
|
196
|
+
const results = [];
|
|
197
|
+
|
|
198
|
+
if (missing > 0) {
|
|
199
|
+
results.push(result(FAIL, `File integrity: ${missing}/${total} files missing`, null));
|
|
200
|
+
}
|
|
201
|
+
if (modified > 0) {
|
|
202
|
+
results.push(
|
|
203
|
+
result(PASS, `File integrity: ${modified}/${total} files customized (expected)`, null)
|
|
204
|
+
);
|
|
205
|
+
}
|
|
206
|
+
if (results.length === 0) {
|
|
207
|
+
results.push(
|
|
208
|
+
result(PASS, `File integrity: all ${total} files present (${intact} intact)`, null)
|
|
209
|
+
);
|
|
210
|
+
}
|
|
211
|
+
|
|
212
|
+
return results;
|
|
213
|
+
}
|
|
214
|
+
|
|
215
|
+
async function checkSessions(projectRoot) {
|
|
216
|
+
const sessionsDir = path.join(projectRoot, '.claude', 'sessions');
|
|
217
|
+
if (!(await fileExists(sessionsDir))) {
|
|
218
|
+
return result(
|
|
219
|
+
WARN,
|
|
220
|
+
'sessions/',
|
|
221
|
+
"Directory missing — session persistence won't work. Run `worclaude init` or create .claude/sessions/ manually."
|
|
222
|
+
);
|
|
223
|
+
}
|
|
224
|
+
return result(PASS, 'sessions/', null);
|
|
225
|
+
}
|
|
226
|
+
|
|
227
|
+
async function checkDocSpecs(projectRoot) {
|
|
228
|
+
const results = [];
|
|
229
|
+
const progressPath = path.join(projectRoot, 'docs', 'spec', 'PROGRESS.md');
|
|
230
|
+
const specPath = path.join(projectRoot, 'docs', 'spec', 'SPEC.md');
|
|
231
|
+
|
|
232
|
+
if (!(await fileExists(progressPath))) {
|
|
233
|
+
results.push(
|
|
234
|
+
result(WARN, 'docs/spec/PROGRESS.md', 'Missing — /start and /sync depend on this')
|
|
235
|
+
);
|
|
236
|
+
}
|
|
237
|
+
if (!(await fileExists(specPath))) {
|
|
238
|
+
results.push(result(WARN, 'docs/spec/SPEC.md', 'Missing — plan-reviewer references this'));
|
|
239
|
+
}
|
|
240
|
+
|
|
241
|
+
if (results.length === 0) {
|
|
242
|
+
results.push(result(PASS, 'docs/spec/ (PROGRESS.md + SPEC.md present)', null));
|
|
243
|
+
}
|
|
244
|
+
return results;
|
|
245
|
+
}
|
|
246
|
+
|
|
247
|
+
async function checkPendingReviewFiles(projectRoot) {
|
|
248
|
+
const pending = [];
|
|
249
|
+
try {
|
|
250
|
+
const claudeDir = path.join(projectRoot, '.claude');
|
|
251
|
+
const allFiles = await listFilesRecursive(claudeDir);
|
|
252
|
+
for (const fp of allFiles) {
|
|
253
|
+
const rel = path.relative(claudeDir, fp).split(path.sep).join('/');
|
|
254
|
+
if (rel.endsWith('.workflow-ref.md')) {
|
|
255
|
+
pending.push(rel);
|
|
256
|
+
}
|
|
257
|
+
}
|
|
258
|
+
} catch {
|
|
259
|
+
// .claude dir might not exist
|
|
260
|
+
}
|
|
261
|
+
|
|
262
|
+
const suggestionsPath = path.join(projectRoot, 'CLAUDE.md.workflow-suggestions');
|
|
263
|
+
if (await fileExists(suggestionsPath)) {
|
|
264
|
+
pending.push('CLAUDE.md.workflow-suggestions');
|
|
265
|
+
}
|
|
266
|
+
|
|
267
|
+
if (pending.length === 0) {
|
|
268
|
+
return [result(PASS, 'No pending review files', null)];
|
|
269
|
+
}
|
|
270
|
+
return pending.map((f) => result(WARN, `Pending review: ${f}`, 'Merge or delete this file'));
|
|
271
|
+
}
|
|
272
|
+
|
|
273
|
+
export async function doctorCommand() {
|
|
274
|
+
const projectRoot = process.cwd();
|
|
275
|
+
const version = await getPackageVersion();
|
|
276
|
+
|
|
277
|
+
display.newline();
|
|
278
|
+
display.sectionHeader('WORCLAUDE DOCTOR');
|
|
279
|
+
display.dim(`CLI version: v${version}`);
|
|
280
|
+
display.newline();
|
|
281
|
+
|
|
282
|
+
// Core files
|
|
283
|
+
display.barLine(display.white('Core Files'));
|
|
284
|
+
const metaResult = await checkWorkflowMeta(projectRoot);
|
|
285
|
+
printResult(metaResult);
|
|
286
|
+
|
|
287
|
+
const meta = await readWorkflowMeta(projectRoot);
|
|
288
|
+
|
|
289
|
+
printResult(await checkClaudeMd(projectRoot));
|
|
290
|
+
printResult(await checkSettingsJson(projectRoot));
|
|
291
|
+
printResult(await checkSessions(projectRoot));
|
|
292
|
+
display.newline();
|
|
293
|
+
|
|
294
|
+
// Components
|
|
295
|
+
display.barLine(display.white('Components'));
|
|
296
|
+
for (const r of await checkAgents(projectRoot, meta)) printResult(r);
|
|
297
|
+
for (const r of await checkCommands(projectRoot)) printResult(r);
|
|
298
|
+
for (const r of await checkSkills(projectRoot)) printResult(r);
|
|
299
|
+
display.newline();
|
|
300
|
+
|
|
301
|
+
// Docs
|
|
302
|
+
display.barLine(display.white('Documentation'));
|
|
303
|
+
for (const r of await checkDocSpecs(projectRoot)) printResult(r);
|
|
304
|
+
display.newline();
|
|
305
|
+
|
|
306
|
+
// Integrity
|
|
307
|
+
display.barLine(display.white('Integrity'));
|
|
308
|
+
for (const r of await checkHashIntegrity(projectRoot, meta)) printResult(r);
|
|
309
|
+
for (const r of await checkPendingReviewFiles(projectRoot)) printResult(r);
|
|
310
|
+
display.newline();
|
|
311
|
+
|
|
312
|
+
// Summary
|
|
313
|
+
if (metaResult.status === FAIL) {
|
|
314
|
+
display.error('Workflow is not installed. Run `worclaude init` to set up.');
|
|
315
|
+
} else {
|
|
316
|
+
display.success('Doctor complete. Review any warnings above.');
|
|
317
|
+
}
|
|
318
|
+
display.newline();
|
|
319
|
+
}
|
package/src/commands/init.js
CHANGED
|
@@ -3,13 +3,13 @@ import inquirer from 'inquirer';
|
|
|
3
3
|
import ora from 'ora';
|
|
4
4
|
import { scaffoldFile, updateGitignore } from '../core/scaffolder.js';
|
|
5
5
|
import {
|
|
6
|
+
computeFileHashes,
|
|
6
7
|
createWorkflowMeta,
|
|
7
8
|
getPackageVersion,
|
|
8
9
|
readWorkflowMeta,
|
|
9
10
|
writeWorkflowMeta,
|
|
10
11
|
} from '../core/config.js';
|
|
11
|
-
import { fileExists, writeFile
|
|
12
|
-
import { hashFile } from '../utils/hash.js';
|
|
12
|
+
import { fileExists, writeFile } from '../utils/file.js';
|
|
13
13
|
import * as display from '../utils/display.js';
|
|
14
14
|
import { promptProjectType } from '../prompts/project-type.js';
|
|
15
15
|
import { promptTechStack } from '../prompts/tech-stack.js';
|
|
@@ -349,17 +349,7 @@ function buildTemplateVariables(selections) {
|
|
|
349
349
|
}
|
|
350
350
|
|
|
351
351
|
async function computeAndWriteWorkflowMeta(projectRoot, selections, version) {
|
|
352
|
-
const fileHashes =
|
|
353
|
-
const claudeFiles = await listFilesRecursive(path.join(projectRoot, '.claude'));
|
|
354
|
-
for (const filePath of claudeFiles) {
|
|
355
|
-
const relativePath = path
|
|
356
|
-
.relative(path.join(projectRoot, '.claude'), filePath)
|
|
357
|
-
.split(path.sep)
|
|
358
|
-
.join('/');
|
|
359
|
-
if (relativePath !== 'workflow-meta.json' && relativePath !== 'settings.json') {
|
|
360
|
-
fileHashes[relativePath] = await hashFile(filePath);
|
|
361
|
-
}
|
|
362
|
-
}
|
|
352
|
+
const fileHashes = await computeFileHashes(projectRoot);
|
|
363
353
|
|
|
364
354
|
const meta = createWorkflowMeta({
|
|
365
355
|
version,
|
|
@@ -478,6 +468,10 @@ async function scaffoldFresh(projectRoot, selections, variables, settingsStr, ve
|
|
|
478
468
|
}
|
|
479
469
|
spinner.text = 'Created docs/spec/';
|
|
480
470
|
|
|
471
|
+
// Create sessions directory for session persistence
|
|
472
|
+
await writeFile(path.join(projectRoot, '.claude', 'sessions', '.gitkeep'), '');
|
|
473
|
+
spinner.text = 'Created .claude/sessions/';
|
|
474
|
+
|
|
481
475
|
await computeAndWriteWorkflowMeta(projectRoot, selections, version);
|
|
482
476
|
spinner.text = 'Created .claude/workflow-meta.json';
|
|
483
477
|
|
|
@@ -501,6 +495,7 @@ function displayFreshSuccess(selections, skipped) {
|
|
|
501
495
|
display.success(`.claude/agents/${display.dimColor(` ${totalAgents} agents`)}`);
|
|
502
496
|
display.success(`.claude/commands/${display.dimColor(` ${COMMAND_FILES.length} commands`)}`);
|
|
503
497
|
display.success(`.claude/skills/${display.dimColor(` ${totalSkills} skills`)}`);
|
|
498
|
+
display.success('.claude/sessions/');
|
|
504
499
|
display.success('.mcp.json');
|
|
505
500
|
display.success('.gitignore');
|
|
506
501
|
if (skipped.progressMd) {
|
package/src/commands/upgrade.js
CHANGED
|
@@ -3,6 +3,7 @@ import { execSync } from 'node:child_process';
|
|
|
3
3
|
import inquirer from 'inquirer';
|
|
4
4
|
import ora from 'ora';
|
|
5
5
|
import {
|
|
6
|
+
computeFileHashes,
|
|
6
7
|
readWorkflowMeta,
|
|
7
8
|
workflowMetaExists,
|
|
8
9
|
writeWorkflowMeta,
|
|
@@ -12,8 +13,7 @@ import { createBackup } from '../core/backup.js';
|
|
|
12
13
|
import { categorizeFiles } from '../core/file-categorizer.js';
|
|
13
14
|
import { buildSettingsJson, mergeSettingsPermissionsAndHooks } from '../core/merger.js';
|
|
14
15
|
import { readTemplate, updateGitignore } from '../core/scaffolder.js';
|
|
15
|
-
import {
|
|
16
|
-
import { writeFile, fileExists, listFilesRecursive } from '../utils/file.js';
|
|
16
|
+
import { writeFile, fileExists } from '../utils/file.js';
|
|
17
17
|
import { getLatestNpmVersion } from '../utils/npm.js';
|
|
18
18
|
import * as display from '../utils/display.js';
|
|
19
19
|
|
|
@@ -211,16 +211,11 @@ export async function upgradeCommand() {
|
|
|
211
211
|
spinner.text = 'Settings merged...';
|
|
212
212
|
}
|
|
213
213
|
|
|
214
|
+
// Ensure sessions directory exists for session persistence
|
|
215
|
+
await writeFile(path.join(projectRoot, '.claude', 'sessions', '.gitkeep'), '');
|
|
216
|
+
|
|
214
217
|
// Recompute file hashes
|
|
215
|
-
const fileHashes =
|
|
216
|
-
const claudeDir = path.join(projectRoot, '.claude');
|
|
217
|
-
const allFiles = await listFilesRecursive(claudeDir);
|
|
218
|
-
for (const filePath of allFiles) {
|
|
219
|
-
const relKey = path.relative(claudeDir, filePath).split(path.sep).join('/');
|
|
220
|
-
if (relKey !== 'workflow-meta.json' && relKey !== 'settings.json') {
|
|
221
|
-
fileHashes[relKey] = await hashFile(filePath);
|
|
222
|
-
}
|
|
223
|
-
}
|
|
218
|
+
const fileHashes = await computeFileHashes(projectRoot);
|
|
224
219
|
|
|
225
220
|
// Ensure .gitignore has worclaude entries
|
|
226
221
|
await updateGitignore(projectRoot);
|
package/src/core/config.js
CHANGED
|
@@ -1,7 +1,8 @@
|
|
|
1
1
|
import path from 'node:path';
|
|
2
2
|
import { readFileSync } from 'node:fs';
|
|
3
3
|
import { fileURLToPath } from 'node:url';
|
|
4
|
-
import { readFile, writeFile, fileExists } from '../utils/file.js';
|
|
4
|
+
import { readFile, writeFile, fileExists, listFilesRecursive } from '../utils/file.js';
|
|
5
|
+
import { hashFile } from '../utils/hash.js';
|
|
5
6
|
|
|
6
7
|
const __dirname = path.dirname(fileURLToPath(import.meta.url));
|
|
7
8
|
const pkgPath = path.resolve(__dirname, '..', '..', 'package.json');
|
|
@@ -57,3 +58,20 @@ export async function writeWorkflowMeta(projectRoot, meta) {
|
|
|
57
58
|
const metaPath = path.join(projectRoot, '.claude', 'workflow-meta.json');
|
|
58
59
|
await writeFile(metaPath, JSON.stringify(meta, null, 2));
|
|
59
60
|
}
|
|
61
|
+
|
|
62
|
+
export async function computeFileHashes(projectRoot) {
|
|
63
|
+
const claudeDir = path.join(projectRoot, '.claude');
|
|
64
|
+
const allFiles = await listFilesRecursive(claudeDir);
|
|
65
|
+
const fileHashes = {};
|
|
66
|
+
for (const filePath of allFiles) {
|
|
67
|
+
const relKey = path.relative(claudeDir, filePath).split(path.sep).join('/');
|
|
68
|
+
if (
|
|
69
|
+
relKey !== 'workflow-meta.json' &&
|
|
70
|
+
relKey !== 'settings.json' &&
|
|
71
|
+
!relKey.startsWith('sessions/')
|
|
72
|
+
) {
|
|
73
|
+
fileHashes[relKey] = await hashFile(filePath);
|
|
74
|
+
}
|
|
75
|
+
}
|
|
76
|
+
return fileHashes;
|
|
77
|
+
}
|
package/src/core/merger.js
CHANGED
|
@@ -238,36 +238,50 @@ export async function mergeSettingsPermissionsAndHooks(projectRoot, workflowSett
|
|
|
238
238
|
if (!existing.hooks[category]) existing.hooks[category] = [];
|
|
239
239
|
|
|
240
240
|
const existingEntries = existing.hooks[category];
|
|
241
|
-
const
|
|
241
|
+
const existingByMatcher = new Map();
|
|
242
|
+
for (const entry of existingEntries) {
|
|
243
|
+
if (!existingByMatcher.has(entry.matcher)) {
|
|
244
|
+
existingByMatcher.set(entry.matcher, []);
|
|
245
|
+
}
|
|
246
|
+
existingByMatcher.get(entry.matcher).push(entry);
|
|
247
|
+
}
|
|
248
|
+
const matched = new Set();
|
|
242
249
|
|
|
243
250
|
for (const workflowEntry of workflowHooks[category]) {
|
|
244
|
-
|
|
245
|
-
|
|
246
|
-
|
|
247
|
-
|
|
248
|
-
|
|
249
|
-
|
|
250
|
-
|
|
251
|
-
|
|
252
|
-
|
|
251
|
+
const candidates = existingByMatcher.get(workflowEntry.matcher) || [];
|
|
252
|
+
const workflowCmd = workflowEntry.hooks?.[0]?.command || '';
|
|
253
|
+
|
|
254
|
+
// Try exact match first (identical command = skip)
|
|
255
|
+
const exactMatch = candidates.find(
|
|
256
|
+
(c) => !matched.has(c) && (c.hooks?.[0]?.command || '') === workflowCmd
|
|
257
|
+
);
|
|
258
|
+
if (exactMatch) {
|
|
259
|
+
matched.add(exactMatch);
|
|
260
|
+
continue;
|
|
261
|
+
}
|
|
262
|
+
|
|
263
|
+
// Try unmatched candidate with same matcher (conflict)
|
|
264
|
+
const conflictCandidate = candidates.find((c) => !matched.has(c));
|
|
265
|
+
if (conflictCandidate) {
|
|
266
|
+
matched.add(conflictCandidate);
|
|
253
267
|
|
|
254
268
|
// Tier 3: conflict — ask user
|
|
255
|
-
const resolution = await promptHookConflict(category,
|
|
269
|
+
const resolution = await promptHookConflict(category, conflictCandidate, workflowEntry);
|
|
256
270
|
|
|
257
271
|
if (resolution === 'replace') {
|
|
258
|
-
const idx = existingEntries.indexOf(
|
|
272
|
+
const idx = existingEntries.indexOf(conflictCandidate);
|
|
259
273
|
existingEntries[idx] = workflowEntry;
|
|
260
274
|
report.hookConflicts.push(
|
|
261
275
|
`${category} "${workflowEntry.matcher}": replaced with workflow hook`
|
|
262
276
|
);
|
|
263
277
|
} else if (resolution === 'chain') {
|
|
264
|
-
const idx = existingEntries.indexOf(
|
|
278
|
+
const idx = existingEntries.indexOf(conflictCandidate);
|
|
265
279
|
existingEntries[idx] = {
|
|
266
|
-
matcher:
|
|
280
|
+
matcher: conflictCandidate.matcher,
|
|
267
281
|
hooks: [
|
|
268
282
|
{
|
|
269
283
|
type: 'command',
|
|
270
|
-
command: `${
|
|
284
|
+
command: `${conflictCandidate.hooks[0].command} && ${workflowEntry.hooks[0].command}`,
|
|
271
285
|
},
|
|
272
286
|
],
|
|
273
287
|
};
|
|
@@ -276,7 +290,7 @@ export async function mergeSettingsPermissionsAndHooks(projectRoot, workflowSett
|
|
|
276
290
|
report.hookConflicts.push(`${category} "${workflowEntry.matcher}": kept existing hook`);
|
|
277
291
|
}
|
|
278
292
|
} else {
|
|
279
|
-
// Tier 1: no
|
|
293
|
+
// Tier 1: no match — append
|
|
280
294
|
existingEntries.push(workflowEntry);
|
|
281
295
|
report.added.hooks++;
|
|
282
296
|
}
|
|
@@ -422,6 +436,10 @@ export async function performMerge(
|
|
|
422
436
|
if (spinner) spinner.start();
|
|
423
437
|
|
|
424
438
|
await mergeMcpJson(projectRoot, existingScan);
|
|
439
|
+
|
|
440
|
+
// Ensure sessions directory exists for session persistence
|
|
441
|
+
await writeFile(path.join(projectRoot, '.claude', 'sessions', '.gitkeep'), '');
|
|
442
|
+
|
|
425
443
|
await mergeDocSpecs(projectRoot, existingScan, variables, selections, report);
|
|
426
444
|
|
|
427
445
|
// Stop spinner before CLAUDE.md merge — interactive prompts for section selection
|
package/src/core/remover.js
CHANGED
|
@@ -181,7 +181,12 @@ export async function cleanGitignore(projectRoot) {
|
|
|
181
181
|
const content = await readFile(gitignorePath);
|
|
182
182
|
const lines = content.split(/\r?\n/);
|
|
183
183
|
|
|
184
|
-
const REMOVE_LINES = new Set([
|
|
184
|
+
const REMOVE_LINES = new Set([
|
|
185
|
+
'# Worclaude (generated workflow files)',
|
|
186
|
+
'.claude/',
|
|
187
|
+
'.claude/sessions/',
|
|
188
|
+
'.claude/workflow-meta.json',
|
|
189
|
+
]);
|
|
185
190
|
|
|
186
191
|
const filtered = lines.filter((line) => !REMOVE_LINES.has(line.trim()));
|
|
187
192
|
|
package/src/core/scaffolder.js
CHANGED
|
@@ -46,7 +46,7 @@ export async function scaffoldDirectory(templateDir, destDir, variables, project
|
|
|
46
46
|
|
|
47
47
|
export async function updateGitignore(projectDir) {
|
|
48
48
|
const gitignorePath = path.join(projectDir, '.gitignore');
|
|
49
|
-
const entries = ['.claude/', '.claude-backup-*/'];
|
|
49
|
+
const entries = ['.claude/sessions/', '.claude/workflow-meta.json', '.claude-backup-*/'];
|
|
50
50
|
const header = '# Worclaude (generated workflow files)';
|
|
51
51
|
|
|
52
52
|
let content = '';
|
|
@@ -56,12 +56,27 @@ export async function updateGitignore(projectDir) {
|
|
|
56
56
|
if (err.code !== 'ENOENT') throw err;
|
|
57
57
|
}
|
|
58
58
|
|
|
59
|
+
// Migrate: remove old blanket .claude/ entry (and its header) if present
|
|
60
|
+
const lines = content.split(/\r?\n/);
|
|
61
|
+
const hasBlanketEntry = lines.some((l) => l.trim() === '.claude/');
|
|
62
|
+
if (hasBlanketEntry) {
|
|
63
|
+
const filtered = lines.filter((l) => {
|
|
64
|
+
const t = l.trim();
|
|
65
|
+
return t !== '.claude/' && t !== header;
|
|
66
|
+
});
|
|
67
|
+
content = filtered.join('\n');
|
|
68
|
+
}
|
|
69
|
+
|
|
59
70
|
const missing = entries.filter((entry) => !content.includes(entry));
|
|
60
|
-
if (missing.length === 0) return false;
|
|
71
|
+
if (missing.length === 0 && !hasBlanketEntry) return false;
|
|
72
|
+
|
|
73
|
+
if (missing.length > 0) {
|
|
74
|
+
const needsNewline = content.length > 0 && !content.endsWith('\n');
|
|
75
|
+
const addition = (needsNewline ? '\n' : '') + '\n' + header + '\n' + missing.join('\n') + '\n';
|
|
76
|
+
content += addition;
|
|
77
|
+
}
|
|
61
78
|
|
|
62
|
-
|
|
63
|
-
const addition = (needsNewline ? '\n' : '') + '\n' + header + '\n' + missing.join('\n') + '\n';
|
|
64
|
-
await fs.appendFile(gitignorePath, addition);
|
|
79
|
+
await fs.writeFile(gitignorePath, content);
|
|
65
80
|
return true;
|
|
66
81
|
}
|
|
67
82
|
|
package/src/index.js
CHANGED
|
@@ -9,6 +9,7 @@ import { backupCommand } from './commands/backup.js';
|
|
|
9
9
|
import { restoreCommand } from './commands/restore.js';
|
|
10
10
|
import { diffCommand } from './commands/diff.js';
|
|
11
11
|
import { deleteCommand } from './commands/delete.js';
|
|
12
|
+
import { doctorCommand } from './commands/doctor.js';
|
|
12
13
|
|
|
13
14
|
const program = new Command();
|
|
14
15
|
|
|
@@ -52,4 +53,9 @@ program
|
|
|
52
53
|
.description('Remove worclaude workflow from project')
|
|
53
54
|
.action(deleteCommand);
|
|
54
55
|
|
|
56
|
+
program
|
|
57
|
+
.command('doctor')
|
|
58
|
+
.description('Validate workflow installation health')
|
|
59
|
+
.action(doctorCommand);
|
|
60
|
+
|
|
55
61
|
program.parse();
|
|
@@ -6,20 +6,42 @@ Do not add Co-Authored-By trailers or AI-generated footers to commits or PR desc
|
|
|
6
6
|
Feature branches contain ONLY the task changes. Do NOT touch shared-state
|
|
7
7
|
files (see git-conventions.md for the canonical list).
|
|
8
8
|
|
|
9
|
-
1.
|
|
10
|
-
|
|
11
|
-
|
|
12
|
-
|
|
13
|
-
|
|
9
|
+
1. Write a session summary to .claude/sessions/:
|
|
10
|
+
- Filename: YYYY-MM-DD-HHMM-{short-branch-name}.md
|
|
11
|
+
- Content format:
|
|
12
|
+
```
|
|
13
|
+
# Session: {date}
|
|
14
|
+
**Branch:** {current branch}
|
|
15
|
+
**Task:** {one-line summary of what was worked on}
|
|
16
|
+
|
|
17
|
+
## Completed
|
|
18
|
+
- {what was done, 3-5 bullet points}
|
|
19
|
+
|
|
20
|
+
## Files Modified
|
|
21
|
+
- {list key files changed, from git diff --name-only}
|
|
22
|
+
|
|
23
|
+
## Notes for Next Session
|
|
24
|
+
- {anything the next session should know}
|
|
25
|
+
```
|
|
26
|
+
- Keep it concise — this is for machine consumption at session start,
|
|
27
|
+
not a detailed report
|
|
28
|
+
2. Stage all changes: git add -A
|
|
29
|
+
3. Write a clear, conventional commit message
|
|
30
|
+
4. Push to the current branch
|
|
31
|
+
5. Create a PR targeting develop: gh pr create --base develop
|
|
32
|
+
6. Include in PR description: title, changes, testing done, reviewer notes
|
|
14
33
|
|
|
15
34
|
## On develop
|
|
16
35
|
|
|
17
36
|
Only used for release merges after /sync has been run.
|
|
18
37
|
|
|
19
|
-
1.
|
|
20
|
-
|
|
21
|
-
|
|
22
|
-
|
|
38
|
+
1. Write a session summary to .claude/sessions/:
|
|
39
|
+
- Filename: YYYY-MM-DD-HHMM-{short-branch-name}.md
|
|
40
|
+
- Same format as the feature branch session summary above
|
|
41
|
+
2. Stage all changes: git add -A
|
|
42
|
+
3. Write a clear, conventional commit message
|
|
43
|
+
4. Push to develop
|
|
44
|
+
5. Create a PR targeting main: gh pr create --base main
|
|
23
45
|
|
|
24
46
|
## On any other branch
|
|
25
47
|
|
|
@@ -11,7 +11,11 @@ Do NOT update PROGRESS.md — /sync handles that on develop after merging.
|
|
|
11
11
|
- What is left to do
|
|
12
12
|
- Decisions or context the next session needs
|
|
13
13
|
- Files that were modified
|
|
14
|
-
3.
|
|
15
|
-
|
|
14
|
+
3. Write a session summary to .claude/sessions/:
|
|
15
|
+
- Filename: YYYY-MM-DD-HHMM-{short-branch-name}.md
|
|
16
|
+
- Same format as /commit-push-pr session summaries
|
|
17
|
+
- Mark the task as "IN PROGRESS" since /end means work is unfinished
|
|
18
|
+
4. git add -A
|
|
19
|
+
5. git commit -m "wip: handoff for [task description]"
|
|
16
20
|
Use exactly this message format — no trailers or Co-Authored-By lines.
|
|
17
|
-
|
|
21
|
+
6. git push
|
|
@@ -1,44 +1,52 @@
|
|
|
1
|
-
Run a focused cleanup pass on
|
|
2
|
-
|
|
1
|
+
Run a focused cleanup pass on recently changed code. This command
|
|
2
|
+
runs INLINE in your current session — it reads uncommitted changes,
|
|
3
|
+
improves them in place, and leaves everything uncommitted for
|
|
4
|
+
/commit-push-pr.
|
|
3
5
|
|
|
4
|
-
|
|
6
|
+
Do NOT spawn a subagent or worktree for this. Work directly on
|
|
7
|
+
the files in the current working directory.
|
|
5
8
|
|
|
6
|
-
|
|
7
|
-
- Unused imports and variables
|
|
8
|
-
- Commented-out code blocks (git has history)
|
|
9
|
-
- Unreachable branches after early returns
|
|
10
|
-
- Functions defined but never called
|
|
9
|
+
## What to Clean
|
|
11
10
|
|
|
12
|
-
|
|
13
|
-
|
|
14
|
-
- Repeated validation patterns → centralize
|
|
15
|
-
- Copy-pasted error handling → extract helper
|
|
11
|
+
1. **Dead code** — unused imports, commented-out blocks, unreachable branches,
|
|
12
|
+
uncalled functions. Delete them — git has history.
|
|
16
13
|
|
|
17
|
-
|
|
18
|
-
|
|
19
|
-
|
|
20
|
-
- Long parameter lists → group into option objects
|
|
14
|
+
2. **Duplication** — identical or near-identical code blocks → extract shared
|
|
15
|
+
function. Repeated validation → centralize. Copy-pasted error handling →
|
|
16
|
+
extract helper.
|
|
21
17
|
|
|
22
|
-
|
|
23
|
-
|
|
24
|
-
|
|
25
|
-
|
|
18
|
+
3. **Complexity** — functions over 30 lines → split by responsibility. Nesting
|
|
19
|
+
deeper than 3 levels → early returns, guard clauses. Long parameter lists →
|
|
20
|
+
group into option objects.
|
|
21
|
+
|
|
22
|
+
4. **Consistency** — naming that doesn't match project conventions. Mixed patterns
|
|
23
|
+
in the same module. Inconsistent error handling.
|
|
26
24
|
|
|
27
25
|
## Process
|
|
28
26
|
|
|
29
|
-
1.
|
|
30
|
-
|
|
31
|
-
|
|
32
|
-
|
|
27
|
+
1. Identify changed files: `git diff --name-only` (unstaged) and
|
|
28
|
+
`git diff --cached --name-only` (staged). If no changes, check
|
|
29
|
+
`git diff --name-only HEAD~5` for recent commits.
|
|
30
|
+
2. Read each changed file fully — understand context before changing
|
|
31
|
+
3. Check CLAUDE.md for project-specific conventions
|
|
32
|
+
4. Make one improvement at a time, smallest meaningful change first
|
|
33
|
+
5. Run the full test suite after EVERY change
|
|
34
|
+
6. If tests fail, revert immediately — your change broke behavior
|
|
35
|
+
|
|
36
|
+
## Confidence Filtering
|
|
37
|
+
|
|
38
|
+
- Only change if >80% confident it improves the code
|
|
39
|
+
- Skip stylistic preferences unless they violate CLAUDE.md
|
|
40
|
+
- Consolidate: "5 functions have duplicated validation" → one fix, not five
|
|
41
|
+
- Prioritize changes that reduce complexity or eliminate bugs
|
|
33
42
|
|
|
34
43
|
## Rules
|
|
35
|
-
- Never change behavior —
|
|
36
|
-
- Never combine cleanup with feature work
|
|
37
|
-
- If a file has low test coverage, do NOT refactor it
|
|
38
|
-
-
|
|
44
|
+
- Never change behavior — if tests break, you changed behavior, revert
|
|
45
|
+
- Never combine cleanup with feature work in logic — only clean what exists
|
|
46
|
+
- If a file has low test coverage, flag it but do NOT refactor it
|
|
47
|
+
- Do not commit — leave changes uncommitted for /commit-push-pr
|
|
39
48
|
|
|
40
49
|
## When to Use
|
|
41
|
-
- After
|
|
50
|
+
- After implementing a feature, before /verify and /commit-push-pr
|
|
42
51
|
- Weekly maintenance pass
|
|
43
|
-
- When
|
|
44
|
-
- Before a major release
|
|
52
|
+
- When /review-changes flagged issues you want to fix
|
|
@@ -1,11 +1,71 @@
|
|
|
1
|
-
|
|
2
|
-
|
|
1
|
+
The SessionStart hook has already loaded CLAUDE.md, PROGRESS.md,
|
|
2
|
+
and the most recent session summary into context.
|
|
3
|
+
|
|
4
|
+
Your job is to supplement that with drift detection and additional context.
|
|
5
|
+
|
|
6
|
+
## 1. Drift Detection
|
|
7
|
+
|
|
8
|
+
Show what changed since the last session so there are no surprises.
|
|
9
|
+
Present raw signals only — do NOT interpret or warn.
|
|
10
|
+
|
|
11
|
+
Run these commands and report the output:
|
|
12
|
+
|
|
13
|
+
```bash
|
|
14
|
+
# How many commits since last session file?
|
|
15
|
+
LAST_SESSION=$(ls -t .claude/sessions/*.md 2>/dev/null | head -1)
|
|
16
|
+
if [ -n "$LAST_SESSION" ]; then
|
|
17
|
+
SESSION_DATE=$(echo "$LAST_SESSION" | grep -oP '\d{4}-\d{2}-\d{2}')
|
|
18
|
+
echo "Commits since last session ($SESSION_DATE):"
|
|
19
|
+
git log --oneline --since="$SESSION_DATE" 2>/dev/null | head -15
|
|
20
|
+
else
|
|
21
|
+
echo "No previous session found. Recent commits:"
|
|
22
|
+
git log --oneline -10 2>/dev/null
|
|
23
|
+
fi
|
|
24
|
+
```
|
|
25
|
+
|
|
26
|
+
Report as:
|
|
27
|
+
|
|
28
|
+
```
|
|
29
|
+
## Drift Since Last Session
|
|
30
|
+
- **X commits** since {date}
|
|
31
|
+
- {one-liner per commit, max 15}
|
|
32
|
+
|
|
33
|
+
Current branch: {branch name}
|
|
34
|
+
```
|
|
35
|
+
|
|
36
|
+
If there are 0 commits since the last session, just say:
|
|
37
|
+
|
|
38
|
+
```
|
|
39
|
+
## Drift Since Last Session
|
|
40
|
+
No new commits since last session ({date}). Branch: {branch name}
|
|
41
|
+
```
|
|
3
42
|
|
|
4
|
-
|
|
5
|
-
|
|
6
|
-
|
|
43
|
+
Do NOT add commentary like "you should review these" or "there may be
|
|
44
|
+
conflicts." Just the facts.
|
|
45
|
+
|
|
46
|
+
## 2. Check for Handoff Files
|
|
47
|
+
|
|
48
|
+
Look in docs/handoffs/ for any HANDOFF*.md files:
|
|
49
|
+
|
|
50
|
+
- Both HANDOFF-{branch}-{date}.md and legacy HANDOFF_{date}.md
|
|
7
51
|
- Prioritize files matching the current branch name
|
|
8
52
|
- If found, read them for context and report what was handed off
|
|
9
53
|
|
|
10
|
-
|
|
11
|
-
|
|
54
|
+
## 3. Load Agent Routing
|
|
55
|
+
|
|
56
|
+
Read .claude/skills/agent-routing.md for agent usage guidance.
|
|
57
|
+
|
|
58
|
+
## 4. Check for Active Prompt Files
|
|
59
|
+
|
|
60
|
+
If any PHASE-*-PROMPT.md or implementation prompt file exists in the
|
|
61
|
+
project root, read it and note it.
|
|
62
|
+
|
|
63
|
+
## 5. Report
|
|
64
|
+
|
|
65
|
+
Summarize:
|
|
66
|
+
|
|
67
|
+
- Drift status (from step 1)
|
|
68
|
+
- Any handoffs found (from step 2)
|
|
69
|
+
- What was last completed (from session summary loaded by hook)
|
|
70
|
+
- What's next (from PROGRESS.md loaded by hook)
|
|
71
|
+
- Any blockers or notes
|
|
@@ -23,6 +23,7 @@ See `.claude/skills/` — load only what's relevant:
|
|
|
23
23
|
- verification.md — How to verify work
|
|
24
24
|
- testing.md — Test philosophy and patterns
|
|
25
25
|
- subagent-usage.md — When and how to use subagents
|
|
26
|
+
- security-checklist.md — Security review checklist
|
|
26
27
|
- agent-routing.md — When and how to use each installed agent (READ EVERY SESSION)
|
|
27
28
|
{project_specific_skills}
|
|
28
29
|
|
|
@@ -43,14 +43,21 @@
|
|
|
43
43
|
"matcher": "Write|Edit",
|
|
44
44
|
"hooks": [{
|
|
45
45
|
"type": "command",
|
|
46
|
-
"command": "{formatter_command}"
|
|
46
|
+
"command": "p=${WORCLAUDE_HOOK_PROFILE:-standard}; case \"$p\" in minimal) exit 0;; esac; {formatter_command}"
|
|
47
47
|
}]
|
|
48
48
|
},
|
|
49
49
|
{
|
|
50
50
|
"matcher": "Stop",
|
|
51
51
|
"hooks": [{
|
|
52
52
|
"type": "command",
|
|
53
|
-
"command": "{notification_command}"
|
|
53
|
+
"command": "p=${WORCLAUDE_HOOK_PROFILE:-standard}; case \"$p\" in minimal) exit 0;; esac; {notification_command}"
|
|
54
|
+
}]
|
|
55
|
+
},
|
|
56
|
+
{
|
|
57
|
+
"matcher": "Write|Edit",
|
|
58
|
+
"hooks": [{
|
|
59
|
+
"type": "command",
|
|
60
|
+
"command": "p=${WORCLAUDE_HOOK_PROFILE:-standard}; case \"$p\" in strict) ;; *) exit 0;; esac; npx tsc --noEmit 2>/dev/null || echo '[Hook] TypeScript errors detected' >&2"
|
|
54
61
|
}]
|
|
55
62
|
}
|
|
56
63
|
],
|
|
@@ -62,6 +69,15 @@
|
|
|
62
69
|
"command": "cat CLAUDE.md && cat docs/spec/PROGRESS.md 2>/dev/null || true"
|
|
63
70
|
}]
|
|
64
71
|
}
|
|
72
|
+
],
|
|
73
|
+
"SessionStart": [
|
|
74
|
+
{
|
|
75
|
+
"matcher": "",
|
|
76
|
+
"hooks": [{
|
|
77
|
+
"type": "command",
|
|
78
|
+
"command": "echo '=== CLAUDE.md ==='; cat CLAUDE.md 2>/dev/null; echo; echo '=== PROGRESS ==='; cat docs/spec/PROGRESS.md 2>/dev/null; echo; echo '=== LAST SESSION ==='; f=$(ls -t .claude/sessions/*.md 2>/dev/null | head -1); if [ -n \"$f\" ]; then cat \"$f\"; else echo 'No previous session found'; fi; echo; echo '=== BRANCH ==='; git rev-parse --abbrev-ref HEAD 2>/dev/null || echo 'not a git repo'"
|
|
79
|
+
}]
|
|
80
|
+
}
|
|
65
81
|
]
|
|
66
82
|
}
|
|
67
83
|
}
|
|
@@ -61,6 +61,61 @@ cat CLAUDE.md && cat docs/spec/PROGRESS.md 2>/dev/null || true
|
|
|
61
61
|
This ensures you never lose your bearings after compaction. The hook fires
|
|
62
62
|
automatically — you don't need to re-read these files manually.
|
|
63
63
|
|
|
64
|
+
## Session Persistence
|
|
65
|
+
|
|
66
|
+
The workflow automatically maintains session continuity:
|
|
67
|
+
|
|
68
|
+
### SessionStart Hook (automatic)
|
|
69
|
+
When a new Claude Code session opens, a hook automatically injects:
|
|
70
|
+
- CLAUDE.md — project conventions and rules
|
|
71
|
+
- PROGRESS.md — current project state
|
|
72
|
+
- The most recent session summary from `.claude/sessions/`
|
|
73
|
+
|
|
74
|
+
You don't need to manually re-read these. The hook handles it.
|
|
75
|
+
Use /start for additional context (handoff files, agent routing).
|
|
76
|
+
|
|
77
|
+
### Session Summaries
|
|
78
|
+
Session summaries are written to `.claude/sessions/` by:
|
|
79
|
+
- `/commit-push-pr` — writes a summary before committing (completed work)
|
|
80
|
+
- `/end` — writes a summary before the handoff commit (in-progress work)
|
|
81
|
+
|
|
82
|
+
These are local files (gitignored) that bridge the gap between sessions.
|
|
83
|
+
They're automatically picked up by the next session's SessionStart hook.
|
|
84
|
+
|
|
85
|
+
### The Continuity Chain
|
|
86
|
+
```
|
|
87
|
+
Session 1: work → /commit-push-pr → writes session summary → push
|
|
88
|
+
Session 2: SessionStart hook → reads summary → knows what happened → /start for extras
|
|
89
|
+
```
|
|
90
|
+
|
|
91
|
+
If the session summary is missing or stale, /start still reads PROGRESS.md
|
|
92
|
+
and handoff files as fallback. The system degrades gracefully.
|
|
93
|
+
|
|
94
|
+
## Hook Profiles
|
|
95
|
+
|
|
96
|
+
Control which hooks fire via the `WORCLAUDE_HOOK_PROFILE` environment variable:
|
|
97
|
+
|
|
98
|
+
| Profile | Hooks Active | Use When |
|
|
99
|
+
|---------|-------------|----------|
|
|
100
|
+
| `minimal` | SessionStart, PostCompact only | Exploring, learning, minimal overhead |
|
|
101
|
+
| `standard` | All hooks (default) | Normal development |
|
|
102
|
+
| `strict` | All hooks + TypeScript checking | Pre-release, team CI, maximum safety |
|
|
103
|
+
|
|
104
|
+
Set in your shell:
|
|
105
|
+
```bash
|
|
106
|
+
export WORCLAUDE_HOOK_PROFILE=minimal # lightweight
|
|
107
|
+
export WORCLAUDE_HOOK_PROFILE=standard # default (same as unset)
|
|
108
|
+
export WORCLAUDE_HOOK_PROFILE=strict # maximum enforcement
|
|
109
|
+
```
|
|
110
|
+
|
|
111
|
+
Or per-session:
|
|
112
|
+
```bash
|
|
113
|
+
WORCLAUDE_HOOK_PROFILE=strict claude
|
|
114
|
+
```
|
|
115
|
+
|
|
116
|
+
The default is `standard` if the variable is not set. You don't need to do
|
|
117
|
+
anything for normal development — the default just works.
|
|
118
|
+
|
|
64
119
|
## Gotchas
|
|
65
120
|
|
|
66
121
|
- Compacting doesn't free as much context as you think. If you've read 20 large files,
|