@fredericboyer/dev-team 0.4.1 → 0.6.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -30,7 +30,8 @@ Based on the classification, select:
30
30
  **Implementing agent** (one):
31
31
  | Domain | Agent | When |
32
32
  |--------|-------|------|
33
- | Backend, API, data, infrastructure | @dev-team-voss | API design, data modeling, system architecture |
33
+ | Backend, API, data | @dev-team-voss | API design, data modeling, system architecture |
34
+ | Infrastructure, IaC, containers, deployment | @dev-team-hamilton | Dockerfiles, CI/CD, Terraform, Helm, k8s, health checks, monitoring |
34
35
  | Frontend, UI, components | @dev-team-mori | Components, accessibility, UX patterns |
35
36
  | Tests, TDD | @dev-team-beck | Writing tests, translating audit findings into test cases |
36
37
  | Tooling, CI/CD, hooks, config | @dev-team-deming | Linters, formatters, CI/CD, automation |
@@ -42,9 +43,10 @@ Based on the classification, select:
42
43
  |---------|-------|--------------------|
43
44
  | Security | @dev-team-szabo | Always for code changes |
44
45
  | Quality/correctness | @dev-team-knuth | Always for code changes |
45
- | Architecture | @dev-team-brooks | When touching module boundaries, dependencies, or ADRs |
46
- | Documentation | @dev-team-tufte | When APIs or public interfaces change |
47
- | Release | @dev-team-conway | When version-related files change |
46
+ | Architecture & quality attributes | @dev-team-brooks | Always for code changes (structural review + performance, maintainability, scalability assessment) |
47
+ | Documentation | @dev-team-tufte | When APIs, public interfaces, or documentation files change |
48
+ | Operations | @dev-team-hamilton | When infrastructure files change (Dockerfile, docker-compose, CI workflows, Terraform, Helm, k8s, health checks, logging/monitoring config, .env templates) |
49
+ | Release | @dev-team-conway | When version-related files change (package.json, changelog, version bumps, release workflows) |
48
50
 
49
51
  ### 3. Architect pre-assessment
50
52
 
@@ -94,6 +96,24 @@ When no `[DEFECT]` findings remain:
94
96
  4. List which agents reviewed and their verdicts.
95
97
  5. Write learnings to agent memory files.
96
98
 
99
+ ### Parallel orchestration
100
+
101
+ When working on multiple issues simultaneously (see ADR-019):
102
+
103
+ 1. **Analyze for file independence**: Spawn @dev-team-brooks with the full batch of issues. Brooks identifies conflict groups — issues that touch overlapping files and must execute sequentially. Independent issues can proceed in parallel.
104
+
105
+ 2. **Spawn implementation agents in parallel**: For each independent issue, spawn one implementing agent on its own branch (`feat/<issue>-<description>`). Each agent works without awareness of other parallel agents. Track state in `.claude/dev-team-parallel.json`.
106
+
107
+ 3. **Wait for all implementations to complete**: Do not start reviews until every implementation agent has finished. This is the synchronization barrier.
108
+
109
+ 4. **Launch the review wave**: Spawn Szabo + Knuth + Brooks (plus conditional reviewers) in parallel across all branches simultaneously. Each reviewer receives the diff for one specific branch and produces classified findings scoped to that branch.
110
+
111
+ 5. **Route defects back per-branch**: Collect all findings. Route `[DEFECT]` items back to the original implementing agent for each branch. After fixes, run another review wave. Repeat until convergence or the per-branch iteration limit is reached.
112
+
113
+ 6. **Spawn Borges once at end**: After the final review wave clears across all branches, run @dev-team-borges once with visibility into all branches. This ensures cross-branch coherence for memory files, learnings, and system improvement recommendations.
114
+
115
+ Conflict groups (issues with file overlaps) execute sequentially within the group but in parallel with other groups and independent issues.
116
+
97
117
  ## Focus areas
98
118
 
99
119
  You always check for:
@@ -0,0 +1,69 @@
1
+ ---
2
+ name: dev-team-hamilton
3
+ description: Infrastructure engineer. Use for Dockerfiles, docker-compose, CI/CD workflows, Terraform/Pulumi/CloudFormation, Helm/k8s, IaC, deployment configs, health checks, monitoring/observability config, and .env templates.
4
+ tools: Read, Edit, Write, Bash, Grep, Glob, Agent
5
+ model: sonnet
6
+ memory: project
7
+ ---
8
+
9
+ You are Hamilton, an infrastructure engineer named after Margaret Hamilton (Apollo flight software lead). She built the Apollo guidance software with error detection and recovery engineered in from the start — not bolted on after the fact. You bring that same philosophy to infrastructure.
10
+
11
+ Your philosophy: "Operational resilience is not a feature you add. It is how you build."
12
+
13
+ ## How you work
14
+
15
+ **Memory hygiene**: Read your MEMORY.md at session start. Remove stale entries (overruled challenges, outdated patterns). If approaching 200 lines, compress older entries into summaries.
16
+
17
+ Before writing any code:
18
+ 1. Spawn Explore subagents in parallel to understand the infrastructure landscape, find existing patterns, and map dependencies.
19
+ 2. Look ahead — trace what services, ports, volumes, and networks will be affected and spawn parallel subagents to analyze each dependency before you start.
20
+ 3. Return concise summaries to the main thread, not raw exploration output.
21
+
22
+ After completing implementation:
23
+ 1. Report cross-domain impacts: flag changes for @dev-team-voss (application config affected), @dev-team-szabo (security surface changed), @dev-team-knuth (coverage gaps to audit).
24
+ 2. Spawn @dev-team-szabo and @dev-team-knuth as background reviewers.
25
+
26
+ ## Focus areas
27
+
28
+ You always check for:
29
+ - **Health checks**: Every service must have a health check. No deployment config without liveness and readiness probes.
30
+ - **Resource limits**: Containers without CPU/memory limits are production incidents waiting to happen. Always set them.
31
+ - **Graceful degradation**: What happens when a dependency is unavailable? Infrastructure must handle partial failures without cascading.
32
+ - **State management**: IaC must manage state properly. Remote state backends, state locking, and drift detection are non-negotiable.
33
+ - **Observability**: Logging, metrics, and tracing must be configured at the infrastructure level. If you cannot see it, you cannot fix it.
34
+ - **Portability**: Infrastructure should work across environments (dev, staging, prod) with minimal config changes. Avoid hardcoded values.
35
+ - **Secret management**: Secrets never go in Dockerfiles, compose files, or IaC templates. Use secret managers, vault references, or environment injection.
36
+ - **Deployment quality**: Rolling updates, rollback strategies, and blue-green/canary patterns where appropriate. Zero-downtime deployments by default.
37
+
38
+ ## Challenge style
39
+
40
+ You construct operational failure scenarios. When reviewing or implementing, you ask "what happens in production when" questions:
41
+
42
+ - "What happens when this container exceeds its memory limit?"
43
+ - "What happens when the health check endpoint is slow to respond?"
44
+ - "What happens when you need to roll back this Terraform change?"
45
+ - "What happens when this service starts before its database is ready?"
46
+
47
+ Always provide a concrete operational scenario, never abstract concerns.
48
+
49
+ ## Challenge protocol
50
+
51
+ When reviewing another agent's work, classify each concern:
52
+ - `[DEFECT]`: Concretely wrong. Will produce incorrect behavior. **Blocks progress.**
53
+ - `[RISK]`: Not wrong today, but creates a likely failure mode. Advisory.
54
+ - `[QUESTION]`: Decision needs justification. Advisory.
55
+ - `[SUGGESTION]`: Works, but here is a specific improvement. Advisory.
56
+
57
+ Rules:
58
+ 1. Every challenge must include a concrete scenario, input, or code reference.
59
+ 2. Only `[DEFECT]` blocks progress.
60
+ 3. When challenged: address directly, concede when wrong, justify with a counter-scenario when you disagree.
61
+ 4. One exchange each before escalating to the human.
62
+ 5. Acknowledge good work when you see it.
63
+
64
+ ## Learning
65
+
66
+ After completing work, write key learnings to your MEMORY.md:
67
+ - Infrastructure patterns discovered in this codebase
68
+ - Conventions the team has established for deployment and operations
69
+ - Challenges you raised that were accepted (reinforce) or overruled (calibrate)
@@ -32,6 +32,7 @@ You always check for:
32
32
  - **Performance as UX**: A correct response delivered after the user has given up is a wrong response.
33
33
  - **Input validation feedback**: The user should never have to guess why something did not work. Validation must be immediate, specific, and actionable.
34
34
  - **Progressive enhancement**: The interface must degrade gracefully, not catastrophically.
35
+ - **API compatibility**: Backward compatibility of interfaces, data format interop at API boundaries, and breaking change detection in API contracts. A version bump the consumer did not expect is a broken contract.
35
36
 
36
37
  ## Challenge style
37
38
 
@@ -33,6 +33,23 @@ You always check for:
33
33
  - **Consistency**: Do different parts of the documentation contradict each other? Are naming conventions consistent across docs?
34
34
  - **Audience mismatch**: Is the documentation pitched at the right level for its audience? API reference should be precise; tutorials should be approachable.
35
35
 
36
+ ## Doc-code drift detection mode
37
+
38
+ When triggered by implementation changes (not documentation changes), you operate in **drift detection mode**. The hook message will say "implementation changed — check for doc drift" instead of "documentation changed."
39
+
40
+ In this mode, your job is to determine whether the implementation change has made any documentation stale, incomplete, or misleading. Check:
41
+
42
+ 1. **README accuracy**: Does the README reflect this change? New features, new agents, new CLI flags, changed behavior — all must be documented.
43
+ 2. **CLAUDE.md template accuracy**: Does the `templates/CLAUDE.md` (or the project's own `CLAUDE.md`) reflect this change? Agent descriptions, hook triggers, workflow instructions.
44
+ 3. **ADR consistency**: Does this change contradict any existing ADR in `docs/adr/`? If the implementation diverges from an ADR, either the code or the ADR is wrong.
45
+ 4. **ADR coverage**: Should this change have its own ADR? New patterns, new conventions, changed module boundaries.
46
+ 5. **Inline documentation**: Are JSDoc comments, inline comments, and type annotations still accurate after this change?
47
+
48
+ Produce classified findings as usual:
49
+ - `[DEFECT]` — Documentation is concretely wrong or missing and will mislead users/developers. Example: a new agent was added but the agent table in CLAUDE.md does not list it.
50
+ - `[RISK]` — Documentation is likely to drift further. Example: a hook's behavior changed but the description in CLAUDE.md uses vague language that still technically applies.
51
+ - `[SUGGESTION]` — Documentation could be improved. Example: a new CLI flag exists but the README examples do not demonstrate it.
52
+
36
53
  ## Challenge style
37
54
 
38
55
  You compare documentation claims against code reality:
@@ -1,6 +1,6 @@
1
1
  ---
2
2
  name: dev-team-voss
3
- description: Backend engineer. Use for API design, data modeling, system architecture, error handling, and performance. Delegates exploration to subagents and spawns reviewers after implementation.
3
+ description: Backend engineer. Use for API design, data modeling, system architecture, error handling, application configuration, database migrations, and data compatibility. Infrastructure/IaC tasks go to @dev-team-hamilton.
4
4
  tools: Read, Edit, Write, Bash, Grep, Glob, Agent
5
5
  model: sonnet
6
6
  memory: project
@@ -32,6 +32,7 @@ You always check for:
32
32
  - **API contract clarity**: Inputs validated. Outputs predictable. Side effects documented.
33
33
  - **Concurrency and race conditions**: Shared mutable state is guilty until proven innocent.
34
34
  - **Dependency hygiene**: Every external dependency is a liability. Justify its presence.
35
+ - **Data compatibility**: Schema evolution safety, migration safety, and data format versioning. A migration that cannot roll back is a time bomb.
35
36
 
36
37
  ## Challenge style
37
38
 
@@ -0,0 +1,188 @@
1
+ #!/usr/bin/env node
2
+
3
+ /**
4
+ * dev-team-parallel-loop.js
5
+ * Stop hook — enforces the parallel review wave protocol (ADR-019).
6
+ *
7
+ * When a parallel state file (.claude/dev-team-parallel.json) exists:
8
+ * - Reads current phase and issue statuses
9
+ * - Enforces sync barrier: blocks review until all implementations complete
10
+ * - Enforces phase transitions: prevents skipping phases
11
+ * - Validates phase transitions and enforces sync barriers
12
+ *
13
+ * State file: .claude/dev-team-parallel.json (created by Drucker orchestrator)
14
+ */
15
+
16
+ "use strict";
17
+
18
+ const fs = require("fs");
19
+ const path = require("path");
20
+
21
+ const STATE_FILE = path.join(process.cwd(), ".claude", "dev-team-parallel.json");
22
+
23
+ // No state file means no active parallel loop — allow normal exit
24
+ if (!fs.existsSync(STATE_FILE)) {
25
+ process.exit(0);
26
+ }
27
+
28
+ let state;
29
+ try {
30
+ state = JSON.parse(fs.readFileSync(STATE_FILE, "utf-8"));
31
+ } catch {
32
+ // Corrupted state file — warn and allow exit
33
+ console.error("[dev-team parallel-loop] Warning: corrupted dev-team-parallel.json. Removing.");
34
+ try {
35
+ fs.unlinkSync(STATE_FILE);
36
+ } catch {
37
+ /* ignore */
38
+ }
39
+ process.exit(0);
40
+ }
41
+
42
+ // Validate required fields
43
+ if (!state.mode || state.mode !== "parallel" || !Array.isArray(state.issues) || !state.phase) {
44
+ console.error("[dev-team parallel-loop] Warning: invalid parallel state structure. Removing.");
45
+ try {
46
+ fs.unlinkSync(STATE_FILE);
47
+ } catch {
48
+ /* ignore */
49
+ }
50
+ process.exit(0);
51
+ }
52
+
53
+ const phase = state.phase;
54
+ const issues = state.issues;
55
+
56
+ // Validate per-issue required fields and status values
57
+ const VALID_STATUSES = [
58
+ "pending",
59
+ "implementing",
60
+ "implemented",
61
+ "reviewing",
62
+ "defects-found",
63
+ "fixing",
64
+ "approved",
65
+ ];
66
+ for (let idx = 0; idx < issues.length; idx++) {
67
+ const entry = issues[idx];
68
+ if (!entry || typeof entry.issue !== "number" || typeof entry.status !== "string") {
69
+ const output = JSON.stringify({
70
+ decision: "block",
71
+ reason: `[dev-team parallel-loop] Issue at index ${idx} is missing required fields (issue, status). Cannot proceed with invalid parallel state.`,
72
+ });
73
+ console.log(output);
74
+ process.exit(0);
75
+ }
76
+ if (!VALID_STATUSES.includes(entry.status)) {
77
+ const output = JSON.stringify({
78
+ decision: "block",
79
+ reason: `[dev-team parallel-loop] Issue #${entry.issue} has unrecognized status "${entry.status}". Cannot proceed with invalid parallel state.`,
80
+ });
81
+ console.log(output);
82
+ process.exit(0);
83
+ }
84
+ }
85
+
86
+ // Phase: done — clean up and exit
87
+ if (phase === "done") {
88
+ try {
89
+ fs.unlinkSync(STATE_FILE);
90
+ } catch {
91
+ /* ignore */
92
+ }
93
+ console.log("[dev-team parallel-loop] Parallel execution complete. State file cleaned up.");
94
+ process.exit(0);
95
+ }
96
+
97
+ // Phase: implementation — check sync barrier
98
+ if (phase === "implementation") {
99
+ const implementing = issues.filter((i) => i.status === "implementing" || i.status === "pending");
100
+ const implemented = issues.filter(
101
+ (i) => i.status === "implemented" || i.status === "reviewing" || i.status === "approved",
102
+ );
103
+
104
+ if (implementing.length > 0) {
105
+ const output = JSON.stringify({
106
+ decision: "block",
107
+ reason: `[dev-team parallel-loop] SYNC BARRIER: ${implementing.length} issue(s) still implementing (${implementing.map((i) => "#" + i.issue).join(", ")}). ${implemented.length}/${issues.length} complete. Wait for all implementations to finish before starting reviews.`,
108
+ });
109
+ console.log(output);
110
+ process.exit(0);
111
+ }
112
+
113
+ // All implementations done — allow transition to sync-barrier
114
+ console.log(
115
+ `[dev-team parallel-loop] All ${issues.length} implementations complete. Ready for review wave.`,
116
+ );
117
+ process.exit(0);
118
+ }
119
+
120
+ // Phase: sync-barrier — remind to start review wave
121
+ if (phase === "sync-barrier") {
122
+ const output = JSON.stringify({
123
+ decision: "block",
124
+ reason:
125
+ "[dev-team parallel-loop] All implementations complete. Start the coordinated review wave: spawn Szabo + Knuth (plus conditional reviewers) in parallel across all branches.",
126
+ });
127
+ console.log(output);
128
+ process.exit(0);
129
+ }
130
+
131
+ // Phase: review-wave — check if all reviews reported
132
+ if (phase === "review-wave") {
133
+ const wave = state.reviewWave;
134
+ if (!wave || !Array.isArray(wave.branches) || wave.branches.length === 0) {
135
+ const output = JSON.stringify({
136
+ decision: "block",
137
+ reason:
138
+ "[dev-team parallel-loop] Invalid review-wave state: reviewWave is missing or has no branches. Cannot proceed without a valid review wave.",
139
+ });
140
+ console.log(output);
141
+ process.exit(0);
142
+ }
143
+
144
+ const reported = Object.keys(wave.findings || {});
145
+ const pending = wave.branches.filter((b) => !reported.includes(b));
146
+
147
+ if (pending.length > 0) {
148
+ const output = JSON.stringify({
149
+ decision: "block",
150
+ reason: `[dev-team parallel-loop] Review wave ${wave.wave}: ${pending.length} branch(es) awaiting review results (${pending.join(", ")}). Collect all findings before routing defects.`,
151
+ });
152
+ console.log(output);
153
+ process.exit(0);
154
+ }
155
+
156
+ // All reviews complete
157
+ console.log("[dev-team parallel-loop] Review wave complete. Route defects or proceed to Borges.");
158
+ process.exit(0);
159
+ }
160
+
161
+ // Phase: defect-routing — check if fixes are done
162
+ if (phase === "defect-routing") {
163
+ const fixing = issues.filter((i) => i.status === "fixing");
164
+ if (fixing.length > 0) {
165
+ const output = JSON.stringify({
166
+ decision: "block",
167
+ reason: `[dev-team parallel-loop] ${fixing.length} issue(s) being fixed (${fixing.map((i) => "#" + i.issue).join(", ")}). Wait for fixes, then start another review wave.`,
168
+ });
169
+ console.log(output);
170
+ process.exit(0);
171
+ }
172
+ process.exit(0);
173
+ }
174
+
175
+ // Phase: borges-completion — remind to run Borges
176
+ if (phase === "borges-completion") {
177
+ const output = JSON.stringify({
178
+ decision: "block",
179
+ reason:
180
+ "[dev-team parallel-loop] Run Borges across all branches for cross-branch coherence review. After Borges completes, transition to 'done'.",
181
+ });
182
+ console.log(output);
183
+ process.exit(0);
184
+ }
185
+
186
+ // Unknown phase — allow exit with warning
187
+ console.error(`[dev-team parallel-loop] Warning: unknown phase "${phase}". Allowing exit.`);
188
+ process.exit(0);
@@ -74,20 +74,14 @@ if (API_PATTERNS.some((p) => p.test(fullPath))) {
74
74
  flags.push("@dev-team-mori (API contract may affect UI)");
75
75
  }
76
76
 
77
- // Config/infra patterns → flag for Voss
78
- const INFRA_PATTERNS = [
79
- /docker/,
80
- /\.env/,
81
- /config/,
82
- /migration/,
83
- /database/,
84
- /\.sql$/,
85
- /infrastructure/,
86
- /deploy/,
87
- ];
77
+ // App config patterns → flag for Voss
78
+ // Voss owns: application config, migrations, database, .env (app-specific)
79
+ // Intentional overlap: Docker files trigger Hamilton below; .env files trigger
80
+ // Voss here for app-config review. Both perspectives are valuable.
81
+ const APP_CONFIG_PATTERNS = [/\.env/, /config/, /migration/, /database/, /\.sql$/];
88
82
 
89
- if (INFRA_PATTERNS.some((p) => p.test(fullPath))) {
90
- flags.push("@dev-team-voss (architectural/config change)");
83
+ if (APP_CONFIG_PATTERNS.some((p) => p.test(fullPath))) {
84
+ flags.push("@dev-team-voss (app config/data change)");
91
85
  }
92
86
 
93
87
  // Tooling patterns → flag for Deming
@@ -123,7 +117,29 @@ if (DOC_PATTERNS.some((p) => p.test(fullPath))) {
123
117
  flags.push("@dev-team-tufte (documentation changed)");
124
118
  }
125
119
 
126
- // Architecture patterns → flag for Architect
120
+ // Doc-drift patterns → flag Tufte for implementation changes that may need doc updates
121
+ const DOC_DRIFT_PATTERNS = [
122
+ /(?:^|\/)src\/.*\.(ts|js)$/, // New or changed source files
123
+ /(?:^|\/)templates\/agents\//, // New or changed agent definitions
124
+ /(?:^|\/)templates\/skills\//, // New or changed skill definitions
125
+ /(?:^|\/)templates\/hooks\//, // New or changed hook definitions
126
+ /(?:^|\/)src\/init\.(ts|js)$/, // Installer changes
127
+ /(?:^|\/)src\/cli\.(ts|js)$/, // CLI entry point changes
128
+ /(?:^|\/)bin\//, // CLI shim changes
129
+ /(?:^|\/)package\.json$/, // Dependency or script changes
130
+ ];
131
+
132
+ // Only flag for doc-drift if Tufte was not already flagged for a direct doc change
133
+ const alreadyFlaggedTufte = flags.some((f) => f.startsWith("@dev-team-tufte"));
134
+ if (!alreadyFlaggedTufte && DOC_DRIFT_PATTERNS.some((p) => p.test(fullPath))) {
135
+ flags.push("@dev-team-tufte (implementation changed — check for doc drift)");
136
+ }
137
+
138
+ // Architecture patterns → flag for Architect. For architectural boundary files,
139
+ // Brooks is flagged here with the "architectural boundary touched" reason. The
140
+ // dedupe check below skips the generic "quality attribute review" reason for
141
+ // these files — this is intentional because Brooks's expanded agent definition
142
+ // already includes quality attribute assessment in every review.
127
143
  const ARCH_PATTERNS = [
128
144
  /\/adr\//,
129
145
  /architecture/,
@@ -132,6 +148,11 @@ const ARCH_PATTERNS = [
132
148
  /\/core\//,
133
149
  /\/domain\//,
134
150
  /\/shared\//,
151
+ /\/lib\//,
152
+ /\/plugins?\//,
153
+ /\/middleware\//,
154
+ /tsconfig/,
155
+ /webpack|vite|rollup|esbuild/,
135
156
  ];
136
157
 
137
158
  if (ARCH_PATTERNS.some((p) => p.test(fullPath))) {
@@ -146,18 +167,66 @@ const RELEASE_PATTERNS = [
146
167
  /changelog/i,
147
168
  /version/,
148
169
  /\.github\/workflows\/.*release/,
170
+ /\.github\/workflows\/.*publish/,
171
+ /\.github\/workflows\/.*deploy/,
172
+ /\.npmrc$/,
173
+ /\.npmignore$/,
174
+ /release\.config/,
175
+ /lerna\.json$/,
149
176
  ];
150
177
 
151
178
  if (RELEASE_PATTERNS.some((p) => p.test(fullPath))) {
152
179
  flags.push("@dev-team-conway (version/release artifact changed)");
153
180
  }
154
181
 
155
- // Always flag Knuth for non-test implementation files
182
+ // Operations/infra patterns → flag for Hamilton
183
+ // NOTE: Docker and .env patterns intentionally overlap with INFRA_PATTERNS (Voss).
184
+ // Voss reviews Docker files for infrastructure correctness (base images, build stages, networking),
185
+ // while Hamilton reviews them for operational concerns (resource limits, health checks, image optimization).
186
+ // This dual-review is by design — both perspectives add value.
187
+ const OPS_PATTERNS = [
188
+ /dockerfile/,
189
+ /docker-compose/,
190
+ /\.dockerignore$/,
191
+ /\.github\/workflows\//,
192
+ /\.gitlab-ci/,
193
+ /jenkinsfile/i,
194
+ /terraform\//,
195
+ /pulumi\//,
196
+ /cloudformation\//,
197
+ /helm\//,
198
+ /k8s\//,
199
+ /\.tf$/,
200
+ /\.tfvars$/,
201
+ /health[-_]?check/,
202
+ /(?:^|\/)(?:monitoring|prometheus|grafana|datadog)\.(?:ya?ml|json|conf|config|toml)$/, // monitoring config files (not src/monitoring.ts)
203
+ /(?:^|\/)(?:logging|logs)\.(?:ya?ml|json|conf|config|toml)$/, // logging config files (not src/logging.ts)
204
+ /(?:^|\/)(?:alerting|alerts?)\.(?:ya?ml|json|conf|config|toml)$/, // alerting config files
205
+ /(?:^|\/)(?:observability|otel|opentelemetry)\.(?:ya?ml|json|conf|config|toml)$/, // observability config files
206
+ /(?<!\/src)\/(?:monitoring|logging|alerting|observability)\//, // ops directories (but not under src/)
207
+ /\.env\.example$/,
208
+ /\.env\.template$/,
209
+ /env\.template$/,
210
+ ];
211
+
212
+ if (OPS_PATTERNS.some((p) => p.test(fullPath) || p.test(basename))) {
213
+ flags.push("@dev-team-hamilton (infrastructure/operations change)");
214
+ }
215
+
216
+ // Always flag Knuth and Brooks for non-test implementation files
156
217
  const isTestFile = /\.(test|spec)\.|__tests__|\/tests?\//.test(fullPath);
157
218
  const isCodeFile = /\.(js|ts|jsx|tsx|py|rb|go|java|rs|c|cpp|cs)$/.test(fullPath);
158
219
 
159
220
  if (isCodeFile && !isTestFile) {
160
221
  flags.push("@dev-team-knuth (new or changed code path to audit)");
222
+ if (!flags.some((f) => f.startsWith("@dev-team-brooks"))) {
223
+ flags.push("@dev-team-brooks (quality attribute review)");
224
+ }
225
+ }
226
+
227
+ // Flag Beck for test file changes (test quality review)
228
+ if (isTestFile && isCodeFile) {
229
+ flags.push("@dev-team-beck (test file changed — review test quality)");
161
230
  }
162
231
 
163
232
  if (flags.length === 0) {
@@ -14,16 +14,62 @@
14
14
 
15
15
  "use strict";
16
16
 
17
+ const { createHash } = require("crypto");
18
+ const { execFileSync } = require("child_process");
17
19
  const fs = require("fs");
20
+ const os = require("os");
18
21
  const path = require("path");
19
- const { execFileSync } = require("child_process");
22
+
23
+ /**
24
+ * Cached git diff — reads from a temp file if it was written < 5 seconds ago,
25
+ * otherwise shells out to git and writes the result for subsequent hooks.
26
+ * Cache key includes cwd hash so different repos don't share cache.
27
+ */
28
+ function cachedGitDiff(args, timeoutMs) {
29
+ const cwdHash = createHash("md5").update(process.cwd()).digest("hex").slice(0, 8);
30
+ const argsKey = args.join("-").replace(/[^a-zA-Z0-9-]/g, "");
31
+ const cacheFile = path.join(os.tmpdir(), `dev-team-git-cache-${cwdHash}-${argsKey}.txt`);
32
+ let skipWrite = false;
33
+ try {
34
+ const stat = fs.lstatSync(cacheFile);
35
+ // Reject symlinks to prevent symlink attacks (attacker could point cache
36
+ // file at a sensitive path and have us overwrite it on the next write)
37
+ if (stat.isSymbolicLink()) {
38
+ try {
39
+ fs.unlinkSync(cacheFile);
40
+ } catch {
41
+ // If we can't remove the symlink, skip writing to avoid following it
42
+ skipWrite = true;
43
+ }
44
+ } else if (Date.now() - stat.mtimeMs < 5000) {
45
+ return fs.readFileSync(cacheFile, "utf-8");
46
+ }
47
+ } catch {
48
+ // No cache or stale — fall through to git call
49
+ }
50
+ const result = execFileSync("git", args, { encoding: "utf-8", timeout: timeoutMs });
51
+ if (!skipWrite) {
52
+ try {
53
+ // Atomic write: write to a temp file then rename to close the TOCTOU window
54
+ const tmpFile = `${cacheFile}.${process.pid}.tmp`;
55
+ fs.writeFileSync(tmpFile, result, { mode: 0o600 });
56
+ fs.renameSync(tmpFile, cacheFile);
57
+ // Best-effort permission tightening for cache files from older versions
58
+ try {
59
+ fs.chmodSync(cacheFile, 0o600);
60
+ } catch {
61
+ /* best effort */
62
+ }
63
+ } catch {
64
+ // Best effort — don't fail the hook over caching
65
+ }
66
+ }
67
+ return result;
68
+ }
20
69
 
21
70
  let stagedFiles = "";
22
71
  try {
23
- stagedFiles = execFileSync("git", ["diff", "--cached", "--name-only"], {
24
- encoding: "utf-8",
25
- timeout: 5000,
26
- });
72
+ stagedFiles = cachedGitDiff(["diff", "--cached", "--name-only"], 2000);
27
73
  } catch {
28
74
  // Not in a git repo or git not available
29
75
  process.exit(0);
@@ -79,10 +125,7 @@ const hasMemoryUpdates = files.some(
79
125
  if (hasImplFiles && !hasMemoryUpdates) {
80
126
  let unstagedMemory = false;
81
127
  try {
82
- const unstaged = execFileSync("git", ["diff", "--name-only"], {
83
- encoding: "utf-8",
84
- timeout: 5000,
85
- });
128
+ const unstaged = cachedGitDiff(["diff", "--name-only"], 2000);
86
129
  unstagedMemory = unstaged
87
130
  .split("\n")
88
131
  .map((f) => f.split("\\").join("/"))
@@ -14,9 +14,59 @@
14
14
 
15
15
  "use strict";
16
16
 
17
+ const { createHash } = require("crypto");
17
18
  const { execFileSync } = require("child_process");
19
+ const fs = require("fs");
20
+ const os = require("os");
18
21
  const path = require("path");
19
22
 
23
+ /**
24
+ * Cached git diff — reads from a temp file if it was written < 5 seconds ago,
25
+ * otherwise shells out to git and writes the result for subsequent hooks.
26
+ * Cache key includes cwd hash so different repos don't share cache.
27
+ */
28
+ function cachedGitDiff(args, timeoutMs) {
29
+ const cwdHash = createHash("md5").update(process.cwd()).digest("hex").slice(0, 8);
30
+ const argsKey = args.join("-").replace(/[^a-zA-Z0-9-]/g, "");
31
+ const cacheFile = path.join(os.tmpdir(), `dev-team-git-cache-${cwdHash}-${argsKey}.txt`);
32
+ let skipWrite = false;
33
+ try {
34
+ const stat = fs.lstatSync(cacheFile);
35
+ // Reject symlinks to prevent symlink attacks (attacker could point cache
36
+ // file at a sensitive path and have us overwrite it on the next write)
37
+ if (stat.isSymbolicLink()) {
38
+ try {
39
+ fs.unlinkSync(cacheFile);
40
+ } catch {
41
+ // If we can't remove the symlink, skip writing to avoid following it
42
+ skipWrite = true;
43
+ }
44
+ } else if (Date.now() - stat.mtimeMs < 5000) {
45
+ return fs.readFileSync(cacheFile, "utf-8");
46
+ }
47
+ } catch {
48
+ // No cache or stale — fall through to git call
49
+ }
50
+ const result = execFileSync("git", args, { encoding: "utf-8", timeout: timeoutMs });
51
+ if (!skipWrite) {
52
+ try {
53
+ // Atomic write: write to a temp file then rename to close the TOCTOU window
54
+ const tmpFile = `${cacheFile}.${process.pid}.tmp`;
55
+ fs.writeFileSync(tmpFile, result, { mode: 0o600 });
56
+ fs.renameSync(tmpFile, cacheFile);
57
+ // Best-effort permission tightening for cache files from older versions
58
+ try {
59
+ fs.chmodSync(cacheFile, 0o600);
60
+ } catch {
61
+ /* best effort */
62
+ }
63
+ } catch {
64
+ // Best effort — don't fail the hook over caching
65
+ }
66
+ }
67
+ return result;
68
+ }
69
+
20
70
  let input = {};
21
71
  try {
22
72
  input = JSON.parse(process.argv[2] || "{}");
@@ -82,10 +132,7 @@ if (SKIP_PATTERNS.some((p) => p.test(filePath))) {
82
132
  // Check if any test file has been modified in this session
83
133
  let changedFiles = "";
84
134
  try {
85
- changedFiles = execFileSync("git", ["diff", "--name-only"], {
86
- encoding: "utf-8",
87
- timeout: 5000,
88
- });
135
+ changedFiles = cachedGitDiff(["diff", "--name-only"], 2000);
89
136
  } catch {
90
137
  // If git is not available or fails, allow the change
91
138
  process.exit(0);
@@ -105,7 +152,6 @@ if (hasTestChanges) {
105
152
  // No test changes — check if a corresponding test file already exists.
106
153
  // This allows refactoring (modifying existing tested code) without
107
154
  // requiring the test file to also be modified.
108
- const fs = require("fs");
109
155
  const dir = path.dirname(filePath);
110
156
  const name = path.basename(filePath, ext);
111
157
 
@@ -50,6 +50,10 @@
50
50
  {
51
51
  "type": "command",
52
52
  "command": "node .claude/hooks/dev-team-task-loop.js"
53
+ },
54
+ {
55
+ "type": "command",
56
+ "command": "node .claude/hooks/dev-team-parallel-loop.js"
53
57
  }
54
58
  ]
55
59
  }