flight-rules 0.13.7 → 0.13.8

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "flight-rules",
3
- "version": "0.13.7",
3
+ "version": "0.13.8",
4
4
  "description": "An opinionated framework for AI-assisted software development",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",
package/payload/AGENTS.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # Flight Rules – Agent Guidelines
2
2
 
3
- flight_rules_version: 0.13.7
3
+ flight_rules_version: 0.13.8
4
4
 
5
5
  This file defines how agents (Claude Code, Cursor, etc.) should work on software projects using the Flight Rules system.
6
6
 
@@ -0,0 +1,258 @@
1
+ # Validate Implementation
2
+
3
+ When the user invokes "impl.validate", audit completed tasks to verify that code actually matches what the implementation specs claim.
4
+
5
+ Adopt the persona of a meticulous QA engineer who trusts specs as the source of truth and systematically verifies that completed work delivers what was promised.
6
+
7
+ ## 1. Check Prerequisites
8
+
9
+ ### Required: Implementation Structure Must Exist
10
+
11
+ Check if `docs/implementation/overview.md` exists with defined areas. If not:
12
+
13
+ > "I couldn't find implementation documentation at `docs/implementation/`. There's nothing to validate yet."
14
+
15
+ Stop and wait for the user's response.
16
+
17
+ ## 2. Determine Scope
18
+
19
+ Check if the user specified a scope with the command:
20
+ - `/impl.validate` → Ask what scope they want
21
+ - `/impl.validate 2-cli-core` → Validate that area
22
+ - `/impl.validate 2.1` → Validate that task group
23
+ - `/impl.validate 2.1.3` → Validate that specific task
24
+
25
+ If no argument provided, analyze the implementation structure to identify areas with completed tasks:
26
+
27
+ > **Implementation Areas with Completed Tasks:**
28
+ >
29
+ > [List areas and count of completed tasks in each]
30
+ >
31
+ > Which would you like to validate? You can specify:
32
+ > - An area (e.g., `2-cli-core`)
33
+ > - A task group (e.g., `2.1`)
34
+ > - A specific task (e.g., `2.1.3`)
35
+ > - `all` to validate everything
36
+
37
+ Wait for the user to specify the scope.
38
+
39
+ ## 3. Gather Context
40
+
41
+ Read the relevant implementation specs based on scope:
42
+
43
+ ### For Area Scope
44
+
45
+ - Read `docs/implementation/{N}-{area}/index.md`
46
+ - Read all task group files in the area
47
+ - Build a list of all tasks marked ✅ Complete
48
+
49
+ ### For Task Group Scope
50
+
51
+ - Read `docs/implementation/{N}-{area}/{N}.{M}-topic.md`
52
+ - Build a list of all tasks marked ✅ Complete
53
+
54
+ ### For Task Scope
55
+
56
+ - Read the specific task from its task group file
57
+ - Verify it's marked ✅ Complete (if not, nothing to validate)
58
+
59
+ Also read:
60
+ - `docs/prd.md` for context on goals and constraints
61
+ - `docs/tech-stack.md` if it exists, for understanding the codebase
62
+
63
+ ## 4. Validate Each Completed Task
64
+
65
+ For each task marked ✅ Complete, perform these checks:
66
+
67
+ ### 4.1 File/Code Existence
68
+
69
+ - Identify files or code mentioned in the task's Approach or Acceptance Criteria
70
+ - Verify those files actually exist
71
+ - Check that referenced functions, classes, or modules are present
72
+
73
+ ### 4.2 Implementation Match
74
+
75
+ - Read the relevant code files
76
+ - Compare implementation against what the spec describes
77
+ - Check that the approach was followed (or document if a different approach was taken)
78
+
79
+ ### 4.3 Acceptance Criteria Verification
80
+
81
+ For each acceptance criterion:
82
+ - Determine how to verify it (code inspection, test existence, behavior check)
83
+ - Verify the criterion is met
84
+ - Document evidence of verification or failure
85
+
86
+ ### 4.4 Categorize Results
87
+
88
+ For each task, categorize as:
89
+ - **Verified** — All acceptance criteria met, implementation matches spec
90
+ - **Partial** — Some criteria met, others missing or incomplete
91
+ - **Mismatched** — Implementation exists but differs from spec
92
+ - **Missing** — Code/files referenced don't exist
93
+
94
+ ## 5. Report Findings
95
+
96
+ Present findings organized by task:
97
+
98
+ > **Implementation Validation Report**
99
+ >
100
+ > **Scope:** [Area/Task Group/Task validated]
101
+ > **Date:** [Current date]
102
+ >
103
+ > ---
104
+ >
105
+ > ## Summary
106
+ >
107
+ > | Status | Count |
108
+ > |--------|-------|
109
+ > | Verified | [N] |
110
+ > | Partial | [N] |
111
+ > | Mismatched | [N] |
112
+ > | Missing | [N] |
113
+ >
114
+ > ---
115
+ >
116
+ > ## Verified Tasks
117
+ >
118
+ > - [N].[M].[X] [Task Name] — All criteria met
119
+ >
120
+ > ---
121
+ >
122
+ > ## Tasks with Discrepancies
123
+ >
124
+ > ### [N].[M].[Y] [Task Name]
125
+ >
126
+ > **Status:** [Partial/Mismatched/Missing]
127
+ >
128
+ > **Claimed:** [What the spec says was implemented]
129
+ >
130
+ > **Actual:** [What you found in the code]
131
+ >
132
+ > **Discrepancy:** [Specific gap or conflict]
133
+ >
134
+ > **Files examined:**
135
+ > - [List of files checked]
136
+ >
137
+ > **Acceptance Criteria:**
138
+ > - [Criterion 1]: ✅ Met / ❌ Not met — [Evidence]
139
+ > - [Criterion 2]: ✅ Met / ❌ Not met — [Evidence]
140
+ >
141
+ > ---
142
+ >
143
+ > [Repeat for each task with issues]
144
+
145
+ If all tasks are verified:
146
+
147
+ > **All [N] completed tasks verified successfully.**
148
+ >
149
+ > No discrepancies found. The implementation matches the specs.
150
+
151
+ ## 6. Handle Discrepancies
152
+
153
+ If discrepancies were found, ask how to proceed:
154
+
155
+ > **[N] tasks have discrepancies.**
156
+ >
157
+ > Would you like me to:
158
+ > 1. **Create discrepancy file** — Document issues in `docs/implementation/{area}/{N}.{M}-discrepancies.md`
159
+ > 2. **Update task statuses** — Mark affected tasks as 🟡 In Progress for re-implementation
160
+ > 3. **Both** — Create file and update statuses
161
+ > 4. **Neither** — Just report, don't change anything
162
+
163
+ Wait for the user's response.
164
+
165
+ ### 6.1 Create Discrepancy File
166
+
167
+ If requested, create `docs/implementation/{N}-{area}/{N}.{M}-discrepancies.md`:
168
+
169
+ ```markdown
170
+ # [Task Group Name] - Validation Discrepancies
171
+
172
+ Validation date: [date]
173
+
174
+ ## [Task ID]: [Task Name]
175
+
176
+ **Status:** [Partial/Mismatched/Missing]
177
+
178
+ **Claimed:** [What the spec says was implemented]
179
+
180
+ **Actual:** [What you found in the code]
181
+
182
+ **Discrepancy:** [Specific gap or conflict]
183
+
184
+ **Files examined:**
185
+ - [List of files checked]
186
+
187
+ **Acceptance Criteria:**
188
+ - [Criterion]: [Met/Not met] — [Evidence]
189
+
190
+ ---
191
+
192
+ [Repeat for each task with issues]
193
+ ```
194
+
195
+ ### 6.2 Update Task Statuses
196
+
197
+ If requested, for each task with discrepancies:
198
+
199
+ 1. Update the task status in the task group file:
200
+ - Change `**Status**: ✅ Complete` to `**Status**: 🟡 In Progress`
201
+ - Add a note: `**Note**: Validation discrepancy found — see [{N}.{M}-discrepancies.md](./{N}.{M}-discrepancies.md)`
202
+
203
+ 2. Update `docs/progress.md` with an entry:
204
+ ```markdown
205
+ ### [Date] - Implementation Validation
206
+
207
+ - Validated [scope]
208
+ - Found [N] discrepancies
209
+ - Tasks marked for re-implementation: [list task IDs]
210
+ - See: `docs/implementation/{area}/{N}.{M}-discrepancies.md`
211
+ ```
212
+
213
+ ## 7. Context Management for Large Scopes
214
+
215
+ When validating an entire area with multiple task groups:
216
+
217
+ After completing each task group, output:
218
+
219
+ > **Task Group [N].[M] Validation Complete**
220
+ > - Tasks verified: [count]
221
+ > - Discrepancies found: [count]
222
+ > - Tasks needing re-implementation: [list or "none"]
223
+ >
224
+ > Ready to proceed to next task group. Continue?
225
+
226
+ This allows context to be cleared between task groups if needed.
227
+
228
+ ## 8. Final Summary
229
+
230
+ After all validation is complete:
231
+
232
+ > **Validation Complete**
233
+ >
234
+ > **Scope validated:** [Area/Task Group/Task]
235
+ >
236
+ > **Results:**
237
+ > - [N] tasks verified successfully
238
+ > - [N] tasks with discrepancies
239
+ >
240
+ > **Actions taken:**
241
+ > - [Created discrepancy file at X / Updated N task statuses / None]
242
+ >
243
+ > **Next steps:**
244
+ > - Fix discrepancies and re-run `/impl.validate` to verify
245
+ > - Run `/dev-session.start` to work on tasks marked for re-implementation
246
+
247
+ ## Key Behaviors
248
+
249
+ Throughout this command, maintain these behaviors:
250
+
251
+ - **Specs are source of truth** — You're checking if code matches specs, not the other way around
252
+ - **Be thorough** — Check every acceptance criterion, not just the obvious ones
253
+ - **Show evidence** — Every verification or failure should cite specific code or files
254
+ - **Don't assume** — If you can't verify something, say so rather than guessing
255
+ - **Preserve audit trail** — Discrepancy files create accountability
256
+ - **One scope at a time** — For large validations, pause between task groups
257
+ - **Flag uncertainty** — If a criterion is ambiguous and hard to verify, note that
258
+ - **Respect existing discrepancy files** — Append to existing files rather than overwriting