specpulse 1.3.4__py3-none-any.whl → 1.4.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -81,7 +81,7 @@ When called with `/sp-task $ARGUMENTS`, I will:
81
81
  * Layer-based tasks (data, business, API)
82
82
  * Module-specific tasks
83
83
  - **Common categories**:
84
- * Constitutional Gates Compliance
84
+ * SDD Gates Compliance
85
85
  * Critical Path identification
86
86
  * Parallel vs Sequential grouping
87
87
  * Progress Tracking configuration
@@ -120,7 +120,7 @@ When called with `/sp-task $ARGUMENTS`, I will:
120
120
  d. **Parse current tasks** from selected file with comprehensive status:
121
121
  - Total tasks, completed, pending, blocked
122
122
  - Parallel tasks identification
123
- - Constitutional gates status
123
+ - SDD gates status
124
124
  - Completion percentage calculation
125
125
  e. **Interactive task updates**:
126
126
  - Mark tasks as completed/in-progress/blocked
@@ -137,20 +137,20 @@ When called with `/sp-task $ARGUMENTS`, I will:
137
137
  TOTAL_TASKS=25
138
138
  COMPLETED_TASKS=10
139
139
  COMPLETION_PERCENTAGE=40%
140
- CONSTITUTIONAL_GATES_PENDING=2
140
+ SDD_GATES_PENDING=2
141
141
  ```
142
142
  d. **Display comprehensive progress**:
143
143
  - Overall completion percentage
144
144
  - Phase-by-phase progress
145
145
  - Blocker identification and resolution
146
146
  - Velocity metrics and estimates
147
- - Constitutional gates compliance status
147
+ - SDD gates compliance status
148
148
 
149
149
  6. **For `/sp-task execute`:**
150
150
  a. **Show existing task files**: List all task-XXX.md files in current feature directory
151
151
  b. **Ask user to select**: Which task file to execute from
152
152
  c. **Ask user to specify**: Which specific task ID to execute
153
- d. **Validate task readiness** using constitutional gates
153
+ d. **Validate task readiness** using SDD gates
154
154
  e. **Execute task** using AI assistant capabilities:
155
155
  ```markdown
156
156
  ## Task Execution: {{ TASK_ID }}
@@ -215,13 +215,13 @@ metrics:
215
215
  completion_percentage: 40%
216
216
  ```
217
217
 
218
- ## Constitutional Gates Integration
218
+ ## SDD Gates Integration
219
219
 
220
- Each task breakdown includes constitutional compliance validation:
221
- - **Simplicity Gate**: Tasks avoid unnecessary complexity
222
- - **Test-First Gate**: Test tasks before implementation tasks
223
- - **Integration-First Gate**: Real service integration preferred
224
- - **Research Gate**: Technology research tasks included
220
+ Each task breakdown includes SDD compliance validation:
221
+ - **Specification First**: Tasks trace to specifications
222
+ - **Task Decomposition**: Concrete, actionable tasks
223
+ - **Quality Assurance**: Appropriate testing tasks included
224
+ - **Traceable Implementation**: Clear linkage to requirements
225
225
 
226
226
  ## Examples
227
227
 
@@ -264,14 +264,14 @@ I will update task status and recalculate progress metrics.
264
264
  ```
265
265
  User: /sp-task status
266
266
  ```
267
- I will display detailed progress with constitutional gates compliance.
267
+ I will display detailed progress with SDD gates compliance.
268
268
 
269
269
  ### Execute specific task
270
270
  ```
271
271
  User: /sp-task execute T001
272
272
  ```
273
273
  I will:
274
- - Validate: Constitutional gates compliance and task readiness
274
+ - Validate: SDD gates compliance and task readiness
275
275
  - Execute: Cross-platform task execution
276
276
  ```bash
277
277
  bash scripts/sp-pulse-task.sh "$FEATURE_DIR" "execute:$TASK_ID"
@@ -283,7 +283,7 @@ I will:
283
283
  - **Script execution** with Bash
284
284
  - **AI-optimized templates** with Jinja2-style variables
285
285
  - **Script integration** for validation and execution
286
- - **Constitutional gates compliance** tracking
286
+ - **SDD gates compliance** tracking
287
287
  - **Parallel task identification** and execution
288
288
  - **Comprehensive progress tracking** with YAML configuration
289
289
  - **Automatic percentage calculation** and velocity metrics
@@ -294,7 +294,7 @@ I will:
294
294
  ## Error Handling
295
295
 
296
296
  - Plan existence validation before task generation
297
- - Constitutional gates compliance checking
297
+ - SDD gates compliance checking
298
298
  - Template structure validation
299
299
  - Dependency conflict detection
300
300
  - Task execution error handling with rollback
@@ -9,7 +9,7 @@ First, detect current feature context:
9
9
  - If no context found, ask user to specify feature or run /sp-pulse first
10
10
 
11
11
  Parse arguments to determine action:
12
- - If "validate": Check plan against constitution
12
+ - If "validate": Check plan against SDD principles
13
13
  - Otherwise: Generate new plan
14
14
 
15
15
  ## For /sp-plan generate or /sp-plan:
@@ -18,11 +18,11 @@ Parse arguments to determine action:
18
18
  3. Read selected specification from @{specs/*/spec-XXX.md}
19
19
 
20
20
  4. Run Phase Gates checks:
21
- - Constitutional compliance
22
- - Simplicity check (≤3 modules)
23
- - Test-first strategy defined
24
- - Framework selection complete
25
- - Research completed
21
+ - SDD compliance verification
22
+ - Architecture documentation check
23
+ - Quality assurance strategy defined
24
+ - Stakeholder alignment confirmed
25
+ - Incremental planning validated
26
26
 
27
27
  5. Generate plan sections:
28
28
  - Technology stack
@@ -32,9 +32,9 @@ Parse arguments to determine action:
32
32
  - Data models
33
33
  - Testing strategy
34
34
 
35
- 6. Track complexity:
36
- - If >3 modules, document justification
37
- - Create simplification roadmap
35
+ 6. Track architectural decisions:
36
+ - Document all technology choices
37
+ - Record trade-offs and rationale
38
38
 
39
39
  7. Check existing plan files and create next version (plan-001.md, plan-002.md, etc.)
40
40
  8. Write plan to plans/XXX-feature/plan-XXX.md
@@ -48,17 +48,17 @@ Parse arguments to determine action:
48
48
  4. Run validation:
49
49
  - !{bash scripts/sp-pulse-plan.sh "XXX-feature"}
50
50
  5. Verify Phase Gates compliance
51
- 6. Check complexity tracking
52
- 7. Ensure test-first approach
51
+ 6. Check architecture documentation
52
+ 7. Ensure quality assurance strategy
53
53
  8. Validate framework choices
54
54
  9. Report validation results
55
55
 
56
- Phase Gates (Phase -1) must pass before implementation:
57
- - ✅ Using ≤3 projects/modules
58
- - ✅ Tests defined before code
59
- - ✅ Using framework features directly
60
- - ✅ No premature abstractions
61
- - ✅ Research completed
56
+ SDD Compliance Gates must pass before implementation:
57
+ - ✅ Specification clearly defined
58
+ - ✅ Incremental phases planned
59
+ - ✅ Tasks properly decomposed
60
+ - ✅ Quality strategy appropriate
61
+ - ✅ Architecture documented
62
62
 
63
63
  Examples:
64
64
  - /sp-plan generate
@@ -1,134 +1,242 @@
1
1
  # Project Constitution
2
- *Immutable principles that govern all development through Specification-Driven Development (SDD)*
2
+ *Universal principles that enable Specification-Driven Development (SDD) for any software project*
3
3
 
4
- ## The Nine Articles of Specification-Driven Development
4
+ ## The Nine Universal SDD Principles
5
5
 
6
- ### Article I: Library-First Principle
7
- Every feature in this project MUST begin its existence as a standalone library. No feature shall be implemented directly within application code without first being abstracted into a reusable library component.
6
+ ### 1. SPECIFICATION FIRST
7
+ **Rule:** Every feature starts with a clear specification.
8
8
 
9
- **Rationale**: This ensures modularity, reusability, and clear boundaries between features.
9
+ **Requirements:**
10
+ - Define what you're building before how
11
+ - Include user stories and acceptance criteria
12
+ - Use [NEEDS CLARIFICATION] markers for unknowns
13
+ - Document functional and non-functional requirements
10
14
 
11
- ### Article II: CLI Interface Mandate
12
- Every library MUST expose its core functionality through a command-line interface.
15
+ **Validation:** Can someone else understand what to build from this spec?
13
16
 
14
- All CLI interfaces MUST:
15
- - Accept text as input (via stdin, arguments, or files)
16
- - Produce text as output (via stdout)
17
- - Support JSON format for structured data exchange
18
- - Provide --help documentation
19
- - Return appropriate exit codes (0 for success, non-zero for failure)
17
+ **Example:**
18
+ ```markdown
19
+ # GOOD: Clear specification
20
+ ## User Authentication
21
+ - Users can register with email and password
22
+ - [NEEDS CLARIFICATION]: OAuth2 provider list
23
+ - Password reset via email required
20
24
 
21
- **Rationale**: This ensures observability, testability, and composability of all components.
25
+ # BAD: Vague specification
26
+ "Make a login system that works"
27
+ ```
28
+
29
+ ### 2. INCREMENTAL PLANNING
30
+ **Rule:** Break specifications into manageable, phased plans.
31
+
32
+ **Requirements:**
33
+ - Create phase-based implementation plans
34
+ - Define clear milestones and checkpoints
35
+ - Prioritize features by business value
36
+ - Each phase should deliver working software
37
+
38
+ **Validation:** Is each phase independently valuable and deployable?
39
+
40
+ **Example:**
41
+ ```markdown
42
+ # ✅ GOOD: Phased delivery
43
+ Phase 1: Core authentication (Week 1)
44
+ Phase 2: User profiles (Week 2)
45
+ Phase 3: Role management (Week 3)
46
+
47
+ # ❌ BAD: Everything at once
48
+ "Complete user management system in one sprint"
49
+ ```
22
50
 
23
- ### Article III: Test-First Imperative
24
- This is NON-NEGOTIABLE: All implementation MUST follow strict Test-Driven Development.
51
+ ### 3. TASK DECOMPOSITION
52
+ **Rule:** Break plans into concrete, executable tasks.
25
53
 
26
- No implementation code shall be written before:
27
- 1. Unit tests are written and documented
28
- 2. Tests are validated and approved by the user
29
- 3. Tests are confirmed to FAIL (Red phase)
30
- 4. Implementation makes tests PASS (Green phase)
31
- 5. Code is refactored while maintaining passing tests (Refactor phase)
54
+ **Requirements:**
55
+ - Create specific, actionable tasks with clear outcomes
56
+ - Estimate effort in hours or days
57
+ - Define "Definition of Done" for each task
58
+ - Include acceptance criteria
32
59
 
33
- **Rationale**: This ensures correctness, prevents regression, and documents expected behavior.
60
+ **Validation:** Could a developer pick this up and start immediately?
34
61
 
35
- ### Article IV: Specification as Source of Truth
36
- Specifications don't serve code—code serves specifications. The specification is the primary artifact from which all implementation flows.
62
+ **Example:**
63
+ ```markdown
64
+ # ✅ GOOD: Actionable task
65
+ T001: Implement user registration endpoint
66
+ - Effort: 4 hours
67
+ - Done: POST /api/users accepts and validates data
68
+ - Test: Registration creates user in database
37
69
 
38
- Every code change MUST:
39
- - Trace back to a specification requirement
40
- - Update specifications if behavior changes
70
+ # BAD: Vague task
71
+ "Do user stuff"
72
+ ```
73
+
74
+ ### 4. TRACEABLE IMPLEMENTATION
75
+ **Rule:** Every piece of code should trace back to a specification.
76
+
77
+ **Requirements:**
78
+ - Reference spec requirements in code comments
79
+ - Link commits to tasks and specs
80
+ - Update specs when requirements change
41
81
  - Maintain bidirectional traceability
42
82
 
43
- **Rationale**: This eliminates the gap between intent and implementation.
83
+ **Validation:** Can you trace this code to a specific requirement?
84
+
85
+ **Example:**
86
+ ```markdown
87
+ # ✅ GOOD: Traceable implementation
88
+ - Commit message: "Implement user auth (SPEC-001, T003)"
89
+ - Code comment: "// Implements REQ-SEC-03: Password validation"
90
+ - PR description: Links to spec-001.md#security
91
+ - Task reference: T003 from task-001.md
92
+
93
+ # ❌ BAD: No traceability
94
+ - Commit: "Fixed stuff"
95
+ - No spec references in code
96
+ - No task linkage
97
+ ```
98
+
99
+ ### 5. CONTINUOUS VALIDATION
100
+ **Rule:** Validate implementation against specifications continuously.
44
101
 
45
- ### Article V: Continuous Refinement
46
- Consistency validation happens continuously, not as a one-time gate.
102
+ **Requirements:**
103
+ - Check implementation matches spec after each task
104
+ - Run acceptance tests regularly
105
+ - Update specs if reality differs from plan
106
+ - Maintain spec-code synchronization
47
107
 
48
- All specifications MUST be:
49
- - Analyzed for ambiguity and contradictions
50
- - Marked with [NEEDS CLARIFICATION] for uncertainties
51
- - Validated against the constitution
52
- - Refined based on implementation feedback
108
+ **Validation:** Does the current implementation match the specification?
53
109
 
54
- **Rationale**: This ensures specifications remain precise, complete, and implementable.
110
+ **Example:**
111
+ ```markdown
112
+ # ✅ GOOD: Regular validation
113
+ - After each task: Check against spec
114
+ - Daily: Run acceptance tests
115
+ - Weekly: Full spec review
55
116
 
56
- ### Article VI: Research-Driven Context
57
- Every technical decision MUST be informed by research.
117
+ # BAD: No validation
118
+ "We'll check at the end of the project"
119
+ ```
120
+
121
+ ### 6. QUALITY ASSURANCE
122
+ **Rule:** Ensure quality through appropriate testing and review.
123
+
124
+ **Requirements:**
125
+ - Test based on acceptance criteria
126
+ - Choose appropriate test types for your project
127
+ - Automate testing where valuable
128
+ - Conduct code reviews for critical features
129
+
130
+ **Validation:** Are all acceptance criteria verifiable and tested?
131
+
132
+ **Example:**
133
+ ```markdown
134
+ # ✅ GOOD: Appropriate testing
135
+ - Unit tests for business logic
136
+ - Integration tests for APIs
137
+ - E2E tests for critical user flows
138
+
139
+ # ❌ BAD: No testing strategy
140
+ "We'll test manually"
141
+ ```
58
142
 
59
- Before implementation:
60
- - Research agents investigate library options
61
- - Performance implications are documented
62
- - Security considerations are analyzed
63
- - Organizational constraints are identified
143
+ ### 7. ARCHITECTURE DOCUMENTATION
144
+ **Rule:** Document key architectural decisions and patterns.
64
145
 
65
- **Rationale**: This prevents uninformed decisions and technical debt.
146
+ **Requirements:**
147
+ - Record technology choices and rationale
148
+ - Document integration points and APIs
149
+ - Track technical debt and trade-offs
150
+ - Maintain architecture decision records (ADRs)
66
151
 
67
- ### Article VII: Simplicity and Anti-Abstraction
68
- Start simple, add complexity only when proven necessary.
152
+ **Validation:** Will someone understand these decisions in 6 months?
69
153
 
70
- Requirements:
71
- - Maximum 3 projects/modules for initial implementation
72
- - No future-proofing without documented need
73
- - Use framework features directly (no unnecessary wrappers)
74
- - Single model representation per concept
75
- - Additional complexity requires documented justification
154
+ **Example:**
155
+ ```markdown
156
+ # GOOD: Documented decisions
157
+ ADR-001: Choose PostgreSQL over MongoDB
158
+ - Date: 2025-01-15
159
+ - Rationale: Need ACID transactions
160
+ - Trade-offs: Less flexible schema
76
161
 
77
- **Rationale**: This prevents over-engineering and maintains maintainability.
162
+ # BAD: No documentation
163
+ "We just picked what we knew"
164
+ ```
165
+
166
+ ### 8. ITERATIVE REFINEMENT
167
+ **Rule:** Specifications and implementations evolve based on learnings.
78
168
 
79
- ### Article VIII: Integration-First Testing
80
- Tests MUST use realistic environments over mocks.
169
+ **Requirements:**
170
+ - Update specs based on user feedback
171
+ - Refine based on implementation discoveries
172
+ - Version specifications for traceability
173
+ - Document lessons learned
81
174
 
82
- Testing priority:
83
- 1. Contract tests (API boundaries)
84
- 2. Integration tests (component interaction)
85
- 3. End-to-end tests (user workflows)
86
- 4. Unit tests (isolated logic)
175
+ **Validation:** Do specs reflect current reality and learnings?
87
176
 
88
- Use:
89
- - Real databases over mocks
90
- - Actual service instances over stubs
91
- - Production-like data volumes
92
- - Realistic network conditions
177
+ **Example:**
178
+ ```markdown
179
+ # GOOD: Learning from implementation
180
+ Spec v1: "Users login with email"
181
+ Spec v2: "Added: Support for username login (user feedback)"
182
+ Spec v3: "Added: MFA option (security review)"
93
183
 
94
- **Rationale**: This ensures code works in practice, not just in theory.
184
+ # BAD: Never updating specs
185
+ "Original spec from 6 months ago"
186
+ ```
95
187
 
96
- ### Article IX: Executable Documentation
97
- All documentation MUST be executable or verifiable.
188
+ ### 9. STAKEHOLDER ALIGNMENT
189
+ **Rule:** Keep all stakeholders aligned through specifications.
98
190
 
99
- This includes:
100
- - Code examples that can be run
101
- - API contracts that can be tested
102
- - Quickstart guides that can be validated
103
- - Architecture decisions with measurable outcomes
191
+ **Requirements:**
192
+ - Share specs with team and clients
193
+ - Get approval before major phases
194
+ - Communicate changes clearly
195
+ - Maintain shared understanding
104
196
 
105
- **Rationale**: This prevents documentation drift and ensures accuracy.
197
+ **Validation:** Does everyone understand what's being built and why?
106
198
 
107
- ## Constitutional Enforcement
199
+ **Example:**
200
+ ```markdown
201
+ # ✅ GOOD: Clear communication
202
+ - Weekly spec reviews with team
203
+ - Client approval before each phase
204
+ - Documented change requests
205
+
206
+ # ❌ BAD: Working in isolation
207
+ "We'll show them when it's done"
208
+ ```
209
+
210
+ ## SDD Methodology Enforcement
108
211
 
109
212
  ### Phase Gates
110
- Every implementation plan MUST pass through constitutional gates:
213
+ Every implementation plan MUST pass through SDD compliance gates:
111
214
 
112
215
  #### Phase -1: Pre-Implementation Gates
113
- - [ ] Simplicity Gate (Article VII): Using <=3 projects? No future-proofing?
114
- - [ ] Anti-Abstraction Gate (Article VII): Using framework directly? Single model?
115
- - [ ] Test-First Gate (Article III): Tests written? Tests reviewed? Tests failing?
116
- - [ ] Integration-First Gate (Article VIII): Contracts defined? Contract tests written?
117
- - [ ] Research Gate (Article VI): Options researched? Trade-offs documented?
118
-
119
- ### Complexity Tracking
120
- Any violation of simplicity principles MUST be documented:
216
+ - [ ] Specification First: Requirements clear and documented?
217
+ - [ ] Incremental Planning: Work broken into valuable phases?
218
+ - [ ] Task Decomposition: Tasks concrete and actionable?
219
+ - [ ] Traceable Implementation: Code-to-spec mapping planned?
220
+ - [ ] Continuous Validation: Validation checkpoints defined?
221
+ - [ ] Quality Assurance: Test strategy appropriate?
222
+ - [ ] Architecture Documentation: Decision tracking planned?
223
+ - [ ] Iterative Refinement: Feedback loops established?
224
+ - [ ] Stakeholder Alignment: Communication plan in place?
225
+
226
+ ### Decision Tracking
227
+ All significant architectural decisions and trade-offs MUST be documented:
121
228
  ```yaml
122
- complexity_exceptions:
123
- - article: VII
124
- violation: "Using 4 projects instead of 3"
125
- justification: "Authentication requires separate service for security isolation"
229
+ architectural_decisions:
230
+ - decision: "Microservices for payment processing"
231
+ rationale: "PCI compliance requires isolation"
232
+ trade_offs: "Increased operational complexity"
126
233
  approved_by: "Team Lead"
127
234
  date: "2025-09-11"
235
+ review_date: "2025-Q2"
128
236
  ```
129
237
 
130
238
  ### Amendment Process
131
- While principles are immutable, their application can evolve:
239
+ While principles guide development, their application can evolve:
132
240
 
133
241
  1. Proposed amendments require:
134
242
  - Explicit documentation of rationale
@@ -143,46 +251,47 @@ While principles are immutable, their application can evolve:
143
251
  ## Principles in Practice
144
252
 
145
253
  ### When Starting a New Feature
146
- 1. Write specification first (Article IV)
147
- 2. Research technical options (Article VI)
148
- 3. Design as a library (Article I)
149
- 4. Expose via CLI (Article II)
150
- 5. Write tests first (Article III)
151
- 6. Keep it simple (Article VII)
152
- 7. Use real environments (Article VIII)
254
+ 1. Write specification first (Principle 1)
255
+ 2. Create phased plan (Principle 2)
256
+ 3. Break into tasks (Principle 3)
257
+ 4. Ensure traceability (Principle 4)
258
+ 5. Set up validation (Principle 5)
259
+ 6. Define quality strategy (Principle 6)
260
+ 7. Document architecture (Principle 7)
153
261
 
154
262
  ### When Reviewing Code
155
- - Does it trace to a specification? (Article IV)
156
- - Are tests written first? (Article III)
157
- - Is it the simplest solution? (Article VII)
158
- - Does it work with real services? (Article VIII)
159
- - Is documentation executable? (Article IX)
160
-
161
- ### When Facing Complexity
162
- 1. Document why simplicity isn't sufficient
163
- 2. Get explicit approval for complexity
164
- 3. Track in complexity_exceptions
165
- 4. Plan for future simplification
263
+ - Does it trace to a specification? (Principle 4)
264
+ - Is testing appropriate? (Principle 6)
265
+ - Are decisions documented? (Principle 7)
266
+ - Has it been validated? (Principle 5)
267
+ - Are stakeholders aligned? (Principle 9)
268
+
269
+ ### When Making Architectural Decisions
270
+ 1. Document the decision and rationale
271
+ 2. Get stakeholder approval for significant changes
272
+ 3. Track in architecture decision records
273
+ 4. Plan for future improvements
166
274
 
167
275
  ## Technical Standards
168
276
 
169
- ### Code Style
170
- - Python: PEP 8 with type hints
171
- - JavaScript/TypeScript: ESLint + Prettier
172
- - Tests: Descriptive names, AAA pattern
173
- - Documentation: Clear, concise, with examples
277
+ ### Code Style Guidelines
278
+ - Follow language-specific best practices
279
+ - Use consistent formatting (Prettier, Black, etc.)
280
+ - Write self-documenting code
281
+ - Add meaningful comments where needed
174
282
 
175
283
  ### Testing Requirements
176
- - Minimum 80% code coverage
177
- - All APIs must have contract tests
178
- - Critical paths require E2E tests
179
- - TDD red-green-refactor cycle mandatory
180
-
181
- ### Performance Targets
182
- - API response time: < 200ms (p95)
183
- - Test execution: < 5 minutes for full suite
184
- - Build time: < 2 minutes
185
- - Specification generation: < 30 seconds
284
+ - Choose coverage appropriate for project risk
285
+ - Test based on acceptance criteria
286
+ - Critical features need comprehensive testing
287
+ - Use testing approach suitable for project type
288
+
289
+ ### Project-Specific Standards
290
+ - Define standards appropriate for your project type
291
+ - Web apps: Response times, Core Web Vitals
292
+ - Mobile apps: Battery usage, offline capability
293
+ - Games: FPS targets, load times
294
+ - APIs: Throughput, latency percentiles
186
295
 
187
296
  ## Development Workflow
188
297
 
@@ -191,8 +300,8 @@ While principles are immutable, their application can evolve:
191
300
  2. `/sp-spec` - Create specification following template guidelines
192
301
  3. `/sp-plan` - Generate implementation plan with Phase Gates
193
302
  4. `/sp-task` - Break down into executable tasks
194
- 5. Execute with Test-First Development
195
- 6. `/sp-validate` - Validate against constitution and specification
303
+ 5. Execute with Quality Assurance
304
+ 6. `/sp-validate` - Validate against SDD principles and specification
196
305
  7. Update specifications based on learnings
197
306
 
198
307
  ### Version Control
@@ -203,9 +312,9 @@ While principles are immutable, their application can evolve:
203
312
 
204
313
  ### Continuous Integration
205
314
  - Specification validation on every commit
206
- - Constitutional gate checks in CI
207
- - Test coverage enforcement
208
- - Complexity tracking reports
315
+ - SDD compliance checks in CI
316
+ - Appropriate test coverage for project type
317
+ - Architecture decision tracking
209
318
 
210
319
  ## Specification Quality Standards
211
320