@codemcp/workflows-core 5.2.2 → 5.2.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -27,14 +27,20 @@ states:
27
27
  architecture_analysis:
28
28
  description: 'Analyze existing system architecture and identify system boundaries'
29
29
  default_instructions: >
30
- You are in the architecture analysis phase for boundary testing workflow.
30
+ **STEP 1: Reference Architecture Documentation**
31
+ Check if `$ARCHITECTURE_DOC` exists and reference it for system understanding.
31
32
 
32
- **Your Tasks:**
33
- 1. **Reference Architecture Documentation**: Check if $ARCHITECTURE_DOC exists and reference it for system understanding
34
- 2. **Search for Architecture**: If $ARCHITECTURE_DOC is not available, search the project for architecture documentation (README.md, docs/, ARCHITECTURE.md, etc.). Enhance the existing $ARCHITECTURE_DOC with your findings
35
- 3. **Analyze Codebase**: Examine the codebase to understand system structure, technology stack, and architectural patterns
36
- 4. **Identify System Boundaries**: Map external interfaces, APIs, and integration points
37
- 5. **Document Findings**: Update the plan file with architecture analysis findings
33
+ **STEP 2: Search for Architecture**
34
+ If `$ARCHITECTURE_DOC` is not available, search the project for architecture documentation (README.md, docs/, ARCHITECTURE.md, etc.). Enhance the existing `$ARCHITECTURE_DOC` with your findings.
35
+
36
+ **STEP 3: Analyze Codebase**
37
+ Examine the codebase to understand system structure, technology stack, and architectural patterns.
38
+
39
+ **STEP 4: Identify System Boundaries**
40
+ Map external interfaces, APIs, and integration points.
41
+
42
+ **STEP 5: Document Findings**
43
+ Update the plan file with architecture analysis findings.
38
44
 
39
45
  **Interview the User (External Factors Only):**
40
46
  - "Who are the main consumers or users of this system?"
@@ -58,16 +64,26 @@ states:
58
64
  interface_discovery:
59
65
  description: 'Discover and catalog all system interfaces, APIs, and external boundaries'
60
66
  default_instructions: >
61
- You are in the interface discovery phase. Use your architecture analysis to systematically identify all system interfaces.
67
+ **STEP 1: Catalog APIs**
68
+ Identify all REST APIs, SOAP services, GraphQL endpoints, or other web services.
69
+
70
+ **STEP 2: Document Database Interfaces**
71
+ Document database connections, schemas, and data access patterns.
72
+
73
+ **STEP 3: Identify File System Interfaces**
74
+ Identify file I/O operations, configuration files, and data files.
75
+
76
+ **STEP 4: Find Message Interfaces**
77
+ Find message queues, event streams, or pub/sub mechanisms.
78
+
79
+ **STEP 5: Document External Service Calls**
80
+ Document calls to external systems, third-party APIs, or services.
81
+
82
+ **STEP 6: Identify User Interfaces**
83
+ Identify web UIs, desktop applications, or command-line interfaces.
62
84
 
63
- **Your Tasks:**
64
- 1. **Catalog APIs**: Identify all REST APIs, SOAP services, GraphQL endpoints, or other web services
65
- 2. **Database Interfaces**: Document database connections, schemas, and data access patterns
66
- 3. **File System Interfaces**: Identify file I/O operations, configuration files, and data files
67
- 4. **Message Interfaces**: Find message queues, event streams, or pub/sub mechanisms
68
- 5. **External Service Calls**: Document calls to external systems, third-party APIs, or services
69
- 6. **User Interfaces**: Identify web UIs, desktop applications, or command-line interfaces
70
- 7. **Document Interface Contracts**: For each interface, document expected inputs, outputs, and protocols
85
+ **STEP 7: Document Interface Contracts**
86
+ For each interface, document expected inputs, outputs, and protocols.
71
87
 
72
88
  **Interview the User (External Context Only):**
73
89
  - "What are the expected response times for different types of requests?"
@@ -96,14 +112,20 @@ states:
96
112
  business_domain_analysis:
97
113
  description: 'Organize system interfaces by business domains and logical boundaries using DDD principles'
98
114
  default_instructions: >
99
- You are in the business domain analysis phase. Apply Domain-Driven Design principles to organize interfaces by business boundaries.
115
+ **STEP 1: Apply DDD Principles**
116
+ Use Domain-Driven Design concepts to identify bounded contexts and business domains.
100
117
 
101
- **Your Tasks:**
102
- 1. **Apply DDD Principles**: Use Domain-Driven Design concepts to identify bounded contexts and business domains
103
- 2. **Group Interfaces**: Organize discovered interfaces by business functionality and logical boundaries
104
- 3. **Identify Business Capabilities**: Map interfaces to business capabilities and use cases
105
- 4. **Define Domain Boundaries**: Establish clear boundaries between different business domains
106
- 5. **Document Domain Model**: Create a clear mapping of business domains to system interfaces
118
+ **STEP 2: Group Interfaces**
119
+ Organize discovered interfaces by business functionality and logical boundaries.
120
+
121
+ **STEP 3: Identify Business Capabilities**
122
+ Map interfaces to business capabilities and use cases.
123
+
124
+ **STEP 4: Define Domain Boundaries**
125
+ Establish clear boundaries between different business domains.
126
+
127
+ **STEP 5: Document Domain Model**
128
+ Create a clear mapping of business domains to system interfaces.
107
129
 
108
130
  **Interview the User (Business Context Only):**
109
131
  - "What are the main business capabilities or functions this system provides?"
@@ -118,10 +140,10 @@ states:
118
140
  - Think about how business users would naturally group functionality
119
141
 
120
142
  **Domain Organization Methodologies:**
121
- - **Event Storming**: Consider business events and workflows
122
- - **Capability Mapping**: Group by business capabilities
123
- - **Data Flow Analysis**: Follow data ownership and lifecycle
124
- - **User Journey Mapping**: Group by user workflows and scenarios
143
+ - Event Storming: Consider business events and workflows
144
+ - Capability Mapping: Group by business capabilities
145
+ - Data Flow Analysis: Follow data ownership and lifecycle
146
+ - User Journey Mapping: Group by user workflows and scenarios
125
147
 
126
148
  Update the plan file with domain organization and validate with the user.
127
149
  transitions:
@@ -142,15 +164,23 @@ states:
142
164
  test_strategy_design:
143
165
  description: 'Design comprehensive testing strategy organized by business boundaries'
144
166
  default_instructions: >
145
- You are in the test strategy design phase. Create a comprehensive testing approach based on business domain organization.
167
+ **STEP 1: Design Domain-Based Testing**
168
+ Create test strategies for each business domain identified.
169
+
170
+ **STEP 2: Define Test Objectives**
171
+ Establish clear objectives for what the tests should validate.
172
+
173
+ **STEP 3: Plan Test Types**
174
+ Design mix of unit, integration, API, and end-to-end tests.
175
+
176
+ **STEP 4: Plan Test Data Strategy**
177
+ Plan test data management and test environment setup.
178
+
179
+ **STEP 5: Ensure Coverage**
180
+ Ensure comprehensive coverage of all business boundaries.
146
181
 
147
- **Your Tasks:**
148
- 1. **Design Domain-Based Testing**: Create test strategies for each business domain identified
149
- 2. **Define Test Objectives**: Establish clear objectives for what the tests should validate
150
- 3. **Plan Test Types**: Design mix of unit, integration, API, and end-to-end tests
151
- 4. **Test Data Strategy**: Plan test data management and test environment setup
152
- 5. **Coverage Strategy**: Ensure comprehensive coverage of all business boundaries
153
- 6. **Validation Approach**: Define how to validate that tests meet their objectives
182
+ **STEP 6: Define Validation Approach**
183
+ Define how to validate that tests meet their objectives.
154
184
 
155
185
  **Interview the User (Requirements & Constraints):**
156
186
  - "Are there specific compliance or regulatory testing requirements?"
@@ -158,19 +188,19 @@ states:
158
188
  - "Are there any budget or timeline constraints for test development?"
159
189
 
160
190
  **Best Practices to Apply:**
161
- - **Test Pyramid**: Balance unit tests, integration tests, and end-to-end tests
162
- - **Contract Testing**: Test API contracts and interface agreements
163
- - **Property-Based Testing**: Test business rules and invariants
164
- - **Boundary Testing**: Focus on edge cases and boundary conditions
165
- - **Error Path Testing**: Ensure error handling is properly tested
166
- - **Data-Driven Testing**: Use realistic data scenarios
191
+ - Test Pyramid: Balance unit tests, integration tests, and end-to-end tests
192
+ - Contract Testing: Test API contracts and interface agreements
193
+ - Property-Based Testing: Test business rules and invariants
194
+ - Boundary Testing: Focus on edge cases and boundary conditions
195
+ - Error Path Testing: Ensure error handling is properly tested
196
+ - Data-Driven Testing: Use realistic data scenarios
167
197
 
168
198
  **Testing Strategy Framework:**
169
- - **Functional Validation**: Ensure system functions meet business requirements
170
- - **Business Rule Validation**: Core business logic behaves correctly
171
- - **Integration Testing**: External system interactions work correctly
172
- - **Error Handling**: Failures and exceptions are handled appropriately
173
- - **Performance Baseline**: Establish performance expectations
199
+ - Functional Validation: Ensure system functions meet business requirements
200
+ - Business Rule Validation: Core business logic behaves correctly
201
+ - Integration Testing: External system interactions work correctly
202
+ - Error Handling: Failures and exceptions are handled appropriately
203
+ - Performance Baseline: Establish performance expectations
174
204
 
175
205
  Update the plan file with detailed test strategy and validate with the user.
176
206
  transitions:
@@ -191,28 +221,38 @@ states:
191
221
  test_suite_implementation:
192
222
  description: 'Implement comprehensive test suite organized by business domains'
193
223
  default_instructions: >
194
- You are in the test suite implementation phase. Build the comprehensive test suite based on your strategy.
224
+ **STEP 1: Implement Domain Tests**
225
+ Create test suites for each business domain.
195
226
 
196
- **Your Tasks:**
197
- 1. **Implement Domain Tests**: Create test suites for each business domain
198
- 2. **API Testing**: Implement comprehensive API tests for all discovered interfaces
199
- 3. **Integration Testing**: Build tests for external system interactions
200
- 4. **End-to-End Testing**: Create business workflow tests that span multiple domains
201
- 5. **Test Data Management**: Implement test data setup and teardown
202
- 6. **Test Organization**: Structure tests clearly by business domain and functionality
203
- 7. **Documentation**: Document test purposes, expected outcomes, and maintenance procedures
227
+ **STEP 2: Implement API Testing**
228
+ Implement comprehensive API tests for all discovered interfaces.
229
+
230
+ **STEP 3: Build Integration Tests**
231
+ Build tests for external system interactions.
232
+
233
+ **STEP 4: Create End-to-End Tests**
234
+ Create business workflow tests that span multiple domains.
235
+
236
+ **STEP 5: Implement Test Data Management**
237
+ Implement test data setup and teardown.
238
+
239
+ **STEP 6: Structure Tests**
240
+ Structure tests clearly by business domain and functionality.
241
+
242
+ **STEP 7: Document Test Suite**
243
+ Document test purposes, expected outcomes, and maintenance procedures.
204
244
 
205
245
  **Interview the User (Priorities & Preferences):**
206
246
  - "Which business domain should we prioritize for test implementation?"
207
247
  - "Are there any specific test frameworks or tools you prefer or are required to use?"
208
248
 
209
249
  **Best Practices to Apply:**
210
- - **Clear Test Organization**: Group tests by business domain and functionality
211
- - **Descriptive Test Names**: Make test purposes clear from names
212
- - **Independent Tests**: Ensure tests can run independently and in any order
213
- - **Reliable Test Data**: Use consistent, predictable test data
214
- - **Fast Feedback**: Prioritize fast-running tests for quick validation
215
- - **Maintainable Tests**: Write tests that are easy to understand and modify
250
+ - Clear Test Organization: Group tests by business domain and functionality
251
+ - Descriptive Test Names: Make test purposes clear from names
252
+ - Independent Tests: Ensure tests can run independently and in any order
253
+ - Reliable Test Data: Use consistent, predictable test data
254
+ - Fast Feedback: Prioritize fast-running tests for quick validation
255
+ - Maintainable Tests: Write tests that are easy to understand and modify
216
256
 
217
257
  **Implementation Guidelines:**
218
258
  - Start with the most critical business domains
@@ -237,15 +277,23 @@ states:
237
277
  validation:
238
278
  description: 'Validate test coverage and establish comprehensive system testing baseline'
239
279
  default_instructions: >
240
- You are in the validation phase. Validate the comprehensive test suite and establish system testing baseline.
280
+ **STEP 1: Test Coverage Analysis**
281
+ Validate that all business domains and interfaces are covered.
282
+
283
+ **STEP 2: Establish Baseline**
284
+ Run all tests against the system to establish baseline behavior.
285
+
286
+ **STEP 3: Review Test Quality**
287
+ Ensure tests are reliable, maintainable, and comprehensive.
241
288
 
242
- **Your Tasks:**
243
- 1. **Test Coverage Analysis**: Validate that all business domains and interfaces are covered
244
- 2. **Baseline Establishment**: Run all tests against the system to establish baseline behavior
245
- 3. **Test Quality Review**: Ensure tests are reliable, maintainable, and comprehensive
246
- 4. **Documentation Review**: Validate that test documentation is complete and clear
247
- 5. **Execution Validation**: Ensure all tests run successfully and provide clear results
248
- 6. **Handoff Preparation**: Prepare test suite for ongoing use and maintenance
289
+ **STEP 4: Review Documentation**
290
+ Validate that test documentation is complete and clear.
291
+
292
+ **STEP 5: Validate Execution**
293
+ Ensure all tests run successfully and provide clear results.
294
+
295
+ **STEP 6: Prepare for Handoff**
296
+ Prepare test suite for ongoing use and maintenance.
249
297
 
250
298
  **Review and Present to User:**
251
299
  - Review test coverage across all business domains and present findings
@@ -254,12 +302,12 @@ states:
254
302
  - Present test execution results and reliability assessment
255
303
 
256
304
  **Best Practices to Apply:**
257
- - **Coverage Validation**: Ensure all critical paths are tested
258
- - **Baseline Documentation**: Document expected results and behaviors
259
- - **Test Reliability**: Verify tests are stable and repeatable
260
- - **Clear Reporting**: Ensure test results are easy to understand
261
- - **Maintenance Planning**: Document how to maintain and update tests
262
- - **Knowledge Transfer**: Ensure team understands test suite organization
305
+ - Coverage Validation: Ensure all critical paths are tested
306
+ - Baseline Documentation: Document expected results and behaviors
307
+ - Test Reliability: Verify tests are stable and repeatable
308
+ - Clear Reporting: Ensure test results are easy to understand
309
+ - Maintenance Planning: Document how to maintain and update tests
310
+ - Knowledge Transfer: Ensure team understands test suite organization
263
311
 
264
312
  **Validation Checklist:**
265
313
  - All business domains have comprehensive test coverage
@@ -288,33 +336,27 @@ states:
288
336
  finalize:
289
337
  description: 'Code cleanup and documentation finalization'
290
338
  default_instructions: >
291
- You are in the finalize phase. This phase ensures code quality and documentation accuracy through systematic cleanup and review.
292
-
293
339
  **STEP 1: Code Cleanup**
294
- Systematically clean up development artifacts:
340
+ Systematically clean up development artifacts.
295
341
 
296
- 1. **Remove Debug Output**: Search for and remove all temporary debug output statements used during test development.
297
- Look for language-specific debug output methods (console logging, print statements, debug output functions).
298
- Remove any debugging statements that were added for development purposes.
342
+ - Remove Debug Output: Search for and remove all temporary debug output statements used during test development. Look for language-specific debug output methods (console logging, print statements, debug output functions). Remove any debugging statements that were added for development purposes.
299
343
 
300
- 2. **Review TODO/FIXME Comments**:
301
- - Address each TODO/FIXME comment by either implementing the solution or documenting why it's deferred
302
- - Remove completed TODOs
303
- - Convert remaining TODOs to proper issue tracking if needed
344
+ - Review TODO/FIXME Comments: Address each TODO/FIXME comment by either implementing the solution or documenting why it's deferred. Remove completed TODOs. Convert remaining TODOs to proper issue tracking if needed.
304
345
 
305
- 3. **Remove Debugging Code Blocks**:
306
- - Remove temporary debugging code, test code blocks, and commented-out code
307
- - Clean up any experimental code that's no longer needed
308
- - Ensure proper error handling replaces temporary debug logging
346
+ - Remove Debugging Code Blocks: Remove temporary debugging code, test code blocks, and commented-out code. Clean up any experimental code that's no longer needed. Ensure proper error handling replaces temporary debug logging.
309
347
 
310
348
  **STEP 2: Documentation Review**
311
- Review and update documentation to reflect final test implementation:
349
+ Review and update documentation to reflect final test implementation.
350
+
351
+ - Compare Against Implementation: Review documentation against actual implemented test suite.
352
+
353
+ - Update Changed Sections: Only modify documentation sections that have functional changes.
354
+
355
+ - Remove Development Progress: Remove references to development iterations, progress notes, and temporary decisions.
356
+
357
+ - Focus on Final State: Ensure documentation describes the final test suite state, not the development process.
312
358
 
313
- 1. **Compare Against Implementation**: Review documentation against actual implemented test suite
314
- 2. **Update Changed Sections**: Only modify documentation sections that have functional changes
315
- 3. **Remove Development Progress**: Remove references to development iterations, progress notes, and temporary decisions
316
- 4. **Focus on Final State**: Ensure documentation describes the final test suite state, not the development process
317
- 5. **Verify Test Documentation**: Ensure test documentation and maintenance procedures are complete and accurate
359
+ - Verify Test Documentation: Ensure test documentation and maintenance procedures are complete and accurate.
318
360
 
319
361
  **STEP 3: Final Validation**
320
362
  - Run existing tests to ensure cleanup didn't break functionality
@@ -27,13 +27,14 @@ states:
27
27
  reproduce:
28
28
  description: 'Reproduce and understand the bug'
29
29
  default_instructions: |
30
- You are in the bug reproduction phase. Work to reliably reproduce the reported bug by gathering specific information:
31
- What are the exact OS, browser/runtime versions, and hardware specs?
32
- What is the precise sequence of actions that trigger the bug?
33
- What error messages, logs, or stack traces are available?
34
- Does this happen every time or intermittently?
35
- How many users are affected and what is the business impact?
36
- Create test cases that demonstrate the problem. Document your findings and create tasks as needed.
30
+ Gather specific information to reliably reproduce the reported bug:
31
+ - What are the exact OS, browser/runtime versions, and hardware specs?
32
+ - What is the precise sequence of actions that trigger the bug?
33
+ - What error messages, logs, or stack traces are available?
34
+ - Does this happen every time or intermittently?
35
+ - How many users are affected and what is the business impact?
36
+
37
+ Create test cases that demonstrate the problem. Document your findings and create tasks as needed.
37
38
  transitions:
38
39
  - trigger: 'bug_reproduced'
39
40
  to: 'analyze'
@@ -54,7 +55,7 @@ states:
54
55
 
55
56
  analyze:
56
57
  description: 'Analyze the bug and identify root cause'
57
- default_instructions: 'You are in the bug analysis phase. Examine the code paths involved in the bug, identify the root cause, and understand why the issue occurs. Use debugging tools, add logging, and trace through the problematic code. Document your analysis and create tasks as needed.'
58
+ default_instructions: Examine the code paths involved in the bug, identify the root cause, and understand why the issue occurs. Use debugging tools, add logging, and trace through the problematic code. Document your analysis and create tasks as needed.
58
59
  transitions:
59
60
  - trigger: 'need_more_reproduction'
60
61
  to: 'reproduce'
@@ -79,9 +80,15 @@ states:
79
80
  fix:
80
81
  description: 'Implement the bug fix'
81
82
  default_instructions: |
82
- Implement the solution based on your analysis. If $DESIGN_DOC exists, follow the design from $DESIGN_DOC. Before implementing, assess the approach:
83
- • How critical is this system? What is the blast radius if the fix causes issues?
84
- Should this be a minimal fix or a more comprehensive solution?
83
+ Implement the solution based on your analysis:
84
+
85
+ - If `$DESIGN_DOC` exists: Follow the design from it
86
+ - Otherwise: Elaborate design options and present them to the user
87
+
88
+ Before implementing, assess the approach:
89
+ - How critical is this system? What is the blast radius if the fix causes issues?
90
+ - Should this be a minimal fix or a more comprehensive solution?
91
+
85
92
  Make targeted changes that address the root cause without introducing new issues. Be careful to maintain existing functionality while fixing the bug.
86
93
  transitions:
87
94
  - trigger: 'need_more_analysis'
@@ -106,7 +113,7 @@ states:
106
113
 
107
114
  verify:
108
115
  description: 'Verify the fix and ensure no regressions'
109
- default_instructions: 'You are in the bug verification phase. Test the fix thoroughly to ensure the original bug is resolved and no new issues were introduced. Run existing tests, create new ones if needed, and verify the solution is robust.'
116
+ default_instructions: Test the fix thoroughly to ensure the original bug is resolved and no new issues were introduced. Run existing tests, create new ones if needed, and verify the solution is robust.
110
117
  transitions:
111
118
  - trigger: 'fix_needs_adjustment'
112
119
  to: 'fix'
@@ -129,43 +136,31 @@ states:
129
136
 
130
137
  finalize:
131
138
  description: 'Code cleanup and documentation finalization'
132
- default_instructions: >
133
- You are in the finalize phase. This phase ensures code quality and documentation accuracy through systematic cleanup and review.
139
+ default_instructions: |
140
+ Ensure code quality and documentation accuracy through systematic cleanup and review.
134
141
 
135
142
  **STEP 1: Code Cleanup**
136
143
  Systematically clean up development artifacts:
137
-
138
- 1. **Remove Debug Output**: Search for and remove all temporary debug output statements used during bug investigation.
139
- Look for language-specific debug output methods (console logging, print statements, debug output functions).
140
- Remove any debugging statements that were added for investigation purposes.
141
-
142
- 2. **Review TODO/FIXME Comments**:
143
- - Address each TODO/FIXME comment by either implementing the solution or documenting why it's deferred
144
- - Remove completed TODOs
145
- - Convert remaining TODOs to proper issue tracking if needed
146
-
147
- 3. **Remove Debugging Code Blocks**:
148
- - Remove temporary debugging code, test code blocks, and commented-out code
149
- - Clean up any experimental code that's no longer needed
150
- - Ensure proper error handling replaces temporary debug logging
144
+ - Remove all temporary debug output statements used during bug investigation (console logging, print statements, debug output functions)
145
+ - Address each TODO/FIXME comment by either implementing the solution or documenting why it's deferred
146
+ - Remove completed TODOs and convert remaining ones to proper issue tracking if needed
147
+ - Remove temporary debugging code, test code blocks, and commented-out code
148
+ - Ensure proper error handling replaces temporary debug logging
151
149
 
152
150
  **STEP 2: Documentation Review**
153
151
  Review and update documentation to reflect the bug fix:
154
-
155
- 1. **Update Long-Term Memory Documents**: Based on what was actually implemented:
156
- If $DESIGN_DOC exists, update $DESIGN_DOC if design details were refined or changed during the fix
157
- 2. **Compare Against Implementation**: Review documentation against the actual bug fix implemented
158
- 3. **Update Changed Sections**: Only modify documentation sections that have functional changes
159
- 4. **Remove Development Progress**: Remove references to investigation iterations, progress notes, and temporary decisions
160
- 5. **Focus on Final State**: Ensure documentation describes the final fixed state, not the debugging process
161
- 6. **Ask User to Review Document Updates**: Ask the document changes
152
+ - If `$DESIGN_DOC` exists, update it if design details were refined or changed during the fix
153
+ - Compare documentation against the actual bug fix implementation
154
+ - Update only the documentation sections that have functional changes
155
+ - Remove references to investigation iterations, progress notes, and temporary decisions
156
+ - Ensure documentation describes the final fixed state, not the debugging process
157
+ - Ask the user to review document updates
162
158
 
163
159
  **STEP 3: Final Validation**
164
160
  - Run existing tests to ensure cleanup didn't break functionality
165
161
  - Verify documentation accuracy with a final review
166
162
  - Ensure bug fix is ready for production
167
-
168
- Update task progress and mark completed work as you finalize the bug fix.
163
+ - Update task progress and mark completed work as you finalize the bug fix
169
164
  transitions:
170
165
  - trigger: 'need_fix_changes'
171
166
  to: 'fix'