speccrew 0.1.3 → 0.1.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -9,7 +9,7 @@ tools: Read, Write, Glob, Grep
9
9
  You are the **Feature Designer Agent**, responsible for transforming PRD requirement scenarios into concrete system feature specifications.
10
10
 
11
11
  You are in the **second stage** of the complete engineering closed loop:
12
- `User Requirements → PRD → [Feature Detail Design] → speccrew-system-designer → speccrew-dev → speccrew-test`
12
+ `User Requirements → PRD → [Feature Detail Design + API Contract] → speccrew-system-designer → speccrew-dev → speccrew-test`
13
13
 
14
14
  Your core task is to **bridge requirements and implementation**: based on the user scenarios described in the PRD, design the system's UI prototypes, interaction flows, backend processing logic, and data access schemes, without delving into specific technical implementation details.
15
15
 
@@ -86,11 +86,41 @@ Invoke `speccrew-task-worker` agents in parallel:
86
86
  - Each worker has access to both Master PRD (for overall view) and one Sub PRD (for focused design)
87
87
  - All workers execute simultaneously to maximize efficiency
88
88
 
89
+ ## Phase 4: API Contract Generation
90
+
91
+ After Feature Spec documents are confirmed by user, generate API Contract documents.
92
+
93
+ ### 4.1 Single Feature Spec
94
+
95
+ Invoke API Contract skill directly:
96
+ - Skill path: `speccrew-fd-api-contract/SKILL.md`
97
+ - Input: The Feature Spec document generated in Phase 3
98
+ - Output path: `speccrew-workspace/iterations/{number}-{type}-{name}/02.feature-design/[feature-name]-api-contract.md`
99
+
100
+ ### 4.2 Multiple Feature Specs (Master + Sub)
101
+
102
+ Invoke `speccrew-task-worker` agents in parallel:
103
+ - Each worker receives:
104
+ - `skill_path`: `speccrew-fd-api-contract/SKILL.md`
105
+ - `context`:
106
+ - `feature_spec_path`: Path to one Feature Spec document
107
+ - `output_path`: Path for the API Contract document
108
+ - Parallel execution: one worker per Feature Spec document
109
+
110
+ ### 4.3 Joint Confirmation
111
+
112
+ After both Feature Spec and API Contract documents are ready, present summary to user:
113
+ - List all Feature Spec documents with paths
114
+ - List all API Contract documents with paths
115
+ - Request user confirmation before proceeding to system design phase
116
+ - After confirmation, API Contract becomes the read-only baseline for downstream stages
117
+
89
118
  # Deliverables
90
119
 
91
120
  | Deliverable | Path | Notes |
92
121
  |-------------|------|-------|
93
122
  | Feature Detail Design Document | `speccrew-workspace/iterations/{number}-{type}-{name}/02.feature-design/[feature-name]-feature-spec.md` | Based on template from `speccrew-fd-feature-design/templates/FEATURE-SPEC-TEMPLATE.md` |
123
+ | API Contract Document | `speccrew-workspace/iterations/{number}-{type}-{name}/02.feature-design/[feature-name]-api-contract.md` | Based on template from `speccrew-fd-api-contract/templates/API-CONTRACT-TEMPLATE.md` |
94
124
 
95
125
  # Deliverable Content Structure
96
126
 
@@ -133,7 +163,8 @@ The Feature Detail Design Document should include the following:
133
163
  - Use Mermaid diagrams to describe interaction flows, clearly expressing user-system interaction processes
134
164
  - Define complete data fields, including type, format, constraints, and other information
135
165
  - Design backend processing logic flows, including business validation and exception handling
136
- - Explicitly prompt user for confirmation after feature design completion, only transition to speccrew-system-designer after confirmation
166
+ - Generate API Contract documents after Feature Spec is confirmed, using `speccrew-fd-api-contract` skill
167
+ - Explicitly prompt user for joint confirmation of both Feature Spec and API Contract, only transition to speccrew-system-designer after confirmation
137
168
 
138
169
  **Must not do:**
139
170
  - Do not go deep into specific technical implementation details (e.g., technology selection, framework usage, that's speccrew-system-designer's responsibility)
@@ -60,7 +60,8 @@ flowchart TB
60
60
  S0[Pre-processing: Platform Root Detection] --> S0a[Stage 0: Platform Detection]
61
61
  S0a --> S1a[Stage 1a: Entry Directory Recognition]
62
62
  S1a --> S1b[Stage 1b: Feature Inventory]
63
- S1b --> S2[Stage 2: Feature Analysis + Graph Write]
63
+ S1b --> S1c[Stage 1c: Feature Merge - Incremental]
64
+ S1c --> S2[Stage 2: Feature Analysis + Graph Write]
64
65
  S2 --> S3[Stage 3: Module Summarize]
65
66
  S3 --> S3_5[Stage 3.5: UI Style Pattern Extract]
66
67
  S3_5 --> S4[Stage 4: System Summary]
@@ -278,6 +279,51 @@ After generating the entry-dirs JSON:
278
279
 
279
280
  ---
280
281
 
282
+ ## Stage 1c: Feature Merge (Incremental)
283
+
284
+ **Goal**: If incremental inventory files (`features-*.new.json`) are detected, merge them with existing `features-*.json` files. Identifies added/removed/changed features, resets changed features for re-analysis, and cleans up artifacts for removed features.
285
+
286
+ > **IMPORTANT**: This stage is executed **directly by the dispatch agent (Leader)** via `run_in_terminal`. NOT delegated to a Worker Agent.
287
+
288
+ **Prerequisite**: Stage 1b completed.
289
+
290
+ **Skip condition**: If no `features-*.new.json` files exist in `{sync_state_path}/knowledge-bizs/`, skip this Stage entirely and proceed to Stage 2.
291
+
292
+ **Action** (dispatch executes directly via `run_in_terminal`):
293
+
294
+ 1. **Locate the merge script**: Find `merge-features.js` in the `speccrew-knowledge-bizs-dispatch` skill's scripts directory:
295
+ - Script location: `{ide_skills_dir}/speccrew-knowledge-bizs-dispatch/scripts/merge-features.js`
296
+
297
+ 2. **Execute merge script**:
298
+ ```
299
+ node "{path_to_merge_features_js}" --syncStatePath "{sync_state_path}/knowledge-bizs" --completedDir "{completed_dir}" --projectRoot "{project_root}"
300
+ ```
301
+
302
+ 3. **Read output JSON** from stdout and report merge results:
303
+ - Added features: new source files discovered
304
+ - Removed features: source files no longer exist (documents and markers cleaned up)
305
+ - Changed features: source files modified since last analysis (reset to `analyzed: false`)
306
+ - Unchanged features: source files not modified (analysis state preserved)
307
+
308
+ **Merge Logic**:
309
+
310
+ | Situation | Condition | Action |
311
+ |-----------|-----------|--------|
312
+ | Added | In new scan but not in existing features | Add with `analyzed: false` |
313
+ | Removed | In existing features but not in new scan | Remove from list, delete `.md` doc + `.done.json` + `.graph.json` markers |
314
+ | Changed | Both exist, `lastModified > completedAt` | Reset `analyzed: false` for re-analysis |
315
+ | Unchanged | Both exist, `lastModified <= completedAt` | Preserve existing analysis state |
316
+
317
+ **Output**: Updated `features-{platform}.json` files where:
318
+ - New features: `analyzed: false`
319
+ - Source-modified features: `analyzed: false`
320
+ - Unmodified features: preserved original `analyzed` status
321
+ - Deleted features: removed from list, associated documents and markers cleaned up
322
+
323
+ **Error handling**: If the merge script exits with non-zero code, STOP and report the error. Do NOT proceed to Stage 2 until merge is resolved.
324
+
325
+ ---
326
+
281
327
  ## Stage 2: Feature Analysis (Batch Processing)
282
328
 
283
329
  **Overview**: Process all pending features in batches. Each batch gets a set of features, launches Worker Agents to analyze them, then processes the results.
@@ -0,0 +1,300 @@
1
+ #!/usr/bin/env node
2
+ /**
3
+ * merge-features.js
4
+ *
5
+ * Merge incremental feature inventory (*.new.json) with existing features (*.json).
6
+ * Identifies added/removed/changed/unchanged features and cleans up artifacts for removed features.
7
+ *
8
+ * Usage: node merge-features.js --syncStatePath <path> --completedDir <path> --projectRoot <path>
9
+ */
10
+
11
+ const fs = require('fs');
12
+ const path = require('path');
13
+
14
+ function parseArgs() {
15
+ const args = process.argv.slice(2);
16
+ const params = {};
17
+ for (let i = 0; i < args.length; i++) {
18
+ if (args[i].startsWith('--')) {
19
+ const key = args[i].slice(2);
20
+ const value = args[i + 1];
21
+ if (value !== undefined && !value.startsWith('--')) {
22
+ params[key] = value;
23
+ i++;
24
+ } else {
25
+ params[key] = true;
26
+ }
27
+ }
28
+ }
29
+ return params;
30
+ }
31
+
32
+ function normalizePath(p) {
33
+ return p ? p.replace(/\\/g, '/') : '';
34
+ }
35
+
36
+ // Safely delete a file, logging the action
37
+ function safeDelete(filePath, cleanedFiles) {
38
+ if (fs.existsSync(filePath)) {
39
+ try {
40
+ fs.unlinkSync(filePath);
41
+ cleanedFiles.push(normalizePath(filePath));
42
+ return true;
43
+ } catch (e) {
44
+ console.error(`Warning: Failed to delete ${filePath}: ${e.message}`);
45
+ return false;
46
+ }
47
+ }
48
+ return false;
49
+ }
50
+
51
+ // Compare timestamps: returns true if sourceModified is newer than analysisCompleted
52
+ function isNewer(lastModified, completedAt) {
53
+ if (!completedAt) return true; // never analyzed → needs analysis
54
+ if (!lastModified) return false; // no modification info → assume unchanged
55
+
56
+ // lastModified is ISO format: "2026-04-07T12:30:00.000Z"
57
+ // completedAt is custom format: "2026-04-07-123000" (from dispatch)
58
+ // Normalize completedAt to comparable format
59
+ const normalizedCompleted = normalizeCompletedAt(completedAt);
60
+
61
+ const modDate = new Date(lastModified);
62
+ const compDate = new Date(normalizedCompleted);
63
+
64
+ // If either date is invalid, treat as changed (safer)
65
+ if (isNaN(modDate.getTime()) || isNaN(compDate.getTime())) {
66
+ return true;
67
+ }
68
+
69
+ return modDate > compDate;
70
+ }
71
+
72
+ // Normalize completedAt format "2026-04-07-123000" → "2026-04-07T12:30:00"
73
+ function normalizeCompletedAt(completedAt) {
74
+ if (!completedAt) return null;
75
+
76
+ // Try ISO format first
77
+ if (completedAt.includes('T')) return completedAt;
78
+
79
+ // Handle "YYYY-MM-DD-HHmmss" format
80
+ const match = completedAt.match(/^(\d{4}-\d{2}-\d{2})-(\d{2})(\d{2})(\d{2})$/);
81
+ if (match) {
82
+ return `${match[1]}T${match[2]}:${match[3]}:${match[4]}`;
83
+ }
84
+
85
+ return completedAt;
86
+ }
87
+
88
+ function main() {
89
+ const params = parseArgs();
90
+
91
+ const syncStatePath = params.syncStatePath;
92
+ const completedDir = params.completedDir;
93
+ const projectRoot = params.projectRoot;
94
+
95
+ if (!syncStatePath || !completedDir || !projectRoot) {
96
+ console.error('Usage: node merge-features.js --syncStatePath <path> --completedDir <path> --projectRoot <path>');
97
+ process.exit(1);
98
+ }
99
+
100
+ const resolvedSyncState = path.resolve(syncStatePath);
101
+ const resolvedCompletedDir = path.resolve(completedDir);
102
+ const resolvedProjectRoot = path.resolve(projectRoot);
103
+
104
+ // Scan for *.new.json files
105
+ if (!fs.existsSync(resolvedSyncState)) {
106
+ // Output empty result
107
+ console.log(JSON.stringify({ platforms: [], totalAdded: 0, totalRemoved: 0, totalChanged: 0, totalUnchanged: 0 }));
108
+ return;
109
+ }
110
+
111
+ const newFiles = fs.readdirSync(resolvedSyncState)
112
+ .filter(f => f.startsWith('features-') && f.endsWith('.new.json'));
113
+
114
+ if (newFiles.length === 0) {
115
+ // No incremental files found
116
+ console.log(JSON.stringify({ platforms: [], totalAdded: 0, totalRemoved: 0, totalChanged: 0, totalUnchanged: 0 }));
117
+ return;
118
+ }
119
+
120
+ const result = {
121
+ platforms: [],
122
+ totalAdded: 0,
123
+ totalRemoved: 0,
124
+ totalChanged: 0,
125
+ totalUnchanged: 0
126
+ };
127
+
128
+ for (const newFile of newFiles) {
129
+ const newFilePath = path.join(resolvedSyncState, newFile);
130
+ // features-backend-bpm.new.json → features-backend-bpm.json
131
+ const oldFileName = newFile.replace('.new.json', '.json');
132
+ const oldFilePath = path.join(resolvedSyncState, oldFileName);
133
+
134
+ // Read new features
135
+ let newData;
136
+ try {
137
+ newData = JSON.parse(fs.readFileSync(newFilePath, 'utf8'));
138
+ } catch (e) {
139
+ console.error(`Error reading ${newFile}: ${e.message}`);
140
+ continue;
141
+ }
142
+
143
+ // Read old features (if exists)
144
+ let oldData = null;
145
+ if (fs.existsSync(oldFilePath)) {
146
+ try {
147
+ oldData = JSON.parse(fs.readFileSync(oldFilePath, 'utf8'));
148
+ } catch (e) {
149
+ console.error(`Warning: Failed to read ${oldFileName}, treating as fresh: ${e.message}`);
150
+ }
151
+ }
152
+
153
+ // If no old data, just rename new → old (first-time generation)
154
+ if (!oldData) {
155
+ fs.renameSync(newFilePath, oldFilePath);
156
+ const platformResult = {
157
+ platformId: newData.platformId || oldFileName.replace('features-', '').replace('.json', ''),
158
+ added: newData.features.map(f => f.fileName),
159
+ removed: [],
160
+ changed: [],
161
+ unchanged: [],
162
+ cleanedFiles: []
163
+ };
164
+ result.platforms.push(platformResult);
165
+ result.totalAdded += platformResult.added.length;
166
+ console.error(`Platform ${platformResult.platformId}: ${platformResult.added.length} features (first run)`);
167
+ continue;
168
+ }
169
+
170
+ // Build old features map by sourcePath
171
+ const oldMap = new Map();
172
+ for (const feature of (oldData.features || [])) {
173
+ oldMap.set(normalizePath(feature.sourcePath), feature);
174
+ }
175
+
176
+ // Build new features map by sourcePath
177
+ const newMap = new Map();
178
+ for (const feature of (newData.features || [])) {
179
+ newMap.set(normalizePath(feature.sourcePath), feature);
180
+ }
181
+
182
+ const platformResult = {
183
+ platformId: newData.platformId || oldData.platformId || oldFileName.replace('features-', '').replace('.json', ''),
184
+ added: [],
185
+ removed: [],
186
+ changed: [],
187
+ unchanged: [],
188
+ cleanedFiles: []
189
+ };
190
+
191
+ // Merged features list
192
+ const mergedFeatures = [];
193
+
194
+ // Process new features
195
+ for (const [sourcePath, newFeature] of newMap) {
196
+ const oldFeature = oldMap.get(sourcePath);
197
+
198
+ if (!oldFeature) {
199
+ // Added: new feature not in old
200
+ mergedFeatures.push({
201
+ ...newFeature,
202
+ analyzed: false,
203
+ startedAt: null,
204
+ completedAt: null,
205
+ analysisNotes: null
206
+ });
207
+ platformResult.added.push(newFeature.fileName);
208
+ } else if (!oldFeature.analyzed || !oldFeature.completedAt) {
209
+ // Previously not analyzed → keep as not analyzed, use new metadata
210
+ mergedFeatures.push({
211
+ ...newFeature,
212
+ analyzed: false,
213
+ startedAt: oldFeature.startedAt,
214
+ completedAt: oldFeature.completedAt,
215
+ analysisNotes: oldFeature.analysisNotes
216
+ });
217
+ platformResult.changed.push(newFeature.fileName);
218
+ } else if (isNewer(newFeature.lastModified, oldFeature.completedAt)) {
219
+ // Changed: source modified after last analysis
220
+ mergedFeatures.push({
221
+ ...newFeature,
222
+ analyzed: false,
223
+ startedAt: null,
224
+ completedAt: oldFeature.completedAt, // preserve for reference
225
+ analysisNotes: `Source modified since last analysis (was: ${oldFeature.analysisNotes || 'N/A'})`
226
+ });
227
+ platformResult.changed.push(newFeature.fileName);
228
+ } else {
229
+ // Unchanged: keep old feature state entirely
230
+ mergedFeatures.push({
231
+ ...oldFeature,
232
+ // Update metadata from new scan (in case id/documentPath changed)
233
+ id: newFeature.id,
234
+ documentPath: newFeature.documentPath,
235
+ lastModified: newFeature.lastModified
236
+ });
237
+ platformResult.unchanged.push(newFeature.fileName);
238
+ }
239
+ }
240
+
241
+ // Process removed features (in old but not in new)
242
+ for (const [sourcePath, oldFeature] of oldMap) {
243
+ if (!newMap.has(sourcePath)) {
244
+ platformResult.removed.push(oldFeature.fileName);
245
+
246
+ // Clean up artifacts
247
+ // 1. Delete document .md file
248
+ if (oldFeature.documentPath) {
249
+ const docAbsPath = path.join(resolvedProjectRoot, oldFeature.documentPath);
250
+ safeDelete(docAbsPath, platformResult.cleanedFiles);
251
+ }
252
+
253
+ // 2. Delete .done.json marker
254
+ const donePath = path.join(resolvedCompletedDir, `${oldFeature.fileName}.done.json`);
255
+ safeDelete(donePath, platformResult.cleanedFiles);
256
+
257
+ // 3. Delete .graph.json marker
258
+ const graphPath = path.join(resolvedCompletedDir, `${oldFeature.fileName}.graph.json`);
259
+ safeDelete(graphPath, platformResult.cleanedFiles);
260
+ }
261
+ }
262
+
263
+ // Update inventory metadata
264
+ const analyzedCount = mergedFeatures.filter(f => f.analyzed).length;
265
+ const mergedInventory = {
266
+ ...newData,
267
+ // Preserve some old metadata
268
+ analysisMethod: oldData.analysisMethod || newData.analysisMethod,
269
+ // Update counts
270
+ totalFiles: mergedFeatures.length,
271
+ analyzedCount: analyzedCount,
272
+ pendingCount: mergedFeatures.length - analyzedCount,
273
+ generatedAt: new Date().toISOString().replace(/[-:]/g, '').slice(0, 15).replace('T', '-'),
274
+ features: mergedFeatures
275
+ };
276
+
277
+ // Also recalculate modules list
278
+ const moduleSet = new Set(mergedFeatures.map(f => f.module));
279
+ mergedInventory.modules = [...moduleSet].sort();
280
+
281
+ // Write back merged features (overwrite old file)
282
+ fs.writeFileSync(oldFilePath, JSON.stringify(mergedInventory, null, 2), 'utf8');
283
+
284
+ // Delete .new.json file
285
+ fs.unlinkSync(newFilePath);
286
+
287
+ result.platforms.push(platformResult);
288
+ result.totalAdded += platformResult.added.length;
289
+ result.totalRemoved += platformResult.removed.length;
290
+ result.totalChanged += platformResult.changed.length;
291
+ result.totalUnchanged += platformResult.unchanged.length;
292
+
293
+ console.error(`Platform ${platformResult.platformId}: +${platformResult.added.length} added, -${platformResult.removed.length} removed, ~${platformResult.changed.length} changed, =${platformResult.unchanged.length} unchanged`);
294
+ }
295
+
296
+ // Output result as JSON to stdout
297
+ console.log(JSON.stringify(result, null, 2));
298
+ }
299
+
300
+ main();
@@ -154,6 +154,14 @@ function isDataObjectFile(fileName, extension, excludeSuffixes) {
154
154
  return false;
155
155
  }
156
156
 
157
+ // Check if file name matches exact exclusion list (e.g., package-info)
158
+ function isExcludedFileName(fileName, excludeFileNames) {
159
+ if (!excludeFileNames || excludeFileNames.length === 0) {
160
+ return false;
161
+ }
162
+ return excludeFileNames.includes(fileName);
163
+ }
164
+
157
165
  // Get module name (first non-excluded directory level)
158
166
  function getModuleName(dirPath, excludeDirs, fallbackModuleName) {
159
167
  const parts = normalizePath(dirPath).split('/').filter(p => p && p !== '.');
@@ -225,11 +233,12 @@ function loadTechStackConfig(platformType, framework, projectRoot) {
225
233
  const techConfig = config.tech_stacks[platformType][framework];
226
234
  return {
227
235
  extensions: techConfig.extensions || [],
228
- exclude_file_suffixes: techConfig.exclude_file_suffixes || []
236
+ exclude_file_suffixes: techConfig.exclude_file_suffixes || [],
237
+ exclude_file_names: techConfig.exclude_file_names || []
229
238
  };
230
239
  }
231
240
 
232
- return { extensions: [], exclude_file_suffixes: [] };
241
+ return { extensions: [], exclude_file_suffixes: [], exclude_file_names: [] };
233
242
  }
234
243
 
235
244
  /**
@@ -357,7 +366,8 @@ function findFilesInDir(dir, extensions, baseDir) {
357
366
  relativePath,
358
367
  fileName: path.basename(item, ext),
359
368
  extension: ext,
360
- directory: path.dirname(relativePath)
369
+ directory: path.dirname(relativePath),
370
+ lastModified: stat.mtime.toISOString()
361
371
  });
362
372
  }
363
373
  }
@@ -380,7 +390,7 @@ function generateFromEntryDirs(entryDirsData, platformConfig, projectRoot, outpu
380
390
 
381
391
  // Load tech stack config for extensions and exclude_file_suffixes
382
392
  const techConfig = loadTechStackConfig(platformType, framework, projectRoot);
383
- const { extensions, exclude_file_suffixes } = techConfig;
393
+ const { extensions, exclude_file_suffixes, exclude_file_names } = techConfig;
384
394
 
385
395
  if (extensions.length === 0) {
386
396
  console.error(`Error: No extensions found for ${platformType}/${framework} in tech-stack-mappings.json`);
@@ -428,6 +438,11 @@ function generateFromEntryDirs(entryDirsData, platformConfig, projectRoot, outpu
428
438
  continue;
429
439
  }
430
440
 
441
+ // Apply exclude_file_names filter (e.g., package-info)
442
+ if (isExcludedFileName(file.fileName, exclude_file_names)) {
443
+ continue;
444
+ }
445
+
431
446
  // Build feature ID: moduleName-entryDirSegs-fileName
432
447
  // entryDir like "controller/admin/chat" → "controller-admin-chat"
433
448
  const entryDirNormalized = normalizePath(entryDir).replace(/[\/\\]/g, '-');
@@ -445,6 +460,7 @@ function generateFromEntryDirs(entryDirsData, platformConfig, projectRoot, outpu
445
460
  sourcePath: relativeFilePath,
446
461
  documentPath: docPath,
447
462
  module: moduleName,
463
+ lastModified: file.lastModified,
448
464
  analyzed: false,
449
465
  startedAt: null,
450
466
  completedAt: null,
@@ -484,16 +500,25 @@ function generateFromEntryDirs(entryDirsData, platformConfig, projectRoot, outpu
484
500
  // Write output file
485
501
  const outputFileName = `features-${platformId}.json`;
486
502
  const outputPath = path.join(outputDir, outputFileName);
487
-
503
+
504
+ // Incremental: if features file already exists, write to *.new.json
505
+ const actualOutputPath = fs.existsSync(outputPath)
506
+ ? outputPath.replace(/\.json$/, '.new.json')
507
+ : outputPath;
508
+
488
509
  // Ensure output directory exists
489
510
  if (!fs.existsSync(outputDir)) {
490
511
  fs.mkdirSync(outputDir, { recursive: true });
491
512
  }
492
-
493
- fs.writeFileSync(outputPath, JSON.stringify(inventory, null, 2), 'utf8');
494
-
495
- console.log(`Generated ${outputFileName} with ${features.length} features`);
496
- console.log(`Output: ${outputPath}`);
513
+
514
+ fs.writeFileSync(actualOutputPath, JSON.stringify(inventory, null, 2), 'utf8');
515
+
516
+ if (actualOutputPath !== outputPath) {
517
+ console.log(`Incremental: Generated ${path.basename(actualOutputPath)} (existing features detected)`);
518
+ } else {
519
+ console.log(`Full: Generated ${path.basename(actualOutputPath)} with ${features.length} features`);
520
+ }
521
+ console.log(`Output: ${actualOutputPath}`);
497
522
 
498
523
  return true;
499
524
  }
@@ -522,7 +547,8 @@ function findFiles(dir, extensions, excludeDirs, baseDir) {
522
547
  relativePath,
523
548
  fileName: path.basename(item, ext),
524
549
  extension: ext,
525
- directory: path.dirname(relativePath)
550
+ directory: path.dirname(relativePath),
551
+ lastModified: stat.mtime.toISOString()
526
552
  });
527
553
  }
528
554
  }
@@ -649,6 +675,7 @@ function main() {
649
675
 
650
676
  // If excludeDirs not provided or empty, try to read from tech-stack-mappings.json
651
677
  let excludeFileSuffixes = [];
678
+ let excludeFileNames = [];
652
679
  if (!excludeDirsStr || excludeDirsStr === '[]') {
653
680
  try {
654
681
  const configPath = path.join(projectRoot, 'speccrew-workspace', 'docs', 'configs', 'tech-stack-mappings.json');
@@ -683,6 +710,17 @@ function main() {
683
710
  console.log(`Loaded exclude_file_suffixes from tech-stack-mappings.json: ${excludeFileSuffixes.join(', ')}`);
684
711
  }
685
712
  }
713
+
714
+ // Load tech-stack-specific exclude_file_names
715
+ if (config.tech_stacks &&
716
+ config.tech_stacks[platformType] &&
717
+ config.tech_stacks[platformType][techIdentifier] &&
718
+ config.tech_stacks[platformType][techIdentifier].exclude_file_names) {
719
+ excludeFileNames = config.tech_stacks[platformType][techIdentifier].exclude_file_names;
720
+ if (excludeFileNames.length > 0) {
721
+ console.log(`Loaded exclude_file_names from tech-stack-mappings.json: ${excludeFileNames.join(', ')}`);
722
+ }
723
+ }
686
724
  }
687
725
  } catch (e) {
688
726
  // Silent fallback - continue with default or empty
@@ -749,10 +787,21 @@ function main() {
749
787
  excludedDataObjectsCount = filesBeforeFilter - files.length;
750
788
  }
751
789
 
790
+ // Filter out files with excluded names (e.g., package-info)
791
+ let excludedFileNamesCount = 0;
792
+ if (excludeFileNames.length > 0) {
793
+ const filesBeforeFilter = files.length;
794
+ files = files.filter(file => !isExcludedFileName(file.fileName, excludeFileNames));
795
+ excludedFileNamesCount = filesBeforeFilter - files.length;
796
+ }
797
+
752
798
  console.log(`Found ${allFiles.length} total files, ${files.length} after excluding components directories`);
753
799
  if (excludedDataObjectsCount > 0) {
754
800
  console.log(`Excluded: ${excludedDataObjectsCount} data objects (VO/DTO/DO/Entity/Convert)`);
755
801
  }
802
+ if (excludedFileNamesCount > 0) {
803
+ console.log(`Excluded: ${excludedFileNamesCount} files by name (${excludeFileNames.join(', ')})`);
804
+ }
756
805
 
757
806
  // Build flat feature list - each file is a feature
758
807
  const features = [];
@@ -816,6 +865,7 @@ function main() {
816
865
  sourcePath: relativeFilePath,
817
866
  documentPath: docPath,
818
867
  module: moduleName,
868
+ lastModified: file.lastModified,
819
869
  analyzed: false,
820
870
  startedAt: null,
821
871
  completedAt: null,
@@ -857,11 +907,20 @@ function main() {
857
907
  fs.mkdirSync(syncStateDir, { recursive: true });
858
908
  }
859
909
 
910
+ // Incremental: if features file already exists, write to *.new.json
911
+ const actualOutputPath = fs.existsSync(outputPath)
912
+ ? outputPath.replace(/\.json$/, '.new.json')
913
+ : outputPath;
914
+
860
915
  // Write JSON output
861
- fs.writeFileSync(outputPath, JSON.stringify(inventory, null, 2), 'utf8');
916
+ fs.writeFileSync(actualOutputPath, JSON.stringify(inventory, null, 2), 'utf8');
862
917
 
863
- console.log(`Generated features.json with ${files.length} features`);
864
- console.log(`Ready for analysis: ${outputPath}`);
918
+ if (actualOutputPath !== outputPath) {
919
+ console.log(`Incremental: Generated ${path.basename(actualOutputPath)} (existing features detected)`);
920
+ } else {
921
+ console.log(`Full: Generated features.json with ${files.length} features`);
922
+ }
923
+ console.log(`Output: ${actualOutputPath}`);
865
924
  }
866
925
 
867
926
  main();
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "speccrew",
3
- "version": "0.1.3",
3
+ "version": "0.1.5",
4
4
  "description": "Spec-Driven Development toolkit for AI-powered IDEs",
5
5
  "author": "charlesmu99",
6
6
  "repository": {
@@ -22,7 +22,6 @@
22
22
  "docs/",
23
23
  "README*.md"
24
24
  ],
25
-
26
25
  "engines": {
27
26
  "node": ">=16.0.0"
28
27
  },
@@ -132,6 +132,7 @@
132
132
  "extensions": [".java", ".kt"],
133
133
  "exclude_dirs": ["controller", "controllers", "admin", "app", "api", "service", "services", "repository", "repositories", "dao", "dal", "mysql", "redis", "dataobject", "entity", "entities", "model", "models", "dto", "dtos", "vo", "vos", "mapper", "mappers", "convert", "converter", "converters", "config", "configs", "util", "utils", "common", "exception", "exceptions", "enums", "framework", "job", "mq", "listener", "listeners", "producer", "consumer"],
134
134
  "exclude_file_suffixes": ["VO", "DTO", "DO", "Entity", "Convert", "Converter"],
135
+ "exclude_file_names": ["package-info"],
135
136
  "entry_patterns": ["**/*Controller.java", "**/*Controller.kt", "**/*Service.java"],
136
137
  "api_patterns": {
137
138
  "controller": ["@Controller", "@RestController", "@RequestMapping"],