speccrew 0.1.4 → 0.1.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -9,7 +9,7 @@ tools: Read, Write, Glob, Grep
9
9
  You are the **Feature Designer Agent**, responsible for transforming PRD requirement scenarios into concrete system feature specifications.
10
10
 
11
11
  You are in the **second stage** of the complete engineering closed loop:
12
- `User Requirements → PRD → [Feature Detail Design] → speccrew-system-designer → speccrew-dev → speccrew-test`
12
+ `User Requirements → PRD → [Feature Detail Design + API Contract] → speccrew-system-designer → speccrew-dev → speccrew-test`
13
13
 
14
14
  Your core task is to **bridge requirements and implementation**: based on the user scenarios described in the PRD, design the system's UI prototypes, interaction flows, backend processing logic, and data access schemes, without delving into specific technical implementation details.
15
15
 
@@ -86,11 +86,41 @@ Invoke `speccrew-task-worker` agents in parallel:
86
86
  - Each worker has access to both Master PRD (for overall view) and one Sub PRD (for focused design)
87
87
  - All workers execute simultaneously to maximize efficiency
88
88
 
89
+ ## Phase 4: API Contract Generation
90
+
91
+ After Feature Spec documents are confirmed by user, generate API Contract documents.
92
+
93
+ ### 4.1 Single Feature Spec
94
+
95
+ Invoke API Contract skill directly:
96
+ - Skill path: `speccrew-fd-api-contract/SKILL.md`
97
+ - Input: The Feature Spec document generated in Phase 3
98
+ - Output path: `speccrew-workspace/iterations/{number}-{type}-{name}/02.feature-design/[feature-name]-api-contract.md`
99
+
100
+ ### 4.2 Multiple Feature Specs (Master + Sub)
101
+
102
+ Invoke `speccrew-task-worker` agents in parallel:
103
+ - Each worker receives:
104
+ - `skill_path`: `speccrew-fd-api-contract/SKILL.md`
105
+ - `context`:
106
+ - `feature_spec_path`: Path to one Feature Spec document
107
+ - `output_path`: Path for the API Contract document
108
+ - Parallel execution: one worker per Feature Spec document
109
+
110
+ ### 4.3 Joint Confirmation
111
+
112
+ After both Feature Spec and API Contract documents are ready, present summary to user:
113
+ - List all Feature Spec documents with paths
114
+ - List all API Contract documents with paths
115
+ - Request user confirmation before proceeding to system design phase
116
+ - After confirmation, API Contract becomes the read-only baseline for downstream stages
117
+
89
118
  # Deliverables
90
119
 
91
120
  | Deliverable | Path | Notes |
92
121
  |-------------|------|-------|
93
122
  | Feature Detail Design Document | `speccrew-workspace/iterations/{number}-{type}-{name}/02.feature-design/[feature-name]-feature-spec.md` | Based on template from `speccrew-fd-feature-design/templates/FEATURE-SPEC-TEMPLATE.md` |
123
+ | API Contract Document | `speccrew-workspace/iterations/{number}-{type}-{name}/02.feature-design/[feature-name]-api-contract.md` | Based on template from `speccrew-fd-api-contract/templates/API-CONTRACT-TEMPLATE.md` |
94
124
 
95
125
  # Deliverable Content Structure
96
126
 
@@ -133,7 +163,8 @@ The Feature Detail Design Document should include the following:
133
163
  - Use Mermaid diagrams to describe interaction flows, clearly expressing user-system interaction processes
134
164
  - Define complete data fields, including type, format, constraints, and other information
135
165
  - Design backend processing logic flows, including business validation and exception handling
136
- - Explicitly prompt user for confirmation after feature design completion, only transition to speccrew-system-designer after confirmation
166
+ - Generate API Contract documents after Feature Spec is confirmed, using `speccrew-fd-api-contract` skill
167
+ - Explicitly prompt user for joint confirmation of both Feature Spec and API Contract, only transition to speccrew-system-designer after confirmation
137
168
 
138
169
  **Must not do:**
139
170
  - Do not go deep into specific technical implementation details (e.g., technology selection, framework usage, that's speccrew-system-designer's responsibility)
@@ -60,7 +60,8 @@ flowchart TB
60
60
  S0[Pre-processing: Platform Root Detection] --> S0a[Stage 0: Platform Detection]
61
61
  S0a --> S1a[Stage 1a: Entry Directory Recognition]
62
62
  S1a --> S1b[Stage 1b: Feature Inventory]
63
- S1b --> S2[Stage 2: Feature Analysis + Graph Write]
63
+ S1b --> S1c[Stage 1c: Feature Merge - Incremental]
64
+ S1c --> S2[Stage 2: Feature Analysis + Graph Write]
64
65
  S2 --> S3[Stage 3: Module Summarize]
65
66
  S3 --> S3_5[Stage 3.5: UI Style Pattern Extract]
66
67
  S3_5 --> S4[Stage 4: System Summary]
@@ -278,6 +279,51 @@ After generating the entry-dirs JSON:
278
279
 
279
280
  ---
280
281
 
282
+ ## Stage 1c: Feature Merge (Incremental)
283
+
284
+ **Goal**: If incremental inventory files (`features-*.new.json`) are detected, merge them with existing `features-*.json` files. Identifies added/removed/changed features, resets changed features for re-analysis, and cleans up artifacts for removed features.
285
+
286
+ > **IMPORTANT**: This stage is executed **directly by the dispatch agent (Leader)** via `run_in_terminal`. NOT delegated to a Worker Agent.
287
+
288
+ **Prerequisite**: Stage 1b completed.
289
+
290
+ **Skip condition**: If no `features-*.new.json` files exist in `{sync_state_path}/knowledge-bizs/`, skip this Stage entirely and proceed to Stage 2.
291
+
292
+ **Action** (dispatch executes directly via `run_in_terminal`):
293
+
294
+ 1. **Locate the merge script**: Find `merge-features.js` in the `speccrew-knowledge-bizs-dispatch` skill's scripts directory:
295
+ - Script location: `{ide_skills_dir}/speccrew-knowledge-bizs-dispatch/scripts/merge-features.js`
296
+
297
+ 2. **Execute merge script**:
298
+ ```
299
+ node "{path_to_merge_features_js}" --syncStatePath "{sync_state_path}/knowledge-bizs" --completedDir "{completed_dir}" --projectRoot "{project_root}"
300
+ ```
301
+
302
+ 3. **Read output JSON** from stdout and report merge results:
303
+ - Added features: new source files discovered
304
+ - Removed features: source files no longer exist (documents and markers cleaned up)
305
+ - Changed features: source files modified since last analysis (reset to `analyzed: false`)
306
+ - Unchanged features: source files not modified (analysis state preserved)
307
+
308
+ **Merge Logic**:
309
+
310
+ | Situation | Condition | Action |
311
+ |-----------|-----------|--------|
312
+ | Added | In new scan but not in existing features | Add with `analyzed: false` |
313
+ | Removed | In existing features but not in new scan | Remove from list, delete `.md` doc + `.done.json` + `.graph.json` markers |
314
+ | Changed | Both exist, `lastModified > completedAt` | Reset `analyzed: false` for re-analysis |
315
+ | Unchanged | Both exist, `lastModified <= completedAt` | Preserve existing analysis state |
316
+
317
+ **Output**: Updated `features-{platform}.json` files where:
318
+ - New features: `analyzed: false`
319
+ - Source-modified features: `analyzed: false`
320
+ - Unmodified features: preserved original `analyzed` status
321
+ - Deleted features: removed from list, associated documents and markers cleaned up
322
+
323
+ **Error handling**: If the merge script exits with non-zero code, STOP and report the error. Do NOT proceed to Stage 2 until merge is resolved.
324
+
325
+ ---
326
+
281
327
  ## Stage 2: Feature Analysis (Batch Processing)
282
328
 
283
329
  **Overview**: Process all pending features in batches. Each batch gets a set of features, launches Worker Agents to analyze them, then processes the results.
@@ -0,0 +1,300 @@
1
+ #!/usr/bin/env node
2
+ /**
3
+ * merge-features.js
4
+ *
5
+ * Merge incremental feature inventory (*.new.json) with existing features (*.json).
6
+ * Identifies added/removed/changed/unchanged features and cleans up artifacts for removed features.
7
+ *
8
+ * Usage: node merge-features.js --syncStatePath <path> --completedDir <path> --projectRoot <path>
9
+ */
10
+
11
+ const fs = require('fs');
12
+ const path = require('path');
13
+
14
+ function parseArgs() {
15
+ const args = process.argv.slice(2);
16
+ const params = {};
17
+ for (let i = 0; i < args.length; i++) {
18
+ if (args[i].startsWith('--')) {
19
+ const key = args[i].slice(2);
20
+ const value = args[i + 1];
21
+ if (value !== undefined && !value.startsWith('--')) {
22
+ params[key] = value;
23
+ i++;
24
+ } else {
25
+ params[key] = true;
26
+ }
27
+ }
28
+ }
29
+ return params;
30
+ }
31
+
32
+ function normalizePath(p) {
33
+ return p ? p.replace(/\\/g, '/') : '';
34
+ }
35
+
36
+ // Safely delete a file, logging the action
37
+ function safeDelete(filePath, cleanedFiles) {
38
+ if (fs.existsSync(filePath)) {
39
+ try {
40
+ fs.unlinkSync(filePath);
41
+ cleanedFiles.push(normalizePath(filePath));
42
+ return true;
43
+ } catch (e) {
44
+ console.error(`Warning: Failed to delete ${filePath}: ${e.message}`);
45
+ return false;
46
+ }
47
+ }
48
+ return false;
49
+ }
50
+
51
+ // Compare timestamps: returns true if sourceModified is newer than analysisCompleted
52
+ function isNewer(lastModified, completedAt) {
53
+ if (!completedAt) return true; // never analyzed → needs analysis
54
+ if (!lastModified) return false; // no modification info → assume unchanged
55
+
56
+ // lastModified is ISO format: "2026-04-07T12:30:00.000Z"
57
+ // completedAt is custom format: "2026-04-07-123000" (from dispatch)
58
+ // Normalize completedAt to comparable format
59
+ const normalizedCompleted = normalizeCompletedAt(completedAt);
60
+
61
+ const modDate = new Date(lastModified);
62
+ const compDate = new Date(normalizedCompleted);
63
+
64
+ // If either date is invalid, treat as changed (safer)
65
+ if (isNaN(modDate.getTime()) || isNaN(compDate.getTime())) {
66
+ return true;
67
+ }
68
+
69
+ return modDate > compDate;
70
+ }
71
+
72
+ // Normalize completedAt format "2026-04-07-123000" → "2026-04-07T12:30:00"
73
+ function normalizeCompletedAt(completedAt) {
74
+ if (!completedAt) return null;
75
+
76
+ // Try ISO format first
77
+ if (completedAt.includes('T')) return completedAt;
78
+
79
+ // Handle "YYYY-MM-DD-HHmmss" format
80
+ const match = completedAt.match(/^(\d{4}-\d{2}-\d{2})-(\d{2})(\d{2})(\d{2})$/);
81
+ if (match) {
82
+ return `${match[1]}T${match[2]}:${match[3]}:${match[4]}`;
83
+ }
84
+
85
+ return completedAt;
86
+ }
87
+
88
+ function main() {
89
+ const params = parseArgs();
90
+
91
+ const syncStatePath = params.syncStatePath;
92
+ const completedDir = params.completedDir;
93
+ const projectRoot = params.projectRoot;
94
+
95
+ if (!syncStatePath || !completedDir || !projectRoot) {
96
+ console.error('Usage: node merge-features.js --syncStatePath <path> --completedDir <path> --projectRoot <path>');
97
+ process.exit(1);
98
+ }
99
+
100
+ const resolvedSyncState = path.resolve(syncStatePath);
101
+ const resolvedCompletedDir = path.resolve(completedDir);
102
+ const resolvedProjectRoot = path.resolve(projectRoot);
103
+
104
+ // Scan for *.new.json files
105
+ if (!fs.existsSync(resolvedSyncState)) {
106
+ // Output empty result
107
+ console.log(JSON.stringify({ platforms: [], totalAdded: 0, totalRemoved: 0, totalChanged: 0, totalUnchanged: 0 }));
108
+ return;
109
+ }
110
+
111
+ const newFiles = fs.readdirSync(resolvedSyncState)
112
+ .filter(f => f.startsWith('features-') && f.endsWith('.new.json'));
113
+
114
+ if (newFiles.length === 0) {
115
+ // No incremental files found
116
+ console.log(JSON.stringify({ platforms: [], totalAdded: 0, totalRemoved: 0, totalChanged: 0, totalUnchanged: 0 }));
117
+ return;
118
+ }
119
+
120
+ const result = {
121
+ platforms: [],
122
+ totalAdded: 0,
123
+ totalRemoved: 0,
124
+ totalChanged: 0,
125
+ totalUnchanged: 0
126
+ };
127
+
128
+ for (const newFile of newFiles) {
129
+ const newFilePath = path.join(resolvedSyncState, newFile);
130
+ // features-backend-bpm.new.json → features-backend-bpm.json
131
+ const oldFileName = newFile.replace('.new.json', '.json');
132
+ const oldFilePath = path.join(resolvedSyncState, oldFileName);
133
+
134
+ // Read new features
135
+ let newData;
136
+ try {
137
+ newData = JSON.parse(fs.readFileSync(newFilePath, 'utf8'));
138
+ } catch (e) {
139
+ console.error(`Error reading ${newFile}: ${e.message}`);
140
+ continue;
141
+ }
142
+
143
+ // Read old features (if exists)
144
+ let oldData = null;
145
+ if (fs.existsSync(oldFilePath)) {
146
+ try {
147
+ oldData = JSON.parse(fs.readFileSync(oldFilePath, 'utf8'));
148
+ } catch (e) {
149
+ console.error(`Warning: Failed to read ${oldFileName}, treating as fresh: ${e.message}`);
150
+ }
151
+ }
152
+
153
+ // If no old data, just rename new → old (first-time generation)
154
+ if (!oldData) {
155
+ fs.renameSync(newFilePath, oldFilePath);
156
+ const platformResult = {
157
+ platformId: newData.platformId || oldFileName.replace('features-', '').replace('.json', ''),
158
+ added: newData.features.map(f => f.fileName),
159
+ removed: [],
160
+ changed: [],
161
+ unchanged: [],
162
+ cleanedFiles: []
163
+ };
164
+ result.platforms.push(platformResult);
165
+ result.totalAdded += platformResult.added.length;
166
+ console.error(`Platform ${platformResult.platformId}: ${platformResult.added.length} features (first run)`);
167
+ continue;
168
+ }
169
+
170
+ // Build old features map by sourcePath
171
+ const oldMap = new Map();
172
+ for (const feature of (oldData.features || [])) {
173
+ oldMap.set(normalizePath(feature.sourcePath), feature);
174
+ }
175
+
176
+ // Build new features map by sourcePath
177
+ const newMap = new Map();
178
+ for (const feature of (newData.features || [])) {
179
+ newMap.set(normalizePath(feature.sourcePath), feature);
180
+ }
181
+
182
+ const platformResult = {
183
+ platformId: newData.platformId || oldData.platformId || oldFileName.replace('features-', '').replace('.json', ''),
184
+ added: [],
185
+ removed: [],
186
+ changed: [],
187
+ unchanged: [],
188
+ cleanedFiles: []
189
+ };
190
+
191
+ // Merged features list
192
+ const mergedFeatures = [];
193
+
194
+ // Process new features
195
+ for (const [sourcePath, newFeature] of newMap) {
196
+ const oldFeature = oldMap.get(sourcePath);
197
+
198
+ if (!oldFeature) {
199
+ // Added: new feature not in old
200
+ mergedFeatures.push({
201
+ ...newFeature,
202
+ analyzed: false,
203
+ startedAt: null,
204
+ completedAt: null,
205
+ analysisNotes: null
206
+ });
207
+ platformResult.added.push(newFeature.fileName);
208
+ } else if (!oldFeature.analyzed || !oldFeature.completedAt) {
209
+ // Previously not analyzed → keep as not analyzed, use new metadata
210
+ mergedFeatures.push({
211
+ ...newFeature,
212
+ analyzed: false,
213
+ startedAt: oldFeature.startedAt,
214
+ completedAt: oldFeature.completedAt,
215
+ analysisNotes: oldFeature.analysisNotes
216
+ });
217
+ platformResult.changed.push(newFeature.fileName);
218
+ } else if (isNewer(newFeature.lastModified, oldFeature.completedAt)) {
219
+ // Changed: source modified after last analysis
220
+ mergedFeatures.push({
221
+ ...newFeature,
222
+ analyzed: false,
223
+ startedAt: null,
224
+ completedAt: oldFeature.completedAt, // preserve for reference
225
+ analysisNotes: `Source modified since last analysis (was: ${oldFeature.analysisNotes || 'N/A'})`
226
+ });
227
+ platformResult.changed.push(newFeature.fileName);
228
+ } else {
229
+ // Unchanged: keep old feature state entirely
230
+ mergedFeatures.push({
231
+ ...oldFeature,
232
+ // Update metadata from new scan (in case id/documentPath changed)
233
+ id: newFeature.id,
234
+ documentPath: newFeature.documentPath,
235
+ lastModified: newFeature.lastModified
236
+ });
237
+ platformResult.unchanged.push(newFeature.fileName);
238
+ }
239
+ }
240
+
241
+ // Process removed features (in old but not in new)
242
+ for (const [sourcePath, oldFeature] of oldMap) {
243
+ if (!newMap.has(sourcePath)) {
244
+ platformResult.removed.push(oldFeature.fileName);
245
+
246
+ // Clean up artifacts
247
+ // 1. Delete document .md file
248
+ if (oldFeature.documentPath) {
249
+ const docAbsPath = path.join(resolvedProjectRoot, oldFeature.documentPath);
250
+ safeDelete(docAbsPath, platformResult.cleanedFiles);
251
+ }
252
+
253
+ // 2. Delete .done.json marker
254
+ const donePath = path.join(resolvedCompletedDir, `${oldFeature.fileName}.done.json`);
255
+ safeDelete(donePath, platformResult.cleanedFiles);
256
+
257
+ // 3. Delete .graph.json marker
258
+ const graphPath = path.join(resolvedCompletedDir, `${oldFeature.fileName}.graph.json`);
259
+ safeDelete(graphPath, platformResult.cleanedFiles);
260
+ }
261
+ }
262
+
263
+ // Update inventory metadata
264
+ const analyzedCount = mergedFeatures.filter(f => f.analyzed).length;
265
+ const mergedInventory = {
266
+ ...newData,
267
+ // Preserve some old metadata
268
+ analysisMethod: oldData.analysisMethod || newData.analysisMethod,
269
+ // Update counts
270
+ totalFiles: mergedFeatures.length,
271
+ analyzedCount: analyzedCount,
272
+ pendingCount: mergedFeatures.length - analyzedCount,
273
+ generatedAt: new Date().toISOString().replace(/[-:]/g, '').slice(0, 15).replace('T', '-'),
274
+ features: mergedFeatures
275
+ };
276
+
277
+ // Also recalculate modules list
278
+ const moduleSet = new Set(mergedFeatures.map(f => f.module));
279
+ mergedInventory.modules = [...moduleSet].sort();
280
+
281
+ // Write back merged features (overwrite old file)
282
+ fs.writeFileSync(oldFilePath, JSON.stringify(mergedInventory, null, 2), 'utf8');
283
+
284
+ // Delete .new.json file
285
+ fs.unlinkSync(newFilePath);
286
+
287
+ result.platforms.push(platformResult);
288
+ result.totalAdded += platformResult.added.length;
289
+ result.totalRemoved += platformResult.removed.length;
290
+ result.totalChanged += platformResult.changed.length;
291
+ result.totalUnchanged += platformResult.unchanged.length;
292
+
293
+ console.error(`Platform ${platformResult.platformId}: +${platformResult.added.length} added, -${platformResult.removed.length} removed, ~${platformResult.changed.length} changed, =${platformResult.unchanged.length} unchanged`);
294
+ }
295
+
296
+ // Output result as JSON to stdout
297
+ console.log(JSON.stringify(result, null, 2));
298
+ }
299
+
300
+ main();
@@ -366,7 +366,8 @@ function findFilesInDir(dir, extensions, baseDir) {
366
366
  relativePath,
367
367
  fileName: path.basename(item, ext),
368
368
  extension: ext,
369
- directory: path.dirname(relativePath)
369
+ directory: path.dirname(relativePath),
370
+ lastModified: stat.mtime.toISOString()
370
371
  });
371
372
  }
372
373
  }
@@ -459,6 +460,7 @@ function generateFromEntryDirs(entryDirsData, platformConfig, projectRoot, outpu
459
460
  sourcePath: relativeFilePath,
460
461
  documentPath: docPath,
461
462
  module: moduleName,
463
+ lastModified: file.lastModified,
462
464
  analyzed: false,
463
465
  startedAt: null,
464
466
  completedAt: null,
@@ -498,16 +500,25 @@ function generateFromEntryDirs(entryDirsData, platformConfig, projectRoot, outpu
498
500
  // Write output file
499
501
  const outputFileName = `features-${platformId}.json`;
500
502
  const outputPath = path.join(outputDir, outputFileName);
501
-
503
+
504
+ // Incremental: if features file already exists, write to *.new.json
505
+ const actualOutputPath = fs.existsSync(outputPath)
506
+ ? outputPath.replace(/\.json$/, '.new.json')
507
+ : outputPath;
508
+
502
509
  // Ensure output directory exists
503
510
  if (!fs.existsSync(outputDir)) {
504
511
  fs.mkdirSync(outputDir, { recursive: true });
505
512
  }
506
-
507
- fs.writeFileSync(outputPath, JSON.stringify(inventory, null, 2), 'utf8');
508
-
509
- console.log(`Generated ${outputFileName} with ${features.length} features`);
510
- console.log(`Output: ${outputPath}`);
513
+
514
+ fs.writeFileSync(actualOutputPath, JSON.stringify(inventory, null, 2), 'utf8');
515
+
516
+ if (actualOutputPath !== outputPath) {
517
+ console.log(`Incremental: Generated ${path.basename(actualOutputPath)} (existing features detected)`);
518
+ } else {
519
+ console.log(`Full: Generated ${path.basename(actualOutputPath)} with ${features.length} features`);
520
+ }
521
+ console.log(`Output: ${actualOutputPath}`);
511
522
 
512
523
  return true;
513
524
  }
@@ -536,7 +547,8 @@ function findFiles(dir, extensions, excludeDirs, baseDir) {
536
547
  relativePath,
537
548
  fileName: path.basename(item, ext),
538
549
  extension: ext,
539
- directory: path.dirname(relativePath)
550
+ directory: path.dirname(relativePath),
551
+ lastModified: stat.mtime.toISOString()
540
552
  });
541
553
  }
542
554
  }
@@ -853,6 +865,7 @@ function main() {
853
865
  sourcePath: relativeFilePath,
854
866
  documentPath: docPath,
855
867
  module: moduleName,
868
+ lastModified: file.lastModified,
856
869
  analyzed: false,
857
870
  startedAt: null,
858
871
  completedAt: null,
@@ -894,11 +907,20 @@ function main() {
894
907
  fs.mkdirSync(syncStateDir, { recursive: true });
895
908
  }
896
909
 
910
+ // Incremental: if features file already exists, write to *.new.json
911
+ const actualOutputPath = fs.existsSync(outputPath)
912
+ ? outputPath.replace(/\.json$/, '.new.json')
913
+ : outputPath;
914
+
897
915
  // Write JSON output
898
- fs.writeFileSync(outputPath, JSON.stringify(inventory, null, 2), 'utf8');
916
+ fs.writeFileSync(actualOutputPath, JSON.stringify(inventory, null, 2), 'utf8');
899
917
 
900
- console.log(`Generated features.json with ${files.length} features`);
901
- console.log(`Ready for analysis: ${outputPath}`);
918
+ if (actualOutputPath !== outputPath) {
919
+ console.log(`Incremental: Generated ${path.basename(actualOutputPath)} (existing features detected)`);
920
+ } else {
921
+ console.log(`Full: Generated features.json with ${files.length} features`);
922
+ }
923
+ console.log(`Output: ${actualOutputPath}`);
902
924
  }
903
925
 
904
926
  main();
@@ -376,6 +376,28 @@ function updatePackageDocs(packageRoot, workspaceDir, stats) {
376
376
  function run() {
377
377
  try {
378
378
  const args = parseArgs();
379
+
380
+ // Self-update: check if a newer version exists on npm registry
381
+ const localVersion = getPackageVersion();
382
+ let latestVersion = null;
383
+ try {
384
+ const { execSync } = require('child_process');
385
+ latestVersion = execSync('npm view speccrew version', { encoding: 'utf8', timeout: 15000 }).trim();
386
+ } catch (e) {
387
+ // Network error or npm not available, skip self-update
388
+ }
389
+
390
+ if (latestVersion && latestVersion !== localVersion) {
391
+ console.log(`Updating CLI: v${localVersion} → v${latestVersion} ...`);
392
+ try {
393
+ const { execSync } = require('child_process');
394
+ execSync('npm install -g speccrew@latest', { stdio: 'inherit', timeout: 60000 });
395
+ console.log(`CLI updated to v${latestVersion}\n`);
396
+ } catch (e) {
397
+ console.warn(`Warning: CLI self-update failed (${e.message}). Continuing with current version.\n`);
398
+ }
399
+ }
400
+
379
401
  const projectRoot = process.cwd();
380
402
 
381
403
  // 读取 .speccrewrc (with migration support)
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "speccrew",
3
- "version": "0.1.4",
3
+ "version": "0.1.6",
4
4
  "description": "Spec-Driven Development toolkit for AI-powered IDEs",
5
5
  "author": "charlesmu99",
6
6
  "repository": {