@boshu2/vibe-check 1.5.0 → 1.6.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.agents/bundles/insight-mining-dashboard-research-2025-11-30.md +400 -0
- package/.agents/bundles/storage-enhancement-research-2025-11-30.md +292 -0
- package/.agents/bundles/timeline-feature-research-complete-2025-11-30.md +301 -0
- package/.agents/plans/insight-dashboard-plan-2025-11-30.md +1130 -0
- package/.agents/plans/json-storage-enhancement-plan.md +717 -0
- package/.agents/plans/storage-hardening-and-cache-plan.md +592 -0
- package/.agents/plans/test-coverage-gaps-plan.md +1117 -0
- package/.agents/plans/timeline-feature-plan.md +193 -0
- package/.agents/plans/vibe_timeline_research_findings.md +553 -0
- package/.claude/settings.local.json +1 -0
- package/.vibe-check/.gitignore +6 -0
- package/CHANGELOG.md +46 -0
- package/CLAUDE.md +24 -0
- package/CONTRIBUTING.md +227 -0
- package/README.md +165 -144
- package/claude-progress.json +191 -9
- package/claude-progress.txt +257 -0
- package/dashboard/app.js +75 -2
- package/dashboard/dashboard-data.json +653 -0
- package/dashboard/index.html +13 -0
- package/dashboard/styles.css +61 -0
- package/dist/analysis/cross-session-analysis.d.ts +68 -0
- package/dist/analysis/cross-session-analysis.d.ts.map +1 -0
- package/dist/analysis/cross-session-analysis.js +174 -0
- package/dist/analysis/cross-session-analysis.js.map +1 -0
- package/dist/analysis/index.d.ts +2 -0
- package/dist/analysis/index.d.ts.map +1 -0
- package/dist/analysis/index.js +12 -0
- package/dist/analysis/index.js.map +1 -0
- package/dist/cli.js +10 -1
- package/dist/cli.js.map +1 -1
- package/dist/commands/analyze.d.ts +2 -0
- package/dist/commands/analyze.d.ts.map +1 -1
- package/dist/commands/analyze.js +105 -2
- package/dist/commands/analyze.js.map +1 -1
- package/dist/commands/cache.d.ts +6 -0
- package/dist/commands/cache.d.ts.map +1 -0
- package/dist/commands/cache.js +168 -0
- package/dist/commands/cache.js.map +1 -0
- package/dist/commands/dashboard.d.ts +8 -0
- package/dist/commands/dashboard.d.ts.map +1 -0
- package/dist/commands/dashboard.js +109 -0
- package/dist/commands/dashboard.js.map +1 -0
- package/dist/commands/index.d.ts +3 -0
- package/dist/commands/index.d.ts.map +1 -1
- package/dist/commands/index.js +8 -1
- package/dist/commands/index.js.map +1 -1
- package/dist/commands/timeline.d.ts +14 -0
- package/dist/commands/timeline.d.ts.map +1 -0
- package/dist/commands/timeline.js +462 -0
- package/dist/commands/timeline.js.map +1 -0
- package/dist/git.d.ts +24 -0
- package/dist/git.d.ts.map +1 -1
- package/dist/git.js +94 -0
- package/dist/git.js.map +1 -1
- package/dist/insights/generators.d.ts +44 -0
- package/dist/insights/generators.d.ts.map +1 -0
- package/dist/insights/generators.js +289 -0
- package/dist/insights/generators.js.map +1 -0
- package/dist/insights/index.d.ts +16 -0
- package/dist/insights/index.d.ts.map +1 -0
- package/dist/insights/index.js +171 -0
- package/dist/insights/index.js.map +1 -0
- package/dist/insights/types.d.ts +93 -0
- package/dist/insights/types.d.ts.map +1 -0
- package/dist/insights/types.js +6 -0
- package/dist/insights/types.js.map +1 -0
- package/dist/output/timeline-html.d.ts +6 -0
- package/dist/output/timeline-html.d.ts.map +1 -0
- package/dist/output/timeline-html.js +389 -0
- package/dist/output/timeline-html.js.map +1 -0
- package/dist/output/timeline-markdown.d.ts +6 -0
- package/dist/output/timeline-markdown.d.ts.map +1 -0
- package/dist/output/timeline-markdown.js +167 -0
- package/dist/output/timeline-markdown.js.map +1 -0
- package/dist/output/timeline.d.ts +9 -0
- package/dist/output/timeline.d.ts.map +1 -0
- package/dist/output/timeline.js +318 -0
- package/dist/output/timeline.js.map +1 -0
- package/dist/patterns/detour.d.ts +32 -0
- package/dist/patterns/detour.d.ts.map +1 -0
- package/dist/patterns/detour.js +137 -0
- package/dist/patterns/detour.js.map +1 -0
- package/dist/patterns/flow-state.d.ts +16 -0
- package/dist/patterns/flow-state.d.ts.map +1 -0
- package/dist/patterns/flow-state.js +40 -0
- package/dist/patterns/flow-state.js.map +1 -0
- package/dist/patterns/index.d.ts +8 -0
- package/dist/patterns/index.d.ts.map +1 -0
- package/dist/patterns/index.js +22 -0
- package/dist/patterns/index.js.map +1 -0
- package/dist/patterns/intervention-effectiveness.d.ts +42 -0
- package/dist/patterns/intervention-effectiveness.d.ts.map +1 -0
- package/dist/patterns/intervention-effectiveness.js +196 -0
- package/dist/patterns/intervention-effectiveness.js.map +1 -0
- package/dist/patterns/late-night.d.ts +30 -0
- package/dist/patterns/late-night.d.ts.map +1 -0
- package/dist/patterns/late-night.js +141 -0
- package/dist/patterns/late-night.js.map +1 -0
- package/dist/patterns/post-delete-sprint.d.ts +28 -0
- package/dist/patterns/post-delete-sprint.d.ts.map +1 -0
- package/dist/patterns/post-delete-sprint.js +85 -0
- package/dist/patterns/post-delete-sprint.js.map +1 -0
- package/dist/patterns/spiral-regression.d.ts +49 -0
- package/dist/patterns/spiral-regression.d.ts.map +1 -0
- package/dist/patterns/spiral-regression.js +219 -0
- package/dist/patterns/spiral-regression.js.map +1 -0
- package/dist/patterns/thrashing.d.ts +25 -0
- package/dist/patterns/thrashing.d.ts.map +1 -0
- package/dist/patterns/thrashing.js +111 -0
- package/dist/patterns/thrashing.js.map +1 -0
- package/dist/storage/atomic.d.ts +40 -0
- package/dist/storage/atomic.d.ts.map +1 -0
- package/dist/storage/atomic.js +155 -0
- package/dist/storage/atomic.js.map +1 -0
- package/dist/storage/commit-log.d.ts +35 -0
- package/dist/storage/commit-log.d.ts.map +1 -0
- package/dist/storage/commit-log.js +128 -0
- package/dist/storage/commit-log.js.map +1 -0
- package/dist/storage/index.d.ts +5 -0
- package/dist/storage/index.d.ts.map +1 -0
- package/dist/storage/index.js +33 -0
- package/dist/storage/index.js.map +1 -0
- package/dist/storage/schema.d.ts +32 -0
- package/dist/storage/schema.d.ts.map +1 -0
- package/dist/storage/schema.js +37 -0
- package/dist/storage/schema.js.map +1 -0
- package/dist/storage/timeline-store.d.ts +117 -0
- package/dist/storage/timeline-store.d.ts.map +1 -0
- package/dist/storage/timeline-store.js +438 -0
- package/dist/storage/timeline-store.js.map +1 -0
- package/dist/types.d.ts +96 -0
- package/dist/types.d.ts.map +1 -1
- package/docs/ARCHITECTURE.md +458 -0
- package/docs/DATA-ARCHITECTURE.md +565 -0
- package/docs/GAMIFICATION.md +564 -0
- package/docs/JSON-STORAGE-PATTERNS.md +512 -0
- package/docs/METRICS-EXPLAINED.md +394 -0
- package/docs/UNIFIED-ECOSYSTEM.md +560 -0
- package/docs/VIBE-ECOSYSTEM.md +406 -0
- package/docs/images/dashboard.png +0 -0
- package/feature-list.json +48 -0
- package/package.json +2 -1
- package/vitest.config.ts +1 -5
- package/.vibe-check/calibration.json +0 -38
- package/.vibe-check/latest.json +0 -114
- package/.vibe-check/sessions.json +0 -44
- package/PLAN-ultimate-game.md +0 -1362
|
@@ -0,0 +1,717 @@
|
|
|
1
|
+
# JSON Storage Enhancement Plan
|
|
2
|
+
|
|
3
|
+
**Type:** Plan
|
|
4
|
+
**Created:** 2025-11-30
|
|
5
|
+
**Depends On:** docs/DATA-ARCHITECTURE.md, docs/JSON-STORAGE-PATTERNS.md
|
|
6
|
+
**Loop:** Middle (bridges research to implementation)
|
|
7
|
+
**Tags:** storage, json, ndjson, event-sourcing
|
|
8
|
+
|
|
9
|
+
---
|
|
10
|
+
|
|
11
|
+
## Overview
|
|
12
|
+
|
|
13
|
+
Enhance vibe-check's JSON storage layer to follow best practices:
|
|
14
|
+
1. Add NDJSON append-only log for commits (source of truth)
|
|
15
|
+
2. Implement atomic writes to prevent corruption
|
|
16
|
+
3. Add schema versioning with migrations
|
|
17
|
+
4. Create utility module for common storage patterns
|
|
18
|
+
|
|
19
|
+
**Scope:** Storage layer only. No changes to commands or output.
|
|
20
|
+
|
|
21
|
+
---
|
|
22
|
+
|
|
23
|
+
## Approach Selected
|
|
24
|
+
|
|
25
|
+
**Hybrid Pattern (Pattern 5 from research):**
|
|
26
|
+
- `commits.ndjson` = append-only source of truth
|
|
27
|
+
- `timeline.json` = computed view (existing, enhanced)
|
|
28
|
+
- Atomic writes via temp file + rename
|
|
29
|
+
|
|
30
|
+
**Rationale:**
|
|
31
|
+
- Git-friendly (NDJSON diffs show only new lines)
|
|
32
|
+
- Resilient (can regenerate timeline.json from commits.ndjson)
|
|
33
|
+
- Backward compatible (existing timeline.json format preserved)
|
|
34
|
+
|
|
35
|
+
---
|
|
36
|
+
|
|
37
|
+
## PDC Strategy
|
|
38
|
+
|
|
39
|
+
### Prevent
|
|
40
|
+
- [x] Research completed (DATA-ARCHITECTURE.md, JSON-STORAGE-PATTERNS.md)
|
|
41
|
+
- [ ] Write tracer test for atomic write
|
|
42
|
+
- [ ] Verify NDJSON append works
|
|
43
|
+
|
|
44
|
+
### Detect
|
|
45
|
+
- [ ] Test corruption recovery
|
|
46
|
+
- [ ] Test backward compatibility with existing timeline.json
|
|
47
|
+
|
|
48
|
+
### Correct
|
|
49
|
+
- [ ] Rollback procedure documented below
|
|
50
|
+
- [ ] Old files preserved (not deleted)
|
|
51
|
+
|
|
52
|
+
---
|
|
53
|
+
|
|
54
|
+
## Files to Create
|
|
55
|
+
|
|
56
|
+
### 1. `src/storage/atomic.ts`
|
|
57
|
+
|
|
58
|
+
**Purpose:** Atomic write utilities to prevent corruption
|
|
59
|
+
|
|
60
|
+
```typescript
|
|
61
|
+
/**
|
|
62
|
+
* Atomic file operations for safe JSON storage
|
|
63
|
+
*/
|
|
64
|
+
import * as fs from 'fs';
|
|
65
|
+
import * as path from 'path';
|
|
66
|
+
import { randomBytes } from 'crypto';
|
|
67
|
+
|
|
68
|
+
/**
|
|
69
|
+
* Write data atomically using temp file + rename pattern.
|
|
70
|
+
* This prevents corruption if the process is killed mid-write.
|
|
71
|
+
*/
|
|
72
|
+
export function atomicWriteSync(filePath: string, data: string): void {
|
|
73
|
+
const dir = path.dirname(filePath);
|
|
74
|
+
const tempPath = path.join(dir, `.${path.basename(filePath)}.${randomBytes(6).toString('hex')}.tmp`);
|
|
75
|
+
|
|
76
|
+
// Ensure directory exists
|
|
77
|
+
if (!fs.existsSync(dir)) {
|
|
78
|
+
fs.mkdirSync(dir, { recursive: true });
|
|
79
|
+
}
|
|
80
|
+
|
|
81
|
+
// Write to temp file
|
|
82
|
+
fs.writeFileSync(tempPath, data, 'utf-8');
|
|
83
|
+
|
|
84
|
+
// Atomic rename (POSIX guarantees atomicity)
|
|
85
|
+
fs.renameSync(tempPath, filePath);
|
|
86
|
+
}
|
|
87
|
+
|
|
88
|
+
/**
|
|
89
|
+
* Append a line to a file (for NDJSON).
|
|
90
|
+
* Creates file if it doesn't exist.
|
|
91
|
+
*/
|
|
92
|
+
export function appendLineSync(filePath: string, line: string): void {
|
|
93
|
+
const dir = path.dirname(filePath);
|
|
94
|
+
|
|
95
|
+
if (!fs.existsSync(dir)) {
|
|
96
|
+
fs.mkdirSync(dir, { recursive: true });
|
|
97
|
+
}
|
|
98
|
+
|
|
99
|
+
// Append with newline
|
|
100
|
+
fs.appendFileSync(filePath, line + '\n', 'utf-8');
|
|
101
|
+
}
|
|
102
|
+
|
|
103
|
+
/**
|
|
104
|
+
* Read NDJSON file and parse each line.
|
|
105
|
+
* Returns empty array if file doesn't exist.
|
|
106
|
+
*/
|
|
107
|
+
export function readNdjsonSync<T>(filePath: string): T[] {
|
|
108
|
+
if (!fs.existsSync(filePath)) {
|
|
109
|
+
return [];
|
|
110
|
+
}
|
|
111
|
+
|
|
112
|
+
const content = fs.readFileSync(filePath, 'utf-8');
|
|
113
|
+
const lines = content.split('\n').filter(line => line.trim().length > 0);
|
|
114
|
+
|
|
115
|
+
return lines.map(line => JSON.parse(line) as T);
|
|
116
|
+
}
|
|
117
|
+
|
|
118
|
+
/**
|
|
119
|
+
* Safely read and parse JSON with fallback.
|
|
120
|
+
* Returns fallback if file doesn't exist or is corrupted.
|
|
121
|
+
*/
|
|
122
|
+
export function safeReadJsonSync<T>(filePath: string, fallback: T): T {
|
|
123
|
+
if (!fs.existsSync(filePath)) {
|
|
124
|
+
return fallback;
|
|
125
|
+
}
|
|
126
|
+
|
|
127
|
+
try {
|
|
128
|
+
const content = fs.readFileSync(filePath, 'utf-8');
|
|
129
|
+
return JSON.parse(content) as T;
|
|
130
|
+
} catch (error) {
|
|
131
|
+
// Log corruption but don't crash
|
|
132
|
+
console.warn(`Warning: Could not parse ${filePath}, using fallback`);
|
|
133
|
+
|
|
134
|
+
// Backup corrupted file for debugging
|
|
135
|
+
const backupPath = `${filePath}.corrupted.${Date.now()}`;
|
|
136
|
+
try {
|
|
137
|
+
fs.renameSync(filePath, backupPath);
|
|
138
|
+
console.warn(`Corrupted file backed up to: ${backupPath}`);
|
|
139
|
+
} catch {
|
|
140
|
+
// Ignore backup failures
|
|
141
|
+
}
|
|
142
|
+
|
|
143
|
+
return fallback;
|
|
144
|
+
}
|
|
145
|
+
}
|
|
146
|
+
```
|
|
147
|
+
|
|
148
|
+
**Validation:**
|
|
149
|
+
```bash
|
|
150
|
+
npm run build
|
|
151
|
+
# Should compile without errors
|
|
152
|
+
```
|
|
153
|
+
|
|
154
|
+
---
|
|
155
|
+
|
|
156
|
+
### 2. `src/storage/commit-log.ts`
|
|
157
|
+
|
|
158
|
+
**Purpose:** NDJSON commit log (append-only source of truth)
|
|
159
|
+
|
|
160
|
+
```typescript
|
|
161
|
+
/**
|
|
162
|
+
* Append-only commit log using NDJSON format.
|
|
163
|
+
* This is the source of truth - timeline.json is derived from this.
|
|
164
|
+
*/
|
|
165
|
+
import * as path from 'path';
|
|
166
|
+
import { Commit } from '../types';
|
|
167
|
+
import { appendLineSync, readNdjsonSync } from './atomic';
|
|
168
|
+
|
|
169
|
+
const STORE_DIR = '.vibe-check';
|
|
170
|
+
const COMMIT_LOG_FILE = 'commits.ndjson';
|
|
171
|
+
|
|
172
|
+
/**
|
|
173
|
+
* Stored commit format (minimal for storage efficiency)
|
|
174
|
+
*/
|
|
175
|
+
export interface StoredCommit {
|
|
176
|
+
h: string; // hash (7 char)
|
|
177
|
+
d: string; // date (ISO string)
|
|
178
|
+
m: string; // message (first line)
|
|
179
|
+
t: string; // type
|
|
180
|
+
s: string | null; // scope
|
|
181
|
+
a: string; // author
|
|
182
|
+
}
|
|
183
|
+
|
|
184
|
+
/**
|
|
185
|
+
* Get commit log file path
|
|
186
|
+
*/
|
|
187
|
+
export function getCommitLogPath(repoPath: string = process.cwd()): string {
|
|
188
|
+
return path.join(repoPath, STORE_DIR, COMMIT_LOG_FILE);
|
|
189
|
+
}
|
|
190
|
+
|
|
191
|
+
/**
|
|
192
|
+
* Convert Commit to StoredCommit (compressed format)
|
|
193
|
+
*/
|
|
194
|
+
function toStoredCommit(commit: Commit): StoredCommit {
|
|
195
|
+
return {
|
|
196
|
+
h: commit.hash,
|
|
197
|
+
d: commit.date.toISOString(),
|
|
198
|
+
m: commit.message,
|
|
199
|
+
t: commit.type,
|
|
200
|
+
s: commit.scope,
|
|
201
|
+
a: commit.author,
|
|
202
|
+
};
|
|
203
|
+
}
|
|
204
|
+
|
|
205
|
+
/**
|
|
206
|
+
* Convert StoredCommit back to Commit
|
|
207
|
+
*/
|
|
208
|
+
function fromStoredCommit(stored: StoredCommit): Commit {
|
|
209
|
+
return {
|
|
210
|
+
hash: stored.h,
|
|
211
|
+
date: new Date(stored.d),
|
|
212
|
+
message: stored.m,
|
|
213
|
+
type: stored.t as Commit['type'],
|
|
214
|
+
scope: stored.s,
|
|
215
|
+
author: stored.a,
|
|
216
|
+
};
|
|
217
|
+
}
|
|
218
|
+
|
|
219
|
+
/**
|
|
220
|
+
* Append new commits to the log.
|
|
221
|
+
* Skips commits that already exist (by hash).
|
|
222
|
+
*/
|
|
223
|
+
export function appendCommits(commits: Commit[], repoPath: string = process.cwd()): number {
|
|
224
|
+
const logPath = getCommitLogPath(repoPath);
|
|
225
|
+
|
|
226
|
+
// Load existing hashes to prevent duplicates
|
|
227
|
+
const existingHashes = new Set(
|
|
228
|
+
readNdjsonSync<StoredCommit>(logPath).map(c => c.h)
|
|
229
|
+
);
|
|
230
|
+
|
|
231
|
+
let appendedCount = 0;
|
|
232
|
+
|
|
233
|
+
for (const commit of commits) {
|
|
234
|
+
if (!existingHashes.has(commit.hash)) {
|
|
235
|
+
const stored = toStoredCommit(commit);
|
|
236
|
+
appendLineSync(logPath, JSON.stringify(stored));
|
|
237
|
+
existingHashes.add(commit.hash);
|
|
238
|
+
appendedCount++;
|
|
239
|
+
}
|
|
240
|
+
}
|
|
241
|
+
|
|
242
|
+
return appendedCount;
|
|
243
|
+
}
|
|
244
|
+
|
|
245
|
+
/**
|
|
246
|
+
* Read all commits from the log.
|
|
247
|
+
*/
|
|
248
|
+
export function readCommitLog(repoPath: string = process.cwd()): Commit[] {
|
|
249
|
+
const logPath = getCommitLogPath(repoPath);
|
|
250
|
+
const stored = readNdjsonSync<StoredCommit>(logPath);
|
|
251
|
+
return stored.map(fromStoredCommit);
|
|
252
|
+
}
|
|
253
|
+
|
|
254
|
+
/**
|
|
255
|
+
* Get the most recent commit hash from the log.
|
|
256
|
+
* Returns empty string if log is empty.
|
|
257
|
+
*/
|
|
258
|
+
export function getLastLoggedCommitHash(repoPath: string = process.cwd()): string {
|
|
259
|
+
const commits = readCommitLog(repoPath);
|
|
260
|
+
if (commits.length === 0) return '';
|
|
261
|
+
|
|
262
|
+
// Sort by date descending and return most recent
|
|
263
|
+
const sorted = commits.sort((a, b) => b.date.getTime() - a.date.getTime());
|
|
264
|
+
return sorted[0].hash;
|
|
265
|
+
}
|
|
266
|
+
|
|
267
|
+
/**
|
|
268
|
+
* Get commit count in the log.
|
|
269
|
+
*/
|
|
270
|
+
export function getCommitLogCount(repoPath: string = process.cwd()): number {
|
|
271
|
+
const logPath = getCommitLogPath(repoPath);
|
|
272
|
+
const commits = readNdjsonSync<StoredCommit>(logPath);
|
|
273
|
+
return commits.length;
|
|
274
|
+
}
|
|
275
|
+
```
|
|
276
|
+
|
|
277
|
+
**Validation:**
|
|
278
|
+
```bash
|
|
279
|
+
npm run build
|
|
280
|
+
# Should compile without errors
|
|
281
|
+
```
|
|
282
|
+
|
|
283
|
+
---
|
|
284
|
+
|
|
285
|
+
### 3. `src/storage/schema.ts`
|
|
286
|
+
|
|
287
|
+
**Purpose:** Schema versioning and migration utilities
|
|
288
|
+
|
|
289
|
+
```typescript
|
|
290
|
+
/**
|
|
291
|
+
* Schema versioning and migration utilities
|
|
292
|
+
*/
|
|
293
|
+
|
|
294
|
+
export type SchemaVersion = '1.0.0' | '1.1.0' | '2.0.0';
|
|
295
|
+
|
|
296
|
+
export const CURRENT_SCHEMA_VERSION: SchemaVersion = '2.0.0';
|
|
297
|
+
|
|
298
|
+
/**
|
|
299
|
+
* Base interface for all versioned stores
|
|
300
|
+
*/
|
|
301
|
+
export interface VersionedStore {
|
|
302
|
+
version: SchemaVersion;
|
|
303
|
+
lastUpdated: string;
|
|
304
|
+
}
|
|
305
|
+
|
|
306
|
+
/**
|
|
307
|
+
* Migration function type
|
|
308
|
+
*/
|
|
309
|
+
export type MigrationFn<T> = (store: T) => T;
|
|
310
|
+
|
|
311
|
+
/**
|
|
312
|
+
* Migration registry
|
|
313
|
+
*/
|
|
314
|
+
export interface MigrationRegistry<T> {
|
|
315
|
+
'1.0.0_to_1.1.0'?: MigrationFn<T>;
|
|
316
|
+
'1.1.0_to_2.0.0'?: MigrationFn<T>;
|
|
317
|
+
}
|
|
318
|
+
|
|
319
|
+
/**
|
|
320
|
+
* Apply migrations to bring store to current version
|
|
321
|
+
*/
|
|
322
|
+
export function migrateStore<T extends VersionedStore>(
|
|
323
|
+
store: T,
|
|
324
|
+
migrations: MigrationRegistry<T>
|
|
325
|
+
): T {
|
|
326
|
+
let currentStore = store;
|
|
327
|
+
|
|
328
|
+
// Migration path
|
|
329
|
+
const migrationPath: Array<keyof MigrationRegistry<T>> = [
|
|
330
|
+
'1.0.0_to_1.1.0',
|
|
331
|
+
'1.1.0_to_2.0.0',
|
|
332
|
+
];
|
|
333
|
+
|
|
334
|
+
for (const migrationKey of migrationPath) {
|
|
335
|
+
const [fromVersion] = (migrationKey as string).split('_to_');
|
|
336
|
+
|
|
337
|
+
if (currentStore.version === fromVersion) {
|
|
338
|
+
const migration = migrations[migrationKey];
|
|
339
|
+
if (migration) {
|
|
340
|
+
currentStore = migration(currentStore);
|
|
341
|
+
}
|
|
342
|
+
}
|
|
343
|
+
}
|
|
344
|
+
|
|
345
|
+
return currentStore;
|
|
346
|
+
}
|
|
347
|
+
|
|
348
|
+
/**
|
|
349
|
+
* Check if store needs migration
|
|
350
|
+
*/
|
|
351
|
+
export function needsMigration(store: VersionedStore): boolean {
|
|
352
|
+
return store.version !== CURRENT_SCHEMA_VERSION;
|
|
353
|
+
}
|
|
354
|
+
```
|
|
355
|
+
|
|
356
|
+
**Validation:**
|
|
357
|
+
```bash
|
|
358
|
+
npm run build
|
|
359
|
+
# Should compile without errors
|
|
360
|
+
```
|
|
361
|
+
|
|
362
|
+
---
|
|
363
|
+
|
|
364
|
+
## Files to Modify
|
|
365
|
+
|
|
366
|
+
### 1. `src/storage/timeline-store.ts:189-199`
|
|
367
|
+
|
|
368
|
+
**Purpose:** Use atomic write instead of direct writeFileSync
|
|
369
|
+
|
|
370
|
+
**Before:**
|
|
371
|
+
```typescript
|
|
372
|
+
export function saveStore(store: TimelineStore, repoPath: string = process.cwd()): void {
|
|
373
|
+
const dirPath = getStoreDir(repoPath);
|
|
374
|
+
const filePath = getStorePath(repoPath);
|
|
375
|
+
|
|
376
|
+
if (!fs.existsSync(dirPath)) {
|
|
377
|
+
fs.mkdirSync(dirPath, { recursive: true });
|
|
378
|
+
}
|
|
379
|
+
|
|
380
|
+
store.lastUpdated = new Date().toISOString();
|
|
381
|
+
fs.writeFileSync(filePath, JSON.stringify(store, null, 2));
|
|
382
|
+
}
|
|
383
|
+
```
|
|
384
|
+
|
|
385
|
+
**After:**
|
|
386
|
+
```typescript
|
|
387
|
+
export function saveStore(store: TimelineStore, repoPath: string = process.cwd()): void {
|
|
388
|
+
const filePath = getStorePath(repoPath);
|
|
389
|
+
|
|
390
|
+
store.lastUpdated = new Date().toISOString();
|
|
391
|
+
|
|
392
|
+
// Use atomic write to prevent corruption
|
|
393
|
+
atomicWriteSync(filePath, JSON.stringify(store, null, 2));
|
|
394
|
+
}
|
|
395
|
+
```
|
|
396
|
+
|
|
397
|
+
**Reason:** Prevents file corruption if process is killed mid-write
|
|
398
|
+
**Validation:** `npm test` should still pass
|
|
399
|
+
|
|
400
|
+
---
|
|
401
|
+
|
|
402
|
+
### 2. `src/storage/timeline-store.ts:170-184`
|
|
403
|
+
|
|
404
|
+
**Purpose:** Use safe read with corruption recovery
|
|
405
|
+
|
|
406
|
+
**Before:**
|
|
407
|
+
```typescript
|
|
408
|
+
export function loadStore(repoPath: string = process.cwd()): TimelineStore {
|
|
409
|
+
const filePath = getStorePath(repoPath);
|
|
410
|
+
|
|
411
|
+
if (fs.existsSync(filePath)) {
|
|
412
|
+
try {
|
|
413
|
+
const data = fs.readFileSync(filePath, 'utf-8');
|
|
414
|
+
const store = JSON.parse(data) as TimelineStore;
|
|
415
|
+
return migrateStore(store);
|
|
416
|
+
} catch {
|
|
417
|
+
return createInitialStore();
|
|
418
|
+
}
|
|
419
|
+
}
|
|
420
|
+
|
|
421
|
+
return createInitialStore();
|
|
422
|
+
}
|
|
423
|
+
```
|
|
424
|
+
|
|
425
|
+
**After:**
|
|
426
|
+
```typescript
|
|
427
|
+
export function loadStore(repoPath: string = process.cwd()): TimelineStore {
|
|
428
|
+
const filePath = getStorePath(repoPath);
|
|
429
|
+
const initialStore = createInitialStore();
|
|
430
|
+
|
|
431
|
+
const store = safeReadJsonSync<TimelineStore>(filePath, initialStore);
|
|
432
|
+
|
|
433
|
+
// Always migrate (handles both old versions and fresh stores)
|
|
434
|
+
return migrateTimelineStore(store);
|
|
435
|
+
}
|
|
436
|
+
```
|
|
437
|
+
|
|
438
|
+
**Reason:** Centralized corruption handling with backup
|
|
439
|
+
**Validation:** `npm test` should still pass
|
|
440
|
+
|
|
441
|
+
---
|
|
442
|
+
|
|
443
|
+
### 3. `src/storage/timeline-store.ts:1-10`
|
|
444
|
+
|
|
445
|
+
**Purpose:** Add imports for new utilities
|
|
446
|
+
|
|
447
|
+
**Before:**
|
|
448
|
+
```typescript
|
|
449
|
+
import * as fs from 'fs';
|
|
450
|
+
import * as path from 'path';
|
|
451
|
+
import {
|
|
452
|
+
TimelineResult,
|
|
453
|
+
TimelineSession,
|
|
454
|
+
TimelineDay,
|
|
455
|
+
TimelineEvent,
|
|
456
|
+
} from '../types';
|
|
457
|
+
```
|
|
458
|
+
|
|
459
|
+
**After:**
|
|
460
|
+
```typescript
|
|
461
|
+
import * as path from 'path';
|
|
462
|
+
import {
|
|
463
|
+
TimelineResult,
|
|
464
|
+
TimelineSession,
|
|
465
|
+
TimelineDay,
|
|
466
|
+
TimelineEvent,
|
|
467
|
+
} from '../types';
|
|
468
|
+
import { atomicWriteSync, safeReadJsonSync } from './atomic';
|
|
469
|
+
```
|
|
470
|
+
|
|
471
|
+
**Reason:** Use new atomic utilities
|
|
472
|
+
**Validation:** `npm run build` should compile
|
|
473
|
+
|
|
474
|
+
---
|
|
475
|
+
|
|
476
|
+
### 4. `src/storage/timeline-store.ts:557-575` (rename function)
|
|
477
|
+
|
|
478
|
+
**Purpose:** Rename migrateStore to migrateTimelineStore to avoid conflict
|
|
479
|
+
|
|
480
|
+
**Before:**
|
|
481
|
+
```typescript
|
|
482
|
+
function migrateStore(store: TimelineStore): TimelineStore {
|
|
483
|
+
```
|
|
484
|
+
|
|
485
|
+
**After:**
|
|
486
|
+
```typescript
|
|
487
|
+
function migrateTimelineStore(store: TimelineStore): TimelineStore {
|
|
488
|
+
```
|
|
489
|
+
|
|
490
|
+
**Reason:** Avoid naming conflict with schema.ts migrateStore
|
|
491
|
+
**Validation:** `npm run build` should compile
|
|
492
|
+
|
|
493
|
+
---
|
|
494
|
+
|
|
495
|
+
### 5. `src/storage/index.ts`
|
|
496
|
+
|
|
497
|
+
**Purpose:** Export new modules
|
|
498
|
+
|
|
499
|
+
**Before:**
|
|
500
|
+
```typescript
|
|
501
|
+
export {
|
|
502
|
+
TimelineStore,
|
|
503
|
+
StoredSession,
|
|
504
|
+
StoredInsight,
|
|
505
|
+
PatternStats,
|
|
506
|
+
TrendData,
|
|
507
|
+
WeekTrend,
|
|
508
|
+
MonthTrend,
|
|
509
|
+
loadStore,
|
|
510
|
+
saveStore,
|
|
511
|
+
createInitialStore,
|
|
512
|
+
updateStore,
|
|
513
|
+
getLastCommitHash,
|
|
514
|
+
getStorePath,
|
|
515
|
+
getStoreDir,
|
|
516
|
+
} from './timeline-store';
|
|
517
|
+
```
|
|
518
|
+
|
|
519
|
+
**After:**
|
|
520
|
+
```typescript
|
|
521
|
+
// Timeline store (computed view)
|
|
522
|
+
export {
|
|
523
|
+
TimelineStore,
|
|
524
|
+
StoredSession,
|
|
525
|
+
StoredInsight,
|
|
526
|
+
PatternStats,
|
|
527
|
+
TrendData,
|
|
528
|
+
WeekTrend,
|
|
529
|
+
MonthTrend,
|
|
530
|
+
loadStore,
|
|
531
|
+
saveStore,
|
|
532
|
+
createInitialStore,
|
|
533
|
+
updateStore,
|
|
534
|
+
getLastCommitHash,
|
|
535
|
+
getStorePath,
|
|
536
|
+
getStoreDir,
|
|
537
|
+
} from './timeline-store';
|
|
538
|
+
|
|
539
|
+
// Atomic file operations
|
|
540
|
+
export {
|
|
541
|
+
atomicWriteSync,
|
|
542
|
+
appendLineSync,
|
|
543
|
+
readNdjsonSync,
|
|
544
|
+
safeReadJsonSync,
|
|
545
|
+
} from './atomic';
|
|
546
|
+
|
|
547
|
+
// Commit log (NDJSON source of truth)
|
|
548
|
+
export {
|
|
549
|
+
StoredCommit,
|
|
550
|
+
getCommitLogPath,
|
|
551
|
+
appendCommits,
|
|
552
|
+
readCommitLog,
|
|
553
|
+
getLastLoggedCommitHash,
|
|
554
|
+
getCommitLogCount,
|
|
555
|
+
} from './commit-log';
|
|
556
|
+
|
|
557
|
+
// Schema versioning
|
|
558
|
+
export {
|
|
559
|
+
SchemaVersion,
|
|
560
|
+
CURRENT_SCHEMA_VERSION,
|
|
561
|
+
VersionedStore,
|
|
562
|
+
migrateStore,
|
|
563
|
+
needsMigration,
|
|
564
|
+
} from './schema';
|
|
565
|
+
```
|
|
566
|
+
|
|
567
|
+
**Reason:** Expose new modules through barrel export
|
|
568
|
+
**Validation:** `npm run build` should compile
|
|
569
|
+
|
|
570
|
+
---
|
|
571
|
+
|
|
572
|
+
## Files NOT to Modify (Yet)
|
|
573
|
+
|
|
574
|
+
These files will use the new storage in a future PR:
|
|
575
|
+
|
|
576
|
+
- `src/commands/timeline.ts` - Will use commit-log.ts
|
|
577
|
+
- `src/gamification/profile.ts` - Will use atomic.ts
|
|
578
|
+
|
|
579
|
+
**Rationale:** Keep this PR focused on storage layer only. Integration comes next.
|
|
580
|
+
|
|
581
|
+
---
|
|
582
|
+
|
|
583
|
+
## Implementation Order
|
|
584
|
+
|
|
585
|
+
**CRITICAL: Sequence matters. Do not reorder.**
|
|
586
|
+
|
|
587
|
+
| Step | Action | Validation | Rollback |
|
|
588
|
+
|------|--------|------------|----------|
|
|
589
|
+
| 1 | Create `src/storage/atomic.ts` | `npm run build` | Delete file |
|
|
590
|
+
| 2 | Create `src/storage/schema.ts` | `npm run build` | Delete file |
|
|
591
|
+
| 3 | Create `src/storage/commit-log.ts` | `npm run build` | Delete file |
|
|
592
|
+
| 4 | Modify `src/storage/timeline-store.ts` imports | `npm run build` | Revert |
|
|
593
|
+
| 5 | Rename migrateStore → migrateTimelineStore | `npm run build` | Revert |
|
|
594
|
+
| 6 | Update loadStore() | `npm run build` | Revert |
|
|
595
|
+
| 7 | Update saveStore() | `npm run build` | Revert |
|
|
596
|
+
| 8 | Update `src/storage/index.ts` | `npm run build` | Revert |
|
|
597
|
+
| 9 | Run full test suite | `npm test` | Revert all |
|
|
598
|
+
| 10 | Manual test timeline | `npm run dev timeline` | Revert all |
|
|
599
|
+
|
|
600
|
+
---
|
|
601
|
+
|
|
602
|
+
## Validation Strategy
|
|
603
|
+
|
|
604
|
+
### Syntax Validation
|
|
605
|
+
```bash
|
|
606
|
+
npm run build
|
|
607
|
+
# Expected: Compiled successfully
|
|
608
|
+
```
|
|
609
|
+
|
|
610
|
+
### Unit Tests
|
|
611
|
+
```bash
|
|
612
|
+
npm test
|
|
613
|
+
# Expected: All tests pass (existing tests should not break)
|
|
614
|
+
```
|
|
615
|
+
|
|
616
|
+
### Manual Integration Test
|
|
617
|
+
```bash
|
|
618
|
+
# Test timeline with new storage
|
|
619
|
+
npm run dev timeline --since "1 week ago"
|
|
620
|
+
# Expected: Timeline output displays correctly
|
|
621
|
+
|
|
622
|
+
# Verify .vibe-check/timeline.json was updated
|
|
623
|
+
cat .vibe-check/timeline.json | head -5
|
|
624
|
+
# Expected: JSON with version field
|
|
625
|
+
|
|
626
|
+
# Test corruption recovery (optional)
|
|
627
|
+
echo "corrupted" > .vibe-check/timeline.json
|
|
628
|
+
npm run dev timeline --since "1 week ago"
|
|
629
|
+
# Expected: Warning about corruption, creates fresh store
|
|
630
|
+
ls .vibe-check/timeline.json.corrupted.*
|
|
631
|
+
# Expected: Backup file exists
|
|
632
|
+
```
|
|
633
|
+
|
|
634
|
+
---
|
|
635
|
+
|
|
636
|
+
## Rollback Procedure
|
|
637
|
+
|
|
638
|
+
**Time to rollback:** ~2 minutes
|
|
639
|
+
|
|
640
|
+
### Full Rollback
|
|
641
|
+
```bash
|
|
642
|
+
# Step 1: Revert all changes
|
|
643
|
+
git checkout src/storage/
|
|
644
|
+
|
|
645
|
+
# Step 2: Remove new files
|
|
646
|
+
rm -f src/storage/atomic.ts
|
|
647
|
+
rm -f src/storage/schema.ts
|
|
648
|
+
rm -f src/storage/commit-log.ts
|
|
649
|
+
|
|
650
|
+
# Step 3: Verify
|
|
651
|
+
npm run build
|
|
652
|
+
npm test
|
|
653
|
+
|
|
654
|
+
# Step 4: Clean up any corrupted backup files
|
|
655
|
+
rm -f .vibe-check/*.corrupted.*
|
|
656
|
+
```
|
|
657
|
+
|
|
658
|
+
---
|
|
659
|
+
|
|
660
|
+
## Failure Pattern Risks
|
|
661
|
+
|
|
662
|
+
| Pattern | Risk | Prevention in Plan |
|
|
663
|
+
|---------|------|-------------------|
|
|
664
|
+
| Tests Passing Lie | LOW | Manual integration test required |
|
|
665
|
+
| Instruction Drift | LOW | Precise file:line specs |
|
|
666
|
+
| Bridge Torching | LOW | Backward compatible (same JSON format) |
|
|
667
|
+
|
|
668
|
+
---
|
|
669
|
+
|
|
670
|
+
## Risk Assessment
|
|
671
|
+
|
|
672
|
+
### Medium Risk: Import Ordering
|
|
673
|
+
- **What:** Circular imports between storage modules
|
|
674
|
+
- **Mitigation:** atomic.ts has no imports from project, schema.ts has no imports
|
|
675
|
+
- **Detection:** Build fails with circular dependency error
|
|
676
|
+
- **Recovery:** Inline the problematic function
|
|
677
|
+
|
|
678
|
+
### Low Risk: Path Handling
|
|
679
|
+
- **What:** Windows path separators in NDJSON
|
|
680
|
+
- **Mitigation:** Using path.join() consistently
|
|
681
|
+
- **Detection:** Tests fail on Windows CI
|
|
682
|
+
- **Recovery:** Add path normalization
|
|
683
|
+
|
|
684
|
+
---
|
|
685
|
+
|
|
686
|
+
## Approval Checklist
|
|
687
|
+
|
|
688
|
+
**Human must verify before /implement:**
|
|
689
|
+
|
|
690
|
+
- [ ] Every file specified precisely (file:line)
|
|
691
|
+
- [ ] All templates complete (no placeholders)
|
|
692
|
+
- [ ] Validation commands provided
|
|
693
|
+
- [ ] Rollback procedure complete
|
|
694
|
+
- [ ] Implementation order is correct
|
|
695
|
+
- [ ] Risks identified and mitigated
|
|
696
|
+
- [ ] Scope limited to storage layer only
|
|
697
|
+
|
|
698
|
+
---
|
|
699
|
+
|
|
700
|
+
## Progress Files
|
|
701
|
+
|
|
702
|
+
This plan does not require `feature-list.json` or `claude-progress.json` as it's a focused enhancement, not a multi-day project.
|
|
703
|
+
|
|
704
|
+
---
|
|
705
|
+
|
|
706
|
+
## Next Step
|
|
707
|
+
|
|
708
|
+
Once approved: `/implement json-storage-enhancement-plan.md`
|
|
709
|
+
|
|
710
|
+
---
|
|
711
|
+
|
|
712
|
+
## Future Work (Not in This Plan)
|
|
713
|
+
|
|
714
|
+
1. **Integration PR:** Use commit-log.ts in timeline.ts
|
|
715
|
+
2. **Profile atomic writes:** Use atomic.ts in profile.ts
|
|
716
|
+
3. **Weekly cache:** Add cache/ directory for precomputed views
|
|
717
|
+
4. **Cross-repo aggregation:** Central SQLite for multi-repo (separate plan)
|