@getmikk/watcher 1.7.1 → 1.9.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,217 +1,92 @@
1
- # @getmikk/watcher
1
+ # @getmikk/watcher
2
2
 
3
- > Your architecture map, always in sync zero manual re-analysis.
3
+ > Live file watcher daemonincremental, debounced, atomic.
4
4
 
5
5
  [![npm](https://img.shields.io/npm/v/@getmikk/watcher)](https://www.npmjs.com/package/@getmikk/watcher)
6
6
  [![License: Apache-2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](../../LICENSE)
7
7
 
8
- `@getmikk/watcher` keeps `mikk.lock.json` in sync with your source code in real time. It uses Chokidar to watch for file events, batches and debounces changes (so saving 20 files triggers one re-analysis, not twenty), incrementally re-parses only affected files, patches the dependency graph, recomputes Merkle hashes, and writes the lock file atomically — all with a PID-based singleton that prevents duplicate daemon processes.
8
+ Background daemon that keeps `mikk.lock.json` in sync as you edit code. Detects file changes via chokidar, re-parses only what changed, updates the lock atomically, and emits typed events for downstream consumers.
9
9
 
10
- The result: your AI context, impact analysis, and contract validation are always based on the current state of your codebase, not a stale snapshot.
11
-
12
- > Part of [Mikk](../../README.md) — the codebase nervous system for AI-assisted development.
10
+ > Part of [Mikk](../../README.md) live architectural context for your AI agent.
13
11
 
14
12
  ---
15
13
 
16
- ## Installation
14
+ ## Usage
15
+
16
+ Started via the CLI:
17
17
 
18
18
  ```bash
19
- npm install @getmikk/watcher
20
- # or
21
- bun add @getmikk/watcher
19
+ mikk watch
22
20
  ```
23
21
 
24
- **Peer dependency:** `@getmikk/core`
25
-
26
- ---
27
-
28
- ## Quick Start
22
+ Or programmatically:
29
23
 
30
24
  ```typescript
31
25
  import { WatcherDaemon } from '@getmikk/watcher'
32
26
 
33
27
  const daemon = new WatcherDaemon({
34
- projectRoot: process.cwd(),
35
- include: ['src/**/*.ts', 'src/**/*.tsx'],
36
- exclude: ['node_modules', 'dist', '.mikk'],
28
+ projectRoot: '/path/to/project',
29
+ include: ['**/*.ts', '**/*.tsx'],
30
+ exclude: ['**/node_modules/**', '**/dist/**'],
37
31
  debounceMs: 100,
38
32
  })
39
33
 
40
34
  daemon.on((event) => {
41
- switch (event.type) {
42
- case 'file:changed':
43
- console.log(`Changed: ${event.path}`)
44
- break
45
- case 'graph:updated':
46
- console.log('Dependency graph rebuilt')
47
- break
48
- case 'sync:clean':
49
- console.log('Lock file is in sync')
50
- break
51
- case 'sync:drifted':
52
- console.log('Lock file has drifted')
53
- break
35
+ if (event.type === 'graph:updated') {
36
+ console.log(`Graph updated: ${event.data.changedNodes.length} changed`)
54
37
  }
55
38
  })
56
39
 
57
40
  await daemon.start()
58
- // Lock file is now kept in sync automatically
59
-
60
- // Later...
61
- await daemon.stop()
62
- ```
63
-
64
- ---
65
-
66
- ## Architecture
67
-
68
- ```
69
- Filesystem Events (Chokidar)
70
-
71
-
72
- ┌─────────────┐
73
- │ FileWatcher │ ← Hash computation, deduplication
74
- └──────┬──────┘
75
- │ FileChangeEvent[]
76
-
77
- ┌──────────────────┐
78
- │ WatcherDaemon │ ← Debouncing (100ms), batching
79
- └──────┬───────────┘
80
- │ Batch of events
81
-
82
- ┌─────────────────────┐
83
- │ IncrementalAnalyzer │ ← Re-parse, graph patch, hash update
84
- └──────────┬──────────┘
85
-
86
-
87
- Atomic lock file write
88
41
  ```
89
42
 
90
43
  ---
91
44
 
92
- ## API Reference
93
-
94
- ### WatcherDaemon
95
-
96
- The main entry point — a long-running process that keeps the lock file in sync.
97
-
98
- ```typescript
99
- import { WatcherDaemon } from '@getmikk/watcher'
100
-
101
- const daemon = new WatcherDaemon(config)
102
- ```
103
-
104
- **`WatcherConfig`:**
105
-
106
- | Field | Type | Default | Description |
107
- |-------|------|---------|-------------|
108
- | `projectRoot` | `string` | — | Absolute path to the project |
109
- | `include` | `string[]` | `['**/*.ts']` | Glob patterns for watched files |
110
- | `exclude` | `string[]` | `['node_modules']` | Glob patterns to ignore |
111
- | `debounceMs` | `number` | `100` | Debounce window in milliseconds |
112
-
113
- **Methods:**
114
-
115
- | Method | Description |
116
- |--------|-------------|
117
- | `start()` | Start watching. Creates PID file at `.mikk/watcher.pid` for single-instance enforcement |
118
- | `stop()` | Stop watching. Cleans up PID file |
119
- | `on(handler)` | Register event handler |
120
-
121
- **Features:**
122
-
123
- - **Debouncing** — Batches rapid file changes (e.g., save-all) into a single analysis pass
124
- - **PID file** — Prevents multiple watcher instances via `.mikk/watcher.pid`
125
- - **Atomic writes** — Lock file is written atomically to prevent corruption
126
- - **Sync state** — Emits `sync:clean` or `sync:drifted` after each cycle
127
-
128
- ---
45
+ ## How It Works
129
46
 
130
47
  ### FileWatcher
131
48
 
132
- Lower-level wrapper around Chokidar with hash-based change detection:
49
+ Wraps chokidar. Watches `.ts` and `.tsx` files (configurable). On change:
50
+ 1. Computes a SHA-256 hash of the new file content
51
+ 2. Compares against the stored hash — skips true no-ops (content unchanged)
52
+ 3. Emits a typed `FileChangeEvent` with old hash, new hash, and change type
133
53
 
134
- ```typescript
135
- import { FileWatcher } from '@getmikk/watcher'
54
+ Hash store is seeded at startup from the lock file so first-change dedup works correctly from the beginning.
136
55
 
137
- const watcher = new FileWatcher(config)
138
-
139
- watcher.on((event) => {
140
- console.log(event.type) // 'added' | 'changed' | 'deleted'
141
- console.log(event.path) // Absolute file path
142
- console.log(event.oldHash) // Previous content hash (undefined for 'added')
143
- console.log(event.newHash) // New content hash (undefined for 'deleted')
144
- console.log(event.timestamp) // Event timestamp
145
- console.log(event.affectedModuleIds) // Modules containing this file
146
- })
147
-
148
- await watcher.start()
149
-
150
- // Seed with known hashes to detect only actual content changes
151
- watcher.setHash('/src/index.ts', 'abc123...')
152
-
153
- await watcher.stop()
154
- ```
56
+ ### WatcherDaemon
155
57
 
156
- **Hash-based deduplication:** Even if the OS reports a file change, the watcher computes a SHA-256 hash and only emits an event if the content actually changed. This prevents redundant re-analysis from editor auto-saves or format-on-save.
58
+ Orchestrates everything:
157
59
 
158
- ---
60
+ - **Debounce** — collects file change events for 100ms, then flushes as a batch
61
+ - **Deduplication** — if the same file changes twice in a batch, only the latest event is kept
62
+ - **Batch threshold** — batches under 15 files → incremental analysis; 15+ files → full re-analysis
63
+ - **Atomic writes** — lock file written as temp file then renamed; zero corruption risk on crash
64
+ - **PID file** — `.mikk/watcher.pid` prevents duplicate daemon instances
65
+ - **Sync state** — `.mikk/sync-state.json` tracks `clean | syncing | drifted | conflict`
159
66
 
160
67
  ### IncrementalAnalyzer
161
68
 
162
- Incrementally updates the dependency graph and lock file for a batch of changed files:
163
-
164
- ```typescript
165
- import { IncrementalAnalyzer } from '@getmikk/watcher'
166
-
167
- const analyzer = new IncrementalAnalyzer(graph, lock, contract, projectRoot)
168
-
169
- const result = await analyzer.analyzeBatch(events)
170
-
171
- console.log(result.graph) // Updated DependencyGraph
172
- console.log(result.lock) // Updated MikkLock
173
- console.log(result.impactResult) // ImpactResult from @getmikk/core
174
- console.log(result.mode) // 'incremental' | 'full'
175
- ```
176
-
177
- **How it works:**
69
+ Re-parses only changed files, updates graph nodes, and recompiles the lock. O(changed files), not O(whole repo).
178
70
 
179
- 1. **Small batches (≤15 files)** Incremental mode:
180
- - Re-parse only changed files
181
- - Patch the existing graph (remove old nodes/edges, add new ones)
182
- - Recompute affected hashes only
183
- - Run impact analysis on changed nodes
71
+ **Race condition handling:** after parsing a file, re-hashes it. If the hash changed during the parse (file was modified while being read), re-parses up to 3 times. Accepts final state after retries are exhausted.
184
72
 
185
- 2. **Large batches (>15 files)** Full re-analysis:
186
- - Re-parse all files from scratch
187
- - Rebuild entire graph
188
- - Recompute all hashes
189
-
190
- **Race-condition protection:** After parsing a file, the analyzer re-hashes it. If the hash changed during parsing (the file was modified again), it retries up to 3 times before falling back to the latest parsed version.
73
+ **Full re-analysis path:** triggered when batch size exceeds 15 files (e.g. `git checkout`, bulk rename). Re-parses all changed files in parallel, rebuilds the full graph, recompiles the lock.
191
74
 
192
75
  ---
193
76
 
194
- ### Events
195
-
196
- All events emitted through the `on()` handler:
77
+ ## Events
197
78
 
198
79
  ```typescript
199
80
  type WatcherEvent =
200
- | { type: 'file:changed'; event: FileChangeEvent }
201
- | { type: 'module:updated'; moduleId: string }
202
- | { type: 'graph:updated'; stats: { nodes: number; edges: number } }
203
- | { type: 'sync:clean' }
204
- | { type: 'sync:drifted'; driftedModules: string[] }
205
- ```
81
+ | { type: 'file:changed'; data: FileChangeEvent }
82
+ | { type: 'graph:updated'; data: { changedNodes: string[]; impactedNodes: string[] } }
83
+ | { type: 'sync:drifted'; data: { reason: string; affectedModules: string[] } }
206
84
 
207
- **`FileChangeEvent`:**
208
-
209
- ```typescript
210
85
  type FileChangeEvent = {
211
- type: 'added' | 'changed' | 'deleted'
212
- path: string
213
- oldHash?: string
214
- newHash?: string
86
+ type: 'changed' | 'added' | 'deleted'
87
+ path: string // relative to project root
88
+ oldHash: string | null
89
+ newHash: string | null
215
90
  timestamp: number
216
91
  affectedModuleIds: string[]
217
92
  }
@@ -219,51 +94,15 @@ type FileChangeEvent = {
219
94
 
220
95
  ---
221
96
 
222
- ## Usage with the CLI
223
-
224
- The `mikk watch` command starts the watcher daemon:
225
-
226
- ```bash
227
- mikk watch
228
- # Watching src/**/*.ts, src/**/*.tsx...
229
- # [sync:clean] Lock file is up to date
230
- # [file:changed] src/auth/login.ts
231
- # [graph:updated] 142 nodes, 87 edges
232
- # [sync:clean] Lock file updated
233
- ```
234
-
235
- Press `Ctrl+C` to stop.
236
-
237
- ---
238
-
239
- ## Single-Instance Enforcement
240
-
241
- The daemon writes a PID file to `.mikk/watcher.pid` on start and removes it on stop. If another watcher is already running, `start()` will throw an error. This prevents multiple watchers from fighting over the lock file.
242
-
243
- ```typescript
244
- try {
245
- await daemon.start()
246
- } catch (err) {
247
- if (err.message.includes('already running')) {
248
- console.log('Another watcher is already running')
249
- }
250
- }
251
- ```
252
-
253
- ---
254
-
255
- ## Types
97
+ ## Sync State
256
98
 
257
- ```typescript
258
- import type {
259
- FileChangeEvent,
260
- WatcherConfig,
261
- WatcherEvent,
262
- } from '@getmikk/watcher'
263
- ```
264
-
265
- ---
99
+ Written atomically to `.mikk/sync-state.json` on every transition:
266
100
 
267
- ## License
101
+ | Status | Meaning |
102
+ |--------|---------|
103
+ | `clean` | Lock file matches filesystem |
104
+ | `syncing` | Batch in progress |
105
+ | `drifted` | Analysis failed — lock is stale |
106
+ | `conflict` | Manual intervention needed |
268
107
 
269
- [Apache-2.0](../../LICENSE)
108
+ The MCP server reads sync state to surface staleness warnings on every tool call.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@getmikk/watcher",
3
- "version": "1.7.1",
3
+ "version": "1.9.0",
4
4
  "license": "Apache-2.0",
5
5
  "repository": {
6
6
  "type": "git",
@@ -21,7 +21,7 @@
21
21
  "dev": "tsc --watch"
22
22
  },
23
23
  "dependencies": {
24
- "@getmikk/core": "^1.7.1",
24
+ "@getmikk/core": "^1.9.0",
25
25
  "chokidar": "^4.0.0"
26
26
  },
27
27
  "devDependencies": {
@@ -4,7 +4,7 @@ import { hashFile } from '@getmikk/core'
4
4
  import type { WatcherConfig, WatcherEvent, FileChangeEvent } from './types.js'
5
5
 
6
6
  /**
7
- * FileWatcher wraps Chokidar to watch filesystem for changes.
7
+ * FileWatcher -- wraps Chokidar to watch filesystem for changes.
8
8
  * Computes hash of changed files and emits typed events.
9
9
  */
10
10
  export class FileWatcher {
@@ -14,12 +14,12 @@ export class FileWatcher {
14
14
 
15
15
  constructor(private config: WatcherConfig) { }
16
16
 
17
- /** Start watching non-blocking */
17
+ /** Start watching -- non-blocking */
18
18
  start(): void {
19
19
  const excludesRegexes = this.config.exclude.map(
20
20
  pattern => new RegExp(pattern.replace(/\*/g, '.*').replace(/\//g, '[\\\\/]'))
21
21
  )
22
- const includeExts = ['.ts', '.tsx']
22
+ const includeExts = ['.ts', '.tsx', '.js', '.jsx', '.mjs', '.cjs', '.go']
23
23
 
24
24
  this.watcher = watch(this.config.projectRoot, {
25
25
  ignored: (testPath: string, stats?: import('fs').Stats) => {
@@ -2,7 +2,7 @@ import * as fs from 'node:fs/promises'
2
2
  import * as path from 'node:path'
3
3
  import {
4
4
  getParser, GraphBuilder, ImpactAnalyzer, LockCompiler, hashFile,
5
- type ParsedFile, type DependencyGraph, type MikkLock, type MikkContract, type ImpactResult
5
+ type ParsedFile, type DependencyGraph, type MikkLock, type MikkContract, type ImpactResult, type GraphEdge
6
6
  } from '@getmikk/core'
7
7
  import type { FileChangeEvent } from './types.js'
8
8
 
@@ -13,14 +13,9 @@ const FULL_ANALYSIS_THRESHOLD = 15
13
13
  const MAX_RETRIES = 3
14
14
 
15
15
  /**
16
- * IncrementalAnalyzer — re-parses only changed files, updates graph nodes,
17
- * and recomputes affected module hashes. O(changed files) not O(whole repo).
18
- *
19
- * Supports batch analysis: if > 15 files change at once (e.g. git checkout),
20
- * runs a full re-analysis instead of incremental.
21
- *
22
- * Race condition handling: after parsing, re-hashes the file and re-parses
23
- * if the content changed during parsing (up to 3 retries).
16
+ * IncrementalAnalyzer — re-parses only changed files, performs a surgical
17
+ * graph update (removes stale nodes/edges, inserts new ones), then runs
18
+ * impact analysis over the affected subgraph.
24
19
  */
25
20
  export class IncrementalAnalyzer {
26
21
  private parsedFiles: Map<string, ParsedFile> = new Map()
@@ -30,59 +25,96 @@ export class IncrementalAnalyzer {
30
25
  private lock: MikkLock,
31
26
  private contract: MikkContract,
32
27
  private projectRoot: string
33
- ) { }
28
+ ) {
29
+ if (!this.graph.outEdges) this.graph.outEdges = new Map()
30
+ if (!this.graph.inEdges) this.graph.inEdges = new Map()
31
+ }
34
32
 
35
- /** Handle a batch of file change events (debounced by daemon) */
33
+ public get fileCount(): number {
34
+ return this.parsedFiles.size
35
+ }
36
+
37
+ /** Handle a batch of file change events */
36
38
  async analyzeBatch(events: FileChangeEvent[]): Promise<{
37
39
  graph: DependencyGraph
38
40
  lock: MikkLock
39
41
  impactResult: ImpactResult
40
42
  mode: 'incremental' | 'full'
41
43
  }> {
42
- // If too many changes at once, run full analysis
43
44
  if (events.length > FULL_ANALYSIS_THRESHOLD) {
44
45
  return this.runFullAnalysis(events)
45
46
  }
46
47
 
47
- // Incremental: process each event, collecting changed file paths
48
48
  const changedFilePaths: string[] = []
49
49
 
50
50
  for (const event of events) {
51
51
  if (event.type === 'deleted') {
52
52
  this.parsedFiles.delete(event.path)
53
- changedFilePaths.push(event.path)
54
53
  } else {
55
54
  const parsed = await this.parseWithRaceCheck(event.path)
56
- if (parsed) {
57
- this.parsedFiles.set(event.path, parsed)
58
- }
59
- changedFilePaths.push(event.path)
55
+ if (parsed) this.parsedFiles.set(event.path, parsed)
60
56
  }
57
+ changedFilePaths.push(event.path)
61
58
  }
62
59
 
63
- // Rebuild graph from all parsed files BEFORE deriving node IDs,
64
- // so newly-added files are present in the graph when we look them up.
65
- const allParsedFiles = [...this.parsedFiles.values()]
66
- const builder = new GraphBuilder()
67
- this.graph = builder.build(allParsedFiles)
60
+ // --- Surgical graph update ---
61
+ const staleNodeIds = new Set<string>(
62
+ changedFilePaths.flatMap(fp => this.findNodeIdsForFile(fp))
63
+ )
64
+
65
+ for (const nodeId of staleNodeIds) {
66
+ this.graph.nodes.delete(nodeId)
67
+ }
68
+ for (const fp of changedFilePaths) {
69
+ this.graph.nodes.delete(fp)
70
+ }
71
+
72
+ const allStaleIds = new Set([...staleNodeIds, ...changedFilePaths])
73
+ this.graph.edges = this.graph.edges.filter(
74
+ edge => !allStaleIds.has(edge.from) && !allStaleIds.has(edge.to)
75
+ )
76
+
77
+ const changedParsedFiles = changedFilePaths
78
+ .map(fp => this.parsedFiles.get(fp))
79
+ .filter((f): f is ParsedFile => f !== undefined)
80
+
81
+ if (changedParsedFiles.length > 0) {
82
+ const miniBuilder = new GraphBuilder()
83
+ const miniGraph = miniBuilder.build(changedParsedFiles)
84
+
85
+ for (const [id, node] of miniGraph.nodes) {
86
+ this.graph.nodes.set(id, node)
87
+ }
88
+
89
+ for (const edge of miniGraph.edges) {
90
+ this.graph.edges.push(edge)
91
+ }
92
+ }
93
+
94
+ // Rebuild adjacency maps
95
+ this.graph.outEdges = new Map()
96
+ this.graph.inEdges = new Map()
97
+ for (const edge of this.graph.edges) {
98
+ if (!this.graph.outEdges.has(edge.from)) this.graph.outEdges.set(edge.from, [])
99
+ this.graph.outEdges.get(edge.from)!.push(edge)
100
+ if (!this.graph.inEdges.has(edge.to)) this.graph.inEdges.set(edge.to, [])
101
+ this.graph.inEdges.get(edge.to)!.push(edge)
102
+ }
68
103
 
69
- // Map changed file paths → graph node IDs using the updated graph
70
- const changedNodeIds = [...new Set(
71
- changedFilePaths.flatMap(fp => this.findAffectedNodes(fp))
72
- )]
104
+ const changedNodeIds = [
105
+ ...new Set(changedFilePaths.flatMap(fp => this.findNodeIdsForFile(fp)))
106
+ ]
73
107
 
74
- // Run impact analysis on all changed nodes
75
108
  const analyzer = new ImpactAnalyzer(this.graph)
76
109
  const impactResult = analyzer.analyze(changedNodeIds)
77
110
 
78
- // Recompile lock
111
+ const allParsedFiles = [...this.parsedFiles.values()]
79
112
  const compiler = new LockCompiler()
80
- this.lock = compiler.compile(this.graph, this.contract, allParsedFiles)
113
+ this.lock = compiler.compile(this.graph, this.contract, allParsedFiles, undefined, this.projectRoot)
81
114
 
82
115
  return { graph: this.graph, lock: this.lock, impactResult, mode: 'incremental' }
83
116
  }
84
117
 
85
- /** Handle a single file change event */
86
118
  async analyze(event: FileChangeEvent): Promise<{
87
119
  graph: DependencyGraph
88
120
  lock: MikkLock
@@ -92,112 +124,75 @@ export class IncrementalAnalyzer {
92
124
  return { graph: result.graph, lock: result.lock, impactResult: result.impactResult }
93
125
  }
94
126
 
95
- /** Add a parsed file to the tracker */
96
127
  addParsedFile(file: ParsedFile): void {
97
128
  this.parsedFiles.set(file.path, file)
98
129
  }
99
130
 
100
- /** Get the current parsed file count */
101
- get fileCount(): number {
102
- return this.parsedFiles.size
103
- }
104
-
105
- // ─── Private ──────────────────────────────────────────────────
106
-
107
- /**
108
- * Parse a file with race-condition detection.
109
- * After parsing, re-hash the file. If the hash differs from what we started with,
110
- * the file changed during parsing — re-parse (up to MAX_RETRIES).
111
- */
112
131
  private async parseWithRaceCheck(changedFile: string): Promise<ParsedFile | null> {
113
132
  const fullPath = path.join(this.projectRoot, changedFile)
114
-
115
133
  for (let attempt = 0; attempt < MAX_RETRIES; attempt++) {
116
134
  try {
117
135
  const content = await fs.readFile(fullPath, 'utf-8')
118
136
  const parser = getParser(changedFile)
119
- const parsedFile = parser.parse(changedFile, content)
137
+ const parsedFile = await parser.parse(changedFile, content)
120
138
 
121
- // Race condition check: re-hash after parse
122
139
  try {
123
140
  const postParseHash = await hashFile(fullPath)
124
- if (postParseHash === parsedFile.hash) {
125
- return parsedFile // Content stable
126
- }
127
- // Content changed during parse — retry
141
+ if (postParseHash === parsedFile.hash) return parsedFile
128
142
  } catch {
129
- return parsedFile // File may have been deleted, return what we have
143
+ return parsedFile
130
144
  }
131
145
  } catch {
132
- return null // File unreadable
146
+ return null
133
147
  }
134
148
  }
135
-
136
- // Exhausted retries — parse one final time and accept
137
- try {
138
- const content = await fs.readFile(fullPath, 'utf-8')
139
- const parser = getParser(changedFile)
140
- return parser.parse(changedFile, content)
141
- } catch {
142
- return null
143
- }
149
+ return null
144
150
  }
145
151
 
146
- /** Run a full re-analysis (for large batches like git checkout) */
147
152
  private async runFullAnalysis(events: FileChangeEvent[]): Promise<{
148
153
  graph: DependencyGraph
149
154
  lock: MikkLock
150
155
  impactResult: ImpactResult
151
156
  mode: 'full'
152
157
  }> {
153
- // Remove deleted files
154
158
  for (const event of events) {
155
- if (event.type === 'deleted') {
156
- this.parsedFiles.delete(event.path)
157
- }
159
+ if (event.type === 'deleted') this.parsedFiles.delete(event.path)
158
160
  }
159
161
 
160
- // Re-parse all non-deleted changed files
161
162
  const nonDeleted = events.filter(e => e.type !== 'deleted')
162
163
  await Promise.all(nonDeleted.map(async (event) => {
163
164
  const parsed = await this.parseWithRaceCheck(event.path)
164
- if (parsed) {
165
- this.parsedFiles.set(event.path, parsed)
166
- }
165
+ if (parsed) this.parsedFiles.set(event.path, parsed)
167
166
  }))
168
167
 
169
- // Full rebuild
170
168
  const allParsedFiles = [...this.parsedFiles.values()]
171
169
  const builder = new GraphBuilder()
172
170
  this.graph = builder.build(allParsedFiles)
173
171
 
174
172
  const compiler = new LockCompiler()
175
- this.lock = compiler.compile(this.graph, this.contract, allParsedFiles)
176
-
177
- const changedPaths = events.map(e => e.path)
178
-
179
- return {
180
- graph: this.graph,
181
- lock: this.lock,
182
- impactResult: {
183
- changed: changedPaths,
184
- impacted: [],
185
- classified: {
186
- critical: [],
187
- high: [],
188
- medium: [],
189
- low: [],
190
- },
191
- depth: 0,
192
- confidence: 'low', // Full rebuild = can't determine precise impact
193
- },
194
- mode: 'full',
173
+ this.lock = compiler.compile(this.graph, this.contract, allParsedFiles, undefined, this.projectRoot)
174
+
175
+ const impactResult: ImpactResult = {
176
+ changed: events.map(e => e.path),
177
+ impacted: [],
178
+ allImpacted: [],
179
+ depth: 0,
180
+ entryPoints: [],
181
+ criticalModules: [],
182
+ paths: [],
183
+ confidence: 1.0,
184
+ riskScore: 0,
185
+ classified: { critical: [], high: [], medium: [], low: [] }
195
186
  }
187
+
188
+ return { graph: this.graph, lock: this.lock, impactResult, mode: 'full' }
196
189
  }
197
190
 
198
- private findAffectedNodes(filePath: string): string[] {
199
- return [...this.graph.nodes.values()]
200
- .filter(n => n.file === filePath)
201
- .map(n => n.id)
191
+ private findNodeIdsForFile(filePath: string): string[] {
192
+ const ids: string[] = []
193
+ for (const [id, node] of this.graph.nodes) {
194
+ if (node.file === filePath) ids.push(id)
195
+ }
196
+ return ids
202
197
  }
203
198
  }
@@ -0,0 +1,158 @@
1
+ import { describe, it, expect } from 'bun:test'
2
+ import { IncrementalAnalyzer } from '../src/incremental-analyzer.js'
3
+ import type { MikkLock, DependencyGraph, MikkContract } from '@getmikk/core'
4
+ import type { FileChangeEvent } from '../src/types.js'
5
+
6
+ describe('IncrementalAnalyzer', () => {
7
+ const mockGraph = (): DependencyGraph => ({
8
+ nodes: new Map(),
9
+ edges: [],
10
+ outEdges: new Map(),
11
+ inEdges: new Map()
12
+ })
13
+
14
+ const mockLock: MikkLock = {
15
+ version: '2.0.0',
16
+ generatedAt: new Date().toISOString(),
17
+ generatorVersion: '1.0.0',
18
+ projectRoot: '/project',
19
+ syncState: {
20
+ status: 'clean',
21
+ lastSyncAt: new Date().toISOString(),
22
+ lockHash: 'abc',
23
+ contractHash: 'xyz'
24
+ },
25
+ files: {
26
+ 'src/index.ts': {
27
+ path: 'src/index.ts',
28
+ hash: 'abc',
29
+ moduleId: 'root',
30
+ lastModified: new Date().toISOString(),
31
+ imports: []
32
+ }
33
+ },
34
+ functions: {},
35
+ classes: {},
36
+ modules: {},
37
+ graph: {
38
+ nodes: 1,
39
+ edges: 0,
40
+ rootHash: 'abc'
41
+ }
42
+ }
43
+
44
+ const contract: MikkContract = {
45
+ version: '2.0.0',
46
+ project: {
47
+ name: 'test',
48
+ description: 'test project',
49
+ language: 'typescript',
50
+ framework: 'none',
51
+ entryPoints: []
52
+ },
53
+ declared: {
54
+ modules: [],
55
+ constraints: [],
56
+ decisions: []
57
+ },
58
+ overwrite: {
59
+ mode: 'never',
60
+ requireConfirmation: false
61
+ },
62
+ policies: {
63
+ maxRiskScore: 70,
64
+ maxImpactNodes: 10,
65
+ protectedModules: [],
66
+ enforceStrictBoundaries: true,
67
+ requireReasoningForCritical: true
68
+ }
69
+ }
70
+
71
+ it('detects changes correctly without throwing', async () => {
72
+ const analyzer = new IncrementalAnalyzer(
73
+ mockGraph(),
74
+ mockLock,
75
+ contract,
76
+ '/project'
77
+ )
78
+
79
+ const event: FileChangeEvent = {
80
+ path: 'src/index.ts',
81
+ type: 'changed',
82
+ oldHash: 'old',
83
+ newHash: 'abc',
84
+ timestamp: Date.now(),
85
+ affectedModuleIds: []
86
+ }
87
+ const result = await analyzer.analyze(event)
88
+ expect(result.graph).toBeDefined()
89
+ expect(result.lock).toBeDefined()
90
+ expect(result.impactResult).toBeDefined()
91
+ })
92
+
93
+ describe('Edge Cases and Batch Processing', () => {
94
+ it('handles file deletions by removing nodes from graph and lock', async () => {
95
+ const analyzer = new IncrementalAnalyzer(mockGraph(), mockLock, contract, '/project')
96
+ // First add it
97
+ analyzer.addParsedFile({
98
+ path: 'src/to-delete.ts',
99
+ language: 'typescript',
100
+ hash: 'foo',
101
+ parsedAt: Date.now(),
102
+ functions: [],
103
+ classes: [],
104
+ imports: [],
105
+ exports: [],
106
+ routes: [],
107
+ variables: [],
108
+ generics: [],
109
+ calls: []
110
+ })
111
+ expect(analyzer.fileCount).toBe(1)
112
+
113
+ // Now send a deleted event
114
+ const event: FileChangeEvent = {
115
+ path: 'src/to-delete.ts',
116
+ type: 'deleted',
117
+ oldHash: 'foo',
118
+ newHash: '',
119
+ timestamp: Date.now(),
120
+ affectedModuleIds: []
121
+ }
122
+ await analyzer.analyze(event)
123
+ expect(analyzer.fileCount).toBe(0)
124
+ })
125
+
126
+ it('survives analyze events on completely non-existent OS files gracefully', async () => {
127
+ const analyzer = new IncrementalAnalyzer(mockGraph(), mockLock, contract, '/project')
128
+ const event: FileChangeEvent = {
129
+ path: 'does/not/exist.ts',
130
+ type: 'changed',
131
+ oldHash: '',
132
+ newHash: 'new',
133
+ timestamp: Date.now(),
134
+ affectedModuleIds: []
135
+ }
136
+ const result = await analyzer.analyzeBatch([event])
137
+ expect(result.mode).toBe('incremental')
138
+ expect(result.impactResult).toBeDefined()
139
+ expect(analyzer.fileCount).toBe(0)
140
+ })
141
+
142
+ it('triggers a full re-analysis if file batch exceeds FULL_ANALYSIS_THRESHOLD (15)', async () => {
143
+ const analyzer = new IncrementalAnalyzer(mockGraph(), mockLock, contract, '/project')
144
+ const events: FileChangeEvent[] = Array.from({ length: 16 }).map((_, i) => ({
145
+ path: `src/file_${i}.ts`,
146
+ type: 'changed',
147
+ oldHash: '',
148
+ newHash: `hash_${i}`,
149
+ timestamp: Date.now(),
150
+ affectedModuleIds: []
151
+ }))
152
+
153
+ const result = await analyzer.analyzeBatch(events)
154
+ expect(result.mode).toBe('full')
155
+ expect(result.impactResult.confidence).toBe(1.0)
156
+ })
157
+ })
158
+ })