@getmikk/watcher 1.7.0 → 1.8.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,217 +1,92 @@
1
- # @getmikk/watcher
1
+ # @getmikk/watcher
2
2
 
3
- > Your architecture map, always in sync zero manual re-analysis.
3
+ > Live file watcher daemonincremental, debounced, atomic.
4
4
 
5
5
  [![npm](https://img.shields.io/npm/v/@getmikk/watcher)](https://www.npmjs.com/package/@getmikk/watcher)
6
6
  [![License: Apache-2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](../../LICENSE)
7
7
 
8
- `@getmikk/watcher` keeps `mikk.lock.json` in sync with your source code in real time. It uses Chokidar to watch for file events, batches and debounces changes (so saving 20 files triggers one re-analysis, not twenty), incrementally re-parses only affected files, patches the dependency graph, recomputes Merkle hashes, and writes the lock file atomically — all with a PID-based singleton that prevents duplicate daemon processes.
8
+ Background daemon that keeps `mikk.lock.json` in sync as you edit code. Detects file changes via chokidar, re-parses only what changed, updates the lock atomically, and emits typed events for downstream consumers.
9
9
 
10
- The result: your AI context, impact analysis, and contract validation are always based on the current state of your codebase, not a stale snapshot.
11
-
12
- > Part of [Mikk](../../README.md) — the codebase nervous system for AI-assisted development.
10
+ > Part of [Mikk](../../README.md) live architectural context for your AI agent.
13
11
 
14
12
  ---
15
13
 
16
- ## Installation
14
+ ## Usage
15
+
16
+ Started via the CLI:
17
17
 
18
18
  ```bash
19
- npm install @getmikk/watcher
20
- # or
21
- bun add @getmikk/watcher
19
+ mikk watch
22
20
  ```
23
21
 
24
- **Peer dependency:** `@getmikk/core`
25
-
26
- ---
27
-
28
- ## Quick Start
22
+ Or programmatically:
29
23
 
30
24
  ```typescript
31
25
  import { WatcherDaemon } from '@getmikk/watcher'
32
26
 
33
27
  const daemon = new WatcherDaemon({
34
- projectRoot: process.cwd(),
35
- include: ['src/**/*.ts', 'src/**/*.tsx'],
36
- exclude: ['node_modules', 'dist', '.mikk'],
28
+ projectRoot: '/path/to/project',
29
+ include: ['**/*.ts', '**/*.tsx'],
30
+ exclude: ['**/node_modules/**', '**/dist/**'],
37
31
  debounceMs: 100,
38
32
  })
39
33
 
40
34
  daemon.on((event) => {
41
- switch (event.type) {
42
- case 'file:changed':
43
- console.log(`Changed: ${event.path}`)
44
- break
45
- case 'graph:updated':
46
- console.log('Dependency graph rebuilt')
47
- break
48
- case 'sync:clean':
49
- console.log('Lock file is in sync')
50
- break
51
- case 'sync:drifted':
52
- console.log('Lock file has drifted')
53
- break
35
+ if (event.type === 'graph:updated') {
36
+ console.log(`Graph updated: ${event.data.changedNodes.length} changed`)
54
37
  }
55
38
  })
56
39
 
57
40
  await daemon.start()
58
- // Lock file is now kept in sync automatically
59
-
60
- // Later...
61
- await daemon.stop()
62
- ```
63
-
64
- ---
65
-
66
- ## Architecture
67
-
68
- ```
69
- Filesystem Events (Chokidar)
70
-
71
-
72
- ┌─────────────┐
73
- │ FileWatcher │ ← Hash computation, deduplication
74
- └──────┬──────┘
75
- │ FileChangeEvent[]
76
-
77
- ┌──────────────────┐
78
- │ WatcherDaemon │ ← Debouncing (100ms), batching
79
- └──────┬───────────┘
80
- │ Batch of events
81
-
82
- ┌─────────────────────┐
83
- │ IncrementalAnalyzer │ ← Re-parse, graph patch, hash update
84
- └──────────┬──────────┘
85
-
86
-
87
- Atomic lock file write
88
41
  ```
89
42
 
90
43
  ---
91
44
 
92
- ## API Reference
93
-
94
- ### WatcherDaemon
95
-
96
- The main entry point — a long-running process that keeps the lock file in sync.
97
-
98
- ```typescript
99
- import { WatcherDaemon } from '@getmikk/watcher'
100
-
101
- const daemon = new WatcherDaemon(config)
102
- ```
103
-
104
- **`WatcherConfig`:**
105
-
106
- | Field | Type | Default | Description |
107
- |-------|------|---------|-------------|
108
- | `projectRoot` | `string` | — | Absolute path to the project |
109
- | `include` | `string[]` | `['**/*.ts']` | Glob patterns for watched files |
110
- | `exclude` | `string[]` | `['node_modules']` | Glob patterns to ignore |
111
- | `debounceMs` | `number` | `100` | Debounce window in milliseconds |
112
-
113
- **Methods:**
114
-
115
- | Method | Description |
116
- |--------|-------------|
117
- | `start()` | Start watching. Creates PID file at `.mikk/watcher.pid` for single-instance enforcement |
118
- | `stop()` | Stop watching. Cleans up PID file |
119
- | `on(handler)` | Register event handler |
120
-
121
- **Features:**
122
-
123
- - **Debouncing** — Batches rapid file changes (e.g., save-all) into a single analysis pass
124
- - **PID file** — Prevents multiple watcher instances via `.mikk/watcher.pid`
125
- - **Atomic writes** — Lock file is written atomically to prevent corruption
126
- - **Sync state** — Emits `sync:clean` or `sync:drifted` after each cycle
127
-
128
- ---
45
+ ## How It Works
129
46
 
130
47
  ### FileWatcher
131
48
 
132
- Lower-level wrapper around Chokidar with hash-based change detection:
49
+ Wraps chokidar. Watches `.ts` and `.tsx` files (configurable). On change:
50
+ 1. Computes a SHA-256 hash of the new file content
51
+ 2. Compares against the stored hash — skips true no-ops (content unchanged)
52
+ 3. Emits a typed `FileChangeEvent` with old hash, new hash, and change type
133
53
 
134
- ```typescript
135
- import { FileWatcher } from '@getmikk/watcher'
54
+ Hash store is seeded at startup from the lock file so first-change dedup works correctly from the beginning.
136
55
 
137
- const watcher = new FileWatcher(config)
138
-
139
- watcher.on((event) => {
140
- console.log(event.type) // 'added' | 'changed' | 'deleted'
141
- console.log(event.path) // Absolute file path
142
- console.log(event.oldHash) // Previous content hash (undefined for 'added')
143
- console.log(event.newHash) // New content hash (undefined for 'deleted')
144
- console.log(event.timestamp) // Event timestamp
145
- console.log(event.affectedModuleIds) // Modules containing this file
146
- })
147
-
148
- await watcher.start()
149
-
150
- // Seed with known hashes to detect only actual content changes
151
- watcher.setHash('/src/index.ts', 'abc123...')
152
-
153
- await watcher.stop()
154
- ```
56
+ ### WatcherDaemon
155
57
 
156
- **Hash-based deduplication:** Even if the OS reports a file change, the watcher computes a SHA-256 hash and only emits an event if the content actually changed. This prevents redundant re-analysis from editor auto-saves or format-on-save.
58
+ Orchestrates everything:
157
59
 
158
- ---
60
+ - **Debounce** — collects file change events for 100ms, then flushes as a batch
61
+ - **Deduplication** — if the same file changes twice in a batch, only the latest event is kept
62
+ - **Batch threshold** — batches under 15 files → incremental analysis; 15+ files → full re-analysis
63
+ - **Atomic writes** — lock file written as temp file then renamed; zero corruption risk on crash
64
+ - **PID file** — `.mikk/watcher.pid` prevents duplicate daemon instances
65
+ - **Sync state** — `.mikk/sync-state.json` tracks `clean | syncing | drifted | conflict`
159
66
 
160
67
  ### IncrementalAnalyzer
161
68
 
162
- Incrementally updates the dependency graph and lock file for a batch of changed files:
163
-
164
- ```typescript
165
- import { IncrementalAnalyzer } from '@getmikk/watcher'
166
-
167
- const analyzer = new IncrementalAnalyzer(graph, lock, contract, projectRoot)
168
-
169
- const result = await analyzer.analyzeBatch(events)
170
-
171
- console.log(result.graph) // Updated DependencyGraph
172
- console.log(result.lock) // Updated MikkLock
173
- console.log(result.impactResult) // ImpactResult from @getmikk/core
174
- console.log(result.mode) // 'incremental' | 'full'
175
- ```
176
-
177
- **How it works:**
69
+ Re-parses only changed files, updates graph nodes, and recompiles the lock. O(changed files), not O(whole repo).
178
70
 
179
- 1. **Small batches (≤15 files)** Incremental mode:
180
- - Re-parse only changed files
181
- - Patch the existing graph (remove old nodes/edges, add new ones)
182
- - Recompute affected hashes only
183
- - Run impact analysis on changed nodes
71
+ **Race condition handling:** after parsing a file, re-hashes it. If the hash changed during the parse (file was modified while being read), re-parses up to 3 times. Accepts final state after retries are exhausted.
184
72
 
185
- 2. **Large batches (>15 files)** Full re-analysis:
186
- - Re-parse all files from scratch
187
- - Rebuild entire graph
188
- - Recompute all hashes
189
-
190
- **Race-condition protection:** After parsing a file, the analyzer re-hashes it. If the hash changed during parsing (the file was modified again), it retries up to 3 times before falling back to the latest parsed version.
73
+ **Full re-analysis path:** triggered when batch size exceeds 15 files (e.g. `git checkout`, bulk rename). Re-parses all changed files in parallel, rebuilds the full graph, recompiles the lock.
191
74
 
192
75
  ---
193
76
 
194
- ### Events
195
-
196
- All events emitted through the `on()` handler:
77
+ ## Events
197
78
 
198
79
  ```typescript
199
80
  type WatcherEvent =
200
- | { type: 'file:changed'; event: FileChangeEvent }
201
- | { type: 'module:updated'; moduleId: string }
202
- | { type: 'graph:updated'; stats: { nodes: number; edges: number } }
203
- | { type: 'sync:clean' }
204
- | { type: 'sync:drifted'; driftedModules: string[] }
205
- ```
81
+ | { type: 'file:changed'; data: FileChangeEvent }
82
+ | { type: 'graph:updated'; data: { changedNodes: string[]; impactedNodes: string[] } }
83
+ | { type: 'sync:drifted'; data: { reason: string; affectedModules: string[] } }
206
84
 
207
- **`FileChangeEvent`:**
208
-
209
- ```typescript
210
85
  type FileChangeEvent = {
211
- type: 'added' | 'changed' | 'deleted'
212
- path: string
213
- oldHash?: string
214
- newHash?: string
86
+ type: 'changed' | 'added' | 'deleted'
87
+ path: string // relative to project root
88
+ oldHash: string | null
89
+ newHash: string | null
215
90
  timestamp: number
216
91
  affectedModuleIds: string[]
217
92
  }
@@ -219,51 +94,15 @@ type FileChangeEvent = {
219
94
 
220
95
  ---
221
96
 
222
- ## Usage with the CLI
223
-
224
- The `mikk watch` command starts the watcher daemon:
225
-
226
- ```bash
227
- mikk watch
228
- # Watching src/**/*.ts, src/**/*.tsx...
229
- # [sync:clean] Lock file is up to date
230
- # [file:changed] src/auth/login.ts
231
- # [graph:updated] 142 nodes, 87 edges
232
- # [sync:clean] Lock file updated
233
- ```
234
-
235
- Press `Ctrl+C` to stop.
236
-
237
- ---
238
-
239
- ## Single-Instance Enforcement
240
-
241
- The daemon writes a PID file to `.mikk/watcher.pid` on start and removes it on stop. If another watcher is already running, `start()` will throw an error. This prevents multiple watchers from fighting over the lock file.
242
-
243
- ```typescript
244
- try {
245
- await daemon.start()
246
- } catch (err) {
247
- if (err.message.includes('already running')) {
248
- console.log('Another watcher is already running')
249
- }
250
- }
251
- ```
252
-
253
- ---
254
-
255
- ## Types
97
+ ## Sync State
256
98
 
257
- ```typescript
258
- import type {
259
- FileChangeEvent,
260
- WatcherConfig,
261
- WatcherEvent,
262
- } from '@getmikk/watcher'
263
- ```
264
-
265
- ---
99
+ Written atomically to `.mikk/sync-state.json` on every transition:
266
100
 
267
- ## License
101
+ | Status | Meaning |
102
+ |--------|---------|
103
+ | `clean` | Lock file matches filesystem |
104
+ | `syncing` | Batch in progress |
105
+ | `drifted` | Analysis failed — lock is stale |
106
+ | `conflict` | Manual intervention needed |
268
107
 
269
- [Apache-2.0](../../LICENSE)
108
+ The MCP server reads sync state to surface staleness warnings on every tool call.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@getmikk/watcher",
3
- "version": "1.7.0",
3
+ "version": "1.8.0",
4
4
  "license": "Apache-2.0",
5
5
  "repository": {
6
6
  "type": "git",
@@ -21,7 +21,7 @@
21
21
  "dev": "tsc --watch"
22
22
  },
23
23
  "dependencies": {
24
- "@getmikk/core": "^1.7.0",
24
+ "@getmikk/core": "^1.8.0",
25
25
  "chokidar": "^4.0.0"
26
26
  },
27
27
  "devDependencies": {
package/src/daemon.ts CHANGED
@@ -70,6 +70,16 @@ export class WatcherDaemon {
70
70
  this.analyzer.addParsedFile(file)
71
71
  }
72
72
 
73
+ // Seed the file watcher's hash store with initial hashes so the first
74
+ // change to any file can be properly deduplicated by content.
75
+ const initialHashes = new Map<string, string>()
76
+ for (const file of parsedFiles) {
77
+ if (file.hash) {
78
+ initialHashes.set(file.path.replace(/\\/g, '/'), file.hash)
79
+ }
80
+ }
81
+ this.watcher.seedHashes(initialHashes)
82
+
73
83
  // Subscribe to file changes with debouncing
74
84
  this.watcher.on(async (event: WatcherEvent) => {
75
85
  if (event.type === 'file:changed') {
@@ -4,7 +4,7 @@ import { hashFile } from '@getmikk/core'
4
4
  import type { WatcherConfig, WatcherEvent, FileChangeEvent } from './types.js'
5
5
 
6
6
  /**
7
- * FileWatcher wraps Chokidar to watch filesystem for changes.
7
+ * FileWatcher -- wraps Chokidar to watch filesystem for changes.
8
8
  * Computes hash of changed files and emits typed events.
9
9
  */
10
10
  export class FileWatcher {
@@ -14,27 +14,39 @@ export class FileWatcher {
14
14
 
15
15
  constructor(private config: WatcherConfig) { }
16
16
 
17
- /** Start watching non-blocking */
17
+ /** Start watching -- non-blocking */
18
18
  start(): void {
19
- this.watcher = watch(this.config.include, {
20
- ignored: this.config.exclude,
19
+ const excludesRegexes = this.config.exclude.map(
20
+ pattern => new RegExp(pattern.replace(/\*/g, '.*').replace(/\//g, '[\\\\/]'))
21
+ )
22
+ const includeExts = ['.ts', '.tsx', '.js', '.jsx', '.mjs', '.cjs', '.go']
23
+
24
+ this.watcher = watch(this.config.projectRoot, {
25
+ ignored: (testPath: string, stats?: import('fs').Stats) => {
26
+ // Ignore matching exclude patterns
27
+ if (excludesRegexes.some(r => r.test(testPath))) return true
28
+ // Keep directories so we can recurse
29
+ if (!stats || stats.isDirectory()) return false
30
+ // Ignore non-matching file extensions
31
+ return !includeExts.some(ext => testPath.endsWith(ext))
32
+ },
21
33
  cwd: this.config.projectRoot,
22
34
  ignoreInitial: true,
23
35
  persistent: true,
24
36
  awaitWriteFinish: {
25
- stabilityThreshold: 50,
26
- pollInterval: 10,
37
+ stabilityThreshold: 300,
38
+ pollInterval: 50,
27
39
  },
28
40
  })
29
41
 
30
- this.watcher.on('change', (relativePath: string) => {
31
- this.handleChange(relativePath, 'changed')
32
- })
33
- this.watcher.on('add', (relativePath: string) => {
34
- this.handleChange(relativePath, 'added')
35
- })
36
- this.watcher.on('unlink', (relativePath: string) => {
37
- this.handleChange(relativePath, 'deleted')
42
+ this.watcher.on('all', (event, relativePath) => {
43
+ if (event === 'change') {
44
+ this.handleChange(relativePath, 'changed')
45
+ } else if (event === 'add') {
46
+ this.handleChange(relativePath, 'added')
47
+ } else if (event === 'unlink') {
48
+ this.handleChange(relativePath, 'deleted')
49
+ }
38
50
  })
39
51
  }
40
52
 
@@ -49,11 +61,18 @@ export class FileWatcher {
49
61
  this.handlers.push(handler)
50
62
  }
51
63
 
52
- /** Set initial hash for a file */
64
+ /** Seed the initial hash for a file (called at startup for all known files) */
53
65
  setHash(filePath: string, hash: string): void {
54
66
  this.hashStore.set(filePath, hash)
55
67
  }
56
68
 
69
+ /** Bulk-seed hashes for all known files so first-change dedup works correctly */
70
+ seedHashes(entries: ReadonlyMap<string, string>): void {
71
+ for (const [p, h] of entries) {
72
+ this.hashStore.set(p.replace(/\\/g, '/'), h)
73
+ }
74
+ }
75
+
57
76
  private async handleChange(relativePath: string, type: FileChangeEvent['type']): Promise<void> {
58
77
  const fullPath = path.join(this.config.projectRoot, relativePath)
59
78
  const normalizedPath = relativePath.replace(/\\/g, '/')
@@ -68,7 +87,8 @@ export class FileWatcher {
68
87
  }
69
88
  }
70
89
 
71
- if (oldHash === newHash) return // Content unchanged
90
+ // Skip only when both hashes are known and identical (true no-op change)
91
+ if (oldHash !== null && newHash !== null && oldHash === newHash) return
72
92
 
73
93
  if (newHash) this.hashStore.set(normalizedPath, newHash)
74
94
  if (type === 'deleted') this.hashStore.delete(normalizedPath)
@@ -44,31 +44,36 @@ export class IncrementalAnalyzer {
44
44
  return this.runFullAnalysis(events)
45
45
  }
46
46
 
47
- // Incremental: process each event
48
- let combinedChanged: string[] = []
49
- let combinedImpacted: string[] = []
47
+ // Incremental: process each event, collecting changed file paths
48
+ const changedFilePaths: string[] = []
50
49
 
51
50
  for (const event of events) {
52
51
  if (event.type === 'deleted') {
53
52
  this.parsedFiles.delete(event.path)
54
- combinedChanged.push(event.path)
53
+ changedFilePaths.push(event.path)
55
54
  } else {
56
55
  const parsed = await this.parseWithRaceCheck(event.path)
57
56
  if (parsed) {
58
57
  this.parsedFiles.set(event.path, parsed)
59
58
  }
60
- combinedChanged.push(...this.findAffectedNodes(event.path))
59
+ changedFilePaths.push(event.path)
61
60
  }
62
61
  }
63
62
 
64
- // Rebuild graph from all parsed files
63
+ // Rebuild graph from all parsed files BEFORE deriving node IDs,
64
+ // so newly-added files are present in the graph when we look them up.
65
65
  const allParsedFiles = [...this.parsedFiles.values()]
66
66
  const builder = new GraphBuilder()
67
67
  this.graph = builder.build(allParsedFiles)
68
68
 
69
+ // Map changed file paths → graph node IDs using the updated graph
70
+ const changedNodeIds = [...new Set(
71
+ changedFilePaths.flatMap(fp => this.findAffectedNodes(fp))
72
+ )]
73
+
69
74
  // Run impact analysis on all changed nodes
70
75
  const analyzer = new ImpactAnalyzer(this.graph)
71
- const impactResult = analyzer.analyze([...new Set(combinedChanged)])
76
+ const impactResult = analyzer.analyze(changedNodeIds)
72
77
 
73
78
  // Recompile lock
74
79
  const compiler = new LockCompiler()
@@ -111,7 +116,7 @@ export class IncrementalAnalyzer {
111
116
  try {
112
117
  const content = await fs.readFile(fullPath, 'utf-8')
113
118
  const parser = getParser(changedFile)
114
- const parsedFile = parser.parse(changedFile, content)
119
+ const parsedFile = await parser.parse(changedFile, content)
115
120
 
116
121
  // Race condition check: re-hash after parse
117
122
  try {
@@ -132,7 +137,7 @@ export class IncrementalAnalyzer {
132
137
  try {
133
138
  const content = await fs.readFile(fullPath, 'utf-8')
134
139
  const parser = getParser(changedFile)
135
- return parser.parse(changedFile, content)
140
+ return await parser.parse(changedFile, content)
136
141
  } catch {
137
142
  return null
138
143
  }
@@ -0,0 +1,91 @@
1
+ import { describe, it, expect } from 'bun:test'
2
+ import { IncrementalAnalyzer } from '../src/incremental-analyzer.js'
3
+ import type { MikkLock, ParsedFile } from '@getmikk/core'
4
+
5
+ describe('IncrementalAnalyzer', () => {
6
+ it('detects changes correctly without throwing', async () => {
7
+ const mockLock: MikkLock = {
8
+ version: '1',
9
+ lastUpdated: new Date().toISOString(),
10
+ files: {
11
+ 'src/index.ts': {
12
+ hash: 'abc',
13
+ lastModified: new Date().toISOString(),
14
+ path: 'src/index.ts',
15
+ moduleId: 'root',
16
+ imports: [],
17
+ exports: []
18
+ }
19
+ },
20
+ functions: {},
21
+ classes: {},
22
+ modules: {}
23
+ }
24
+
25
+ const analyzer = new IncrementalAnalyzer(
26
+ { nodes: new Map(), edges: [] },
27
+ mockLock,
28
+ {
29
+ project: { name: 'test', language: 'typescript', framework: null },
30
+ declared: { modules: [], constraints: [], decisions: [] },
31
+ overwrite: { mode: 'never', requireConfirmation: false }
32
+ },
33
+ '/project'
34
+ )
35
+
36
+ // This simulates a file change event
37
+ const result = await analyzer.analyze({ path: 'src/index.ts', type: 'modified' })
38
+ expect(result.graph).toBeDefined()
39
+ expect(result.lock).toBeDefined()
40
+ expect(result.impactResult).toBeDefined()
41
+ })
42
+
43
+ describe('Edge Cases and Batch Processing', () => {
44
+ const mockLock: MikkLock = {
45
+ version: '1',
46
+ lastUpdated: new Date().toISOString(),
47
+ files: {}, functions: {}, classes: {}, modules: {}
48
+ }
49
+
50
+ const contract = {
51
+ project: { name: 'test', language: 'typescript', framework: null },
52
+ declared: { modules: [], constraints: [], decisions: [] },
53
+ overwrite: { mode: 'never', requireConfirmation: false }
54
+ }
55
+
56
+ it('handles file deletions by removing nodes from graph and lock', async () => {
57
+ const analyzer = new IncrementalAnalyzer({ nodes: new Map(), edges: [] }, mockLock, contract, '/project')
58
+ // First add it
59
+ analyzer.addParsedFile({ path: 'src/to-delete.ts', language: 'ts', hash: 'foo', parsedAt: Date.now(), functions: [], classes: [], imports: [], exports: [], routes: [] })
60
+ expect(analyzer.fileCount).toBe(1)
61
+
62
+ // Now send a deleted event
63
+ await analyzer.analyze({ path: 'src/to-delete.ts', type: 'deleted' })
64
+ expect(analyzer.fileCount).toBe(0)
65
+ })
66
+
67
+ it('survives analyze events on completely non-existent OS files gracefully', async () => {
68
+ const analyzer = new IncrementalAnalyzer({ nodes: new Map(), edges: [] }, mockLock, contract, '/project')
69
+ // Will fail to fs.readFile inside parseWithRaceCheck
70
+ const result = await analyzer.analyze({ path: 'does/not/exist.ts', type: 'modified' })
71
+ expect(result.mode).toBeUndefined() // Returns incremental by default
72
+ expect(result.impactResult).toBeDefined()
73
+ // Should not have crashed the analyzer
74
+ expect(analyzer.fileCount).toBe(0)
75
+ })
76
+
77
+ it('triggers a full re-analysis if file batch exceeds FULL_ANALYSIS_THRESHOLD (15)', async () => {
78
+ const analyzer = new IncrementalAnalyzer({ nodes: new Map(), edges: [] }, mockLock, contract, '/project')
79
+ const events = Array.from({ length: 16 }).map((_, i) => ({
80
+ path: `src/file_${i}.ts`,
81
+ type: 'modified' as const
82
+ }))
83
+
84
+ const result = await analyzer.analyzeBatch(events)
85
+ // It should have hit runFullAnalysis, which returns mode: 'full'
86
+ expect(result.mode).toBe('full')
87
+ // It will also have gracefully continued despite the 16 files failing to load off disk
88
+ expect(result.impactResult.confidence).toBe('low')
89
+ })
90
+ })
91
+ })