prjct-cli 1.16.0 → 1.18.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +92 -1
- package/core/__tests__/domain/bm25.test.ts +225 -0
- package/core/__tests__/domain/change-propagator.test.ts +100 -0
- package/core/__tests__/domain/file-hasher.test.ts +146 -0
- package/core/__tests__/domain/file-ranker.test.ts +169 -0
- package/core/__tests__/domain/git-cochange.test.ts +121 -0
- package/core/__tests__/domain/import-graph.test.ts +156 -0
- package/core/agentic/smart-context.ts +33 -2
- package/core/commands/analysis.ts +9 -2
- package/core/commands/command-data.ts +3 -1
- package/core/commands/commands.ts +1 -0
- package/core/domain/bm25.ts +525 -0
- package/core/domain/change-propagator.ts +162 -0
- package/core/domain/file-hasher.ts +296 -0
- package/core/domain/file-ranker.ts +151 -0
- package/core/domain/git-cochange.ts +250 -0
- package/core/domain/import-graph.ts +315 -0
- package/core/index.ts +1 -0
- package/core/services/sync-service.ts +133 -2
- package/core/services/watch-service.ts +1 -1
- package/core/types/index.ts +1 -0
- package/core/types/project-sync.ts +20 -0
- package/dist/bin/prjct.mjs +1238 -374
- package/package.json +1 -1
package/CHANGELOG.md
CHANGED
|
@@ -1,12 +1,103 @@
|
|
|
1
1
|
# Changelog
|
|
2
2
|
|
|
3
|
+
## [1.18.0] - 2026-02-09
|
|
4
|
+
|
|
5
|
+
### Features
|
|
6
|
+
|
|
7
|
+
- implement incremental sync with file hashing (PRJ-305) (#160)
|
|
8
|
+
|
|
9
|
+
|
|
10
|
+
## [1.18.0] - 2026-02-09
|
|
11
|
+
|
|
12
|
+
### Features
|
|
13
|
+
|
|
14
|
+
- **Incremental sync**: `prjct sync` now only re-analyzes files that changed since last sync (PRJ-305)
|
|
15
|
+
- File hashing with Bun.hash (xxHash64) — <100ms for 500 files
|
|
16
|
+
- Change propagation through import graph (1-level reverse edges)
|
|
17
|
+
- Conditional index rebuilds: BM25, import graph, co-change only when source files change
|
|
18
|
+
- Conditional agent regeneration: only when config files (package.json, tsconfig.json) change
|
|
19
|
+
- `prjct sync --full` flag to force complete re-analysis
|
|
20
|
+
|
|
21
|
+
### Implementation Details
|
|
22
|
+
|
|
23
|
+
New modules:
|
|
24
|
+
- `core/domain/file-hasher.ts` — Hash computation via Bun.hash, SQLite registry using `index_checksums` table, diff detection (added/modified/deleted/unchanged)
|
|
25
|
+
- `core/domain/change-propagator.ts` — Import graph reverse-edge lookup for 1-level change propagation, domain classification for affected files
|
|
26
|
+
|
|
27
|
+
Modified:
|
|
28
|
+
- `core/services/sync-service.ts` — Incremental decision logic: detect changes → propagate → conditionally rebuild indexes and agents
|
|
29
|
+
- `core/services/watch-service.ts` — Passes accumulated `changedFiles` to sync options
|
|
30
|
+
- `core/types/project-sync.ts` — Added `full`, `changedFiles` to `SyncOptions` + `IncrementalInfo` result type
|
|
31
|
+
- CLI chain (`core/index.ts` → `commands.ts` → `analysis.ts`) — Wired `--full` flag through
|
|
32
|
+
|
|
33
|
+
### Learnings
|
|
34
|
+
|
|
35
|
+
- Bun's `fs.readdir` with `withFileTypes` returns `Dirent<NonSharedBuffer>` — need `String()` cast for `.name`
|
|
36
|
+
- Existing `index_checksums` SQLite table was already set up (PRJ-303) — zero schema changes needed
|
|
37
|
+
- Import graph reverse edges (from PRJ-304) enable efficient 1-level propagation without rebuilding the graph
|
|
38
|
+
|
|
39
|
+
### Test Plan
|
|
40
|
+
|
|
41
|
+
#### For QA
|
|
42
|
+
1. Run `prjct sync` on fresh project (no hash cache) — should behave as full sync
|
|
43
|
+
2. Run `prjct sync` again without changes — should skip index rebuilds and agent regeneration
|
|
44
|
+
3. Modify a `.ts` file, run `prjct sync` — should detect change and rebuild indexes
|
|
45
|
+
4. Modify `package.json`, run `prjct sync` — should regenerate agents
|
|
46
|
+
5. Run `prjct sync --full` — should force complete re-analysis
|
|
47
|
+
6. Run `prjct watch`, change a file — should pass changedFiles to sync
|
|
48
|
+
|
|
49
|
+
#### For Users
|
|
50
|
+
**What changed:** `prjct sync` is now incremental by default.
|
|
51
|
+
**How to use:** No changes needed. Use `prjct sync --full` to force complete re-analysis.
|
|
52
|
+
**Breaking changes:** None
|
|
53
|
+
|
|
54
|
+
## [1.17.0] - 2026-02-09
|
|
55
|
+
|
|
56
|
+
### Features
|
|
57
|
+
|
|
58
|
+
- implement BM25 + import graph + git co-change for zero-cost file selection (PRJ-304) (#159)
|
|
59
|
+
|
|
60
|
+
|
|
61
|
+
## [1.17.0] - 2026-02-08
|
|
62
|
+
|
|
63
|
+
### Features
|
|
64
|
+
- **BM25 + import graph + git co-change file selection** (PRJ-304): Zero-cost file selection using three mathematical signals combined into a weighted ranker. Replaces keyword matching in smart-context with precision that matches LLM-based classification — at zero API cost.
|
|
65
|
+
|
|
66
|
+
### Implementation Details
|
|
67
|
+
- `core/domain/bm25.ts` — BM25 indexer: tokenizes files (exports, functions, imports, comments, path segments), builds inverted index, scores queries using Okapi BM25 (k1=1.2, b=0.75). Stores in SQLite kv_store.
|
|
68
|
+
- `core/domain/import-graph.ts` — Import graph builder: parses TS/JS imports to build directed adjacency list, follows chains 2 levels deep, scores by proximity (1/(depth+1)).
|
|
69
|
+
- `core/domain/git-cochange.ts` — Git co-change analyzer: parses last 100 commits, builds Jaccard similarity matrix for file pairs that change together.
|
|
70
|
+
- `core/domain/file-ranker.ts` — Combined ranker: `BM25 × 0.5 + imports × 0.3 + cochange × 0.2`, normalizes each signal to [0,1], returns top 15 files.
|
|
71
|
+
- `core/agentic/smart-context.ts` — Uses ranker when indexes exist, graceful fallback to regex-based domain filtering.
|
|
72
|
+
- `core/services/sync-service.ts` — Builds all 3 indexes in parallel during `prjct sync`.
|
|
73
|
+
|
|
74
|
+
### Learnings
|
|
75
|
+
- `tokenizeQuery` must split camelCase BEFORE lowercasing — otherwise "getUserById" becomes "getuserbyid" and doesn't split
|
|
76
|
+
- Jaccard similarity > cosine for co-change because data is binary (file present or not in commit)
|
|
77
|
+
- Batch file reads (50 at a time) needed for indexing performance on large projects
|
|
78
|
+
- Stop words list must include code keywords (import, export, const) to reduce noise in scoring
|
|
79
|
+
|
|
80
|
+
### Test Plan
|
|
81
|
+
|
|
82
|
+
#### For QA
|
|
83
|
+
1. Run `prjct sync` — verify BM25, import graph, and co-change indexes build without errors
|
|
84
|
+
2. Query "Fix auth middleware" — verify auth-related files rank higher than unrelated files
|
|
85
|
+
3. Query "Build responsive dashboard" — verify frontend files rank higher than backend files
|
|
86
|
+
4. Verify index rebuild time <5 seconds on 300+ file project
|
|
87
|
+
5. Verify query time <50ms
|
|
88
|
+
6. Verify zero API calls during file selection
|
|
89
|
+
|
|
90
|
+
#### For Users
|
|
91
|
+
**What changed:** File selection during task context is now powered by BM25 text search, import graph proximity, and git co-change analysis instead of keyword matching.
|
|
92
|
+
**How to use:** Run `p. sync` to build indexes — file selection is automatic and more accurate.
|
|
93
|
+
**Breaking changes:** None. Falls back to previous filtering if indexes don't exist.
|
|
94
|
+
|
|
3
95
|
## [1.16.0] - 2026-02-09
|
|
4
96
|
|
|
5
97
|
### Features
|
|
6
98
|
|
|
7
99
|
- remove JSON storage redundancy, SQLite-only backend (PRJ-303) (#158)
|
|
8
100
|
|
|
9
|
-
|
|
10
101
|
## [1.16.0] - 2026-02-08
|
|
11
102
|
|
|
12
103
|
### Features
|
|
@@ -0,0 +1,225 @@
|
|
|
1
|
+
/**
|
|
2
|
+
* Tests for BM25 Text Search Index
|
|
3
|
+
*/
|
|
4
|
+
|
|
5
|
+
import { afterEach, beforeEach, describe, expect, it } from 'bun:test'
|
|
6
|
+
import fs from 'node:fs/promises'
|
|
7
|
+
import os from 'node:os'
|
|
8
|
+
import path from 'node:path'
|
|
9
|
+
import { buildIndex, score, tokenizeFile, tokenizeQuery } from '../../domain/bm25'
|
|
10
|
+
|
|
11
|
+
// =============================================================================
|
|
12
|
+
// Tokenization Tests
|
|
13
|
+
// =============================================================================
|
|
14
|
+
|
|
15
|
+
describe('BM25', () => {
|
|
16
|
+
describe('tokenizeFile', () => {
|
|
17
|
+
it('should extract path segments', () => {
|
|
18
|
+
const tokens = tokenizeFile('', 'core/domain/bm25.ts')
|
|
19
|
+
expect(tokens).toContain('core')
|
|
20
|
+
expect(tokens).toContain('domain')
|
|
21
|
+
expect(tokens).toContain('bm25')
|
|
22
|
+
})
|
|
23
|
+
|
|
24
|
+
it('should extract function names and split camelCase', () => {
|
|
25
|
+
const content = 'export function getUserById(id: string) { return id }'
|
|
26
|
+
const tokens = tokenizeFile(content, 'user-service.ts')
|
|
27
|
+
expect(tokens).toContain('get')
|
|
28
|
+
expect(tokens).toContain('user')
|
|
29
|
+
// 'by' and 'id' are filtered (stop word / too short)
|
|
30
|
+
})
|
|
31
|
+
|
|
32
|
+
it('should extract class names', () => {
|
|
33
|
+
const content = 'export class AuthMiddleware { handle() {} }'
|
|
34
|
+
const tokens = tokenizeFile(content, 'middleware.ts')
|
|
35
|
+
expect(tokens).toContain('auth')
|
|
36
|
+
expect(tokens).toContain('middleware')
|
|
37
|
+
})
|
|
38
|
+
|
|
39
|
+
it('should extract interface names', () => {
|
|
40
|
+
const content = 'export interface JwtPayload { sub: string }'
|
|
41
|
+
const tokens = tokenizeFile(content, 'types.ts')
|
|
42
|
+
expect(tokens).toContain('jwt')
|
|
43
|
+
expect(tokens).toContain('payload')
|
|
44
|
+
})
|
|
45
|
+
|
|
46
|
+
it('should extract import sources', () => {
|
|
47
|
+
const content = `import { Router } from './router'\nimport express from 'express'`
|
|
48
|
+
const tokens = tokenizeFile(content, 'app.ts')
|
|
49
|
+
expect(tokens).toContain('router')
|
|
50
|
+
expect(tokens).toContain('express')
|
|
51
|
+
})
|
|
52
|
+
|
|
53
|
+
it('should extract words from comments', () => {
|
|
54
|
+
const content = '// Handle authentication for JWT tokens'
|
|
55
|
+
const tokens = tokenizeFile(content, 'auth.ts')
|
|
56
|
+
expect(tokens).toContain('handle')
|
|
57
|
+
expect(tokens).toContain('authentication')
|
|
58
|
+
expect(tokens).toContain('jwt')
|
|
59
|
+
expect(tokens).toContain('tokens')
|
|
60
|
+
})
|
|
61
|
+
|
|
62
|
+
it('should extract words from JSDoc comments', () => {
|
|
63
|
+
const content = '/** Validates user session and refreshes token */'
|
|
64
|
+
const tokens = tokenizeFile(content, 'session.ts')
|
|
65
|
+
expect(tokens).toContain('validates')
|
|
66
|
+
expect(tokens).toContain('session')
|
|
67
|
+
expect(tokens).toContain('refreshes')
|
|
68
|
+
expect(tokens).toContain('token')
|
|
69
|
+
})
|
|
70
|
+
|
|
71
|
+
it('should filter out stop words', () => {
|
|
72
|
+
const content = 'export function getTheData() {}'
|
|
73
|
+
const tokens = tokenizeFile(content, 'data.ts')
|
|
74
|
+
expect(tokens).not.toContain('the')
|
|
75
|
+
expect(tokens).not.toContain('export')
|
|
76
|
+
expect(tokens).not.toContain('function')
|
|
77
|
+
})
|
|
78
|
+
|
|
79
|
+
it('should handle empty content', () => {
|
|
80
|
+
const tokens = tokenizeFile('', 'empty.ts')
|
|
81
|
+
// Should still have path segments
|
|
82
|
+
expect(tokens).toContain('empty')
|
|
83
|
+
})
|
|
84
|
+
})
|
|
85
|
+
|
|
86
|
+
describe('tokenizeQuery', () => {
|
|
87
|
+
it('should tokenize a task description', () => {
|
|
88
|
+
const tokens = tokenizeQuery('Fix the auth middleware for JWT validation')
|
|
89
|
+
expect(tokens).toContain('fix')
|
|
90
|
+
expect(tokens).toContain('auth')
|
|
91
|
+
expect(tokens).toContain('middleware')
|
|
92
|
+
expect(tokens).toContain('jwt')
|
|
93
|
+
expect(tokens).toContain('validation')
|
|
94
|
+
})
|
|
95
|
+
|
|
96
|
+
it('should split camelCase in queries', () => {
|
|
97
|
+
const tokens = tokenizeQuery('update getUserById function')
|
|
98
|
+
expect(tokens).toContain('update')
|
|
99
|
+
expect(tokens).toContain('get')
|
|
100
|
+
expect(tokens).toContain('user')
|
|
101
|
+
})
|
|
102
|
+
|
|
103
|
+
it('should remove stop words from queries', () => {
|
|
104
|
+
const tokens = tokenizeQuery('Fix the bug in the login')
|
|
105
|
+
expect(tokens).not.toContain('the')
|
|
106
|
+
expect(tokens).not.toContain('in')
|
|
107
|
+
expect(tokens).toContain('fix')
|
|
108
|
+
expect(tokens).toContain('bug')
|
|
109
|
+
expect(tokens).toContain('login')
|
|
110
|
+
})
|
|
111
|
+
})
|
|
112
|
+
|
|
113
|
+
// =============================================================================
|
|
114
|
+
// Index Building & Scoring Tests
|
|
115
|
+
// =============================================================================
|
|
116
|
+
|
|
117
|
+
describe('buildIndex + score', () => {
|
|
118
|
+
let testDir: string
|
|
119
|
+
|
|
120
|
+
beforeEach(async () => {
|
|
121
|
+
testDir = path.join(os.tmpdir(), `prjct-bm25-test-${Date.now()}`)
|
|
122
|
+
await fs.mkdir(testDir, { recursive: true })
|
|
123
|
+
})
|
|
124
|
+
|
|
125
|
+
afterEach(async () => {
|
|
126
|
+
try {
|
|
127
|
+
await fs.rm(testDir, { recursive: true, force: true })
|
|
128
|
+
} catch {
|
|
129
|
+
// Ignore cleanup errors
|
|
130
|
+
}
|
|
131
|
+
})
|
|
132
|
+
|
|
133
|
+
it('should build an index from project files', async () => {
|
|
134
|
+
// Create test files
|
|
135
|
+
await fs.writeFile(
|
|
136
|
+
path.join(testDir, 'auth.ts'),
|
|
137
|
+
`export class AuthService {\n validateJwt(token: string) {}\n refreshSession() {}\n}`
|
|
138
|
+
)
|
|
139
|
+
await fs.writeFile(
|
|
140
|
+
path.join(testDir, 'middleware.ts'),
|
|
141
|
+
`import { AuthService } from './auth'\nexport function authMiddleware(req: any) {}`
|
|
142
|
+
)
|
|
143
|
+
await fs.writeFile(
|
|
144
|
+
path.join(testDir, 'button.tsx'),
|
|
145
|
+
`export function Button({ label }: { label: string }) {\n return <button>{label}</button>\n}`
|
|
146
|
+
)
|
|
147
|
+
|
|
148
|
+
const index = await buildIndex(testDir)
|
|
149
|
+
|
|
150
|
+
expect(index.totalDocs).toBe(3)
|
|
151
|
+
expect(index.avgDocLength).toBeGreaterThan(0)
|
|
152
|
+
expect(Object.keys(index.documents)).toContain('auth.ts')
|
|
153
|
+
expect(Object.keys(index.documents)).toContain('middleware.ts')
|
|
154
|
+
expect(Object.keys(index.documents)).toContain('button.tsx')
|
|
155
|
+
})
|
|
156
|
+
|
|
157
|
+
it('should rank auth files higher for auth query', async () => {
|
|
158
|
+
await fs.writeFile(
|
|
159
|
+
path.join(testDir, 'auth.ts'),
|
|
160
|
+
`export class AuthService {\n // Authenticate user with JWT\n validateJwt(token: string) {}\n refreshSession() {}\n}`
|
|
161
|
+
)
|
|
162
|
+
await fs.writeFile(
|
|
163
|
+
path.join(testDir, 'middleware.ts'),
|
|
164
|
+
`import { AuthService } from './auth'\n// Auth middleware for JWT validation\nexport function authMiddleware(req: any) {}`
|
|
165
|
+
)
|
|
166
|
+
await fs.writeFile(
|
|
167
|
+
path.join(testDir, 'button.tsx'),
|
|
168
|
+
`// Render a UI button component\nexport function Button({ label }: { label: string }) {\n return <button>{label}</button>\n}`
|
|
169
|
+
)
|
|
170
|
+
|
|
171
|
+
const index = await buildIndex(testDir)
|
|
172
|
+
const results = score('Fix the auth middleware for JWT validation', index)
|
|
173
|
+
|
|
174
|
+
// Auth and middleware should rank higher than button
|
|
175
|
+
expect(results.length).toBeGreaterThan(0)
|
|
176
|
+
const authIndex = results.findIndex((r) => r.path === 'auth.ts')
|
|
177
|
+
const middlewareIndex = results.findIndex((r) => r.path === 'middleware.ts')
|
|
178
|
+
const buttonIndex = results.findIndex((r) => r.path === 'button.tsx')
|
|
179
|
+
|
|
180
|
+
expect(authIndex).toBeLessThan(buttonIndex === -1 ? Infinity : buttonIndex)
|
|
181
|
+
expect(middlewareIndex).toBeLessThan(buttonIndex === -1 ? Infinity : buttonIndex)
|
|
182
|
+
})
|
|
183
|
+
|
|
184
|
+
it('should rank frontend files higher for UI query', async () => {
|
|
185
|
+
await fs.writeFile(
|
|
186
|
+
path.join(testDir, 'dashboard.tsx'),
|
|
187
|
+
`// Responsive dashboard with charts and data grid\nexport function Dashboard() {\n return <div className="dashboard">Charts here</div>\n}`
|
|
188
|
+
)
|
|
189
|
+
await fs.writeFile(
|
|
190
|
+
path.join(testDir, 'api-handler.ts'),
|
|
191
|
+
`// Handle API requests for user data\nexport function handleRequest(req: any) { return {} }`
|
|
192
|
+
)
|
|
193
|
+
|
|
194
|
+
const index = await buildIndex(testDir)
|
|
195
|
+
const results = score('Build responsive dashboard', index)
|
|
196
|
+
|
|
197
|
+
expect(results.length).toBeGreaterThan(0)
|
|
198
|
+
expect(results[0].path).toBe('dashboard.tsx')
|
|
199
|
+
})
|
|
200
|
+
|
|
201
|
+
it('should skip node_modules', async () => {
|
|
202
|
+
await fs.mkdir(path.join(testDir, 'node_modules', 'pkg'), { recursive: true })
|
|
203
|
+
await fs.writeFile(path.join(testDir, 'node_modules', 'pkg', 'index.ts'), 'export default {}')
|
|
204
|
+
await fs.writeFile(path.join(testDir, 'app.ts'), 'export function main() {}')
|
|
205
|
+
|
|
206
|
+
const index = await buildIndex(testDir)
|
|
207
|
+
expect(index.totalDocs).toBe(1)
|
|
208
|
+
expect(Object.keys(index.documents)).toContain('app.ts')
|
|
209
|
+
})
|
|
210
|
+
|
|
211
|
+
it('should handle empty query', async () => {
|
|
212
|
+
await fs.writeFile(path.join(testDir, 'app.ts'), 'export function main() {}')
|
|
213
|
+
|
|
214
|
+
const index = await buildIndex(testDir)
|
|
215
|
+
const results = score('', index)
|
|
216
|
+
expect(results).toEqual([])
|
|
217
|
+
})
|
|
218
|
+
|
|
219
|
+
it('should handle empty project', async () => {
|
|
220
|
+
const index = await buildIndex(testDir)
|
|
221
|
+
expect(index.totalDocs).toBe(0)
|
|
222
|
+
expect(score('anything', index)).toEqual([])
|
|
223
|
+
})
|
|
224
|
+
})
|
|
225
|
+
})
|
|
@@ -0,0 +1,100 @@
|
|
|
1
|
+
import { describe, expect, test } from 'bun:test'
|
|
2
|
+
import { affectedDomains, propagateChanges } from '../../domain/change-propagator'
|
|
3
|
+
import type { FileDiff } from '../../domain/file-hasher'
|
|
4
|
+
|
|
5
|
+
describe('change-propagator', () => {
|
|
6
|
+
// =========================================================================
|
|
7
|
+
// propagateChanges
|
|
8
|
+
// =========================================================================
|
|
9
|
+
|
|
10
|
+
describe('propagateChanges', () => {
|
|
11
|
+
test('returns direct changes when no import graph exists', () => {
|
|
12
|
+
const diff: FileDiff = {
|
|
13
|
+
added: ['src/new.ts'],
|
|
14
|
+
modified: ['src/changed.ts'],
|
|
15
|
+
deleted: ['src/removed.ts'],
|
|
16
|
+
unchanged: ['src/same.ts'],
|
|
17
|
+
}
|
|
18
|
+
|
|
19
|
+
// Use a fake projectId that won't have a graph
|
|
20
|
+
const result = propagateChanges(diff, 'nonexistent-project')
|
|
21
|
+
|
|
22
|
+
expect(result.directlyChanged).toEqual(['src/new.ts', 'src/changed.ts'])
|
|
23
|
+
expect(result.affectedByImports).toEqual([])
|
|
24
|
+
expect(result.allAffected).toEqual(['src/new.ts', 'src/changed.ts'])
|
|
25
|
+
expect(result.deleted).toEqual(['src/removed.ts'])
|
|
26
|
+
})
|
|
27
|
+
|
|
28
|
+
test('empty diff returns empty propagation', () => {
|
|
29
|
+
const diff: FileDiff = {
|
|
30
|
+
added: [],
|
|
31
|
+
modified: [],
|
|
32
|
+
deleted: [],
|
|
33
|
+
unchanged: ['src/a.ts', 'src/b.ts'],
|
|
34
|
+
}
|
|
35
|
+
|
|
36
|
+
const result = propagateChanges(diff, 'nonexistent-project')
|
|
37
|
+
|
|
38
|
+
expect(result.directlyChanged).toEqual([])
|
|
39
|
+
expect(result.affectedByImports).toEqual([])
|
|
40
|
+
expect(result.allAffected).toEqual([])
|
|
41
|
+
})
|
|
42
|
+
})
|
|
43
|
+
|
|
44
|
+
// =========================================================================
|
|
45
|
+
// affectedDomains
|
|
46
|
+
// =========================================================================
|
|
47
|
+
|
|
48
|
+
describe('affectedDomains', () => {
|
|
49
|
+
test('detects frontend files', () => {
|
|
50
|
+
const domains = affectedDomains([
|
|
51
|
+
'src/components/Button.tsx',
|
|
52
|
+
'src/pages/Home.jsx',
|
|
53
|
+
'styles/main.css',
|
|
54
|
+
])
|
|
55
|
+
expect(domains.has('frontend')).toBe(true)
|
|
56
|
+
expect(domains.has('uxui')).toBe(true)
|
|
57
|
+
})
|
|
58
|
+
|
|
59
|
+
test('detects backend files', () => {
|
|
60
|
+
const domains = affectedDomains(['core/services/auth.ts', 'core/domain/user.ts'])
|
|
61
|
+
expect(domains.has('backend')).toBe(true)
|
|
62
|
+
})
|
|
63
|
+
|
|
64
|
+
test('detects testing files', () => {
|
|
65
|
+
const domains = affectedDomains([
|
|
66
|
+
'core/__tests__/auth.test.ts',
|
|
67
|
+
'src/components/Button.spec.tsx',
|
|
68
|
+
])
|
|
69
|
+
expect(domains.has('testing')).toBe(true)
|
|
70
|
+
})
|
|
71
|
+
|
|
72
|
+
test('detects devops files', () => {
|
|
73
|
+
const domains = affectedDomains(['Dockerfile', '.github/workflows/ci.yml'])
|
|
74
|
+
expect(domains.has('devops')).toBe(true)
|
|
75
|
+
})
|
|
76
|
+
|
|
77
|
+
test('detects database files', () => {
|
|
78
|
+
const domains = affectedDomains(['prisma/schema.prisma', 'db/migrations/001.sql'])
|
|
79
|
+
expect(domains.has('database')).toBe(true)
|
|
80
|
+
})
|
|
81
|
+
|
|
82
|
+
test('handles mixed domain files', () => {
|
|
83
|
+
const domains = affectedDomains([
|
|
84
|
+
'src/components/Form.tsx', // frontend + uxui
|
|
85
|
+
'core/services/api.ts', // backend
|
|
86
|
+
'Dockerfile', // devops
|
|
87
|
+
'core/__tests__/api.test.ts', // testing + backend
|
|
88
|
+
])
|
|
89
|
+
expect(domains.has('frontend')).toBe(true)
|
|
90
|
+
expect(domains.has('backend')).toBe(true)
|
|
91
|
+
expect(domains.has('devops')).toBe(true)
|
|
92
|
+
expect(domains.has('testing')).toBe(true)
|
|
93
|
+
})
|
|
94
|
+
|
|
95
|
+
test('empty file list returns empty domains', () => {
|
|
96
|
+
const domains = affectedDomains([])
|
|
97
|
+
expect(domains.size).toBe(0)
|
|
98
|
+
})
|
|
99
|
+
})
|
|
100
|
+
})
|
|
@@ -0,0 +1,146 @@
|
|
|
1
|
+
import { describe, expect, test } from 'bun:test'
|
|
2
|
+
import path from 'node:path'
|
|
3
|
+
import { computeHashes, diffHashes, type FileHash } from '../../domain/file-hasher'
|
|
4
|
+
|
|
5
|
+
describe('file-hasher', () => {
|
|
6
|
+
// =========================================================================
|
|
7
|
+
// diffHashes
|
|
8
|
+
// =========================================================================
|
|
9
|
+
|
|
10
|
+
describe('diffHashes', () => {
|
|
11
|
+
const makeHash = (filePath: string, hash: string): FileHash => ({
|
|
12
|
+
path: filePath,
|
|
13
|
+
hash,
|
|
14
|
+
size: 100,
|
|
15
|
+
mtime: '2026-01-01T00:00:00.000Z',
|
|
16
|
+
})
|
|
17
|
+
|
|
18
|
+
test('detects added files', () => {
|
|
19
|
+
const current = new Map<string, FileHash>([
|
|
20
|
+
['src/new-file.ts', makeHash('src/new-file.ts', 'xxh64:abc')],
|
|
21
|
+
['src/existing.ts', makeHash('src/existing.ts', 'xxh64:def')],
|
|
22
|
+
])
|
|
23
|
+
const stored = new Map<string, FileHash>([
|
|
24
|
+
['src/existing.ts', makeHash('src/existing.ts', 'xxh64:def')],
|
|
25
|
+
])
|
|
26
|
+
|
|
27
|
+
const diff = diffHashes(current, stored)
|
|
28
|
+
expect(diff.added).toEqual(['src/new-file.ts'])
|
|
29
|
+
expect(diff.modified).toEqual([])
|
|
30
|
+
expect(diff.unchanged).toEqual(['src/existing.ts'])
|
|
31
|
+
expect(diff.deleted).toEqual([])
|
|
32
|
+
})
|
|
33
|
+
|
|
34
|
+
test('detects modified files', () => {
|
|
35
|
+
const current = new Map<string, FileHash>([
|
|
36
|
+
['src/changed.ts', makeHash('src/changed.ts', 'xxh64:new-hash')],
|
|
37
|
+
])
|
|
38
|
+
const stored = new Map<string, FileHash>([
|
|
39
|
+
['src/changed.ts', makeHash('src/changed.ts', 'xxh64:old-hash')],
|
|
40
|
+
])
|
|
41
|
+
|
|
42
|
+
const diff = diffHashes(current, stored)
|
|
43
|
+
expect(diff.added).toEqual([])
|
|
44
|
+
expect(diff.modified).toEqual(['src/changed.ts'])
|
|
45
|
+
expect(diff.unchanged).toEqual([])
|
|
46
|
+
expect(diff.deleted).toEqual([])
|
|
47
|
+
})
|
|
48
|
+
|
|
49
|
+
test('detects deleted files', () => {
|
|
50
|
+
const current = new Map<string, FileHash>()
|
|
51
|
+
const stored = new Map<string, FileHash>([
|
|
52
|
+
['src/removed.ts', makeHash('src/removed.ts', 'xxh64:abc')],
|
|
53
|
+
])
|
|
54
|
+
|
|
55
|
+
const diff = diffHashes(current, stored)
|
|
56
|
+
expect(diff.added).toEqual([])
|
|
57
|
+
expect(diff.modified).toEqual([])
|
|
58
|
+
expect(diff.unchanged).toEqual([])
|
|
59
|
+
expect(diff.deleted).toEqual(['src/removed.ts'])
|
|
60
|
+
})
|
|
61
|
+
|
|
62
|
+
test('handles empty maps', () => {
|
|
63
|
+
const diff = diffHashes(new Map(), new Map())
|
|
64
|
+
expect(diff.added).toEqual([])
|
|
65
|
+
expect(diff.modified).toEqual([])
|
|
66
|
+
expect(diff.unchanged).toEqual([])
|
|
67
|
+
expect(diff.deleted).toEqual([])
|
|
68
|
+
})
|
|
69
|
+
|
|
70
|
+
test('handles first sync (no stored hashes)', () => {
|
|
71
|
+
const current = new Map<string, FileHash>([
|
|
72
|
+
['src/a.ts', makeHash('src/a.ts', 'xxh64:1')],
|
|
73
|
+
['src/b.ts', makeHash('src/b.ts', 'xxh64:2')],
|
|
74
|
+
['src/c.ts', makeHash('src/c.ts', 'xxh64:3')],
|
|
75
|
+
])
|
|
76
|
+
const stored = new Map<string, FileHash>()
|
|
77
|
+
|
|
78
|
+
const diff = diffHashes(current, stored)
|
|
79
|
+
expect(diff.added).toHaveLength(3)
|
|
80
|
+
expect(diff.modified).toEqual([])
|
|
81
|
+
expect(diff.unchanged).toEqual([])
|
|
82
|
+
expect(diff.deleted).toEqual([])
|
|
83
|
+
})
|
|
84
|
+
|
|
85
|
+
test('mixed changes: added + modified + deleted + unchanged', () => {
|
|
86
|
+
const current = new Map<string, FileHash>([
|
|
87
|
+
['src/new.ts', makeHash('src/new.ts', 'xxh64:new')],
|
|
88
|
+
['src/changed.ts', makeHash('src/changed.ts', 'xxh64:v2')],
|
|
89
|
+
['src/same.ts', makeHash('src/same.ts', 'xxh64:same')],
|
|
90
|
+
])
|
|
91
|
+
const stored = new Map<string, FileHash>([
|
|
92
|
+
['src/changed.ts', makeHash('src/changed.ts', 'xxh64:v1')],
|
|
93
|
+
['src/same.ts', makeHash('src/same.ts', 'xxh64:same')],
|
|
94
|
+
['src/gone.ts', makeHash('src/gone.ts', 'xxh64:gone')],
|
|
95
|
+
])
|
|
96
|
+
|
|
97
|
+
const diff = diffHashes(current, stored)
|
|
98
|
+
expect(diff.added).toEqual(['src/new.ts'])
|
|
99
|
+
expect(diff.modified).toEqual(['src/changed.ts'])
|
|
100
|
+
expect(diff.unchanged).toEqual(['src/same.ts'])
|
|
101
|
+
expect(diff.deleted).toEqual(['src/gone.ts'])
|
|
102
|
+
})
|
|
103
|
+
})
|
|
104
|
+
|
|
105
|
+
// =========================================================================
|
|
106
|
+
// computeHashes (integration — reads actual files)
|
|
107
|
+
// =========================================================================
|
|
108
|
+
|
|
109
|
+
describe('computeHashes', () => {
|
|
110
|
+
test('computes hashes for project files', async () => {
|
|
111
|
+
// Hash the prjct-cli project itself (small subset)
|
|
112
|
+
const projectPath = path.resolve(__dirname, '..', '..', '..')
|
|
113
|
+
const hashes = await computeHashes(projectPath)
|
|
114
|
+
|
|
115
|
+
// Should find many files
|
|
116
|
+
expect(hashes.size).toBeGreaterThan(50)
|
|
117
|
+
|
|
118
|
+
// Check a known file exists
|
|
119
|
+
const packageJson = hashes.get('package.json')
|
|
120
|
+
expect(packageJson).toBeDefined()
|
|
121
|
+
expect(packageJson!.hash).toMatch(/^(xxh64|fnv1a):/)
|
|
122
|
+
expect(packageJson!.size).toBeGreaterThan(0)
|
|
123
|
+
})
|
|
124
|
+
|
|
125
|
+
test('excludes node_modules and .git', async () => {
|
|
126
|
+
const projectPath = path.resolve(__dirname, '..', '..', '..')
|
|
127
|
+
const hashes = await computeHashes(projectPath)
|
|
128
|
+
|
|
129
|
+
for (const [filePath] of hashes) {
|
|
130
|
+
expect(filePath).not.toContain('node_modules')
|
|
131
|
+
expect(filePath).not.toContain('.git/')
|
|
132
|
+
}
|
|
133
|
+
})
|
|
134
|
+
|
|
135
|
+
test('hash is deterministic', async () => {
|
|
136
|
+
const projectPath = path.resolve(__dirname, '..', '..', '..')
|
|
137
|
+
const hashes1 = await computeHashes(projectPath)
|
|
138
|
+
const hashes2 = await computeHashes(projectPath)
|
|
139
|
+
|
|
140
|
+
// Same file should produce same hash
|
|
141
|
+
const pkg1 = hashes1.get('package.json')
|
|
142
|
+
const pkg2 = hashes2.get('package.json')
|
|
143
|
+
expect(pkg1?.hash).toBe(pkg2?.hash)
|
|
144
|
+
})
|
|
145
|
+
})
|
|
146
|
+
})
|