@blamechris/repo-memory 0.4.0 → 0.5.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +124 -33
- package/dist/server.js +1 -1
- package/dist/telemetry/tokens.d.ts +19 -0
- package/dist/telemetry/tokens.d.ts.map +1 -0
- package/dist/telemetry/tokens.js +25 -0
- package/dist/telemetry/tokens.js.map +1 -0
- package/dist/tools/force-reread.d.ts.map +1 -1
- package/dist/tools/force-reread.js +4 -0
- package/dist/tools/force-reread.js.map +1 -1
- package/dist/tools/get-file-summary.d.ts.map +1 -1
- package/dist/tools/get-file-summary.js +6 -0
- package/dist/tools/get-file-summary.js.map +1 -1
- package/dist/tools/invalidate.d.ts.map +1 -1
- package/dist/tools/invalidate.js +4 -0
- package/dist/tools/invalidate.js.map +1 -1
- package/dist/tools/search-by-purpose.d.ts.map +1 -1
- package/dist/tools/search-by-purpose.js +12 -1
- package/dist/tools/search-by-purpose.js.map +1 -1
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -1,15 +1,17 @@
|
|
|
1
1
|
# repo-memory
|
|
2
2
|
|
|
3
|
-
An MCP server that gives AI coding agents persistent memory about your codebase.
|
|
3
|
+
An MCP server that gives AI coding agents persistent memory about your codebase. Stop wasting tokens re-reading files your agent already understands.
|
|
4
4
|
|
|
5
5
|
## Why?
|
|
6
6
|
|
|
7
|
-
AI
|
|
8
|
-
|
|
9
|
-
-
|
|
10
|
-
- **
|
|
11
|
-
- **
|
|
12
|
-
- **
|
|
7
|
+
Every time an AI agent explores your project, it re-reads files from scratch — burning tokens on code it's already seen. On a 200-file project, that's **~43,000 tokens wasted per exploration pass**.
|
|
8
|
+
|
|
9
|
+
repo-memory fixes this:
|
|
10
|
+
- **Caches file summaries** — exports, imports, purpose, declarations, line count
|
|
11
|
+
- **Tracks changes** — only re-reads files that actually changed (SHA-256 hash comparison)
|
|
12
|
+
- **Dependency graphs** — understands which files depend on which
|
|
13
|
+
- **Task memory** — remembers what's been explored across conversation turns
|
|
14
|
+
- **Token telemetry** — measures and proves the savings
|
|
13
15
|
|
|
14
16
|
## Quick Start
|
|
15
17
|
|
|
@@ -32,31 +34,112 @@ npm install -g @blamechris/repo-memory
|
|
|
32
34
|
repo-memory # starts MCP server on stdio
|
|
33
35
|
```
|
|
34
36
|
|
|
37
|
+
## How It Works
|
|
38
|
+
|
|
39
|
+
### The problem
|
|
40
|
+
Your agent wants to understand `src/server.ts`. Normally it reads the whole file — 300 lines, ~800 tokens. But it really just needs: "what does this file export, import, and do?" That answer is ~200 tokens.
|
|
41
|
+
|
|
42
|
+
### The flow
|
|
43
|
+
|
|
44
|
+
**First access (cache miss):**
|
|
45
|
+
1. Agent calls `get_file_summary("src/server.ts")`
|
|
46
|
+
2. repo-memory reads the file, SHA-256 hashes it, extracts a summary via regex (exports, imports, purpose, declarations, line count)
|
|
47
|
+
3. Stores the hash + summary in SQLite (`.repo-memory/cache.db` in your project)
|
|
48
|
+
4. Returns the compact summary
|
|
49
|
+
5. No savings yet — we had to read the file anyway
|
|
50
|
+
|
|
51
|
+
**Every subsequent access (cache hit):**
|
|
52
|
+
1. Agent calls `get_file_summary("src/server.ts")` again
|
|
53
|
+
2. repo-memory reads and hashes the file — hash matches what's stored
|
|
54
|
+
3. Returns the cached summary instantly, without re-parsing
|
|
55
|
+
4. Savings logged: `(full file tokens) - (summary tokens)` = tokens your agent didn't consume
|
|
56
|
+
|
|
57
|
+
**When files change:**
|
|
58
|
+
- The hash won't match, so repo-memory generates a fresh summary automatically
|
|
59
|
+
- You never get stale data
|
|
60
|
+
|
|
61
|
+
The savings compound fast. An agent exploring a project touches the same files 3-5 times per session. First pass costs full price. Every subsequent hit returns a tiny summary instead of the full file — that's where the ~3.6x compression ratio comes from.
|
|
62
|
+
|
|
35
63
|
## Tools
|
|
36
64
|
|
|
37
65
|
| Tool | Description |
|
|
38
66
|
|------|-------------|
|
|
39
67
|
| `get_file_summary` | Cached file summary (exports, imports, purpose) |
|
|
68
|
+
| `batch_file_summaries` | Get summaries for multiple files at once |
|
|
40
69
|
| `get_changed_files` | Files changed since last check |
|
|
41
70
|
| `get_project_map` | Structural overview of project |
|
|
71
|
+
| `search_by_purpose` | Search files by purpose/exports keywords |
|
|
72
|
+
| `get_related_files` | Find related files ranked by relevance |
|
|
73
|
+
| `get_dependency_graph` | File dependency relationships |
|
|
74
|
+
| `create_task` / `get_task_context` / `mark_explored` | Track investigation progress across turns |
|
|
75
|
+
| `get_token_report` | Token usage and savings report |
|
|
42
76
|
| `force_reread` | Force fresh summary generation |
|
|
43
77
|
| `invalidate` | Clear cache entries |
|
|
44
|
-
| `get_dependency_graph` | File dependency relationships |
|
|
45
|
-
| `create_task` | Create investigation task |
|
|
46
|
-
| `get_task_context` | Task state and explored files |
|
|
47
|
-
| `mark_explored` | Mark file as explored for task |
|
|
48
|
-
| `get_token_report` | Token usage telemetry report |
|
|
49
|
-
| `get_related_files` | Find related files ranked by relevance |
|
|
50
|
-
| `batch_file_summaries` | Get summaries for multiple files at once |
|
|
51
|
-
| `search_by_purpose` | Search files by purpose/exports keywords |
|
|
52
78
|
|
|
53
|
-
##
|
|
79
|
+
## Token Savings Tracking
|
|
80
|
+
|
|
81
|
+
repo-memory tracks every cache interaction so you can measure exactly how many tokens you're saving. Call `get_token_report` at any time to see your stats.
|
|
82
|
+
|
|
83
|
+
### What gets tracked
|
|
84
|
+
|
|
85
|
+
| Event | When | Tokens Recorded |
|
|
86
|
+
|-------|------|-----------------|
|
|
87
|
+
| `cache_hit` | Summary served from cache (hash unchanged) | Tokens saved (raw file - summary) |
|
|
88
|
+
| `cache_miss` | File changed or first access | 0 (no savings on first read) |
|
|
89
|
+
| `force_reread` | Explicit re-read requested | Raw file token count |
|
|
90
|
+
| `invalidation` | Cache entry cleared | — |
|
|
91
|
+
| `summary_served` | File matched via `search_by_purpose` | Estimated raw file tokens |
|
|
92
|
+
|
|
93
|
+
### How savings are calculated
|
|
94
|
+
|
|
95
|
+
Token estimates use the standard heuristic of **~4 characters per token**, which closely matches major LLM tokenizers (cl100k_base, o200k_base).
|
|
96
|
+
|
|
97
|
+
For each cache hit:
|
|
98
|
+
```
|
|
99
|
+
tokensSaved = ceil(rawFileChars / 4) - ceil(summaryJsonChars / 4)
|
|
100
|
+
```
|
|
54
101
|
|
|
55
|
-
|
|
56
|
-
|
|
57
|
-
|
|
58
|
-
|
|
59
|
-
|
|
102
|
+
- **rawFileChars** — the full file contents your agent would have consumed
|
|
103
|
+
- **summaryJsonChars** — the compact summary served instead (purpose, exports, imports, declarations, line count)
|
|
104
|
+
|
|
105
|
+
The reported savings represent real tokens that never entered your context window.
|
|
106
|
+
|
|
107
|
+
### Querying your savings
|
|
108
|
+
|
|
109
|
+
```
|
|
110
|
+
# All-time stats
|
|
111
|
+
get_token_report()
|
|
112
|
+
|
|
113
|
+
# Last 24 hours
|
|
114
|
+
get_token_report(period: "last_n_hours", hours: 24)
|
|
115
|
+
|
|
116
|
+
# Current session only
|
|
117
|
+
get_token_report(period: "session", session_id: "<id>")
|
|
118
|
+
|
|
119
|
+
# With cache health diagnostics
|
|
120
|
+
get_token_report(include_diagnostics: true)
|
|
121
|
+
```
|
|
122
|
+
|
|
123
|
+
The report includes:
|
|
124
|
+
- **Cache hit ratio** — percentage of requests served from cache
|
|
125
|
+
- **Estimated tokens saved** — cumulative tokens your agent didn't consume
|
|
126
|
+
- **Top files** — most frequently accessed files and their token impact
|
|
127
|
+
- **Event breakdown** — counts by event type
|
|
128
|
+
|
|
129
|
+
## Performance
|
|
130
|
+
|
|
131
|
+
Benchmarks measured on synthetic TypeScript projects with realistic imports and class structures:
|
|
132
|
+
|
|
133
|
+
| Scenario | Files | Raw Size | Summary Size | Compression | Tokens Saved | Speed |
|
|
134
|
+
|----------|-------|----------|--------------|-------------|--------------|-------|
|
|
135
|
+
| Explore project | 10 | 11.7 KB | 3.3 KB | 3.6x | ~2,100 | 3.7 ms/file |
|
|
136
|
+
| Explore project | 50 | 58.0 KB | 16.2 KB | 3.6x | ~10,700 | 0.7 ms/file |
|
|
137
|
+
| Explore project | 100 | 116.1 KB | 32.3 KB | 3.6x | ~21,500 | 0.4 ms/file |
|
|
138
|
+
| Explore project | 200 | 233.4 KB | 65.7 KB | 3.6x | ~42,900 | 0.3 ms/file |
|
|
139
|
+
|
|
140
|
+
~3.6x compression ratio at all scales. Sub-millisecond per file on cached reads.
|
|
141
|
+
|
|
142
|
+
Run benchmarks yourself: `npm run benchmark`
|
|
60
143
|
|
|
61
144
|
## Architecture
|
|
62
145
|
|
|
@@ -66,26 +149,34 @@ MCP Server (stdio transport)
|
|
|
66
149
|
├── Indexer Pipeline (scanner, summarizer, imports, diff-analyzer)
|
|
67
150
|
├── Dependency Graph (in-memory adjacency maps backed by SQLite)
|
|
68
151
|
├── Task Memory (CRUD, exploration tracking, frontier)
|
|
69
|
-
├── Telemetry (token tracking, sampling, export)
|
|
152
|
+
├── Telemetry (token tracking, sampling, export, retention)
|
|
70
153
|
├── Session Manager (cross-turn persistence)
|
|
71
154
|
└── Persistence Layer (SQLite with WAL mode)
|
|
72
155
|
```
|
|
73
156
|
|
|
74
|
-
##
|
|
157
|
+
## Configuration
|
|
75
158
|
|
|
76
|
-
|
|
159
|
+
Create a `.repo-memory.json` in your project root to customize behavior:
|
|
77
160
|
|
|
78
|
-
|
|
79
|
-
|
|
80
|
-
|
|
81
|
-
|
|
82
|
-
|
|
83
|
-
|
|
84
|
-
|
|
161
|
+
```json
|
|
162
|
+
{
|
|
163
|
+
"ignore": ["dist", "node_modules", "*.generated.ts"],
|
|
164
|
+
"maxFiles": 5000,
|
|
165
|
+
"gc": {
|
|
166
|
+
"cacheMaxAgeDays": 30
|
|
167
|
+
}
|
|
168
|
+
}
|
|
169
|
+
```
|
|
85
170
|
|
|
86
|
-
|
|
171
|
+
## Language Support
|
|
87
172
|
|
|
88
|
-
|
|
173
|
+
Summaries are extracted via regex analysis. Supported languages:
|
|
174
|
+
- **TypeScript / JavaScript** — exports, imports, declarations, purpose classification
|
|
175
|
+
- **Python** — functions, classes, `__all__`, `from`/`import` statements
|
|
176
|
+
- **Go** — exported names (uppercase), imports, type/func/var/const declarations
|
|
177
|
+
- **Rust** — `pub` items, `use`/`mod` statements, structs/enums/traits/impls
|
|
178
|
+
|
|
179
|
+
Config files (JSON, YAML, TOML) and other file types get basic classification.
|
|
89
180
|
|
|
90
181
|
## Development
|
|
91
182
|
|
package/dist/server.js
CHANGED
|
@@ -18,7 +18,7 @@ import { SessionManager } from './memory/session.js';
|
|
|
18
18
|
import { loadConfig } from './config.js';
|
|
19
19
|
const server = new McpServer({
|
|
20
20
|
name: 'repo-memory',
|
|
21
|
-
version: '0.
|
|
21
|
+
version: '0.5.0',
|
|
22
22
|
});
|
|
23
23
|
server.registerTool('get_file_summary', {
|
|
24
24
|
title: 'Get File Summary',
|
|
@@ -0,0 +1,19 @@
|
|
|
1
|
+
/**
|
|
2
|
+
* Token estimation utilities for telemetry tracking.
|
|
3
|
+
*
|
|
4
|
+
* Uses the widely-accepted heuristic of ~4 characters per token for English
|
|
5
|
+
* text and code. This matches OpenAI's documentation and empirical analysis
|
|
6
|
+
* across major LLM tokenizers (cl100k_base, o200k_base).
|
|
7
|
+
*
|
|
8
|
+
* For cache hit savings, we calculate:
|
|
9
|
+
* tokensAvoided = rawFileTokens - summaryTokens
|
|
10
|
+
*
|
|
11
|
+
* This represents the tokens the agent would have consumed reading the full
|
|
12
|
+
* file, minus the (much smaller) summary that was served instead.
|
|
13
|
+
*/
|
|
14
|
+
import type { FileSummary } from '../types.js';
|
|
15
|
+
/** Estimate token count from a string's character length. */
|
|
16
|
+
export declare function estimateTokens(text: string): number;
|
|
17
|
+
/** Estimate tokens saved by serving a summary instead of raw file contents. */
|
|
18
|
+
export declare function estimateTokensSaved(rawContents: string, summary: FileSummary): number;
|
|
19
|
+
//# sourceMappingURL=tokens.d.ts.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"file":"tokens.d.ts","sourceRoot":"","sources":["../../src/telemetry/tokens.ts"],"names":[],"mappings":"AAAA;;;;;;;;;;;;GAYG;AAEH,OAAO,KAAK,EAAE,WAAW,EAAE,MAAM,aAAa,CAAC;AAI/C,6DAA6D;AAC7D,wBAAgB,cAAc,CAAC,IAAI,EAAE,MAAM,GAAG,MAAM,CAEnD;AAED,+EAA+E;AAC/E,wBAAgB,mBAAmB,CAAC,WAAW,EAAE,MAAM,EAAE,OAAO,EAAE,WAAW,GAAG,MAAM,CAIrF"}
|
|
@@ -0,0 +1,25 @@
|
|
|
1
|
+
/**
|
|
2
|
+
* Token estimation utilities for telemetry tracking.
|
|
3
|
+
*
|
|
4
|
+
* Uses the widely-accepted heuristic of ~4 characters per token for English
|
|
5
|
+
* text and code. This matches OpenAI's documentation and empirical analysis
|
|
6
|
+
* across major LLM tokenizers (cl100k_base, o200k_base).
|
|
7
|
+
*
|
|
8
|
+
* For cache hit savings, we calculate:
|
|
9
|
+
* tokensAvoided = rawFileTokens - summaryTokens
|
|
10
|
+
*
|
|
11
|
+
* This represents the tokens the agent would have consumed reading the full
|
|
12
|
+
* file, minus the (much smaller) summary that was served instead.
|
|
13
|
+
*/
|
|
14
|
+
const CHARS_PER_TOKEN = 4;
|
|
15
|
+
/** Estimate token count from a string's character length. */
|
|
16
|
+
export function estimateTokens(text) {
|
|
17
|
+
return Math.ceil(text.length / CHARS_PER_TOKEN);
|
|
18
|
+
}
|
|
19
|
+
/** Estimate tokens saved by serving a summary instead of raw file contents. */
|
|
20
|
+
export function estimateTokensSaved(rawContents, summary) {
|
|
21
|
+
const rawTokens = estimateTokens(rawContents);
|
|
22
|
+
const summaryTokens = estimateTokens(JSON.stringify(summary));
|
|
23
|
+
return Math.max(0, rawTokens - summaryTokens);
|
|
24
|
+
}
|
|
25
|
+
//# sourceMappingURL=tokens.js.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"file":"tokens.js","sourceRoot":"","sources":["../../src/telemetry/tokens.ts"],"names":[],"mappings":"AAAA;;;;;;;;;;;;GAYG;AAIH,MAAM,eAAe,GAAG,CAAC,CAAC;AAE1B,6DAA6D;AAC7D,MAAM,UAAU,cAAc,CAAC,IAAY;IACzC,OAAO,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,MAAM,GAAG,eAAe,CAAC,CAAC;AAClD,CAAC;AAED,+EAA+E;AAC/E,MAAM,UAAU,mBAAmB,CAAC,WAAmB,EAAE,OAAoB;IAC3E,MAAM,SAAS,GAAG,cAAc,CAAC,WAAW,CAAC,CAAC;IAC9C,MAAM,aAAa,GAAG,cAAc,CAAC,IAAI,CAAC,SAAS,CAAC,OAAO,CAAC,CAAC,CAAC;IAC9D,OAAO,IAAI,CAAC,GAAG,CAAC,CAAC,EAAE,SAAS,GAAG,aAAa,CAAC,CAAC;AAChD,CAAC"}
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"force-reread.d.ts","sourceRoot":"","sources":["../../src/tools/force-reread.ts"],"names":[],"mappings":"
|
|
1
|
+
{"version":3,"file":"force-reread.d.ts","sourceRoot":"","sources":["../../src/tools/force-reread.ts"],"names":[],"mappings":"AAOA,OAAO,KAAK,EAAE,WAAW,EAAE,MAAM,aAAa,CAAC;AAG/C,wBAAsB,WAAW,CAC/B,WAAW,EAAE,MAAM,EACnB,YAAY,EAAE,MAAM,GACnB,OAAO,CAAC;IAAE,IAAI,EAAE,MAAM,CAAC;IAAC,IAAI,EAAE,MAAM,CAAC;IAAC,OAAO,EAAE,WAAW,CAAC;IAAC,MAAM,EAAE,IAAI,CAAC;IAAC,MAAM,EAAE,MAAM,CAAA;CAAE,CAAC,CAc7F"}
|
|
@@ -3,6 +3,8 @@ import { join } from 'node:path';
|
|
|
3
3
|
import { hashContents } from '../cache/hash.js';
|
|
4
4
|
import { CacheStore } from '../cache/store.js';
|
|
5
5
|
import { summarizeFile } from '../indexer/summarizer.js';
|
|
6
|
+
import { TelemetryTracker } from '../telemetry/tracker.js';
|
|
7
|
+
import { estimateTokens } from '../telemetry/tokens.js';
|
|
6
8
|
import { validatePath } from '../utils/validate-path.js';
|
|
7
9
|
export async function forceReread(projectRoot, relativePath) {
|
|
8
10
|
relativePath = validatePath(projectRoot, relativePath);
|
|
@@ -12,6 +14,8 @@ export async function forceReread(projectRoot, relativePath) {
|
|
|
12
14
|
const summary = summarizeFile(relativePath, contents);
|
|
13
15
|
const store = new CacheStore(projectRoot);
|
|
14
16
|
store.setEntry(relativePath, hash, summary);
|
|
17
|
+
const tracker = new TelemetryTracker(projectRoot);
|
|
18
|
+
tracker.trackEvent('force_reread', relativePath, estimateTokens(contents));
|
|
15
19
|
return { path: relativePath, hash, summary, reread: true, reason: 'force_reread: explicitly requested' };
|
|
16
20
|
}
|
|
17
21
|
//# sourceMappingURL=force-reread.js.map
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"force-reread.js","sourceRoot":"","sources":["../../src/tools/force-reread.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,QAAQ,EAAE,MAAM,kBAAkB,CAAC;AAC5C,OAAO,EAAE,IAAI,EAAE,MAAM,WAAW,CAAC;AACjC,OAAO,EAAE,YAAY,EAAE,MAAM,kBAAkB,CAAC;AAChD,OAAO,EAAE,UAAU,EAAE,MAAM,mBAAmB,CAAC;AAC/C,OAAO,EAAE,aAAa,EAAE,MAAM,0BAA0B,CAAC;
|
|
1
|
+
{"version":3,"file":"force-reread.js","sourceRoot":"","sources":["../../src/tools/force-reread.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,QAAQ,EAAE,MAAM,kBAAkB,CAAC;AAC5C,OAAO,EAAE,IAAI,EAAE,MAAM,WAAW,CAAC;AACjC,OAAO,EAAE,YAAY,EAAE,MAAM,kBAAkB,CAAC;AAChD,OAAO,EAAE,UAAU,EAAE,MAAM,mBAAmB,CAAC;AAC/C,OAAO,EAAE,aAAa,EAAE,MAAM,0BAA0B,CAAC;AACzD,OAAO,EAAE,gBAAgB,EAAE,MAAM,yBAAyB,CAAC;AAC3D,OAAO,EAAE,cAAc,EAAE,MAAM,wBAAwB,CAAC;AAExD,OAAO,EAAE,YAAY,EAAE,MAAM,2BAA2B,CAAC;AAEzD,MAAM,CAAC,KAAK,UAAU,WAAW,CAC/B,WAAmB,EACnB,YAAoB;IAEpB,YAAY,GAAG,YAAY,CAAC,WAAW,EAAE,YAAY,CAAC,CAAC;IACvD,MAAM,YAAY,GAAG,IAAI,CAAC,WAAW,EAAE,YAAY,CAAC,CAAC;IACrD,MAAM,QAAQ,GAAG,MAAM,QAAQ,CAAC,YAAY,EAAE,OAAO,CAAC,CAAC;IACvD,MAAM,IAAI,GAAG,YAAY,CAAC,QAAQ,CAAC,CAAC;IACpC,MAAM,OAAO,GAAG,aAAa,CAAC,YAAY,EAAE,QAAQ,CAAC,CAAC;IAEtD,MAAM,KAAK,GAAG,IAAI,UAAU,CAAC,WAAW,CAAC,CAAC;IAC1C,KAAK,CAAC,QAAQ,CAAC,YAAY,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;IAE5C,MAAM,OAAO,GAAG,IAAI,gBAAgB,CAAC,WAAW,CAAC,CAAC;IAClD,OAAO,CAAC,UAAU,CAAC,cAAc,EAAE,YAAY,EAAE,cAAc,CAAC,QAAQ,CAAC,CAAC,CAAC;IAE3E,OAAO,EAAE,IAAI,EAAE,YAAY,EAAE,IAAI,EAAE,OAAO,EAAE,MAAM,EAAE,IAAI,EAAE,MAAM,EAAE,oCAAoC,EAAE,CAAC;AAC3G,CAAC"}
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"get-file-summary.d.ts","sourceRoot":"","sources":["../../src/tools/get-file-summary.ts"],"names":[],"mappings":"
|
|
1
|
+
{"version":3,"file":"get-file-summary.d.ts","sourceRoot":"","sources":["../../src/tools/get-file-summary.ts"],"names":[],"mappings":"AAOA,OAAO,KAAK,EAAE,WAAW,EAAE,MAAM,aAAa,CAAC;AAG/C,MAAM,WAAW,iBAAiB;IAChC,IAAI,EAAE,MAAM,CAAC;IACb,IAAI,EAAE,MAAM,CAAC;IACb,OAAO,EAAE,WAAW,CAAC;IACrB,SAAS,EAAE,OAAO,CAAC;IACnB,MAAM,EAAE,MAAM,CAAC;IACf,QAAQ,EAAE,MAAM,GAAG,IAAI,CAAC;IACxB,eAAe,EAAE,OAAO,CAAC;CAC1B;AAED,wBAAsB,cAAc,CAClC,WAAW,EAAE,MAAM,EACnB,YAAY,EAAE,MAAM,GACnB,OAAO,CAAC,iBAAiB,CAAC,CAwD5B"}
|
|
@@ -3,6 +3,8 @@ import { join } from 'node:path';
|
|
|
3
3
|
import { hashContents } from '../cache/hash.js';
|
|
4
4
|
import { CacheStore } from '../cache/store.js';
|
|
5
5
|
import { summarizeFile } from '../indexer/summarizer.js';
|
|
6
|
+
import { TelemetryTracker } from '../telemetry/tracker.js';
|
|
7
|
+
import { estimateTokensSaved } from '../telemetry/tokens.js';
|
|
6
8
|
import { validatePath } from '../utils/validate-path.js';
|
|
7
9
|
export async function getFileSummary(projectRoot, relativePath) {
|
|
8
10
|
relativePath = validatePath(projectRoot, relativePath);
|
|
@@ -16,7 +18,10 @@ export async function getFileSummary(projectRoot, relativePath) {
|
|
|
16
18
|
const cacheAge = cached
|
|
17
19
|
? Math.floor((Date.now() - cached.lastChecked) / 1000)
|
|
18
20
|
: null;
|
|
21
|
+
const tracker = new TelemetryTracker(projectRoot);
|
|
19
22
|
if (cached && cached.hash === currentHash && cached.summary) {
|
|
23
|
+
const tokensSaved = estimateTokensSaved(contents, cached.summary);
|
|
24
|
+
tracker.trackEvent('cache_hit', relativePath, tokensSaved);
|
|
20
25
|
return {
|
|
21
26
|
path: relativePath,
|
|
22
27
|
hash: currentHash,
|
|
@@ -40,6 +45,7 @@ export async function getFileSummary(projectRoot, relativePath) {
|
|
|
40
45
|
else {
|
|
41
46
|
reason = 'cache_miss: no summary in cache';
|
|
42
47
|
}
|
|
48
|
+
tracker.trackEvent('cache_miss', relativePath, 0);
|
|
43
49
|
return {
|
|
44
50
|
path: relativePath,
|
|
45
51
|
hash: currentHash,
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"get-file-summary.js","sourceRoot":"","sources":["../../src/tools/get-file-summary.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,QAAQ,EAAE,MAAM,kBAAkB,CAAC;AAC5C,OAAO,EAAE,IAAI,EAAE,MAAM,WAAW,CAAC;AACjC,OAAO,EAAE,YAAY,EAAE,MAAM,kBAAkB,CAAC;AAChD,OAAO,EAAE,UAAU,EAAE,MAAM,mBAAmB,CAAC;AAC/C,OAAO,EAAE,aAAa,EAAE,MAAM,0BAA0B,CAAC;
|
|
1
|
+
{"version":3,"file":"get-file-summary.js","sourceRoot":"","sources":["../../src/tools/get-file-summary.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,QAAQ,EAAE,MAAM,kBAAkB,CAAC;AAC5C,OAAO,EAAE,IAAI,EAAE,MAAM,WAAW,CAAC;AACjC,OAAO,EAAE,YAAY,EAAE,MAAM,kBAAkB,CAAC;AAChD,OAAO,EAAE,UAAU,EAAE,MAAM,mBAAmB,CAAC;AAC/C,OAAO,EAAE,aAAa,EAAE,MAAM,0BAA0B,CAAC;AACzD,OAAO,EAAE,gBAAgB,EAAE,MAAM,yBAAyB,CAAC;AAC3D,OAAO,EAAE,mBAAmB,EAAE,MAAM,wBAAwB,CAAC;AAE7D,OAAO,EAAE,YAAY,EAAE,MAAM,2BAA2B,CAAC;AAYzD,MAAM,CAAC,KAAK,UAAU,cAAc,CAClC,WAAmB,EACnB,YAAoB;IAEpB,YAAY,GAAG,YAAY,CAAC,WAAW,EAAE,YAAY,CAAC,CAAC;IACvD,MAAM,KAAK,GAAG,IAAI,UAAU,CAAC,WAAW,CAAC,CAAC;IAC1C,MAAM,YAAY,GAAG,IAAI,CAAC,WAAW,EAAE,YAAY,CAAC,CAAC;IAErD,yBAAyB;IACzB,MAAM,QAAQ,GAAG,MAAM,QAAQ,CAAC,YAAY,EAAE,OAAO,CAAC,CAAC;IACvD,MAAM,WAAW,GAAG,YAAY,CAAC,QAAQ,CAAC,CAAC;IAE3C,cAAc;IACd,MAAM,MAAM,GAAG,KAAK,CAAC,QAAQ,CAAC,YAAY,CAAC,CAAC;IAC5C,MAAM,QAAQ,GAAG,MAAM;QACrB,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,IAAI,CAAC,GAAG,EAAE,GAAG,MAAM,CAAC,WAAW,CAAC,GAAG,IAAI,CAAC;QACtD,CAAC,CAAC,IAAI,CAAC;IAET,MAAM,OAAO,GAAG,IAAI,gBAAgB,CAAC,WAAW,CAAC,CAAC;IAElD,IAAI,MAAM,IAAI,MAAM,CAAC,IAAI,KAAK,WAAW,IAAI,MAAM,CAAC,OAAO,EAAE,CAAC;QAC5D,MAAM,WAAW,GAAG,mBAAmB,CAAC,QAAQ,EAAE,MAAM,CAAC,OAAO,CAAC,CAAC;QAClE,OAAO,CAAC,UAAU,CAAC,WAAW,EAAE,YAAY,EAAE,WAAW,CAAC,CAAC;QAE3D,OAAO;YACL,IAAI,EAAE,YAAY;YAClB,IAAI,EAAE,WAAW;YACjB,OAAO,EAAE,MAAM,CAAC,OAAO;YACvB,SAAS,EAAE,IAAI;YACf,MAAM,EAAE,2BAA2B;YACnC,QAAQ;YACR,eAAe,EAAE,MAAM,CAAC,OAAO,CAAC,UAAU,KAAK,KAAK;SACrD,CAAC;IACJ,CAAC;IAED,yBAAyB;IACzB,MAAM,OAAO,GAAG,aAAa,CAAC,YAAY,EAAE,QAAQ,CAAC,CAAC;IACtD,KAAK,CAAC,QAAQ,CAAC,YAAY,EAAE,WAAW,EAAE,OAAO,CAAC,CAAC;IAEnD,IAAI,MAAc,CAAC;IACnB,IAAI,CAAC,MAAM,EAAE,CAAC;QACZ,MAAM,GAAG,4BAA4B,CAAC;IACxC,CAAC;SAAM,IAAI,MAAM,CAAC,IAAI,KAAK,WAAW,EAAE,CAAC;QACvC,MAAM,GAAG,0BAA0B,CAAC;IACtC,CAAC;SAAM,CAAC;QACN,MAAM,GAAG,iCAAiC,CAAC;IAC7C,CAAC;IAED,OAAO,CAAC,UAAU,CAAC,YAAY,EAAE,YAAY,EAAE,CAAC,CAAC,CAAC;IAElD,OAAO;QACL,IAAI,EAAE,YAAY;QAClB,IAAI,EAAE,WAAW;QACjB,OAAO;QACP,SAAS,EAAE,KAAK;QAChB,MAAM;QACN,QAAQ;QACR,eAAe,EAAE,OAAO,CAAC,UAAU,KAAK,KAAK;KAC9C,CAAC;AACJ,CAAC"}
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"invalidate.d.ts","sourceRoot":"","sources":["../../src/tools/invalidate.ts"],"names":[],"mappings":"
|
|
1
|
+
{"version":3,"file":"invalidate.d.ts","sourceRoot":"","sources":["../../src/tools/invalidate.ts"],"names":[],"mappings":"AAIA,wBAAsB,eAAe,CACnC,WAAW,EAAE,MAAM,EACnB,IAAI,CAAC,EAAE,MAAM,GACZ,OAAO,CAAC;IAAE,WAAW,EAAE,MAAM,GAAG,KAAK,CAAC;IAAC,cAAc,EAAE,MAAM,CAAA;CAAE,CAAC,CAqBlE"}
|
package/dist/tools/invalidate.js
CHANGED
|
@@ -1,18 +1,22 @@
|
|
|
1
1
|
import { CacheStore } from '../cache/store.js';
|
|
2
|
+
import { TelemetryTracker } from '../telemetry/tracker.js';
|
|
2
3
|
import { validatePath } from '../utils/validate-path.js';
|
|
3
4
|
export async function invalidateCache(projectRoot, path) {
|
|
4
5
|
if (path) {
|
|
5
6
|
path = validatePath(projectRoot, path);
|
|
6
7
|
}
|
|
7
8
|
const store = new CacheStore(projectRoot);
|
|
9
|
+
const tracker = new TelemetryTracker(projectRoot);
|
|
8
10
|
if (path) {
|
|
9
11
|
store.deleteEntry(path);
|
|
12
|
+
tracker.trackEvent('invalidation', path);
|
|
10
13
|
return { invalidated: path, entriesRemoved: 1 };
|
|
11
14
|
}
|
|
12
15
|
const entries = store.getAllEntries();
|
|
13
16
|
const count = entries.length;
|
|
14
17
|
for (const entry of entries) {
|
|
15
18
|
store.deleteEntry(entry.path);
|
|
19
|
+
tracker.trackEvent('invalidation', entry.path);
|
|
16
20
|
}
|
|
17
21
|
return { invalidated: 'all', entriesRemoved: count };
|
|
18
22
|
}
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"invalidate.js","sourceRoot":"","sources":["../../src/tools/invalidate.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,UAAU,EAAE,MAAM,mBAAmB,CAAC;AAC/C,OAAO,EAAE,YAAY,EAAE,MAAM,2BAA2B,CAAC;AAEzD,MAAM,CAAC,KAAK,UAAU,eAAe,CACnC,WAAmB,EACnB,IAAa;IAEb,IAAI,IAAI,EAAE,CAAC;QACT,IAAI,GAAG,YAAY,CAAC,WAAW,EAAE,IAAI,CAAC,CAAC;IACzC,CAAC;IACD,MAAM,KAAK,GAAG,IAAI,UAAU,CAAC,WAAW,CAAC,CAAC;
|
|
1
|
+
{"version":3,"file":"invalidate.js","sourceRoot":"","sources":["../../src/tools/invalidate.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,UAAU,EAAE,MAAM,mBAAmB,CAAC;AAC/C,OAAO,EAAE,gBAAgB,EAAE,MAAM,yBAAyB,CAAC;AAC3D,OAAO,EAAE,YAAY,EAAE,MAAM,2BAA2B,CAAC;AAEzD,MAAM,CAAC,KAAK,UAAU,eAAe,CACnC,WAAmB,EACnB,IAAa;IAEb,IAAI,IAAI,EAAE,CAAC;QACT,IAAI,GAAG,YAAY,CAAC,WAAW,EAAE,IAAI,CAAC,CAAC;IACzC,CAAC;IACD,MAAM,KAAK,GAAG,IAAI,UAAU,CAAC,WAAW,CAAC,CAAC;IAC1C,MAAM,OAAO,GAAG,IAAI,gBAAgB,CAAC,WAAW,CAAC,CAAC;IAElD,IAAI,IAAI,EAAE,CAAC;QACT,KAAK,CAAC,WAAW,CAAC,IAAI,CAAC,CAAC;QACxB,OAAO,CAAC,UAAU,CAAC,cAAc,EAAE,IAAI,CAAC,CAAC;QACzC,OAAO,EAAE,WAAW,EAAE,IAAI,EAAE,cAAc,EAAE,CAAC,EAAE,CAAC;IAClD,CAAC;IAED,MAAM,OAAO,GAAG,KAAK,CAAC,aAAa,EAAE,CAAC;IACtC,MAAM,KAAK,GAAG,OAAO,CAAC,MAAM,CAAC;IAC7B,KAAK,MAAM,KAAK,IAAI,OAAO,EAAE,CAAC;QAC5B,KAAK,CAAC,WAAW,CAAC,KAAK,CAAC,IAAI,CAAC,CAAC;QAC9B,OAAO,CAAC,UAAU,CAAC,cAAc,EAAE,KAAK,CAAC,IAAI,CAAC,CAAC;IACjD,CAAC;IAED,OAAO,EAAE,WAAW,EAAE,KAAK,EAAE,cAAc,EAAE,KAAK,EAAE,CAAC;AACvD,CAAC"}
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"search-by-purpose.d.ts","sourceRoot":"","sources":["../../src/tools/search-by-purpose.ts"],"names":[],"mappings":"
|
|
1
|
+
{"version":3,"file":"search-by-purpose.d.ts","sourceRoot":"","sources":["../../src/tools/search-by-purpose.ts"],"names":[],"mappings":"AAGA,MAAM,WAAW,YAAY;IAC3B,IAAI,EAAE,MAAM,CAAC;IACb,OAAO,EAAE,MAAM,CAAC;IAChB,SAAS,EAAE,MAAM,EAAE,CAAC;IACpB,OAAO,EAAE,MAAM,EAAE,CAAC;IAClB,UAAU,EAAE,MAAM,CAAC;CACpB;AAED,MAAM,WAAW,qBAAqB;IACpC,KAAK,EAAE,MAAM,CAAC;IACd,OAAO,EAAE,YAAY,EAAE,CAAC;IACxB,WAAW,EAAE,MAAM,CAAC;CACrB;AAED,wBAAgB,eAAe,CAC7B,WAAW,EAAE,MAAM,EACnB,KAAK,EAAE,MAAM,EACb,KAAK,CAAC,EAAE,MAAM,GACb,qBAAqB,CAuEvB"}
|
|
@@ -1,4 +1,5 @@
|
|
|
1
1
|
import { CacheStore } from '../cache/store.js';
|
|
2
|
+
import { TelemetryTracker } from '../telemetry/tracker.js';
|
|
2
3
|
export function searchByPurpose(projectRoot, query, limit) {
|
|
3
4
|
const store = new CacheStore(projectRoot);
|
|
4
5
|
const allEntries = store.getAllEntries();
|
|
@@ -41,9 +42,19 @@ export function searchByPurpose(projectRoot, query, limit) {
|
|
|
41
42
|
}
|
|
42
43
|
// Sort by score descending
|
|
43
44
|
results.sort((a, b) => b.score - a.score);
|
|
45
|
+
const matched = results.slice(0, effectiveLimit);
|
|
46
|
+
// Track each matched file as a summary_served event.
|
|
47
|
+
// Token estimate: approximate raw file tokens from lineCount (avg ~40 chars/line).
|
|
48
|
+
const tracker = new TelemetryTracker(projectRoot);
|
|
49
|
+
for (const result of matched) {
|
|
50
|
+
const entry = allEntries.find(e => e.path === result.path);
|
|
51
|
+
// Approximate raw file tokens from lineCount (avg ~40 chars/line, ~4 chars/token)
|
|
52
|
+
const estimatedRawTokens = entry?.summary ? entry.summary.lineCount * 10 : 0;
|
|
53
|
+
tracker.trackEvent('summary_served', result.path, estimatedRawTokens);
|
|
54
|
+
}
|
|
44
55
|
return {
|
|
45
56
|
query,
|
|
46
|
-
results:
|
|
57
|
+
results: matched.map(({ score: _score, ...rest }) => rest),
|
|
47
58
|
totalCached: allEntries.filter(e => e.summary).length,
|
|
48
59
|
};
|
|
49
60
|
}
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"search-by-purpose.js","sourceRoot":"","sources":["../../src/tools/search-by-purpose.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,UAAU,EAAE,MAAM,mBAAmB,CAAC;
|
|
1
|
+
{"version":3,"file":"search-by-purpose.js","sourceRoot":"","sources":["../../src/tools/search-by-purpose.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,UAAU,EAAE,MAAM,mBAAmB,CAAC;AAC/C,OAAO,EAAE,gBAAgB,EAAE,MAAM,yBAAyB,CAAC;AAgB3D,MAAM,UAAU,eAAe,CAC7B,WAAmB,EACnB,KAAa,EACb,KAAc;IAEd,MAAM,KAAK,GAAG,IAAI,UAAU,CAAC,WAAW,CAAC,CAAC;IAC1C,MAAM,UAAU,GAAG,KAAK,CAAC,aAAa,EAAE,CAAC;IACzC,MAAM,cAAc,GAAG,KAAK,IAAI,EAAE,CAAC;IAEnC,MAAM,UAAU,GAAG,KAAK,CAAC,WAAW,EAAE,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;IACpE,MAAM,OAAO,GAA4C,EAAE,CAAC;IAE5D,KAAK,MAAM,KAAK,IAAI,UAAU,EAAE,CAAC;QAC/B,IAAI,CAAC,KAAK,CAAC,OAAO;YAAE,SAAS;QAE7B,MAAM,SAAS,GAAa,EAAE,CAAC;QAC/B,IAAI,KAAK,GAAG,CAAC,CAAC;QAEd,MAAM,OAAO,GAAG,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC,WAAW,EAAE,CAAC;QACpD,MAAM,cAAc,GAAG,UAAU,CAAC,MAAM,CAAC,IAAI,CAAC,EAAE,CAAC,OAAO,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC;QACzE,IAAI,cAAc,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;YAC9B,SAAS,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;YAC1B,KAAK,IAAI,cAAc,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,mCAAmC;QACzE,CAAC;QAED,MAAM,YAAY,GAAG,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,WAAW,EAAE,CAAC,CAAC;QACrE,MAAM,aAAa,GAAG,UAAU,CAAC,MAAM,CAAC,IAAI,CAAC,EAAE,CAC7C,YAAY,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAC7C,CAAC;QACF,IAAI,aAAa,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;YAC7B,SAAS,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;YAC1B,KAAK,IAAI,aAAa,CAAC,MAAM,GAAG,CAAC,CAAC;QACpC,CAAC;QAED,MAAM,UAAU,GAAG,KAAK,CAAC,OAAO,CAAC,oBAAoB,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,WAAW,EAAE,CAAC,CAAC;QAChF,MAAM,WAAW,GAAG,UAAU,CAAC,MAAM,CAAC,IAAI,CAAC,EAAE,CAC3C,UAAU,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE,CAAC,IAAI,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAC7C,CAAC;QACF,IAAI,WAAW,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;YAC3B,SAAS,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC;YAC/B,KAAK,IAAI,WAAW,CAAC,MAAM,CAAC;QAC9B,CAAC;QAED,IAAI,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;YACzB,OAAO,CAAC,IAAI,CAAC;gBACX,IAAI,EAAE,KAAK,CAAC,IAAI;gBAChB,OAAO,EAAE,KAAK,CAAC,OAAO,CAAC,OAAO;gBAC9B,SAAS;gBACT,OAAO,EAAE,KAAK,CAAC,OAAO,CAAC,OAAO;gBAC9B,UAAU,EAAE,KAAK,CAAC,OAAO,CAAC,UAAU;gBACpC,KAAK;aACN,CAAC,CAAC;QACL,CAAC;IACH,CAAC;IAED,2BAA2B;IAC3B,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,KAAK,GAAG,CAAC,CAAC,KAAK,CAAC,CAAC;IAE1C,MAAM,OAAO,GAAG,OAAO,CAAC,KAAK,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC;IAEjD,qDAAqD;IACrD,mFAAmF;IACnF,MAAM,OAAO,GAAG,IAAI,gBAAgB,CAAC,WAAW,CAAC,CAAC;IAClD,KAAK,MAAM,MAAM,IAAI,OAAO,EAAE,CAAC;QAC7B,MAAM,KAAK,GAAG,UAAU,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,IAAI,KAAK,MAAM,CAAC,IAAI,CAAC,CAAC;QAC3D,kFAAkF;QAClF,MAAM,kBAAkB,GAAG,KAAK,EAAE,OAAO,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,SAAS,GAAG,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC;QAC7E,OAAO,CAAC,UAAU,CAAC,gBAAgB,EAAE,MAAM,CAAC,IAAI,EAAE,kBAAkB,CAAC,CAAC;IACxE,CAAC;IAED,OAAO;QACL,KAAK;QACL,OAAO,EAAE,OAAO,CAAC,GAAG,CAAC,CAAC,EAAE,KAAK,EAAE,MAAM,EAAE,GAAG,IAAI,EAAE,EAAE,EAAE,CAAC,IAAI,CAAC;QAC1D,WAAW,EAAE,UAAU,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC,MAAM;KACtD,CAAC;AACJ,CAAC"}
|