ctxo-mcp 0.3.1 → 0.4.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,25 +1,43 @@
1
- # Ctxo
1
+ <div align="center">
2
2
 
3
- **Code intelligence for AI agents — one call instead of hundreds.**
3
+ [![npm version](https://img.shields.io/npm/v/ctxo-mcp.svg)](https://www.npmjs.com/package/ctxo-mcp)
4
+ [![CI](https://github.com/alperhankendi/Ctxo/actions/workflows/ci.yml/badge.svg)](https://github.com/alperhankendi/Ctxo/actions/workflows/ci.yml)
5
+ [![Release](https://github.com/alperhankendi/Ctxo/actions/workflows/release.yml/badge.svg)](https://github.com/alperhankendi/Ctxo/actions/workflows/release.yml)
4
6
 
5
- AI coding assistants waste context window reading files one by one, still missing dependencies. Ctxo gives them the full picture in a single MCP call: symbol graphs, blast radius, git intent, and risk scores.
7
+ **Code intelligence for AI agents one call instead of hundreds.**
6
8
 
7
- ```
8
- Context per query set (full codebase investigation):
9
+ <picture>
10
+ <source media="(prefers-color-scheme: dark)" srcset="docs/img/hero-svg.svg">
11
+ <source media="(prefers-color-scheme: light)" srcset="docs/img/hero-svg.svg">
12
+ <img alt="Ctxo — Code intelligence for AI agents" src="docs/img/hero-svg.svg" width="100%">
13
+ </picture>
9
14
 
10
- Manual ████████████████████████████████████████ 140,000 tokens
11
- Ctxo █░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 2,900 tokens → 48x less
15
+ </div>
12
16
 
13
- Tool calls per query set:
17
+ ***
14
18
 
15
- Manual ████████████████████████████████████████ 409+ calls
16
- Ctxo █░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 8 calls → 51x fewer
19
+ ### The Problem
17
20
 
18
- Context after 10 investigation rounds:
21
+ AI coding assistants like Copilot, Claude Code, and Cursor rely on generic tools `grep`, `find`, file reads to understand your codebase. On brownfield projects with thousands of files, this brute-force exploration creates a chain of problems:
19
22
 
20
- Manual ██████████████████████████████████████████ 140% OOM at round 7
21
- Ctxo █░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 2.9% 97% free for coding
22
- ```
23
+ * **Context window saturation** The agent fills its window reading files one by one, leaving little room for actual reasoning
24
+ * **Partial context hallucination** It sees a function but misses its dependencies, leading to wrong assumptions and broken suggestions
25
+ * **Lost-in-the-middle** Critical information buried deep in a long context gets ignored by the model
26
+ * **Context poisoning** Irrelevant code pulled in during exploration biases the model's output
27
+ * **Iteration overhead** Understanding one symbol takes 10-20 tool calls, each adding more noise to the context
28
+ * **Stale reasoning** After too many iterations, the agent contradicts its own earlier assumptions
29
+
30
+ The result: more tokens burned, slower responses, higher cost, and lower quality output.
31
+
32
+ ### The Solution
33
+
34
+ Ctxo is an **MCP server** that **enhances** your existing AI tools with dependency-aware, history-enriched code intelligence. Instead of hundreds of `grep` and `read_file` calls, your agent gets the full picture symbol graphs, blast radius, git intent, and risk scores in a **single MCP call**.
35
+
36
+ * **Clean context** Only relevant symbols and their transitive dependencies, nothing more
37
+ * **Fewer iterations** One call replaces an entire exploration cycle
38
+ * **Higher quality** The agent reasons over structured, complete context instead of fragmented file reads
39
+
40
+ > A senior developer takes \~10 minutes to gather context across files. Ctxo delivers that same context in **under 500ms**.
23
41
 
24
42
  ## Quick Start
25
43
 
@@ -55,22 +73,22 @@ npx ctxo-mcp index
55
73
 
56
74
  ## 14 Tools
57
75
 
58
- | Tool | What it does |
59
- | --------------------------- | ---------------------------------------------------------------- |
60
- | `get_logic_slice` | Symbol + transitive dependencies (L1-L4 progressive detail) |
61
- | `get_blast_radius` | What breaks if this changes (3-tier: confirmed/likely/potential) |
62
- | `get_architectural_overlay` | Project layer map (Domain/Infrastructure/Adapter) |
63
- | `get_why_context` | Git commit intent + anti-pattern warnings (reverts, rollbacks) |
64
- | `get_change_intelligence` | Complexity x churn composite score |
65
- | `find_dead_code` | Unreachable symbols, unused exports, scaffolding markers |
66
- | `get_context_for_task` | Task-optimized context (fix/extend/refactor/understand) |
67
- | `get_ranked_context` | BM25 + PageRank search within token budget |
68
- | `search_symbols` | Symbol name/regex search across index |
69
- | `get_changed_symbols` | Symbols in recently changed files (git diff) |
70
- | `find_importers` | Reverse dependency lookup ("who uses this?") |
71
- | `get_class_hierarchy` | Class inheritance tree (ancestors + descendants) |
72
- | `get_symbol_importance` | PageRank centrality ranking |
73
- | `get_pr_impact` | Full PR risk assessment in a single call |
76
+ | Tool | What it does |
77
+ | --------------------------- | ---------------------------------------------------------------------------------------- |
78
+ | `get_logic_slice` | Symbol + transitive dependencies (L1-L4 progressive detail) |
79
+ | `get_blast_radius` | What breaks if this changes (3-tier: confirmed/likely/potential) |
80
+ | `get_architectural_overlay` | Project layer map (Domain/Infrastructure/Adapter) |
81
+ | `get_why_context` | Git commit intent + anti-pattern warnings (reverts, rollbacks) |
82
+ | `get_change_intelligence` | Complexity x churn composite score |
83
+ | `find_dead_code` | Unreachable symbols, unused exports, scaffolding markers |
84
+ | `get_context_for_task` | Task-optimized context (fix/extend/refactor/understand) |
85
+ | `get_ranked_context` | Two-phase BM25 search (camelCase-aware, fuzzy correction) + PageRank within token budget |
86
+ | `search_symbols` | Symbol name/regex search across index (`mode: 'fts'` for BM25) |
87
+ | `get_changed_symbols` | Symbols in recently changed files (git diff) |
88
+ | `find_importers` | Reverse dependency lookup ("who uses this?") |
89
+ | `get_class_hierarchy` | Class inheritance tree (ancestors + descendants) |
90
+ | `get_symbol_importance` | PageRank centrality ranking |
91
+ | `get_pr_impact` | Full PR risk assessment in a single call |
74
92
 
75
93
  ## Tool Selection Guide
76
94
 
@@ -96,19 +114,55 @@ npx ctxo-mcp watch # File watcher for incremental re-index
96
114
  npx ctxo-mcp init # Install git hooks (post-commit, post-merge)
97
115
  npx ctxo-mcp status # Show index manifest
98
116
  npx ctxo-mcp sync # Rebuild SQLite cache from committed JSON
117
+ npx ctxo-mcp stats # Show usage statistics
118
+ npx ctxo-mcp stats --json # Machine-readable output
119
+ npx ctxo-mcp stats --days 7 # Last 7 days only
120
+ npx ctxo-mcp stats --clear # Delete all session data
121
+ ```
122
+
123
+ **Example output:**
124
+
125
+ ```
126
+ npx ctxo-mcp stats
127
+
128
+ Usage Summary (all time)
129
+ ────────────────────────────────────────
130
+ Total tool calls: 30
131
+ Total tokens served: 26.0K
132
+
133
+ Top Tools
134
+ ────────────────────────────────────────
135
+ get_logic_slice 14 calls avg 352 tokens
136
+ get_blast_radius 3 calls avg 1,279 tokens
137
+ get_ranked_context 3 calls avg 898 tokens
138
+ find_importers 2 calls avg 1,342 tokens
139
+ get_context_for_task 2 calls avg 740 tokens
140
+
141
+ Top Queried Symbols
142
+ ────────────────────────────────────────
143
+ SymbolNode 15 queries
144
+ LogicSliceQuery 4 queries
145
+ SqliteStorageAdapter 2 queries
146
+
147
+ Detail Level Distribution
148
+ ────────────────────────────────────────
149
+ L1: ███░░░░░░░ 25%
150
+ L2: ███░░░░░░░ 25%
151
+ L3: ███░░░░░░░ 25%
152
+ L4: ███░░░░░░░ 25%
99
153
  ```
100
154
 
101
155
  ## Features
102
156
 
103
- **Response Envelope** All responses include `_meta` with item counts, truncation info, and drill-in hints. Large results auto-truncated at 8KB (configurable via `CTXO_RESPONSE_LIMIT`).
157
+ **Response Envelope** All responses include `_meta` with item counts, truncation info, and drill-in hints. Large results auto-truncated at 8KB (configurable via `CTXO_RESPONSE_LIMIT`).
104
158
 
105
- **Intent Filtering** 4 tools accept `intent` parameter for keyword-based result filtering. `get_blast_radius(symbolId, intent: "test")` returns only test-related impacts.
159
+ **Intent Filtering** 4 tools accept `intent` parameter for keyword-based result filtering. `get_blast_radius(symbolId, intent: "test")` returns only test-related impacts.
106
160
 
107
- **Tool Annotations** All tools declare `readOnlyHint: true`, `idempotentHint: true`, `openWorldHint: false` for safe auto-approval in agent frameworks.
161
+ **Tool Annotations** All tools declare `readOnlyHint: true`, `idempotentHint: true`, `openWorldHint: false` for safe auto-approval in agent frameworks.
108
162
 
109
- **Privacy Masking** AWS keys, GCP service accounts, Azure connection strings, JWTs, private IPs, env secrets automatically redacted. Extensible via `.ctxo/masking.json`.
163
+ **Privacy Masking** AWS keys, GCP service accounts, Azure connection strings, JWTs, private IPs, env secrets automatically redacted. Extensible via `.ctxo/masking.json`.
110
164
 
111
- **Debug Mode** `DEBUG=ctxo:*` for all debug output, or `DEBUG=ctxo:git,ctxo:storage` for specific namespaces.
165
+ **Debug Mode** `DEBUG=ctxo:*` for all debug output, or `DEBUG=ctxo:git,ctxo:storage` for specific namespaces.
112
166
 
113
167
  **Per-tool savings vs manual approach:**
114
168
 
@@ -162,9 +216,19 @@ See [Agentic AI Integration Guide](docs/agentic-ai-integration.md) for LangChain
162
216
  | Go | tree-sitter | Syntax | Structs, interfaces, functions, methods, import edges |
163
217
  | C# | tree-sitter | Syntax | Classes, interfaces, methods, enums, namespace qualification |
164
218
 
219
+ ## Index Visualizer
220
+
221
+ Ctxo ships with an interactive visualizer that renders your codebase index as a dependency graph. Explore symbols, edges, layers, and PageRank scores visually deployed automatically to GitHub Pages on every push.
222
+
223
+ ![1.00](docs/img/ui.png)
224
+
225
+ [Open Live Visualizer](https://alperhankendi.github.io/Ctxo/ctxo-visualizer.html)
226
+
165
227
  ## How It Works
166
228
 
167
- Ctxo builds a **committed JSON index** (`.ctxo/index/`) that captures symbols, dependency edges, git history, and co-change data. The MCP server reads this index to answer queries — no runtime parsing, no external services.
229
+ <img src="docs/img/mcp-server.png" alt="Ctxo MCP Server Architecture" width="400">
230
+
231
+ Ctxo builds a **committed JSON index** (`.ctxo/index/`) that captures symbols, dependency edges, git history, and co-change data. The MCP server reads this index to answer queries no runtime parsing, no external services.
168
232
 
169
233
  ```
170
234
  .ctxo/
@@ -179,7 +243,8 @@ Ctxo builds a **committed JSON index** (`.ctxo/index/`) that captures symbols, d
179
243
  * [npm](https://www.npmjs.com/package/ctxo-mcp)
180
244
  * [Changelog](CHANGELOG.md)
181
245
  * [LLM Reference](llms-full.txt)
182
- * [Validation Runbook](docs/runbook/mcp-validation/mcp-validation.md)
246
+ * [MCP Validation Runbook](docs/runbook/mcp-validation/mcp-validation.md)
247
+ * [CLI Validation Runbook](docs/runbook/cli-validation/cli-validation.md)
183
248
  * [Architecture](docs/artifacts/architecture.md)
184
249
  * [PRD](docs/artifacts/prd.md)
185
250
 
@@ -146,4 +146,4 @@ export {
146
146
  DetailLevelSchema,
147
147
  JsonIndexReader
148
148
  };
149
- //# sourceMappingURL=chunk-N6GPODUY.js.map
149
+ //# sourceMappingURL=chunk-4OI75JDS.js.map
@@ -1 +1 @@
1
- {"version":3,"sources":["../src/adapters/storage/json-index-reader.ts","../src/core/types.ts"],"sourcesContent":["import { readFileSync, readdirSync, existsSync, realpathSync } from 'node:fs';\nimport { join, relative } from 'node:path';\nimport { FileIndexSchema, type FileIndex } from '../../core/types.js';\n\nexport class JsonIndexReader {\n private readonly indexDir: string;\n\n constructor(ctxoRoot: string) {\n this.indexDir = join(ctxoRoot, 'index');\n }\n\n readAll(): FileIndex[] {\n if (!existsSync(this.indexDir)) {\n return [];\n }\n\n const jsonFiles = this.collectJsonFiles(this.indexDir);\n const results: FileIndex[] = [];\n\n for (const filePath of jsonFiles) {\n const parsed = this.readSingle(filePath);\n if (parsed) {\n results.push(parsed);\n }\n }\n\n return results;\n }\n\n readSingle(absolutePath: string): FileIndex | undefined {\n try {\n const raw = readFileSync(absolutePath, 'utf-8');\n const data: unknown = JSON.parse(raw);\n const result = FileIndexSchema.safeParse(data);\n\n if (!result.success) {\n const rel = relative(this.indexDir, absolutePath);\n console.error(\n `[ctxo:json-reader] Invalid schema in ${rel}: ${result.error.message}`,\n );\n return undefined;\n }\n\n return result.data;\n } catch (err) {\n const rel = relative(this.indexDir, absolutePath);\n console.error(\n `[ctxo:json-reader] Failed to read ${rel}: ${(err as Error).message}`,\n );\n return undefined;\n }\n }\n\n private collectJsonFiles(dir: string, visited: Set<string> = new Set()): string[] {\n const realDir = realpathSync(dir);\n if (visited.has(realDir)) return []; // Guard against symlink loops\n visited.add(realDir);\n\n const files: string[] = [];\n\n for (const entry of readdirSync(dir, { withFileTypes: true })) {\n if (entry.isSymbolicLink()) continue; // Skip symlinks entirely\n\n const fullPath = join(dir, entry.name);\n if (entry.isDirectory()) {\n files.push(...this.collectJsonFiles(fullPath, visited));\n } else if (entry.name.endsWith('.json') && entry.name !== 'co-changes.json') {\n files.push(fullPath);\n }\n }\n\n return files;\n }\n}\n","import { z } from 'zod';\n\n// ── Symbol Kinds ────────────────────────────────────────────────\n\nexport const SYMBOL_KINDS = [\n 'function',\n 'class',\n 'interface',\n 'method',\n 'variable',\n 'type',\n] as const;\n\nexport const SymbolKindSchema = z.enum(SYMBOL_KINDS);\nexport type SymbolKind = z.infer<typeof SymbolKindSchema>;\n\n// ── Edge Kinds ──────────────────────────────────────────────────\n\nexport const EDGE_KINDS = [\n 'imports',\n 'calls',\n 'extends',\n 'implements',\n 'uses',\n] as const;\n\nexport const EdgeKindSchema = z.enum(EDGE_KINDS);\nexport type EdgeKind = z.infer<typeof EdgeKindSchema>;\n\n// ── Detail Levels ───────────────────────────────────────────────\n\nexport const DETAIL_LEVELS = [1, 2, 3, 4] as const;\n\nexport const DetailLevelSchema = z.union([\n z.literal(1),\n z.literal(2),\n z.literal(3),\n z.literal(4),\n]);\nexport type DetailLevel = z.infer<typeof DetailLevelSchema>;\n\n// ── Symbol ID ───────────────────────────────────────────────────\n\n/**\n * Format: \"<relativeFile>::<name>::<kind>\"\n * Example: \"src/foo.ts::myFn::function\"\n */\nexport const SymbolIdSchema = z\n .string()\n .min(1)\n .refine(\n (value) => {\n const parts = value.split('::');\n if (parts.length !== 3) return false;\n const [file, name, kind] = parts;\n if (!file || !name || !kind) return false;\n return SymbolKindSchema.safeParse(kind).success;\n },\n { message: 'Symbol ID must match format \"<file>::<name>::<kind>\"' },\n );\nexport type SymbolId = z.infer<typeof SymbolIdSchema>;\n\n// ── Symbol Node ─────────────────────────────────────────────────\n\nexport const SymbolNodeSchema = z\n .object({\n symbolId: SymbolIdSchema,\n name: z.string().min(1),\n kind: SymbolKindSchema,\n startLine: z.number().int().nonnegative(),\n endLine: z.number().int().nonnegative(),\n startOffset: z.number().int().nonnegative().optional(),\n endOffset: z.number().int().nonnegative().optional(),\n })\n .refine((node) => node.endLine >= node.startLine, {\n message: 'endLine must be >= startLine',\n });\nexport type SymbolNode = z.infer<typeof SymbolNodeSchema>;\n\n// ── Graph Edge ──────────────────────────────────────────────────\n\nexport const GraphEdgeSchema = z.object({\n from: SymbolIdSchema,\n to: SymbolIdSchema,\n kind: EdgeKindSchema,\n typeOnly: z.boolean().optional(),\n});\nexport type GraphEdge = z.infer<typeof GraphEdgeSchema>;\n\n// ── Commit Intent ───────────────────────────────────────────────\n\nexport const CommitIntentSchema = z.object({\n hash: z.string().min(1),\n message: z.string(),\n date: z.string().min(1),\n kind: z.literal('commit'),\n});\nexport type CommitIntent = z.infer<typeof CommitIntentSchema>;\n\n// ── Anti-Pattern ────────────────────────────────────────────────\n\nexport const AntiPatternSchema = z.object({\n hash: z.string().min(1),\n message: z.string(),\n date: z.string().min(1),\n});\nexport type AntiPattern = z.infer<typeof AntiPatternSchema>;\n\n// ── File Index ──────────────────────────────────────────────────\n\nexport const ComplexityMetricsSchema = z.object({\n symbolId: z.string().min(1),\n cyclomatic: z.number().nonnegative(),\n});\n\nexport const FileIndexSchema = z.object({\n file: z.string().min(1),\n lastModified: z.number().nonnegative(),\n contentHash: z.string().optional(),\n symbols: z.array(SymbolNodeSchema),\n edges: z.array(GraphEdgeSchema),\n complexity: z.array(ComplexityMetricsSchema).optional(),\n intent: z.array(CommitIntentSchema),\n antiPatterns: z.array(AntiPatternSchema),\n});\nexport type FileIndex = z.infer<typeof FileIndexSchema>;\n\n// ── Logic-Slice Result ──────────────────────────────────────────\n\nexport interface LogicSliceResult {\n readonly root: SymbolNode;\n readonly dependencies: readonly SymbolNode[];\n readonly edges: readonly GraphEdge[];\n}\n\n// ── Formatted Slice ─────────────────────────────────────────────\n\nexport interface TruncationInfo {\n readonly truncated: true;\n readonly reason: 'token_budget_exceeded';\n}\n\nexport interface FormattedSlice {\n readonly root: SymbolNode;\n readonly dependencies: readonly SymbolNode[];\n readonly edges: readonly GraphEdge[];\n readonly level: DetailLevel;\n readonly levelDescription?: string;\n readonly truncation?: TruncationInfo;\n}\n\n// ── Complexity & Churn ──────────────────────────────────────────\n\nexport interface ComplexityMetrics {\n readonly symbolId: string;\n readonly cyclomatic: number;\n}\n\nexport interface ChurnData {\n readonly filePath: string;\n readonly commitCount: number;\n}\n\nexport const SCORE_BANDS = ['low', 'medium', 'high'] as const;\nexport type ScoreBand = (typeof SCORE_BANDS)[number];\n\nexport interface ChangeIntelligenceScore {\n readonly symbolId: string;\n readonly complexity: number;\n readonly churn: number;\n readonly composite: number;\n readonly band: ScoreBand;\n}\n\n// ── Why-Context Result ──────────────────────────────────────────\n\nexport interface WhyContextResult {\n readonly commitHistory: readonly CommitIntent[];\n readonly antiPatternWarnings: readonly AntiPattern[];\n readonly changeIntelligence?: ChangeIntelligenceScore;\n}\n\n// ── Co-Change Analysis ─────────────────────────────────────────\n\nexport interface CoChangeEntry {\n readonly file1: string;\n readonly file2: string;\n readonly sharedCommits: number;\n readonly frequency: number;\n}\n\nexport interface CoChangeMatrix {\n readonly version: 1;\n readonly timestamp: number;\n readonly entries: readonly CoChangeEntry[];\n}\n\n// ── Commit Record (from git adapter) ────────────────────────────\n\nexport interface CommitRecord {\n readonly hash: string;\n readonly message: string;\n readonly date: string;\n readonly author: string;\n}\n\n// ── Blame Line (from git adapter) ───────────────────────────────\n\nexport interface BlameLine {\n readonly hash: string;\n readonly lineNumber: number;\n readonly author: string;\n readonly date: string;\n}\n"],"mappings":";;;AAAA,SAAS,cAAc,aAAa,YAAY,oBAAoB;AACpE,SAAS,MAAM,gBAAgB;;;ACD/B,SAAS,SAAS;AAIX,IAAM,eAAe;AAAA,EAC1B;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AACF;AAEO,IAAM,mBAAmB,EAAE,KAAK,YAAY;AAK5C,IAAM,aAAa;AAAA,EACxB;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AACF;AAEO,IAAM,iBAAiB,EAAE,KAAK,UAAU;AAOxC,IAAM,oBAAoB,EAAE,MAAM;AAAA,EACvC,EAAE,QAAQ,CAAC;AAAA,EACX,EAAE,QAAQ,CAAC;AAAA,EACX,EAAE,QAAQ,CAAC;AAAA,EACX,EAAE,QAAQ,CAAC;AACb,CAAC;AASM,IAAM,iBAAiB,EAC3B,OAAO,EACP,IAAI,CAAC,EACL;AAAA,EACC,CAAC,UAAU;AACT,UAAM,QAAQ,MAAM,MAAM,IAAI;AAC9B,QAAI,MAAM,WAAW,EAAG,QAAO;AAC/B,UAAM,CAAC,MAAM,MAAM,IAAI,IAAI;AAC3B,QAAI,CAAC,QAAQ,CAAC,QAAQ,CAAC,KAAM,QAAO;AACpC,WAAO,iBAAiB,UAAU,IAAI,EAAE;AAAA,EAC1C;AAAA,EACA,EAAE,SAAS,uDAAuD;AACpE;AAKK,IAAM,mBAAmB,EAC7B,OAAO;AAAA,EACN,UAAU;AAAA,EACV,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EACtB,MAAM;AAAA,EACN,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE,YAAY;AAAA,EACxC,SAAS,EAAE,OAAO,EAAE,IAAI,EAAE,YAAY;AAAA,EACtC,aAAa,EAAE,OAAO,EAAE,IAAI,EAAE,YAAY,EAAE,SAAS;AAAA,EACrD,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE,YAAY,EAAE,SAAS;AACrD,CAAC,EACA,OAAO,CAAC,SAAS,KAAK,WAAW,KAAK,WAAW;AAAA,EAChD,SAAS;AACX,CAAC;AAKI,IAAM,kBAAkB,EAAE,OAAO;AAAA,EACtC,MAAM;AAAA,EACN,IAAI;AAAA,EACJ,MAAM;AAAA,EACN,UAAU,EAAE,QAAQ,EAAE,SAAS;AACjC,CAAC;AAKM,IAAM,qBAAqB,EAAE,OAAO;AAAA,EACzC,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EACtB,SAAS,EAAE,OAAO;AAAA,EAClB,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EACtB,MAAM,EAAE,QAAQ,QAAQ;AAC1B,CAAC;AAKM,IAAM,oBAAoB,EAAE,OAAO;AAAA,EACxC,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EACtB,SAAS,EAAE,OAAO;AAAA,EAClB,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AACxB,CAAC;AAKM,IAAM,0BAA0B,EAAE,OAAO;AAAA,EAC9C,UAAU,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EAC1B,YAAY,EAAE,OAAO,EAAE,YAAY;AACrC,CAAC;AAEM,IAAM,kBAAkB,EAAE,OAAO;AAAA,EACtC,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EACtB,cAAc,EAAE,OAAO,EAAE,YAAY;AAAA,EACrC,aAAa,EAAE,OAAO,EAAE,SAAS;AAAA,EACjC,SAAS,EAAE,MAAM,gBAAgB;AAAA,EACjC,OAAO,EAAE,MAAM,eAAe;AAAA,EAC9B,YAAY,EAAE,MAAM,uBAAuB,EAAE,SAAS;AAAA,EACtD,QAAQ,EAAE,MAAM,kBAAkB;AAAA,EAClC,cAAc,EAAE,MAAM,iBAAiB;AACzC,CAAC;;;ADxHM,IAAM,kBAAN,MAAsB;AAAA,EACV;AAAA,EAEjB,YAAY,UAAkB;AAC5B,SAAK,WAAW,KAAK,UAAU,OAAO;AAAA,EACxC;AAAA,EAEA,UAAuB;AACrB,QAAI,CAAC,WAAW,KAAK,QAAQ,GAAG;AAC9B,aAAO,CAAC;AAAA,IACV;AAEA,UAAM,YAAY,KAAK,iBAAiB,KAAK,QAAQ;AACrD,UAAM,UAAuB,CAAC;AAE9B,eAAW,YAAY,WAAW;AAChC,YAAM,SAAS,KAAK,WAAW,QAAQ;AACvC,UAAI,QAAQ;AACV,gBAAQ,KAAK,MAAM;AAAA,MACrB;AAAA,IACF;AAEA,WAAO;AAAA,EACT;AAAA,EAEA,WAAW,cAA6C;AACtD,QAAI;AACF,YAAM,MAAM,aAAa,cAAc,OAAO;AAC9C,YAAM,OAAgB,KAAK,MAAM,GAAG;AACpC,YAAM,SAAS,gBAAgB,UAAU,IAAI;AAE7C,UAAI,CAAC,OAAO,SAAS;AACnB,cAAM,MAAM,SAAS,KAAK,UAAU,YAAY;AAChD,gBAAQ;AAAA,UACN,wCAAwC,GAAG,KAAK,OAAO,MAAM,OAAO;AAAA,QACtE;AACA,eAAO;AAAA,MACT;AAEA,aAAO,OAAO;AAAA,IAChB,SAAS,KAAK;AACZ,YAAM,MAAM,SAAS,KAAK,UAAU,YAAY;AAChD,cAAQ;AAAA,QACN,qCAAqC,GAAG,KAAM,IAAc,OAAO;AAAA,MACrE;AACA,aAAO;AAAA,IACT;AAAA,EACF;AAAA,EAEQ,iBAAiB,KAAa,UAAuB,oBAAI,IAAI,GAAa;AAChF,UAAM,UAAU,aAAa,GAAG;AAChC,QAAI,QAAQ,IAAI,OAAO,EAAG,QAAO,CAAC;AAClC,YAAQ,IAAI,OAAO;AAEnB,UAAM,QAAkB,CAAC;AAEzB,eAAW,SAAS,YAAY,KAAK,EAAE,eAAe,KAAK,CAAC,GAAG;AAC7D,UAAI,MAAM,eAAe,EAAG;AAE5B,YAAM,WAAW,KAAK,KAAK,MAAM,IAAI;AACrC,UAAI,MAAM,YAAY,GAAG;AACvB,cAAM,KAAK,GAAG,KAAK,iBAAiB,UAAU,OAAO,CAAC;AAAA,MACxD,WAAW,MAAM,KAAK,SAAS,OAAO,KAAK,MAAM,SAAS,mBAAmB;AAC3E,cAAM,KAAK,QAAQ;AAAA,MACrB;AAAA,IACF;AAEA,WAAO;AAAA,EACT;AACF;","names":[]}
1
+ {"version":3,"sources":["../src/adapters/storage/json-index-reader.ts","../src/core/types.ts"],"sourcesContent":["import { readFileSync, readdirSync, existsSync, realpathSync } from 'node:fs';\nimport { join, relative } from 'node:path';\nimport { FileIndexSchema, type FileIndex } from '../../core/types.js';\n\nexport class JsonIndexReader {\n private readonly indexDir: string;\n\n constructor(ctxoRoot: string) {\n this.indexDir = join(ctxoRoot, 'index');\n }\n\n readAll(): FileIndex[] {\n if (!existsSync(this.indexDir)) {\n return [];\n }\n\n const jsonFiles = this.collectJsonFiles(this.indexDir);\n const results: FileIndex[] = [];\n\n for (const filePath of jsonFiles) {\n const parsed = this.readSingle(filePath);\n if (parsed) {\n results.push(parsed);\n }\n }\n\n return results;\n }\n\n readSingle(absolutePath: string): FileIndex | undefined {\n try {\n const raw = readFileSync(absolutePath, 'utf-8');\n const data: unknown = JSON.parse(raw);\n const result = FileIndexSchema.safeParse(data);\n\n if (!result.success) {\n const rel = relative(this.indexDir, absolutePath);\n console.error(\n `[ctxo:json-reader] Invalid schema in ${rel}: ${result.error.message}`,\n );\n return undefined;\n }\n\n return result.data;\n } catch (err) {\n const rel = relative(this.indexDir, absolutePath);\n console.error(\n `[ctxo:json-reader] Failed to read ${rel}: ${(err as Error).message}`,\n );\n return undefined;\n }\n }\n\n private collectJsonFiles(dir: string, visited: Set<string> = new Set()): string[] {\n const realDir = realpathSync(dir);\n if (visited.has(realDir)) return []; // Guard against symlink loops\n visited.add(realDir);\n\n const files: string[] = [];\n\n for (const entry of readdirSync(dir, { withFileTypes: true })) {\n if (entry.isSymbolicLink()) continue; // Skip symlinks entirely\n\n const fullPath = join(dir, entry.name);\n if (entry.isDirectory()) {\n files.push(...this.collectJsonFiles(fullPath, visited));\n } else if (entry.name.endsWith('.json') && entry.name !== 'co-changes.json') {\n files.push(fullPath);\n }\n }\n\n return files;\n }\n}\n","import { z } from 'zod';\n\n// ── Symbol Kinds ────────────────────────────────────────────────\n\nexport const SYMBOL_KINDS = [\n 'function',\n 'class',\n 'interface',\n 'method',\n 'variable',\n 'type',\n] as const;\n\nexport const SymbolKindSchema = z.enum(SYMBOL_KINDS);\nexport type SymbolKind = z.infer<typeof SymbolKindSchema>;\n\n// ── Edge Kinds ──────────────────────────────────────────────────\n\nexport const EDGE_KINDS = [\n 'imports',\n 'calls',\n 'extends',\n 'implements',\n 'uses',\n] as const;\n\nexport const EdgeKindSchema = z.enum(EDGE_KINDS);\nexport type EdgeKind = z.infer<typeof EdgeKindSchema>;\n\n// ── Detail Levels ───────────────────────────────────────────────\n\nexport const DETAIL_LEVELS = [1, 2, 3, 4] as const;\n\nexport const DetailLevelSchema = z.union([\n z.literal(1),\n z.literal(2),\n z.literal(3),\n z.literal(4),\n]);\nexport type DetailLevel = z.infer<typeof DetailLevelSchema>;\n\n// ── Symbol ID ───────────────────────────────────────────────────\n\n/**\n * Format: \"<relativeFile>::<name>::<kind>\"\n * Example: \"src/foo.ts::myFn::function\"\n */\nexport const SymbolIdSchema = z\n .string()\n .min(1)\n .refine(\n (value) => {\n const parts = value.split('::');\n if (parts.length !== 3) return false;\n const [file, name, kind] = parts;\n if (!file || !name || !kind) return false;\n return SymbolKindSchema.safeParse(kind).success;\n },\n { message: 'Symbol ID must match format \"<file>::<name>::<kind>\"' },\n );\nexport type SymbolId = z.infer<typeof SymbolIdSchema>;\n\n// ── Symbol Node ─────────────────────────────────────────────────\n\nexport const SymbolNodeSchema = z\n .object({\n symbolId: SymbolIdSchema,\n name: z.string().min(1),\n kind: SymbolKindSchema,\n startLine: z.number().int().nonnegative(),\n endLine: z.number().int().nonnegative(),\n startOffset: z.number().int().nonnegative().optional(),\n endOffset: z.number().int().nonnegative().optional(),\n })\n .refine((node) => node.endLine >= node.startLine, {\n message: 'endLine must be >= startLine',\n });\nexport type SymbolNode = z.infer<typeof SymbolNodeSchema>;\n\n// ── Graph Edge ──────────────────────────────────────────────────\n\nexport const GraphEdgeSchema = z.object({\n from: SymbolIdSchema,\n to: SymbolIdSchema,\n kind: EdgeKindSchema,\n typeOnly: z.boolean().optional(),\n});\nexport type GraphEdge = z.infer<typeof GraphEdgeSchema>;\n\n// ── Commit Intent ───────────────────────────────────────────────\n\nexport const CommitIntentSchema = z.object({\n hash: z.string().min(1),\n message: z.string(),\n date: z.string().min(1),\n kind: z.literal('commit'),\n});\nexport type CommitIntent = z.infer<typeof CommitIntentSchema>;\n\n// ── Anti-Pattern ────────────────────────────────────────────────\n\nexport const AntiPatternSchema = z.object({\n hash: z.string().min(1),\n message: z.string(),\n date: z.string().min(1),\n});\nexport type AntiPattern = z.infer<typeof AntiPatternSchema>;\n\n// ── File Index ──────────────────────────────────────────────────\n\nexport const ComplexityMetricsSchema = z.object({\n symbolId: z.string().min(1),\n cyclomatic: z.number().nonnegative(),\n});\n\nexport const FileIndexSchema = z.object({\n file: z.string().min(1),\n lastModified: z.number().nonnegative(),\n contentHash: z.string().optional(),\n symbols: z.array(SymbolNodeSchema),\n edges: z.array(GraphEdgeSchema),\n complexity: z.array(ComplexityMetricsSchema).optional(),\n intent: z.array(CommitIntentSchema),\n antiPatterns: z.array(AntiPatternSchema),\n});\nexport type FileIndex = z.infer<typeof FileIndexSchema>;\n\n// ── Logic-Slice Result ──────────────────────────────────────────\n\nexport interface LogicSliceResult {\n readonly root: SymbolNode;\n readonly dependencies: readonly SymbolNode[];\n readonly edges: readonly GraphEdge[];\n}\n\n// ── Formatted Slice ─────────────────────────────────────────────\n\nexport interface TruncationInfo {\n readonly truncated: true;\n readonly reason: 'token_budget_exceeded';\n}\n\nexport interface FormattedSlice {\n readonly root: SymbolNode;\n readonly dependencies: readonly SymbolNode[];\n readonly edges: readonly GraphEdge[];\n readonly level: DetailLevel;\n readonly levelDescription?: string;\n readonly truncation?: TruncationInfo;\n}\n\n// ── Complexity & Churn ──────────────────────────────────────────\n\nexport interface ComplexityMetrics {\n readonly symbolId: string;\n readonly cyclomatic: number;\n}\n\nexport interface ChurnData {\n readonly filePath: string;\n readonly commitCount: number;\n}\n\nexport const SCORE_BANDS = ['low', 'medium', 'high'] as const;\nexport type ScoreBand = (typeof SCORE_BANDS)[number];\n\nexport interface ChangeIntelligenceScore {\n readonly symbolId: string;\n readonly complexity: number;\n readonly churn: number;\n readonly composite: number;\n readonly band: ScoreBand;\n}\n\n// ── Why-Context Result ──────────────────────────────────────────\n\nexport interface WhyContextResult {\n readonly commitHistory: readonly CommitIntent[];\n readonly antiPatternWarnings: readonly AntiPattern[];\n readonly changeIntelligence?: ChangeIntelligenceScore;\n}\n\n// ── Co-Change Analysis ─────────────────────────────────────────\n\nexport interface CoChangeEntry {\n readonly file1: string;\n readonly file2: string;\n readonly sharedCommits: number;\n readonly frequency: number;\n}\n\nexport interface CoChangeMatrix {\n readonly version: 1;\n readonly timestamp: number;\n readonly entries: readonly CoChangeEntry[];\n}\n\n// ── Commit Record (from git adapter) ────────────────────────────\n\nexport interface CommitRecord {\n readonly hash: string;\n readonly message: string;\n readonly date: string;\n readonly author: string;\n}\n\n// ── Blame Line (from git adapter) ───────────────────────────────\n\nexport interface BlameLine {\n readonly hash: string;\n readonly lineNumber: number;\n readonly author: string;\n readonly date: string;\n}\n\n// ── Search Types ───────────────────────────────────────────────\n\nexport interface SearchResult {\n symbolId: string;\n name: string;\n kind: string;\n filePath: string;\n /** BM25 relevance score (higher = more relevant) */\n relevanceScore: number;\n /** PageRank importance score (0-1) */\n importanceScore: number;\n /** Final combined score after all boosts */\n combinedScore: number;\n}\n\nexport interface SearchMetrics {\n porterHits: number;\n trigramHits: number;\n phase2Activated: boolean;\n fuzzyApplied: boolean;\n latencyMs: number;\n}\n\nexport interface FuzzyCorrection {\n originalQuery: string;\n correctedQuery: string;\n corrections: Array<{\n original: string;\n corrected: string;\n distance: number;\n }>;\n}\n\nexport interface SearchResponse {\n query: string;\n results: SearchResult[];\n metrics: SearchMetrics;\n fuzzyCorrection?: FuzzyCorrection;\n}\n"],"mappings":";;;AAAA,SAAS,cAAc,aAAa,YAAY,oBAAoB;AACpE,SAAS,MAAM,gBAAgB;;;ACD/B,SAAS,SAAS;AAIX,IAAM,eAAe;AAAA,EAC1B;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AACF;AAEO,IAAM,mBAAmB,EAAE,KAAK,YAAY;AAK5C,IAAM,aAAa;AAAA,EACxB;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AACF;AAEO,IAAM,iBAAiB,EAAE,KAAK,UAAU;AAOxC,IAAM,oBAAoB,EAAE,MAAM;AAAA,EACvC,EAAE,QAAQ,CAAC;AAAA,EACX,EAAE,QAAQ,CAAC;AAAA,EACX,EAAE,QAAQ,CAAC;AAAA,EACX,EAAE,QAAQ,CAAC;AACb,CAAC;AASM,IAAM,iBAAiB,EAC3B,OAAO,EACP,IAAI,CAAC,EACL;AAAA,EACC,CAAC,UAAU;AACT,UAAM,QAAQ,MAAM,MAAM,IAAI;AAC9B,QAAI,MAAM,WAAW,EAAG,QAAO;AAC/B,UAAM,CAAC,MAAM,MAAM,IAAI,IAAI;AAC3B,QAAI,CAAC,QAAQ,CAAC,QAAQ,CAAC,KAAM,QAAO;AACpC,WAAO,iBAAiB,UAAU,IAAI,EAAE;AAAA,EAC1C;AAAA,EACA,EAAE,SAAS,uDAAuD;AACpE;AAKK,IAAM,mBAAmB,EAC7B,OAAO;AAAA,EACN,UAAU;AAAA,EACV,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EACtB,MAAM;AAAA,EACN,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE,YAAY;AAAA,EACxC,SAAS,EAAE,OAAO,EAAE,IAAI,EAAE,YAAY;AAAA,EACtC,aAAa,EAAE,OAAO,EAAE,IAAI,EAAE,YAAY,EAAE,SAAS;AAAA,EACrD,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE,YAAY,EAAE,SAAS;AACrD,CAAC,EACA,OAAO,CAAC,SAAS,KAAK,WAAW,KAAK,WAAW;AAAA,EAChD,SAAS;AACX,CAAC;AAKI,IAAM,kBAAkB,EAAE,OAAO;AAAA,EACtC,MAAM;AAAA,EACN,IAAI;AAAA,EACJ,MAAM;AAAA,EACN,UAAU,EAAE,QAAQ,EAAE,SAAS;AACjC,CAAC;AAKM,IAAM,qBAAqB,EAAE,OAAO;AAAA,EACzC,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EACtB,SAAS,EAAE,OAAO;AAAA,EAClB,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EACtB,MAAM,EAAE,QAAQ,QAAQ;AAC1B,CAAC;AAKM,IAAM,oBAAoB,EAAE,OAAO;AAAA,EACxC,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EACtB,SAAS,EAAE,OAAO;AAAA,EAClB,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AACxB,CAAC;AAKM,IAAM,0BAA0B,EAAE,OAAO;AAAA,EAC9C,UAAU,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EAC1B,YAAY,EAAE,OAAO,EAAE,YAAY;AACrC,CAAC;AAEM,IAAM,kBAAkB,EAAE,OAAO;AAAA,EACtC,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC;AAAA,EACtB,cAAc,EAAE,OAAO,EAAE,YAAY;AAAA,EACrC,aAAa,EAAE,OAAO,EAAE,SAAS;AAAA,EACjC,SAAS,EAAE,MAAM,gBAAgB;AAAA,EACjC,OAAO,EAAE,MAAM,eAAe;AAAA,EAC9B,YAAY,EAAE,MAAM,uBAAuB,EAAE,SAAS;AAAA,EACtD,QAAQ,EAAE,MAAM,kBAAkB;AAAA,EAClC,cAAc,EAAE,MAAM,iBAAiB;AACzC,CAAC;;;ADxHM,IAAM,kBAAN,MAAsB;AAAA,EACV;AAAA,EAEjB,YAAY,UAAkB;AAC5B,SAAK,WAAW,KAAK,UAAU,OAAO;AAAA,EACxC;AAAA,EAEA,UAAuB;AACrB,QAAI,CAAC,WAAW,KAAK,QAAQ,GAAG;AAC9B,aAAO,CAAC;AAAA,IACV;AAEA,UAAM,YAAY,KAAK,iBAAiB,KAAK,QAAQ;AACrD,UAAM,UAAuB,CAAC;AAE9B,eAAW,YAAY,WAAW;AAChC,YAAM,SAAS,KAAK,WAAW,QAAQ;AACvC,UAAI,QAAQ;AACV,gBAAQ,KAAK,MAAM;AAAA,MACrB;AAAA,IACF;AAEA,WAAO;AAAA,EACT;AAAA,EAEA,WAAW,cAA6C;AACtD,QAAI;AACF,YAAM,MAAM,aAAa,cAAc,OAAO;AAC9C,YAAM,OAAgB,KAAK,MAAM,GAAG;AACpC,YAAM,SAAS,gBAAgB,UAAU,IAAI;AAE7C,UAAI,CAAC,OAAO,SAAS;AACnB,cAAM,MAAM,SAAS,KAAK,UAAU,YAAY;AAChD,gBAAQ;AAAA,UACN,wCAAwC,GAAG,KAAK,OAAO,MAAM,OAAO;AAAA,QACtE;AACA,eAAO;AAAA,MACT;AAEA,aAAO,OAAO;AAAA,IAChB,SAAS,KAAK;AACZ,YAAM,MAAM,SAAS,KAAK,UAAU,YAAY;AAChD,cAAQ;AAAA,QACN,qCAAqC,GAAG,KAAM,IAAc,OAAO;AAAA,MACrE;AACA,aAAO;AAAA,IACT;AAAA,EACF;AAAA,EAEQ,iBAAiB,KAAa,UAAuB,oBAAI,IAAI,GAAa;AAChF,UAAM,UAAU,aAAa,GAAG;AAChC,QAAI,QAAQ,IAAI,OAAO,EAAG,QAAO,CAAC;AAClC,YAAQ,IAAI,OAAO;AAEnB,UAAM,QAAkB,CAAC;AAEzB,eAAW,SAAS,YAAY,KAAK,EAAE,eAAe,KAAK,CAAC,GAAG;AAC7D,UAAI,MAAM,eAAe,EAAG;AAE5B,YAAM,WAAW,KAAK,KAAK,MAAM,IAAI;AACrC,UAAI,MAAM,YAAY,GAAG;AACvB,cAAM,KAAK,GAAG,KAAK,iBAAiB,UAAU,OAAO,CAAC;AAAA,MACxD,WAAW,MAAM,KAAK,SAAS,OAAO,KAAK,MAAM,SAAS,mBAAmB;AAC3E,cAAM,KAAK,QAAQ;AAAA,MACrB;AAAA,IACF;AAEA,WAAO;AAAA,EACT;AACF;","names":[]}
@@ -1,7 +1,7 @@
1
1
  #!/usr/bin/env node
2
2
  import {
3
3
  JsonIndexReader
4
- } from "./chunk-N6GPODUY.js";
4
+ } from "./chunk-4OI75JDS.js";
5
5
 
6
6
  // src/core/logger.ts
7
7
  function isDebugEnabled(namespace) {
@@ -68,15 +68,29 @@ var SqliteStorageAdapter = class {
68
68
  }
69
69
  async initEmpty() {
70
70
  const SQL = await initSqlJs();
71
- this.db = new SQL.Database();
71
+ if (existsSync(this.dbPath)) {
72
+ try {
73
+ const buffer = readFileSync(this.dbPath);
74
+ this.db = new SQL.Database(buffer);
75
+ } catch {
76
+ this.db = new SQL.Database();
77
+ }
78
+ } else {
79
+ this.db = new SQL.Database();
80
+ }
72
81
  this.createTables();
82
+ this.clearIndexTables();
73
83
  }
74
- database() {
84
+ /** Returns the underlying sql.js Database instance. Throws if not initialized. */
85
+ getDb() {
75
86
  if (!this.db) {
76
87
  throw new Error("SqliteStorageAdapter not initialized. Call init() first.");
77
88
  }
78
89
  return this.db;
79
90
  }
91
+ database() {
92
+ return this.getDb();
93
+ }
80
94
  verifyIntegrity() {
81
95
  const db = this.database();
82
96
  const result = db.exec("PRAGMA integrity_check");
@@ -121,6 +135,12 @@ var SqliteStorageAdapter = class {
121
135
  db.run("CREATE INDEX IF NOT EXISTS idx_edges_from ON edges(from_symbol)");
122
136
  db.run("CREATE INDEX IF NOT EXISTS idx_edges_to ON edges(to_symbol)");
123
137
  }
138
+ clearIndexTables() {
139
+ const db = this.database();
140
+ db.run("DELETE FROM edges");
141
+ db.run("DELETE FROM symbols");
142
+ db.run("DELETE FROM files");
143
+ }
124
144
  rebuildFromJson() {
125
145
  const reader = new JsonIndexReader(this.ctxoRoot);
126
146
  const indices = reader.readAll();
@@ -371,12 +391,12 @@ var SimpleGitAdapter = class {
371
391
  }
372
392
  async getCommitHistory(filePath, maxCount) {
373
393
  try {
374
- const log3 = await this.git.log({
394
+ const log4 = await this.git.log({
375
395
  file: filePath,
376
396
  "--follow": null,
377
397
  ...maxCount ? { maxCount } : {}
378
398
  });
379
- return log3.all.map((entry) => ({
399
+ return log4.all.map((entry) => ({
380
400
  hash: entry.hash,
381
401
  message: entry.message,
382
402
  date: entry.date,
@@ -389,9 +409,9 @@ var SimpleGitAdapter = class {
389
409
  }
390
410
  async getBatchHistory(maxCount = 20) {
391
411
  try {
392
- const log3 = await this.git.log({ maxCount: maxCount * 50, "--name-only": null });
412
+ const log4 = await this.git.log({ maxCount: maxCount * 50, "--name-only": null });
393
413
  const result = /* @__PURE__ */ new Map();
394
- for (const entry of log3.all) {
414
+ for (const entry of log4.all) {
395
415
  const record = {
396
416
  hash: entry.hash,
397
417
  message: entry.message,
@@ -428,10 +448,10 @@ var SimpleGitAdapter = class {
428
448
  }
429
449
  async getFileChurn(filePath) {
430
450
  try {
431
- const log3 = await this.git.log({ file: filePath, "--follow": null });
451
+ const log4 = await this.git.log({ file: filePath, "--follow": null });
432
452
  return {
433
453
  filePath,
434
- commitCount: log3.total
454
+ commitCount: log4.total
435
455
  };
436
456
  } catch (err) {
437
457
  log2.error(` Failed to get churn for ${filePath}: ${err.message}`);
@@ -440,6 +460,126 @@ var SimpleGitAdapter = class {
440
460
  }
441
461
  };
442
462
 
463
+ // src/adapters/stats/session-recorder-adapter.ts
464
+ var log3 = createLogger("ctxo:stats");
465
+ var SessionRecorderAdapter = class {
466
+ db;
467
+ onWrite;
468
+ tableCreated = false;
469
+ constructor(db, onWrite) {
470
+ this.db = db;
471
+ this.onWrite = onWrite ?? null;
472
+ }
473
+ ensureTable() {
474
+ if (this.tableCreated) return;
475
+ this.db.run(`
476
+ CREATE TABLE IF NOT EXISTS session_events (
477
+ id INTEGER PRIMARY KEY AUTOINCREMENT,
478
+ timestamp TEXT DEFAULT (datetime('now')),
479
+ tool TEXT NOT NULL,
480
+ symbol_id TEXT,
481
+ detail_level TEXT,
482
+ response_tokens INTEGER,
483
+ response_bytes INTEGER,
484
+ latency_ms REAL,
485
+ truncated BOOLEAN DEFAULT 0
486
+ )
487
+ `);
488
+ this.db.run("CREATE INDEX IF NOT EXISTS idx_session_timestamp ON session_events(timestamp)");
489
+ this.db.run("CREATE INDEX IF NOT EXISTS idx_session_tool ON session_events(tool)");
490
+ this.tableCreated = true;
491
+ }
492
+ record(event) {
493
+ try {
494
+ this.ensureTable();
495
+ this.db.run(
496
+ `INSERT INTO session_events (tool, symbol_id, detail_level, response_tokens, response_bytes, latency_ms, truncated)
497
+ VALUES (?, ?, ?, ?, ?, ?, ?)`,
498
+ [
499
+ event.tool,
500
+ event.symbolId,
501
+ event.detailLevel,
502
+ event.responseTokens,
503
+ event.responseBytes,
504
+ event.latencyMs,
505
+ event.truncated ? 1 : 0
506
+ ]
507
+ );
508
+ this.flushToDisk();
509
+ } catch (err) {
510
+ log3.error(`Failed to record session event: ${err.message}`);
511
+ }
512
+ }
513
+ queryStats(days) {
514
+ this.ensureTable();
515
+ const whereClause = days != null && days > 0 ? `WHERE timestamp >= datetime('now', '-${Math.floor(days)} days')` : "";
516
+ const summaryResult = this.db.exec(
517
+ `SELECT COUNT(*) as total_calls, COALESCE(SUM(response_tokens), 0) as total_tokens
518
+ FROM session_events ${whereClause}`
519
+ );
520
+ const summaryRow = summaryResult[0]?.values[0];
521
+ const totalCalls = Number(summaryRow?.[0] ?? 0);
522
+ const totalTokensServed = Number(summaryRow?.[1] ?? 0);
523
+ const toolsResult = this.db.exec(
524
+ `SELECT tool, COUNT(*) as calls, CAST(AVG(response_tokens) AS INTEGER) as avg_tokens
525
+ FROM session_events ${whereClause}
526
+ GROUP BY tool ORDER BY calls DESC LIMIT 5`
527
+ );
528
+ const topTools = (toolsResult[0]?.values ?? []).map((row) => ({
529
+ tool: String(row[0]),
530
+ calls: Number(row[1]),
531
+ avgTokens: Number(row[2])
532
+ }));
533
+ const symbolWherePrefix = whereClause ? whereClause + " AND" : "WHERE";
534
+ const symbolsResult = this.db.exec(
535
+ `SELECT symbol_id, COUNT(*) as queries
536
+ FROM session_events ${symbolWherePrefix} symbol_id IS NOT NULL
537
+ GROUP BY symbol_id ORDER BY queries DESC LIMIT 5`
538
+ );
539
+ const topSymbols = (symbolsResult[0]?.values ?? []).map((row) => {
540
+ const symbolId = String(row[0]);
541
+ const parts = symbolId.split("::");
542
+ const name = parts.length >= 2 ? parts[1] : symbolId;
543
+ return {
544
+ symbolId,
545
+ name,
546
+ queries: Number(row[1])
547
+ };
548
+ });
549
+ const levelResult = this.db.exec(
550
+ `SELECT detail_level, COUNT(*) as count
551
+ FROM session_events ${symbolWherePrefix} detail_level IS NOT NULL
552
+ GROUP BY detail_level ORDER BY detail_level`
553
+ );
554
+ const levelRows = levelResult[0]?.values ?? [];
555
+ const levelTotal = levelRows.reduce((sum, row) => sum + Number(row[1]), 0);
556
+ const detailLevelDistribution = levelRows.map((row) => ({
557
+ level: String(row[0]),
558
+ count: Number(row[1]),
559
+ percentage: levelTotal > 0 ? Math.round(Number(row[1]) / levelTotal * 100) : 0
560
+ }));
561
+ return {
562
+ totalCalls,
563
+ totalTokensServed,
564
+ topTools,
565
+ topSymbols,
566
+ detailLevelDistribution
567
+ };
568
+ }
569
+ clear() {
570
+ this.ensureTable();
571
+ this.db.run("DELETE FROM session_events");
572
+ this.flushToDisk();
573
+ }
574
+ flushToDisk() {
575
+ try {
576
+ this.onWrite?.();
577
+ } catch (err) {
578
+ log3.error(`Failed to flush stats to disk: ${err.message}`);
579
+ }
580
+ }
581
+ };
582
+
443
583
  // src/core/why-context/revert-detector.ts
444
584
  var REVERT_QUOTED_PATTERN = /^Revert "(.+)"$/;
445
585
  var REVERT_PREFIX_PATTERN = /^revert:\s*(.+)$/i;
@@ -556,6 +696,7 @@ export {
556
696
  RevertDetector,
557
697
  SimpleGitAdapter,
558
698
  aggregateCoChanges,
559
- loadCoChangeMap
699
+ loadCoChangeMap,
700
+ SessionRecorderAdapter
560
701
  };
561
- //# sourceMappingURL=chunk-XSHNN6PU.js.map
702
+ //# sourceMappingURL=chunk-WZKXGKKI.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/core/logger.ts","../src/adapters/storage/sqlite-storage-adapter.ts","../src/adapters/git/simple-git-adapter.ts","../src/adapters/stats/session-recorder-adapter.ts","../src/core/why-context/revert-detector.ts","../src/core/co-change/co-change-analyzer.ts"],"sourcesContent":["/**\n * Structured logger for Ctxo — all output goes to stderr (stdout reserved for MCP JSON-RPC).\n *\n * Supports debug mode via DEBUG environment variable:\n * DEBUG=ctxo:* — all debug output\n * DEBUG=ctxo:git — only git adapter debug\n * DEBUG=ctxo:index — only indexing debug\n * DEBUG=ctxo:mcp — only MCP handler debug\n * DEBUG=ctxo:storage — only storage adapter debug\n * DEBUG=ctxo:masking — only masking pipeline debug\n *\n * Always outputs: info, warn, error (regardless of DEBUG).\n * Only outputs debug when matching namespace is enabled.\n */\n\nfunction isDebugEnabled(namespace: string): boolean {\n const debugEnv = process.env['DEBUG'] ?? '';\n if (!debugEnv) return false;\n const patterns = debugEnv.split(',').map(p => p.trim());\n return patterns.some(p => {\n if (p === '*' || p === 'ctxo:*') return true;\n if (p === namespace) return true;\n if (p.endsWith(':*') && namespace.startsWith(p.slice(0, -1))) return true;\n return false;\n });\n}\n\nexport interface Logger {\n debug(message: string, ...args: unknown[]): void;\n info(message: string, ...args: unknown[]): void;\n warn(message: string, ...args: unknown[]): void;\n error(message: string, ...args: unknown[]): void;\n}\n\nexport function createLogger(namespace: string): Logger {\n const prefix = `[${namespace}]`;\n\n return {\n debug(message: string, ...args: unknown[]) {\n if (isDebugEnabled(namespace)) {\n console.error(`${prefix} DEBUG ${message}`, ...args);\n }\n },\n info(message: string, ...args: unknown[]) {\n console.error(`${prefix} ${message}`, ...args);\n },\n warn(message: string, ...args: unknown[]) {\n console.error(`${prefix} WARN ${message}`, ...args);\n },\n error(message: string, ...args: unknown[]) {\n console.error(`${prefix} ERROR ${message}`, ...args);\n },\n };\n}\n","import initSqlJs, { type Database } from 'sql.js';\nimport { readFileSync, writeFileSync, existsSync, mkdirSync } from 'node:fs';\nimport { join, dirname } from 'node:path';\nimport type { FileIndex, GraphEdge, SymbolNode } from '../../core/types.js';\nimport type { IStoragePort } from '../../ports/i-storage-port.js';\nimport { JsonIndexReader } from './json-index-reader.js';\nimport { createLogger } from '../../core/logger.js';\n\nconst log = createLogger('ctxo:storage');\n\nexport class SqliteStorageAdapter implements IStoragePort {\n private db: Database | undefined;\n private readonly dbPath: string;\n private readonly ctxoRoot: string;\n\n constructor(ctxoRoot: string) {\n this.ctxoRoot = ctxoRoot;\n this.dbPath = join(ctxoRoot, '.cache', 'symbols.db');\n }\n\n async init(): Promise<void> {\n const SQL = await initSqlJs();\n\n if (existsSync(this.dbPath)) {\n try {\n const buffer = readFileSync(this.dbPath);\n this.db = new SQL.Database(buffer);\n this.verifyIntegrity();\n } catch {\n log.warn('Corrupt DB detected, rebuilding from JSON index');\n this.db = new SQL.Database();\n this.createTables();\n }\n } else {\n this.db = new SQL.Database();\n this.createTables();\n }\n\n // Always rebuild from JSON — it is the source of truth\n this.rebuildFromJson();\n }\n\n async initEmpty(): Promise<void> {\n const SQL = await initSqlJs();\n if (existsSync(this.dbPath)) {\n try {\n const buffer = readFileSync(this.dbPath);\n this.db = new SQL.Database(buffer);\n } catch {\n this.db = new SQL.Database();\n }\n } else {\n this.db = new SQL.Database();\n }\n this.createTables();\n this.clearIndexTables();\n }\n\n /** Returns the underlying sql.js Database instance. Throws if not initialized. */\n getDb(): Database {\n if (!this.db) {\n throw new Error('SqliteStorageAdapter not initialized. Call init() first.');\n }\n return this.db;\n }\n\n private database(): Database {\n return this.getDb();\n }\n\n private verifyIntegrity(): void {\n const db = this.database();\n const result = db.exec('PRAGMA integrity_check');\n const firstRow = result[0]?.values[0];\n if (!firstRow || firstRow[0] !== 'ok') {\n throw new Error('SQLite integrity check failed');\n }\n\n const tables = db.exec(\n \"SELECT name FROM sqlite_master WHERE type='table' AND name IN ('symbols', 'edges', 'files')\",\n );\n if (!tables[0] || tables[0].values.length < 3) {\n throw new Error('Missing required tables');\n }\n }\n\n private createTables(): void {\n const db = this.database();\n db.run(`\n CREATE TABLE IF NOT EXISTS files (\n file_path TEXT PRIMARY KEY,\n last_modified INTEGER NOT NULL\n )\n `);\n db.run(`\n CREATE TABLE IF NOT EXISTS symbols (\n symbol_id TEXT PRIMARY KEY,\n name TEXT NOT NULL,\n kind TEXT NOT NULL,\n file_path TEXT NOT NULL,\n start_line INTEGER NOT NULL,\n end_line INTEGER NOT NULL\n )\n `);\n db.run(`\n CREATE TABLE IF NOT EXISTS edges (\n id INTEGER PRIMARY KEY AUTOINCREMENT,\n from_symbol TEXT NOT NULL,\n to_symbol TEXT NOT NULL,\n kind TEXT NOT NULL\n )\n `);\n db.run('CREATE INDEX IF NOT EXISTS idx_symbols_file ON symbols(file_path)');\n db.run('CREATE INDEX IF NOT EXISTS idx_edges_from ON edges(from_symbol)');\n db.run('CREATE INDEX IF NOT EXISTS idx_edges_to ON edges(to_symbol)');\n }\n\n private clearIndexTables(): void {\n const db = this.database();\n db.run('DELETE FROM edges');\n db.run('DELETE FROM symbols');\n db.run('DELETE FROM files');\n }\n\n private rebuildFromJson(): void {\n const reader = new JsonIndexReader(this.ctxoRoot);\n const indices = reader.readAll();\n\n if (indices.length > 0) {\n this.bulkWrite(indices);\n }\n }\n\n writeSymbolFile(fileIndex: FileIndex): void {\n const db = this.database();\n\n db.run('BEGIN TRANSACTION');\n try {\n this.deleteFileData(db, fileIndex.file);\n\n db.run(\n 'INSERT INTO files (file_path, last_modified) VALUES (?, ?)',\n [fileIndex.file, fileIndex.lastModified],\n );\n\n for (const sym of fileIndex.symbols) {\n db.run(\n 'INSERT OR REPLACE INTO symbols (symbol_id, name, kind, file_path, start_line, end_line) VALUES (?, ?, ?, ?, ?, ?)',\n [sym.symbolId, sym.name, sym.kind, fileIndex.file, sym.startLine, sym.endLine],\n );\n }\n\n for (const edge of fileIndex.edges) {\n db.run(\n 'INSERT INTO edges (from_symbol, to_symbol, kind) VALUES (?, ?, ?)',\n [edge.from, edge.to, edge.kind],\n );\n }\n\n db.run('COMMIT');\n this.persistIfNeeded();\n } catch (err) {\n db.run('ROLLBACK');\n throw err;\n }\n }\n\n /**\n * Read symbol file from SQLite cache.\n * NOTE: SQLite only caches symbols + edges (graph topology).\n * intent, antiPatterns, and complexity live in committed JSON index only.\n * Use JsonIndexReader for full FileIndex data.\n */\n readSymbolFile(relativePath: string): FileIndex | undefined {\n const db = this.database();\n\n const fileResult = db.exec(\n 'SELECT file_path, last_modified FROM files WHERE file_path = ?',\n [relativePath],\n );\n if (!fileResult[0] || fileResult[0].values.length === 0) {\n return undefined;\n }\n\n const [filePath, lastModified] = fileResult[0].values[0] as [string, number];\n\n const symbols = this.getSymbolsForFile(db, filePath);\n const edges = this.getEdgesForFile(db, filePath);\n\n return {\n file: filePath,\n lastModified,\n symbols,\n edges,\n // Not stored in SQLite — use JsonIndexReader for these fields\n intent: [],\n antiPatterns: [],\n };\n }\n\n listIndexedFiles(): string[] {\n const db = this.database();\n const result = db.exec('SELECT file_path FROM files ORDER BY file_path');\n if (!result[0]) return [];\n return result[0].values.map((row) => row[0] as string);\n }\n\n deleteSymbolFile(relativePath: string): void {\n const db = this.database();\n this.deleteFileData(db, relativePath);\n }\n\n getSymbolById(symbolId: string): SymbolNode | undefined {\n const db = this.database();\n const result = db.exec(\n 'SELECT symbol_id, name, kind, start_line, end_line FROM symbols WHERE symbol_id = ?',\n [symbolId],\n );\n if (!result[0] || result[0].values.length === 0) {\n return undefined;\n }\n\n const [sid, name, kind, startLine, endLine] = result[0].values[0] as [string, string, string, number, number];\n return { symbolId: sid, name, kind: kind as SymbolNode['kind'], startLine, endLine };\n }\n\n getEdgesFrom(symbolId: string): GraphEdge[] {\n const db = this.database();\n const result = db.exec(\n 'SELECT from_symbol, to_symbol, kind FROM edges WHERE from_symbol = ?',\n [symbolId],\n );\n if (!result[0]) return [];\n return result[0].values.map((row) => ({\n from: row[0] as string,\n to: row[1] as string,\n kind: row[2] as GraphEdge['kind'],\n }));\n }\n\n getEdgesTo(symbolId: string): GraphEdge[] {\n const db = this.database();\n const result = db.exec(\n 'SELECT from_symbol, to_symbol, kind FROM edges WHERE to_symbol = ?',\n [symbolId],\n );\n if (!result[0]) return [];\n return result[0].values.map((row) => ({\n from: row[0] as string,\n to: row[1] as string,\n kind: row[2] as GraphEdge['kind'],\n }));\n }\n\n getAllSymbols(): SymbolNode[] {\n const db = this.database();\n const result = db.exec(\n 'SELECT symbol_id, name, kind, start_line, end_line FROM symbols',\n );\n if (!result[0]) return [];\n return result[0].values.map((row) => ({\n symbolId: row[0] as string,\n name: row[1] as string,\n kind: row[2] as SymbolNode['kind'],\n startLine: row[3] as number,\n endLine: row[4] as number,\n }));\n }\n\n getAllEdges(): GraphEdge[] {\n const db = this.database();\n const result = db.exec(\n 'SELECT from_symbol, to_symbol, kind FROM edges',\n );\n if (!result[0]) return [];\n return result[0].values.map((row) => ({\n from: row[0] as string,\n to: row[1] as string,\n kind: row[2] as GraphEdge['kind'],\n }));\n }\n\n bulkWrite(indices: FileIndex[]): void {\n const db = this.database();\n\n db.run('BEGIN TRANSACTION');\n try {\n for (const fileIndex of indices) {\n this.deleteFileData(db, fileIndex.file);\n\n db.run(\n 'INSERT INTO files (file_path, last_modified) VALUES (?, ?)',\n [fileIndex.file, fileIndex.lastModified],\n );\n\n for (const sym of fileIndex.symbols) {\n db.run(\n 'INSERT OR REPLACE INTO symbols (symbol_id, name, kind, file_path, start_line, end_line) VALUES (?, ?, ?, ?, ?, ?)',\n [sym.symbolId, sym.name, sym.kind, fileIndex.file, sym.startLine, sym.endLine],\n );\n }\n\n for (const edge of fileIndex.edges) {\n db.run(\n 'INSERT INTO edges (from_symbol, to_symbol, kind) VALUES (?, ?, ?)',\n [edge.from, edge.to, edge.kind],\n );\n }\n }\n\n db.run('COMMIT');\n this.persistIfNeeded();\n } catch (err) {\n db.run('ROLLBACK');\n throw err;\n }\n }\n\n persist(): void {\n const db = this.database();\n const data = db.export();\n const buffer = Buffer.from(data);\n mkdirSync(dirname(this.dbPath), { recursive: true });\n writeFileSync(this.dbPath, buffer);\n }\n\n close(): void {\n if (this.db) {\n this.persist();\n this.db.close();\n this.db = undefined;\n }\n }\n\n private persistIfNeeded(): void {\n try {\n this.persist();\n } catch (err) {\n log.error(`Failed to persist DB: ${(err as Error).message}`);\n }\n }\n\n private deleteFileData(db: Database, filePath: string): void {\n // Delete edges originating from this file's symbols\n const symResult = db.exec(\n 'SELECT symbol_id FROM symbols WHERE file_path = ?',\n [filePath],\n );\n if (symResult[0]) {\n for (const row of symResult[0].values) {\n db.run('DELETE FROM edges WHERE from_symbol = ?', [row[0]]);\n }\n }\n db.run('DELETE FROM symbols WHERE file_path = ?', [filePath]);\n db.run('DELETE FROM files WHERE file_path = ?', [filePath]);\n }\n\n private getSymbolsForFile(db: Database, filePath: string): SymbolNode[] {\n const result = db.exec(\n 'SELECT symbol_id, name, kind, start_line, end_line FROM symbols WHERE file_path = ? ORDER BY start_line',\n [filePath],\n );\n if (!result[0]) return [];\n return result[0].values.map((row) => ({\n symbolId: row[0] as string,\n name: row[1] as string,\n kind: row[2] as SymbolNode['kind'],\n startLine: row[3] as number,\n endLine: row[4] as number,\n }));\n }\n\n private getEdgesForFile(db: Database, filePath: string): GraphEdge[] {\n const result = db.exec(\n `SELECT e.from_symbol, e.to_symbol, e.kind FROM edges e\n INNER JOIN symbols s ON e.from_symbol = s.symbol_id\n WHERE s.file_path = ?`,\n [filePath],\n );\n if (!result[0]) return [];\n return result[0].values.map((row) => ({\n from: row[0] as string,\n to: row[1] as string,\n kind: row[2] as GraphEdge['kind'],\n }));\n }\n}\n","import { simpleGit, type SimpleGit } from 'simple-git';\nimport type { IGitPort } from '../../ports/i-git-port.js';\nimport type { CommitRecord, ChurnData } from '../../core/types.js';\nimport { createLogger } from '../../core/logger.js';\n\nconst log = createLogger('ctxo:git');\n\nexport class SimpleGitAdapter implements IGitPort {\n private readonly git: SimpleGit;\n\n constructor(projectRoot: string) {\n this.git = simpleGit(projectRoot);\n log.debug('Initialized with root: %s', projectRoot);\n }\n\n async isAvailable(): Promise<boolean> {\n try {\n await this.git.version();\n return true;\n } catch {\n return false;\n }\n }\n\n async getCommitHistory(filePath: string, maxCount?: number): Promise<CommitRecord[]> {\n try {\n const log = await this.git.log({\n file: filePath,\n '--follow': null,\n ...(maxCount ? { maxCount } : {}),\n });\n\n return log.all.map((entry) => ({\n hash: entry.hash,\n message: entry.message,\n date: entry.date,\n author: entry.author_name,\n }));\n } catch (err) {\n log.error(` Failed to get history for ${filePath}: ${(err as Error).message}`);\n return [];\n }\n }\n\n async getBatchHistory(maxCount = 20): Promise<Map<string, CommitRecord[]>> {\n try {\n const log = await this.git.log({ maxCount: maxCount * 50, '--name-only': null });\n const result = new Map<string, CommitRecord[]>();\n\n for (const entry of log.all) {\n const record: CommitRecord = {\n hash: entry.hash,\n message: entry.message,\n date: entry.date,\n author: entry.author_name,\n };\n\n const diff = entry.diff;\n const files: string[] = diff?.files?.map((f: { file: string }) => f.file.replace(/\\\\/g, '/')) ?? [];\n\n for (const file of files) {\n let list = result.get(file);\n if (!list) { list = []; result.set(file, list); }\n if (list.length < maxCount) {\n list.push(record);\n }\n }\n }\n\n return result;\n } catch (err) {\n log.error(` Batch history failed: ${(err as Error).message}`);\n return new Map();\n }\n }\n\n async getChangedFiles(since: string): Promise<string[]> {\n try {\n const diff = await this.git.diffSummary([since]);\n return diff.files.map((f) => f.file.replace(/\\\\/g, '/'));\n } catch (err) {\n log.error(` Failed to get changed files since ${since}: ${(err as Error).message}`);\n return [];\n }\n }\n\n async getFileChurn(filePath: string): Promise<ChurnData> {\n try {\n const log = await this.git.log({ file: filePath, '--follow': null });\n\n return {\n filePath,\n commitCount: log.total,\n };\n } catch (err) {\n log.error(` Failed to get churn for ${filePath}: ${(err as Error).message}`);\n return { filePath, commitCount: 0 };\n }\n }\n}\n","import type { Database } from 'sql.js';\nimport type {\n ISessionRecorderPort,\n SessionEvent,\n AggregatedStats,\n ToolStats,\n SymbolStats,\n DetailLevelStats,\n} from '../../ports/i-session-recorder-port.js';\nimport { createLogger } from '../../core/logger.js';\n\nconst log = createLogger('ctxo:stats');\n\nexport class SessionRecorderAdapter implements ISessionRecorderPort {\n private readonly db: Database;\n private readonly onWrite: (() => void) | null;\n private tableCreated = false;\n\n constructor(db: Database, onWrite?: () => void) {\n this.db = db;\n this.onWrite = onWrite ?? null;\n }\n\n private ensureTable(): void {\n if (this.tableCreated) return;\n this.db.run(`\n CREATE TABLE IF NOT EXISTS session_events (\n id INTEGER PRIMARY KEY AUTOINCREMENT,\n timestamp TEXT DEFAULT (datetime('now')),\n tool TEXT NOT NULL,\n symbol_id TEXT,\n detail_level TEXT,\n response_tokens INTEGER,\n response_bytes INTEGER,\n latency_ms REAL,\n truncated BOOLEAN DEFAULT 0\n )\n `);\n this.db.run('CREATE INDEX IF NOT EXISTS idx_session_timestamp ON session_events(timestamp)');\n this.db.run('CREATE INDEX IF NOT EXISTS idx_session_tool ON session_events(tool)');\n this.tableCreated = true;\n }\n\n record(event: SessionEvent): void {\n try {\n this.ensureTable();\n this.db.run(\n `INSERT INTO session_events (tool, symbol_id, detail_level, response_tokens, response_bytes, latency_ms, truncated)\n VALUES (?, ?, ?, ?, ?, ?, ?)`,\n [\n event.tool,\n event.symbolId,\n event.detailLevel,\n event.responseTokens,\n event.responseBytes,\n event.latencyMs,\n event.truncated ? 1 : 0,\n ],\n );\n this.flushToDisk();\n } catch (err) {\n log.error(`Failed to record session event: ${(err as Error).message}`);\n }\n }\n\n queryStats(days?: number): AggregatedStats {\n this.ensureTable();\n\n const whereClause = days != null && days > 0\n ? `WHERE timestamp >= datetime('now', '-${Math.floor(days)} days')`\n : '';\n\n // Total calls and tokens\n const summaryResult = this.db.exec(\n `SELECT COUNT(*) as total_calls, COALESCE(SUM(response_tokens), 0) as total_tokens\n FROM session_events ${whereClause}`,\n );\n const summaryRow = summaryResult[0]?.values[0];\n const totalCalls = Number(summaryRow?.[0] ?? 0);\n const totalTokensServed = Number(summaryRow?.[1] ?? 0);\n\n // Top tools\n const toolsResult = this.db.exec(\n `SELECT tool, COUNT(*) as calls, CAST(AVG(response_tokens) AS INTEGER) as avg_tokens\n FROM session_events ${whereClause}\n GROUP BY tool ORDER BY calls DESC LIMIT 5`,\n );\n const topTools: ToolStats[] = (toolsResult[0]?.values ?? []).map((row) => ({\n tool: String(row[0]),\n calls: Number(row[1]),\n avgTokens: Number(row[2]),\n }));\n\n // Top symbols (exclude nulls)\n const symbolWherePrefix = whereClause ? whereClause + ' AND' : 'WHERE';\n const symbolsResult = this.db.exec(\n `SELECT symbol_id, COUNT(*) as queries\n FROM session_events ${symbolWherePrefix} symbol_id IS NOT NULL\n GROUP BY symbol_id ORDER BY queries DESC LIMIT 5`,\n );\n const topSymbols: SymbolStats[] = (symbolsResult[0]?.values ?? []).map((row) => {\n const symbolId = String(row[0]);\n const parts = symbolId.split('::');\n const name = parts.length >= 2 ? parts[1]! : symbolId;\n return {\n symbolId,\n name,\n queries: Number(row[1]),\n };\n });\n\n // Detail level distribution\n const levelResult = this.db.exec(\n `SELECT detail_level, COUNT(*) as count\n FROM session_events ${symbolWherePrefix} detail_level IS NOT NULL\n GROUP BY detail_level ORDER BY detail_level`,\n );\n const levelRows = levelResult[0]?.values ?? [];\n const levelTotal = levelRows.reduce((sum, row) => sum + Number(row[1]), 0);\n const detailLevelDistribution: DetailLevelStats[] = levelRows.map((row) => ({\n level: String(row[0]),\n count: Number(row[1]),\n percentage: levelTotal > 0 ? Math.round((Number(row[1]) / levelTotal) * 100) : 0,\n }));\n\n return {\n totalCalls,\n totalTokensServed,\n topTools,\n topSymbols,\n detailLevelDistribution,\n };\n }\n\n clear(): void {\n this.ensureTable();\n this.db.run('DELETE FROM session_events');\n this.flushToDisk();\n }\n\n private flushToDisk(): void {\n try {\n this.onWrite?.();\n } catch (err) {\n log.error(`Failed to flush stats to disk: ${(err as Error).message}`);\n }\n }\n}\n","import type { CommitRecord, AntiPattern } from '../types.js';\n\n// Explicit revert patterns\nconst REVERT_QUOTED_PATTERN = /^Revert \"(.+)\"$/;\nconst REVERT_PREFIX_PATTERN = /^revert:\\s*(.+)$/i;\n\n// Undo/rollback patterns\nconst UNDO_PATTERN = /^undo[:\\s]/i;\nconst ROLLBACK_PATTERN = /^rollback[:\\s]/i;\n\n// Indirect revert indicators (keywords in commit message body)\nconst INDIRECT_KEYWORDS = [\n /\\brevert(?:s|ed|ing)?\\b/i,\n /\\broll(?:s|ed|ing)?\\s*back\\b/i,\n /\\bundo(?:es|ne|ing)?\\b/i,\n /\\bbacked?\\s*out\\b/i,\n /\\bremov(?:e|es|ed|ing)\\s+(?:broken|buggy|faulty)\\b/i,\n];\n\nexport class RevertDetector {\n detect(commits: readonly CommitRecord[]): AntiPattern[] {\n const antiPatterns: AntiPattern[] = [];\n\n for (const commit of commits) {\n if (!commit.message) continue;\n\n if (this.isRevert(commit.message)) {\n antiPatterns.push({\n hash: commit.hash,\n message: commit.message,\n date: commit.date,\n });\n }\n }\n\n return antiPatterns;\n }\n\n private isRevert(message: string): boolean {\n // Explicit patterns (high confidence)\n if (REVERT_QUOTED_PATTERN.test(message)) return true;\n if (REVERT_PREFIX_PATTERN.test(message)) return true;\n if (UNDO_PATTERN.test(message)) return true;\n if (ROLLBACK_PATTERN.test(message)) return true;\n\n // Indirect indicators (keyword search in full message)\n for (const pattern of INDIRECT_KEYWORDS) {\n if (pattern.test(message)) return true;\n }\n\n return false;\n }\n}\n","import type { CoChangeEntry, CoChangeMatrix, FileIndex } from '../types.js';\n\nconst MIN_SHARED_COMMITS = 2;\nconst MIN_FREQUENCY = 0.1;\n\nexport function aggregateCoChanges(indices: readonly FileIndex[]): CoChangeMatrix {\n // Step 1: Build commit → files map\n const commitToFiles = new Map<string, Set<string>>();\n const fileCommitCounts = new Map<string, number>();\n\n for (const idx of indices) {\n const file = idx.file;\n let count = 0;\n for (const intent of idx.intent) {\n count++;\n let files = commitToFiles.get(intent.hash);\n if (!files) {\n files = new Set();\n commitToFiles.set(intent.hash, files);\n }\n files.add(file);\n }\n if (count > 0) {\n fileCommitCounts.set(file, count);\n }\n }\n\n // Step 2: Count shared commits per file pair\n const pairCounts = new Map<string, number>();\n for (const files of commitToFiles.values()) {\n if (files.size < 2) continue;\n const sorted = [...files].sort();\n for (let i = 0; i < sorted.length; i++) {\n for (let j = i + 1; j < sorted.length; j++) {\n const key = `${sorted[i]}|${sorted[j]}`;\n pairCounts.set(key, (pairCounts.get(key) ?? 0) + 1);\n }\n }\n }\n\n // Step 3: Calculate frequency and filter\n const entries: CoChangeEntry[] = [];\n for (const [key, sharedCommits] of pairCounts) {\n if (sharedCommits < MIN_SHARED_COMMITS) continue;\n\n const [file1, file2] = key.split('|') as [string, string];\n const count1 = fileCommitCounts.get(file1) ?? 0;\n const count2 = fileCommitCounts.get(file2) ?? 0;\n const minCount = Math.min(count1, count2);\n if (minCount === 0) continue;\n\n const frequency = Math.round((sharedCommits / minCount) * 1000) / 1000;\n if (frequency < MIN_FREQUENCY) continue;\n\n entries.push({ file1, file2, sharedCommits, frequency });\n }\n\n entries.sort((a, b) => b.frequency - a.frequency || b.sharedCommits - a.sharedCommits);\n\n return {\n version: 1,\n timestamp: Math.floor(Date.now() / 1000),\n entries,\n };\n}\n\nexport function loadCoChangeMap(matrix: CoChangeMatrix): Map<string, CoChangeEntry[]> {\n const map = new Map<string, CoChangeEntry[]>();\n for (const entry of matrix.entries) {\n let list1 = map.get(entry.file1);\n if (!list1) { list1 = []; map.set(entry.file1, list1); }\n list1.push(entry);\n\n let list2 = map.get(entry.file2);\n if (!list2) { list2 = []; map.set(entry.file2, list2); }\n list2.push(entry);\n }\n return map;\n}\n"],"mappings":";;;;;;AAeA,SAAS,eAAe,WAA4B;AAClD,QAAM,WAAW,QAAQ,IAAI,OAAO,KAAK;AACzC,MAAI,CAAC,SAAU,QAAO;AACtB,QAAM,WAAW,SAAS,MAAM,GAAG,EAAE,IAAI,OAAK,EAAE,KAAK,CAAC;AACtD,SAAO,SAAS,KAAK,OAAK;AACxB,QAAI,MAAM,OAAO,MAAM,SAAU,QAAO;AACxC,QAAI,MAAM,UAAW,QAAO;AAC5B,QAAI,EAAE,SAAS,IAAI,KAAK,UAAU,WAAW,EAAE,MAAM,GAAG,EAAE,CAAC,EAAG,QAAO;AACrE,WAAO;AAAA,EACT,CAAC;AACH;AASO,SAAS,aAAa,WAA2B;AACtD,QAAM,SAAS,IAAI,SAAS;AAE5B,SAAO;AAAA,IACL,MAAM,YAAoB,MAAiB;AACzC,UAAI,eAAe,SAAS,GAAG;AAC7B,gBAAQ,MAAM,GAAG,MAAM,UAAU,OAAO,IAAI,GAAG,IAAI;AAAA,MACrD;AAAA,IACF;AAAA,IACA,KAAK,YAAoB,MAAiB;AACxC,cAAQ,MAAM,GAAG,MAAM,IAAI,OAAO,IAAI,GAAG,IAAI;AAAA,IAC/C;AAAA,IACA,KAAK,YAAoB,MAAiB;AACxC,cAAQ,MAAM,GAAG,MAAM,SAAS,OAAO,IAAI,GAAG,IAAI;AAAA,IACpD;AAAA,IACA,MAAM,YAAoB,MAAiB;AACzC,cAAQ,MAAM,GAAG,MAAM,UAAU,OAAO,IAAI,GAAG,IAAI;AAAA,IACrD;AAAA,EACF;AACF;;;ACrDA,OAAO,eAAkC;AACzC,SAAS,cAAc,eAAe,YAAY,iBAAiB;AACnE,SAAS,MAAM,eAAe;AAM9B,IAAM,MAAM,aAAa,cAAc;AAEhC,IAAM,uBAAN,MAAmD;AAAA,EAChD;AAAA,EACS;AAAA,EACA;AAAA,EAEjB,YAAY,UAAkB;AAC5B,SAAK,WAAW;AAChB,SAAK,SAAS,KAAK,UAAU,UAAU,YAAY;AAAA,EACrD;AAAA,EAEA,MAAM,OAAsB;AAC1B,UAAM,MAAM,MAAM,UAAU;AAE5B,QAAI,WAAW,KAAK,MAAM,GAAG;AAC3B,UAAI;AACF,cAAM,SAAS,aAAa,KAAK,MAAM;AACvC,aAAK,KAAK,IAAI,IAAI,SAAS,MAAM;AACjC,aAAK,gBAAgB;AAAA,MACvB,QAAQ;AACN,YAAI,KAAK,iDAAiD;AAC1D,aAAK,KAAK,IAAI,IAAI,SAAS;AAC3B,aAAK,aAAa;AAAA,MACpB;AAAA,IACF,OAAO;AACL,WAAK,KAAK,IAAI,IAAI,SAAS;AAC3B,WAAK,aAAa;AAAA,IACpB;AAGA,SAAK,gBAAgB;AAAA,EACvB;AAAA,EAEA,MAAM,YAA2B;AAC/B,UAAM,MAAM,MAAM,UAAU;AAC5B,QAAI,WAAW,KAAK,MAAM,GAAG;AAC3B,UAAI;AACF,cAAM,SAAS,aAAa,KAAK,MAAM;AACvC,aAAK,KAAK,IAAI,IAAI,SAAS,MAAM;AAAA,MACnC,QAAQ;AACN,aAAK,KAAK,IAAI,IAAI,SAAS;AAAA,MAC7B;AAAA,IACF,OAAO;AACL,WAAK,KAAK,IAAI,IAAI,SAAS;AAAA,IAC7B;AACA,SAAK,aAAa;AAClB,SAAK,iBAAiB;AAAA,EACxB;AAAA;AAAA,EAGA,QAAkB;AAChB,QAAI,CAAC,KAAK,IAAI;AACZ,YAAM,IAAI,MAAM,0DAA0D;AAAA,IAC5E;AACA,WAAO,KAAK;AAAA,EACd;AAAA,EAEQ,WAAqB;AAC3B,WAAO,KAAK,MAAM;AAAA,EACpB;AAAA,EAEQ,kBAAwB;AAC9B,UAAM,KAAK,KAAK,SAAS;AACzB,UAAM,SAAS,GAAG,KAAK,wBAAwB;AAC/C,UAAM,WAAW,OAAO,CAAC,GAAG,OAAO,CAAC;AACpC,QAAI,CAAC,YAAY,SAAS,CAAC,MAAM,MAAM;AACrC,YAAM,IAAI,MAAM,+BAA+B;AAAA,IACjD;AAEA,UAAM,SAAS,GAAG;AAAA,MAChB;AAAA,IACF;AACA,QAAI,CAAC,OAAO,CAAC,KAAK,OAAO,CAAC,EAAE,OAAO,SAAS,GAAG;AAC7C,YAAM,IAAI,MAAM,yBAAyB;AAAA,IAC3C;AAAA,EACF;AAAA,EAEQ,eAAqB;AAC3B,UAAM,KAAK,KAAK,SAAS;AACzB,OAAG,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA,KAKN;AACD,OAAG,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,KASN;AACD,OAAG,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,KAON;AACD,OAAG,IAAI,mEAAmE;AAC1E,OAAG,IAAI,iEAAiE;AACxE,OAAG,IAAI,6DAA6D;AAAA,EACtE;AAAA,EAEQ,mBAAyB;AAC/B,UAAM,KAAK,KAAK,SAAS;AACzB,OAAG,IAAI,mBAAmB;AAC1B,OAAG,IAAI,qBAAqB;AAC5B,OAAG,IAAI,mBAAmB;AAAA,EAC5B;AAAA,EAEQ,kBAAwB;AAC9B,UAAM,SAAS,IAAI,gBAAgB,KAAK,QAAQ;AAChD,UAAM,UAAU,OAAO,QAAQ;AAE/B,QAAI,QAAQ,SAAS,GAAG;AACtB,WAAK,UAAU,OAAO;AAAA,IACxB;AAAA,EACF;AAAA,EAEA,gBAAgB,WAA4B;AAC1C,UAAM,KAAK,KAAK,SAAS;AAEzB,OAAG,IAAI,mBAAmB;AAC1B,QAAI;AACF,WAAK,eAAe,IAAI,UAAU,IAAI;AAEtC,SAAG;AAAA,QACD;AAAA,QACA,CAAC,UAAU,MAAM,UAAU,YAAY;AAAA,MACzC;AAEA,iBAAW,OAAO,UAAU,SAAS;AACnC,WAAG;AAAA,UACD;AAAA,UACA,CAAC,IAAI,UAAU,IAAI,MAAM,IAAI,MAAM,UAAU,MAAM,IAAI,WAAW,IAAI,OAAO;AAAA,QAC/E;AAAA,MACF;AAEA,iBAAW,QAAQ,UAAU,OAAO;AAClC,WAAG;AAAA,UACD;AAAA,UACA,CAAC,KAAK,MAAM,KAAK,IAAI,KAAK,IAAI;AAAA,QAChC;AAAA,MACF;AAEA,SAAG,IAAI,QAAQ;AACf,WAAK,gBAAgB;AAAA,IACvB,SAAS,KAAK;AACZ,SAAG,IAAI,UAAU;AACjB,YAAM;AAAA,IACR;AAAA,EACF;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,eAAe,cAA6C;AAC1D,UAAM,KAAK,KAAK,SAAS;AAEzB,UAAM,aAAa,GAAG;AAAA,MACpB;AAAA,MACA,CAAC,YAAY;AAAA,IACf;AACA,QAAI,CAAC,WAAW,CAAC,KAAK,WAAW,CAAC,EAAE,OAAO,WAAW,GAAG;AACvD,aAAO;AAAA,IACT;AAEA,UAAM,CAAC,UAAU,YAAY,IAAI,WAAW,CAAC,EAAE,OAAO,CAAC;AAEvD,UAAM,UAAU,KAAK,kBAAkB,IAAI,QAAQ;AACnD,UAAM,QAAQ,KAAK,gBAAgB,IAAI,QAAQ;AAE/C,WAAO;AAAA,MACL,MAAM;AAAA,MACN;AAAA,MACA;AAAA,MACA;AAAA;AAAA,MAEA,QAAQ,CAAC;AAAA,MACT,cAAc,CAAC;AAAA,IACjB;AAAA,EACF;AAAA,EAEA,mBAA6B;AAC3B,UAAM,KAAK,KAAK,SAAS;AACzB,UAAM,SAAS,GAAG,KAAK,gDAAgD;AACvE,QAAI,CAAC,OAAO,CAAC,EAAG,QAAO,CAAC;AACxB,WAAO,OAAO,CAAC,EAAE,OAAO,IAAI,CAAC,QAAQ,IAAI,CAAC,CAAW;AAAA,EACvD;AAAA,EAEA,iBAAiB,cAA4B;AAC3C,UAAM,KAAK,KAAK,SAAS;AACzB,SAAK,eAAe,IAAI,YAAY;AAAA,EACtC;AAAA,EAEA,cAAc,UAA0C;AACtD,UAAM,KAAK,KAAK,SAAS;AACzB,UAAM,SAAS,GAAG;AAAA,MAChB;AAAA,MACA,CAAC,QAAQ;AAAA,IACX;AACA,QAAI,CAAC,OAAO,CAAC,KAAK,OAAO,CAAC,EAAE,OAAO,WAAW,GAAG;AAC/C,aAAO;AAAA,IACT;AAEA,UAAM,CAAC,KAAK,MAAM,MAAM,WAAW,OAAO,IAAI,OAAO,CAAC,EAAE,OAAO,CAAC;AAChE,WAAO,EAAE,UAAU,KAAK,MAAM,MAAkC,WAAW,QAAQ;AAAA,EACrF;AAAA,EAEA,aAAa,UAA+B;AAC1C,UAAM,KAAK,KAAK,SAAS;AACzB,UAAM,SAAS,GAAG;AAAA,MAChB;AAAA,MACA,CAAC,QAAQ;AAAA,IACX;AACA,QAAI,CAAC,OAAO,CAAC,EAAG,QAAO,CAAC;AACxB,WAAO,OAAO,CAAC,EAAE,OAAO,IAAI,CAAC,SAAS;AAAA,MACpC,MAAM,IAAI,CAAC;AAAA,MACX,IAAI,IAAI,CAAC;AAAA,MACT,MAAM,IAAI,CAAC;AAAA,IACb,EAAE;AAAA,EACJ;AAAA,EAEA,WAAW,UAA+B;AACxC,UAAM,KAAK,KAAK,SAAS;AACzB,UAAM,SAAS,GAAG;AAAA,MAChB;AAAA,MACA,CAAC,QAAQ;AAAA,IACX;AACA,QAAI,CAAC,OAAO,CAAC,EAAG,QAAO,CAAC;AACxB,WAAO,OAAO,CAAC,EAAE,OAAO,IAAI,CAAC,SAAS;AAAA,MACpC,MAAM,IAAI,CAAC;AAAA,MACX,IAAI,IAAI,CAAC;AAAA,MACT,MAAM,IAAI,CAAC;AAAA,IACb,EAAE;AAAA,EACJ;AAAA,EAEA,gBAA8B;AAC5B,UAAM,KAAK,KAAK,SAAS;AACzB,UAAM,SAAS,GAAG;AAAA,MAChB;AAAA,IACF;AACA,QAAI,CAAC,OAAO,CAAC,EAAG,QAAO,CAAC;AACxB,WAAO,OAAO,CAAC,EAAE,OAAO,IAAI,CAAC,SAAS;AAAA,MACpC,UAAU,IAAI,CAAC;AAAA,MACf,MAAM,IAAI,CAAC;AAAA,MACX,MAAM,IAAI,CAAC;AAAA,MACX,WAAW,IAAI,CAAC;AAAA,MAChB,SAAS,IAAI,CAAC;AAAA,IAChB,EAAE;AAAA,EACJ;AAAA,EAEA,cAA2B;AACzB,UAAM,KAAK,KAAK,SAAS;AACzB,UAAM,SAAS,GAAG;AAAA,MAChB;AAAA,IACF;AACA,QAAI,CAAC,OAAO,CAAC,EAAG,QAAO,CAAC;AACxB,WAAO,OAAO,CAAC,EAAE,OAAO,IAAI,CAAC,SAAS;AAAA,MACpC,MAAM,IAAI,CAAC;AAAA,MACX,IAAI,IAAI,CAAC;AAAA,MACT,MAAM,IAAI,CAAC;AAAA,IACb,EAAE;AAAA,EACJ;AAAA,EAEA,UAAU,SAA4B;AACpC,UAAM,KAAK,KAAK,SAAS;AAEzB,OAAG,IAAI,mBAAmB;AAC1B,QAAI;AACF,iBAAW,aAAa,SAAS;AAC/B,aAAK,eAAe,IAAI,UAAU,IAAI;AAEtC,WAAG;AAAA,UACD;AAAA,UACA,CAAC,UAAU,MAAM,UAAU,YAAY;AAAA,QACzC;AAEA,mBAAW,OAAO,UAAU,SAAS;AACnC,aAAG;AAAA,YACD;AAAA,YACA,CAAC,IAAI,UAAU,IAAI,MAAM,IAAI,MAAM,UAAU,MAAM,IAAI,WAAW,IAAI,OAAO;AAAA,UAC/E;AAAA,QACF;AAEA,mBAAW,QAAQ,UAAU,OAAO;AAClC,aAAG;AAAA,YACD;AAAA,YACA,CAAC,KAAK,MAAM,KAAK,IAAI,KAAK,IAAI;AAAA,UAChC;AAAA,QACF;AAAA,MACF;AAEA,SAAG,IAAI,QAAQ;AACf,WAAK,gBAAgB;AAAA,IACvB,SAAS,KAAK;AACZ,SAAG,IAAI,UAAU;AACjB,YAAM;AAAA,IACR;AAAA,EACF;AAAA,EAEA,UAAgB;AACd,UAAM,KAAK,KAAK,SAAS;AACzB,UAAM,OAAO,GAAG,OAAO;AACvB,UAAM,SAAS,OAAO,KAAK,IAAI;AAC/B,cAAU,QAAQ,KAAK,MAAM,GAAG,EAAE,WAAW,KAAK,CAAC;AACnD,kBAAc,KAAK,QAAQ,MAAM;AAAA,EACnC;AAAA,EAEA,QAAc;AACZ,QAAI,KAAK,IAAI;AACX,WAAK,QAAQ;AACb,WAAK,GAAG,MAAM;AACd,WAAK,KAAK;AAAA,IACZ;AAAA,EACF;AAAA,EAEQ,kBAAwB;AAC9B,QAAI;AACF,WAAK,QAAQ;AAAA,IACf,SAAS,KAAK;AACZ,UAAI,MAAM,yBAA0B,IAAc,OAAO,EAAE;AAAA,IAC7D;AAAA,EACF;AAAA,EAEQ,eAAe,IAAc,UAAwB;AAE3D,UAAM,YAAY,GAAG;AAAA,MACnB;AAAA,MACA,CAAC,QAAQ;AAAA,IACX;AACA,QAAI,UAAU,CAAC,GAAG;AAChB,iBAAW,OAAO,UAAU,CAAC,EAAE,QAAQ;AACrC,WAAG,IAAI,2CAA2C,CAAC,IAAI,CAAC,CAAC,CAAC;AAAA,MAC5D;AAAA,IACF;AACA,OAAG,IAAI,2CAA2C,CAAC,QAAQ,CAAC;AAC5D,OAAG,IAAI,yCAAyC,CAAC,QAAQ,CAAC;AAAA,EAC5D;AAAA,EAEQ,kBAAkB,IAAc,UAAgC;AACtE,UAAM,SAAS,GAAG;AAAA,MAChB;AAAA,MACA,CAAC,QAAQ;AAAA,IACX;AACA,QAAI,CAAC,OAAO,CAAC,EAAG,QAAO,CAAC;AACxB,WAAO,OAAO,CAAC,EAAE,OAAO,IAAI,CAAC,SAAS;AAAA,MACpC,UAAU,IAAI,CAAC;AAAA,MACf,MAAM,IAAI,CAAC;AAAA,MACX,MAAM,IAAI,CAAC;AAAA,MACX,WAAW,IAAI,CAAC;AAAA,MAChB,SAAS,IAAI,CAAC;AAAA,IAChB,EAAE;AAAA,EACJ;AAAA,EAEQ,gBAAgB,IAAc,UAA+B;AACnE,UAAM,SAAS,GAAG;AAAA,MAChB;AAAA;AAAA;AAAA,MAGA,CAAC,QAAQ;AAAA,IACX;AACA,QAAI,CAAC,OAAO,CAAC,EAAG,QAAO,CAAC;AACxB,WAAO,OAAO,CAAC,EAAE,OAAO,IAAI,CAAC,SAAS;AAAA,MACpC,MAAM,IAAI,CAAC;AAAA,MACX,IAAI,IAAI,CAAC;AAAA,MACT,MAAM,IAAI,CAAC;AAAA,IACb,EAAE;AAAA,EACJ;AACF;;;AClYA,SAAS,iBAAiC;AAK1C,IAAMA,OAAM,aAAa,UAAU;AAE5B,IAAM,mBAAN,MAA2C;AAAA,EAC/B;AAAA,EAEjB,YAAY,aAAqB;AAC/B,SAAK,MAAM,UAAU,WAAW;AAChC,IAAAA,KAAI,MAAM,6BAA6B,WAAW;AAAA,EACpD;AAAA,EAEA,MAAM,cAAgC;AACpC,QAAI;AACF,YAAM,KAAK,IAAI,QAAQ;AACvB,aAAO;AAAA,IACT,QAAQ;AACN,aAAO;AAAA,IACT;AAAA,EACF;AAAA,EAEA,MAAM,iBAAiB,UAAkB,UAA4C;AACnF,QAAI;AACF,YAAMA,OAAM,MAAM,KAAK,IAAI,IAAI;AAAA,QAC7B,MAAM;AAAA,QACN,YAAY;AAAA,QACZ,GAAI,WAAW,EAAE,SAAS,IAAI,CAAC;AAAA,MACjC,CAAC;AAED,aAAOA,KAAI,IAAI,IAAI,CAAC,WAAW;AAAA,QAC7B,MAAM,MAAM;AAAA,QACZ,SAAS,MAAM;AAAA,QACf,MAAM,MAAM;AAAA,QACZ,QAAQ,MAAM;AAAA,MAChB,EAAE;AAAA,IACJ,SAAS,KAAK;AACZ,MAAAA,KAAI,MAAM,8BAA8B,QAAQ,KAAM,IAAc,OAAO,EAAE;AAC7E,aAAO,CAAC;AAAA,IACV;AAAA,EACF;AAAA,EAEA,MAAM,gBAAgB,WAAW,IAA0C;AACzE,QAAI;AACF,YAAMA,OAAM,MAAM,KAAK,IAAI,IAAI,EAAE,UAAU,WAAW,IAAI,eAAe,KAAK,CAAC;AAC/E,YAAM,SAAS,oBAAI,IAA4B;AAE/C,iBAAW,SAASA,KAAI,KAAK;AAC3B,cAAM,SAAuB;AAAA,UAC3B,MAAM,MAAM;AAAA,UACZ,SAAS,MAAM;AAAA,UACf,MAAM,MAAM;AAAA,UACZ,QAAQ,MAAM;AAAA,QAChB;AAEA,cAAM,OAAO,MAAM;AACnB,cAAM,QAAkB,MAAM,OAAO,IAAI,CAAC,MAAwB,EAAE,KAAK,QAAQ,OAAO,GAAG,CAAC,KAAK,CAAC;AAElG,mBAAW,QAAQ,OAAO;AACxB,cAAI,OAAO,OAAO,IAAI,IAAI;AAC1B,cAAI,CAAC,MAAM;AAAE,mBAAO,CAAC;AAAG,mBAAO,IAAI,MAAM,IAAI;AAAA,UAAG;AAChD,cAAI,KAAK,SAAS,UAAU;AAC1B,iBAAK,KAAK,MAAM;AAAA,UAClB;AAAA,QACF;AAAA,MACF;AAEA,aAAO;AAAA,IACT,SAAS,KAAK;AACZ,MAAAA,KAAI,MAAM,0BAA2B,IAAc,OAAO,EAAE;AAC5D,aAAO,oBAAI,IAAI;AAAA,IACjB;AAAA,EACF;AAAA,EAEA,MAAM,gBAAgB,OAAkC;AACtD,QAAI;AACF,YAAM,OAAO,MAAM,KAAK,IAAI,YAAY,CAAC,KAAK,CAAC;AAC/C,aAAO,KAAK,MAAM,IAAI,CAAC,MAAM,EAAE,KAAK,QAAQ,OAAO,GAAG,CAAC;AAAA,IACzD,SAAS,KAAK;AACZ,MAAAA,KAAI,MAAM,sCAAsC,KAAK,KAAM,IAAc,OAAO,EAAE;AAClF,aAAO,CAAC;AAAA,IACV;AAAA,EACF;AAAA,EAEA,MAAM,aAAa,UAAsC;AACvD,QAAI;AACF,YAAMA,OAAM,MAAM,KAAK,IAAI,IAAI,EAAE,MAAM,UAAU,YAAY,KAAK,CAAC;AAEnE,aAAO;AAAA,QACL;AAAA,QACA,aAAaA,KAAI;AAAA,MACnB;AAAA,IACF,SAAS,KAAK;AACZ,MAAAA,KAAI,MAAM,4BAA4B,QAAQ,KAAM,IAAc,OAAO,EAAE;AAC3E,aAAO,EAAE,UAAU,aAAa,EAAE;AAAA,IACpC;AAAA,EACF;AACF;;;ACxFA,IAAMC,OAAM,aAAa,YAAY;AAE9B,IAAM,yBAAN,MAA6D;AAAA,EACjD;AAAA,EACA;AAAA,EACT,eAAe;AAAA,EAEvB,YAAY,IAAc,SAAsB;AAC9C,SAAK,KAAK;AACV,SAAK,UAAU,WAAW;AAAA,EAC5B;AAAA,EAEQ,cAAoB;AAC1B,QAAI,KAAK,aAAc;AACvB,SAAK,GAAG,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,KAYX;AACD,SAAK,GAAG,IAAI,+EAA+E;AAC3F,SAAK,GAAG,IAAI,qEAAqE;AACjF,SAAK,eAAe;AAAA,EACtB;AAAA,EAEA,OAAO,OAA2B;AAChC,QAAI;AACF,WAAK,YAAY;AACjB,WAAK,GAAG;AAAA,QACN;AAAA;AAAA,QAEA;AAAA,UACE,MAAM;AAAA,UACN,MAAM;AAAA,UACN,MAAM;AAAA,UACN,MAAM;AAAA,UACN,MAAM;AAAA,UACN,MAAM;AAAA,UACN,MAAM,YAAY,IAAI;AAAA,QACxB;AAAA,MACF;AACA,WAAK,YAAY;AAAA,IACnB,SAAS,KAAK;AACZ,MAAAA,KAAI,MAAM,mCAAoC,IAAc,OAAO,EAAE;AAAA,IACvE;AAAA,EACF;AAAA,EAEA,WAAW,MAAgC;AACzC,SAAK,YAAY;AAEjB,UAAM,cAAc,QAAQ,QAAQ,OAAO,IACvC,wCAAwC,KAAK,MAAM,IAAI,CAAC,YACxD;AAGJ,UAAM,gBAAgB,KAAK,GAAG;AAAA,MAC5B;AAAA,6BACuB,WAAW;AAAA,IACpC;AACA,UAAM,aAAa,cAAc,CAAC,GAAG,OAAO,CAAC;AAC7C,UAAM,aAAa,OAAO,aAAa,CAAC,KAAK,CAAC;AAC9C,UAAM,oBAAoB,OAAO,aAAa,CAAC,KAAK,CAAC;AAGrD,UAAM,cAAc,KAAK,GAAG;AAAA,MAC1B;AAAA,6BACuB,WAAW;AAAA;AAAA,IAEpC;AACA,UAAM,YAAyB,YAAY,CAAC,GAAG,UAAU,CAAC,GAAG,IAAI,CAAC,SAAS;AAAA,MACzE,MAAM,OAAO,IAAI,CAAC,CAAC;AAAA,MACnB,OAAO,OAAO,IAAI,CAAC,CAAC;AAAA,MACpB,WAAW,OAAO,IAAI,CAAC,CAAC;AAAA,IAC1B,EAAE;AAGF,UAAM,oBAAoB,cAAc,cAAc,SAAS;AAC/D,UAAM,gBAAgB,KAAK,GAAG;AAAA,MAC5B;AAAA,6BACuB,iBAAiB;AAAA;AAAA,IAE1C;AACA,UAAM,cAA6B,cAAc,CAAC,GAAG,UAAU,CAAC,GAAG,IAAI,CAAC,QAAQ;AAC9E,YAAM,WAAW,OAAO,IAAI,CAAC,CAAC;AAC9B,YAAM,QAAQ,SAAS,MAAM,IAAI;AACjC,YAAM,OAAO,MAAM,UAAU,IAAI,MAAM,CAAC,IAAK;AAC7C,aAAO;AAAA,QACL;AAAA,QACA;AAAA,QACA,SAAS,OAAO,IAAI,CAAC,CAAC;AAAA,MACxB;AAAA,IACF,CAAC;AAGD,UAAM,cAAc,KAAK,GAAG;AAAA,MAC1B;AAAA,6BACuB,iBAAiB;AAAA;AAAA,IAE1C;AACA,UAAM,YAAY,YAAY,CAAC,GAAG,UAAU,CAAC;AAC7C,UAAM,aAAa,UAAU,OAAO,CAAC,KAAK,QAAQ,MAAM,OAAO,IAAI,CAAC,CAAC,GAAG,CAAC;AACzE,UAAM,0BAA8C,UAAU,IAAI,CAAC,SAAS;AAAA,MAC1E,OAAO,OAAO,IAAI,CAAC,CAAC;AAAA,MACpB,OAAO,OAAO,IAAI,CAAC,CAAC;AAAA,MACpB,YAAY,aAAa,IAAI,KAAK,MAAO,OAAO,IAAI,CAAC,CAAC,IAAI,aAAc,GAAG,IAAI;AAAA,IACjF,EAAE;AAEF,WAAO;AAAA,MACL;AAAA,MACA;AAAA,MACA;AAAA,MACA;AAAA,MACA;AAAA,IACF;AAAA,EACF;AAAA,EAEA,QAAc;AACZ,SAAK,YAAY;AACjB,SAAK,GAAG,IAAI,4BAA4B;AACxC,SAAK,YAAY;AAAA,EACnB;AAAA,EAEQ,cAAoB;AAC1B,QAAI;AACF,WAAK,UAAU;AAAA,IACjB,SAAS,KAAK;AACZ,MAAAA,KAAI,MAAM,kCAAmC,IAAc,OAAO,EAAE;AAAA,IACtE;AAAA,EACF;AACF;;;AChJA,IAAM,wBAAwB;AAC9B,IAAM,wBAAwB;AAG9B,IAAM,eAAe;AACrB,IAAM,mBAAmB;AAGzB,IAAM,oBAAoB;AAAA,EACxB;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AACF;AAEO,IAAM,iBAAN,MAAqB;AAAA,EAC1B,OAAO,SAAiD;AACtD,UAAM,eAA8B,CAAC;AAErC,eAAW,UAAU,SAAS;AAC5B,UAAI,CAAC,OAAO,QAAS;AAErB,UAAI,KAAK,SAAS,OAAO,OAAO,GAAG;AACjC,qBAAa,KAAK;AAAA,UAChB,MAAM,OAAO;AAAA,UACb,SAAS,OAAO;AAAA,UAChB,MAAM,OAAO;AAAA,QACf,CAAC;AAAA,MACH;AAAA,IACF;AAEA,WAAO;AAAA,EACT;AAAA,EAEQ,SAAS,SAA0B;AAEzC,QAAI,sBAAsB,KAAK,OAAO,EAAG,QAAO;AAChD,QAAI,sBAAsB,KAAK,OAAO,EAAG,QAAO;AAChD,QAAI,aAAa,KAAK,OAAO,EAAG,QAAO;AACvC,QAAI,iBAAiB,KAAK,OAAO,EAAG,QAAO;AAG3C,eAAW,WAAW,mBAAmB;AACvC,UAAI,QAAQ,KAAK,OAAO,EAAG,QAAO;AAAA,IACpC;AAEA,WAAO;AAAA,EACT;AACF;;;AClDA,IAAM,qBAAqB;AAC3B,IAAM,gBAAgB;AAEf,SAAS,mBAAmB,SAA+C;AAEhF,QAAM,gBAAgB,oBAAI,IAAyB;AACnD,QAAM,mBAAmB,oBAAI,IAAoB;AAEjD,aAAW,OAAO,SAAS;AACzB,UAAM,OAAO,IAAI;AACjB,QAAI,QAAQ;AACZ,eAAW,UAAU,IAAI,QAAQ;AAC/B;AACA,UAAI,QAAQ,cAAc,IAAI,OAAO,IAAI;AACzC,UAAI,CAAC,OAAO;AACV,gBAAQ,oBAAI,IAAI;AAChB,sBAAc,IAAI,OAAO,MAAM,KAAK;AAAA,MACtC;AACA,YAAM,IAAI,IAAI;AAAA,IAChB;AACA,QAAI,QAAQ,GAAG;AACb,uBAAiB,IAAI,MAAM,KAAK;AAAA,IAClC;AAAA,EACF;AAGA,QAAM,aAAa,oBAAI,IAAoB;AAC3C,aAAW,SAAS,cAAc,OAAO,GAAG;AAC1C,QAAI,MAAM,OAAO,EAAG;AACpB,UAAM,SAAS,CAAC,GAAG,KAAK,EAAE,KAAK;AAC/B,aAAS,IAAI,GAAG,IAAI,OAAO,QAAQ,KAAK;AACtC,eAAS,IAAI,IAAI,GAAG,IAAI,OAAO,QAAQ,KAAK;AAC1C,cAAM,MAAM,GAAG,OAAO,CAAC,CAAC,IAAI,OAAO,CAAC,CAAC;AACrC,mBAAW,IAAI,MAAM,WAAW,IAAI,GAAG,KAAK,KAAK,CAAC;AAAA,MACpD;AAAA,IACF;AAAA,EACF;AAGA,QAAM,UAA2B,CAAC;AAClC,aAAW,CAAC,KAAK,aAAa,KAAK,YAAY;AAC7C,QAAI,gBAAgB,mBAAoB;AAExC,UAAM,CAAC,OAAO,KAAK,IAAI,IAAI,MAAM,GAAG;AACpC,UAAM,SAAS,iBAAiB,IAAI,KAAK,KAAK;AAC9C,UAAM,SAAS,iBAAiB,IAAI,KAAK,KAAK;AAC9C,UAAM,WAAW,KAAK,IAAI,QAAQ,MAAM;AACxC,QAAI,aAAa,EAAG;AAEpB,UAAM,YAAY,KAAK,MAAO,gBAAgB,WAAY,GAAI,IAAI;AAClE,QAAI,YAAY,cAAe;AAE/B,YAAQ,KAAK,EAAE,OAAO,OAAO,eAAe,UAAU,CAAC;AAAA,EACzD;AAEA,UAAQ,KAAK,CAAC,GAAG,MAAM,EAAE,YAAY,EAAE,aAAa,EAAE,gBAAgB,EAAE,aAAa;AAErF,SAAO;AAAA,IACL,SAAS;AAAA,IACT,WAAW,KAAK,MAAM,KAAK,IAAI,IAAI,GAAI;AAAA,IACvC;AAAA,EACF;AACF;AAEO,SAAS,gBAAgB,QAAsD;AACpF,QAAM,MAAM,oBAAI,IAA6B;AAC7C,aAAW,SAAS,OAAO,SAAS;AAClC,QAAI,QAAQ,IAAI,IAAI,MAAM,KAAK;AAC/B,QAAI,CAAC,OAAO;AAAE,cAAQ,CAAC;AAAG,UAAI,IAAI,MAAM,OAAO,KAAK;AAAA,IAAG;AACvD,UAAM,KAAK,KAAK;AAEhB,QAAI,QAAQ,IAAI,IAAI,MAAM,KAAK;AAC/B,QAAI,CAAC,OAAO;AAAE,cAAQ,CAAC;AAAG,UAAI,IAAI,MAAM,OAAO,KAAK;AAAA,IAAG;AACvD,UAAM,KAAK,KAAK;AAAA,EAClB;AACA,SAAO;AACT;","names":["log","log"]}