@tekmidian/pai 0.5.7 → 0.6.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/ARCHITECTURE.md +72 -1
- package/README.md +87 -1
- package/dist/{auto-route-BG6I_4B1.mjs → auto-route-C-DrW6BL.mjs} +3 -3
- package/dist/{auto-route-BG6I_4B1.mjs.map → auto-route-C-DrW6BL.mjs.map} +1 -1
- package/dist/cli/index.mjs +1482 -1628
- package/dist/cli/index.mjs.map +1 -1
- package/dist/clusters-JIDQW65f.mjs +201 -0
- package/dist/clusters-JIDQW65f.mjs.map +1 -0
- package/dist/{config-Cf92lGX_.mjs → config-BuhHWyOK.mjs} +21 -6
- package/dist/config-BuhHWyOK.mjs.map +1 -0
- package/dist/daemon/index.mjs +11 -8
- package/dist/daemon/index.mjs.map +1 -1
- package/dist/{daemon-2ND5WO2j.mjs → daemon-D3hYb5_C.mjs} +669 -218
- package/dist/daemon-D3hYb5_C.mjs.map +1 -0
- package/dist/daemon-mcp/index.mjs +4597 -4
- package/dist/daemon-mcp/index.mjs.map +1 -1
- package/dist/db-DdUperSl.mjs +110 -0
- package/dist/db-DdUperSl.mjs.map +1 -0
- package/dist/{detect-BU3Nx_2L.mjs → detect-CdaA48EI.mjs} +1 -1
- package/dist/{detect-BU3Nx_2L.mjs.map → detect-CdaA48EI.mjs.map} +1 -1
- package/dist/{detector-Bp-2SM3x.mjs → detector-jGBuYQJM.mjs} +2 -2
- package/dist/{detector-Bp-2SM3x.mjs.map → detector-jGBuYQJM.mjs.map} +1 -1
- package/dist/{factory-Bzcy70G9.mjs → factory-Ygqe_bVZ.mjs} +7 -5
- package/dist/{factory-Bzcy70G9.mjs.map → factory-Ygqe_bVZ.mjs.map} +1 -1
- package/dist/helpers-BEST-4Gx.mjs +420 -0
- package/dist/helpers-BEST-4Gx.mjs.map +1 -0
- package/dist/hooks/capture-all-events.mjs +2 -2
- package/dist/hooks/capture-all-events.mjs.map +3 -3
- package/dist/hooks/capture-session-summary.mjs +38 -0
- package/dist/hooks/capture-session-summary.mjs.map +3 -3
- package/dist/hooks/cleanup-session-files.mjs +6 -12
- package/dist/hooks/cleanup-session-files.mjs.map +4 -4
- package/dist/hooks/context-compression-hook.mjs +93 -104
- package/dist/hooks/context-compression-hook.mjs.map +4 -4
- package/dist/hooks/initialize-session.mjs +14 -11
- package/dist/hooks/initialize-session.mjs.map +4 -4
- package/dist/hooks/inject-observations.mjs +220 -0
- package/dist/hooks/inject-observations.mjs.map +7 -0
- package/dist/hooks/load-core-context.mjs +2 -2
- package/dist/hooks/load-core-context.mjs.map +3 -3
- package/dist/hooks/load-project-context.mjs +90 -91
- package/dist/hooks/load-project-context.mjs.map +4 -4
- package/dist/hooks/observe.mjs +354 -0
- package/dist/hooks/observe.mjs.map +7 -0
- package/dist/hooks/stop-hook.mjs +94 -107
- package/dist/hooks/stop-hook.mjs.map +4 -4
- package/dist/hooks/sync-todo-to-md.mjs +31 -33
- package/dist/hooks/sync-todo-to-md.mjs.map +4 -4
- package/dist/index.d.mts +30 -7
- package/dist/index.d.mts.map +1 -1
- package/dist/index.mjs +5 -8
- package/dist/indexer-D53l5d1U.mjs +1 -0
- package/dist/{indexer-backend-CIMXedqk.mjs → indexer-backend-jcJFsmB4.mjs} +37 -127
- package/dist/indexer-backend-jcJFsmB4.mjs.map +1 -0
- package/dist/{ipc-client-Bjg_a1dc.mjs → ipc-client-CoyUHPod.mjs} +2 -7
- package/dist/{ipc-client-Bjg_a1dc.mjs.map → ipc-client-CoyUHPod.mjs.map} +1 -1
- package/dist/latent-ideas-bTJo6Omd.mjs +191 -0
- package/dist/latent-ideas-bTJo6Omd.mjs.map +1 -0
- package/dist/neighborhood-BYYbEkUJ.mjs +135 -0
- package/dist/neighborhood-BYYbEkUJ.mjs.map +1 -0
- package/dist/note-context-BK24bX8Y.mjs +126 -0
- package/dist/note-context-BK24bX8Y.mjs.map +1 -0
- package/dist/postgres-CKf-EDtS.mjs +846 -0
- package/dist/postgres-CKf-EDtS.mjs.map +1 -0
- package/dist/{reranker-D7bRAHi6.mjs → reranker-CMNZcfVx.mjs} +1 -1
- package/dist/{reranker-D7bRAHi6.mjs.map → reranker-CMNZcfVx.mjs.map} +1 -1
- package/dist/{search-_oHfguA5.mjs → search-DC1qhkKn.mjs} +2 -58
- package/dist/search-DC1qhkKn.mjs.map +1 -0
- package/dist/{sqlite-WWBq7_2C.mjs → sqlite-l-s9xPjY.mjs} +160 -3
- package/dist/sqlite-l-s9xPjY.mjs.map +1 -0
- package/dist/state-C6_vqz7w.mjs +102 -0
- package/dist/state-C6_vqz7w.mjs.map +1 -0
- package/dist/stop-words-BaMEGVeY.mjs +326 -0
- package/dist/stop-words-BaMEGVeY.mjs.map +1 -0
- package/dist/{indexer-CMPOiY1r.mjs → sync-BOsnEj2-.mjs} +14 -216
- package/dist/sync-BOsnEj2-.mjs.map +1 -0
- package/dist/themes-BvYF0W8T.mjs +148 -0
- package/dist/themes-BvYF0W8T.mjs.map +1 -0
- package/dist/{tools-DV_lsiCc.mjs → tools-DcaJlYDN.mjs} +162 -273
- package/dist/tools-DcaJlYDN.mjs.map +1 -0
- package/dist/trace-CRx9lPuc.mjs +137 -0
- package/dist/trace-CRx9lPuc.mjs.map +1 -0
- package/dist/{vault-indexer-k-kUlaZ-.mjs → vault-indexer-Bi2cRmn7.mjs} +134 -132
- package/dist/vault-indexer-Bi2cRmn7.mjs.map +1 -0
- package/dist/zettelkasten-cdajbnPr.mjs +708 -0
- package/dist/zettelkasten-cdajbnPr.mjs.map +1 -0
- package/package.json +1 -2
- package/src/hooks/ts/lib/project-utils/index.ts +50 -0
- package/src/hooks/ts/lib/project-utils/notify.ts +75 -0
- package/src/hooks/ts/lib/project-utils/paths.ts +218 -0
- package/src/hooks/ts/lib/project-utils/session-notes.ts +363 -0
- package/src/hooks/ts/lib/project-utils/todo.ts +178 -0
- package/src/hooks/ts/lib/project-utils/tokens.ts +39 -0
- package/src/hooks/ts/lib/project-utils.ts +40 -1018
- package/src/hooks/ts/post-tool-use/observe.ts +327 -0
- package/src/hooks/ts/session-end/capture-session-summary.ts +41 -0
- package/src/hooks/ts/session-start/inject-observations.ts +254 -0
- package/dist/chunker-CbnBe0s0.mjs +0 -191
- package/dist/chunker-CbnBe0s0.mjs.map +0 -1
- package/dist/config-Cf92lGX_.mjs.map +0 -1
- package/dist/daemon-2ND5WO2j.mjs.map +0 -1
- package/dist/db-Dp8VXIMR.mjs +0 -212
- package/dist/db-Dp8VXIMR.mjs.map +0 -1
- package/dist/indexer-CMPOiY1r.mjs.map +0 -1
- package/dist/indexer-backend-CIMXedqk.mjs.map +0 -1
- package/dist/mcp/index.d.mts +0 -1
- package/dist/mcp/index.mjs +0 -500
- package/dist/mcp/index.mjs.map +0 -1
- package/dist/postgres-FXrHDPcE.mjs +0 -358
- package/dist/postgres-FXrHDPcE.mjs.map +0 -1
- package/dist/schemas-BFIgGntb.mjs +0 -3405
- package/dist/schemas-BFIgGntb.mjs.map +0 -1
- package/dist/search-_oHfguA5.mjs.map +0 -1
- package/dist/sqlite-WWBq7_2C.mjs.map +0 -1
- package/dist/tools-DV_lsiCc.mjs.map +0 -1
- package/dist/vault-indexer-k-kUlaZ-.mjs.map +0 -1
- package/dist/zettelkasten-e-a4rW_6.mjs +0 -901
- package/dist/zettelkasten-e-a4rW_6.mjs.map +0 -1
- package/templates/README.md +0 -181
- package/templates/skills/CORE/Aesthetic.md +0 -333
- package/templates/skills/CORE/CONSTITUTION.md +0 -1502
- package/templates/skills/CORE/HistorySystem.md +0 -427
- package/templates/skills/CORE/HookSystem.md +0 -1082
- package/templates/skills/CORE/Prompting.md +0 -509
- package/templates/skills/CORE/ProsodyAgentTemplate.md +0 -53
- package/templates/skills/CORE/ProsodyGuide.md +0 -416
- package/templates/skills/CORE/SKILL.md +0 -741
- package/templates/skills/CORE/SkillSystem.md +0 -213
- package/templates/skills/CORE/TerminalTabs.md +0 -119
- package/templates/skills/CORE/VOICE.md +0 -106
- package/templates/skills/createskill-skill.template.md +0 -78
- package/templates/skills/history-system.template.md +0 -371
- package/templates/skills/hook-system.template.md +0 -913
- package/templates/skills/sessions-skill.template.md +0 -102
- package/templates/skills/skill-system.template.md +0 -214
- package/templates/skills/terminal-tabs.template.md +0 -120
- package/templates/templates.md +0 -20
|
@@ -1 +0,0 @@
|
|
|
1
|
-
{"version":3,"file":"zettelkasten-e-a4rW_6.mjs","names":["MAX_CHUNKS","MAX_CHUNKS"],"sources":["../src/zettelkasten/explore.ts","../src/zettelkasten/surprise.ts","../src/zettelkasten/converse.ts","../src/zettelkasten/themes.ts","../src/zettelkasten/health.ts","../src/zettelkasten/suggest.ts"],"sourcesContent":["import type { Database } from \"better-sqlite3\";\nimport { dirname } from \"node:path\";\n\nexport interface ExploreOptions {\n startNote: string;\n depth?: number;\n direction?: \"forward\" | \"backward\" | \"both\";\n mode?: \"sequential\" | \"associative\" | \"all\";\n}\n\nexport interface ExploreNode {\n path: string;\n title: string | null;\n depth: number;\n linkType: \"sequential\" | \"associative\";\n inbound: number;\n outbound: number;\n}\n\nexport interface ExploreResult {\n root: string;\n nodes: ExploreNode[];\n edges: Array<{ from: string; to: string; type: \"sequential\" | \"associative\" }>;\n branchingPoints: string[];\n maxDepthReached: boolean;\n}\n\nfunction classifyEdge(source: string, target: string): \"sequential\" | \"associative\" {\n return dirname(source) === dirname(target) ? \"sequential\" : \"associative\";\n}\n\nfunction resolveStart(db: Database, startNote: string): string | null {\n const inFiles = db\n .prepare(\"SELECT vault_path FROM vault_files WHERE vault_path = ?\")\n .get(startNote) as { vault_path: string } | undefined;\n if (inFiles) return inFiles.vault_path;\n\n const alias = db\n .prepare(\"SELECT canonical_path FROM vault_aliases WHERE vault_path = ?\")\n .get(startNote) as { canonical_path: string } | undefined;\n if (!alias) return null;\n\n const canonical = db\n .prepare(\"SELECT vault_path FROM vault_files WHERE vault_path = ?\")\n .get(alias.canonical_path) as { vault_path: string } | undefined;\n return canonical ? canonical.vault_path : null;\n}\n\nfunction getForwardNeighbors(db: Database, path: string): string[] {\n return (\n db\n .prepare(\n \"SELECT target_path FROM vault_links WHERE source_path = ? AND target_path IS NOT NULL\",\n )\n .all(path) as Array<{ target_path: string }>\n ).map((r) => r.target_path);\n}\n\nfunction getBackwardNeighbors(db: Database, path: string): string[] {\n return (\n db\n .prepare(\n \"SELECT source_path FROM vault_links WHERE target_path = ?\",\n )\n .all(path) as Array<{ source_path: string }>\n ).map((r) => r.source_path);\n}\n\nfunction getFileInfo(\n db: Database,\n path: string,\n): { title: string | null; inbound: number; outbound: number } {\n const file = db\n .prepare(\"SELECT title FROM vault_files WHERE vault_path = ?\")\n .get(path) as { title: string | null } | undefined;\n\n const health = db\n .prepare(\"SELECT inbound_count, outbound_count FROM vault_health WHERE vault_path = ?\")\n .get(path) as { inbound_count: number; outbound_count: number } | undefined;\n\n return {\n title: file?.title ?? null,\n inbound: health?.inbound_count ?? 0,\n outbound: health?.outbound_count ?? 0,\n };\n}\n\n/**\n * Traverse the Zettelkasten link graph using BFS, following chains of thought\n * from a starting note up to a configurable depth.\n */\nexport function zettelExplore(db: Database, opts: ExploreOptions): ExploreResult {\n const depth = Math.min(Math.max(opts.depth ?? 3, 1), 10);\n const direction = opts.direction ?? \"both\";\n const mode = opts.mode ?? \"all\";\n\n const root = resolveStart(db, opts.startNote);\n if (!root) {\n return {\n root: opts.startNote,\n nodes: [],\n edges: [],\n branchingPoints: [],\n maxDepthReached: false,\n };\n }\n\n const visited = new Set<string>([root]);\n const nodes: ExploreNode[] = [];\n const edges: Array<{ from: string; to: string; type: \"sequential\" | \"associative\" }> = [];\n let maxDepthReached = false;\n\n const queue: Array<{ path: string; depth: number }> = [{ path: root, depth: 0 }];\n\n while (queue.length > 0) {\n const current = queue.shift()!;\n\n if (current.depth >= depth) {\n maxDepthReached = true;\n continue;\n }\n\n const neighbors: Array<{ neighbor: string; from: string; to: string }> = [];\n\n if (direction === \"forward\" || direction === \"both\") {\n for (const n of getForwardNeighbors(db, current.path)) {\n neighbors.push({ neighbor: n, from: current.path, to: n });\n }\n }\n\n if (direction === \"backward\" || direction === \"both\") {\n for (const n of getBackwardNeighbors(db, current.path)) {\n neighbors.push({ neighbor: n, from: n, to: current.path });\n }\n }\n\n for (const { neighbor, from, to } of neighbors) {\n const edgeType = classifyEdge(from, to);\n\n if (mode !== \"all\" && edgeType !== mode) {\n continue;\n }\n\n const edgeKey = `${from}|${to}`;\n const alreadyHasEdge = edges.some((e) => e.from === from && e.to === to);\n if (!alreadyHasEdge) {\n edges.push({ from, to, type: edgeType });\n }\n\n if (!visited.has(neighbor)) {\n visited.add(neighbor);\n\n const info = getFileInfo(db, neighbor);\n nodes.push({\n path: neighbor,\n title: info.title,\n depth: current.depth + 1,\n linkType: edgeType,\n inbound: info.inbound,\n outbound: info.outbound,\n });\n\n queue.push({ path: neighbor, depth: current.depth + 1 });\n }\n }\n }\n\n const branchingPoints = nodes\n .filter((n) => n.outbound > 2)\n .map((n) => n.path);\n\n const rootInfo = getFileInfo(db, root);\n if (rootInfo.outbound > 2) {\n branchingPoints.unshift(root);\n }\n\n return { root, nodes, edges, branchingPoints, maxDepthReached };\n}\n","import type { Database } from \"better-sqlite3\";\nimport {\n deserializeEmbedding,\n generateEmbedding,\n cosineSimilarity,\n} from \"../memory/embeddings.js\";\n\nexport interface SurpriseOptions {\n referencePath: string;\n vaultProjectId: number;\n limit?: number;\n minSimilarity?: number;\n minGraphDistance?: number;\n}\n\nexport interface SurpriseResult {\n path: string;\n title: string | null;\n cosineSimilarity: number;\n graphDistance: number;\n surpriseScore: number;\n sharedSnippet: string;\n}\n\nconst CHUNK_BATCH_SIZE = 500;\nconst MAX_CHUNKS = 5000;\nconst BFS_HOP_CAP = 20;\n\nfunction getFileEmbeddings(\n db: Database,\n projectId: number,\n): Map<string, { embedding: Float32Array; text: string }> {\n const rows = db\n .prepare(\n `SELECT path, embedding, text FROM memory_chunks\n WHERE project_id = ? AND embedding IS NOT NULL\n ORDER BY path, start_line\n LIMIT ?`,\n )\n .all(projectId, MAX_CHUNKS) as Array<{\n path: string;\n embedding: Buffer;\n text: string;\n }>;\n\n // Group chunks by path and accumulate embeddings for averaging\n const byPath = new Map<string, { sum: Float32Array; count: number; text: string }>();\n\n for (const row of rows) {\n const vec = deserializeEmbedding(row.embedding);\n const entry = byPath.get(row.path);\n if (!entry) {\n byPath.set(row.path, { sum: new Float32Array(vec), count: 1, text: row.text });\n } else {\n for (let i = 0; i < vec.length; i++) {\n entry.sum[i] += vec[i];\n }\n entry.count++;\n }\n }\n\n const result = new Map<string, { embedding: Float32Array; text: string }>();\n for (const [path, { sum, count, text }] of byPath) {\n const avg = new Float32Array(sum.length);\n for (let i = 0; i < sum.length; i++) {\n avg[i] = sum[i] / count;\n }\n result.set(path, { embedding: avg, text });\n }\n return result;\n}\n\nfunction getReferenceEmbedding(\n db: Database,\n projectId: number,\n path: string,\n): { embedding: Float32Array; found: boolean } {\n const rows = db\n .prepare(\n `SELECT embedding FROM memory_chunks\n WHERE project_id = ? AND path = ? AND embedding IS NOT NULL`,\n )\n .all(projectId, path) as Array<{ embedding: Buffer }>;\n\n if (rows.length === 0) {\n return { embedding: new Float32Array(0), found: false };\n }\n\n const dim = deserializeEmbedding(rows[0].embedding).length;\n const sum = new Float32Array(dim);\n for (const row of rows) {\n const vec = deserializeEmbedding(row.embedding);\n for (let i = 0; i < dim; i++) {\n sum[i] += vec[i];\n }\n }\n const avg = new Float32Array(dim);\n for (let i = 0; i < dim; i++) {\n avg[i] = sum[i] / rows.length;\n }\n return { embedding: avg, found: true };\n}\n\nfunction bfsGraphDistance(db: Database, source: string, target: string): number {\n if (source === target) return 0;\n\n const visited = new Set<string>([source]);\n const queue: Array<{ path: string; hops: number }> = [{ path: source, hops: 0 }];\n\n while (queue.length > 0) {\n const { path, hops } = queue.shift()!;\n if (hops >= BFS_HOP_CAP) continue;\n\n const neighbors = db\n .prepare(\n `SELECT target_path AS neighbor FROM vault_links\n WHERE source_path = ? AND target_path IS NOT NULL\n UNION\n SELECT source_path AS neighbor FROM vault_links\n WHERE target_path = ?`,\n )\n .all(path, path) as Array<{ neighbor: string }>;\n\n for (const { neighbor } of neighbors) {\n if (neighbor === target) return hops + 1;\n if (!visited.has(neighbor)) {\n visited.add(neighbor);\n queue.push({ path: neighbor, hops: hops + 1 });\n }\n }\n }\n\n return Infinity;\n}\n\nfunction getBestChunkText(\n db: Database,\n projectId: number,\n path: string,\n refEmbedding: Float32Array,\n): string {\n const rows = db\n .prepare(\n `SELECT text, embedding FROM memory_chunks\n WHERE project_id = ? AND path = ? AND embedding IS NOT NULL\n LIMIT 20`,\n )\n .all(projectId, path) as Array<{ text: string; embedding: Buffer }>;\n\n if (rows.length === 0) return \"\";\n\n let bestText = rows[0].text;\n let bestSim = -Infinity;\n\n for (const row of rows) {\n const vec = deserializeEmbedding(row.embedding);\n const sim = cosineSimilarity(refEmbedding, vec);\n if (sim > bestSim) {\n bestSim = sim;\n bestText = row.text;\n }\n }\n\n return bestText.trim().slice(0, 200);\n}\n\n/**\n * Find notes that are semantically similar to a reference note but graph-distant —\n * revealing surprising conceptual connections across unrelated areas of the Zettelkasten.\n */\nexport async function zettelSurprise(\n db: Database,\n opts: SurpriseOptions,\n): Promise<SurpriseResult[]> {\n const limit = opts.limit ?? 10;\n const minSimilarity = opts.minSimilarity ?? 0.3;\n const minGraphDistance = opts.minGraphDistance ?? 3;\n\n let { embedding: refEmbedding, found } = getReferenceEmbedding(\n db,\n opts.vaultProjectId,\n opts.referencePath,\n );\n\n // Fall back to generating an embedding from the file title if no chunks exist\n if (!found) {\n const file = db\n .prepare(\"SELECT title FROM vault_files WHERE vault_path = ?\")\n .get(opts.referencePath) as { title: string | null } | undefined;\n const text = file?.title ?? opts.referencePath;\n refEmbedding = await generateEmbedding(text, true);\n }\n\n const allFileEmbeddings = getFileEmbeddings(db, opts.vaultProjectId);\n\n // Remove the reference note itself from candidates\n allFileEmbeddings.delete(opts.referencePath);\n\n // First pass: filter by semantic similarity to avoid BFS on all nodes\n const semanticCandidates: Array<{ path: string; sim: number }> = [];\n for (const [path, { embedding }] of allFileEmbeddings) {\n const sim = cosineSimilarity(refEmbedding, embedding);\n if (sim >= minSimilarity) {\n semanticCandidates.push({ path, sim });\n }\n }\n\n // Compute graph distances for semantic candidates\n const results: SurpriseResult[] = [];\n\n for (const { path, sim } of semanticCandidates) {\n const graphDistance = bfsGraphDistance(db, opts.referencePath, path);\n\n const effectiveDistance = isFinite(graphDistance) ? graphDistance : BFS_HOP_CAP;\n if (effectiveDistance < minGraphDistance) continue;\n\n const file = db\n .prepare(\"SELECT title FROM vault_files WHERE vault_path = ?\")\n .get(path) as { title: string | null } | undefined;\n\n const surpriseScore = sim * Math.log2(effectiveDistance + 1);\n const sharedSnippet = getBestChunkText(db, opts.vaultProjectId, path, refEmbedding);\n\n results.push({\n path,\n title: file?.title ?? null,\n cosineSimilarity: sim,\n graphDistance: isFinite(graphDistance) ? graphDistance : Infinity,\n surpriseScore,\n sharedSnippet,\n });\n }\n\n results.sort((a, b) => b.surpriseScore - a.surpriseScore);\n return results.slice(0, limit);\n}\n","import type { Database } from \"better-sqlite3\";\nimport { searchMemoryHybrid } from \"../memory/search.js\";\nimport { generateEmbedding } from \"../memory/embeddings.js\";\n\nexport interface ConverseOptions {\n /** The user's question or topic to explore. */\n question: string;\n /** project_id for vault chunks in memory_chunks. */\n vaultProjectId: number;\n /** Graph expansion depth. Default 2. */\n depth?: number;\n /** Maximum number of relevant notes to return. Default 15. */\n limit?: number;\n}\n\nexport interface ConverseConnection {\n fromPath: string;\n toPath: string;\n /** Top-level folder of fromPath. */\n fromDomain: string;\n /** Top-level folder of toPath. */\n toDomain: string;\n /** Link count between these two notes (can be > 1). */\n strength: number;\n}\n\nexport interface ConverseResult {\n relevantNotes: Array<{\n path: string;\n title: string | null;\n snippet: string;\n score: number;\n domain: string;\n }>;\n /** Cross-domain connections found among the selected notes. */\n connections: ConverseConnection[];\n /** Unique domains involved across all selected notes. */\n domains: string[];\n /** AI-ready prompt combining notes + connections for insight generation. */\n synthesisPrompt: string;\n}\n\n// ---------------------------------------------------------------------------\n// Helpers\n// ---------------------------------------------------------------------------\n\n/** Extract the top-level folder from a vault path (first path segment). */\nfunction extractDomain(vaultPath: string): string {\n const slash = vaultPath.indexOf(\"/\");\n return slash === -1 ? vaultPath : vaultPath.slice(0, slash);\n}\n\n/**\n * Expand one level of graph neighbors for a set of paths.\n * Returns all outbound and inbound neighbor paths (excluding already-visited).\n */\nfunction expandNeighbors(db: Database, paths: Set<string>): string[] {\n if (paths.size === 0) return [];\n\n const placeholders = Array.from(paths).map(() => \"?\").join(\", \");\n const pathList = Array.from(paths);\n\n const forward = db\n .prepare(\n `SELECT DISTINCT target_path FROM vault_links WHERE source_path IN (${placeholders}) AND target_path IS NOT NULL`,\n )\n .all(...pathList) as Array<{ target_path: string }>;\n\n const backward = db\n .prepare(\n `SELECT DISTINCT source_path FROM vault_links WHERE target_path IN (${placeholders})`,\n )\n .all(...pathList) as Array<{ source_path: string }>;\n\n const neighbors: string[] = [];\n for (const r of forward) neighbors.push(r.target_path);\n for (const r of backward) neighbors.push(r.source_path);\n return neighbors;\n}\n\n/**\n * Look up the title for a single vault path.\n * Returns null when the path is not found in vault_files.\n */\nfunction getTitle(db: Database, path: string): string | null {\n const row = db\n .prepare(\"SELECT title FROM vault_files WHERE vault_path = ?\")\n .get(path) as { title: string | null } | undefined;\n return row?.title ?? null;\n}\n\n/**\n * Count inbound links for a path from vault_health.\n * Used as a tiebreaker when trimming neighbor-only notes.\n */\nfunction getInboundCount(db: Database, path: string): number {\n const row = db\n .prepare(\"SELECT inbound_count FROM vault_health WHERE vault_path = ?\")\n .get(path) as { inbound_count: number } | undefined;\n return row?.inbound_count ?? 0;\n}\n\n// ---------------------------------------------------------------------------\n// Main export\n// ---------------------------------------------------------------------------\n\n/**\n * Let the vault \"talk back\" — find notes relevant to a question, expand\n * through the link graph, identify cross-domain connections, and return a\n * structured result including a synthesis prompt for an AI to generate insights.\n */\nexport async function zettelConverse(\n db: Database,\n opts: ConverseOptions,\n): Promise<ConverseResult> {\n const depth = Math.max(opts.depth ?? 2, 0);\n const limit = Math.max(opts.limit ?? 15, 1);\n const candidateLimit = 20;\n\n // ------------------------------------------------------------------\n // 1. Hybrid search: find top candidates via BM25 + semantic similarity\n // ------------------------------------------------------------------\n const queryEmbedding = await generateEmbedding(opts.question, true);\n\n const searchResults = searchMemoryHybrid(\n db,\n opts.question,\n queryEmbedding,\n {\n projectIds: [opts.vaultProjectId],\n maxResults: candidateLimit,\n },\n );\n\n // Map of path -> best score + snippet from search results\n const searchHits = new Map<string, { score: number; snippet: string }>();\n for (const r of searchResults) {\n const existing = searchHits.get(r.path);\n if (!existing || r.score > existing.score) {\n searchHits.set(r.path, { score: r.score, snippet: r.snippet });\n }\n }\n\n // ------------------------------------------------------------------\n // 2. Graph expansion: BFS from each search result up to `depth` levels\n // ------------------------------------------------------------------\n const allPaths = new Set<string>(searchHits.keys());\n let frontier = new Set<string>(searchHits.keys());\n\n for (let d = 0; d < depth; d++) {\n const neighbors = expandNeighbors(db, frontier);\n const newFrontier = new Set<string>();\n for (const n of neighbors) {\n if (!allPaths.has(n)) {\n allPaths.add(n);\n newFrontier.add(n);\n }\n }\n if (newFrontier.size === 0) break;\n frontier = newFrontier;\n }\n\n // ------------------------------------------------------------------\n // 3. Deduplicate + trim to limit\n // Search results first (ranked by score), then neighbors by inbound count\n // ------------------------------------------------------------------\n const searchRanked = Array.from(searchHits.entries())\n .sort((a, b) => b[1].score - a[1].score)\n .map(([path, info]) => ({ path, ...info, isSearchResult: true }));\n\n const neighborPaths = Array.from(allPaths).filter((p) => !searchHits.has(p));\n\n // Sort neighbors by link popularity (inbound count) so that well-connected\n // notes are preferred when we have budget for them.\n const neighborRanked = neighborPaths\n .map((path) => ({\n path,\n score: 0,\n snippet: \"\",\n inbound: getInboundCount(db, path),\n isSearchResult: false,\n }))\n .sort((a, b) => b.inbound - a.inbound);\n\n // Combine: search results fill the budget first, then neighbors\n const budgetForNeighbors = Math.max(limit - searchRanked.length, 0);\n const selectedNeighbors = neighborRanked.slice(0, budgetForNeighbors);\n\n const selectedSearchPaths = searchRanked.slice(0, limit);\n const selectedPaths = new Set<string>([\n ...selectedSearchPaths.map((r) => r.path),\n ...selectedNeighbors.map((r) => r.path),\n ]);\n\n // ------------------------------------------------------------------\n // 4. Build relevantNotes with titles + domains\n // ------------------------------------------------------------------\n const relevantNotes: ConverseResult[\"relevantNotes\"] = [];\n\n for (const r of selectedSearchPaths) {\n if (!selectedPaths.has(r.path)) continue;\n relevantNotes.push({\n path: r.path,\n title: getTitle(db, r.path),\n snippet: r.snippet,\n score: r.score,\n domain: extractDomain(r.path),\n });\n }\n\n for (const r of selectedNeighbors) {\n relevantNotes.push({\n path: r.path,\n title: getTitle(db, r.path),\n snippet: r.snippet,\n score: 0,\n domain: extractDomain(r.path),\n });\n }\n\n // ------------------------------------------------------------------\n // 5. Find connections between the selected notes\n // ------------------------------------------------------------------\n let connections: ConverseConnection[] = [];\n\n if (selectedPaths.size > 0) {\n const pathList = Array.from(selectedPaths);\n const placeholders = pathList.map(() => \"?\").join(\", \");\n\n const edgeRows = db\n .prepare(\n `SELECT source_path, target_path, COUNT(*) AS cnt\n FROM vault_links\n WHERE source_path IN (${placeholders})\n AND target_path IN (${placeholders})\n GROUP BY source_path, target_path`,\n )\n .all(...pathList, ...pathList) as Array<{\n source_path: string;\n target_path: string;\n cnt: number;\n }>;\n\n for (const row of edgeRows) {\n connections.push({\n fromPath: row.source_path,\n toPath: row.target_path,\n fromDomain: extractDomain(row.source_path),\n toDomain: extractDomain(row.target_path),\n strength: row.cnt,\n });\n }\n }\n\n // ------------------------------------------------------------------\n // 6. Domains + cross-domain filter\n // ------------------------------------------------------------------\n const domainSet = new Set<string>(relevantNotes.map((n) => n.domain));\n const domains = Array.from(domainSet).sort();\n\n const crossDomainConnections = connections.filter(\n (c) => c.fromDomain !== c.toDomain,\n );\n\n // ------------------------------------------------------------------\n // 7. Build synthesis prompt\n // ------------------------------------------------------------------\n const notesSummary = relevantNotes\n .map((n, i) => {\n const title = n.title ? `\"${n.title}\"` : \"(untitled)\";\n const domain = n.domain;\n const scoreLabel = n.score > 0 ? ` [relevance: ${n.score.toFixed(3)}]` : \" [context]\";\n const snippet = n.snippet.trim().slice(0, 300);\n return `${i + 1}. [${domain}] ${title}${scoreLabel}\\n Path: ${n.path}\\n \"${snippet}\"`;\n })\n .join(\"\\n\\n\");\n\n const connectionSummary =\n crossDomainConnections.length > 0\n ? crossDomainConnections\n .map(\n (c) =>\n `- \"${c.fromPath}\" (${c.fromDomain}) → \"${c.toPath}\" (${c.toDomain}) [strength: ${c.strength}]`,\n )\n .join(\"\\n\")\n : \"(no cross-domain connections found)\";\n\n const domainList = domains.join(\", \");\n\n const synthesisPrompt = `You are a Zettelkasten research assistant. The vault has surfaced the following notes in response to this question:\n\nQUESTION: ${opts.question}\n\n---\n\nRELEVANT NOTES (${relevantNotes.length} notes across ${domains.length} domain(s): ${domainList}):\n\n${notesSummary}\n\n---\n\nCROSS-DOMAIN CONNECTIONS (links bridging different knowledge areas):\n\n${connectionSummary}\n\n---\n\nSYNTHESIS TASK:\n\nBased on these notes and the connections between them, please:\n\n1. Identify the key insights that emerge in direct response to the question.\n2. Highlight any unexpected connections between notes from different domains (${domainList}).\n3. Point out tensions, contradictions, or open questions the vault raises but does not resolve.\n4. Suggest what is notably absent — what the vault does NOT yet contain that would strengthen the understanding of this topic.\n5. Propose 2-3 new notes that would meaningfully extend this knowledge cluster.\n\nThink like a scholar who has deeply internalized these ideas and is now synthesizing them for the first time.`;\n\n return {\n relevantNotes,\n connections: crossDomainConnections,\n domains,\n synthesisPrompt,\n };\n}\n","import type { Database } from \"better-sqlite3\";\nimport { deserializeEmbedding, cosineSimilarity } from \"../memory/embeddings.js\";\n\nexport interface ThemeOptions {\n vaultProjectId: number;\n lookbackDays?: number;\n minClusterSize?: number;\n maxThemes?: number;\n similarityThreshold?: number;\n}\n\nexport interface ThemeCluster {\n id: number;\n label: string;\n notes: Array<{\n path: string;\n title: string | null;\n }>;\n size: number;\n folderDiversity: number;\n avgRecency: number;\n linkedRatio: number;\n suggestIndexNote: boolean;\n}\n\nexport interface ThemeResult {\n themes: ThemeCluster[];\n totalNotesAnalyzed: number;\n timeWindow: { from: number; to: number };\n}\n\nconst MAX_CHUNKS = 5000;\n\nconst STOP_WORDS = new Set([\n \"a\", \"an\", \"the\", \"and\", \"or\", \"but\", \"in\", \"on\", \"at\", \"to\", \"for\",\n \"of\", \"with\", \"by\", \"from\", \"is\", \"it\", \"as\", \"be\", \"was\", \"are\",\n \"has\", \"had\", \"have\", \"not\", \"this\", \"that\", \"i\", \"my\", \"we\", \"our\",\n \"new\", \"note\", \"untitled\", \"page\", \"file\", \"doc\",\n]);\n\nfunction getTopFolder(vaultPath: string): string {\n const parts = vaultPath.split(\"/\");\n return parts.length > 1 ? parts[0] : \"\";\n}\n\nfunction generateLabel(titles: Array<string | null>): string {\n const wordCounts = new Map<string, number>();\n for (const title of titles) {\n if (!title) continue;\n const words = title\n .toLowerCase()\n .replace(/[^a-z0-9\\s]/g, \" \")\n .split(/\\s+/)\n .filter((w) => w.length > 2 && !STOP_WORDS.has(w));\n for (const word of words) {\n wordCounts.set(word, (wordCounts.get(word) ?? 0) + 1);\n }\n }\n const sorted = [...wordCounts.entries()].sort((a, b) => b[1] - a[1]);\n return sorted\n .slice(0, 3)\n .map(([w]) => w)\n .join(\" / \");\n}\n\nfunction computeLinkedRatio(db: Database, paths: string[]): number {\n if (paths.length < 2) return 0;\n const totalPairs = (paths.length * (paths.length - 1)) / 2;\n const pathSet = new Set(paths);\n let linkedPairs = 0;\n\n for (const path of paths) {\n const rows = db\n .prepare(\n `SELECT target_path FROM vault_links\n WHERE source_path = ? AND target_path IS NOT NULL`,\n )\n .all(path) as Array<{ target_path: string }>;\n for (const { target_path } of rows) {\n if (pathSet.has(target_path)) {\n linkedPairs++;\n }\n }\n }\n\n // Each bidirectional pair might be counted once per direction; divide by 2 to normalize\n const uniquePairs = linkedPairs / 2;\n return Math.min(1, uniquePairs / totalPairs);\n}\n\ntype ClusterNode = {\n paths: string[];\n titles: Array<string | null>;\n indexedAts: number[];\n centroid: Float32Array;\n};\n\nfunction averageEmbeddings(embeddings: Float32Array[]): Float32Array {\n if (embeddings.length === 0) return new Float32Array(0);\n const dim = embeddings[0].length;\n const sum = new Float32Array(dim);\n for (const vec of embeddings) {\n for (let i = 0; i < dim; i++) {\n sum[i] += vec[i];\n }\n }\n const avg = new Float32Array(dim);\n for (let i = 0; i < dim; i++) {\n avg[i] = sum[i] / embeddings.length;\n }\n return avg;\n}\n\n/**\n * Detect emerging themes in recently-modified notes using agglomerative single-linkage\n * clustering of note-level embeddings.\n */\nexport async function zettelThemes(\n db: Database,\n opts: ThemeOptions,\n): Promise<ThemeResult> {\n const lookbackDays = opts.lookbackDays ?? 30;\n const minClusterSize = opts.minClusterSize ?? 3;\n const maxThemes = opts.maxThemes ?? 10;\n const similarityThreshold = opts.similarityThreshold ?? 0.65;\n\n const now = Date.now();\n const from = now - lookbackDays * 86400000;\n\n // Step 1: get recent notes\n const recentNotes = db\n .prepare(\n `SELECT vault_path, title, indexed_at FROM vault_files WHERE indexed_at > ?`,\n )\n .all(from) as Array<{ vault_path: string; title: string | null; indexed_at: number }>;\n\n // Step 2: get file-level embeddings from memory_chunks\n const chunkRows = db\n .prepare(\n `SELECT path, embedding FROM memory_chunks\n WHERE project_id = ? AND embedding IS NOT NULL\n ORDER BY path, start_line\n LIMIT ?`,\n )\n .all(opts.vaultProjectId, MAX_CHUNKS) as Array<{ path: string; embedding: Buffer }>;\n\n const embeddingsByPath = new Map<string, Float32Array[]>();\n for (const row of chunkRows) {\n const vec = deserializeEmbedding(row.embedding);\n const arr = embeddingsByPath.get(row.path);\n if (!arr) {\n embeddingsByPath.set(row.path, [vec]);\n } else {\n arr.push(vec);\n }\n }\n\n const fileEmbeddings = new Map<string, Float32Array>();\n for (const [path, vecs] of embeddingsByPath) {\n fileEmbeddings.set(path, averageEmbeddings(vecs));\n }\n\n // Step 3: build initial clusters — only include notes that have embeddings\n const clusters: ClusterNode[] = [];\n for (const note of recentNotes) {\n const embedding = fileEmbeddings.get(note.vault_path);\n if (!embedding) continue;\n clusters.push({\n paths: [note.vault_path],\n titles: [note.title],\n indexedAts: [note.indexed_at],\n centroid: embedding,\n });\n }\n\n const totalNotesAnalyzed = clusters.length;\n\n // Step 4: agglomerative single-linkage clustering\n // Stop when no two clusters have similarity >= threshold\n // Using centroid similarity as a proxy for single-linkage max similarity\n let merged = true;\n while (merged && clusters.length > 1) {\n merged = false;\n let bestSim = similarityThreshold;\n let bestI = -1;\n let bestJ = -1;\n\n for (let i = 0; i < clusters.length; i++) {\n for (let j = i + 1; j < clusters.length; j++) {\n const sim = cosineSimilarity(clusters[i].centroid, clusters[j].centroid);\n if (sim > bestSim) {\n bestSim = sim;\n bestI = i;\n bestJ = j;\n }\n }\n }\n\n if (bestI === -1) break;\n\n // Merge cluster j into cluster i\n const ci = clusters[bestI];\n const cj = clusters[bestJ];\n const mergedPaths = [...ci.paths, ...cj.paths];\n const mergedTitles = [...ci.titles, ...cj.titles];\n const mergedIndexedAts = [...ci.indexedAts, ...cj.indexedAts];\n\n // Recompute centroid from averaged embeddings of all member paths\n const memberEmbeddings: Float32Array[] = [];\n for (const p of mergedPaths) {\n const emb = fileEmbeddings.get(p);\n if (emb) memberEmbeddings.push(emb);\n }\n\n clusters[bestI] = {\n paths: mergedPaths,\n titles: mergedTitles,\n indexedAts: mergedIndexedAts,\n centroid: averageEmbeddings(memberEmbeddings),\n };\n\n clusters.splice(bestJ, 1);\n merged = true;\n }\n\n // Step 5: filter and annotate clusters\n const themes: ThemeCluster[] = [];\n let clusterIndex = 0;\n\n for (const cluster of clusters) {\n if (cluster.paths.length < minClusterSize) continue;\n\n const label = generateLabel(cluster.titles) || `Theme ${clusterIndex + 1}`;\n const avgRecency =\n cluster.indexedAts.reduce((sum, t) => sum + t, 0) / cluster.indexedAts.length;\n\n const uniqueFolders = new Set(cluster.paths.map(getTopFolder));\n const folderDiversity = uniqueFolders.size / cluster.paths.length;\n\n const linkedRatio = computeLinkedRatio(db, cluster.paths);\n const suggestIndexNote = linkedRatio < 0.3 && cluster.paths.length >= 5;\n\n themes.push({\n id: clusterIndex++,\n label,\n notes: cluster.paths.map((path, idx) => ({\n path,\n title: cluster.titles[idx],\n })),\n size: cluster.paths.length,\n folderDiversity,\n avgRecency,\n linkedRatio,\n suggestIndexNote,\n });\n }\n\n // Step 6: rank by size * folderDiversity * recency_ratio\n themes.sort(\n (a, b) =>\n b.size * b.folderDiversity * (b.avgRecency / now) -\n a.size * a.folderDiversity * (a.avgRecency / now),\n );\n\n return {\n themes: themes.slice(0, maxThemes),\n totalNotesAnalyzed,\n timeWindow: { from, to: now },\n };\n}\n","import type { Database } from \"better-sqlite3\";\n\nexport interface HealthOptions {\n scope?: \"full\" | \"recent\" | \"project\";\n projectPath?: string;\n recentDays?: number;\n include?: Array<\"dead_links\" | \"orphans\" | \"disconnected\" | \"low_connectivity\">;\n}\n\nexport interface DeadLink {\n sourcePath: string;\n targetRaw: string;\n lineNumber: number;\n}\n\nexport interface HealthResult {\n totalFiles: number;\n totalLinks: number;\n deadLinks: DeadLink[];\n orphans: string[];\n disconnectedClusters: number;\n lowConnectivity: string[];\n healthScore: number;\n computedAt: number;\n}\n\nfunction buildScopeFilter(\n opts: HealthOptions,\n tableAlias: string,\n pathColumn: string,\n): { clause: string; params: unknown[] } {\n const scope = opts.scope ?? \"full\";\n\n if (scope === \"project\") {\n const prefix = opts.projectPath ?? \"\";\n return {\n clause: `WHERE ${tableAlias}.${pathColumn} LIKE ? || '%'`,\n params: [prefix],\n };\n }\n\n if (scope === \"recent\") {\n const days = opts.recentDays ?? 30;\n const cutoff = Date.now() - days * 86400000;\n return {\n clause: `WHERE ${tableAlias}.indexed_at > ?`,\n params: [cutoff],\n };\n }\n\n return { clause: \"\", params: [] };\n}\n\nfunction countComponents(nodes: string[], edges: Array<{ source: string; target: string }>): number {\n if (nodes.length === 0) return 0;\n\n const parent = new Map<string, string>();\n const rank = new Map<string, number>();\n\n for (const n of nodes) {\n parent.set(n, n);\n rank.set(n, 0);\n }\n\n function find(x: string): string {\n let root = x;\n while (parent.get(root) !== root) {\n root = parent.get(root)!;\n }\n let current = x;\n while (current !== root) {\n const next = parent.get(current)!;\n parent.set(current, root);\n current = next;\n }\n return root;\n }\n\n function union(a: string, b: string): void {\n const ra = find(a);\n const rb = find(b);\n if (ra === rb) return;\n const rankA = rank.get(ra) ?? 0;\n const rankB = rank.get(rb) ?? 0;\n if (rankA < rankB) {\n parent.set(ra, rb);\n } else if (rankA > rankB) {\n parent.set(rb, ra);\n } else {\n parent.set(rb, ra);\n rank.set(ra, rankA + 1);\n }\n }\n\n for (const { source, target } of edges) {\n if (parent.has(source) && parent.has(target)) {\n union(source, target);\n }\n }\n\n const roots = new Set<string>();\n for (const n of nodes) {\n roots.add(find(n));\n }\n return roots.size;\n}\n\n/**\n * Audit the structural health of the Zettelkasten vault using graph metrics.\n * Designed to complete in under 60ms for a full vault.\n */\nexport function zettelHealth(db: Database, opts?: HealthOptions): HealthResult {\n const options = opts ?? {};\n const scope = options.scope ?? \"full\";\n const include = options.include ?? [\"dead_links\", \"orphans\", \"disconnected\", \"low_connectivity\"];\n\n const computedAt = Date.now();\n\n // --- totalFiles ---\n let totalFiles = 0;\n if (scope === \"full\") {\n totalFiles = (\n db.prepare(\"SELECT COUNT(*) AS n FROM vault_files\").get() as { n: number }\n ).n;\n } else if (scope === \"project\") {\n const prefix = options.projectPath ?? \"\";\n totalFiles = (\n db\n .prepare(\"SELECT COUNT(*) AS n FROM vault_files WHERE vault_path LIKE ? || '%'\")\n .get(prefix) as { n: number }\n ).n;\n } else {\n const days = options.recentDays ?? 30;\n const cutoff = computedAt - days * 86400000;\n totalFiles = (\n db\n .prepare(\"SELECT COUNT(*) AS n FROM vault_files WHERE indexed_at > ?\")\n .get(cutoff) as { n: number }\n ).n;\n }\n\n // --- totalLinks ---\n let totalLinks = 0;\n if (scope === \"full\") {\n totalLinks = (\n db.prepare(\"SELECT COUNT(*) AS n FROM vault_links\").get() as { n: number }\n ).n;\n } else if (scope === \"project\") {\n const prefix = options.projectPath ?? \"\";\n totalLinks = (\n db\n .prepare(\"SELECT COUNT(*) AS n FROM vault_links WHERE source_path LIKE ? || '%'\")\n .get(prefix) as { n: number }\n ).n;\n } else {\n const days = options.recentDays ?? 30;\n const cutoff = computedAt - days * 86400000;\n totalLinks = (\n db\n .prepare(\n \"SELECT COUNT(*) AS n FROM vault_links WHERE source_path IN (SELECT vault_path FROM vault_files WHERE indexed_at > ?)\",\n )\n .get(cutoff) as { n: number }\n ).n;\n }\n\n // --- deadLinks ---\n let deadLinks: DeadLink[] = [];\n if (include.includes(\"dead_links\")) {\n if (scope === \"full\") {\n deadLinks = (\n db\n .prepare(\n \"SELECT source_path, target_raw, line_number FROM vault_links WHERE target_path IS NULL\",\n )\n .all() as Array<{ source_path: string; target_raw: string; line_number: number }>\n ).map((r) => ({\n sourcePath: r.source_path,\n targetRaw: r.target_raw,\n lineNumber: r.line_number,\n }));\n } else if (scope === \"project\") {\n const prefix = options.projectPath ?? \"\";\n deadLinks = (\n db\n .prepare(\n \"SELECT source_path, target_raw, line_number FROM vault_links WHERE target_path IS NULL AND source_path LIKE ? || '%'\",\n )\n .all(prefix) as Array<{ source_path: string; target_raw: string; line_number: number }>\n ).map((r) => ({\n sourcePath: r.source_path,\n targetRaw: r.target_raw,\n lineNumber: r.line_number,\n }));\n } else {\n const days = options.recentDays ?? 30;\n const cutoff = computedAt - days * 86400000;\n deadLinks = (\n db\n .prepare(\n \"SELECT source_path, target_raw, line_number FROM vault_links WHERE target_path IS NULL AND source_path IN (SELECT vault_path FROM vault_files WHERE indexed_at > ?)\",\n )\n .all(cutoff) as Array<{ source_path: string; target_raw: string; line_number: number }>\n ).map((r) => ({\n sourcePath: r.source_path,\n targetRaw: r.target_raw,\n lineNumber: r.line_number,\n }));\n }\n }\n\n // --- orphans ---\n let orphans: string[] = [];\n if (include.includes(\"orphans\")) {\n if (scope === \"full\") {\n orphans = (\n db\n .prepare(\"SELECT vault_path FROM vault_health WHERE is_orphan = 1\")\n .all() as Array<{ vault_path: string }>\n ).map((r) => r.vault_path);\n } else if (scope === \"project\") {\n const prefix = options.projectPath ?? \"\";\n orphans = (\n db\n .prepare(\n \"SELECT vault_path FROM vault_health WHERE is_orphan = 1 AND vault_path LIKE ? || '%'\",\n )\n .all(prefix) as Array<{ vault_path: string }>\n ).map((r) => r.vault_path);\n } else {\n const days = options.recentDays ?? 30;\n const cutoff = computedAt - days * 86400000;\n orphans = (\n db\n .prepare(\n \"SELECT vh.vault_path FROM vault_health vh JOIN vault_files vf ON vh.vault_path = vf.vault_path WHERE vh.is_orphan = 1 AND vf.indexed_at > ?\",\n )\n .all(cutoff) as Array<{ vault_path: string }>\n ).map((r) => r.vault_path);\n }\n }\n\n // --- disconnectedClusters (union-find) ---\n let disconnectedClusters = 1;\n if (include.includes(\"disconnected\")) {\n let allNodes: string[];\n let allEdges: Array<{ source: string; target: string }>;\n\n if (scope === \"full\") {\n allNodes = (\n db.prepare(\"SELECT vault_path FROM vault_files\").all() as Array<{ vault_path: string }>\n ).map((r) => r.vault_path);\n\n allEdges = (\n db\n .prepare(\n \"SELECT DISTINCT source_path AS source, target_path AS target FROM vault_links WHERE target_path IS NOT NULL\",\n )\n .all() as Array<{ source: string; target: string }>\n );\n } else if (scope === \"project\") {\n const prefix = options.projectPath ?? \"\";\n allNodes = (\n db\n .prepare(\"SELECT vault_path FROM vault_files WHERE vault_path LIKE ? || '%'\")\n .all(prefix) as Array<{ vault_path: string }>\n ).map((r) => r.vault_path);\n\n allEdges = (\n db\n .prepare(\n \"SELECT DISTINCT source_path AS source, target_path AS target FROM vault_links WHERE target_path IS NOT NULL AND source_path LIKE ? || '%'\",\n )\n .all(prefix) as Array<{ source: string; target: string }>\n );\n } else {\n const days = options.recentDays ?? 30;\n const cutoff = computedAt - days * 86400000;\n allNodes = (\n db\n .prepare(\"SELECT vault_path FROM vault_files WHERE indexed_at > ?\")\n .all(cutoff) as Array<{ vault_path: string }>\n ).map((r) => r.vault_path);\n\n allEdges = (\n db\n .prepare(\n \"SELECT DISTINCT source_path AS source, target_path AS target FROM vault_links WHERE target_path IS NOT NULL AND source_path IN (SELECT vault_path FROM vault_files WHERE indexed_at > ?)\",\n )\n .all(cutoff) as Array<{ source: string; target: string }>\n );\n }\n\n disconnectedClusters = countComponents(allNodes, allEdges);\n }\n\n // --- lowConnectivity ---\n let lowConnectivity: string[] = [];\n if (include.includes(\"low_connectivity\")) {\n if (scope === \"full\") {\n lowConnectivity = (\n db\n .prepare(\n \"SELECT vault_path FROM vault_health WHERE inbound_count + outbound_count <= 1\",\n )\n .all() as Array<{ vault_path: string }>\n ).map((r) => r.vault_path);\n } else if (scope === \"project\") {\n const prefix = options.projectPath ?? \"\";\n lowConnectivity = (\n db\n .prepare(\n \"SELECT vault_path FROM vault_health WHERE inbound_count + outbound_count <= 1 AND vault_path LIKE ? || '%'\",\n )\n .all(prefix) as Array<{ vault_path: string }>\n ).map((r) => r.vault_path);\n } else {\n const days = options.recentDays ?? 30;\n const cutoff = computedAt - days * 86400000;\n lowConnectivity = (\n db\n .prepare(\n \"SELECT vh.vault_path FROM vault_health vh JOIN vault_files vf ON vh.vault_path = vf.vault_path WHERE vh.inbound_count + vh.outbound_count <= 1 AND vf.indexed_at > ?\",\n )\n .all(cutoff) as Array<{ vault_path: string }>\n ).map((r) => r.vault_path);\n }\n }\n\n // --- healthScore ---\n const deadRatio = totalLinks > 0 ? deadLinks.length / totalLinks : 0;\n const orphanRatio = totalFiles > 0 ? orphans.length / totalFiles : 0;\n const lowConnRatio = totalFiles > 0 ? lowConnectivity.length / totalFiles : 0;\n const healthScore = Math.round(\n 100 * (1 - deadRatio) * (1 - orphanRatio * 0.5) * (1 - lowConnRatio * 0.3),\n );\n\n return {\n totalFiles,\n totalLinks,\n deadLinks,\n orphans,\n disconnectedClusters,\n lowConnectivity,\n healthScore,\n computedAt,\n };\n}\n","import type { Database } from \"better-sqlite3\";\nimport { deserializeEmbedding, cosineSimilarity } from \"../memory/embeddings.js\";\nimport { basename } from \"node:path\";\n\nexport interface SuggestOptions {\n notePath: string;\n vaultProjectId: number;\n limit?: number;\n excludeLinked?: boolean;\n}\n\nexport interface Suggestion {\n path: string;\n title: string | null;\n score: number;\n semanticScore: number;\n tagScore: number;\n neighborScore: number;\n reason: string;\n suggestedWikilink: string;\n}\n\nconst MAX_CHUNKS = 5000;\nconst SEMANTIC_WEIGHT = 0.5;\nconst TAG_WEIGHT = 0.2;\nconst NEIGHBOR_WEIGHT = 0.3;\n\n// Stop words to ignore when generating tag/label strings\nconst STOP_WORDS = new Set([\n \"a\", \"an\", \"the\", \"and\", \"or\", \"but\", \"in\", \"on\", \"at\", \"to\", \"for\",\n \"of\", \"with\", \"by\", \"from\", \"is\", \"it\", \"as\", \"be\", \"was\", \"are\",\n \"has\", \"had\", \"have\", \"not\", \"this\", \"that\", \"i\", \"my\", \"we\", \"our\",\n]);\n\nfunction extractTagsFromChunkTexts(texts: string[]): Set<string> {\n const tags = new Set<string>();\n for (const text of texts) {\n // Match YAML frontmatter tags block: \"tags:\\n - tag1\\n - tag2\"\n const match = text.match(/^tags:\\s*\\n((?:[ \\t]*-[ \\t]*.+\\n?)*)/m);\n if (!match) continue;\n const block = match[1];\n const lines = block.split(\"\\n\");\n for (const line of lines) {\n const tagMatch = line.match(/^[ \\t]*-[ \\t]*(.+)/);\n if (tagMatch) {\n const tag = tagMatch[1].trim().toLowerCase();\n if (tag) tags.add(tag);\n }\n }\n }\n return tags;\n}\n\nfunction getFileAvgEmbedding(\n db: Database,\n projectId: number,\n path: string,\n): Float32Array | null {\n const rows = db\n .prepare(\n `SELECT embedding FROM memory_chunks\n WHERE project_id = ? AND path = ? AND embedding IS NOT NULL`,\n )\n .all(projectId, path) as Array<{ embedding: Buffer }>;\n\n if (rows.length === 0) return null;\n\n const first = deserializeEmbedding(rows[0].embedding);\n const sum = new Float32Array(first.length);\n for (const row of rows) {\n const vec = deserializeEmbedding(row.embedding);\n for (let i = 0; i < vec.length; i++) {\n sum[i] += vec[i];\n }\n }\n const avg = new Float32Array(sum.length);\n for (let i = 0; i < sum.length; i++) {\n avg[i] = sum[i] / rows.length;\n }\n return avg;\n}\n\nfunction getAllFileEmbeddings(\n db: Database,\n projectId: number,\n): Map<string, Float32Array> {\n const rows = db\n .prepare(\n `SELECT path, embedding FROM memory_chunks\n WHERE project_id = ? AND embedding IS NOT NULL\n ORDER BY path, start_line\n LIMIT ?`,\n )\n .all(projectId, MAX_CHUNKS) as Array<{ path: string; embedding: Buffer }>;\n\n const byPath = new Map<string, { sum: Float32Array; count: number }>();\n for (const row of rows) {\n const vec = deserializeEmbedding(row.embedding);\n const entry = byPath.get(row.path);\n if (!entry) {\n byPath.set(row.path, { sum: new Float32Array(vec), count: 1 });\n } else {\n for (let i = 0; i < vec.length; i++) {\n entry.sum[i] += vec[i];\n }\n entry.count++;\n }\n }\n\n const result = new Map<string, Float32Array>();\n for (const [path, { sum, count }] of byPath) {\n const avg = new Float32Array(sum.length);\n for (let i = 0; i < sum.length; i++) {\n avg[i] = sum[i] / count;\n }\n result.set(path, avg);\n }\n return result;\n}\n\nfunction getFileTags(db: Database, projectId: number, path: string): Set<string> {\n const rows = db\n .prepare(\n `SELECT text FROM memory_chunks\n WHERE project_id = ? AND path = ?\n ORDER BY start_line\n LIMIT 5`,\n )\n .all(projectId, path) as Array<{ text: string }>;\n return extractTagsFromChunkTexts(rows.map((r) => r.text));\n}\n\nfunction jaccardSimilarity(a: Set<string>, b: Set<string>): number {\n if (a.size === 0 && b.size === 0) return 0;\n let intersection = 0;\n for (const tag of a) {\n if (b.has(tag)) intersection++;\n }\n const union = a.size + b.size - intersection;\n return union === 0 ? 0 : intersection / union;\n}\n\nfunction buildReason(\n semanticScore: number,\n tagScore: number,\n neighborScore: number,\n neighborCount: number,\n): string {\n const signals: Array<{ label: string; value: number }> = [\n { label: `Semantically similar (${semanticScore.toFixed(2)})`, value: semanticScore * SEMANTIC_WEIGHT },\n { label: `Shared tags (${tagScore.toFixed(2)} Jaccard)`, value: tagScore * TAG_WEIGHT },\n { label: `Linked by ${neighborCount} mutual connection${neighborCount !== 1 ? \"s\" : \"\"}`, value: neighborScore * NEIGHBOR_WEIGHT },\n ];\n signals.sort((a, b) => b.value - a.value);\n return signals[0].label;\n}\n\nfunction suggestedWikilink(vaultPath: string): string {\n const base = basename(vaultPath);\n const name = base.endsWith(\".md\") ? base.slice(0, -3) : base;\n return `[[${name}]]`;\n}\n\n/**\n * Proactively find notes worth linking to a given note, combining semantic similarity,\n * shared tags, and graph-neighborhood signals into a ranked list of suggestions.\n */\nexport async function zettelSuggest(\n db: Database,\n opts: SuggestOptions,\n): Promise<Suggestion[]> {\n const limit = opts.limit ?? 5;\n const excludeLinked = opts.excludeLinked ?? true;\n\n // Step 1: get current outbound links\n const outboundRows = db\n .prepare(\n `SELECT target_path FROM vault_links\n WHERE source_path = ? AND target_path IS NOT NULL`,\n )\n .all(opts.notePath) as Array<{ target_path: string }>;\n const linkedPaths = new Set(outboundRows.map((r) => r.target_path));\n\n // Step 2: get source embedding\n const sourceEmbedding = getFileAvgEmbedding(db, opts.vaultProjectId, opts.notePath);\n\n // Step 3a: get all file-level embeddings for semantic scoring\n const allEmbeddings = getAllFileEmbeddings(db, opts.vaultProjectId);\n allEmbeddings.delete(opts.notePath);\n\n // Step 3b: get source tags\n const sourceTags = getFileTags(db, opts.vaultProjectId, opts.notePath);\n\n // Step 3c: compute graph neighborhood (friends-of-friends)\n const friendTargetRows = db\n .prepare(\n `SELECT DISTINCT target_path AS path FROM vault_links\n WHERE source_path IN (\n SELECT target_path FROM vault_links\n WHERE source_path = ? AND target_path IS NOT NULL\n ) AND target_path IS NOT NULL`,\n )\n .all(opts.notePath) as Array<{ path: string }>;\n\n // For each friend-of-friend, count how many of source's direct friends link to them\n const friendLinkCounts = new Map<string, number>();\n for (const { path } of friendTargetRows) {\n if (path === opts.notePath) continue;\n friendLinkCounts.set(path, (friendLinkCounts.get(path) ?? 0) + 1);\n }\n const maxFriendLinks = Math.max(1, ...friendLinkCounts.values());\n\n // Get all vault files to enumerate candidates\n const allFiles = db\n .prepare(\"SELECT vault_path, title FROM vault_files\")\n .all() as Array<{ vault_path: string; title: string | null }>;\n\n const suggestions: Suggestion[] = [];\n\n for (const { vault_path, title } of allFiles) {\n if (vault_path === opts.notePath) continue;\n if (excludeLinked && linkedPaths.has(vault_path)) continue;\n\n // Semantic score\n let semanticScore = 0;\n if (sourceEmbedding) {\n const candidateEmbedding = allEmbeddings.get(vault_path);\n if (candidateEmbedding) {\n semanticScore = Math.max(0, cosineSimilarity(sourceEmbedding, candidateEmbedding));\n }\n }\n\n // Tag score (only compute if candidate might have chunks)\n let tagScore = 0;\n if (allEmbeddings.has(vault_path)) {\n const candidateTags = getFileTags(db, opts.vaultProjectId, vault_path);\n tagScore = jaccardSimilarity(sourceTags, candidateTags);\n }\n\n // Neighbor score\n const friendCount = friendLinkCounts.get(vault_path) ?? 0;\n const neighborScore = friendCount / maxFriendLinks;\n\n const score =\n SEMANTIC_WEIGHT * semanticScore +\n TAG_WEIGHT * tagScore +\n NEIGHBOR_WEIGHT * neighborScore;\n\n // Only include if there is at least some signal\n if (score <= 0) continue;\n\n const reason = buildReason(semanticScore, tagScore, neighborScore, friendCount);\n\n suggestions.push({\n path: vault_path,\n title,\n score,\n semanticScore,\n tagScore,\n neighborScore,\n reason,\n suggestedWikilink: suggestedWikilink(vault_path),\n });\n }\n\n suggestions.sort((a, b) => b.score - a.score);\n return suggestions.slice(0, limit);\n}\n"],"mappings":";;;;;AA2BA,SAAS,aAAa,QAAgB,QAA8C;AAClF,QAAO,QAAQ,OAAO,KAAK,QAAQ,OAAO,GAAG,eAAe;;AAG9D,SAAS,aAAa,IAAc,WAAkC;CACpE,MAAM,UAAU,GACb,QAAQ,0DAA0D,CAClE,IAAI,UAAU;AACjB,KAAI,QAAS,QAAO,QAAQ;CAE5B,MAAM,QAAQ,GACX,QAAQ,gEAAgE,CACxE,IAAI,UAAU;AACjB,KAAI,CAAC,MAAO,QAAO;CAEnB,MAAM,YAAY,GACf,QAAQ,0DAA0D,CAClE,IAAI,MAAM,eAAe;AAC5B,QAAO,YAAY,UAAU,aAAa;;AAG5C,SAAS,oBAAoB,IAAc,MAAwB;AACjE,QACE,GACG,QACC,wFACD,CACA,IAAI,KAAK,CACZ,KAAK,MAAM,EAAE,YAAY;;AAG7B,SAAS,qBAAqB,IAAc,MAAwB;AAClE,QACE,GACG,QACC,4DACD,CACA,IAAI,KAAK,CACZ,KAAK,MAAM,EAAE,YAAY;;AAG7B,SAAS,YACP,IACA,MAC6D;CAC7D,MAAM,OAAO,GACV,QAAQ,qDAAqD,CAC7D,IAAI,KAAK;CAEZ,MAAM,SAAS,GACZ,QAAQ,8EAA8E,CACtF,IAAI,KAAK;AAEZ,QAAO;EACL,OAAO,MAAM,SAAS;EACtB,SAAS,QAAQ,iBAAiB;EAClC,UAAU,QAAQ,kBAAkB;EACrC;;;;;;AAOH,SAAgB,cAAc,IAAc,MAAqC;CAC/E,MAAM,QAAQ,KAAK,IAAI,KAAK,IAAI,KAAK,SAAS,GAAG,EAAE,EAAE,GAAG;CACxD,MAAM,YAAY,KAAK,aAAa;CACpC,MAAM,OAAO,KAAK,QAAQ;CAE1B,MAAM,OAAO,aAAa,IAAI,KAAK,UAAU;AAC7C,KAAI,CAAC,KACH,QAAO;EACL,MAAM,KAAK;EACX,OAAO,EAAE;EACT,OAAO,EAAE;EACT,iBAAiB,EAAE;EACnB,iBAAiB;EAClB;CAGH,MAAM,UAAU,IAAI,IAAY,CAAC,KAAK,CAAC;CACvC,MAAM,QAAuB,EAAE;CAC/B,MAAM,QAAiF,EAAE;CACzF,IAAI,kBAAkB;CAEtB,MAAM,QAAgD,CAAC;EAAE,MAAM;EAAM,OAAO;EAAG,CAAC;AAEhF,QAAO,MAAM,SAAS,GAAG;EACvB,MAAM,UAAU,MAAM,OAAO;AAE7B,MAAI,QAAQ,SAAS,OAAO;AAC1B,qBAAkB;AAClB;;EAGF,MAAM,YAAmE,EAAE;AAE3E,MAAI,cAAc,aAAa,cAAc,OAC3C,MAAK,MAAM,KAAK,oBAAoB,IAAI,QAAQ,KAAK,CACnD,WAAU,KAAK;GAAE,UAAU;GAAG,MAAM,QAAQ;GAAM,IAAI;GAAG,CAAC;AAI9D,MAAI,cAAc,cAAc,cAAc,OAC5C,MAAK,MAAM,KAAK,qBAAqB,IAAI,QAAQ,KAAK,CACpD,WAAU,KAAK;GAAE,UAAU;GAAG,MAAM;GAAG,IAAI,QAAQ;GAAM,CAAC;AAI9D,OAAK,MAAM,EAAE,UAAU,MAAM,QAAQ,WAAW;GAC9C,MAAM,WAAW,aAAa,MAAM,GAAG;AAEvC,OAAI,SAAS,SAAS,aAAa,KACjC;AAGc,MAAG,KAAH,EAAW,GAAX;AAEhB,OAAI,CADmB,MAAM,MAAM,MAAM,EAAE,SAAS,QAAQ,EAAE,OAAO,GAAG,CAEtE,OAAM,KAAK;IAAE;IAAM;IAAI,MAAM;IAAU,CAAC;AAG1C,OAAI,CAAC,QAAQ,IAAI,SAAS,EAAE;AAC1B,YAAQ,IAAI,SAAS;IAErB,MAAM,OAAO,YAAY,IAAI,SAAS;AACtC,UAAM,KAAK;KACT,MAAM;KACN,OAAO,KAAK;KACZ,OAAO,QAAQ,QAAQ;KACvB,UAAU;KACV,SAAS,KAAK;KACd,UAAU,KAAK;KAChB,CAAC;AAEF,UAAM,KAAK;KAAE,MAAM;KAAU,OAAO,QAAQ,QAAQ;KAAG,CAAC;;;;CAK9D,MAAM,kBAAkB,MACrB,QAAQ,MAAM,EAAE,WAAW,EAAE,CAC7B,KAAK,MAAM,EAAE,KAAK;AAGrB,KADiB,YAAY,IAAI,KAAK,CACzB,WAAW,EACtB,iBAAgB,QAAQ,KAAK;AAG/B,QAAO;EAAE;EAAM;EAAO;EAAO;EAAiB;EAAiB;;;;;ACvJjE,MAAMA,eAAa;AACnB,MAAM,cAAc;AAEpB,SAAS,kBACP,IACA,WACwD;CACxD,MAAM,OAAO,GACV,QACC;;;gBAID,CACA,IAAI,WAAWA,aAAW;CAO7B,MAAM,yBAAS,IAAI,KAAiE;AAEpF,MAAK,MAAM,OAAO,MAAM;EACtB,MAAM,MAAM,qBAAqB,IAAI,UAAU;EAC/C,MAAM,QAAQ,OAAO,IAAI,IAAI,KAAK;AAClC,MAAI,CAAC,MACH,QAAO,IAAI,IAAI,MAAM;GAAE,KAAK,IAAI,aAAa,IAAI;GAAE,OAAO;GAAG,MAAM,IAAI;GAAM,CAAC;OACzE;AACL,QAAK,IAAI,IAAI,GAAG,IAAI,IAAI,QAAQ,IAC9B,OAAM,IAAI,MAAM,IAAI;AAEtB,SAAM;;;CAIV,MAAM,yBAAS,IAAI,KAAwD;AAC3E,MAAK,MAAM,CAAC,MAAM,EAAE,KAAK,OAAO,WAAW,QAAQ;EACjD,MAAM,MAAM,IAAI,aAAa,IAAI,OAAO;AACxC,OAAK,IAAI,IAAI,GAAG,IAAI,IAAI,QAAQ,IAC9B,KAAI,KAAK,IAAI,KAAK;AAEpB,SAAO,IAAI,MAAM;GAAE,WAAW;GAAK;GAAM,CAAC;;AAE5C,QAAO;;AAGT,SAAS,sBACP,IACA,WACA,MAC6C;CAC7C,MAAM,OAAO,GACV,QACC;oEAED,CACA,IAAI,WAAW,KAAK;AAEvB,KAAI,KAAK,WAAW,EAClB,QAAO;EAAE,WAAW,IAAI,aAAa,EAAE;EAAE,OAAO;EAAO;CAGzD,MAAM,MAAM,qBAAqB,KAAK,GAAG,UAAU,CAAC;CACpD,MAAM,MAAM,IAAI,aAAa,IAAI;AACjC,MAAK,MAAM,OAAO,MAAM;EACtB,MAAM,MAAM,qBAAqB,IAAI,UAAU;AAC/C,OAAK,IAAI,IAAI,GAAG,IAAI,KAAK,IACvB,KAAI,MAAM,IAAI;;CAGlB,MAAM,MAAM,IAAI,aAAa,IAAI;AACjC,MAAK,IAAI,IAAI,GAAG,IAAI,KAAK,IACvB,KAAI,KAAK,IAAI,KAAK,KAAK;AAEzB,QAAO;EAAE,WAAW;EAAK,OAAO;EAAM;;AAGxC,SAAS,iBAAiB,IAAc,QAAgB,QAAwB;AAC9E,KAAI,WAAW,OAAQ,QAAO;CAE9B,MAAM,UAAU,IAAI,IAAY,CAAC,OAAO,CAAC;CACzC,MAAM,QAA+C,CAAC;EAAE,MAAM;EAAQ,MAAM;EAAG,CAAC;AAEhF,QAAO,MAAM,SAAS,GAAG;EACvB,MAAM,EAAE,MAAM,SAAS,MAAM,OAAO;AACpC,MAAI,QAAQ,YAAa;EAEzB,MAAM,YAAY,GACf,QACC;;;;gCAKD,CACA,IAAI,MAAM,KAAK;AAElB,OAAK,MAAM,EAAE,cAAc,WAAW;AACpC,OAAI,aAAa,OAAQ,QAAO,OAAO;AACvC,OAAI,CAAC,QAAQ,IAAI,SAAS,EAAE;AAC1B,YAAQ,IAAI,SAAS;AACrB,UAAM,KAAK;KAAE,MAAM;KAAU,MAAM,OAAO;KAAG,CAAC;;;;AAKpD,QAAO;;AAGT,SAAS,iBACP,IACA,WACA,MACA,cACQ;CACR,MAAM,OAAO,GACV,QACC;;iBAGD,CACA,IAAI,WAAW,KAAK;AAEvB,KAAI,KAAK,WAAW,EAAG,QAAO;CAE9B,IAAI,WAAW,KAAK,GAAG;CACvB,IAAI,UAAU;AAEd,MAAK,MAAM,OAAO,MAAM;EAEtB,MAAM,MAAM,iBAAiB,cADjB,qBAAqB,IAAI,UAAU,CACA;AAC/C,MAAI,MAAM,SAAS;AACjB,aAAU;AACV,cAAW,IAAI;;;AAInB,QAAO,SAAS,MAAM,CAAC,MAAM,GAAG,IAAI;;;;;;AAOtC,eAAsB,eACpB,IACA,MAC2B;CAC3B,MAAM,QAAQ,KAAK,SAAS;CAC5B,MAAM,gBAAgB,KAAK,iBAAiB;CAC5C,MAAM,mBAAmB,KAAK,oBAAoB;CAElD,IAAI,EAAE,WAAW,cAAc,UAAU,sBACvC,IACA,KAAK,gBACL,KAAK,cACN;AAGD,KAAI,CAAC,MAKH,gBAAe,MAAM,kBAJR,GACV,QAAQ,qDAAqD,CAC7D,IAAI,KAAK,cAAc,EACP,SAAS,KAAK,eACY,KAAK;CAGpD,MAAM,oBAAoB,kBAAkB,IAAI,KAAK,eAAe;AAGpE,mBAAkB,OAAO,KAAK,cAAc;CAG5C,MAAM,qBAA2D,EAAE;AACnE,MAAK,MAAM,CAAC,MAAM,EAAE,gBAAgB,mBAAmB;EACrD,MAAM,MAAM,iBAAiB,cAAc,UAAU;AACrD,MAAI,OAAO,cACT,oBAAmB,KAAK;GAAE;GAAM;GAAK,CAAC;;CAK1C,MAAM,UAA4B,EAAE;AAEpC,MAAK,MAAM,EAAE,MAAM,SAAS,oBAAoB;EAC9C,MAAM,gBAAgB,iBAAiB,IAAI,KAAK,eAAe,KAAK;EAEpE,MAAM,oBAAoB,SAAS,cAAc,GAAG,gBAAgB;AACpE,MAAI,oBAAoB,iBAAkB;EAE1C,MAAM,OAAO,GACV,QAAQ,qDAAqD,CAC7D,IAAI,KAAK;EAEZ,MAAM,gBAAgB,MAAM,KAAK,KAAK,oBAAoB,EAAE;EAC5D,MAAM,gBAAgB,iBAAiB,IAAI,KAAK,gBAAgB,MAAM,aAAa;AAEnF,UAAQ,KAAK;GACX;GACA,OAAO,MAAM,SAAS;GACtB,kBAAkB;GAClB,eAAe,SAAS,cAAc,GAAG,gBAAgB;GACzD;GACA;GACD,CAAC;;AAGJ,SAAQ,MAAM,GAAG,MAAM,EAAE,gBAAgB,EAAE,cAAc;AACzD,QAAO,QAAQ,MAAM,GAAG,MAAM;;;;;;AC3LhC,SAAS,cAAc,WAA2B;CAChD,MAAM,QAAQ,UAAU,QAAQ,IAAI;AACpC,QAAO,UAAU,KAAK,YAAY,UAAU,MAAM,GAAG,MAAM;;;;;;AAO7D,SAAS,gBAAgB,IAAc,OAA8B;AACnE,KAAI,MAAM,SAAS,EAAG,QAAO,EAAE;CAE/B,MAAM,eAAe,MAAM,KAAK,MAAM,CAAC,UAAU,IAAI,CAAC,KAAK,KAAK;CAChE,MAAM,WAAW,MAAM,KAAK,MAAM;CAElC,MAAM,UAAU,GACb,QACC,sEAAsE,aAAa,+BACpF,CACA,IAAI,GAAG,SAAS;CAEnB,MAAM,WAAW,GACd,QACC,sEAAsE,aAAa,GACpF,CACA,IAAI,GAAG,SAAS;CAEnB,MAAM,YAAsB,EAAE;AAC9B,MAAK,MAAM,KAAK,QAAS,WAAU,KAAK,EAAE,YAAY;AACtD,MAAK,MAAM,KAAK,SAAU,WAAU,KAAK,EAAE,YAAY;AACvD,QAAO;;;;;;AAOT,SAAS,SAAS,IAAc,MAA6B;AAI3D,QAHY,GACT,QAAQ,qDAAqD,CAC7D,IAAI,KAAK,EACA,SAAS;;;;;;AAOvB,SAAS,gBAAgB,IAAc,MAAsB;AAI3D,QAHY,GACT,QAAQ,8DAA8D,CACtE,IAAI,KAAK,EACA,iBAAiB;;;;;;;AAY/B,eAAsB,eACpB,IACA,MACyB;CACzB,MAAM,QAAQ,KAAK,IAAI,KAAK,SAAS,GAAG,EAAE;CAC1C,MAAM,QAAQ,KAAK,IAAI,KAAK,SAAS,IAAI,EAAE;CAC3C,MAAM,iBAAiB;CAKvB,MAAM,iBAAiB,MAAM,kBAAkB,KAAK,UAAU,KAAK;CAEnE,MAAM,gBAAgB,mBACpB,IACA,KAAK,UACL,gBACA;EACE,YAAY,CAAC,KAAK,eAAe;EACjC,YAAY;EACb,CACF;CAGD,MAAM,6BAAa,IAAI,KAAiD;AACxE,MAAK,MAAM,KAAK,eAAe;EAC7B,MAAM,WAAW,WAAW,IAAI,EAAE,KAAK;AACvC,MAAI,CAAC,YAAY,EAAE,QAAQ,SAAS,MAClC,YAAW,IAAI,EAAE,MAAM;GAAE,OAAO,EAAE;GAAO,SAAS,EAAE;GAAS,CAAC;;CAOlE,MAAM,WAAW,IAAI,IAAY,WAAW,MAAM,CAAC;CACnD,IAAI,WAAW,IAAI,IAAY,WAAW,MAAM,CAAC;AAEjD,MAAK,IAAI,IAAI,GAAG,IAAI,OAAO,KAAK;EAC9B,MAAM,YAAY,gBAAgB,IAAI,SAAS;EAC/C,MAAM,8BAAc,IAAI,KAAa;AACrC,OAAK,MAAM,KAAK,UACd,KAAI,CAAC,SAAS,IAAI,EAAE,EAAE;AACpB,YAAS,IAAI,EAAE;AACf,eAAY,IAAI,EAAE;;AAGtB,MAAI,YAAY,SAAS,EAAG;AAC5B,aAAW;;CAOb,MAAM,eAAe,MAAM,KAAK,WAAW,SAAS,CAAC,CAClD,MAAM,GAAG,MAAM,EAAE,GAAG,QAAQ,EAAE,GAAG,MAAM,CACvC,KAAK,CAAC,MAAM,WAAW;EAAE;EAAM,GAAG;EAAM,gBAAgB;EAAM,EAAE;CAMnE,MAAM,iBAJgB,MAAM,KAAK,SAAS,CAAC,QAAQ,MAAM,CAAC,WAAW,IAAI,EAAE,CAAC,CAKzE,KAAK,UAAU;EACd;EACA,OAAO;EACP,SAAS;EACT,SAAS,gBAAgB,IAAI,KAAK;EAClC,gBAAgB;EACjB,EAAE,CACF,MAAM,GAAG,MAAM,EAAE,UAAU,EAAE,QAAQ;CAGxC,MAAM,qBAAqB,KAAK,IAAI,QAAQ,aAAa,QAAQ,EAAE;CACnE,MAAM,oBAAoB,eAAe,MAAM,GAAG,mBAAmB;CAErE,MAAM,sBAAsB,aAAa,MAAM,GAAG,MAAM;CACxD,MAAM,gBAAgB,IAAI,IAAY,CACpC,GAAG,oBAAoB,KAAK,MAAM,EAAE,KAAK,EACzC,GAAG,kBAAkB,KAAK,MAAM,EAAE,KAAK,CACxC,CAAC;CAKF,MAAM,gBAAiD,EAAE;AAEzD,MAAK,MAAM,KAAK,qBAAqB;AACnC,MAAI,CAAC,cAAc,IAAI,EAAE,KAAK,CAAE;AAChC,gBAAc,KAAK;GACjB,MAAM,EAAE;GACR,OAAO,SAAS,IAAI,EAAE,KAAK;GAC3B,SAAS,EAAE;GACX,OAAO,EAAE;GACT,QAAQ,cAAc,EAAE,KAAK;GAC9B,CAAC;;AAGJ,MAAK,MAAM,KAAK,kBACd,eAAc,KAAK;EACjB,MAAM,EAAE;EACR,OAAO,SAAS,IAAI,EAAE,KAAK;EAC3B,SAAS,EAAE;EACX,OAAO;EACP,QAAQ,cAAc,EAAE,KAAK;EAC9B,CAAC;CAMJ,IAAI,cAAoC,EAAE;AAE1C,KAAI,cAAc,OAAO,GAAG;EAC1B,MAAM,WAAW,MAAM,KAAK,cAAc;EAC1C,MAAM,eAAe,SAAS,UAAU,IAAI,CAAC,KAAK,KAAK;EAEvD,MAAM,WAAW,GACd,QACC;;iCAEyB,aAAa;iCACb,aAAa;4CAEvC,CACA,IAAI,GAAG,UAAU,GAAG,SAAS;AAMhC,OAAK,MAAM,OAAO,SAChB,aAAY,KAAK;GACf,UAAU,IAAI;GACd,QAAQ,IAAI;GACZ,YAAY,cAAc,IAAI,YAAY;GAC1C,UAAU,cAAc,IAAI,YAAY;GACxC,UAAU,IAAI;GACf,CAAC;;CAON,MAAM,YAAY,IAAI,IAAY,cAAc,KAAK,MAAM,EAAE,OAAO,CAAC;CACrE,MAAM,UAAU,MAAM,KAAK,UAAU,CAAC,MAAM;CAE5C,MAAM,yBAAyB,YAAY,QACxC,MAAM,EAAE,eAAe,EAAE,SAC3B;CAKD,MAAM,eAAe,cAClB,KAAK,GAAG,MAAM;EACb,MAAM,QAAQ,EAAE,QAAQ,IAAI,EAAE,MAAM,KAAK;EACzC,MAAM,SAAS,EAAE;EACjB,MAAM,aAAa,EAAE,QAAQ,IAAI,gBAAgB,EAAE,MAAM,QAAQ,EAAE,CAAC,KAAK;EACzE,MAAM,UAAU,EAAE,QAAQ,MAAM,CAAC,MAAM,GAAG,IAAI;AAC9C,SAAO,GAAG,IAAI,EAAE,KAAK,OAAO,IAAI,QAAQ,WAAW,aAAa,EAAE,KAAK,QAAQ,QAAQ;GACvF,CACD,KAAK,OAAO;CAEf,MAAM,oBACJ,uBAAuB,SAAS,IAC5B,uBACG,KACE,MACC,MAAM,EAAE,SAAS,KAAK,EAAE,WAAW,OAAO,EAAE,OAAO,KAAK,EAAE,SAAS,eAAe,EAAE,SAAS,GAChG,CACA,KAAK,KAAK,GACb;CAEN,MAAM,aAAa,QAAQ,KAAK,KAAK;AAgCrC,QAAO;EACL;EACA,aAAa;EACb;EACA,iBAlCsB;;YAEd,KAAK,SAAS;;;;kBAIR,cAAc,OAAO,gBAAgB,QAAQ,OAAO,cAAc,WAAW;;EAE7F,aAAa;;;;;;EAMb,kBAAkB;;;;;;;;;gFAS4D,WAAW;;;;;;EAYxF;;;;;ACrSH,MAAMC,eAAa;AAEnB,MAAM,aAAa,IAAI,IAAI;CACzB;CAAK;CAAM;CAAO;CAAO;CAAM;CAAO;CAAM;CAAM;CAAM;CAAM;CAC9D;CAAM;CAAQ;CAAM;CAAQ;CAAM;CAAM;CAAM;CAAM;CAAO;CAC3D;CAAO;CAAO;CAAQ;CAAO;CAAQ;CAAQ;CAAK;CAAM;CAAM;CAC9D;CAAO;CAAQ;CAAY;CAAQ;CAAQ;CAC5C,CAAC;AAEF,SAAS,aAAa,WAA2B;CAC/C,MAAM,QAAQ,UAAU,MAAM,IAAI;AAClC,QAAO,MAAM,SAAS,IAAI,MAAM,KAAK;;AAGvC,SAAS,cAAc,QAAsC;CAC3D,MAAM,6BAAa,IAAI,KAAqB;AAC5C,MAAK,MAAM,SAAS,QAAQ;AAC1B,MAAI,CAAC,MAAO;EACZ,MAAM,QAAQ,MACX,aAAa,CACb,QAAQ,gBAAgB,IAAI,CAC5B,MAAM,MAAM,CACZ,QAAQ,MAAM,EAAE,SAAS,KAAK,CAAC,WAAW,IAAI,EAAE,CAAC;AACpD,OAAK,MAAM,QAAQ,MACjB,YAAW,IAAI,OAAO,WAAW,IAAI,KAAK,IAAI,KAAK,EAAE;;AAIzD,QADe,CAAC,GAAG,WAAW,SAAS,CAAC,CAAC,MAAM,GAAG,MAAM,EAAE,KAAK,EAAE,GAAG,CAEjE,MAAM,GAAG,EAAE,CACX,KAAK,CAAC,OAAO,EAAE,CACf,KAAK,MAAM;;AAGhB,SAAS,mBAAmB,IAAc,OAAyB;AACjE,KAAI,MAAM,SAAS,EAAG,QAAO;CAC7B,MAAM,aAAc,MAAM,UAAU,MAAM,SAAS,KAAM;CACzD,MAAM,UAAU,IAAI,IAAI,MAAM;CAC9B,IAAI,cAAc;AAElB,MAAK,MAAM,QAAQ,OAAO;EACxB,MAAM,OAAO,GACV,QACC;4DAED,CACA,IAAI,KAAK;AACZ,OAAK,MAAM,EAAE,iBAAiB,KAC5B,KAAI,QAAQ,IAAI,YAAY,CAC1B;;CAMN,MAAM,cAAc,cAAc;AAClC,QAAO,KAAK,IAAI,GAAG,cAAc,WAAW;;AAU9C,SAAS,kBAAkB,YAA0C;AACnE,KAAI,WAAW,WAAW,EAAG,QAAO,IAAI,aAAa,EAAE;CACvD,MAAM,MAAM,WAAW,GAAG;CAC1B,MAAM,MAAM,IAAI,aAAa,IAAI;AACjC,MAAK,MAAM,OAAO,WAChB,MAAK,IAAI,IAAI,GAAG,IAAI,KAAK,IACvB,KAAI,MAAM,IAAI;CAGlB,MAAM,MAAM,IAAI,aAAa,IAAI;AACjC,MAAK,IAAI,IAAI,GAAG,IAAI,KAAK,IACvB,KAAI,KAAK,IAAI,KAAK,WAAW;AAE/B,QAAO;;;;;;AAOT,eAAsB,aACpB,IACA,MACsB;CACtB,MAAM,eAAe,KAAK,gBAAgB;CAC1C,MAAM,iBAAiB,KAAK,kBAAkB;CAC9C,MAAM,YAAY,KAAK,aAAa;CACpC,MAAM,sBAAsB,KAAK,uBAAuB;CAExD,MAAM,MAAM,KAAK,KAAK;CACtB,MAAM,OAAO,MAAM,eAAe;CAGlC,MAAM,cAAc,GACjB,QACC,6EACD,CACA,IAAI,KAAK;CAGZ,MAAM,YAAY,GACf,QACC;;;gBAID,CACA,IAAI,KAAK,gBAAgBA,aAAW;CAEvC,MAAM,mCAAmB,IAAI,KAA6B;AAC1D,MAAK,MAAM,OAAO,WAAW;EAC3B,MAAM,MAAM,qBAAqB,IAAI,UAAU;EAC/C,MAAM,MAAM,iBAAiB,IAAI,IAAI,KAAK;AAC1C,MAAI,CAAC,IACH,kBAAiB,IAAI,IAAI,MAAM,CAAC,IAAI,CAAC;MAErC,KAAI,KAAK,IAAI;;CAIjB,MAAM,iCAAiB,IAAI,KAA2B;AACtD,MAAK,MAAM,CAAC,MAAM,SAAS,iBACzB,gBAAe,IAAI,MAAM,kBAAkB,KAAK,CAAC;CAInD,MAAM,WAA0B,EAAE;AAClC,MAAK,MAAM,QAAQ,aAAa;EAC9B,MAAM,YAAY,eAAe,IAAI,KAAK,WAAW;AACrD,MAAI,CAAC,UAAW;AAChB,WAAS,KAAK;GACZ,OAAO,CAAC,KAAK,WAAW;GACxB,QAAQ,CAAC,KAAK,MAAM;GACpB,YAAY,CAAC,KAAK,WAAW;GAC7B,UAAU;GACX,CAAC;;CAGJ,MAAM,qBAAqB,SAAS;CAKpC,IAAI,SAAS;AACb,QAAO,UAAU,SAAS,SAAS,GAAG;AACpC,WAAS;EACT,IAAI,UAAU;EACd,IAAI,QAAQ;EACZ,IAAI,QAAQ;AAEZ,OAAK,IAAI,IAAI,GAAG,IAAI,SAAS,QAAQ,IACnC,MAAK,IAAI,IAAI,IAAI,GAAG,IAAI,SAAS,QAAQ,KAAK;GAC5C,MAAM,MAAM,iBAAiB,SAAS,GAAG,UAAU,SAAS,GAAG,SAAS;AACxE,OAAI,MAAM,SAAS;AACjB,cAAU;AACV,YAAQ;AACR,YAAQ;;;AAKd,MAAI,UAAU,GAAI;EAGlB,MAAM,KAAK,SAAS;EACpB,MAAM,KAAK,SAAS;EACpB,MAAM,cAAc,CAAC,GAAG,GAAG,OAAO,GAAG,GAAG,MAAM;EAC9C,MAAM,eAAe,CAAC,GAAG,GAAG,QAAQ,GAAG,GAAG,OAAO;EACjD,MAAM,mBAAmB,CAAC,GAAG,GAAG,YAAY,GAAG,GAAG,WAAW;EAG7D,MAAM,mBAAmC,EAAE;AAC3C,OAAK,MAAM,KAAK,aAAa;GAC3B,MAAM,MAAM,eAAe,IAAI,EAAE;AACjC,OAAI,IAAK,kBAAiB,KAAK,IAAI;;AAGrC,WAAS,SAAS;GAChB,OAAO;GACP,QAAQ;GACR,YAAY;GACZ,UAAU,kBAAkB,iBAAiB;GAC9C;AAED,WAAS,OAAO,OAAO,EAAE;AACzB,WAAS;;CAIX,MAAM,SAAyB,EAAE;CACjC,IAAI,eAAe;AAEnB,MAAK,MAAM,WAAW,UAAU;AAC9B,MAAI,QAAQ,MAAM,SAAS,eAAgB;EAE3C,MAAM,QAAQ,cAAc,QAAQ,OAAO,IAAI,SAAS,eAAe;EACvE,MAAM,aACJ,QAAQ,WAAW,QAAQ,KAAK,MAAM,MAAM,GAAG,EAAE,GAAG,QAAQ,WAAW;EAGzE,MAAM,kBADgB,IAAI,IAAI,QAAQ,MAAM,IAAI,aAAa,CAAC,CACxB,OAAO,QAAQ,MAAM;EAE3D,MAAM,cAAc,mBAAmB,IAAI,QAAQ,MAAM;EACzD,MAAM,mBAAmB,cAAc,MAAO,QAAQ,MAAM,UAAU;AAEtE,SAAO,KAAK;GACV,IAAI;GACJ;GACA,OAAO,QAAQ,MAAM,KAAK,MAAM,SAAS;IACvC;IACA,OAAO,QAAQ,OAAO;IACvB,EAAE;GACH,MAAM,QAAQ,MAAM;GACpB;GACA;GACA;GACA;GACD,CAAC;;AAIJ,QAAO,MACJ,GAAG,MACF,EAAE,OAAO,EAAE,mBAAmB,EAAE,aAAa,OAC7C,EAAE,OAAO,EAAE,mBAAmB,EAAE,aAAa,KAChD;AAED,QAAO;EACL,QAAQ,OAAO,MAAM,GAAG,UAAU;EAClC;EACA,YAAY;GAAE;GAAM,IAAI;GAAK;EAC9B;;;;;ACvNH,SAAS,gBAAgB,OAAiB,OAA0D;AAClG,KAAI,MAAM,WAAW,EAAG,QAAO;CAE/B,MAAM,yBAAS,IAAI,KAAqB;CACxC,MAAM,uBAAO,IAAI,KAAqB;AAEtC,MAAK,MAAM,KAAK,OAAO;AACrB,SAAO,IAAI,GAAG,EAAE;AAChB,OAAK,IAAI,GAAG,EAAE;;CAGhB,SAAS,KAAK,GAAmB;EAC/B,IAAI,OAAO;AACX,SAAO,OAAO,IAAI,KAAK,KAAK,KAC1B,QAAO,OAAO,IAAI,KAAK;EAEzB,IAAI,UAAU;AACd,SAAO,YAAY,MAAM;GACvB,MAAM,OAAO,OAAO,IAAI,QAAQ;AAChC,UAAO,IAAI,SAAS,KAAK;AACzB,aAAU;;AAEZ,SAAO;;CAGT,SAAS,MAAM,GAAW,GAAiB;EACzC,MAAM,KAAK,KAAK,EAAE;EAClB,MAAM,KAAK,KAAK,EAAE;AAClB,MAAI,OAAO,GAAI;EACf,MAAM,QAAQ,KAAK,IAAI,GAAG,IAAI;EAC9B,MAAM,QAAQ,KAAK,IAAI,GAAG,IAAI;AAC9B,MAAI,QAAQ,MACV,QAAO,IAAI,IAAI,GAAG;WACT,QAAQ,MACjB,QAAO,IAAI,IAAI,GAAG;OACb;AACL,UAAO,IAAI,IAAI,GAAG;AAClB,QAAK,IAAI,IAAI,QAAQ,EAAE;;;AAI3B,MAAK,MAAM,EAAE,QAAQ,YAAY,MAC/B,KAAI,OAAO,IAAI,OAAO,IAAI,OAAO,IAAI,OAAO,CAC1C,OAAM,QAAQ,OAAO;CAIzB,MAAM,wBAAQ,IAAI,KAAa;AAC/B,MAAK,MAAM,KAAK,MACd,OAAM,IAAI,KAAK,EAAE,CAAC;AAEpB,QAAO,MAAM;;;;;;AAOf,SAAgB,aAAa,IAAc,MAAoC;CAC7E,MAAM,UAAU,QAAQ,EAAE;CAC1B,MAAM,QAAQ,QAAQ,SAAS;CAC/B,MAAM,UAAU,QAAQ,WAAW;EAAC;EAAc;EAAW;EAAgB;EAAmB;CAEhG,MAAM,aAAa,KAAK,KAAK;CAG7B,IAAI,aAAa;AACjB,KAAI,UAAU,OACZ,cACE,GAAG,QAAQ,wCAAwC,CAAC,KAAK,CACzD;UACO,UAAU,WAAW;EAC9B,MAAM,SAAS,QAAQ,eAAe;AACtC,eACE,GACG,QAAQ,uEAAuE,CAC/E,IAAI,OAAO,CACd;QACG;EAEL,MAAM,SAAS,cADF,QAAQ,cAAc,MACA;AACnC,eACE,GACG,QAAQ,6DAA6D,CACrE,IAAI,OAAO,CACd;;CAIJ,IAAI,aAAa;AACjB,KAAI,UAAU,OACZ,cACE,GAAG,QAAQ,wCAAwC,CAAC,KAAK,CACzD;UACO,UAAU,WAAW;EAC9B,MAAM,SAAS,QAAQ,eAAe;AACtC,eACE,GACG,QAAQ,wEAAwE,CAChF,IAAI,OAAO,CACd;QACG;EAEL,MAAM,SAAS,cADF,QAAQ,cAAc,MACA;AACnC,eACE,GACG,QACC,uHACD,CACA,IAAI,OAAO,CACd;;CAIJ,IAAI,YAAwB,EAAE;AAC9B,KAAI,QAAQ,SAAS,aAAa,CAChC,KAAI,UAAU,OACZ,aACE,GACG,QACC,yFACD,CACA,KAAK,CACR,KAAK,OAAO;EACZ,YAAY,EAAE;EACd,WAAW,EAAE;EACb,YAAY,EAAE;EACf,EAAE;UACM,UAAU,WAAW;EAC9B,MAAM,SAAS,QAAQ,eAAe;AACtC,cACE,GACG,QACC,uHACD,CACA,IAAI,OAAO,CACd,KAAK,OAAO;GACZ,YAAY,EAAE;GACd,WAAW,EAAE;GACb,YAAY,EAAE;GACf,EAAE;QACE;EAEL,MAAM,SAAS,cADF,QAAQ,cAAc,MACA;AACnC,cACE,GACG,QACC,sKACD,CACA,IAAI,OAAO,CACd,KAAK,OAAO;GACZ,YAAY,EAAE;GACd,WAAW,EAAE;GACb,YAAY,EAAE;GACf,EAAE;;CAKP,IAAI,UAAoB,EAAE;AAC1B,KAAI,QAAQ,SAAS,UAAU,CAC7B,KAAI,UAAU,OACZ,WACE,GACG,QAAQ,0DAA0D,CAClE,KAAK,CACR,KAAK,MAAM,EAAE,WAAW;UACjB,UAAU,WAAW;EAC9B,MAAM,SAAS,QAAQ,eAAe;AACtC,YACE,GACG,QACC,uFACD,CACA,IAAI,OAAO,CACd,KAAK,MAAM,EAAE,WAAW;QACrB;EAEL,MAAM,SAAS,cADF,QAAQ,cAAc,MACA;AACnC,YACE,GACG,QACC,8IACD,CACA,IAAI,OAAO,CACd,KAAK,MAAM,EAAE,WAAW;;CAK9B,IAAI,uBAAuB;AAC3B,KAAI,QAAQ,SAAS,eAAe,EAAE;EACpC,IAAI;EACJ,IAAI;AAEJ,MAAI,UAAU,QAAQ;AACpB,cACE,GAAG,QAAQ,qCAAqC,CAAC,KAAK,CACtD,KAAK,MAAM,EAAE,WAAW;AAE1B,cACE,GACG,QACC,8GACD,CACA,KAAK;aAED,UAAU,WAAW;GAC9B,MAAM,SAAS,QAAQ,eAAe;AACtC,cACE,GACG,QAAQ,oEAAoE,CAC5E,IAAI,OAAO,CACd,KAAK,MAAM,EAAE,WAAW;AAE1B,cACE,GACG,QACC,4IACD,CACA,IAAI,OAAO;SAEX;GAEL,MAAM,SAAS,cADF,QAAQ,cAAc,MACA;AACnC,cACE,GACG,QAAQ,0DAA0D,CAClE,IAAI,OAAO,CACd,KAAK,MAAM,EAAE,WAAW;AAE1B,cACE,GACG,QACC,2LACD,CACA,IAAI,OAAO;;AAIlB,yBAAuB,gBAAgB,UAAU,SAAS;;CAI5D,IAAI,kBAA4B,EAAE;AAClC,KAAI,QAAQ,SAAS,mBAAmB,CACtC,KAAI,UAAU,OACZ,mBACE,GACG,QACC,gFACD,CACA,KAAK,CACR,KAAK,MAAM,EAAE,WAAW;UACjB,UAAU,WAAW;EAC9B,MAAM,SAAS,QAAQ,eAAe;AACtC,oBACE,GACG,QACC,6GACD,CACA,IAAI,OAAO,CACd,KAAK,MAAM,EAAE,WAAW;QACrB;EAEL,MAAM,SAAS,cADF,QAAQ,cAAc,MACA;AACnC,oBACE,GACG,QACC,uKACD,CACA,IAAI,OAAO,CACd,KAAK,MAAM,EAAE,WAAW;;CAK9B,MAAM,YAAY,aAAa,IAAI,UAAU,SAAS,aAAa;CACnE,MAAM,cAAc,aAAa,IAAI,QAAQ,SAAS,aAAa;CACnE,MAAM,eAAe,aAAa,IAAI,gBAAgB,SAAS,aAAa;CAC5E,MAAM,cAAc,KAAK,MACvB,OAAO,IAAI,cAAc,IAAI,cAAc,OAAQ,IAAI,eAAe,IACvE;AAED,QAAO;EACL;EACA;EACA;EACA;EACA;EACA;EACA;EACA;EACD;;;;;ACpUH,MAAM,aAAa;AACnB,MAAM,kBAAkB;AACxB,MAAM,aAAa;AACnB,MAAM,kBAAkB;AASxB,SAAS,0BAA0B,OAA8B;CAC/D,MAAM,uBAAO,IAAI,KAAa;AAC9B,MAAK,MAAM,QAAQ,OAAO;EAExB,MAAM,QAAQ,KAAK,MAAM,wCAAwC;AACjE,MAAI,CAAC,MAAO;EAEZ,MAAM,QADQ,MAAM,GACA,MAAM,KAAK;AAC/B,OAAK,MAAM,QAAQ,OAAO;GACxB,MAAM,WAAW,KAAK,MAAM,qBAAqB;AACjD,OAAI,UAAU;IACZ,MAAM,MAAM,SAAS,GAAG,MAAM,CAAC,aAAa;AAC5C,QAAI,IAAK,MAAK,IAAI,IAAI;;;;AAI5B,QAAO;;AAGT,SAAS,oBACP,IACA,WACA,MACqB;CACrB,MAAM,OAAO,GACV,QACC;oEAED,CACA,IAAI,WAAW,KAAK;AAEvB,KAAI,KAAK,WAAW,EAAG,QAAO;CAE9B,MAAM,QAAQ,qBAAqB,KAAK,GAAG,UAAU;CACrD,MAAM,MAAM,IAAI,aAAa,MAAM,OAAO;AAC1C,MAAK,MAAM,OAAO,MAAM;EACtB,MAAM,MAAM,qBAAqB,IAAI,UAAU;AAC/C,OAAK,IAAI,IAAI,GAAG,IAAI,IAAI,QAAQ,IAC9B,KAAI,MAAM,IAAI;;CAGlB,MAAM,MAAM,IAAI,aAAa,IAAI,OAAO;AACxC,MAAK,IAAI,IAAI,GAAG,IAAI,IAAI,QAAQ,IAC9B,KAAI,KAAK,IAAI,KAAK,KAAK;AAEzB,QAAO;;AAGT,SAAS,qBACP,IACA,WAC2B;CAC3B,MAAM,OAAO,GACV,QACC;;;gBAID,CACA,IAAI,WAAW,WAAW;CAE7B,MAAM,yBAAS,IAAI,KAAmD;AACtE,MAAK,MAAM,OAAO,MAAM;EACtB,MAAM,MAAM,qBAAqB,IAAI,UAAU;EAC/C,MAAM,QAAQ,OAAO,IAAI,IAAI,KAAK;AAClC,MAAI,CAAC,MACH,QAAO,IAAI,IAAI,MAAM;GAAE,KAAK,IAAI,aAAa,IAAI;GAAE,OAAO;GAAG,CAAC;OACzD;AACL,QAAK,IAAI,IAAI,GAAG,IAAI,IAAI,QAAQ,IAC9B,OAAM,IAAI,MAAM,IAAI;AAEtB,SAAM;;;CAIV,MAAM,yBAAS,IAAI,KAA2B;AAC9C,MAAK,MAAM,CAAC,MAAM,EAAE,KAAK,YAAY,QAAQ;EAC3C,MAAM,MAAM,IAAI,aAAa,IAAI,OAAO;AACxC,OAAK,IAAI,IAAI,GAAG,IAAI,IAAI,QAAQ,IAC9B,KAAI,KAAK,IAAI,KAAK;AAEpB,SAAO,IAAI,MAAM,IAAI;;AAEvB,QAAO;;AAGT,SAAS,YAAY,IAAc,WAAmB,MAA2B;AAS/E,QAAO,0BARM,GACV,QACC;;;gBAID,CACA,IAAI,WAAW,KAAK,CACe,KAAK,MAAM,EAAE,KAAK,CAAC;;AAG3D,SAAS,kBAAkB,GAAgB,GAAwB;AACjE,KAAI,EAAE,SAAS,KAAK,EAAE,SAAS,EAAG,QAAO;CACzC,IAAI,eAAe;AACnB,MAAK,MAAM,OAAO,EAChB,KAAI,EAAE,IAAI,IAAI,CAAE;CAElB,MAAM,QAAQ,EAAE,OAAO,EAAE,OAAO;AAChC,QAAO,UAAU,IAAI,IAAI,eAAe;;AAG1C,SAAS,YACP,eACA,UACA,eACA,eACQ;CACR,MAAM,UAAmD;EACvD;GAAE,OAAO,yBAAyB,cAAc,QAAQ,EAAE,CAAC;GAAI,OAAO,gBAAgB;GAAiB;EACvG;GAAE,OAAO,gBAAgB,SAAS,QAAQ,EAAE,CAAC;GAAY,OAAO,WAAW;GAAY;EACvF;GAAE,OAAO,aAAa,cAAc,oBAAoB,kBAAkB,IAAI,MAAM;GAAM,OAAO,gBAAgB;GAAiB;EACnI;AACD,SAAQ,MAAM,GAAG,MAAM,EAAE,QAAQ,EAAE,MAAM;AACzC,QAAO,QAAQ,GAAG;;AAGpB,SAAS,kBAAkB,WAA2B;CACpD,MAAM,OAAO,SAAS,UAAU;AAEhC,QAAO,KADM,KAAK,SAAS,MAAM,GAAG,KAAK,MAAM,GAAG,GAAG,GAAG,KACvC;;;;;;AAOnB,eAAsB,cACpB,IACA,MACuB;CACvB,MAAM,QAAQ,KAAK,SAAS;CAC5B,MAAM,gBAAgB,KAAK,iBAAiB;CAG5C,MAAM,eAAe,GAClB,QACC;0DAED,CACA,IAAI,KAAK,SAAS;CACrB,MAAM,cAAc,IAAI,IAAI,aAAa,KAAK,MAAM,EAAE,YAAY,CAAC;CAGnE,MAAM,kBAAkB,oBAAoB,IAAI,KAAK,gBAAgB,KAAK,SAAS;CAGnF,MAAM,gBAAgB,qBAAqB,IAAI,KAAK,eAAe;AACnE,eAAc,OAAO,KAAK,SAAS;CAGnC,MAAM,aAAa,YAAY,IAAI,KAAK,gBAAgB,KAAK,SAAS;CAGtE,MAAM,mBAAmB,GACtB,QACC;;;;sCAKD,CACA,IAAI,KAAK,SAAS;CAGrB,MAAM,mCAAmB,IAAI,KAAqB;AAClD,MAAK,MAAM,EAAE,UAAU,kBAAkB;AACvC,MAAI,SAAS,KAAK,SAAU;AAC5B,mBAAiB,IAAI,OAAO,iBAAiB,IAAI,KAAK,IAAI,KAAK,EAAE;;CAEnE,MAAM,iBAAiB,KAAK,IAAI,GAAG,GAAG,iBAAiB,QAAQ,CAAC;CAGhE,MAAM,WAAW,GACd,QAAQ,4CAA4C,CACpD,KAAK;CAER,MAAM,cAA4B,EAAE;AAEpC,MAAK,MAAM,EAAE,YAAY,WAAW,UAAU;AAC5C,MAAI,eAAe,KAAK,SAAU;AAClC,MAAI,iBAAiB,YAAY,IAAI,WAAW,CAAE;EAGlD,IAAI,gBAAgB;AACpB,MAAI,iBAAiB;GACnB,MAAM,qBAAqB,cAAc,IAAI,WAAW;AACxD,OAAI,mBACF,iBAAgB,KAAK,IAAI,GAAG,iBAAiB,iBAAiB,mBAAmB,CAAC;;EAKtF,IAAI,WAAW;AACf,MAAI,cAAc,IAAI,WAAW,CAE/B,YAAW,kBAAkB,YADP,YAAY,IAAI,KAAK,gBAAgB,WAAW,CACf;EAIzD,MAAM,cAAc,iBAAiB,IAAI,WAAW,IAAI;EACxD,MAAM,gBAAgB,cAAc;EAEpC,MAAM,QACJ,kBAAkB,gBAClB,aAAa,WACb,kBAAkB;AAGpB,MAAI,SAAS,EAAG;EAEhB,MAAM,SAAS,YAAY,eAAe,UAAU,eAAe,YAAY;AAE/E,cAAY,KAAK;GACf,MAAM;GACN;GACA;GACA;GACA;GACA;GACA;GACA,mBAAmB,kBAAkB,WAAW;GACjD,CAAC;;AAGJ,aAAY,MAAM,GAAG,MAAM,EAAE,QAAQ,EAAE,MAAM;AAC7C,QAAO,YAAY,MAAM,GAAG,MAAM"}
|
package/templates/README.md
DELETED
|
@@ -1,181 +0,0 @@
|
|
|
1
|
-
# PAI Configuration Templates
|
|
2
|
-
|
|
3
|
-
This directory contains example configuration files for personalizing your PAI Knowledge OS setup.
|
|
4
|
-
|
|
5
|
-
## Files
|
|
6
|
-
|
|
7
|
-
### `agent-prefs.example.md`
|
|
8
|
-
|
|
9
|
-
**Template for**: `~/.config/pai/agent-prefs.md`
|
|
10
|
-
|
|
11
|
-
Contains your personal preferences for PAI agents:
|
|
12
|
-
- Identity and role information
|
|
13
|
-
- Directory search restrictions
|
|
14
|
-
- Project-code directory mappings
|
|
15
|
-
- Notification preferences (WhatsApp, ntfy.sh)
|
|
16
|
-
- Voice configuration for TTS
|
|
17
|
-
- Git commit rules
|
|
18
|
-
- Code quality standards
|
|
19
|
-
- Language and framework preferences
|
|
20
|
-
- Workflow customization options
|
|
21
|
-
|
|
22
|
-
**Size**: 362 lines, comprehensive with examples
|
|
23
|
-
|
|
24
|
-
**Setup**:
|
|
25
|
-
```bash
|
|
26
|
-
cp agent-prefs.example.md ~/.config/pai/agent-prefs.md
|
|
27
|
-
# Then customize with your settings
|
|
28
|
-
```
|
|
29
|
-
|
|
30
|
-
### `voices.example.json`
|
|
31
|
-
|
|
32
|
-
**Template for**: `~/.config/pai/voices.json`
|
|
33
|
-
|
|
34
|
-
Configures voice output for different agents and channels:
|
|
35
|
-
- TTS system selection (Kokoro local or ElevenLabs cloud)
|
|
36
|
-
- Voice assignments per agent type
|
|
37
|
-
- WhatsApp voice note configuration
|
|
38
|
-
- Local speaker output settings
|
|
39
|
-
- Named personas for easy reference
|
|
40
|
-
- Multiple profiles (professional, casual, focus)
|
|
41
|
-
- Complete examples for different setups
|
|
42
|
-
- Troubleshooting guide
|
|
43
|
-
|
|
44
|
-
**Size**: 251 lines, valid JSON with comments
|
|
45
|
-
|
|
46
|
-
**Setup**:
|
|
47
|
-
```bash
|
|
48
|
-
cp voices.example.json ~/.config/pai/voices.json
|
|
49
|
-
# Then customize voice selections
|
|
50
|
-
```
|
|
51
|
-
|
|
52
|
-
## Quick Start
|
|
53
|
-
|
|
54
|
-
```bash
|
|
55
|
-
# Create config directory if needed
|
|
56
|
-
mkdir -p ~/.config/pai
|
|
57
|
-
|
|
58
|
-
# Copy templates
|
|
59
|
-
cp agent-prefs.example.md ~/.config/pai/agent-prefs.md
|
|
60
|
-
cp voices.example.json ~/.config/pai/voices.json
|
|
61
|
-
|
|
62
|
-
# Edit with your preferences
|
|
63
|
-
nano ~/.config/pai/agent-prefs.md
|
|
64
|
-
nano ~/.config/pai/voices.json
|
|
65
|
-
```
|
|
66
|
-
|
|
67
|
-
## Key Features
|
|
68
|
-
|
|
69
|
-
### Agent Preferences
|
|
70
|
-
|
|
71
|
-
- **Directory Restrictions**: Prevent expensive searches in large directories
|
|
72
|
-
- **Project Mappings**: Automate end-session commits to the right directories
|
|
73
|
-
- **Workflow Customization**: Tailor agent behavior to your development style
|
|
74
|
-
- **Voice and Notifications**: Configure how agents communicate with you
|
|
75
|
-
|
|
76
|
-
### Voice Configuration
|
|
77
|
-
|
|
78
|
-
- **Kokoro TTS**: Free, local, no API key required (~160MB download once)
|
|
79
|
-
- **ElevenLabs**: Premium quality voices, requires API key
|
|
80
|
-
- **Agent-Specific Voices**: Different voices for different agent types
|
|
81
|
-
- **Profiles**: Switch between professional/casual/focus modes
|
|
82
|
-
|
|
83
|
-
## Privacy and Version Control
|
|
84
|
-
|
|
85
|
-
These files contain personal preferences and should **NOT** be committed:
|
|
86
|
-
|
|
87
|
-
```bash
|
|
88
|
-
# Add to your .gitignore
|
|
89
|
-
echo "~/.config/pai/" >> ~/.gitignore
|
|
90
|
-
```
|
|
91
|
-
|
|
92
|
-
Keep these files private:
|
|
93
|
-
```bash
|
|
94
|
-
chmod 600 ~/.config/pai/agent-prefs.md
|
|
95
|
-
chmod 600 ~/.config/pai/voices.json
|
|
96
|
-
```
|
|
97
|
-
|
|
98
|
-
## Integration with PAI
|
|
99
|
-
|
|
100
|
-
The PAI daemon automatically:
|
|
101
|
-
- Reads `~/.config/pai/agent-prefs.md` on startup
|
|
102
|
-
- Loads voice configuration from `~/.config/pai/voices.json`
|
|
103
|
-
- Uses these settings for all agent operations
|
|
104
|
-
- Falls back to sensible defaults if files are missing
|
|
105
|
-
|
|
106
|
-
**No restart needed**: Changes to these files take effect immediately on next operation.
|
|
107
|
-
|
|
108
|
-
## Customization Examples
|
|
109
|
-
|
|
110
|
-
### Minimal Setup (Just Defaults)
|
|
111
|
-
|
|
112
|
-
```bash
|
|
113
|
-
# Copy templates
|
|
114
|
-
cp agent-prefs.example.md ~/.config/pai/agent-prefs.md
|
|
115
|
-
cp voices.example.json ~/.config/pai/voices.json
|
|
116
|
-
|
|
117
|
-
# Use defaults as-is, customize later as needed
|
|
118
|
-
```
|
|
119
|
-
|
|
120
|
-
### Engineer-Heavy Workflow
|
|
121
|
-
|
|
122
|
-
Focus on technical tools and code quality:
|
|
123
|
-
- Set default language to TypeScript
|
|
124
|
-
- Configure strict testing requirements
|
|
125
|
-
- Map multiple code directories
|
|
126
|
-
- Use technical-sounding voices
|
|
127
|
-
|
|
128
|
-
### Research-Heavy Workflow
|
|
129
|
-
|
|
130
|
-
Focus on information gathering:
|
|
131
|
-
- Configure ntfy.sh notifications for task completion
|
|
132
|
-
- Set longer timeouts for research agents
|
|
133
|
-
- Map knowledge base directories
|
|
134
|
-
- Use calm, analytical voices
|
|
135
|
-
|
|
136
|
-
## Environment Variables
|
|
137
|
-
|
|
138
|
-
Some settings can also be configured via environment variables:
|
|
139
|
-
|
|
140
|
-
```bash
|
|
141
|
-
export ELEVENLABS_API_KEY="sk-..." # ElevenLabs API key
|
|
142
|
-
export NTFY_TOPIC="my-private-topic" # ntfy.sh topic
|
|
143
|
-
export PAI_CONFIG_DIR="~/.config/pai" # Custom config directory
|
|
144
|
-
```
|
|
145
|
-
|
|
146
|
-
## Troubleshooting
|
|
147
|
-
|
|
148
|
-
**Q: My preferences aren't being used**
|
|
149
|
-
- Check file exists: `ls ~/.config/pai/agent-prefs.md`
|
|
150
|
-
- Check JSON is valid: `python3 -m json.tool ~/.config/pai/voices.json`
|
|
151
|
-
- Check permissions: `ls -l ~/.config/pai/`
|
|
152
|
-
|
|
153
|
-
**Q: Voice not working**
|
|
154
|
-
- Verify ElevenLabs API key if using ElevenLabs
|
|
155
|
-
- Check internet connection if using cloud TTS
|
|
156
|
-
- Try local Kokoro TTS as fallback
|
|
157
|
-
- See troubleshooting section in `voices.example.json`
|
|
158
|
-
|
|
159
|
-
**Q: Configuration not loading**
|
|
160
|
-
- Daemon reads from `~/.config/pai/` (not `~/.claude/`)
|
|
161
|
-
- Create directory if missing: `mkdir -p ~/.config/pai`
|
|
162
|
-
- Copy templates to this location
|
|
163
|
-
|
|
164
|
-
## Further Customization
|
|
165
|
-
|
|
166
|
-
Once you've customized the basic templates, you can:
|
|
167
|
-
|
|
168
|
-
1. **Create project-specific overrides** in project `.claude.json` files
|
|
169
|
-
2. **Add team standards** to agent-prefs.md under "Team Standards"
|
|
170
|
-
3. **Define CI/CD rules** in git commit section
|
|
171
|
-
4. **Experiment with voice profiles** for different working modes
|
|
172
|
-
|
|
173
|
-
## Related Documentation
|
|
174
|
-
|
|
175
|
-
- PAI Knowledge OS: `~/projects/PAI/README.md`
|
|
176
|
-
- Configuration reference: `~/.claude.json` (Claude Code configuration)
|
|
177
|
-
- PAI daemon docs: `~/projects/PAI/src/daemon/`
|
|
178
|
-
|
|
179
|
-
---
|
|
180
|
-
|
|
181
|
-
**Template Version**: 1.0
|