bluera-knowledge 0.9.32 → 0.9.34

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (196) hide show
  1. package/.claude/hooks/post-edit-check.sh +5 -3
  2. package/.claude/skills/atomic-commits/SKILL.md +3 -1
  3. package/.husky/pre-commit +3 -2
  4. package/.prettierrc +9 -0
  5. package/.versionrc.json +1 -1
  6. package/CHANGELOG.md +33 -0
  7. package/CLAUDE.md +6 -0
  8. package/README.md +25 -13
  9. package/bun.lock +277 -33
  10. package/dist/{chunk-L2YVNC63.js → chunk-6FHWC36B.js} +9 -1
  11. package/dist/chunk-6FHWC36B.js.map +1 -0
  12. package/dist/{chunk-RST4XGRL.js → chunk-DC7CGSGT.js} +288 -241
  13. package/dist/chunk-DC7CGSGT.js.map +1 -0
  14. package/dist/{chunk-6PBP5DVD.js → chunk-WFNPNAAP.js} +3212 -3054
  15. package/dist/chunk-WFNPNAAP.js.map +1 -0
  16. package/dist/{chunk-WT2DAEO7.js → chunk-Z2KKVH45.js} +548 -482
  17. package/dist/chunk-Z2KKVH45.js.map +1 -0
  18. package/dist/index.js +871 -758
  19. package/dist/index.js.map +1 -1
  20. package/dist/mcp/server.js +3 -3
  21. package/dist/watch.service-BJV3TI3F.js +7 -0
  22. package/dist/workers/background-worker-cli.js +46 -45
  23. package/dist/workers/background-worker-cli.js.map +1 -1
  24. package/eslint.config.js +43 -1
  25. package/package.json +18 -11
  26. package/plugin.json +8 -0
  27. package/python/requirements.txt +1 -1
  28. package/src/analysis/ast-parser.test.ts +12 -11
  29. package/src/analysis/ast-parser.ts +28 -22
  30. package/src/analysis/code-graph.test.ts +52 -62
  31. package/src/analysis/code-graph.ts +9 -13
  32. package/src/analysis/dependency-usage-analyzer.test.ts +91 -271
  33. package/src/analysis/dependency-usage-analyzer.ts +52 -24
  34. package/src/analysis/go-ast-parser.test.ts +22 -22
  35. package/src/analysis/go-ast-parser.ts +18 -25
  36. package/src/analysis/parser-factory.test.ts +9 -9
  37. package/src/analysis/parser-factory.ts +3 -3
  38. package/src/analysis/python-ast-parser.test.ts +27 -27
  39. package/src/analysis/python-ast-parser.ts +2 -2
  40. package/src/analysis/repo-url-resolver.test.ts +82 -82
  41. package/src/analysis/rust-ast-parser.test.ts +19 -19
  42. package/src/analysis/rust-ast-parser.ts +17 -27
  43. package/src/analysis/tree-sitter-parser.test.ts +3 -3
  44. package/src/analysis/tree-sitter-parser.ts +10 -16
  45. package/src/cli/commands/crawl.test.ts +40 -24
  46. package/src/cli/commands/crawl.ts +186 -166
  47. package/src/cli/commands/index-cmd.test.ts +90 -90
  48. package/src/cli/commands/index-cmd.ts +52 -36
  49. package/src/cli/commands/mcp.test.ts +6 -6
  50. package/src/cli/commands/mcp.ts +2 -2
  51. package/src/cli/commands/plugin-api.test.ts +16 -18
  52. package/src/cli/commands/plugin-api.ts +9 -6
  53. package/src/cli/commands/search.test.ts +16 -7
  54. package/src/cli/commands/search.ts +124 -87
  55. package/src/cli/commands/serve.test.ts +67 -25
  56. package/src/cli/commands/serve.ts +18 -3
  57. package/src/cli/commands/setup.test.ts +176 -101
  58. package/src/cli/commands/setup.ts +140 -117
  59. package/src/cli/commands/store.test.ts +82 -53
  60. package/src/cli/commands/store.ts +56 -37
  61. package/src/cli/program.ts +2 -2
  62. package/src/crawl/article-converter.test.ts +4 -1
  63. package/src/crawl/article-converter.ts +46 -31
  64. package/src/crawl/bridge.test.ts +240 -132
  65. package/src/crawl/bridge.ts +87 -30
  66. package/src/crawl/claude-client.test.ts +124 -56
  67. package/src/crawl/claude-client.ts +7 -15
  68. package/src/crawl/intelligent-crawler.test.ts +65 -22
  69. package/src/crawl/intelligent-crawler.ts +86 -53
  70. package/src/crawl/markdown-utils.ts +1 -4
  71. package/src/db/embeddings.ts +4 -6
  72. package/src/db/lance.test.ts +4 -4
  73. package/src/db/lance.ts +16 -12
  74. package/src/index.ts +26 -17
  75. package/src/logging/index.ts +1 -5
  76. package/src/logging/logger.ts +3 -5
  77. package/src/logging/payload.test.ts +1 -1
  78. package/src/logging/payload.ts +3 -5
  79. package/src/mcp/commands/index.ts +2 -2
  80. package/src/mcp/commands/job.commands.ts +12 -18
  81. package/src/mcp/commands/meta.commands.ts +13 -13
  82. package/src/mcp/commands/registry.ts +5 -8
  83. package/src/mcp/commands/store.commands.ts +19 -19
  84. package/src/mcp/handlers/execute.handler.test.ts +10 -10
  85. package/src/mcp/handlers/execute.handler.ts +4 -5
  86. package/src/mcp/handlers/index.ts +10 -14
  87. package/src/mcp/handlers/job.handler.test.ts +10 -10
  88. package/src/mcp/handlers/job.handler.ts +22 -25
  89. package/src/mcp/handlers/search.handler.test.ts +36 -65
  90. package/src/mcp/handlers/search.handler.ts +135 -104
  91. package/src/mcp/handlers/store.handler.test.ts +41 -52
  92. package/src/mcp/handlers/store.handler.ts +108 -88
  93. package/src/mcp/schemas/index.test.ts +73 -68
  94. package/src/mcp/schemas/index.ts +18 -12
  95. package/src/mcp/server.test.ts +1 -1
  96. package/src/mcp/server.ts +59 -46
  97. package/src/plugin/commands.test.ts +230 -95
  98. package/src/plugin/commands.ts +24 -25
  99. package/src/plugin/dependency-analyzer.test.ts +52 -52
  100. package/src/plugin/dependency-analyzer.ts +85 -22
  101. package/src/plugin/git-clone.test.ts +24 -13
  102. package/src/plugin/git-clone.ts +3 -7
  103. package/src/server/app.test.ts +109 -109
  104. package/src/server/app.ts +32 -23
  105. package/src/server/index.test.ts +64 -66
  106. package/src/services/chunking.service.test.ts +32 -32
  107. package/src/services/chunking.service.ts +16 -9
  108. package/src/services/code-graph.service.test.ts +30 -36
  109. package/src/services/code-graph.service.ts +24 -10
  110. package/src/services/code-unit.service.test.ts +55 -11
  111. package/src/services/code-unit.service.ts +85 -11
  112. package/src/services/config.service.test.ts +37 -18
  113. package/src/services/config.service.ts +30 -7
  114. package/src/services/index.service.test.ts +49 -18
  115. package/src/services/index.service.ts +98 -48
  116. package/src/services/index.ts +6 -9
  117. package/src/services/job.service.test.ts +22 -22
  118. package/src/services/job.service.ts +18 -18
  119. package/src/services/project-root.service.test.ts +1 -3
  120. package/src/services/search.service.test.ts +248 -120
  121. package/src/services/search.service.ts +286 -156
  122. package/src/services/services.test.ts +1 -1
  123. package/src/services/snippet.service.test.ts +14 -6
  124. package/src/services/snippet.service.ts +7 -5
  125. package/src/services/store.service.test.ts +68 -29
  126. package/src/services/store.service.ts +41 -12
  127. package/src/services/watch.service.test.ts +34 -14
  128. package/src/services/watch.service.ts +11 -1
  129. package/src/types/brands.test.ts +3 -1
  130. package/src/types/index.ts +2 -13
  131. package/src/types/search.ts +10 -8
  132. package/src/utils/type-guards.test.ts +20 -15
  133. package/src/utils/type-guards.ts +1 -1
  134. package/src/workers/background-worker-cli.ts +2 -2
  135. package/src/workers/background-worker.test.ts +54 -40
  136. package/src/workers/background-worker.ts +76 -60
  137. package/src/workers/spawn-worker.test.ts +22 -10
  138. package/src/workers/spawn-worker.ts +6 -6
  139. package/tests/analysis/ast-parser.test.ts +3 -3
  140. package/tests/analysis/code-graph.test.ts +5 -5
  141. package/tests/fixtures/code-snippets/api/error-handling.ts +4 -15
  142. package/tests/fixtures/code-snippets/api/rest-controller.ts +3 -9
  143. package/tests/fixtures/code-snippets/auth/jwt-auth.ts +5 -21
  144. package/tests/fixtures/code-snippets/auth/oauth-flow.ts +4 -4
  145. package/tests/fixtures/code-snippets/database/repository-pattern.ts +11 -3
  146. package/tests/fixtures/corpus/oss-repos/hono/src/adapter/aws-lambda/handler.ts +2 -2
  147. package/tests/fixtures/corpus/oss-repos/hono/src/adapter/cloudflare-pages/handler.ts +1 -1
  148. package/tests/fixtures/corpus/oss-repos/hono/src/adapter/cloudflare-workers/serve-static.ts +2 -2
  149. package/tests/fixtures/corpus/oss-repos/hono/src/client/client.ts +2 -2
  150. package/tests/fixtures/corpus/oss-repos/hono/src/client/types.ts +22 -20
  151. package/tests/fixtures/corpus/oss-repos/hono/src/context.ts +13 -10
  152. package/tests/fixtures/corpus/oss-repos/hono/src/helper/accepts/accepts.ts +10 -7
  153. package/tests/fixtures/corpus/oss-repos/hono/src/helper/adapter/index.ts +2 -2
  154. package/tests/fixtures/corpus/oss-repos/hono/src/helper/css/index.ts +1 -1
  155. package/tests/fixtures/corpus/oss-repos/hono/src/helper/factory/index.ts +16 -16
  156. package/tests/fixtures/corpus/oss-repos/hono/src/helper/ssg/ssg.ts +2 -2
  157. package/tests/fixtures/corpus/oss-repos/hono/src/hono-base.ts +3 -3
  158. package/tests/fixtures/corpus/oss-repos/hono/src/hono.ts +1 -1
  159. package/tests/fixtures/corpus/oss-repos/hono/src/jsx/dom/css.ts +2 -2
  160. package/tests/fixtures/corpus/oss-repos/hono/src/jsx/dom/intrinsic-element/components.ts +1 -1
  161. package/tests/fixtures/corpus/oss-repos/hono/src/jsx/dom/render.ts +7 -7
  162. package/tests/fixtures/corpus/oss-repos/hono/src/jsx/hooks/index.ts +3 -3
  163. package/tests/fixtures/corpus/oss-repos/hono/src/jsx/intrinsic-element/components.ts +1 -1
  164. package/tests/fixtures/corpus/oss-repos/hono/src/jsx/utils.ts +6 -6
  165. package/tests/fixtures/corpus/oss-repos/hono/src/middleware/jsx-renderer/index.ts +3 -3
  166. package/tests/fixtures/corpus/oss-repos/hono/src/middleware/serve-static/index.ts +1 -1
  167. package/tests/fixtures/corpus/oss-repos/hono/src/preset/quick.ts +1 -1
  168. package/tests/fixtures/corpus/oss-repos/hono/src/preset/tiny.ts +1 -1
  169. package/tests/fixtures/corpus/oss-repos/hono/src/router/pattern-router/router.ts +2 -2
  170. package/tests/fixtures/corpus/oss-repos/hono/src/router/reg-exp-router/node.ts +4 -4
  171. package/tests/fixtures/corpus/oss-repos/hono/src/router/reg-exp-router/router.ts +1 -1
  172. package/tests/fixtures/corpus/oss-repos/hono/src/router/trie-router/node.ts +1 -1
  173. package/tests/fixtures/corpus/oss-repos/hono/src/types.ts +166 -169
  174. package/tests/fixtures/corpus/oss-repos/hono/src/utils/body.ts +8 -8
  175. package/tests/fixtures/corpus/oss-repos/hono/src/utils/color.ts +3 -3
  176. package/tests/fixtures/corpus/oss-repos/hono/src/utils/cookie.ts +2 -2
  177. package/tests/fixtures/corpus/oss-repos/hono/src/utils/encode.ts +2 -2
  178. package/tests/fixtures/corpus/oss-repos/hono/src/utils/types.ts +30 -33
  179. package/tests/fixtures/corpus/oss-repos/hono/src/validator/validator.ts +2 -2
  180. package/tests/fixtures/test-server.ts +3 -2
  181. package/tests/helpers/performance-metrics.ts +8 -25
  182. package/tests/helpers/search-relevance.ts +14 -69
  183. package/tests/integration/cli-consistency.test.ts +5 -4
  184. package/tests/integration/python-bridge.test.ts +13 -3
  185. package/tests/mcp/server.test.ts +1 -1
  186. package/tests/services/code-unit.service.test.ts +48 -0
  187. package/tests/services/job.service.test.ts +124 -0
  188. package/tests/services/search.progressive-context.test.ts +2 -2
  189. package/.claude-plugin/plugin.json +0 -13
  190. package/dist/chunk-6PBP5DVD.js.map +0 -1
  191. package/dist/chunk-L2YVNC63.js.map +0 -1
  192. package/dist/chunk-RST4XGRL.js.map +0 -1
  193. package/dist/chunk-WT2DAEO7.js.map +0 -1
  194. package/dist/watch.service-YAIKKDCF.js +0 -7
  195. package/skills/atomic-commits/SKILL.md +0 -77
  196. /package/dist/{watch.service-YAIKKDCF.js.map → watch.service-BJV3TI3F.js.map} +0 -0
@@ -1 +0,0 @@
1
- {"version":3,"sources":["../src/services/watch.service.ts"],"sourcesContent":["import { watch, type FSWatcher } from 'chokidar';\nimport type { FileStore, RepoStore } from '../types/store.js';\nimport type { IndexService } from './index.service.js';\nimport type { LanceStore } from '../db/lance.js';\n\nexport class WatchService {\n private readonly watchers: Map<string, FSWatcher> = new Map();\n private readonly indexService: IndexService;\n private readonly lanceStore: LanceStore;\n\n constructor(indexService: IndexService, lanceStore: LanceStore) {\n this.indexService = indexService;\n this.lanceStore = lanceStore;\n }\n\n async watch(\n store: FileStore | RepoStore,\n debounceMs = 1000,\n onReindex?: () => void\n ): Promise<void> {\n if (this.watchers.has(store.id)) {\n return Promise.resolve(); // Already watching\n }\n\n let timeout: NodeJS.Timeout | null = null;\n\n const watcher = watch(store.path, {\n ignored: /(^|[/\\\\])\\.(git|node_modules|dist|build)/,\n persistent: true,\n ignoreInitial: true,\n });\n\n const reindexHandler = (): void => {\n if (timeout) clearTimeout(timeout);\n timeout = setTimeout(() => {\n void (async (): Promise<void> => {\n try {\n await this.lanceStore.initialize(store.id);\n await this.indexService.indexStore(store);\n onReindex?.();\n } catch (error) {\n console.error('Error during reindexing:', error);\n }\n })();\n }, debounceMs);\n };\n\n watcher.on('all', reindexHandler);\n\n watcher.on('error', (error) => {\n console.error('Watcher error:', error);\n });\n\n this.watchers.set(store.id, watcher);\n return Promise.resolve();\n }\n\n async unwatch(storeId: string): Promise<void> {\n const watcher = this.watchers.get(storeId);\n if (watcher) {\n await watcher.close();\n this.watchers.delete(storeId);\n }\n }\n\n async unwatchAll(): Promise<void> {\n for (const [id] of this.watchers) {\n await this.unwatch(id);\n }\n }\n}\n"],"mappings":";AAAA,SAAS,aAA6B;AAK/B,IAAM,eAAN,MAAmB;AAAA,EACP,WAAmC,oBAAI,IAAI;AAAA,EAC3C;AAAA,EACA;AAAA,EAEjB,YAAY,cAA4B,YAAwB;AAC9D,SAAK,eAAe;AACpB,SAAK,aAAa;AAAA,EACpB;AAAA,EAEA,MAAM,MACJ,OACA,aAAa,KACb,WACe;AACf,QAAI,KAAK,SAAS,IAAI,MAAM,EAAE,GAAG;AAC/B,aAAO,QAAQ,QAAQ;AAAA,IACzB;AAEA,QAAI,UAAiC;AAErC,UAAM,UAAU,MAAM,MAAM,MAAM;AAAA,MAChC,SAAS;AAAA,MACT,YAAY;AAAA,MACZ,eAAe;AAAA,IACjB,CAAC;AAED,UAAM,iBAAiB,MAAY;AACjC,UAAI,QAAS,cAAa,OAAO;AACjC,gBAAU,WAAW,MAAM;AACzB,cAAM,YAA2B;AAC/B,cAAI;AACF,kBAAM,KAAK,WAAW,WAAW,MAAM,EAAE;AACzC,kBAAM,KAAK,aAAa,WAAW,KAAK;AACxC,wBAAY;AAAA,UACd,SAAS,OAAO;AACd,oBAAQ,MAAM,4BAA4B,KAAK;AAAA,UACjD;AAAA,QACF,GAAG;AAAA,MACL,GAAG,UAAU;AAAA,IACf;AAEA,YAAQ,GAAG,OAAO,cAAc;AAEhC,YAAQ,GAAG,SAAS,CAAC,UAAU;AAC7B,cAAQ,MAAM,kBAAkB,KAAK;AAAA,IACvC,CAAC;AAED,SAAK,SAAS,IAAI,MAAM,IAAI,OAAO;AACnC,WAAO,QAAQ,QAAQ;AAAA,EACzB;AAAA,EAEA,MAAM,QAAQ,SAAgC;AAC5C,UAAM,UAAU,KAAK,SAAS,IAAI,OAAO;AACzC,QAAI,SAAS;AACX,YAAM,QAAQ,MAAM;AACpB,WAAK,SAAS,OAAO,OAAO;AAAA,IAC9B;AAAA,EACF;AAAA,EAEA,MAAM,aAA4B;AAChC,eAAW,CAAC,EAAE,KAAK,KAAK,UAAU;AAChC,YAAM,KAAK,QAAQ,EAAE;AAAA,IACvB;AAAA,EACF;AACF;","names":[]}
@@ -1 +0,0 @@
1
- {"version":3,"sources":["../src/crawl/intelligent-crawler.ts","../src/crawl/claude-client.ts","../src/crawl/article-converter.ts","../src/crawl/markdown-utils.ts"],"sourcesContent":["/**\n * Intelligent web crawler with natural language control\n * Two modes: Intelligent (Claude-driven) and Simple (BFS)\n */\n\nimport { EventEmitter } from 'node:events';\nimport axios from 'axios';\nimport { ClaudeClient, type CrawlStrategy } from './claude-client.js';\nimport { convertHtmlToMarkdown } from './article-converter.js';\nimport { PythonBridge, type CrawledLink } from './bridge.js';\nimport { createLogger, summarizePayload } from '../logging/index.js';\n\nconst logger = createLogger('crawler');\n\nexport interface CrawlOptions {\n crawlInstruction?: string; // Natural language: what to crawl\n extractInstruction?: string; // Natural language: what to extract\n maxPages?: number; // Max pages to crawl (default: 50)\n timeout?: number; // Per-page timeout in ms (default: 30000)\n simple?: boolean; // Force simple BFS mode\n useHeadless?: boolean; // Enable headless browser for JavaScript-rendered sites\n}\n\nexport interface CrawlResult {\n url: string;\n title?: string;\n markdown: string;\n extracted?: string;\n depth?: number;\n}\n\nexport interface CrawlProgress {\n type: 'start' | 'strategy' | 'page' | 'extraction' | 'complete' | 'error';\n pagesVisited: number;\n totalPages: number;\n currentUrl?: string;\n message?: string;\n error?: Error;\n}\n\n/**\n * Intelligent crawler that uses Claude CLI for strategy and extraction\n */\nexport class IntelligentCrawler extends EventEmitter {\n private readonly claudeClient: ClaudeClient;\n private readonly pythonBridge: PythonBridge;\n private readonly visited: Set<string>;\n private stopped: boolean;\n\n constructor() {\n super();\n this.claudeClient = new ClaudeClient();\n this.pythonBridge = new PythonBridge();\n this.visited = new Set();\n this.stopped = false;\n }\n\n /**\n * Crawl a website with intelligent or simple mode\n */\n async *crawl(\n seedUrl: string,\n options: CrawlOptions = {},\n ): AsyncIterable<CrawlResult> {\n const {\n crawlInstruction,\n extractInstruction,\n maxPages = 50,\n simple = false,\n } = options;\n\n this.visited.clear();\n this.stopped = false;\n\n logger.info({\n seedUrl,\n maxPages,\n mode: simple ? 'simple' : (crawlInstruction !== undefined && crawlInstruction !== '' ? 'intelligent' : 'simple'),\n hasExtractInstruction: extractInstruction !== undefined,\n }, 'Starting crawl');\n\n const startProgress: CrawlProgress = {\n type: 'start',\n pagesVisited: 0,\n totalPages: maxPages,\n };\n this.emit('progress', startProgress);\n\n // Determine mode: intelligent (with crawl instruction) or simple (BFS)\n const useIntelligentMode = !simple && crawlInstruction !== undefined && crawlInstruction !== '';\n\n if (useIntelligentMode) {\n // TypeScript knows crawlInstruction is defined here due to useIntelligentMode check\n yield* this.crawlIntelligent(seedUrl, crawlInstruction, extractInstruction, maxPages, options.useHeadless ?? false);\n } else {\n yield* this.crawlSimple(seedUrl, extractInstruction, maxPages, options.useHeadless ?? false);\n }\n\n logger.info({\n seedUrl,\n pagesVisited: this.visited.size,\n }, 'Crawl complete');\n\n const completeProgress: CrawlProgress = {\n type: 'complete',\n pagesVisited: this.visited.size,\n totalPages: this.visited.size,\n };\n this.emit('progress', completeProgress);\n }\n\n /**\n * Intelligent mode: Use Claude to determine which URLs to crawl\n */\n private async *crawlIntelligent(\n seedUrl: string,\n crawlInstruction: string,\n extractInstruction: string | undefined,\n maxPages: number,\n useHeadless: boolean = false,\n ): AsyncIterable<CrawlResult> {\n // Check if Claude CLI is available before attempting intelligent mode\n if (!ClaudeClient.isAvailable()) {\n const fallbackProgress: CrawlProgress = {\n type: 'error',\n pagesVisited: 0,\n totalPages: maxPages,\n message: 'Claude CLI not found, using simple crawl mode (install Claude Code for intelligent crawling)',\n error: new Error('Claude CLI not available'),\n };\n this.emit('progress', fallbackProgress);\n yield* this.crawlSimple(seedUrl, extractInstruction, maxPages, useHeadless);\n return;\n }\n\n let strategy: CrawlStrategy;\n\n try {\n // Step 1: Fetch seed page HTML\n const strategyStartProgress: CrawlProgress = {\n type: 'strategy',\n pagesVisited: 0,\n totalPages: maxPages,\n currentUrl: seedUrl,\n message: 'Analyzing page structure with Claude...',\n };\n this.emit('progress', strategyStartProgress);\n\n const seedHtml = await this.fetchHtml(seedUrl, useHeadless);\n\n // Step 2: Ask Claude which URLs to crawl\n strategy = await this.claudeClient.determineCrawlUrls(seedHtml, crawlInstruction);\n\n const strategyCompleteProgress: CrawlProgress = {\n type: 'strategy',\n pagesVisited: 0,\n totalPages: maxPages,\n message: `Claude identified ${String(strategy.urls.length)} URLs to crawl: ${strategy.reasoning}`,\n };\n this.emit('progress', strategyCompleteProgress);\n } catch (error) {\n // Fallback to simple mode if Claude fails\n const errorProgress: CrawlProgress = {\n type: 'error',\n pagesVisited: 0,\n totalPages: maxPages,\n message: 'Claude crawl strategy failed, falling back to simple mode',\n error: error instanceof Error ? error : new Error(String(error)),\n };\n this.emit('progress', errorProgress);\n\n yield* this.crawlSimple(seedUrl, extractInstruction, maxPages);\n return;\n }\n\n // Step 3: Crawl each URL from Claude's strategy\n let pagesVisited = 0;\n\n for (const url of strategy.urls) {\n if (this.stopped || pagesVisited >= maxPages) break;\n if (this.visited.has(url)) continue;\n\n try {\n const result = await this.crawlSinglePage(url, extractInstruction, pagesVisited, useHeadless);\n pagesVisited++;\n yield result;\n } catch (error) {\n const pageErrorProgress: CrawlProgress = {\n type: 'error',\n pagesVisited,\n totalPages: maxPages,\n currentUrl: url,\n error: error instanceof Error ? error : new Error(String(error)),\n };\n this.emit('progress', pageErrorProgress);\n }\n }\n }\n\n /**\n * Simple mode: BFS crawling with depth limit\n */\n private async *crawlSimple(\n seedUrl: string,\n extractInstruction: string | undefined,\n maxPages: number,\n useHeadless: boolean = false,\n ): AsyncIterable<CrawlResult> {\n const queue: Array<{ url: string; depth: number }> = [{ url: seedUrl, depth: 0 }];\n const maxDepth = 2; // Default depth limit for simple mode\n let pagesVisited = 0;\n\n while (queue.length > 0 && pagesVisited < maxPages && !this.stopped) {\n const current = queue.shift();\n\n if (!current || this.visited.has(current.url) || current.depth > maxDepth) {\n continue;\n }\n\n try {\n const result = await this.crawlSinglePage(\n current.url,\n extractInstruction,\n pagesVisited,\n useHeadless,\n );\n result.depth = current.depth;\n pagesVisited++;\n\n yield result;\n\n // Add links to queue if we haven't reached max depth\n if (current.depth < maxDepth) {\n try {\n const links = await this.extractLinks(current.url, useHeadless);\n\n if (links.length === 0) {\n logger.debug({ url: current.url }, 'No links found - page may be a leaf node');\n } else {\n logger.debug({ url: current.url, linkCount: links.length }, 'Links extracted from page');\n }\n\n for (const link of links) {\n if (!this.visited.has(link) && this.isSameDomain(seedUrl, link)) {\n queue.push({ url: link, depth: current.depth + 1 });\n }\n }\n } catch (error) {\n // Log link extraction failure but continue crawling other pages\n const errorProgress: CrawlProgress = {\n type: 'error',\n pagesVisited,\n totalPages: maxPages,\n currentUrl: current.url,\n message: `Failed to extract links from ${current.url}`,\n error: error instanceof Error ? error : new Error(String(error)),\n };\n this.emit('progress', errorProgress);\n }\n }\n } catch (error) {\n const simpleErrorProgress: CrawlProgress = {\n type: 'error',\n pagesVisited,\n totalPages: maxPages,\n currentUrl: current.url,\n error: error instanceof Error ? error : new Error(String(error)),\n };\n this.emit('progress', simpleErrorProgress);\n }\n }\n }\n\n /**\n * Crawl a single page: fetch, convert to markdown, optionally extract\n */\n private async crawlSinglePage(\n url: string,\n extractInstruction: string | undefined,\n pagesVisited: number,\n useHeadless: boolean = false,\n ): Promise<CrawlResult> {\n const pageProgress: CrawlProgress = {\n type: 'page',\n pagesVisited,\n totalPages: 0,\n currentUrl: url,\n };\n this.emit('progress', pageProgress);\n\n // Mark as visited\n this.visited.add(url);\n\n // Fetch HTML\n const html = await this.fetchHtml(url, useHeadless);\n\n // Convert to clean markdown using slurp-ai techniques\n const conversion = await convertHtmlToMarkdown(html, url);\n\n if (!conversion.success) {\n logger.error({ url, error: conversion.error }, 'HTML to markdown conversion failed');\n throw new Error(`Failed to convert HTML: ${conversion.error ?? 'Unknown error'}`);\n }\n\n logger.debug({\n url,\n title: conversion.title,\n markdownLength: conversion.markdown.length,\n }, 'Article converted to markdown');\n\n let extracted: string | undefined;\n\n // Optional: Extract specific information using Claude\n if (extractInstruction !== undefined && extractInstruction !== '') {\n // Skip extraction if Claude CLI isn't available\n if (!ClaudeClient.isAvailable()) {\n const skipProgress: CrawlProgress = {\n type: 'error',\n pagesVisited,\n totalPages: 0,\n currentUrl: url,\n message: 'Skipping extraction (Claude CLI not available), storing raw markdown',\n error: new Error('Claude CLI not available'),\n };\n this.emit('progress', skipProgress);\n } else {\n try {\n const extractionProgress: CrawlProgress = {\n type: 'extraction',\n pagesVisited,\n totalPages: 0,\n currentUrl: url,\n };\n this.emit('progress', extractionProgress);\n\n extracted = await this.claudeClient.extractContent(\n conversion.markdown,\n extractInstruction,\n );\n } catch (error) {\n // If extraction fails, just store raw markdown\n const extractionErrorProgress: CrawlProgress = {\n type: 'error',\n pagesVisited,\n totalPages: 0,\n currentUrl: url,\n message: 'Extraction failed, storing raw markdown',\n error: error instanceof Error ? error : new Error(String(error)),\n };\n this.emit('progress', extractionErrorProgress);\n }\n }\n }\n\n return {\n url,\n ...(conversion.title !== undefined && { title: conversion.title }),\n markdown: conversion.markdown,\n ...(extracted !== undefined && { extracted }),\n };\n }\n\n /**\n * Fetch HTML content from a URL\n */\n private async fetchHtml(url: string, useHeadless: boolean = false): Promise<string> {\n const startTime = Date.now();\n logger.debug({ url, useHeadless }, 'Fetching HTML');\n\n if (useHeadless) {\n try {\n const result = await this.pythonBridge.fetchHeadless(url);\n const durationMs = Date.now() - startTime;\n logger.info({\n url,\n useHeadless: true,\n durationMs,\n ...summarizePayload(result.html, 'raw-html', url),\n }, 'Raw HTML fetched');\n return result.html;\n } catch (error) {\n // Fallback to axios if headless fails\n logger.warn({ url, error: error instanceof Error ? error.message : String(error) }, 'Headless fetch failed, falling back to axios');\n }\n }\n\n // Original axios implementation for static sites\n try {\n const response = await axios.get<string>(url, {\n timeout: 30000,\n headers: {\n 'User-Agent':\n 'Mozilla/5.0 (compatible; bluera-knowledge-crawler/1.0)',\n },\n });\n\n const durationMs = Date.now() - startTime;\n logger.info({\n url,\n useHeadless: false,\n durationMs,\n ...summarizePayload(response.data, 'raw-html', url),\n }, 'Raw HTML fetched');\n\n return response.data;\n } catch (error) {\n logger.error({ url, error: error instanceof Error ? error.message : String(error) }, 'Failed to fetch HTML');\n throw new Error(\n `Failed to fetch ${url}: ${error instanceof Error ? error.message : String(error)}`,\n );\n }\n }\n\n /**\n * Extract links from a page using Python bridge\n */\n private async extractLinks(url: string, useHeadless: boolean = false): Promise<string[]> {\n try {\n // Use headless mode for link extraction if enabled\n if (useHeadless) {\n const result = await this.pythonBridge.fetchHeadless(url);\n // Extract href strings from link objects (crawl4ai returns objects, not strings)\n return result.links.map((link: CrawledLink | string) => {\n if (typeof link === 'string') return link;\n return link.href;\n });\n }\n\n const result = await this.pythonBridge.crawl(url);\n\n // Validate response structure (handle potential runtime type mismatches)\n // eslint-disable-next-line @typescript-eslint/no-unnecessary-condition -- TypeScript types claim pages exists but Python bridge may return invalid structure at runtime\n const firstPage = result.pages?.[0];\n if (!firstPage) {\n throw new Error(`Invalid crawl response structure for ${url}: missing pages array`);\n }\n\n return firstPage.links;\n } catch (error: unknown) {\n // Log the error for debugging\n const errorMessage = error instanceof Error ? error.message : String(error);\n logger.error({ url, error: errorMessage }, 'Failed to extract links');\n\n // Re-throw the error instead of silently swallowing it\n throw new Error(`Link extraction failed for ${url}: ${errorMessage}`);\n }\n }\n\n /**\n * Check if two URLs are from the same domain\n */\n private isSameDomain(url1: string, url2: string): boolean {\n try {\n const domain1 = new URL(url1).hostname.toLowerCase();\n const domain2 = new URL(url2).hostname.toLowerCase();\n return domain1 === domain2 || domain1.endsWith(`.${domain2}`) || domain2.endsWith(`.${domain1}`);\n } catch {\n return false;\n }\n }\n\n /**\n * Stop the crawler\n */\n async stop(): Promise<void> {\n this.stopped = true;\n await this.pythonBridge.stop();\n }\n}\n","/**\n * Claude CLI client for intelligent crawling and extraction\n * Uses `claude -p` programmatically to analyze page structure and extract content\n */\n\nimport { spawn, execSync } from 'node:child_process';\n\n/**\n * Schema for crawl strategy response from Claude\n */\nexport interface CrawlStrategy {\n urls: string[];\n reasoning: string;\n}\n\nconst CRAWL_STRATEGY_SCHEMA = {\n type: 'object',\n properties: {\n urls: {\n type: 'array',\n items: { type: 'string' },\n description: 'List of URLs to crawl based on the instruction',\n },\n reasoning: {\n type: 'string',\n description: 'Brief explanation of why these URLs were selected',\n },\n },\n required: ['urls', 'reasoning'],\n};\n\n/**\n * Client for interacting with Claude Code CLI\n */\nexport class ClaudeClient {\n private readonly timeout: number;\n private static availabilityChecked = false;\n private static available = false;\n\n /**\n * Check if Claude CLI is available in PATH\n * Result is cached after first check for performance\n */\n static isAvailable(): boolean {\n if (!ClaudeClient.availabilityChecked) {\n try {\n execSync('which claude', { stdio: 'ignore' });\n ClaudeClient.available = true;\n } catch {\n ClaudeClient.available = false;\n }\n ClaudeClient.availabilityChecked = true;\n }\n return ClaudeClient.available;\n }\n\n /**\n * Reset availability cache (for testing)\n */\n static resetAvailabilityCache(): void {\n ClaudeClient.availabilityChecked = false;\n ClaudeClient.available = false;\n }\n\n constructor(options: { timeout?: number } = {}) {\n this.timeout = options.timeout ?? 30000; // 30s default\n }\n\n /**\n * Determine which URLs to crawl based on natural language instruction\n *\n * @param seedHtml - HTML content of the seed page\n * @param instruction - Natural language crawl instruction (e.g., \"scrape all Getting Started pages\")\n * @returns List of URLs to crawl with reasoning\n */\n async determineCrawlUrls(\n seedHtml: string,\n instruction: string,\n ): Promise<CrawlStrategy> {\n const prompt = `You are analyzing a webpage to determine which pages to crawl based on the user's instruction.\n\nInstruction: ${instruction}\n\nWebpage HTML (analyze the navigation structure, links, and content):\n${this.truncateHtml(seedHtml, 50000)}\n\nBased on the instruction, extract and return a list of absolute URLs that should be crawled. Look for navigation menus, sidebars, headers, and link structures that match the instruction.\n\nReturn only URLs that are relevant to the instruction. If the instruction mentions specific sections (e.g., \"Getting Started\"), find links in those sections.`;\n\n try {\n const result = await this.callClaude(prompt, CRAWL_STRATEGY_SCHEMA);\n const parsed: unknown = JSON.parse(result);\n\n // Validate and narrow type\n if (\n typeof parsed !== 'object' ||\n parsed === null ||\n !('urls' in parsed) ||\n !('reasoning' in parsed) ||\n !Array.isArray(parsed.urls) ||\n parsed.urls.length === 0 ||\n typeof parsed.reasoning !== 'string' ||\n !parsed.urls.every((url) => typeof url === 'string')\n ) {\n throw new Error('Claude returned invalid crawl strategy');\n }\n\n // Type is now properly narrowed - urls is string[] after validation\n return { urls: parsed.urls, reasoning: parsed.reasoning };\n } catch (error) {\n throw new Error(\n `Failed to determine crawl strategy: ${error instanceof Error ? error.message : String(error)}`,\n );\n }\n }\n\n /**\n * Extract specific information from markdown content using natural language\n *\n * @param markdown - Page content in markdown format\n * @param instruction - Natural language extraction instruction (e.g., \"extract pricing info\")\n * @returns Extracted information as text\n */\n async extractContent(markdown: string, instruction: string): Promise<string> {\n const prompt = `${instruction}\n\nContent to analyze:\n${this.truncateMarkdown(markdown, 100000)}`;\n\n try {\n const result = await this.callClaude(prompt);\n return result.trim();\n } catch (error) {\n throw new Error(\n `Failed to extract content: ${error instanceof Error ? error.message : String(error)}`,\n );\n }\n }\n\n /**\n * Call Claude CLI with a prompt\n *\n * @param prompt - The prompt to send to Claude\n * @param jsonSchema - Optional JSON schema for structured output\n * @returns Claude's response as a string\n */\n private async callClaude(\n prompt: string,\n jsonSchema?: Record<string, unknown>,\n ): Promise<string> {\n return new Promise<string>((resolve, reject) => {\n const args = ['-p'];\n\n // Add JSON schema if provided\n if (jsonSchema) {\n args.push('--json-schema', JSON.stringify(jsonSchema));\n args.push('--output-format', 'json');\n }\n\n const proc = spawn('claude', args, {\n stdio: ['pipe', 'pipe', 'pipe'],\n cwd: process.cwd(),\n env: { ...process.env },\n });\n\n let stdout = '';\n let stderr = '';\n let timeoutId: NodeJS.Timeout | undefined;\n\n // Set timeout\n if (this.timeout > 0) {\n timeoutId = setTimeout(() => {\n proc.kill('SIGTERM');\n reject(new Error(`Claude CLI timed out after ${String(this.timeout)}ms`));\n }, this.timeout);\n }\n\n proc.stdout.on('data', (chunk: Buffer) => {\n stdout += chunk.toString();\n });\n\n proc.stderr.on('data', (chunk: Buffer) => {\n stderr += chunk.toString();\n });\n\n proc.on('close', (code: number | null) => {\n if (timeoutId !== undefined) {\n clearTimeout(timeoutId);\n }\n\n if (code === 0) {\n resolve(stdout.trim());\n } else {\n reject(\n new Error(\n `Claude CLI exited with code ${String(code)}${stderr ? `: ${stderr}` : ''}`,\n ),\n );\n }\n });\n\n proc.on('error', (err) => {\n if (timeoutId !== undefined) {\n clearTimeout(timeoutId);\n }\n reject(new Error(`Failed to spawn Claude CLI: ${err.message}`));\n });\n\n // Write prompt to stdin\n proc.stdin.write(prompt);\n proc.stdin.end();\n });\n }\n\n /**\n * Truncate HTML to a maximum length (keep important parts)\n */\n private truncateHtml(html: string, maxLength: number): string {\n if (html.length <= maxLength) return html;\n\n // Try to keep the beginning (usually has navigation)\n return html.substring(0, maxLength) + '\\n\\n[... HTML truncated ...]';\n }\n\n /**\n * Truncate markdown to a maximum length\n */\n private truncateMarkdown(markdown: string, maxLength: number): string {\n if (markdown.length <= maxLength) return markdown;\n\n return markdown.substring(0, maxLength) + '\\n\\n[... content truncated ...]';\n }\n}\n","/**\n * Article converter using @extractus/article-extractor and Turndown\n * Produces clean markdown from HTML using slurp-ai techniques\n */\n\nimport { extractFromHtml } from '@extractus/article-extractor';\nimport TurndownService from 'turndown';\nimport { gfm } from 'turndown-plugin-gfm';\nimport { preprocessHtmlForCodeBlocks, cleanupMarkdown } from './markdown-utils.js';\nimport { createLogger, truncateForLog } from '../logging/index.js';\n\nconst logger = createLogger('article-converter');\n\nexport interface ConversionResult {\n markdown: string;\n title?: string;\n success: boolean;\n error?: string;\n}\n\n/**\n * Convert HTML to clean markdown using best practices from slurp-ai\n *\n * Pipeline:\n * 1. Extract main article content (strips navigation, ads, boilerplate)\n * 2. Preprocess HTML (handle MkDocs code blocks)\n * 3. Convert to markdown with Turndown + GFM\n * 4. Cleanup markdown (regex patterns)\n */\nexport async function convertHtmlToMarkdown(\n html: string,\n url: string,\n): Promise<ConversionResult> {\n logger.debug({ url, htmlLength: html.length }, 'Starting HTML conversion');\n\n try {\n // Step 1: Extract main article content\n let articleHtml: string;\n let title: string | undefined;\n\n try {\n const article = await extractFromHtml(html, url);\n if (article !== null && article.content !== undefined && article.content !== '') {\n articleHtml = article.content;\n title = article.title !== undefined && article.title !== '' ? article.title : undefined;\n logger.debug({\n url,\n title,\n extractedLength: articleHtml.length,\n usedFullHtml: false,\n }, 'Article content extracted');\n } else {\n // Fallback to full HTML if extraction fails\n articleHtml = html;\n logger.debug({ url, usedFullHtml: true }, 'Article extraction returned empty, using full HTML');\n }\n } catch (extractError) {\n // Fallback to full HTML if extraction fails\n articleHtml = html;\n logger.debug({\n url,\n usedFullHtml: true,\n error: extractError instanceof Error ? extractError.message : String(extractError),\n }, 'Article extraction failed, using full HTML');\n }\n\n // Step 2: Preprocess HTML for code blocks\n const preprocessed = preprocessHtmlForCodeBlocks(articleHtml);\n\n // Step 3: Configure Turndown with custom rules\n const turndownService = new TurndownService({\n headingStyle: 'atx', // Use # style headings\n codeBlockStyle: 'fenced', // Use ``` style code blocks\n fence: '```',\n emDelimiter: '*',\n strongDelimiter: '**',\n linkStyle: 'inlined',\n });\n\n // Add GitHub Flavored Markdown support (tables, strikethrough, task lists)\n turndownService.use(gfm);\n\n // Custom rule for headings with anchors (from slurp-ai)\n turndownService.addRule('headingsWithAnchors', {\n filter: ['h1', 'h2', 'h3', 'h4', 'h5', 'h6'],\n replacement(content: string, node: HTMLElement): string {\n const level = Number(node.nodeName.charAt(1));\n const hashes = '#'.repeat(level);\n const cleanContent = content\n .replace(/\\[\\]\\([^)]*\\)/g, '') // Remove empty links\n .replace(/\\s+/g, ' ') // Normalize whitespace\n .trim();\n return cleanContent !== '' ? `\\n\\n${hashes} ${cleanContent}\\n\\n` : '';\n },\n });\n\n // Convert to markdown\n const rawMarkdown = turndownService.turndown(preprocessed);\n\n // Step 4: Cleanup markdown with comprehensive regex patterns\n const markdown = cleanupMarkdown(rawMarkdown);\n\n logger.debug({\n url,\n title,\n rawMarkdownLength: rawMarkdown.length,\n finalMarkdownLength: markdown.length,\n }, 'HTML to markdown conversion complete');\n\n // Log markdown preview at trace level\n logger.trace({\n url,\n markdownPreview: truncateForLog(markdown, 1000),\n }, 'Markdown content preview');\n\n return {\n markdown,\n ...(title !== undefined && { title }),\n success: true,\n };\n } catch (error) {\n logger.error({\n url,\n error: error instanceof Error ? error.message : String(error),\n }, 'HTML to markdown conversion failed');\n\n return {\n markdown: '',\n success: false,\n error: error instanceof Error ? error.message : String(error),\n };\n }\n}\n","/**\n * Markdown conversion utilities ported from slurp-ai\n * Source: https://github.com/ratacat/slurp-ai\n *\n * These utilities handle complex documentation site patterns (MkDocs, Sphinx, etc.)\n * and produce clean, well-formatted markdown.\n */\n\nimport * as cheerio from 'cheerio';\n\n/**\n * Detect language from code element class names.\n * Handles various class naming patterns from different highlighters.\n */\nfunction detectLanguageFromClass(className: string | undefined): string {\n if (className === undefined || className === '') return '';\n\n // Common patterns: \"language-python\", \"lang-js\", \"highlight-python\", \"python\", \"hljs language-python\"\n const patterns = [\n /language-(\\w+)/i,\n /lang-(\\w+)/i,\n /highlight-(\\w+)/i,\n /hljs\\s+(\\w+)/i,\n /^(\\w+)$/i,\n ];\n\n for (const pattern of patterns) {\n const match = className.match(pattern);\n if (match?.[1] !== undefined) {\n const lang = match[1].toLowerCase();\n // Filter out common non-language classes\n if (!['hljs', 'highlight', 'code', 'pre', 'block', 'inline'].includes(lang)) {\n return lang;\n }\n }\n }\n\n return '';\n}\n\n/**\n * Escape HTML special characters for safe embedding in HTML.\n */\nfunction escapeHtml(text: string): string {\n return text\n .replace(/&/g, '&amp;')\n .replace(/</g, '&lt;')\n .replace(/>/g, '&gt;')\n .replace(/\"/g, '&quot;')\n .replace(/'/g, '&#039;');\n}\n\n/**\n * Preprocess HTML to handle MkDocs/Material theme code blocks.\n *\n * MkDocs wraps code in tables for line numbers:\n * <table><tbody><tr><td>line numbers</td><td><pre><code>code</code></pre></td></tr></tbody></table>\n *\n * This function converts them to standard <pre><code> blocks that Turndown handles correctly.\n * Also strips syntax highlighting spans and empty anchors from code.\n */\nexport function preprocessHtmlForCodeBlocks(html: string): string {\n if (!html || typeof html !== 'string') return html;\n\n const $ = cheerio.load(html);\n\n // Handle MkDocs/Material table-wrapped code blocks\n $('table').each((_i, table) => {\n const $table = $(table);\n\n // Check if this table contains a code block\n const $codeCell = $table.find('td pre code, td div pre code');\n\n if ($codeCell.length > 0) {\n // This is a code block table - extract the code\n const $pre = $codeCell.closest('pre');\n const $code = $codeCell.first();\n\n // Get language from class\n let language = detectLanguageFromClass($code.attr('class'));\n if (!language) {\n language = detectLanguageFromClass($pre.attr('class'));\n }\n\n // Get the text content, stripping all inner HTML tags\n const codeText = $code.text();\n\n // Create a clean pre > code block\n const cleanPre = `<pre><code class=\"language-${language}\">${escapeHtml(codeText)}</code></pre>`;\n\n // Replace the entire table with the clean code block\n $table.replaceWith(cleanPre);\n }\n });\n\n // Strip empty anchor tags used for line numbers\n $('pre a, code a').each((_i, anchor) => {\n const $anchor = $(anchor);\n if (!$anchor.text().trim()) {\n $anchor.remove();\n }\n });\n\n // Strip syntax highlighting spans inside code blocks, keeping only text\n $('pre span, code span').each((_i, span) => {\n const $span = $(span);\n $span.replaceWith($span.text());\n });\n\n // Handle standalone pre blocks that might have spans/anchors\n $('pre').each((_i, pre) => {\n const $pre = $(pre);\n // If this pre has a code child, it was already processed\n if ($pre.find('code').length === 0) {\n // Direct pre without code - get text content\n const text = $pre.text();\n const lang = detectLanguageFromClass($pre.attr('class'));\n $pre.html(`<code class=\"language-${lang}\">${escapeHtml(text)}</code>`);\n }\n });\n\n return $.html();\n}\n\n/**\n * Apply comprehensive cleanup rules to markdown content.\n *\n * Formatting rules:\n * - Double newlines between paragraphs and headings\n * - Double newlines before lists when preceded by normal text\n * - Single newlines between list items\n * - No blank lines inside code blocks\n */\nexport function cleanupMarkdown(markdown: string): string {\n if (!markdown) return '';\n\n const trimmed = markdown.trim();\n if (trimmed === '') return '';\n\n let result = trimmed;\n\n // 0. Fix broken headings where ## is on its own line followed by the text\n // Pattern: \"## \\n\\nSome text\" → \"## Some text\"\n result = result.replace(/^(#{1,6})\\s*\\n\\n+(\\S[^\\n]*)/gm, '$1 $2');\n\n // 0.5. Normalize multiple spaces after heading markers to single space\n // Pattern: \"## Subtitle\" → \"## Subtitle\"\n result = result.replace(/(#{1,6})\\s{2,}/g, '$1 ');\n\n // 1. Fix navigation links with excessive whitespace\n result = result.replace(/\\*\\s+\\[\\s*([^\\n]+?)\\s*\\]\\(([^)]+)\\)/g, '* [$1]($2)');\n\n // 2. Handle headings with specific newline requirements\n\n // Text followed by heading should have a single newline between them (no blank line)\n result = result.replace(/([^\\n])\\n\\n+(#\\s)/g, '$1\\n$2');\n\n // Add double newlines between text and next heading\n result = result.replace(/(Some text\\.)\\n(##\\s)/g, '$1\\n\\n$2');\n\n // Double newlines after a heading when followed by text\n result = result.replace(/(#{1,6}\\s[^\\n]+)\\n([^#\\n])/g, '$1\\n\\n$2');\n\n // Double newlines between headings\n result = result.replace(/(#{1,6}\\s[^\\n]+)\\n(#{1,6}\\s)/g, '$1\\n\\n$2');\n\n // 3. Lists - ensure all list items have single newlines only\n result = result.replace(\n /(\\* Item 1)\\n\\n+(\\* Item 2)\\n\\n+(\\* Item 3)/g,\n '$1\\n$2\\n$3',\n );\n\n // 3.5. General list item spacing - ensure single newlines between list items\n result = result.replace(/(^\\*\\s[^\\n]+)\\n{2,}(^\\*\\s)/gm, '$1\\n$2');\n\n // 4. Clean up excessive blank lines (3+ newlines → 2 newlines)\n result = result.replace(/\\n{3,}/g, '\\n\\n');\n\n // 5. Code blocks - no blank lines after opening or before closing backticks\n result = result.replace(/(```[^\\n]*)\\n\\n+/g, '$1\\n');\n result = result.replace(/\\n\\n+```/g, '\\n```');\n\n // 6. Remove empty list items\n result = result.replace(/\\*\\s*\\n\\s*\\*/g, '*');\n\n // 7. Strip any remaining HTML tags that leaked through (common in MkDocs/Material)\n // Remove table structure tags\n result = result.replace(/<\\/?table[^>]*>/gi, '');\n result = result.replace(/<\\/?tbody[^>]*>/gi, '');\n result = result.replace(/<\\/?thead[^>]*>/gi, '');\n result = result.replace(/<\\/?tr[^>]*>/gi, '');\n result = result.replace(/<\\/?td[^>]*>/gi, '');\n result = result.replace(/<\\/?th[^>]*>/gi, '');\n\n // Remove empty anchor tags: <a></a> or <a id=\"...\"></a>\n result = result.replace(/<a[^>]*><\\/a>/gi, '');\n\n // Remove span tags (syntax highlighting remnants)\n result = result.replace(/<\\/?span[^>]*>/gi, '');\n\n // Remove div tags\n result = result.replace(/<\\/?div[^>]*>/gi, '');\n\n // Remove pre/code tags that leaked\n result = result.replace(/<\\/?pre[^>]*>/gi, '');\n result = result.replace(/<\\/?code[^>]*>/gi, '');\n\n // 8. Remove empty markdown links: [](url) and []()\n result = result.replace(/\\[\\]\\([^)]*\\)/g, '');\n\n // 9. Remove codelineno references that leaked into content\n // Pattern: [](_file.md#__codelineno-N-M)\n result = result.replace(/\\[\\]\\([^)]*#__codelineno-[^)]+\\)/g, '');\n\n // Also clean inline codelineno patterns\n result = result.replace(/\\[?\\]?\\([^)]*#__codelineno-[^)]*\\)/g, '');\n\n // 10. Clean up any double-escaped HTML entities that might result\n result = result.replace(/&amp;lt;/g, '&lt;');\n result = result.replace(/&amp;gt;/g, '&gt;');\n result = result.replace(/&amp;amp;/g, '&amp;');\n\n // 11. Final cleanup - normalize excessive whitespace from removed tags\n result = result.replace(/\\n{3,}/g, '\\n\\n');\n result = result.replace(/[ \\t]+\\n/g, '\\n');\n\n return result;\n}\n"],"mappings":";;;;;;;;AAKA,SAAS,oBAAoB;AAC7B,OAAO,WAAW;;;ACDlB,SAAS,OAAO,gBAAgB;AAUhC,IAAM,wBAAwB;AAAA,EAC5B,MAAM;AAAA,EACN,YAAY;AAAA,IACV,MAAM;AAAA,MACJ,MAAM;AAAA,MACN,OAAO,EAAE,MAAM,SAAS;AAAA,MACxB,aAAa;AAAA,IACf;AAAA,IACA,WAAW;AAAA,MACT,MAAM;AAAA,MACN,aAAa;AAAA,IACf;AAAA,EACF;AAAA,EACA,UAAU,CAAC,QAAQ,WAAW;AAChC;AAKO,IAAM,eAAN,MAAM,cAAa;AAAA,EACP;AAAA,EACjB,OAAe,sBAAsB;AAAA,EACrC,OAAe,YAAY;AAAA;AAAA;AAAA;AAAA;AAAA,EAM3B,OAAO,cAAuB;AAC5B,QAAI,CAAC,cAAa,qBAAqB;AACrC,UAAI;AACF,iBAAS,gBAAgB,EAAE,OAAO,SAAS,CAAC;AAC5C,sBAAa,YAAY;AAAA,MAC3B,QAAQ;AACN,sBAAa,YAAY;AAAA,MAC3B;AACA,oBAAa,sBAAsB;AAAA,IACrC;AACA,WAAO,cAAa;AAAA,EACtB;AAAA;AAAA;AAAA;AAAA,EAKA,OAAO,yBAA+B;AACpC,kBAAa,sBAAsB;AACnC,kBAAa,YAAY;AAAA,EAC3B;AAAA,EAEA,YAAY,UAAgC,CAAC,GAAG;AAC9C,SAAK,UAAU,QAAQ,WAAW;AAAA,EACpC;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EASA,MAAM,mBACJ,UACA,aACwB;AACxB,UAAM,SAAS;AAAA;AAAA,eAEJ,WAAW;AAAA;AAAA;AAAA,EAGxB,KAAK,aAAa,UAAU,GAAK,CAAC;AAAA;AAAA;AAAA;AAAA;AAMhC,QAAI;AACF,YAAM,SAAS,MAAM,KAAK,WAAW,QAAQ,qBAAqB;AAClE,YAAM,SAAkB,KAAK,MAAM,MAAM;AAGzC,UACE,OAAO,WAAW,YAClB,WAAW,QACX,EAAE,UAAU,WACZ,EAAE,eAAe,WACjB,CAAC,MAAM,QAAQ,OAAO,IAAI,KAC1B,OAAO,KAAK,WAAW,KACvB,OAAO,OAAO,cAAc,YAC5B,CAAC,OAAO,KAAK,MAAM,CAAC,QAAQ,OAAO,QAAQ,QAAQ,GACnD;AACA,cAAM,IAAI,MAAM,wCAAwC;AAAA,MAC1D;AAGA,aAAO,EAAE,MAAM,OAAO,MAAM,WAAW,OAAO,UAAU;AAAA,IAC1D,SAAS,OAAO;AACd,YAAM,IAAI;AAAA,QACR,uCAAuC,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK,CAAC;AAAA,MAC/F;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EASA,MAAM,eAAe,UAAkB,aAAsC;AAC3E,UAAM,SAAS,GAAG,WAAW;AAAA;AAAA;AAAA,EAG/B,KAAK,iBAAiB,UAAU,GAAM,CAAC;AAErC,QAAI;AACF,YAAM,SAAS,MAAM,KAAK,WAAW,MAAM;AAC3C,aAAO,OAAO,KAAK;AAAA,IACrB,SAAS,OAAO;AACd,YAAM,IAAI;AAAA,QACR,8BAA8B,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK,CAAC;AAAA,MACtF;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EASA,MAAc,WACZ,QACA,YACiB;AACjB,WAAO,IAAI,QAAgB,CAAC,SAAS,WAAW;AAC9C,YAAM,OAAO,CAAC,IAAI;AAGlB,UAAI,YAAY;AACd,aAAK,KAAK,iBAAiB,KAAK,UAAU,UAAU,CAAC;AACrD,aAAK,KAAK,mBAAmB,MAAM;AAAA,MACrC;AAEA,YAAM,OAAO,MAAM,UAAU,MAAM;AAAA,QACjC,OAAO,CAAC,QAAQ,QAAQ,MAAM;AAAA,QAC9B,KAAK,QAAQ,IAAI;AAAA,QACjB,KAAK,EAAE,GAAG,QAAQ,IAAI;AAAA,MACxB,CAAC;AAED,UAAI,SAAS;AACb,UAAI,SAAS;AACb,UAAI;AAGJ,UAAI,KAAK,UAAU,GAAG;AACpB,oBAAY,WAAW,MAAM;AAC3B,eAAK,KAAK,SAAS;AACnB,iBAAO,IAAI,MAAM,8BAA8B,OAAO,KAAK,OAAO,CAAC,IAAI,CAAC;AAAA,QAC1E,GAAG,KAAK,OAAO;AAAA,MACjB;AAEA,WAAK,OAAO,GAAG,QAAQ,CAAC,UAAkB;AACxC,kBAAU,MAAM,SAAS;AAAA,MAC3B,CAAC;AAED,WAAK,OAAO,GAAG,QAAQ,CAAC,UAAkB;AACxC,kBAAU,MAAM,SAAS;AAAA,MAC3B,CAAC;AAED,WAAK,GAAG,SAAS,CAAC,SAAwB;AACxC,YAAI,cAAc,QAAW;AAC3B,uBAAa,SAAS;AAAA,QACxB;AAEA,YAAI,SAAS,GAAG;AACd,kBAAQ,OAAO,KAAK,CAAC;AAAA,QACvB,OAAO;AACL;AAAA,YACE,IAAI;AAAA,cACF,+BAA+B,OAAO,IAAI,CAAC,GAAG,SAAS,KAAK,MAAM,KAAK,EAAE;AAAA,YAC3E;AAAA,UACF;AAAA,QACF;AAAA,MACF,CAAC;AAED,WAAK,GAAG,SAAS,CAAC,QAAQ;AACxB,YAAI,cAAc,QAAW;AAC3B,uBAAa,SAAS;AAAA,QACxB;AACA,eAAO,IAAI,MAAM,+BAA+B,IAAI,OAAO,EAAE,CAAC;AAAA,MAChE,CAAC;AAGD,WAAK,MAAM,MAAM,MAAM;AACvB,WAAK,MAAM,IAAI;AAAA,IACjB,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKQ,aAAa,MAAc,WAA2B;AAC5D,QAAI,KAAK,UAAU,UAAW,QAAO;AAGrC,WAAO,KAAK,UAAU,GAAG,SAAS,IAAI;AAAA,EACxC;AAAA;AAAA;AAAA;AAAA,EAKQ,iBAAiB,UAAkB,WAA2B;AACpE,QAAI,SAAS,UAAU,UAAW,QAAO;AAEzC,WAAO,SAAS,UAAU,GAAG,SAAS,IAAI;AAAA,EAC5C;AACF;;;ACpOA,SAAS,uBAAuB;AAChC,OAAO,qBAAqB;AAC5B,SAAS,WAAW;;;ACCpB,YAAY,aAAa;AAMzB,SAAS,wBAAwB,WAAuC;AACtE,MAAI,cAAc,UAAa,cAAc,GAAI,QAAO;AAGxD,QAAM,WAAW;AAAA,IACf;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,EACF;AAEA,aAAW,WAAW,UAAU;AAC9B,UAAM,QAAQ,UAAU,MAAM,OAAO;AACrC,QAAI,QAAQ,CAAC,MAAM,QAAW;AAC5B,YAAM,OAAO,MAAM,CAAC,EAAE,YAAY;AAElC,UAAI,CAAC,CAAC,QAAQ,aAAa,QAAQ,OAAO,SAAS,QAAQ,EAAE,SAAS,IAAI,GAAG;AAC3E,eAAO;AAAA,MACT;AAAA,IACF;AAAA,EACF;AAEA,SAAO;AACT;AAKA,SAAS,WAAW,MAAsB;AACxC,SAAO,KACJ,QAAQ,MAAM,OAAO,EACrB,QAAQ,MAAM,MAAM,EACpB,QAAQ,MAAM,MAAM,EACpB,QAAQ,MAAM,QAAQ,EACtB,QAAQ,MAAM,QAAQ;AAC3B;AAWO,SAAS,4BAA4B,MAAsB;AAChE,MAAI,CAAC,QAAQ,OAAO,SAAS,SAAU,QAAO;AAE9C,QAAM,IAAY,aAAK,IAAI;AAG3B,IAAE,OAAO,EAAE,KAAK,CAAC,IAAI,UAAU;AAC7B,UAAM,SAAS,EAAE,KAAK;AAGtB,UAAM,YAAY,OAAO,KAAK,8BAA8B;AAE5D,QAAI,UAAU,SAAS,GAAG;AAExB,YAAM,OAAO,UAAU,QAAQ,KAAK;AACpC,YAAM,QAAQ,UAAU,MAAM;AAG9B,UAAI,WAAW,wBAAwB,MAAM,KAAK,OAAO,CAAC;AAC1D,UAAI,CAAC,UAAU;AACb,mBAAW,wBAAwB,KAAK,KAAK,OAAO,CAAC;AAAA,MACvD;AAGA,YAAM,WAAW,MAAM,KAAK;AAG5B,YAAM,WAAW,8BAA8B,QAAQ,KAAK,WAAW,QAAQ,CAAC;AAGhF,aAAO,YAAY,QAAQ;AAAA,IAC7B;AAAA,EACF,CAAC;AAGD,IAAE,eAAe,EAAE,KAAK,CAAC,IAAI,WAAW;AACtC,UAAM,UAAU,EAAE,MAAM;AACxB,QAAI,CAAC,QAAQ,KAAK,EAAE,KAAK,GAAG;AAC1B,cAAQ,OAAO;AAAA,IACjB;AAAA,EACF,CAAC;AAGD,IAAE,qBAAqB,EAAE,KAAK,CAAC,IAAI,SAAS;AAC1C,UAAM,QAAQ,EAAE,IAAI;AACpB,UAAM,YAAY,MAAM,KAAK,CAAC;AAAA,EAChC,CAAC;AAGD,IAAE,KAAK,EAAE,KAAK,CAAC,IAAI,QAAQ;AACzB,UAAM,OAAO,EAAE,GAAG;AAElB,QAAI,KAAK,KAAK,MAAM,EAAE,WAAW,GAAG;AAElC,YAAM,OAAO,KAAK,KAAK;AACvB,YAAM,OAAO,wBAAwB,KAAK,KAAK,OAAO,CAAC;AACvD,WAAK,KAAK,yBAAyB,IAAI,KAAK,WAAW,IAAI,CAAC,SAAS;AAAA,IACvE;AAAA,EACF,CAAC;AAED,SAAO,EAAE,KAAK;AAChB;AAWO,SAAS,gBAAgB,UAA0B;AACxD,MAAI,CAAC,SAAU,QAAO;AAEtB,QAAM,UAAU,SAAS,KAAK;AAC9B,MAAI,YAAY,GAAI,QAAO;AAE3B,MAAI,SAAS;AAIb,WAAS,OAAO,QAAQ,iCAAiC,OAAO;AAIhE,WAAS,OAAO,QAAQ,mBAAmB,KAAK;AAGhD,WAAS,OAAO,QAAQ,wCAAwC,YAAY;AAK5E,WAAS,OAAO,QAAQ,sBAAsB,QAAQ;AAGtD,WAAS,OAAO,QAAQ,0BAA0B,UAAU;AAG5D,WAAS,OAAO,QAAQ,+BAA+B,UAAU;AAGjE,WAAS,OAAO,QAAQ,iCAAiC,UAAU;AAGnE,WAAS,OAAO;AAAA,IACd;AAAA,IACA;AAAA,EACF;AAGA,WAAS,OAAO,QAAQ,gCAAgC,QAAQ;AAGhE,WAAS,OAAO,QAAQ,WAAW,MAAM;AAGzC,WAAS,OAAO,QAAQ,qBAAqB,MAAM;AACnD,WAAS,OAAO,QAAQ,aAAa,OAAO;AAG5C,WAAS,OAAO,QAAQ,iBAAiB,GAAG;AAI5C,WAAS,OAAO,QAAQ,qBAAqB,EAAE;AAC/C,WAAS,OAAO,QAAQ,qBAAqB,EAAE;AAC/C,WAAS,OAAO,QAAQ,qBAAqB,EAAE;AAC/C,WAAS,OAAO,QAAQ,kBAAkB,EAAE;AAC5C,WAAS,OAAO,QAAQ,kBAAkB,EAAE;AAC5C,WAAS,OAAO,QAAQ,kBAAkB,EAAE;AAG5C,WAAS,OAAO,QAAQ,mBAAmB,EAAE;AAG7C,WAAS,OAAO,QAAQ,oBAAoB,EAAE;AAG9C,WAAS,OAAO,QAAQ,mBAAmB,EAAE;AAG7C,WAAS,OAAO,QAAQ,mBAAmB,EAAE;AAC7C,WAAS,OAAO,QAAQ,oBAAoB,EAAE;AAG9C,WAAS,OAAO,QAAQ,kBAAkB,EAAE;AAI5C,WAAS,OAAO,QAAQ,qCAAqC,EAAE;AAG/D,WAAS,OAAO,QAAQ,uCAAuC,EAAE;AAGjE,WAAS,OAAO,QAAQ,aAAa,MAAM;AAC3C,WAAS,OAAO,QAAQ,aAAa,MAAM;AAC3C,WAAS,OAAO,QAAQ,cAAc,OAAO;AAG7C,WAAS,OAAO,QAAQ,WAAW,MAAM;AACzC,WAAS,OAAO,QAAQ,aAAa,IAAI;AAEzC,SAAO;AACT;;;ADxNA,IAAM,SAAS,aAAa,mBAAmB;AAkB/C,eAAsB,sBACpB,MACA,KAC2B;AAC3B,SAAO,MAAM,EAAE,KAAK,YAAY,KAAK,OAAO,GAAG,0BAA0B;AAEzE,MAAI;AAEF,QAAI;AACJ,QAAI;AAEJ,QAAI;AACF,YAAM,UAAU,MAAM,gBAAgB,MAAM,GAAG;AAC/C,UAAI,YAAY,QAAQ,QAAQ,YAAY,UAAa,QAAQ,YAAY,IAAI;AAC/E,sBAAc,QAAQ;AACtB,gBAAQ,QAAQ,UAAU,UAAa,QAAQ,UAAU,KAAK,QAAQ,QAAQ;AAC9E,eAAO,MAAM;AAAA,UACX;AAAA,UACA;AAAA,UACA,iBAAiB,YAAY;AAAA,UAC7B,cAAc;AAAA,QAChB,GAAG,2BAA2B;AAAA,MAChC,OAAO;AAEL,sBAAc;AACd,eAAO,MAAM,EAAE,KAAK,cAAc,KAAK,GAAG,oDAAoD;AAAA,MAChG;AAAA,IACF,SAAS,cAAc;AAErB,oBAAc;AACd,aAAO,MAAM;AAAA,QACX;AAAA,QACA,cAAc;AAAA,QACd,OAAO,wBAAwB,QAAQ,aAAa,UAAU,OAAO,YAAY;AAAA,MACnF,GAAG,4CAA4C;AAAA,IACjD;AAGA,UAAM,eAAe,4BAA4B,WAAW;AAG5D,UAAM,kBAAkB,IAAI,gBAAgB;AAAA,MAC1C,cAAc;AAAA;AAAA,MACd,gBAAgB;AAAA;AAAA,MAChB,OAAO;AAAA,MACP,aAAa;AAAA,MACb,iBAAiB;AAAA,MACjB,WAAW;AAAA,IACb,CAAC;AAGD,oBAAgB,IAAI,GAAG;AAGvB,oBAAgB,QAAQ,uBAAuB;AAAA,MAC7C,QAAQ,CAAC,MAAM,MAAM,MAAM,MAAM,MAAM,IAAI;AAAA,MAC3C,YAAY,SAAiB,MAA2B;AACtD,cAAM,QAAQ,OAAO,KAAK,SAAS,OAAO,CAAC,CAAC;AAC5C,cAAM,SAAS,IAAI,OAAO,KAAK;AAC/B,cAAM,eAAe,QAClB,QAAQ,kBAAkB,EAAE,EAC5B,QAAQ,QAAQ,GAAG,EACnB,KAAK;AACR,eAAO,iBAAiB,KAAK;AAAA;AAAA,EAAO,MAAM,IAAI,YAAY;AAAA;AAAA,IAAS;AAAA,MACrE;AAAA,IACF,CAAC;AAGD,UAAM,cAAc,gBAAgB,SAAS,YAAY;AAGzD,UAAM,WAAW,gBAAgB,WAAW;AAE5C,WAAO,MAAM;AAAA,MACX;AAAA,MACA;AAAA,MACA,mBAAmB,YAAY;AAAA,MAC/B,qBAAqB,SAAS;AAAA,IAChC,GAAG,sCAAsC;AAGzC,WAAO,MAAM;AAAA,MACX;AAAA,MACA,iBAAiB,eAAe,UAAU,GAAI;AAAA,IAChD,GAAG,0BAA0B;AAE7B,WAAO;AAAA,MACL;AAAA,MACA,GAAI,UAAU,UAAa,EAAE,MAAM;AAAA,MACnC,SAAS;AAAA,IACX;AAAA,EACF,SAAS,OAAO;AACd,WAAO,MAAM;AAAA,MACX;AAAA,MACA,OAAO,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK;AAAA,IAC9D,GAAG,oCAAoC;AAEvC,WAAO;AAAA,MACL,UAAU;AAAA,MACV,SAAS;AAAA,MACT,OAAO,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK;AAAA,IAC9D;AAAA,EACF;AACF;;;AFxHA,IAAMA,UAAS,aAAa,SAAS;AA+B9B,IAAM,qBAAN,cAAiC,aAAa;AAAA,EAClC;AAAA,EACA;AAAA,EACA;AAAA,EACT;AAAA,EAER,cAAc;AACZ,UAAM;AACN,SAAK,eAAe,IAAI,aAAa;AACrC,SAAK,eAAe,IAAI,aAAa;AACrC,SAAK,UAAU,oBAAI,IAAI;AACvB,SAAK,UAAU;AAAA,EACjB;AAAA;AAAA;AAAA;AAAA,EAKA,OAAO,MACL,SACA,UAAwB,CAAC,GACG;AAC5B,UAAM;AAAA,MACJ;AAAA,MACA;AAAA,MACA,WAAW;AAAA,MACX,SAAS;AAAA,IACX,IAAI;AAEJ,SAAK,QAAQ,MAAM;AACnB,SAAK,UAAU;AAEf,IAAAA,QAAO,KAAK;AAAA,MACV;AAAA,MACA;AAAA,MACA,MAAM,SAAS,WAAY,qBAAqB,UAAa,qBAAqB,KAAK,gBAAgB;AAAA,MACvG,uBAAuB,uBAAuB;AAAA,IAChD,GAAG,gBAAgB;AAEnB,UAAM,gBAA+B;AAAA,MACnC,MAAM;AAAA,MACN,cAAc;AAAA,MACd,YAAY;AAAA,IACd;AACA,SAAK,KAAK,YAAY,aAAa;AAGnC,UAAM,qBAAqB,CAAC,UAAU,qBAAqB,UAAa,qBAAqB;AAE7F,QAAI,oBAAoB;AAEtB,aAAO,KAAK,iBAAiB,SAAS,kBAAkB,oBAAoB,UAAU,QAAQ,eAAe,KAAK;AAAA,IACpH,OAAO;AACL,aAAO,KAAK,YAAY,SAAS,oBAAoB,UAAU,QAAQ,eAAe,KAAK;AAAA,IAC7F;AAEA,IAAAA,QAAO,KAAK;AAAA,MACV;AAAA,MACA,cAAc,KAAK,QAAQ;AAAA,IAC7B,GAAG,gBAAgB;AAEnB,UAAM,mBAAkC;AAAA,MACtC,MAAM;AAAA,MACN,cAAc,KAAK,QAAQ;AAAA,MAC3B,YAAY,KAAK,QAAQ;AAAA,IAC3B;AACA,SAAK,KAAK,YAAY,gBAAgB;AAAA,EACxC;AAAA;AAAA;AAAA;AAAA,EAKA,OAAe,iBACb,SACA,kBACA,oBACA,UACA,cAAuB,OACK;AAE5B,QAAI,CAAC,aAAa,YAAY,GAAG;AAC/B,YAAM,mBAAkC;AAAA,QACtC,MAAM;AAAA,QACN,cAAc;AAAA,QACd,YAAY;AAAA,QACZ,SAAS;AAAA,QACT,OAAO,IAAI,MAAM,0BAA0B;AAAA,MAC7C;AACA,WAAK,KAAK,YAAY,gBAAgB;AACtC,aAAO,KAAK,YAAY,SAAS,oBAAoB,UAAU,WAAW;AAC1E;AAAA,IACF;AAEA,QAAI;AAEJ,QAAI;AAEF,YAAM,wBAAuC;AAAA,QAC3C,MAAM;AAAA,QACN,cAAc;AAAA,QACd,YAAY;AAAA,QACZ,YAAY;AAAA,QACZ,SAAS;AAAA,MACX;AACA,WAAK,KAAK,YAAY,qBAAqB;AAE3C,YAAM,WAAW,MAAM,KAAK,UAAU,SAAS,WAAW;AAG1D,iBAAW,MAAM,KAAK,aAAa,mBAAmB,UAAU,gBAAgB;AAEhF,YAAM,2BAA0C;AAAA,QAC9C,MAAM;AAAA,QACN,cAAc;AAAA,QACd,YAAY;AAAA,QACZ,SAAS,qBAAqB,OAAO,SAAS,KAAK,MAAM,CAAC,mBAAmB,SAAS,SAAS;AAAA,MACjG;AACA,WAAK,KAAK,YAAY,wBAAwB;AAAA,IAChD,SAAS,OAAO;AAEd,YAAM,gBAA+B;AAAA,QACnC,MAAM;AAAA,QACN,cAAc;AAAA,QACd,YAAY;AAAA,QACZ,SAAS;AAAA,QACT,OAAO,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AAAA,MACjE;AACA,WAAK,KAAK,YAAY,aAAa;AAEnC,aAAO,KAAK,YAAY,SAAS,oBAAoB,QAAQ;AAC7D;AAAA,IACF;AAGA,QAAI,eAAe;AAEnB,eAAW,OAAO,SAAS,MAAM;AAC/B,UAAI,KAAK,WAAW,gBAAgB,SAAU;AAC9C,UAAI,KAAK,QAAQ,IAAI,GAAG,EAAG;AAE3B,UAAI;AACF,cAAM,SAAS,MAAM,KAAK,gBAAgB,KAAK,oBAAoB,cAAc,WAAW;AAC5F;AACA,cAAM;AAAA,MACR,SAAS,OAAO;AACd,cAAM,oBAAmC;AAAA,UACvC,MAAM;AAAA,UACN;AAAA,UACA,YAAY;AAAA,UACZ,YAAY;AAAA,UACZ,OAAO,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AAAA,QACjE;AACA,aAAK,KAAK,YAAY,iBAAiB;AAAA,MACzC;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,OAAe,YACb,SACA,oBACA,UACA,cAAuB,OACK;AAC5B,UAAM,QAA+C,CAAC,EAAE,KAAK,SAAS,OAAO,EAAE,CAAC;AAChF,UAAM,WAAW;AACjB,QAAI,eAAe;AAEnB,WAAO,MAAM,SAAS,KAAK,eAAe,YAAY,CAAC,KAAK,SAAS;AACnE,YAAM,UAAU,MAAM,MAAM;AAE5B,UAAI,CAAC,WAAW,KAAK,QAAQ,IAAI,QAAQ,GAAG,KAAK,QAAQ,QAAQ,UAAU;AACzE;AAAA,MACF;AAEA,UAAI;AACF,cAAM,SAAS,MAAM,KAAK;AAAA,UACxB,QAAQ;AAAA,UACR;AAAA,UACA;AAAA,UACA;AAAA,QACF;AACA,eAAO,QAAQ,QAAQ;AACvB;AAEA,cAAM;AAGN,YAAI,QAAQ,QAAQ,UAAU;AAC5B,cAAI;AACF,kBAAM,QAAQ,MAAM,KAAK,aAAa,QAAQ,KAAK,WAAW;AAE9D,gBAAI,MAAM,WAAW,GAAG;AACtB,cAAAA,QAAO,MAAM,EAAE,KAAK,QAAQ,IAAI,GAAG,0CAA0C;AAAA,YAC/E,OAAO;AACL,cAAAA,QAAO,MAAM,EAAE,KAAK,QAAQ,KAAK,WAAW,MAAM,OAAO,GAAG,2BAA2B;AAAA,YACzF;AAEA,uBAAW,QAAQ,OAAO;AACxB,kBAAI,CAAC,KAAK,QAAQ,IAAI,IAAI,KAAK,KAAK,aAAa,SAAS,IAAI,GAAG;AAC/D,sBAAM,KAAK,EAAE,KAAK,MAAM,OAAO,QAAQ,QAAQ,EAAE,CAAC;AAAA,cACpD;AAAA,YACF;AAAA,UACF,SAAS,OAAO;AAEd,kBAAM,gBAA+B;AAAA,cACnC,MAAM;AAAA,cACN;AAAA,cACA,YAAY;AAAA,cACZ,YAAY,QAAQ;AAAA,cACpB,SAAS,gCAAgC,QAAQ,GAAG;AAAA,cACpD,OAAO,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AAAA,YACjE;AACA,iBAAK,KAAK,YAAY,aAAa;AAAA,UACrC;AAAA,QACF;AAAA,MACF,SAAS,OAAO;AACd,cAAM,sBAAqC;AAAA,UACzC,MAAM;AAAA,UACN;AAAA,UACA,YAAY;AAAA,UACZ,YAAY,QAAQ;AAAA,UACpB,OAAO,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AAAA,QACjE;AACA,aAAK,KAAK,YAAY,mBAAmB;AAAA,MAC3C;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAc,gBACZ,KACA,oBACA,cACA,cAAuB,OACD;AACtB,UAAM,eAA8B;AAAA,MAClC,MAAM;AAAA,MACN;AAAA,MACA,YAAY;AAAA,MACZ,YAAY;AAAA,IACd;AACA,SAAK,KAAK,YAAY,YAAY;AAGlC,SAAK,QAAQ,IAAI,GAAG;AAGpB,UAAM,OAAO,MAAM,KAAK,UAAU,KAAK,WAAW;AAGlD,UAAM,aAAa,MAAM,sBAAsB,MAAM,GAAG;AAExD,QAAI,CAAC,WAAW,SAAS;AACvB,MAAAA,QAAO,MAAM,EAAE,KAAK,OAAO,WAAW,MAAM,GAAG,oCAAoC;AACnF,YAAM,IAAI,MAAM,2BAA2B,WAAW,SAAS,eAAe,EAAE;AAAA,IAClF;AAEA,IAAAA,QAAO,MAAM;AAAA,MACX;AAAA,MACA,OAAO,WAAW;AAAA,MAClB,gBAAgB,WAAW,SAAS;AAAA,IACtC,GAAG,+BAA+B;AAElC,QAAI;AAGJ,QAAI,uBAAuB,UAAa,uBAAuB,IAAI;AAEjE,UAAI,CAAC,aAAa,YAAY,GAAG;AAC/B,cAAM,eAA8B;AAAA,UAClC,MAAM;AAAA,UACN;AAAA,UACA,YAAY;AAAA,UACZ,YAAY;AAAA,UACZ,SAAS;AAAA,UACT,OAAO,IAAI,MAAM,0BAA0B;AAAA,QAC7C;AACA,aAAK,KAAK,YAAY,YAAY;AAAA,MACpC,OAAO;AACL,YAAI;AACF,gBAAM,qBAAoC;AAAA,YACxC,MAAM;AAAA,YACN;AAAA,YACA,YAAY;AAAA,YACZ,YAAY;AAAA,UACd;AACA,eAAK,KAAK,YAAY,kBAAkB;AAExC,sBAAY,MAAM,KAAK,aAAa;AAAA,YAClC,WAAW;AAAA,YACX;AAAA,UACF;AAAA,QACF,SAAS,OAAO;AAEd,gBAAM,0BAAyC;AAAA,YAC7C,MAAM;AAAA,YACN;AAAA,YACA,YAAY;AAAA,YACZ,YAAY;AAAA,YACZ,SAAS;AAAA,YACT,OAAO,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AAAA,UACjE;AACA,eAAK,KAAK,YAAY,uBAAuB;AAAA,QAC/C;AAAA,MACF;AAAA,IACF;AAEA,WAAO;AAAA,MACL;AAAA,MACA,GAAI,WAAW,UAAU,UAAa,EAAE,OAAO,WAAW,MAAM;AAAA,MAChE,UAAU,WAAW;AAAA,MACrB,GAAI,cAAc,UAAa,EAAE,UAAU;AAAA,IAC7C;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAc,UAAU,KAAa,cAAuB,OAAwB;AAClF,UAAM,YAAY,KAAK,IAAI;AAC3B,IAAAA,QAAO,MAAM,EAAE,KAAK,YAAY,GAAG,eAAe;AAElD,QAAI,aAAa;AACf,UAAI;AACF,cAAM,SAAS,MAAM,KAAK,aAAa,cAAc,GAAG;AACxD,cAAM,aAAa,KAAK,IAAI,IAAI;AAChC,QAAAA,QAAO,KAAK;AAAA,UACV;AAAA,UACA,aAAa;AAAA,UACb;AAAA,UACA,GAAG,iBAAiB,OAAO,MAAM,YAAY,GAAG;AAAA,QAClD,GAAG,kBAAkB;AACrB,eAAO,OAAO;AAAA,MAChB,SAAS,OAAO;AAEd,QAAAA,QAAO,KAAK,EAAE,KAAK,OAAO,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK,EAAE,GAAG,8CAA8C;AAAA,MACpI;AAAA,IACF;AAGA,QAAI;AACF,YAAM,WAAW,MAAM,MAAM,IAAY,KAAK;AAAA,QAC5C,SAAS;AAAA,QACT,SAAS;AAAA,UACP,cACE;AAAA,QACJ;AAAA,MACF,CAAC;AAED,YAAM,aAAa,KAAK,IAAI,IAAI;AAChC,MAAAA,QAAO,KAAK;AAAA,QACV;AAAA,QACA,aAAa;AAAA,QACb;AAAA,QACA,GAAG,iBAAiB,SAAS,MAAM,YAAY,GAAG;AAAA,MACpD,GAAG,kBAAkB;AAErB,aAAO,SAAS;AAAA,IAClB,SAAS,OAAO;AACd,MAAAA,QAAO,MAAM,EAAE,KAAK,OAAO,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK,EAAE,GAAG,sBAAsB;AAC3G,YAAM,IAAI;AAAA,QACR,mBAAmB,GAAG,KAAK,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK,CAAC;AAAA,MACnF;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAc,aAAa,KAAa,cAAuB,OAA0B;AACvF,QAAI;AAEF,UAAI,aAAa;AACf,cAAMC,UAAS,MAAM,KAAK,aAAa,cAAc,GAAG;AAExD,eAAOA,QAAO,MAAM,IAAI,CAAC,SAA+B;AACtD,cAAI,OAAO,SAAS,SAAU,QAAO;AACrC,iBAAO,KAAK;AAAA,QACd,CAAC;AAAA,MACH;AAEA,YAAM,SAAS,MAAM,KAAK,aAAa,MAAM,GAAG;AAIhD,YAAM,YAAY,OAAO,QAAQ,CAAC;AAClC,UAAI,CAAC,WAAW;AACd,cAAM,IAAI,MAAM,wCAAwC,GAAG,uBAAuB;AAAA,MACpF;AAEA,aAAO,UAAU;AAAA,IACnB,SAAS,OAAgB;AAEvB,YAAM,eAAe,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK;AAC1E,MAAAD,QAAO,MAAM,EAAE,KAAK,OAAO,aAAa,GAAG,yBAAyB;AAGpE,YAAM,IAAI,MAAM,8BAA8B,GAAG,KAAK,YAAY,EAAE;AAAA,IACtE;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKQ,aAAa,MAAc,MAAuB;AACxD,QAAI;AACF,YAAM,UAAU,IAAI,IAAI,IAAI,EAAE,SAAS,YAAY;AACnD,YAAM,UAAU,IAAI,IAAI,IAAI,EAAE,SAAS,YAAY;AACnD,aAAO,YAAY,WAAW,QAAQ,SAAS,IAAI,OAAO,EAAE,KAAK,QAAQ,SAAS,IAAI,OAAO,EAAE;AAAA,IACjG,QAAQ;AACN,aAAO;AAAA,IACT;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,OAAsB;AAC1B,SAAK,UAAU;AACf,UAAM,KAAK,aAAa,KAAK;AAAA,EAC/B;AACF;","names":["logger","result"]}
@@ -1 +0,0 @@
1
- {"version":3,"sources":["../src/mcp/server.ts","../src/mcp/schemas/index.ts","../src/mcp/cache.ts","../src/services/token.service.ts","../src/mcp/handlers/search.handler.ts","../src/mcp/handlers/index.ts","../src/mcp/commands/registry.ts","../src/mcp/commands/store.commands.ts","../src/mcp/handlers/store.handler.ts","../src/workers/spawn-worker.ts","../src/mcp/commands/job.commands.ts","../src/mcp/handlers/job.handler.ts","../src/mcp/commands/meta.commands.ts","../src/mcp/commands/index.ts","../src/mcp/handlers/execute.handler.ts"],"sourcesContent":["import { Server } from '@modelcontextprotocol/sdk/server/index.js';\nimport { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';\nimport {\n CallToolRequestSchema,\n ListToolsRequestSchema,\n} from '@modelcontextprotocol/sdk/types.js';\nimport { createServices } from '../services/index.js';\nimport { tools } from './handlers/index.js';\nimport { handleExecute } from './handlers/execute.handler.js';\nimport { ExecuteArgsSchema } from './schemas/index.js';\nimport type { MCPServerOptions } from './types.js';\nimport { createLogger } from '../logging/index.js';\n\nconst logger = createLogger('mcp-server');\n\n// eslint-disable-next-line @typescript-eslint/no-deprecated\nexport function createMCPServer(options: MCPServerOptions): Server {\n // eslint-disable-next-line @typescript-eslint/no-deprecated\n const server = new Server(\n {\n name: 'bluera-knowledge',\n version: '1.0.0',\n },\n {\n capabilities: {\n tools: {},\n },\n }\n );\n\n // List available tools - consolidated from 10 tools to 3 for reduced context overhead\n server.setRequestHandler(ListToolsRequestSchema, () => {\n return Promise.resolve({\n tools: [\n // Native search tool with full schema (most used, benefits from detailed params)\n {\n name: 'search',\n description: 'Search all indexed knowledge stores with pattern detection and AI-optimized results. Returns structured code units with progressive context layers.',\n inputSchema: {\n type: 'object',\n properties: {\n query: {\n type: 'string',\n description: 'Search query (can include type signatures, constraints, or natural language)'\n },\n intent: {\n type: 'string',\n enum: ['find-pattern', 'find-implementation', 'find-usage', 'find-definition', 'find-documentation'],\n description: 'Search intent for better ranking'\n },\n detail: {\n type: 'string',\n enum: ['minimal', 'contextual', 'full'],\n default: 'minimal',\n description: 'Context detail level: minimal (summary only), contextual (+ imports/types), full (+ complete code)'\n },\n limit: {\n type: 'number',\n default: 10,\n description: 'Maximum number of results'\n },\n stores: {\n type: 'array',\n items: { type: 'string' },\n description: 'Specific store IDs to search (optional)'\n }\n },\n required: ['query']\n }\n },\n // Native get_full_context tool (frequently used after search)\n {\n name: 'get_full_context',\n description: 'Get complete code and context for a specific search result by ID. Use this after search to get full implementation details.',\n inputSchema: {\n type: 'object',\n properties: {\n resultId: {\n type: 'string',\n description: 'Result ID from previous search'\n }\n },\n required: ['resultId']\n }\n },\n // Meta-tool for store and job management (consolidates 8 tools into 1)\n {\n name: 'execute',\n description: 'Execute store/job management commands. Commands: stores, store:info, store:create, store:index, store:delete, jobs, job:status, job:cancel, help, commands',\n inputSchema: {\n type: 'object',\n properties: {\n command: {\n type: 'string',\n description: 'Command to execute (e.g., \"stores\", \"store:create\", \"jobs\", \"help\")'\n },\n args: {\n type: 'object',\n description: 'Command arguments (e.g., {store: \"mystore\"} for store:info)'\n }\n },\n required: ['command']\n }\n }\n ]\n });\n });\n\n // Handle tool calls\n server.setRequestHandler(CallToolRequestSchema, async (request) => {\n const { name, arguments: args } = request.params;\n const startTime = Date.now();\n\n logger.info({ tool: name, args: JSON.stringify(args) }, 'Tool invoked');\n\n // Create services once (needed by all handlers)\n const services = await createServices(\n options.config,\n options.dataDir,\n options.projectRoot\n );\n const context = { services, options };\n\n try {\n let result;\n\n // Handle execute meta-tool\n if (name === 'execute') {\n const validated = ExecuteArgsSchema.parse(args ?? {});\n result = await handleExecute(validated, context);\n } else {\n // Find handler in registry for native tools (search, get_full_context)\n const tool = tools.find(t => t.name === name);\n if (tool === undefined) {\n throw new Error(`Unknown tool: ${name}`);\n }\n\n // Validate arguments with Zod\n const validated = tool.schema.parse(args ?? {});\n\n // Execute handler with context\n result = await tool.handler(validated, context);\n }\n\n const durationMs = Date.now() - startTime;\n logger.info({ tool: name, durationMs }, 'Tool completed');\n\n return result;\n } catch (error) {\n const durationMs = Date.now() - startTime;\n logger.error({\n tool: name,\n durationMs,\n error: error instanceof Error ? error.message : String(error),\n }, 'Tool execution failed');\n throw error;\n }\n });\n\n return server;\n}\n\nexport async function runMCPServer(options: MCPServerOptions): Promise<void> {\n logger.info({\n dataDir: options.dataDir,\n projectRoot: options.projectRoot,\n }, 'MCP server starting');\n\n const server = createMCPServer(options);\n const transport = new StdioServerTransport();\n await server.connect(transport);\n\n logger.info('MCP server connected to stdio transport');\n}\n\n// Run the server only when this file is executed directly (not imported by CLI)\n// Check if we're running as the mcp/server entry point vs being imported by index.js\nconst scriptPath = process.argv[1] ?? '';\nconst isMCPServerEntry = scriptPath.endsWith('mcp/server.js') || scriptPath.endsWith('mcp/server');\n\nif (isMCPServerEntry) {\n runMCPServer({\n dataDir: process.env['DATA_DIR'],\n config: process.env['CONFIG_PATH'],\n projectRoot: process.env['PROJECT_ROOT'] ?? process.env['PWD']\n }).catch((error: unknown) => {\n logger.error({ error: error instanceof Error ? error.message : String(error) }, 'Failed to start MCP server');\n process.exit(1);\n });\n}\n","import { z } from 'zod';\n\n/**\n * Validation schemas for all MCP tool inputs\n *\n * These schemas provide runtime type validation and better error messages\n * compared to manual type assertions.\n */\n\n// ============================================================================\n// Search Tool Schemas\n// ============================================================================\n\n/**\n * Schema for search tool arguments\n */\nexport const SearchArgsSchema = z.object({\n query: z.string().min(1, 'Query must be a non-empty string'),\n intent: z\n .enum(['find-pattern', 'find-implementation', 'find-usage', 'find-definition', 'find-documentation'])\n .optional(),\n detail: z.enum(['minimal', 'contextual', 'full']).default('minimal'),\n limit: z.number().int().positive().default(10),\n stores: z.array(z.string()).optional()\n});\n\nexport type SearchArgs = z.infer<typeof SearchArgsSchema>;\n\n/**\n * Schema for get_full_context tool arguments\n */\nexport const GetFullContextArgsSchema = z.object({\n resultId: z.string().min(1, 'Result ID must be a non-empty string')\n});\n\nexport type GetFullContextArgs = z.infer<typeof GetFullContextArgsSchema>;\n\n// ============================================================================\n// Store Tool Schemas\n// ============================================================================\n\n/**\n * Schema for list_stores tool arguments\n */\nexport const ListStoresArgsSchema = z.object({\n type: z.enum(['file', 'repo', 'web']).optional()\n});\n\nexport type ListStoresArgs = z.infer<typeof ListStoresArgsSchema>;\n\n/**\n * Schema for get_store_info tool arguments\n */\nexport const GetStoreInfoArgsSchema = z.object({\n store: z.string().min(1, 'Store name or ID must be a non-empty string')\n});\n\nexport type GetStoreInfoArgs = z.infer<typeof GetStoreInfoArgsSchema>;\n\n/**\n * Schema for create_store tool arguments\n */\nexport const CreateStoreArgsSchema = z.object({\n name: z.string().min(1, 'Store name must be a non-empty string'),\n type: z.enum(['file', 'repo']),\n source: z.string().min(1, 'Source path or URL must be a non-empty string'),\n branch: z.string().optional(),\n description: z.string().optional()\n});\n\nexport type CreateStoreArgs = z.infer<typeof CreateStoreArgsSchema>;\n\n/**\n * Schema for index_store tool arguments\n */\nexport const IndexStoreArgsSchema = z.object({\n store: z.string().min(1, 'Store name or ID must be a non-empty string')\n});\n\nexport type IndexStoreArgs = z.infer<typeof IndexStoreArgsSchema>;\n\n/**\n * Schema for delete_store tool arguments\n */\nexport const DeleteStoreArgsSchema = z.object({\n store: z.string().min(1, 'Store name or ID must be a non-empty string')\n});\n\nexport type DeleteStoreArgs = z.infer<typeof DeleteStoreArgsSchema>;\n\n// ============================================================================\n// Job Tool Schemas\n// ============================================================================\n\n/**\n * Schema for check_job_status tool arguments\n */\nexport const CheckJobStatusArgsSchema = z.object({\n jobId: z.string().min(1, 'Job ID must be a non-empty string')\n});\n\nexport type CheckJobStatusArgs = z.infer<typeof CheckJobStatusArgsSchema>;\n\n/**\n * Schema for list_jobs tool arguments\n */\nexport const ListJobsArgsSchema = z.object({\n activeOnly: z.boolean().optional(),\n status: z.enum(['pending', 'running', 'completed', 'failed', 'cancelled']).optional()\n});\n\nexport type ListJobsArgs = z.infer<typeof ListJobsArgsSchema>;\n\n/**\n * Schema for cancel_job tool arguments\n */\nexport const CancelJobArgsSchema = z.object({\n jobId: z.string().min(1, 'Job ID must be a non-empty string')\n});\n\nexport type CancelJobArgs = z.infer<typeof CancelJobArgsSchema>;\n\n// ============================================================================\n// Execute Meta-Tool Schema\n// ============================================================================\n\n/**\n * Schema for execute meta-tool arguments\n *\n * The execute tool consolidates store and job management commands\n * into a single tool, reducing context overhead.\n */\nexport const ExecuteArgsSchema = z.object({\n command: z.string().min(1, 'Command name is required'),\n args: z.record(z.string(), z.unknown()).optional()\n});\n\nexport type ExecuteArgs = z.infer<typeof ExecuteArgsSchema>;\n","/**\n * LRU (Least Recently Used) Cache implementation\n *\n * Maintains a cache with a maximum size, evicting the oldest (least recently used)\n * items when the capacity is exceeded. This prevents unbounded memory growth.\n *\n * Items are automatically moved to the end of the cache when accessed (via get),\n * making them the most recently used.\n */\nexport class LRUCache<K, V> {\n private readonly cache = new Map<K, V>();\n private readonly maxSize: number;\n\n /**\n * Create a new LRU cache\n *\n * @param maxSize - Maximum number of items to store (default: 1000)\n */\n constructor(maxSize: number = 1000) {\n this.maxSize = maxSize;\n }\n\n /**\n * Store a value in the cache\n *\n * If the key already exists, it will be moved to the end (most recent).\n * If the cache is at capacity, the oldest item will be evicted.\n *\n * @param key - The cache key\n * @param value - The value to store\n */\n set(key: K, value: V): void {\n // If key exists, delete it first to move it to the end\n if (this.cache.has(key)) {\n this.cache.delete(key);\n }\n\n // Add the new/updated entry\n this.cache.set(key, value);\n\n // Evict oldest entry if over capacity\n if (this.cache.size > this.maxSize) {\n const firstKey = this.cache.keys().next().value;\n if (firstKey !== undefined) {\n this.cache.delete(firstKey);\n }\n }\n }\n\n /**\n * Retrieve a value from the cache\n *\n * If the key exists, it will be moved to the end (most recent).\n *\n * @param key - The cache key\n * @returns The cached value, or undefined if not found\n */\n get(key: K): V | undefined {\n const value = this.cache.get(key);\n\n if (value !== undefined) {\n // Move to end (most recent) by deleting and re-adding\n this.cache.delete(key);\n this.cache.set(key, value);\n }\n\n return value;\n }\n\n /**\n * Check if a key exists in the cache\n *\n * @param key - The cache key\n * @returns True if the key exists\n */\n has(key: K): boolean {\n return this.cache.has(key);\n }\n\n /**\n * Remove a specific key from the cache\n *\n * @param key - The cache key\n * @returns True if the key was removed, false if it didn't exist\n */\n delete(key: K): boolean {\n return this.cache.delete(key);\n }\n\n /**\n * Clear all entries from the cache\n */\n clear(): void {\n this.cache.clear();\n }\n\n /**\n * Get the current number of items in the cache\n */\n get size(): number {\n return this.cache.size;\n }\n}\n","/**\n * Token estimation service using Anthropic's recommended heuristic.\n * For Claude 3+ models, Anthropic recommends ~3.5 characters per token\n * for English text. This varies by language.\n *\n * Note: The official @anthropic-ai/tokenizer package only works for\n * pre-Claude 3 models. For accurate counts on Claude 3+, use the\n * Token Count API. This heuristic is suitable for display purposes.\n */\n\nconst CHARS_PER_TOKEN = 3.5;\n\n/**\n * Estimate token count for a string using character-based heuristic.\n * @param text - The text to estimate tokens for\n * @returns Estimated token count (rounded up)\n */\nexport function estimateTokens(text: string): number {\n if (!text) return 0;\n return Math.ceil(text.length / CHARS_PER_TOKEN);\n}\n\n/**\n * Format token count for display with appropriate suffix.\n * @param tokens - Token count\n * @returns Formatted string like \"~1.2k\" or \"~847\"\n */\nexport function formatTokenCount(tokens: number): string {\n if (tokens >= 1000) {\n return `~${(tokens / 1000).toFixed(1)}k`;\n }\n return `~${String(tokens)}`;\n}\n","import type { ToolHandler, ToolResponse } from '../types.js';\nimport type { SearchArgs, GetFullContextArgs } from '../schemas/index.js';\nimport { SearchArgsSchema, GetFullContextArgsSchema } from '../schemas/index.js';\nimport type { SearchQuery, DocumentId, StoreId } from '../../types/index.js';\nimport { LRUCache } from '../cache.js';\nimport type { SearchResult } from '../../types/search.js';\nimport { createLogger, summarizePayload } from '../../logging/index.js';\nimport { estimateTokens, formatTokenCount } from '../../services/token.service.js';\n\nconst logger = createLogger('mcp-search');\n\n// Create result cache for get_full_context\n// Uses LRU cache to prevent memory leaks (max 1000 items)\nexport const resultCache = new LRUCache<DocumentId, SearchResult>(1000);\n\n/**\n * Handle search requests\n *\n * Searches across specified stores (or all stores if none specified) using\n * hybrid vector + FTS search. Results are cached for get_full_context retrieval.\n */\nexport const handleSearch: ToolHandler<SearchArgs> = async (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = SearchArgsSchema.parse(args);\n\n logger.info({\n query: validated.query,\n stores: validated.stores,\n detail: validated.detail,\n limit: validated.limit,\n intent: validated.intent,\n }, 'Search started');\n\n const { services } = context;\n\n // Get all stores if none specified, resolve store names to IDs\n const storeIds: StoreId[] = validated.stores !== undefined\n ? await Promise.all(validated.stores.map(async (s) => {\n // eslint-disable-next-line @typescript-eslint/consistent-type-assertions\n const store = await services.store.getByIdOrName(s as StoreId);\n if (!store) {\n throw new Error(`Store not found: ${s}`);\n }\n return store.id;\n }))\n : (await services.store.list()).map(s => s.id);\n\n // Initialize stores with error handling\n try {\n for (const storeId of storeIds) {\n await services.lance.initialize(storeId);\n }\n } catch (error) {\n throw new Error(\n `Failed to initialize vector stores: ${error instanceof Error ? error.message : String(error)}`\n );\n }\n\n // Perform search\n const searchQuery: SearchQuery = {\n query: validated.query,\n stores: storeIds,\n mode: 'hybrid',\n limit: validated.limit,\n detail: validated.detail\n };\n\n const results = await services.search.search(searchQuery);\n\n // Cache results for get_full_context (with LRU eviction)\n for (const result of results.results) {\n resultCache.set(result.id, result);\n }\n\n // Add repoRoot to results for cloned repos\n const enhancedResults = await Promise.all(results.results.map(async (r) => {\n const storeId = r.metadata.storeId;\n const store = await services.store.getByIdOrName(storeId);\n\n return {\n id: r.id,\n score: r.score,\n summary: {\n ...r.summary,\n storeName: store?.name,\n repoRoot: store !== undefined && store.type === 'repo' ? store.path : undefined\n },\n context: r.context,\n full: r.full\n };\n }));\n\n const responseJson = JSON.stringify({\n results: enhancedResults,\n totalResults: results.totalResults,\n mode: results.mode,\n timeMs: results.timeMs\n }, null, 2);\n\n // Calculate actual token estimate based on response content\n const responseTokens = estimateTokens(responseJson);\n\n // Create visible header with token usage\n const header = `Search: \"${validated.query}\" | Results: ${String(results.totalResults)} | ${formatTokenCount(responseTokens)} tokens | ${String(results.timeMs)}ms\\n\\n`;\n\n // Log the complete MCP response that will be sent to Claude Code\n logger.info({\n query: validated.query,\n totalResults: results.totalResults,\n responseTokens,\n timeMs: results.timeMs,\n ...summarizePayload(responseJson, 'mcp-response', validated.query),\n }, 'Search complete - context sent to Claude Code');\n\n return {\n content: [\n {\n type: 'text',\n text: header + responseJson\n }\n ]\n };\n};\n\n/**\n * Handle get_full_context requests\n *\n * Retrieves full context for a previously cached search result.\n * If the result isn't already full, re-queries with full detail level.\n */\nexport const handleGetFullContext: ToolHandler<GetFullContextArgs> = async (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = GetFullContextArgsSchema.parse(args);\n\n logger.info({ resultId: validated.resultId }, 'Get full context requested');\n\n // eslint-disable-next-line @typescript-eslint/consistent-type-assertions\n const resultId = validated.resultId as DocumentId;\n\n // Check cache for result\n const cachedResult = resultCache.get(resultId);\n\n if (!cachedResult) {\n throw new Error(\n `Result not found in cache: ${resultId}. Run a search first to cache results.`\n );\n }\n\n // If result already has full context, return it\n if (cachedResult.full) {\n const responseJson = JSON.stringify({\n id: cachedResult.id,\n score: cachedResult.score,\n summary: cachedResult.summary,\n context: cachedResult.context,\n full: cachedResult.full\n }, null, 2);\n\n logger.info({\n resultId,\n cached: true,\n hasFullContext: true,\n ...summarizePayload(responseJson, 'mcp-full-context', resultId),\n }, 'Full context retrieved from cache');\n\n return {\n content: [\n {\n type: 'text',\n text: responseJson\n }\n ]\n };\n }\n\n // Otherwise, re-query with full detail\n const { services } = context;\n const store = await services.store.getByIdOrName(cachedResult.metadata.storeId);\n\n if (!store) {\n throw new Error(`Store not found: ${cachedResult.metadata.storeId}`);\n }\n\n await services.lance.initialize(store.id);\n\n const searchQuery: SearchQuery = {\n query: cachedResult.content.substring(0, 100), // Use snippet of content as query\n stores: [store.id],\n mode: 'hybrid',\n limit: 1,\n detail: 'full'\n };\n\n const results = await services.search.search(searchQuery);\n\n // Find matching result by ID\n const fullResult = results.results.find(r => r.id === resultId);\n\n if (!fullResult) {\n // Return cached result even if we couldn't get full detail\n return {\n content: [\n {\n type: 'text',\n text: JSON.stringify({\n id: cachedResult.id,\n score: cachedResult.score,\n summary: cachedResult.summary,\n context: cachedResult.context,\n warning: 'Could not retrieve full context, returning cached minimal result'\n }, null, 2)\n }\n ]\n };\n }\n\n // Update cache with full result\n resultCache.set(resultId, fullResult);\n\n const responseJson = JSON.stringify({\n id: fullResult.id,\n score: fullResult.score,\n summary: fullResult.summary,\n context: fullResult.context,\n full: fullResult.full\n }, null, 2);\n\n logger.info({\n resultId,\n cached: false,\n hasFullContext: true,\n ...summarizePayload(responseJson, 'mcp-full-context', resultId),\n }, 'Full context retrieved via re-query');\n\n return {\n content: [\n {\n type: 'text',\n text: responseJson\n }\n ]\n };\n};\n","import { z } from 'zod';\nimport type { ToolHandler } from '../types.js';\nimport {\n SearchArgsSchema,\n GetFullContextArgsSchema\n} from '../schemas/index.js';\nimport {\n handleSearch,\n handleGetFullContext\n} from './search.handler.js';\n\n/**\n * Tool definition with schema and handler\n */\nexport interface ToolDefinition {\n name: string;\n description: string;\n schema: z.ZodType;\n // eslint-disable-next-line @typescript-eslint/no-explicit-any\n handler: ToolHandler<any>;\n}\n\n/**\n * Registry of native MCP tools\n *\n * Only search and get_full_context are native tools with full schemas.\n * Store and job management is consolidated into the execute meta-tool\n * (see commands/ directory and execute.handler.ts).\n */\nexport const tools: ToolDefinition[] = [\n {\n name: 'search',\n description: 'Search all indexed knowledge stores with pattern detection and AI-optimized results. Returns structured code units with progressive context layers.',\n schema: SearchArgsSchema,\n handler: handleSearch\n },\n {\n name: 'get_full_context',\n description: 'Get complete code and context for a specific search result by ID. Use this after search to get full implementation details.',\n schema: GetFullContextArgsSchema,\n handler: handleGetFullContext\n }\n];\n","import { z } from 'zod';\nimport type { HandlerContext, ToolResponse } from '../types.js';\n\n/**\n * Command definition for the execute meta-tool\n *\n * Each command has a name, description, optional args schema, and handler.\n * Commands are registered in the registry and invoked via execute(command, args).\n */\nexport interface CommandDefinition {\n /** Command name (e.g., 'stores', 'store:info', 'job:status') */\n name: string;\n /** Human-readable description shown in help output */\n description: string;\n /** Optional Zod schema for argument validation */\n argsSchema?: z.ZodType;\n /** Handler function that executes the command */\n handler: CommandHandler;\n}\n\n/**\n * Command handler function signature\n *\n * @param args - Validated command arguments (or empty object if no args)\n * @param context - Handler context with services and options\n * @returns Promise resolving to tool response\n */\nexport type CommandHandler = (\n args: Record<string, unknown>,\n context: HandlerContext\n) => Promise<ToolResponse>;\n\n/**\n * Command registry - singleton that holds all registered commands\n */\nclass CommandRegistry {\n private readonly commands = new Map<string, CommandDefinition>();\n\n /**\n * Register a command\n */\n register(command: CommandDefinition): void {\n if (this.commands.has(command.name)) {\n throw new Error(`Command already registered: ${command.name}`);\n }\n this.commands.set(command.name, command);\n }\n\n /**\n * Register multiple commands at once\n */\n registerAll(commands: CommandDefinition[]): void {\n for (const command of commands) {\n this.register(command);\n }\n }\n\n /**\n * Get a command by name\n */\n get(name: string): CommandDefinition | undefined {\n return this.commands.get(name);\n }\n\n /**\n * Check if a command exists\n */\n has(name: string): boolean {\n return this.commands.has(name);\n }\n\n /**\n * Get all registered commands\n */\n all(): CommandDefinition[] {\n return Array.from(this.commands.values());\n }\n\n /**\n * Get commands grouped by category (prefix before colon)\n */\n grouped(): Map<string, CommandDefinition[]> {\n const groups = new Map<string, CommandDefinition[]>();\n\n for (const cmd of this.commands.values()) {\n const colonIndex = cmd.name.indexOf(':');\n const category = colonIndex === -1 ? 'general' : cmd.name.slice(0, colonIndex);\n\n const existing = groups.get(category) ?? [];\n existing.push(cmd);\n groups.set(category, existing);\n }\n\n return groups;\n }\n}\n\n/** Global command registry instance */\nexport const commandRegistry = new CommandRegistry();\n\n/**\n * Execute a command by name\n *\n * @param commandName - The command to execute\n * @param args - Arguments to pass to the command\n * @param context - Handler context\n * @returns Promise resolving to tool response\n */\nexport async function executeCommand(\n commandName: string,\n args: Record<string, unknown>,\n context: HandlerContext\n): Promise<ToolResponse> {\n const command = commandRegistry.get(commandName);\n\n if (command === undefined) {\n throw new Error(\n `Unknown command: ${commandName}. Use execute(\"commands\") to list available commands.`\n );\n }\n\n // Validate args if schema provided (Zod parse returns unknown, safe to cast after validation)\n /* eslint-disable @typescript-eslint/consistent-type-assertions */\n const validatedArgs: Record<string, unknown> = command.argsSchema !== undefined\n ? (command.argsSchema.parse(args) as Record<string, unknown>)\n : args;\n /* eslint-enable @typescript-eslint/consistent-type-assertions */\n\n return command.handler(validatedArgs, context);\n}\n\n/**\n * Generate help text for a command or all commands\n */\nexport function generateHelp(commandName?: string): string {\n if (commandName !== undefined) {\n const command = commandRegistry.get(commandName);\n if (command === undefined) {\n throw new Error(`Unknown command: ${commandName}`);\n }\n\n const lines = [\n `Command: ${command.name}`,\n `Description: ${command.description}`,\n ''\n ];\n\n if (command.argsSchema !== undefined) {\n lines.push('Arguments:');\n // Extract schema shape for documentation\n const schema = command.argsSchema;\n if (schema instanceof z.ZodObject) {\n // eslint-disable-next-line @typescript-eslint/consistent-type-assertions\n const shape = schema.shape as Record<string, z.ZodType>;\n for (const [key, fieldSchema] of Object.entries(shape)) {\n const isOptional = fieldSchema.safeParse(undefined).success;\n const desc = fieldSchema.description ?? '';\n lines.push(` ${key}${isOptional ? ' (optional)' : ''}: ${desc}`);\n }\n }\n } else {\n lines.push('Arguments: none');\n }\n\n return lines.join('\\n');\n }\n\n // Generate help for all commands\n const groups = commandRegistry.grouped();\n const lines = ['Available commands:', ''];\n\n for (const [category, commands] of groups) {\n lines.push(`${category}:`);\n for (const cmd of commands) {\n lines.push(` ${cmd.name} - ${cmd.description}`);\n }\n lines.push('');\n }\n\n lines.push('Use execute(\"help\", {command: \"name\"}) for detailed command help.');\n\n return lines.join('\\n');\n}\n","import { z } from 'zod';\nimport type { CommandDefinition } from './registry.js';\nimport type {\n ListStoresArgs,\n GetStoreInfoArgs,\n CreateStoreArgs,\n IndexStoreArgs,\n DeleteStoreArgs\n} from '../schemas/index.js';\nimport {\n handleListStores,\n handleGetStoreInfo,\n handleCreateStore,\n handleIndexStore,\n handleDeleteStore\n} from '../handlers/store.handler.js';\n\n/**\n * Store management commands for the execute meta-tool\n *\n * These commands wrap the existing store handlers, providing\n * a unified interface through the execute command.\n *\n * Note: Type assertions are necessary here because CommandHandler uses\n * Record<string, unknown> for generic command args, while handlers expect\n * specific typed args. Zod validates at runtime before the cast.\n */\n/* eslint-disable @typescript-eslint/consistent-type-assertions */\nexport const storeCommands: CommandDefinition[] = [\n {\n name: 'stores',\n description: 'List all indexed knowledge stores',\n argsSchema: z.object({\n type: z.enum(['file', 'repo', 'web']).optional().describe('Filter by store type')\n }),\n handler: (args, context) => handleListStores(args as unknown as ListStoresArgs, context)\n },\n {\n name: 'store:info',\n description: 'Get detailed information about a specific store',\n argsSchema: z.object({\n store: z.string().min(1).describe('Store name or ID')\n }),\n handler: (args, context) => handleGetStoreInfo(args as unknown as GetStoreInfoArgs, context)\n },\n {\n name: 'store:create',\n description: 'Create a new knowledge store from git URL or local path',\n argsSchema: z.object({\n name: z.string().min(1).describe('Store name'),\n type: z.enum(['file', 'repo']).describe('Store type'),\n source: z.string().min(1).describe('Git URL or local path'),\n branch: z.string().optional().describe('Git branch (for repo type)'),\n description: z.string().optional().describe('Store description')\n }),\n handler: (args, context) => handleCreateStore(args as unknown as CreateStoreArgs, context)\n },\n {\n name: 'store:index',\n description: 'Re-index a knowledge store to update search data',\n argsSchema: z.object({\n store: z.string().min(1).describe('Store name or ID')\n }),\n handler: (args, context) => handleIndexStore(args as unknown as IndexStoreArgs, context)\n },\n {\n name: 'store:delete',\n description: 'Delete a knowledge store and all associated data',\n argsSchema: z.object({\n store: z.string().min(1).describe('Store name or ID')\n }),\n handler: (args, context) => handleDeleteStore(args as unknown as DeleteStoreArgs, context)\n }\n];\n/* eslint-enable @typescript-eslint/consistent-type-assertions */\n","import { rm } from 'node:fs/promises';\nimport { join } from 'node:path';\nimport type { ToolHandler, ToolResponse } from '../types.js';\nimport type {\n ListStoresArgs,\n GetStoreInfoArgs,\n CreateStoreArgs,\n IndexStoreArgs,\n DeleteStoreArgs\n} from '../schemas/index.js';\nimport {\n ListStoresArgsSchema,\n GetStoreInfoArgsSchema,\n CreateStoreArgsSchema,\n IndexStoreArgsSchema,\n DeleteStoreArgsSchema\n} from '../schemas/index.js';\nimport { JobService } from '../../services/job.service.js';\nimport { spawnBackgroundWorker } from '../../workers/spawn-worker.js';\nimport { createStoreId } from '../../types/brands.js';\n\n/**\n * Handle list_stores requests\n *\n * Lists all knowledge stores with optional type filtering.\n */\nexport const handleListStores: ToolHandler<ListStoresArgs> = async (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = ListStoresArgsSchema.parse(args);\n\n const { services } = context;\n\n const stores = await services.store.list();\n const filtered = validated.type !== undefined\n ? stores.filter(s => s.type === validated.type)\n : stores;\n\n return {\n content: [\n {\n type: 'text',\n text: JSON.stringify({\n stores: filtered.map(s => ({\n id: s.id,\n name: s.name,\n type: s.type,\n path: 'path' in s ? s.path : undefined,\n url: 'url' in s && s.url !== undefined ? s.url : undefined,\n description: s.description,\n createdAt: s.createdAt.toISOString()\n }))\n }, null, 2)\n }\n ]\n };\n};\n\n/**\n * Handle get_store_info requests\n *\n * Retrieves detailed information about a specific store.\n */\nexport const handleGetStoreInfo: ToolHandler<GetStoreInfoArgs> = async (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = GetStoreInfoArgsSchema.parse(args);\n\n const { services } = context;\n\n const store = await services.store.getByIdOrName(createStoreId(validated.store));\n\n if (store === undefined) {\n throw new Error(`Store not found: ${validated.store}`);\n }\n\n return {\n content: [\n {\n type: 'text',\n text: JSON.stringify({\n id: store.id,\n name: store.name,\n type: store.type,\n path: 'path' in store ? store.path : undefined,\n url: 'url' in store && store.url !== undefined ? store.url : undefined,\n branch: 'branch' in store ? store.branch : undefined,\n description: store.description,\n status: store.status,\n createdAt: store.createdAt.toISOString(),\n updatedAt: store.updatedAt.toISOString()\n }, null, 2)\n }\n ]\n };\n};\n\n/**\n * Handle create_store requests\n *\n * Creates a new knowledge store and starts background indexing.\n * Returns store info and job ID for tracking progress.\n */\nexport const handleCreateStore: ToolHandler<CreateStoreArgs> = async (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = CreateStoreArgsSchema.parse(args);\n\n const { services, options } = context;\n\n // Determine if source is a URL or path\n const isUrl = validated.source.startsWith('http://') ||\n validated.source.startsWith('https://') ||\n validated.source.startsWith('git@');\n\n const result = await services.store.create({\n name: validated.name,\n type: validated.type,\n ...(isUrl ? { url: validated.source } : { path: validated.source }),\n ...(validated.branch !== undefined ? { branch: validated.branch } : {}),\n ...(validated.description !== undefined ? { description: validated.description } : {})\n });\n\n if (!result.success) {\n throw new Error(result.error.message);\n }\n\n // Create background job for indexing\n const jobService = new JobService(options.dataDir);\n const jobDetails: Record<string, unknown> = {\n storeName: result.data.name,\n storeId: result.data.id\n };\n if (isUrl) {\n jobDetails['url'] = validated.source;\n }\n if ('path' in result.data && result.data.path) {\n jobDetails['path'] = result.data.path;\n }\n const job = jobService.createJob({\n type: validated.type === 'repo' && isUrl ? 'clone' : 'index',\n details: jobDetails,\n message: `Indexing ${result.data.name}...`\n });\n\n // Spawn background worker (dataDir defaults to project-local .bluera if undefined)\n spawnBackgroundWorker(job.id, options.dataDir ?? '');\n\n return {\n content: [\n {\n type: 'text',\n text: JSON.stringify({\n store: {\n id: result.data.id,\n name: result.data.name,\n type: result.data.type,\n path: 'path' in result.data ? result.data.path : undefined\n },\n job: {\n id: job.id,\n status: job.status,\n message: job.message\n },\n message: `Store created. Indexing started in background (Job ID: ${job.id})`\n }, null, 2)\n }\n ]\n };\n};\n\n/**\n * Handle index_store requests\n *\n * Re-indexes an existing store in the background.\n * Returns job ID for tracking progress.\n */\nexport const handleIndexStore: ToolHandler<IndexStoreArgs> = async (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = IndexStoreArgsSchema.parse(args);\n\n const { services, options } = context;\n\n const store = await services.store.getByIdOrName(createStoreId(validated.store));\n\n if (store === undefined) {\n throw new Error(`Store not found: ${validated.store}`);\n }\n\n // Create background job for indexing\n const jobService = new JobService(options.dataDir);\n const jobDetails: Record<string, unknown> = {\n storeName: store.name,\n storeId: store.id\n };\n if ('path' in store && store.path) {\n jobDetails['path'] = store.path;\n }\n const job = jobService.createJob({\n type: 'index',\n details: jobDetails,\n message: `Re-indexing ${store.name}...`\n });\n\n // Spawn background worker (dataDir defaults to project-local .bluera if undefined)\n spawnBackgroundWorker(job.id, options.dataDir ?? '');\n\n return {\n content: [\n {\n type: 'text',\n text: JSON.stringify({\n store: {\n id: store.id,\n name: store.name\n },\n job: {\n id: job.id,\n status: job.status,\n message: job.message\n },\n message: `Indexing started in background (Job ID: ${job.id})`\n }, null, 2)\n }\n ]\n };\n};\n\n/**\n * Handle delete_store requests\n *\n * Deletes a store and all associated data:\n * - Removes from store registry\n * - Drops LanceDB table\n * - For repo stores with URL, removes cloned files\n */\nexport const handleDeleteStore: ToolHandler<DeleteStoreArgs> = async (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = DeleteStoreArgsSchema.parse(args);\n\n const { services, options } = context;\n\n const store = await services.store.getByIdOrName(createStoreId(validated.store));\n\n if (store === undefined) {\n throw new Error(`Store not found: ${validated.store}`);\n }\n\n // Delete LanceDB table\n await services.lance.deleteStore(store.id);\n\n // For repo stores cloned from URL, remove the cloned directory\n if (store.type === 'repo' && 'url' in store && store.url !== undefined) {\n if (options.dataDir === undefined) {\n throw new Error('dataDir is required to delete cloned repository files');\n }\n const repoPath = join(options.dataDir, 'repos', store.id);\n await rm(repoPath, { recursive: true, force: true });\n }\n\n // Delete from registry\n const result = await services.store.delete(store.id);\n if (!result.success) {\n throw new Error(result.error.message);\n }\n\n return {\n content: [\n {\n type: 'text',\n text: JSON.stringify({\n deleted: true,\n store: {\n id: store.id,\n name: store.name,\n type: store.type\n },\n message: `Successfully deleted store: ${store.name}`\n }, null, 2)\n }\n ]\n };\n};\n","import { spawn } from 'child_process';\nimport { fileURLToPath } from 'url';\nimport path from 'path';\n\n/**\n * Spawn a background worker process to execute a job\n *\n * The worker runs detached from the parent process, allowing the\n * parent to exit while the worker continues running.\n *\n * @param jobId - The ID of the job to execute\n */\nexport function spawnBackgroundWorker(jobId: string, dataDir: string): void {\n // Determine the worker script path\n // In production, this will be the compiled dist file\n // In development, we need to use tsx to run TypeScript\n const __dirname = path.dirname(fileURLToPath(import.meta.url));\n\n // Check if we're running from dist (production) or src (development)\n const isProduction = __dirname.includes('/dist/');\n\n let command: string;\n let args: string[];\n\n if (isProduction) {\n // Production: Use Node.js directly with compiled file\n const workerScript = path.join(__dirname, 'background-worker-cli.js');\n command = process.execPath; // Use the same Node.js binary\n args = [workerScript, jobId];\n } else {\n // Development: Use tsx to run TypeScript directly\n const workerScript = path.join(__dirname, 'background-worker-cli.ts');\n command = 'npx';\n args = ['tsx', workerScript, jobId];\n }\n\n // Spawn the worker process\n const worker = spawn(command, args, {\n detached: true, // Detach from parent process\n stdio: 'ignore', // Don't pipe stdio (fully independent)\n env: {\n ...process.env, // Inherit environment variables\n BLUERA_DATA_DIR: dataDir // Pass dataDir to worker\n }\n });\n\n // Unref the worker so the parent can exit\n worker.unref();\n}\n","import { z } from 'zod';\nimport type { CommandDefinition } from './registry.js';\nimport type {\n CheckJobStatusArgs,\n ListJobsArgs,\n CancelJobArgs\n} from '../schemas/index.js';\nimport {\n handleCheckJobStatus,\n handleListJobs,\n handleCancelJob\n} from '../handlers/job.handler.js';\n\n/**\n * Job management commands for the execute meta-tool\n *\n * These commands wrap the existing job handlers, providing\n * a unified interface through the execute command.\n *\n * Note: Type assertions are necessary here because CommandHandler uses\n * Record<string, unknown> for generic command args, while handlers expect\n * specific typed args. Zod validates at runtime before the cast.\n */\n/* eslint-disable @typescript-eslint/consistent-type-assertions */\nexport const jobCommands: CommandDefinition[] = [\n {\n name: 'jobs',\n description: 'List all background jobs',\n argsSchema: z.object({\n activeOnly: z.boolean().optional().describe('Only show active jobs'),\n status: z.enum(['pending', 'running', 'completed', 'failed', 'cancelled'])\n .optional()\n .describe('Filter by job status')\n }),\n handler: (args, context) => handleListJobs(args as unknown as ListJobsArgs, context)\n },\n {\n name: 'job:status',\n description: 'Check the status of a specific background job',\n argsSchema: z.object({\n jobId: z.string().min(1).describe('Job ID to check')\n }),\n handler: (args, context) => handleCheckJobStatus(args as unknown as CheckJobStatusArgs, context)\n },\n {\n name: 'job:cancel',\n description: 'Cancel a running or pending background job',\n argsSchema: z.object({\n jobId: z.string().min(1).describe('Job ID to cancel')\n }),\n handler: (args, context) => handleCancelJob(args as unknown as CancelJobArgs, context)\n }\n];\n/* eslint-enable @typescript-eslint/consistent-type-assertions */\n","import type { ToolHandler, ToolResponse } from '../types.js';\nimport type {\n CheckJobStatusArgs,\n ListJobsArgs,\n CancelJobArgs\n} from '../schemas/index.js';\nimport {\n CheckJobStatusArgsSchema,\n ListJobsArgsSchema,\n CancelJobArgsSchema\n} from '../schemas/index.js';\nimport { JobService } from '../../services/job.service.js';\n\n/**\n * Handle check_job_status requests\n *\n * Retrieves the current status of a background job.\n */\nexport const handleCheckJobStatus: ToolHandler<CheckJobStatusArgs> = (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = CheckJobStatusArgsSchema.parse(args);\n\n const { options } = context;\n\n const jobService = new JobService(options.dataDir);\n const job = jobService.getJob(validated.jobId);\n\n if (!job) {\n throw new Error(`Job not found: ${validated.jobId}`);\n }\n\n return Promise.resolve({\n content: [\n {\n type: 'text',\n text: JSON.stringify(job, null, 2)\n }\n ]\n });\n};\n\n/**\n * Handle list_jobs requests\n *\n * Lists all jobs with optional filtering by status or active status.\n */\nexport const handleListJobs: ToolHandler<ListJobsArgs> = (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = ListJobsArgsSchema.parse(args);\n\n const { options } = context;\n\n const jobService = new JobService(options.dataDir);\n\n let jobs;\n if (validated.activeOnly === true) {\n jobs = jobService.listActiveJobs();\n } else if (validated.status !== undefined) {\n jobs = jobService.listJobs(validated.status);\n } else {\n jobs = jobService.listJobs();\n }\n\n return Promise.resolve({\n content: [\n {\n type: 'text',\n text: JSON.stringify({ jobs }, null, 2)\n }\n ]\n });\n};\n\n/**\n * Handle cancel_job requests\n *\n * Cancels a running or pending background job.\n * Kills the worker process if it exists.\n */\nexport const handleCancelJob: ToolHandler<CancelJobArgs> = (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = CancelJobArgsSchema.parse(args);\n\n const { options } = context;\n\n const jobService = new JobService(options.dataDir);\n const result = jobService.cancelJob(validated.jobId);\n\n if (!result.success) {\n throw new Error(result.error.message);\n }\n\n const job = jobService.getJob(validated.jobId);\n\n return Promise.resolve({\n content: [\n {\n type: 'text',\n text: JSON.stringify({\n success: true,\n job,\n message: 'Job cancelled successfully'\n }, null, 2)\n }\n ]\n });\n};\n","import { z } from 'zod';\nimport type { CommandDefinition } from './registry.js';\nimport type { ToolResponse } from '../types.js';\nimport { commandRegistry, generateHelp } from './registry.js';\n\n/**\n * Meta commands for introspection and help\n *\n * These commands provide self-documentation for the execute tool,\n * allowing users to discover available commands and their usage.\n */\nexport const metaCommands: CommandDefinition[] = [\n {\n name: 'commands',\n description: 'List all available commands',\n handler: (): Promise<ToolResponse> => {\n const commands = commandRegistry.all();\n const commandList = commands.map(cmd => ({\n name: cmd.name,\n description: cmd.description\n }));\n\n return Promise.resolve({\n content: [\n {\n type: 'text' as const,\n text: JSON.stringify({ commands: commandList }, null, 2)\n }\n ]\n });\n }\n },\n {\n name: 'help',\n description: 'Show help for a specific command or list all commands',\n argsSchema: z.object({\n command: z.string().optional().describe('Command name to get help for')\n }),\n handler: (args: Record<string, unknown>): Promise<ToolResponse> => {\n // eslint-disable-next-line @typescript-eslint/consistent-type-assertions\n const commandName = args['command'] as string | undefined;\n const helpText = generateHelp(commandName);\n\n return Promise.resolve({\n content: [\n {\n type: 'text' as const,\n text: helpText\n }\n ]\n });\n }\n }\n];\n","/**\n * Command registration for the execute meta-tool\n *\n * This module registers all commands with the command registry.\n * Import this module to ensure all commands are available.\n */\n\nimport { commandRegistry } from './registry.js';\nimport { storeCommands } from './store.commands.js';\nimport { jobCommands } from './job.commands.js';\nimport { metaCommands } from './meta.commands.js';\n\n// Register all commands\ncommandRegistry.registerAll(storeCommands);\ncommandRegistry.registerAll(jobCommands);\ncommandRegistry.registerAll(metaCommands);\n\n// Re-export for convenience\nexport { commandRegistry, executeCommand, generateHelp } from './registry.js';\nexport type { CommandDefinition, CommandHandler } from './registry.js';\n","import type { ToolHandler, ToolResponse } from '../types.js';\nimport type { ExecuteArgs } from '../schemas/index.js';\nimport { ExecuteArgsSchema } from '../schemas/index.js';\n// Import commands module to register all commands\nimport '../commands/index.js';\nimport { executeCommand } from '../commands/index.js';\n\n/**\n * Handle execute requests\n *\n * This is the meta-tool handler that routes to registered commands.\n * It consolidates store and job management into a single tool surface.\n */\nexport const handleExecute: ToolHandler<ExecuteArgs> = async (\n args,\n context\n): Promise<ToolResponse> => {\n // Validate arguments with Zod\n const validated = ExecuteArgsSchema.parse(args);\n\n const commandArgs = validated.args ?? {};\n\n return executeCommand(validated.command, commandArgs, context);\n};\n"],"mappings":";;;;;;;;;AAAA,SAAS,cAAc;AACvB,SAAS,4BAA4B;AACrC;AAAA,EACE;AAAA,EACA;AAAA,OACK;;;ACLP,SAAS,SAAS;AAgBX,IAAM,mBAAmB,EAAE,OAAO;AAAA,EACvC,OAAO,EAAE,OAAO,EAAE,IAAI,GAAG,kCAAkC;AAAA,EAC3D,QAAQ,EACL,KAAK,CAAC,gBAAgB,uBAAuB,cAAc,mBAAmB,oBAAoB,CAAC,EACnG,SAAS;AAAA,EACZ,QAAQ,EAAE,KAAK,CAAC,WAAW,cAAc,MAAM,CAAC,EAAE,QAAQ,SAAS;AAAA,EACnE,OAAO,EAAE,OAAO,EAAE,IAAI,EAAE,SAAS,EAAE,QAAQ,EAAE;AAAA,EAC7C,QAAQ,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,SAAS;AACvC,CAAC;AAOM,IAAM,2BAA2B,EAAE,OAAO;AAAA,EAC/C,UAAU,EAAE,OAAO,EAAE,IAAI,GAAG,sCAAsC;AACpE,CAAC;AAWM,IAAM,uBAAuB,EAAE,OAAO;AAAA,EAC3C,MAAM,EAAE,KAAK,CAAC,QAAQ,QAAQ,KAAK,CAAC,EAAE,SAAS;AACjD,CAAC;AAOM,IAAM,yBAAyB,EAAE,OAAO;AAAA,EAC7C,OAAO,EAAE,OAAO,EAAE,IAAI,GAAG,6CAA6C;AACxE,CAAC;AAOM,IAAM,wBAAwB,EAAE,OAAO;AAAA,EAC5C,MAAM,EAAE,OAAO,EAAE,IAAI,GAAG,uCAAuC;AAAA,EAC/D,MAAM,EAAE,KAAK,CAAC,QAAQ,MAAM,CAAC;AAAA,EAC7B,QAAQ,EAAE,OAAO,EAAE,IAAI,GAAG,+CAA+C;AAAA,EACzE,QAAQ,EAAE,OAAO,EAAE,SAAS;AAAA,EAC5B,aAAa,EAAE,OAAO,EAAE,SAAS;AACnC,CAAC;AAOM,IAAM,uBAAuB,EAAE,OAAO;AAAA,EAC3C,OAAO,EAAE,OAAO,EAAE,IAAI,GAAG,6CAA6C;AACxE,CAAC;AAOM,IAAM,wBAAwB,EAAE,OAAO;AAAA,EAC5C,OAAO,EAAE,OAAO,EAAE,IAAI,GAAG,6CAA6C;AACxE,CAAC;AAWM,IAAM,2BAA2B,EAAE,OAAO;AAAA,EAC/C,OAAO,EAAE,OAAO,EAAE,IAAI,GAAG,mCAAmC;AAC9D,CAAC;AAOM,IAAM,qBAAqB,EAAE,OAAO;AAAA,EACzC,YAAY,EAAE,QAAQ,EAAE,SAAS;AAAA,EACjC,QAAQ,EAAE,KAAK,CAAC,WAAW,WAAW,aAAa,UAAU,WAAW,CAAC,EAAE,SAAS;AACtF,CAAC;AAOM,IAAM,sBAAsB,EAAE,OAAO;AAAA,EAC1C,OAAO,EAAE,OAAO,EAAE,IAAI,GAAG,mCAAmC;AAC9D,CAAC;AAcM,IAAM,oBAAoB,EAAE,OAAO;AAAA,EACxC,SAAS,EAAE,OAAO,EAAE,IAAI,GAAG,0BAA0B;AAAA,EACrD,MAAM,EAAE,OAAO,EAAE,OAAO,GAAG,EAAE,QAAQ,CAAC,EAAE,SAAS;AACnD,CAAC;;;AC9HM,IAAM,WAAN,MAAqB;AAAA,EACT,QAAQ,oBAAI,IAAU;AAAA,EACtB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAOjB,YAAY,UAAkB,KAAM;AAClC,SAAK,UAAU;AAAA,EACjB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAWA,IAAI,KAAQ,OAAgB;AAE1B,QAAI,KAAK,MAAM,IAAI,GAAG,GAAG;AACvB,WAAK,MAAM,OAAO,GAAG;AAAA,IACvB;AAGA,SAAK,MAAM,IAAI,KAAK,KAAK;AAGzB,QAAI,KAAK,MAAM,OAAO,KAAK,SAAS;AAClC,YAAM,WAAW,KAAK,MAAM,KAAK,EAAE,KAAK,EAAE;AAC1C,UAAI,aAAa,QAAW;AAC1B,aAAK,MAAM,OAAO,QAAQ;AAAA,MAC5B;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAUA,IAAI,KAAuB;AACzB,UAAM,QAAQ,KAAK,MAAM,IAAI,GAAG;AAEhC,QAAI,UAAU,QAAW;AAEvB,WAAK,MAAM,OAAO,GAAG;AACrB,WAAK,MAAM,IAAI,KAAK,KAAK;AAAA,IAC3B;AAEA,WAAO;AAAA,EACT;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,IAAI,KAAiB;AACnB,WAAO,KAAK,MAAM,IAAI,GAAG;AAAA,EAC3B;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,OAAO,KAAiB;AACtB,WAAO,KAAK,MAAM,OAAO,GAAG;AAAA,EAC9B;AAAA;AAAA;AAAA;AAAA,EAKA,QAAc;AACZ,SAAK,MAAM,MAAM;AAAA,EACnB;AAAA;AAAA;AAAA;AAAA,EAKA,IAAI,OAAe;AACjB,WAAO,KAAK,MAAM;AAAA,EACpB;AACF;;;AC5FA,IAAM,kBAAkB;AAOjB,SAAS,eAAe,MAAsB;AACnD,MAAI,CAAC,KAAM,QAAO;AAClB,SAAO,KAAK,KAAK,KAAK,SAAS,eAAe;AAChD;AAOO,SAAS,iBAAiB,QAAwB;AACvD,MAAI,UAAU,KAAM;AAClB,WAAO,KAAK,SAAS,KAAM,QAAQ,CAAC,CAAC;AAAA,EACvC;AACA,SAAO,IAAI,OAAO,MAAM,CAAC;AAC3B;;;ACvBA,IAAM,SAAS,aAAa,YAAY;AAIjC,IAAM,cAAc,IAAI,SAAmC,GAAI;AAQ/D,IAAM,eAAwC,OACnD,MACA,YAC0B;AAE1B,QAAM,YAAY,iBAAiB,MAAM,IAAI;AAE7C,SAAO,KAAK;AAAA,IACV,OAAO,UAAU;AAAA,IACjB,QAAQ,UAAU;AAAA,IAClB,QAAQ,UAAU;AAAA,IAClB,OAAO,UAAU;AAAA,IACjB,QAAQ,UAAU;AAAA,EACpB,GAAG,gBAAgB;AAEnB,QAAM,EAAE,SAAS,IAAI;AAGrB,QAAM,WAAsB,UAAU,WAAW,SAC7C,MAAM,QAAQ,IAAI,UAAU,OAAO,IAAI,OAAO,MAAM;AAElD,UAAM,QAAQ,MAAM,SAAS,MAAM,cAAc,CAAY;AAC7D,QAAI,CAAC,OAAO;AACV,YAAM,IAAI,MAAM,oBAAoB,CAAC,EAAE;AAAA,IACzC;AACA,WAAO,MAAM;AAAA,EACf,CAAC,CAAC,KACD,MAAM,SAAS,MAAM,KAAK,GAAG,IAAI,OAAK,EAAE,EAAE;AAG/C,MAAI;AACF,eAAW,WAAW,UAAU;AAC9B,YAAM,SAAS,MAAM,WAAW,OAAO;AAAA,IACzC;AAAA,EACF,SAAS,OAAO;AACd,UAAM,IAAI;AAAA,MACR,uCAAuC,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK,CAAC;AAAA,IAC/F;AAAA,EACF;AAGA,QAAM,cAA2B;AAAA,IAC/B,OAAO,UAAU;AAAA,IACjB,QAAQ;AAAA,IACR,MAAM;AAAA,IACN,OAAO,UAAU;AAAA,IACjB,QAAQ,UAAU;AAAA,EACpB;AAEA,QAAM,UAAU,MAAM,SAAS,OAAO,OAAO,WAAW;AAGxD,aAAW,UAAU,QAAQ,SAAS;AACpC,gBAAY,IAAI,OAAO,IAAI,MAAM;AAAA,EACnC;AAGA,QAAM,kBAAkB,MAAM,QAAQ,IAAI,QAAQ,QAAQ,IAAI,OAAO,MAAM;AACzE,UAAM,UAAU,EAAE,SAAS;AAC3B,UAAM,QAAQ,MAAM,SAAS,MAAM,cAAc,OAAO;AAExD,WAAO;AAAA,MACL,IAAI,EAAE;AAAA,MACN,OAAO,EAAE;AAAA,MACT,SAAS;AAAA,QACP,GAAG,EAAE;AAAA,QACL,WAAW,OAAO;AAAA,QAClB,UAAU,UAAU,UAAa,MAAM,SAAS,SAAS,MAAM,OAAO;AAAA,MACxE;AAAA,MACA,SAAS,EAAE;AAAA,MACX,MAAM,EAAE;AAAA,IACV;AAAA,EACF,CAAC,CAAC;AAEF,QAAM,eAAe,KAAK,UAAU;AAAA,IAClC,SAAS;AAAA,IACT,cAAc,QAAQ;AAAA,IACtB,MAAM,QAAQ;AAAA,IACd,QAAQ,QAAQ;AAAA,EAClB,GAAG,MAAM,CAAC;AAGV,QAAM,iBAAiB,eAAe,YAAY;AAGlD,QAAM,SAAS,YAAY,UAAU,KAAK,gBAAgB,OAAO,QAAQ,YAAY,CAAC,MAAM,iBAAiB,cAAc,CAAC,aAAa,OAAO,QAAQ,MAAM,CAAC;AAAA;AAAA;AAG/J,SAAO,KAAK;AAAA,IACV,OAAO,UAAU;AAAA,IACjB,cAAc,QAAQ;AAAA,IACtB;AAAA,IACA,QAAQ,QAAQ;AAAA,IAChB,GAAG,iBAAiB,cAAc,gBAAgB,UAAU,KAAK;AAAA,EACnE,GAAG,+CAA+C;AAElD,SAAO;AAAA,IACL,SAAS;AAAA,MACP;AAAA,QACE,MAAM;AAAA,QACN,MAAM,SAAS;AAAA,MACjB;AAAA,IACF;AAAA,EACF;AACF;AAQO,IAAM,uBAAwD,OACnE,MACA,YAC0B;AAE1B,QAAM,YAAY,yBAAyB,MAAM,IAAI;AAErD,SAAO,KAAK,EAAE,UAAU,UAAU,SAAS,GAAG,4BAA4B;AAG1E,QAAM,WAAW,UAAU;AAG3B,QAAM,eAAe,YAAY,IAAI,QAAQ;AAE7C,MAAI,CAAC,cAAc;AACjB,UAAM,IAAI;AAAA,MACR,8BAA8B,QAAQ;AAAA,IACxC;AAAA,EACF;AAGA,MAAI,aAAa,MAAM;AACrB,UAAMA,gBAAe,KAAK,UAAU;AAAA,MAClC,IAAI,aAAa;AAAA,MACjB,OAAO,aAAa;AAAA,MACpB,SAAS,aAAa;AAAA,MACtB,SAAS,aAAa;AAAA,MACtB,MAAM,aAAa;AAAA,IACrB,GAAG,MAAM,CAAC;AAEV,WAAO,KAAK;AAAA,MACV;AAAA,MACA,QAAQ;AAAA,MACR,gBAAgB;AAAA,MAChB,GAAG,iBAAiBA,eAAc,oBAAoB,QAAQ;AAAA,IAChE,GAAG,mCAAmC;AAEtC,WAAO;AAAA,MACL,SAAS;AAAA,QACP;AAAA,UACE,MAAM;AAAA,UACN,MAAMA;AAAA,QACR;AAAA,MACF;AAAA,IACF;AAAA,EACF;AAGA,QAAM,EAAE,SAAS,IAAI;AACrB,QAAM,QAAQ,MAAM,SAAS,MAAM,cAAc,aAAa,SAAS,OAAO;AAE9E,MAAI,CAAC,OAAO;AACV,UAAM,IAAI,MAAM,oBAAoB,aAAa,SAAS,OAAO,EAAE;AAAA,EACrE;AAEA,QAAM,SAAS,MAAM,WAAW,MAAM,EAAE;AAExC,QAAM,cAA2B;AAAA,IAC/B,OAAO,aAAa,QAAQ,UAAU,GAAG,GAAG;AAAA;AAAA,IAC5C,QAAQ,CAAC,MAAM,EAAE;AAAA,IACjB,MAAM;AAAA,IACN,OAAO;AAAA,IACP,QAAQ;AAAA,EACV;AAEA,QAAM,UAAU,MAAM,SAAS,OAAO,OAAO,WAAW;AAGxD,QAAM,aAAa,QAAQ,QAAQ,KAAK,OAAK,EAAE,OAAO,QAAQ;AAE9D,MAAI,CAAC,YAAY;AAEf,WAAO;AAAA,MACL,SAAS;AAAA,QACP;AAAA,UACE,MAAM;AAAA,UACN,MAAM,KAAK,UAAU;AAAA,YACnB,IAAI,aAAa;AAAA,YACjB,OAAO,aAAa;AAAA,YACpB,SAAS,aAAa;AAAA,YACtB,SAAS,aAAa;AAAA,YACtB,SAAS;AAAA,UACX,GAAG,MAAM,CAAC;AAAA,QACZ;AAAA,MACF;AAAA,IACF;AAAA,EACF;AAGA,cAAY,IAAI,UAAU,UAAU;AAEpC,QAAM,eAAe,KAAK,UAAU;AAAA,IAClC,IAAI,WAAW;AAAA,IACf,OAAO,WAAW;AAAA,IAClB,SAAS,WAAW;AAAA,IACpB,SAAS,WAAW;AAAA,IACpB,MAAM,WAAW;AAAA,EACnB,GAAG,MAAM,CAAC;AAEV,SAAO,KAAK;AAAA,IACV;AAAA,IACA,QAAQ;AAAA,IACR,gBAAgB;AAAA,IAChB,GAAG,iBAAiB,cAAc,oBAAoB,QAAQ;AAAA,EAChE,GAAG,qCAAqC;AAExC,SAAO;AAAA,IACL,SAAS;AAAA,MACP;AAAA,QACE,MAAM;AAAA,QACN,MAAM;AAAA,MACR;AAAA,IACF;AAAA,EACF;AACF;;;AC3NO,IAAM,QAA0B;AAAA,EACrC;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,QAAQ;AAAA,IACR,SAAS;AAAA,EACX;AAAA,EACA;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,QAAQ;AAAA,IACR,SAAS;AAAA,EACX;AACF;;;AC1CA,SAAS,KAAAC,UAAS;AAmClB,IAAM,kBAAN,MAAsB;AAAA,EACH,WAAW,oBAAI,IAA+B;AAAA;AAAA;AAAA;AAAA,EAK/D,SAAS,SAAkC;AACzC,QAAI,KAAK,SAAS,IAAI,QAAQ,IAAI,GAAG;AACnC,YAAM,IAAI,MAAM,+BAA+B,QAAQ,IAAI,EAAE;AAAA,IAC/D;AACA,SAAK,SAAS,IAAI,QAAQ,MAAM,OAAO;AAAA,EACzC;AAAA;AAAA;AAAA;AAAA,EAKA,YAAY,UAAqC;AAC/C,eAAW,WAAW,UAAU;AAC9B,WAAK,SAAS,OAAO;AAAA,IACvB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,IAAI,MAA6C;AAC/C,WAAO,KAAK,SAAS,IAAI,IAAI;AAAA,EAC/B;AAAA;AAAA;AAAA;AAAA,EAKA,IAAI,MAAuB;AACzB,WAAO,KAAK,SAAS,IAAI,IAAI;AAAA,EAC/B;AAAA;AAAA;AAAA;AAAA,EAKA,MAA2B;AACzB,WAAO,MAAM,KAAK,KAAK,SAAS,OAAO,CAAC;AAAA,EAC1C;AAAA;AAAA;AAAA;AAAA,EAKA,UAA4C;AAC1C,UAAM,SAAS,oBAAI,IAAiC;AAEpD,eAAW,OAAO,KAAK,SAAS,OAAO,GAAG;AACxC,YAAM,aAAa,IAAI,KAAK,QAAQ,GAAG;AACvC,YAAM,WAAW,eAAe,KAAK,YAAY,IAAI,KAAK,MAAM,GAAG,UAAU;AAE7E,YAAM,WAAW,OAAO,IAAI,QAAQ,KAAK,CAAC;AAC1C,eAAS,KAAK,GAAG;AACjB,aAAO,IAAI,UAAU,QAAQ;AAAA,IAC/B;AAEA,WAAO;AAAA,EACT;AACF;AAGO,IAAM,kBAAkB,IAAI,gBAAgB;AAUnD,eAAsB,eACpB,aACA,MACA,SACuB;AACvB,QAAM,UAAU,gBAAgB,IAAI,WAAW;AAE/C,MAAI,YAAY,QAAW;AACzB,UAAM,IAAI;AAAA,MACR,oBAAoB,WAAW;AAAA,IACjC;AAAA,EACF;AAIA,QAAM,gBAAyC,QAAQ,eAAe,SACjE,QAAQ,WAAW,MAAM,IAAI,IAC9B;AAGJ,SAAO,QAAQ,QAAQ,eAAe,OAAO;AAC/C;AAKO,SAAS,aAAa,aAA8B;AACzD,MAAI,gBAAgB,QAAW;AAC7B,UAAM,UAAU,gBAAgB,IAAI,WAAW;AAC/C,QAAI,YAAY,QAAW;AACzB,YAAM,IAAI,MAAM,oBAAoB,WAAW,EAAE;AAAA,IACnD;AAEA,UAAMC,SAAQ;AAAA,MACZ,YAAY,QAAQ,IAAI;AAAA,MACxB,gBAAgB,QAAQ,WAAW;AAAA,MACnC;AAAA,IACF;AAEA,QAAI,QAAQ,eAAe,QAAW;AACpC,MAAAA,OAAM,KAAK,YAAY;AAEvB,YAAM,SAAS,QAAQ;AACvB,UAAI,kBAAkBD,GAAE,WAAW;AAEjC,cAAM,QAAQ,OAAO;AACrB,mBAAW,CAAC,KAAK,WAAW,KAAK,OAAO,QAAQ,KAAK,GAAG;AACtD,gBAAM,aAAa,YAAY,UAAU,MAAS,EAAE;AACpD,gBAAM,OAAO,YAAY,eAAe;AACxC,UAAAC,OAAM,KAAK,KAAK,GAAG,GAAG,aAAa,gBAAgB,EAAE,KAAK,IAAI,EAAE;AAAA,QAClE;AAAA,MACF;AAAA,IACF,OAAO;AACL,MAAAA,OAAM,KAAK,iBAAiB;AAAA,IAC9B;AAEA,WAAOA,OAAM,KAAK,IAAI;AAAA,EACxB;AAGA,QAAM,SAAS,gBAAgB,QAAQ;AACvC,QAAM,QAAQ,CAAC,uBAAuB,EAAE;AAExC,aAAW,CAAC,UAAU,QAAQ,KAAK,QAAQ;AACzC,UAAM,KAAK,GAAG,QAAQ,GAAG;AACzB,eAAW,OAAO,UAAU;AAC1B,YAAM,KAAK,KAAK,IAAI,IAAI,MAAM,IAAI,WAAW,EAAE;AAAA,IACjD;AACA,UAAM,KAAK,EAAE;AAAA,EACf;AAEA,QAAM,KAAK,mEAAmE;AAE9E,SAAO,MAAM,KAAK,IAAI;AACxB;;;ACtLA,SAAS,KAAAC,UAAS;;;ACAlB,SAAS,UAAU;AACnB,SAAS,YAAY;;;ACDrB,SAAS,aAAa;AACtB,SAAS,qBAAqB;AAC9B,OAAO,UAAU;AAUV,SAAS,sBAAsB,OAAe,SAAuB;AAI1E,QAAMC,aAAY,KAAK,QAAQ,cAAc,YAAY,GAAG,CAAC;AAG7D,QAAM,eAAeA,WAAU,SAAS,QAAQ;AAEhD,MAAI;AACJ,MAAI;AAEJ,MAAI,cAAc;AAEhB,UAAM,eAAe,KAAK,KAAKA,YAAW,0BAA0B;AACpE,cAAU,QAAQ;AAClB,WAAO,CAAC,cAAc,KAAK;AAAA,EAC7B,OAAO;AAEL,UAAM,eAAe,KAAK,KAAKA,YAAW,0BAA0B;AACpE,cAAU;AACV,WAAO,CAAC,OAAO,cAAc,KAAK;AAAA,EACpC;AAGA,QAAM,SAAS,MAAM,SAAS,MAAM;AAAA,IAClC,UAAU;AAAA;AAAA,IACV,OAAO;AAAA;AAAA,IACP,KAAK;AAAA,MACH,GAAG,QAAQ;AAAA;AAAA,MACX,iBAAiB;AAAA;AAAA,IACnB;AAAA,EACF,CAAC;AAGD,SAAO,MAAM;AACf;;;ADtBO,IAAM,mBAAgD,OAC3D,MACA,YAC0B;AAE1B,QAAM,YAAY,qBAAqB,MAAM,IAAI;AAEjD,QAAM,EAAE,SAAS,IAAI;AAErB,QAAM,SAAS,MAAM,SAAS,MAAM,KAAK;AACzC,QAAM,WAAW,UAAU,SAAS,SAChC,OAAO,OAAO,OAAK,EAAE,SAAS,UAAU,IAAI,IAC5C;AAEJ,SAAO;AAAA,IACL,SAAS;AAAA,MACP;AAAA,QACE,MAAM;AAAA,QACN,MAAM,KAAK,UAAU;AAAA,UACnB,QAAQ,SAAS,IAAI,QAAM;AAAA,YACzB,IAAI,EAAE;AAAA,YACN,MAAM,EAAE;AAAA,YACR,MAAM,EAAE;AAAA,YACR,MAAM,UAAU,IAAI,EAAE,OAAO;AAAA,YAC7B,KAAK,SAAS,KAAK,EAAE,QAAQ,SAAY,EAAE,MAAM;AAAA,YACjD,aAAa,EAAE;AAAA,YACf,WAAW,EAAE,UAAU,YAAY;AAAA,UACrC,EAAE;AAAA,QACJ,GAAG,MAAM,CAAC;AAAA,MACZ;AAAA,IACF;AAAA,EACF;AACF;AAOO,IAAM,qBAAoD,OAC/D,MACA,YAC0B;AAE1B,QAAM,YAAY,uBAAuB,MAAM,IAAI;AAEnD,QAAM,EAAE,SAAS,IAAI;AAErB,QAAM,QAAQ,MAAM,SAAS,MAAM,cAAc,cAAc,UAAU,KAAK,CAAC;AAE/E,MAAI,UAAU,QAAW;AACvB,UAAM,IAAI,MAAM,oBAAoB,UAAU,KAAK,EAAE;AAAA,EACvD;AAEA,SAAO;AAAA,IACL,SAAS;AAAA,MACP;AAAA,QACE,MAAM;AAAA,QACN,MAAM,KAAK,UAAU;AAAA,UACnB,IAAI,MAAM;AAAA,UACV,MAAM,MAAM;AAAA,UACZ,MAAM,MAAM;AAAA,UACZ,MAAM,UAAU,QAAQ,MAAM,OAAO;AAAA,UACrC,KAAK,SAAS,SAAS,MAAM,QAAQ,SAAY,MAAM,MAAM;AAAA,UAC7D,QAAQ,YAAY,QAAQ,MAAM,SAAS;AAAA,UAC3C,aAAa,MAAM;AAAA,UACnB,QAAQ,MAAM;AAAA,UACd,WAAW,MAAM,UAAU,YAAY;AAAA,UACvC,WAAW,MAAM,UAAU,YAAY;AAAA,QACzC,GAAG,MAAM,CAAC;AAAA,MACZ;AAAA,IACF;AAAA,EACF;AACF;AAQO,IAAM,oBAAkD,OAC7D,MACA,YAC0B;AAE1B,QAAM,YAAY,sBAAsB,MAAM,IAAI;AAElD,QAAM,EAAE,UAAU,QAAQ,IAAI;AAG9B,QAAM,QAAQ,UAAU,OAAO,WAAW,SAAS,KACrC,UAAU,OAAO,WAAW,UAAU,KACtC,UAAU,OAAO,WAAW,MAAM;AAEhD,QAAM,SAAS,MAAM,SAAS,MAAM,OAAO;AAAA,IACzC,MAAM,UAAU;AAAA,IAChB,MAAM,UAAU;AAAA,IAChB,GAAI,QAAQ,EAAE,KAAK,UAAU,OAAO,IAAI,EAAE,MAAM,UAAU,OAAO;AAAA,IACjE,GAAI,UAAU,WAAW,SAAY,EAAE,QAAQ,UAAU,OAAO,IAAI,CAAC;AAAA,IACrE,GAAI,UAAU,gBAAgB,SAAY,EAAE,aAAa,UAAU,YAAY,IAAI,CAAC;AAAA,EACtF,CAAC;AAED,MAAI,CAAC,OAAO,SAAS;AACnB,UAAM,IAAI,MAAM,OAAO,MAAM,OAAO;AAAA,EACtC;AAGA,QAAM,aAAa,IAAI,WAAW,QAAQ,OAAO;AACjD,QAAM,aAAsC;AAAA,IAC1C,WAAW,OAAO,KAAK;AAAA,IACvB,SAAS,OAAO,KAAK;AAAA,EACvB;AACA,MAAI,OAAO;AACT,eAAW,KAAK,IAAI,UAAU;AAAA,EAChC;AACA,MAAI,UAAU,OAAO,QAAQ,OAAO,KAAK,MAAM;AAC7C,eAAW,MAAM,IAAI,OAAO,KAAK;AAAA,EACnC;AACA,QAAM,MAAM,WAAW,UAAU;AAAA,IAC/B,MAAM,UAAU,SAAS,UAAU,QAAQ,UAAU;AAAA,IACrD,SAAS;AAAA,IACT,SAAS,YAAY,OAAO,KAAK,IAAI;AAAA,EACvC,CAAC;AAGD,wBAAsB,IAAI,IAAI,QAAQ,WAAW,EAAE;AAEnD,SAAO;AAAA,IACL,SAAS;AAAA,MACP;AAAA,QACE,MAAM;AAAA,QACN,MAAM,KAAK,UAAU;AAAA,UACnB,OAAO;AAAA,YACL,IAAI,OAAO,KAAK;AAAA,YAChB,MAAM,OAAO,KAAK;AAAA,YAClB,MAAM,OAAO,KAAK;AAAA,YAClB,MAAM,UAAU,OAAO,OAAO,OAAO,KAAK,OAAO;AAAA,UACnD;AAAA,UACA,KAAK;AAAA,YACH,IAAI,IAAI;AAAA,YACR,QAAQ,IAAI;AAAA,YACZ,SAAS,IAAI;AAAA,UACf;AAAA,UACA,SAAS,0DAA0D,IAAI,EAAE;AAAA,QAC3E,GAAG,MAAM,CAAC;AAAA,MACZ;AAAA,IACF;AAAA,EACF;AACF;AAQO,IAAM,mBAAgD,OAC3D,MACA,YAC0B;AAE1B,QAAM,YAAY,qBAAqB,MAAM,IAAI;AAEjD,QAAM,EAAE,UAAU,QAAQ,IAAI;AAE9B,QAAM,QAAQ,MAAM,SAAS,MAAM,cAAc,cAAc,UAAU,KAAK,CAAC;AAE/E,MAAI,UAAU,QAAW;AACvB,UAAM,IAAI,MAAM,oBAAoB,UAAU,KAAK,EAAE;AAAA,EACvD;AAGA,QAAM,aAAa,IAAI,WAAW,QAAQ,OAAO;AACjD,QAAM,aAAsC;AAAA,IAC1C,WAAW,MAAM;AAAA,IACjB,SAAS,MAAM;AAAA,EACjB;AACA,MAAI,UAAU,SAAS,MAAM,MAAM;AACjC,eAAW,MAAM,IAAI,MAAM;AAAA,EAC7B;AACA,QAAM,MAAM,WAAW,UAAU;AAAA,IAC/B,MAAM;AAAA,IACN,SAAS;AAAA,IACT,SAAS,eAAe,MAAM,IAAI;AAAA,EACpC,CAAC;AAGD,wBAAsB,IAAI,IAAI,QAAQ,WAAW,EAAE;AAEnD,SAAO;AAAA,IACL,SAAS;AAAA,MACP;AAAA,QACE,MAAM;AAAA,QACN,MAAM,KAAK,UAAU;AAAA,UACnB,OAAO;AAAA,YACL,IAAI,MAAM;AAAA,YACV,MAAM,MAAM;AAAA,UACd;AAAA,UACA,KAAK;AAAA,YACH,IAAI,IAAI;AAAA,YACR,QAAQ,IAAI;AAAA,YACZ,SAAS,IAAI;AAAA,UACf;AAAA,UACA,SAAS,2CAA2C,IAAI,EAAE;AAAA,QAC5D,GAAG,MAAM,CAAC;AAAA,MACZ;AAAA,IACF;AAAA,EACF;AACF;AAUO,IAAM,oBAAkD,OAC7D,MACA,YAC0B;AAE1B,QAAM,YAAY,sBAAsB,MAAM,IAAI;AAElD,QAAM,EAAE,UAAU,QAAQ,IAAI;AAE9B,QAAM,QAAQ,MAAM,SAAS,MAAM,cAAc,cAAc,UAAU,KAAK,CAAC;AAE/E,MAAI,UAAU,QAAW;AACvB,UAAM,IAAI,MAAM,oBAAoB,UAAU,KAAK,EAAE;AAAA,EACvD;AAGA,QAAM,SAAS,MAAM,YAAY,MAAM,EAAE;AAGzC,MAAI,MAAM,SAAS,UAAU,SAAS,SAAS,MAAM,QAAQ,QAAW;AACtE,QAAI,QAAQ,YAAY,QAAW;AACjC,YAAM,IAAI,MAAM,uDAAuD;AAAA,IACzE;AACA,UAAM,WAAW,KAAK,QAAQ,SAAS,SAAS,MAAM,EAAE;AACxD,UAAM,GAAG,UAAU,EAAE,WAAW,MAAM,OAAO,KAAK,CAAC;AAAA,EACrD;AAGA,QAAM,SAAS,MAAM,SAAS,MAAM,OAAO,MAAM,EAAE;AACnD,MAAI,CAAC,OAAO,SAAS;AACnB,UAAM,IAAI,MAAM,OAAO,MAAM,OAAO;AAAA,EACtC;AAEA,SAAO;AAAA,IACL,SAAS;AAAA,MACP;AAAA,QACE,MAAM;AAAA,QACN,MAAM,KAAK,UAAU;AAAA,UACnB,SAAS;AAAA,UACT,OAAO;AAAA,YACL,IAAI,MAAM;AAAA,YACV,MAAM,MAAM;AAAA,YACZ,MAAM,MAAM;AAAA,UACd;AAAA,UACA,SAAS,+BAA+B,MAAM,IAAI;AAAA,QACpD,GAAG,MAAM,CAAC;AAAA,MACZ;AAAA,IACF;AAAA,EACF;AACF;;;AD1QO,IAAM,gBAAqC;AAAA,EAChD;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,YAAYC,GAAE,OAAO;AAAA,MACnB,MAAMA,GAAE,KAAK,CAAC,QAAQ,QAAQ,KAAK,CAAC,EAAE,SAAS,EAAE,SAAS,sBAAsB;AAAA,IAClF,CAAC;AAAA,IACD,SAAS,CAAC,MAAM,YAAY,iBAAiB,MAAmC,OAAO;AAAA,EACzF;AAAA,EACA;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,YAAYA,GAAE,OAAO;AAAA,MACnB,OAAOA,GAAE,OAAO,EAAE,IAAI,CAAC,EAAE,SAAS,kBAAkB;AAAA,IACtD,CAAC;AAAA,IACD,SAAS,CAAC,MAAM,YAAY,mBAAmB,MAAqC,OAAO;AAAA,EAC7F;AAAA,EACA;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,YAAYA,GAAE,OAAO;AAAA,MACnB,MAAMA,GAAE,OAAO,EAAE,IAAI,CAAC,EAAE,SAAS,YAAY;AAAA,MAC7C,MAAMA,GAAE,KAAK,CAAC,QAAQ,MAAM,CAAC,EAAE,SAAS,YAAY;AAAA,MACpD,QAAQA,GAAE,OAAO,EAAE,IAAI,CAAC,EAAE,SAAS,uBAAuB;AAAA,MAC1D,QAAQA,GAAE,OAAO,EAAE,SAAS,EAAE,SAAS,4BAA4B;AAAA,MACnE,aAAaA,GAAE,OAAO,EAAE,SAAS,EAAE,SAAS,mBAAmB;AAAA,IACjE,CAAC;AAAA,IACD,SAAS,CAAC,MAAM,YAAY,kBAAkB,MAAoC,OAAO;AAAA,EAC3F;AAAA,EACA;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,YAAYA,GAAE,OAAO;AAAA,MACnB,OAAOA,GAAE,OAAO,EAAE,IAAI,CAAC,EAAE,SAAS,kBAAkB;AAAA,IACtD,CAAC;AAAA,IACD,SAAS,CAAC,MAAM,YAAY,iBAAiB,MAAmC,OAAO;AAAA,EACzF;AAAA,EACA;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,YAAYA,GAAE,OAAO;AAAA,MACnB,OAAOA,GAAE,OAAO,EAAE,IAAI,CAAC,EAAE,SAAS,kBAAkB;AAAA,IACtD,CAAC;AAAA,IACD,SAAS,CAAC,MAAM,YAAY,kBAAkB,MAAoC,OAAO;AAAA,EAC3F;AACF;;;AGzEA,SAAS,KAAAC,UAAS;;;ACkBX,IAAM,uBAAwD,CACnE,MACA,YAC0B;AAE1B,QAAM,YAAY,yBAAyB,MAAM,IAAI;AAErD,QAAM,EAAE,QAAQ,IAAI;AAEpB,QAAM,aAAa,IAAI,WAAW,QAAQ,OAAO;AACjD,QAAM,MAAM,WAAW,OAAO,UAAU,KAAK;AAE7C,MAAI,CAAC,KAAK;AACR,UAAM,IAAI,MAAM,kBAAkB,UAAU,KAAK,EAAE;AAAA,EACrD;AAEA,SAAO,QAAQ,QAAQ;AAAA,IACrB,SAAS;AAAA,MACP;AAAA,QACE,MAAM;AAAA,QACN,MAAM,KAAK,UAAU,KAAK,MAAM,CAAC;AAAA,MACnC;AAAA,IACF;AAAA,EACF,CAAC;AACH;AAOO,IAAM,iBAA4C,CACvD,MACA,YAC0B;AAE1B,QAAM,YAAY,mBAAmB,MAAM,IAAI;AAE/C,QAAM,EAAE,QAAQ,IAAI;AAEpB,QAAM,aAAa,IAAI,WAAW,QAAQ,OAAO;AAEjD,MAAI;AACJ,MAAI,UAAU,eAAe,MAAM;AACjC,WAAO,WAAW,eAAe;AAAA,EACnC,WAAW,UAAU,WAAW,QAAW;AACzC,WAAO,WAAW,SAAS,UAAU,MAAM;AAAA,EAC7C,OAAO;AACL,WAAO,WAAW,SAAS;AAAA,EAC7B;AAEA,SAAO,QAAQ,QAAQ;AAAA,IACrB,SAAS;AAAA,MACP;AAAA,QACE,MAAM;AAAA,QACN,MAAM,KAAK,UAAU,EAAE,KAAK,GAAG,MAAM,CAAC;AAAA,MACxC;AAAA,IACF;AAAA,EACF,CAAC;AACH;AAQO,IAAM,kBAA8C,CACzD,MACA,YAC0B;AAE1B,QAAM,YAAY,oBAAoB,MAAM,IAAI;AAEhD,QAAM,EAAE,QAAQ,IAAI;AAEpB,QAAM,aAAa,IAAI,WAAW,QAAQ,OAAO;AACjD,QAAM,SAAS,WAAW,UAAU,UAAU,KAAK;AAEnD,MAAI,CAAC,OAAO,SAAS;AACnB,UAAM,IAAI,MAAM,OAAO,MAAM,OAAO;AAAA,EACtC;AAEA,QAAM,MAAM,WAAW,OAAO,UAAU,KAAK;AAE7C,SAAO,QAAQ,QAAQ;AAAA,IACrB,SAAS;AAAA,MACP;AAAA,QACE,MAAM;AAAA,QACN,MAAM,KAAK,UAAU;AAAA,UACnB,SAAS;AAAA,UACT;AAAA,UACA,SAAS;AAAA,QACX,GAAG,MAAM,CAAC;AAAA,MACZ;AAAA,IACF;AAAA,EACF,CAAC;AACH;;;AD3FO,IAAM,cAAmC;AAAA,EAC9C;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,YAAYC,GAAE,OAAO;AAAA,MACnB,YAAYA,GAAE,QAAQ,EAAE,SAAS,EAAE,SAAS,uBAAuB;AAAA,MACnE,QAAQA,GAAE,KAAK,CAAC,WAAW,WAAW,aAAa,UAAU,WAAW,CAAC,EACtE,SAAS,EACT,SAAS,sBAAsB;AAAA,IACpC,CAAC;AAAA,IACD,SAAS,CAAC,MAAM,YAAY,eAAe,MAAiC,OAAO;AAAA,EACrF;AAAA,EACA;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,YAAYA,GAAE,OAAO;AAAA,MACnB,OAAOA,GAAE,OAAO,EAAE,IAAI,CAAC,EAAE,SAAS,iBAAiB;AAAA,IACrD,CAAC;AAAA,IACD,SAAS,CAAC,MAAM,YAAY,qBAAqB,MAAuC,OAAO;AAAA,EACjG;AAAA,EACA;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,YAAYA,GAAE,OAAO;AAAA,MACnB,OAAOA,GAAE,OAAO,EAAE,IAAI,CAAC,EAAE,SAAS,kBAAkB;AAAA,IACtD,CAAC;AAAA,IACD,SAAS,CAAC,MAAM,YAAY,gBAAgB,MAAkC,OAAO;AAAA,EACvF;AACF;;;AEpDA,SAAS,KAAAC,UAAS;AAWX,IAAM,eAAoC;AAAA,EAC/C;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,SAAS,MAA6B;AACpC,YAAM,WAAW,gBAAgB,IAAI;AACrC,YAAM,cAAc,SAAS,IAAI,UAAQ;AAAA,QACvC,MAAM,IAAI;AAAA,QACV,aAAa,IAAI;AAAA,MACnB,EAAE;AAEF,aAAO,QAAQ,QAAQ;AAAA,QACrB,SAAS;AAAA,UACP;AAAA,YACE,MAAM;AAAA,YACN,MAAM,KAAK,UAAU,EAAE,UAAU,YAAY,GAAG,MAAM,CAAC;AAAA,UACzD;AAAA,QACF;AAAA,MACF,CAAC;AAAA,IACH;AAAA,EACF;AAAA,EACA;AAAA,IACE,MAAM;AAAA,IACN,aAAa;AAAA,IACb,YAAYC,GAAE,OAAO;AAAA,MACnB,SAASA,GAAE,OAAO,EAAE,SAAS,EAAE,SAAS,8BAA8B;AAAA,IACxE,CAAC;AAAA,IACD,SAAS,CAAC,SAAyD;AAEjE,YAAM,cAAc,KAAK,SAAS;AAClC,YAAM,WAAW,aAAa,WAAW;AAEzC,aAAO,QAAQ,QAAQ;AAAA,QACrB,SAAS;AAAA,UACP;AAAA,YACE,MAAM;AAAA,YACN,MAAM;AAAA,UACR;AAAA,QACF;AAAA,MACF,CAAC;AAAA,IACH;AAAA,EACF;AACF;;;ACxCA,gBAAgB,YAAY,aAAa;AACzC,gBAAgB,YAAY,WAAW;AACvC,gBAAgB,YAAY,YAAY;;;ACFjC,IAAM,gBAA0C,OACrD,MACA,YAC0B;AAE1B,QAAM,YAAY,kBAAkB,MAAM,IAAI;AAE9C,QAAM,cAAc,UAAU,QAAQ,CAAC;AAEvC,SAAO,eAAe,UAAU,SAAS,aAAa,OAAO;AAC/D;;;AdVA,IAAMC,UAAS,aAAa,YAAY;AAGjC,SAAS,gBAAgB,SAAmC;AAEjE,QAAM,SAAS,IAAI;AAAA,IACjB;AAAA,MACE,MAAM;AAAA,MACN,SAAS;AAAA,IACX;AAAA,IACA;AAAA,MACE,cAAc;AAAA,QACZ,OAAO,CAAC;AAAA,MACV;AAAA,IACF;AAAA,EACF;AAGA,SAAO,kBAAkB,wBAAwB,MAAM;AACrD,WAAO,QAAQ,QAAQ;AAAA,MACrB,OAAO;AAAA;AAAA,QAEL;AAAA,UACE,MAAM;AAAA,UACN,aAAa;AAAA,UACb,aAAa;AAAA,YACX,MAAM;AAAA,YACN,YAAY;AAAA,cACV,OAAO;AAAA,gBACL,MAAM;AAAA,gBACN,aAAa;AAAA,cACf;AAAA,cACA,QAAQ;AAAA,gBACN,MAAM;AAAA,gBACN,MAAM,CAAC,gBAAgB,uBAAuB,cAAc,mBAAmB,oBAAoB;AAAA,gBACnG,aAAa;AAAA,cACf;AAAA,cACA,QAAQ;AAAA,gBACN,MAAM;AAAA,gBACN,MAAM,CAAC,WAAW,cAAc,MAAM;AAAA,gBACtC,SAAS;AAAA,gBACT,aAAa;AAAA,cACf;AAAA,cACA,OAAO;AAAA,gBACL,MAAM;AAAA,gBACN,SAAS;AAAA,gBACT,aAAa;AAAA,cACf;AAAA,cACA,QAAQ;AAAA,gBACN,MAAM;AAAA,gBACN,OAAO,EAAE,MAAM,SAAS;AAAA,gBACxB,aAAa;AAAA,cACf;AAAA,YACF;AAAA,YACA,UAAU,CAAC,OAAO;AAAA,UACpB;AAAA,QACF;AAAA;AAAA,QAEA;AAAA,UACE,MAAM;AAAA,UACN,aAAa;AAAA,UACb,aAAa;AAAA,YACX,MAAM;AAAA,YACN,YAAY;AAAA,cACV,UAAU;AAAA,gBACR,MAAM;AAAA,gBACN,aAAa;AAAA,cACf;AAAA,YACF;AAAA,YACA,UAAU,CAAC,UAAU;AAAA,UACvB;AAAA,QACF;AAAA;AAAA,QAEA;AAAA,UACE,MAAM;AAAA,UACN,aAAa;AAAA,UACb,aAAa;AAAA,YACX,MAAM;AAAA,YACN,YAAY;AAAA,cACV,SAAS;AAAA,gBACP,MAAM;AAAA,gBACN,aAAa;AAAA,cACf;AAAA,cACA,MAAM;AAAA,gBACJ,MAAM;AAAA,gBACN,aAAa;AAAA,cACf;AAAA,YACF;AAAA,YACA,UAAU,CAAC,SAAS;AAAA,UACtB;AAAA,QACF;AAAA,MACF;AAAA,IACF,CAAC;AAAA,EACH,CAAC;AAGD,SAAO,kBAAkB,uBAAuB,OAAO,YAAY;AACjE,UAAM,EAAE,MAAM,WAAW,KAAK,IAAI,QAAQ;AAC1C,UAAM,YAAY,KAAK,IAAI;AAE3B,IAAAA,QAAO,KAAK,EAAE,MAAM,MAAM,MAAM,KAAK,UAAU,IAAI,EAAE,GAAG,cAAc;AAGtE,UAAM,WAAW,MAAM;AAAA,MACrB,QAAQ;AAAA,MACR,QAAQ;AAAA,MACR,QAAQ;AAAA,IACV;AACA,UAAM,UAAU,EAAE,UAAU,QAAQ;AAEpC,QAAI;AACF,UAAI;AAGJ,UAAI,SAAS,WAAW;AACtB,cAAM,YAAY,kBAAkB,MAAM,QAAQ,CAAC,CAAC;AACpD,iBAAS,MAAM,cAAc,WAAW,OAAO;AAAA,MACjD,OAAO;AAEL,cAAM,OAAO,MAAM,KAAK,OAAK,EAAE,SAAS,IAAI;AAC5C,YAAI,SAAS,QAAW;AACtB,gBAAM,IAAI,MAAM,iBAAiB,IAAI,EAAE;AAAA,QACzC;AAGA,cAAM,YAAY,KAAK,OAAO,MAAM,QAAQ,CAAC,CAAC;AAG9C,iBAAS,MAAM,KAAK,QAAQ,WAAW,OAAO;AAAA,MAChD;AAEA,YAAM,aAAa,KAAK,IAAI,IAAI;AAChC,MAAAA,QAAO,KAAK,EAAE,MAAM,MAAM,WAAW,GAAG,gBAAgB;AAExD,aAAO;AAAA,IACT,SAAS,OAAO;AACd,YAAM,aAAa,KAAK,IAAI,IAAI;AAChC,MAAAA,QAAO,MAAM;AAAA,QACX,MAAM;AAAA,QACN;AAAA,QACA,OAAO,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK;AAAA,MAC9D,GAAG,uBAAuB;AAC1B,YAAM;AAAA,IACR;AAAA,EACF,CAAC;AAED,SAAO;AACT;AAEA,eAAsB,aAAa,SAA0C;AAC3E,EAAAA,QAAO,KAAK;AAAA,IACV,SAAS,QAAQ;AAAA,IACjB,aAAa,QAAQ;AAAA,EACvB,GAAG,qBAAqB;AAExB,QAAM,SAAS,gBAAgB,OAAO;AACtC,QAAM,YAAY,IAAI,qBAAqB;AAC3C,QAAM,OAAO,QAAQ,SAAS;AAE9B,EAAAA,QAAO,KAAK,yCAAyC;AACvD;AAIA,IAAM,aAAa,QAAQ,KAAK,CAAC,KAAK;AACtC,IAAM,mBAAmB,WAAW,SAAS,eAAe,KAAK,WAAW,SAAS,YAAY;AAEjG,IAAI,kBAAkB;AACpB,eAAa;AAAA,IACX,SAAS,QAAQ,IAAI,UAAU;AAAA,IAC/B,QAAQ,QAAQ,IAAI,aAAa;AAAA,IACjC,aAAa,QAAQ,IAAI,cAAc,KAAK,QAAQ,IAAI,KAAK;AAAA,EAC/D,CAAC,EAAE,MAAM,CAAC,UAAmB;AAC3B,IAAAA,QAAO,MAAM,EAAE,OAAO,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK,EAAE,GAAG,4BAA4B;AAC5G,YAAQ,KAAK,CAAC;AAAA,EAChB,CAAC;AACH;","names":["responseJson","z","lines","z","__dirname","z","z","z","z","z","logger"]}
@@ -1,7 +0,0 @@
1
- import {
2
- WatchService
3
- } from "./chunk-L2YVNC63.js";
4
- export {
5
- WatchService
6
- };
7
- //# sourceMappingURL=watch.service-YAIKKDCF.js.map
@@ -1,77 +0,0 @@
1
- ---
2
- name: atomic-commits
3
- description: Methodology for creating atomic git commits grouped by logical features. Use when committing changes, organizing commits, or deciding how to group changes into commits.
4
- ---
5
-
6
- # Atomic Commit Methodology
7
-
8
- ## Grouping Rules
9
-
10
- **Group together:**
11
- - Feature code + its documentation
12
- - Feature code + its tests
13
- - Feature code + its configuration
14
- - API changes + API documentation
15
-
16
- **Keep separate:**
17
- - Unrelated features
18
- - Independent bug fixes
19
- - Standalone doc improvements
20
- - Pure refactoring
21
-
22
- ## Commit Message Format
23
-
24
- ```
25
- <type>(<scope>): <description>
26
-
27
- [optional body explaining why]
28
- ```
29
-
30
- **Types:** `feat`, `fix`, `docs`, `refactor`, `test`, `chore`, `perf`
31
-
32
- ## Workflow
33
-
34
- ### 1. Analyze
35
- ```bash
36
- git diff HEAD # All changes
37
- git diff --cached # Staged only
38
- git ls-files --others --exclude-standard # Untracked
39
- ```
40
-
41
- ### 2. Evaluate README Impact
42
- **STOP and evaluate each row - does this change warrant README updates?**
43
-
44
- | Category | Question | If YES, update |
45
- |----------|----------|----------------|
46
- | Features | New CLI command, Skill, MCP tool, or capability? | Features/Commands section |
47
- | Setup | Changed install steps, settings, or data paths? | Setup/Installation section |
48
- | Scripts | New/changed npm scripts? | package.json scripts table |
49
- | Behavior | Changed how existing features work? | Relevant feature section |
50
- | Deps | New package or common error scenario? | Technologies/Troubleshooting |
51
-
52
- **If any YES**: Update the README appropriately and include it in this commit (feature + docs = atomic unit)
53
-
54
- ### 3. Group by Feature
55
- Each commit = one complete, testable change.
56
-
57
- ### 4. Commit Each Group
58
- ```bash
59
- git add <files> # Stage related files
60
- bun run precommit # Validate (already quiet)
61
- # If fails: fix issues, re-stage, re-validate
62
- git commit -m "..." # Commit when valid
63
- ```
64
-
65
- ### 5. Handle Untracked Files
66
- - **Should commit**: Add to appropriate commit
67
- - **Should ignore**: Suggest `.gitignore` (ask first)
68
- - **Intentional**: Document why untracked
69
-
70
- Goal: Clean `git status` after commits
71
-
72
- ## Safety
73
-
74
- - Never force push
75
- - Never amend commits from other sessions
76
- - Ask if unsure about grouping
77
- - Always report final `git status`