@vpxa/aikit 0.1.63 → 0.1.64

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,10 +1,26 @@
1
1
  # @vpxa/aikit
2
2
 
3
- Local-first AI developer toolkit knowledge base, code analysis, context management, and developer tools for LLM agents.
3
+ Local-first AI developer toolkit and multi-agent operating system for LLM-powered software teams.
4
+
5
+ ## Why AI Kit?
6
+
7
+ AI Kit is built around 17 specialized agents with clear roles, not one general-purpose agent improvising across research, planning, coding, debugging, review, and architecture at the same time. The Orchestrator conducts. The Planner researches and designs. The Implementer writes code. The Debugger diagnoses failures. Security audits risk. Four Researcher variants analyze hard decisions from different model perspectives. Dual code and architecture reviewers catch what one model alone will miss.
8
+
9
+ That specialization is paired with parallel dispatch. The Orchestrator decomposes complex work into independent sub-tasks, runs read-only agents concurrently without artificial limits, and can run up to four file-modifying agents in parallel when their file sets do not overlap. The result is lower wall-clock time without turning the repo into a race condition.
10
+
11
+ For non-trivial technical choices, AI Kit uses multi-model decision analysis. Researcher-Alpha, Beta, Gamma, and Delta run in parallel on different LLMs, then the Orchestrator synthesizes agreement, disagreement, trade-offs, and implementation risk into one recommendation. Important decisions do not depend on a single model's blind spots.
12
+
13
+ Every code change also goes through FORGE. Tasks are classified by tier, critical claims are tracked in an evidence map, and completion is gated by provenance, commitment, and coverage checks. Floor, Standard, and Critical work do not get the same ceremony, but they do get explicit, deterministic quality control.
14
+
15
+ AI Kit keeps that system fast by compressing code context 5-20x with `compact`, `digest`, and `stratum_card`. Agents get fresh, scoped context for the task in front of them instead of inheriting a bloated conversation transcript. Each sub-agent receives exactly the code context it needs to do one job well.
4
16
 
5
17
  ## Features
6
18
 
7
19
  - **82 MCP tools** for AI agents to search, analyze, and manipulate codebases
20
+ - **17 specialized agents** — Orchestrator, Planner, Implementer, Frontend, Refactor, Debugger, Security, Documenter, Explorer, 4 Researchers, 2 Code Reviewers, 2 Architect Reviewers — each with a focused role and optimal model
21
+ - **Parallel agent dispatch** — independent sub-tasks run concurrently, up to 4 file-modifying agents in parallel on non-overlapping files
22
+ - **Multi-model decisions** — 4 Researcher variants with different LLMs analyze every non-trivial technical choice in parallel, then synthesize consensus
23
+ - **FORGE quality gates** — evidence-tracked code changes with tiered classification (Floor/Standard/Critical) and deterministic pass/fail gating
8
24
  - **47 CLI commands** for shell-based interaction
9
25
  - **Local-first** — ONNX embeddings, LanceDB vector store, no cloud dependencies
10
26
  - **Three interfaces**: MCP (for agent IDEs), CLI (for terminal agents), and programmatic API
@@ -15,6 +31,30 @@ Local-first AI developer toolkit — knowledge base, code analysis, context mana
15
31
  - **Middleware pipeline** — extensible plugin hooks for tool execution customization
16
32
  - **Pluggable memory** — adapter interface for curated knowledge storage backends
17
33
 
34
+ ## Agent Architecture
35
+
36
+ | Agent | Role | Category |
37
+ |-------|------|----------|
38
+ | Orchestrator | Master conductor — decomposes tasks, dispatches agents, gates quality | Orchestration |
39
+ | Planner | Autonomous research and TDD implementation planning | Orchestration |
40
+ | Implementer | Code writing following TDD practices | Implementation |
41
+ | Frontend | UI/UX specialist — React, styling, responsive design | Implementation |
42
+ | Refactor | Code cleanup, structure improvement, maintainability | Implementation |
43
+ | Debugger | Issue diagnosis, error tracing, root cause analysis | Diagnostics |
44
+ | Security | Vulnerability analysis, OWASP compliance, security review | Diagnostics |
45
+ | Explorer | Rapid codebase navigation and context gathering | Exploration |
46
+ | Documenter | Comprehensive project documentation generation | Documentation |
47
+ | Researcher-Alpha | Deep research — primary investigator | Research |
48
+ | Researcher-Beta | Pragmatic analysis — trade-offs and edge cases | Research |
49
+ | Researcher-Gamma | Broad pattern matching across domains | Research |
50
+ | Researcher-Delta | Implementation feasibility and performance | Research |
51
+ | Code-Reviewer-Alpha | Primary code review | Review |
52
+ | Code-Reviewer-Beta | Code review from different LLM perspective | Review |
53
+ | Architect-Reviewer-Alpha | Primary architecture review | Review |
54
+ | Architect-Reviewer-Beta | Architecture review from different LLM perspective | Review |
55
+
56
+ The Orchestrator never writes code itself; it delegates, reviews, and gates. Dual-perspective reviewers catch defects and design issues a single model can miss, while multi-model research reduces single-LLM blind spots before implementation begins. All of these agents are scaffolded by `aikit init` and can be customized per project.
57
+
18
58
  ## What This Is
19
59
 
20
60
  This is an **MCP (Model Context Protocol) server** that gives AI agents:
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@vpxa/aikit",
3
- "version": "0.1.63",
3
+ "version": "0.1.64",
4
4
  "type": "module",
5
5
  "description": "Local-first AI developer toolkit — knowledge base, code analysis, context management, and developer tools for LLM agents",
6
6
  "license": "MIT",
@@ -3,7 +3,7 @@ import{i as e,t}from"./constants-D3v4VDf0.js";import{copyFileSync as n,existsSyn
3
3
  `))console.log(` ${e}`)}}function z(e){console.log(e.id),console.log(` Command: ${e.command}${e.args.length>0?` ${e.args.join(` `)}`:``}`),console.log(` PID: ${e.pid??`unknown`}`),console.log(` Status: ${e.status}`),console.log(` Started: ${e.startedAt}`),e.exitCode!==void 0&&console.log(` Exit code: ${e.exitCode}`),console.log(` Logs: ${e.logs.length}`)}function pt(e){if(console.log(`Exports scanned: ${e.totalExports}`),console.log(`Dead in source: ${e.totalDeadSource} (actionable)`),console.log(`Dead in docs: ${e.totalDeadDocs} (informational)`),e.totalDeadSource===0&&e.totalDeadDocs===0){console.log(`No dead symbols found.`);return}if(e.deadInSource.length>0){console.log(`
4
4
  Dead in source (actionable):`);for(let t of e.deadInSource)console.log(` - ${t.path}:${t.line} ${t.kind} ${t.name}`)}if(e.deadInDocs.length>0){console.log(`
5
5
  Dead in docs (informational):`);for(let t of e.deadInDocs)console.log(` - ${t.path}:${t.line} ${t.kind} ${t.name}`)}}function mt(e){console.log(e.path),console.log(` Language: ${e.language}`),console.log(` Lines: ${e.lines}`),console.log(` Estimated tokens: ~${e.estimatedTokens}`),console.log(``),H(`Imports`,e.imports),H(`Exports`,e.exports),H(`Functions`,e.functions.map(e=>`${e.name} @ line ${e.line}${e.exported?` [exported]`:``}`)),H(`Classes`,e.classes.map(e=>`${e.name} @ line ${e.line}${e.exported?` [exported]`:``}`)),H(`Interfaces`,e.interfaces.map(e=>`${e.name} @ line ${e.line}`)),H(`Types`,e.types.map(e=>`${e.name} @ line ${e.line}`))}function ht(e){if(console.log(`Symbol: ${e.name}`),e.definedIn?console.log(`Defined in: ${e.definedIn.path}:${e.definedIn.line} (${e.definedIn.kind})`):console.log(`Defined in: not found`),console.log(``),console.log(`Imported by:`),e.importedBy.length===0)console.log(` none`);else for(let t of e.importedBy)console.log(` - ${t.path}:${t.line} ${t.importStatement}`);if(console.log(``),console.log(`Referenced in:`),e.referencedIn.length===0)console.log(` none`);else for(let t of e.referencedIn)console.log(` - ${t.path}:${t.line} ${t.context}`)}function B(e){console.log(e.name),console.log(` Files: ${e.files.length}`),console.log(` Updated: ${e.updated}`),e.description&&console.log(` Description: ${e.description}`);for(let t of e.files)console.log(` - ${t}`)}function V(e){if(console.log(e.id),console.log(` Label: ${e.label}`),console.log(` Created: ${e.createdAt}`),e.notes&&console.log(` Notes: ${e.notes}`),e.files?.length){console.log(` Files: ${e.files.length}`);for(let t of e.files)console.log(` - ${t}`)}console.log(` Data:`);for(let t of JSON.stringify(e.data,null,2).split(`
6
- `))console.log(` ${t}`)}function H(e,t){if(console.log(`${e}:`),t.length===0){console.log(` none`),console.log(``);return}for(let e of t)console.log(` - ${e}`);console.log(``)}function gt(e){let t=e.trim();if(!t)return``;try{return JSON.parse(t)}catch{return e}}function _t(e){let t=e.trim();if(!t)return{};let n=JSON.parse(t);if(!n||typeof n!=`object`||Array.isArray(n))throw Error(`Checkpoint data must be a JSON object.`);return n}function U(e,t,n=60){let r=new Map;for(let t=0;t<e.length;t++){let i=e[t];r.set(i.record.id,{record:i.record,score:1/(n+t+1)})}for(let e=0;e<t.length;e++){let i=t[e],a=r.get(i.record.id);a?a.score+=1/(n+e+1):r.set(i.record.id,{record:i.record,score:1/(n+e+1)})}return[...r.values()].sort((e,t)=>t.score-e.score)}async function vt(e,t){switch(e.type){case`search`:{if(!t)throw Error(`search operation requires AI Kit context`);let n=typeof e.args.query==`string`?e.args.query.trim():``;if(!n)throw Error(`search operation requires a query`);let r=typeof e.args.limit==`number`?e.args.limit:5,i=e.args.search_mode===`semantic`||e.args.search_mode===`keyword`?e.args.search_mode:`hybrid`,a=typeof e.args.content_type==`string`?e.args.content_type:void 0,o=typeof e.args.min_score==`number`?e.args.min_score:.25;if(i===`keyword`)return(await t.store.ftsSearch(n,{limit:r,contentType:a,minScore:o})).slice(0,r);let s=await t.embedder.embedQuery(n);if(i===`semantic`)return t.store.search(s,{limit:r,contentType:a,minScore:o});let[c,l]=await Promise.all([t.store.search(s,{limit:r*2,contentType:a,minScore:o}),t.store.ftsSearch(n,{limit:r*2,contentType:a,minScore:o}).catch(()=>[])]);return U(c,l).slice(0,r)}case`find`:{if(!t)throw Error(`find operation requires AI Kit context`);let n=typeof e.args.query==`string`?e.args.query:void 0,r=typeof e.args.glob==`string`?e.args.glob:void 0,i=typeof e.args.pattern==`string`?e.args.pattern:void 0,a=typeof e.args.limit==`number`?e.args.limit:10,o=typeof e.args.content_type==`string`?e.args.content_type:void 0,s=typeof e.args.cwd==`string`?e.args.cwd:void 0;if(!n&&!r&&!i)throw Error(`find operation requires query, glob, or pattern`);return g(t.embedder,t.store,{query:n,glob:r,pattern:i,limit:a,contentType:o,cwd:s})}case`check`:return m({files:Array.isArray(e.args.files)?e.args.files.filter(e=>typeof e==`string`):void 0,cwd:typeof e.args.cwd==`string`?e.args.cwd:void 0,skipTypes:e.args.skip_types===!0,skipLint:e.args.skip_lint===!0});default:throw Error(`Unsupported batch operation type: ${e.type}`)}}const yt=[{name:`parse-output`,description:`Parse build or tool output from stdin`,usage:`aikit parse-output [--tool tsc|vitest|biome|git-status]`,run:async e=>{let t=N(e,`--tool`,``).trim()||void 0,n=await F();n.trim()||(console.error(`Usage: aikit parse-output [--tool tsc|vitest|biome|git-status]`),process.exit(1)),ot(_(n,t))}},{name:`git`,description:`Show git branch, status, recent commits, and optional diff stats`,usage:`aikit git [--cwd path] [--commit-count N] [--diff]`,run:async e=>{lt(await he({cwd:N(e,`--cwd`,``).trim()||void 0,commitCount:M(e,`--commit-count`,5),includeDiff:P(e,`--diff`)}))}},{name:`diff`,description:`Parse unified diff text from stdin into structured file changes`,usage:`git diff | aikit diff`,run:async()=>{let e=await F();e.trim()||(console.error(`Usage: git diff | aikit diff`),process.exit(1)),ut(le({diff:e}))}},{name:`summarize`,description:`Show a structural summary of a file`,usage:`aikit summarize <path>`,run:async e=>{let t=e.shift()?.trim();t||(console.error(`Usage: aikit summarize <path>`),process.exit(1)),mt(await de({path:l(t)}))}},{name:`checkpoint`,description:`Save and restore lightweight session checkpoints`,usage:`aikit checkpoint <save|load|list|latest> [label-or-id] [--data json] [--notes text]`,run:async e=>{let t=e.shift()?.trim();switch(t||(console.error(`Usage: aikit checkpoint <save|load|list|latest> [label-or-id] [--data json] [--notes text]`),process.exit(1)),t){case`save`:{let t=e.shift()?.trim(),n=N(e,`--data`,``),r=N(e,`--notes`,``).trim()||void 0,i=n.trim()?``:await F();t||(console.error(`Usage: aikit checkpoint save <label> [--data json] [--notes text]`),process.exit(1)),V(ne(t,_t(n||i),{notes:r}));return}case`load`:{let t=e.shift()?.trim();t||(console.error(`Usage: aikit checkpoint load <id>`),process.exit(1));let n=te(t);if(!n){console.log(`No checkpoint found: ${t}`);return}V(n);return}case`list`:{let e=ee();if(e.length===0){console.log(`No checkpoints saved.`);return}console.log(`Checkpoints (${e.length})`),console.log(`─`.repeat(60));for(let t of e)console.log(`${t.id}`),console.log(` Label: ${t.label}`),console.log(` Created: ${t.createdAt}`);return}case`latest`:{let e=h();if(!e){console.log(`No checkpoints saved.`);return}V(e);return}default:console.error(`Unknown checkpoint action: ${t}`),console.error(`Actions: save, load, list, latest`),process.exit(1)}}}],bt=[{name:`proc`,description:`Manage in-memory child processes`,usage:`aikit proc <start|stop|status|list|logs> ...`,run:async e=>{let t=e.shift()?.trim()??``;switch(t){case`start`:{let t=e.shift()?.trim(),n=e.shift()?.trim();(!t||!n)&&(console.error(`Usage: aikit proc start <id> <command> [args...]`),process.exit(1)),z(b(t,n,e));return}case`stop`:{let t=e.shift()?.trim();t||(console.error(`Usage: aikit proc stop <id>`),process.exit(1));let n=S(t);if(!n){console.log(`No managed process found: ${t}`);return}z(n);return}case`status`:{let t=e.shift()?.trim();t||(console.error(`Usage: aikit proc status <id>`),process.exit(1));let n=x(t);if(!n){console.log(`No managed process found: ${t}`);return}z(n);return}case`list`:{let e=v();if(e.length===0){console.log(`No managed processes.`);return}for(let t of e)z(t),console.log(``);return}case`logs`:{let t=M(e,`--tail`,50),n=e.shift()?.trim();n||(console.error(`Usage: aikit proc logs <id> [--tail N]`),process.exit(1));let r=y(n,t);if(r.length===0){console.log(`No logs found for process: ${n}`);return}for(let e of r)console.log(e);return}default:console.error(`Unknown proc action: ${t}`),console.error(`Actions: start, stop, status, list, logs`),process.exit(1)}}},{name:`watch`,description:`Manage in-memory filesystem watchers`,usage:`aikit watch <start|stop|list> ...`,run:async e=>{let t=e.shift()?.trim()??``;switch(t){case`start`:{let t=e.shift()?.trim();t||(console.error(`Usage: aikit watch start <path>`),process.exit(1));let n=Ge({path:l(t)});console.log(`Started watcher: ${n.id}`),console.log(` Path: ${n.path}`),console.log(` Status: ${n.status}`);return}case`stop`:{let t=e.shift()?.trim();t||(console.error(`Usage: aikit watch stop <id>`),process.exit(1));let n=Ke(t);console.log(n?`Stopped watcher: ${t}`:`Watcher not found: ${t}`);return}case`list`:{let e=We();if(e.length===0){console.log(`No active watchers.`);return}for(let t of e)console.log(`${t.id}`),console.log(` Path: ${t.path}`),console.log(` Status: ${t.status}`),console.log(` Events: ${t.eventCount}`);return}default:console.error(`Unknown watch action: ${t}`),console.error(`Actions: start, stop, list`),process.exit(1)}}},{name:`delegate`,description:`Delegate a task to a local Ollama model`,usage:`aikit delegate [--model name] [--system prompt] [--temp 0.3] <prompt | --stdin>`,run:async e=>{if((e[0]===`models`?e.shift():void 0)===`models`){try{let e=await se();if(e.length===0){console.log(`No Ollama models available. Pull one with: ollama pull gemma4:e2b`);return}for(let t of e)console.log(t)}catch{console.error(`Ollama is not running. Start it with: ollama serve`),process.exit(1)}return}let t=N(e,`--model`,``),n=N(e,`--system`,``),r=M(e,`--temp`,.3),i=N(e,`--context`,``),a=e.join(` `);a||=await F(),a||(console.error(`Usage: aikit delegate [--model name] <prompt>`),process.exit(1));let o;i&&(o=await k(l(i),`utf-8`));let s=await oe({prompt:a,model:t||void 0,system:n||void 0,context:o,temperature:r});s.error&&(console.error(`Error: ${s.error}`),process.exit(1)),console.log(s.response),console.error(`\n(${s.model}, ${s.durationMs}ms, ${s.tokenCount??`?`} tokens)`)}}],xt=[{name:`eval`,description:`Evaluate JavaScript or TypeScript in a constrained VM sandbox`,usage:`aikit eval [code] [--lang js|ts] [--timeout ms]`,run:async e=>{let t=N(e,`--lang`,`js`),n=M(e,`--timeout`,5e3),r=e.join(` `),i=r.trim()?``:await F(),a=r||i;a.trim()||(console.error(`Usage: aikit eval [code] [--lang js|ts] [--timeout ms]`),process.exit(1));let o=ue({code:a,lang:t===`ts`?`ts`:`js`,timeout:n});if(!o.success){console.error(`Eval failed in ${o.durationMs}ms: ${o.error}`),process.exitCode=1;return}console.log(`Eval succeeded in ${o.durationMs}ms`),console.log(`─`.repeat(60)),console.log(o.output)}},{name:`test`,description:`Run Vitest for all tests or a specific subset`,usage:`aikit test [files...] [--grep pattern] [--cwd path] [--timeout ms]`,run:async e=>{let t=N(e,`--grep`,``).trim()||void 0,n=N(e,`--cwd`,``).trim()||void 0,r=M(e,`--timeout`,6e4),i=e.filter(Boolean),a=await He({files:i.length>0?i:void 0,grep:t,cwd:n,timeout:r});ct(a),a.passed||(process.exitCode=1)}},{name:`rename`,description:`Rename a symbol across files using whole-word regex matching`,usage:`aikit rename <old> <new> <path> [--dry-run] [--extensions .ts,.tsx] [--exclude dist/**]`,run:async e=>{let t=e.shift()?.trim()??``,n=e.shift()?.trim()??``,r=e.shift()?.trim()??``;(!t||!n||!r)&&(console.error(`Usage: aikit rename <old> <new> <path> [--dry-run] [--extensions .ts,.tsx] [--exclude dist/**]`),process.exit(1));let i=I(N(e,`--extensions`,``)),a=I(N(e,`--exclude`,``)),o=await Ae({oldName:t,newName:n,rootPath:l(r),extensions:i.length>0?i:void 0,exclude:a.length>0?a:void 0,dryRun:P(e,`--dry-run`)});console.log(JSON.stringify(o,null,2))}},{name:`codemod`,description:`Apply regex-based codemod rules from a JSON file across a path`,usage:`aikit codemod <path> --rules <file.json> [--dry-run] [--extensions .ts,.tsx] [--exclude dist/**]`,run:async e=>{let t=e.shift()?.trim()??``,n=N(e,`--rules`,``).trim();(!t||!n)&&(console.error(`Usage: aikit codemod <path> --rules <file.json> [--dry-run] [--extensions .ts,.tsx] [--exclude dist/**]`),process.exit(1));let r=await k(l(n),`utf-8`),i;try{i=JSON.parse(r)}catch{throw Error(`Failed to parse rules file as JSON: ${n}`)}if(!Array.isArray(i))throw Error(`Codemod rules file must contain a JSON array.`);let a=I(N(e,`--extensions`,``)),o=I(N(e,`--exclude`,``)),s=await re({rootPath:l(t),rules:i,extensions:a.length>0?a:void 0,exclude:o.length>0?o:void 0,dryRun:P(e,`--dry-run`)});console.log(JSON.stringify(s,null,2))}},{name:`transform`,description:`Apply jq-like transforms to JSON from stdin`,usage:`cat data.json | aikit transform <expression>`,run:async e=>{let t=e.join(` `).trim(),n=await F();(!t||!n.trim())&&(console.error(`Usage: cat data.json | aikit transform <expression>`),process.exit(1));let r=ae({input:n,expression:t});console.log(r.outputString)}}];function W(){let e=process.env.AIKIT_CONFIG_PATH??(r(l(process.cwd(),`aikit.config.json`))?l(process.cwd(),`aikit.config.json`):null);if(!e)return St();let t=a(e,`utf-8`),n;try{n=JSON.parse(t)}catch{console.error(`Failed to parse ${e} as JSON. Ensure the file contains valid JSON.`),process.exit(1)}let i=s(e);return n.sources=n.sources.map(e=>({...e,path:l(i,e.path)})),n.store.path=l(i,n.store.path),n.curated=n.curated??{path:`curated`},n.curated.path=l(i,n.curated.path),G(n,i),n}function St(){let e=process.env.AIKIT_WORKSPACE_ROOT??process.cwd(),t={sources:[{path:e,excludePatterns:[`node_modules/**`,`dist/**`,`.git/**`,`coverage/**`,`*.lock`,`pnpm-lock.yaml`]}],serverName:`aikit`,indexing:{chunkSize:1500,chunkOverlap:200,minChunkSize:100},embedding:{model:`mixedbread-ai/mxbai-embed-large-v1`,dimensions:1024},store:{backend:`lancedb`,path:l(e,A.data)},curated:{path:l(e,A.aiCurated)},stateDir:l(e,A.state)};return G(t,e),t}function G(e,t){if(!Je())return;let n=Ye(t);e.store.path=l(j(n.partition)),e.stateDir=l(j(n.partition),`state`),e.curated||={path:l(t,A.aiCurated)}}async function Ct(){let e=W(),t=new Ze({model:e.embedding.model,dimensions:e.embedding.dimensions});await t.initialize();let n=await et({backend:e.store.backend,path:e.store.path});await n.initialize();let r=new Qe(t,n),{CuratedKnowledgeManager:i}=await import(`../../server/dist/index.js`),a=new i(e.curated.path,n,t),o;try{let t=new $e({path:e.store.path});await t.initialize(),o=t,r.setGraphStore(o)}catch(e){console.error(`[aikit] Graph store init failed (non-fatal): ${e.message}`),o={initialize:async()=>{},upsertNode:async()=>{},upsertEdge:async()=>{},upsertNodes:async()=>{},upsertEdges:async()=>{},getNode:async()=>null,getNeighbors:async()=>({nodes:[],edges:[]}),traverse:async()=>({nodes:[],edges:[]}),findNodes:async()=>[],findEdges:async()=>[],deleteNode:async()=>{},deleteBySourcePath:async()=>0,clear:async()=>{},getStats:async()=>({nodeCount:0,edgeCount:0,nodeTypes:{},edgeTypes:{}}),validate:async()=>({valid:!0,orphanNodes:[],danglingEdges:[],stats:{nodeCount:0,edgeCount:0,nodeTypes:{},edgeTypes:{}}}),setNodeCommunity:async()=>{},detectCommunities:async()=>({}),traceProcess:async()=>({id:``,entryNodeId:``,label:``,properties:{},steps:[]}),getProcesses:async()=>[],deleteProcess:async()=>{},depthGroupedTraversal:async()=>({}),getCohesionScore:async()=>0,getSymbol360:async()=>({node:{id:``,type:``,name:``,properties:{}},incoming:[],outgoing:[],community:null,processes:[]}),close:async()=>{}}}return await Xe().catch(()=>{}),{config:e,embedder:t,store:n,graphStore:o,indexer:r,curated:a}}async function wt(){let e=await import(`../../flows/dist/index.js`),{FlowLoader:t,FlowRegistryManager:n,FlowStateMachine:a,GitInstaller:o}=e,s=typeof e.getBuiltinFlows==`function`?e.getBuiltinFlows:void 0,l=W(),u=c(l.stateDir??c(l.sources[0].path,`.aikit-state`),`flows`),d=c(qe(),`flows`);i(d,{recursive:!0});let f=c(d,`registry.json`),p=c(u,`state.json`);r(u)||i(u,{recursive:!0});let m=l.sources[0].path;return{loader:new t,registry:new n(f),stateMachine:new a(p),git:new o(c(u,`installed`)),getBuiltinFlows:s,cwd:m}}const K=[e=>c(`skills`,e,`SKILL.md`),e=>c(`skills`,e,`README.md`),e=>c(e,`SKILL.md`),e=>c(e,`README.md`)];function Tt(e,t){for(let a of t.steps){let t=c(e,a.instruction);if(r(t))continue;let o=!1;for(let l of K){let u=c(e,l(a.id));if(r(u)){let e=s(t);r(e)||i(e,{recursive:!0}),n(u,t),o=!0;break}}o||console.warn(`Warning: instruction file for step "${a.id}" not found.\n Expected: ${a.instruction}\n Searched: ${K.map(e=>e(a.id)).join(`, `)}`)}}const q=[{name:`flow`,description:`Manage pluggable development flows`,usage:`flow <add|remove|list|info|use|update|status|start|reset> [args]`,run:async e=>{let t=e[0];if(!t){console.log(`Usage: aikit flow <add|remove|list|info|use|update|status|start|reset>`),console.log(``),console.log(`Commands:`),console.log(` add <source> Install a flow from git URL or local path`),console.log(` remove <name> Remove an installed flow`),console.log(` list List all installed flows`),console.log(` info <name> Show details of a flow`),console.log(` use <name> Set active flow (start it)`),console.log(` update <name> Update a flow from its source`),console.log(` status Show current flow execution status`),console.log(` start [name] Start a flow (or resume)`),console.log(` reset Reset the active flow`);return}let n=await wt();switch(t){case`add`:{let t=e[1];if(!t){console.error(`Usage: aikit flow add <source>`),console.error(` source: git URL (https://...) or local path`);return}let i=t.startsWith(`http`)||t.startsWith(`git@`)||t.endsWith(`.git`),a=l(t),s=r(a);if(!i&&!s){console.error(`Source not found: ${t}`),console.error(`Provide a git URL or existing local path.`);return}let c,u;if(i){console.log(`Cloning ${t}...`);let e=n.git.clone(t);if(!e.success||!e.data){console.error(e.error??`Failed to clone flow`);return}c=e.data,u=`git`}else{let r=e[2]||o(a);console.log(`Copying local flow from ${t}...`);let i=n.git.copyLocal(a,r);if(!i.success||!i.data){console.error(i.error??`Failed to copy local flow`);return}c=i.data,u=`local`}let d=await n.loader.load(c);if(!d.success||!d.data){console.error(d.error??`Failed to load flow`),n.git.remove(c);return}let{manifest:f,format:p}=d.data;if(Tt(c,f),f.install.length>0){console.log(`Installing ${f.install.length} dependencies...`);let e=n.git.runInstallDeps(f.install);if(!e.success){console.error(`Dependency install failed: ${e.error}`),n.git.remove(c);return}}let m=new Date().toISOString(),h=n.registry.register({name:f.name,version:f.version,source:t,sourceType:u,installPath:c,format:p,registeredAt:m,updatedAt:m,manifest:f});if(!h.success){console.error(h.error??`Failed to register flow`),n.git.remove(c);return}console.log(`✓ Flow "${f.name}" v${f.version} installed (${p} format)`),console.log(` Steps: ${f.steps.map(e=>e.id).join(` → `)}`);break}case`remove`:{let t=e[1];if(!t){console.error(`Usage: aikit flow remove <name>`);return}let r=n.registry.get(t);if(!r){console.error(`Flow "${t}" not found`);return}let i=n.git.remove(r.installPath);if(!i.success){console.error(i.error??`Failed to remove flow files`);return}let a=n.registry.unregister(t);if(!a.success){console.error(a.error??`Failed to unregister flow`);return}console.log(`✓ Flow "${t}" removed`);break}case`list`:{let e=n.registry.list();if(e.length===0){console.log("No flows installed. Use `aikit flow add <source>` to install one.");return}console.log(`Installed Flows:`),console.log(`─`.repeat(60));for(let t of e){let e=t.manifest.steps.map(e=>e.id).join(` → `);console.log(` ${t.name} v${t.version} (${t.sourceType}, ${t.format})`),console.log(` Steps: ${e}`)}let t=n.stateMachine.getStatus();t.success&&t.data&&(console.log(``),console.log(`Active: ${t.data.flow} (${t.data.status}, step: ${t.data.currentStep??`done`})`));break}case`info`:{let t=e[1];if(!t){console.error(`Usage: aikit flow info <name>`);return}let r=n.registry.get(t);if(!r){console.error(`Flow "${t}" not found`);return}console.log(`Flow: ${r.name}`),console.log(`Version: ${r.version}`),console.log(`Source: ${r.source} (${r.sourceType})`),console.log(`Format: ${r.format}`),console.log(`Path: ${r.installPath}`),console.log(`Registered: ${r.registeredAt}`),console.log(`Updated: ${r.updatedAt}`),console.log(``),console.log(`Steps:`);for(let e of r.manifest.steps){let t=e.requires.length?` (requires: ${e.requires.join(`, `)})`:``;console.log(` ${e.id}: ${e.name}${t}`),console.log(` Skill: ${e.skill}`),console.log(` Produces: ${e.produces.join(`, `)}`)}if(r.manifest.install.length>0){console.log(``),console.log(`Dependencies:`);for(let e of r.manifest.install)console.log(` ${e}`)}break}case`use`:case`start`:{let r=e[1],i=r?n.registry.get(r):null;if(t===`use`&&!r){console.error(`Usage: aikit flow use <name>`);return}if(t===`start`&&!r){let e=n.stateMachine.getStatus();if(e.success&&e.data&&e.data.status===`active`){console.log(`Resuming flow: ${e.data.flow}`),console.log(`Current step: ${e.data.currentStep}`),console.log(`Completed: ${e.data.completedSteps.join(`, `)||`none`}`);return}console.error("No active flow. Use `aikit flow start <name>` to begin one.");return}if(!i){console.error(`Flow "${r}" not found. Use \`aikit flow list\` to see installed flows.`);return}let a=n.stateMachine.start(i.name,i.manifest);if(!a.success||!a.data){console.error(a.error??`Failed to start flow`);return}let o=a.data;console.log(`✓ Flow "${i.name}" started`),console.log(` Current step: ${o.currentStep}`),console.log(` Steps: ${i.manifest.steps.map(e=>e.id).join(` → `)}`);break}case`update`:{let t=e[1];if(!t){console.error(`Usage: aikit flow update <name>`);return}let r=n.registry.get(t);if(!r){console.error(`Flow "${t}" not found`);return}if(r.sourceType!==`git`){console.error(`Flow "${t}" is ${r.sourceType}, not updatable via git`);return}console.log(`Updating ${t}...`);let i=n.git.update(r.installPath);if(!i.success){console.error(i.error??`Failed to update flow`);return}let a=await n.loader.load(r.installPath);if(a.success&&a.data){let e=new Date().toISOString(),t=n.registry.register({...r,version:a.data.manifest.version,format:a.data.format,manifest:a.data.manifest,updatedAt:e});if(!t.success){console.error(t.error??`Failed to refresh flow registry entry`);return}}console.log(`✓ Flow "${t}" updated`);break}case`status`:{let e=n.stateMachine.getStatus();if(!e.success||!e.data){console.log(`No active flow.`);return}let t=e.data;if(console.log(`Flow: ${t.flow}`),console.log(`Status: ${t.status}`),console.log(`Current Step: ${t.currentStep??`(completed)`}`),console.log(`Completed: ${t.completedSteps.join(`, `)||`none`}`),t.skippedSteps.length>0&&console.log(`Skipped: ${t.skippedSteps.join(`, `)}`),console.log(`Started: ${t.startedAt}`),console.log(`Updated: ${t.updatedAt}`),Object.keys(t.artifacts).length>0){console.log(`Artifacts:`);for(let[e,n]of Object.entries(t.artifacts))console.log(` ${e}: ${n}`)}break}case`reset`:{let e=n.stateMachine.reset();if(!e.success){console.error(e.error??`Failed to reset flow state`);return}console.log(`✓ Flow state reset`);break}default:console.error(`Unknown flow command: ${t}`),console.log("Use `aikit flow` for help.")}}}];let J=null;async function Y(){return J||=await Ct(),J}function Et(){return J}const Dt=[{name:`graph`,description:`Query the knowledge graph`,usage:`aikit graph <action> [options]
6
+ `))console.log(` ${t}`)}function H(e,t){if(console.log(`${e}:`),t.length===0){console.log(` none`),console.log(``);return}for(let e of t)console.log(` - ${e}`);console.log(``)}function gt(e){let t=e.trim();if(!t)return``;try{return JSON.parse(t)}catch{return e}}function _t(e){let t=e.trim();if(!t)return{};let n=JSON.parse(t);if(!n||typeof n!=`object`||Array.isArray(n))throw Error(`Checkpoint data must be a JSON object.`);return n}function U(e,t,n=60){let r=new Map;for(let t=0;t<e.length;t++){let i=e[t];r.set(i.record.id,{record:i.record,score:1/(n+t+1)})}for(let e=0;e<t.length;e++){let i=t[e],a=r.get(i.record.id);a?a.score+=1/(n+e+1):r.set(i.record.id,{record:i.record,score:1/(n+e+1)})}return[...r.values()].sort((e,t)=>t.score-e.score)}async function vt(e,t){switch(e.type){case`search`:{if(!t)throw Error(`search operation requires AI Kit context`);let n=typeof e.args.query==`string`?e.args.query.trim():``;if(!n)throw Error(`search operation requires a query`);let r=typeof e.args.limit==`number`?e.args.limit:5,i=e.args.search_mode===`semantic`||e.args.search_mode===`keyword`?e.args.search_mode:`hybrid`,a=typeof e.args.content_type==`string`?e.args.content_type:void 0,o=typeof e.args.min_score==`number`?e.args.min_score:.25;if(i===`keyword`)return(await t.store.ftsSearch(n,{limit:r,contentType:a,minScore:o})).slice(0,r);let s=await t.embedder.embedQuery(n);if(i===`semantic`)return t.store.search(s,{limit:r,contentType:a,minScore:o});let[c,l]=await Promise.all([t.store.search(s,{limit:r*2,contentType:a,minScore:o}),t.store.ftsSearch(n,{limit:r*2,contentType:a,minScore:o}).catch(()=>[])]);return U(c,l).slice(0,r)}case`find`:{if(!t)throw Error(`find operation requires AI Kit context`);let n=typeof e.args.query==`string`?e.args.query:void 0,r=typeof e.args.glob==`string`?e.args.glob:void 0,i=typeof e.args.pattern==`string`?e.args.pattern:void 0,a=typeof e.args.limit==`number`?e.args.limit:10,o=typeof e.args.content_type==`string`?e.args.content_type:void 0,s=typeof e.args.cwd==`string`?e.args.cwd:void 0;if(!n&&!r&&!i)throw Error(`find operation requires query, glob, or pattern`);return g(t.embedder,t.store,{query:n,glob:r,pattern:i,limit:a,contentType:o,cwd:s})}case`check`:return m({files:Array.isArray(e.args.files)?e.args.files.filter(e=>typeof e==`string`):void 0,cwd:typeof e.args.cwd==`string`?e.args.cwd:void 0,skipTypes:e.args.skip_types===!0,skipLint:e.args.skip_lint===!0});default:throw Error(`Unsupported batch operation type: ${e.type}`)}}const yt=[{name:`parse-output`,description:`Parse build or tool output from stdin`,usage:`aikit parse-output [--tool tsc|vitest|biome|git-status]`,run:async e=>{let t=N(e,`--tool`,``).trim()||void 0,n=await F();n.trim()||(console.error(`Usage: aikit parse-output [--tool tsc|vitest|biome|git-status]`),process.exit(1)),ot(_(n,t))}},{name:`git`,description:`Show git branch, status, recent commits, and optional diff stats`,usage:`aikit git [--cwd path] [--commit-count N] [--diff]`,run:async e=>{lt(await he({cwd:N(e,`--cwd`,``).trim()||void 0,commitCount:M(e,`--commit-count`,5),includeDiff:P(e,`--diff`)}))}},{name:`diff`,description:`Parse unified diff text from stdin into structured file changes`,usage:`git diff | aikit diff`,run:async()=>{let e=await F();e.trim()||(console.error(`Usage: git diff | aikit diff`),process.exit(1)),ut(le({diff:e}))}},{name:`summarize`,description:`Show a structural summary of a file`,usage:`aikit summarize <path>`,run:async e=>{let t=e.shift()?.trim();t||(console.error(`Usage: aikit summarize <path>`),process.exit(1)),mt(await de({path:l(t)}))}},{name:`checkpoint`,description:`Save and restore lightweight session checkpoints`,usage:`aikit checkpoint <save|load|list|latest> [label-or-id] [--data json] [--notes text]`,run:async e=>{let t=e.shift()?.trim();switch(t||(console.error(`Usage: aikit checkpoint <save|load|list|latest> [label-or-id] [--data json] [--notes text]`),process.exit(1)),t){case`save`:{let t=e.shift()?.trim(),n=N(e,`--data`,``),r=N(e,`--notes`,``).trim()||void 0,i=n.trim()?``:await F();t||(console.error(`Usage: aikit checkpoint save <label> [--data json] [--notes text]`),process.exit(1)),V(ne(t,_t(n||i),{notes:r}));return}case`load`:{let t=e.shift()?.trim();t||(console.error(`Usage: aikit checkpoint load <id>`),process.exit(1));let n=te(t);if(!n){console.log(`No checkpoint found: ${t}`);return}V(n);return}case`list`:{let e=ee();if(e.length===0){console.log(`No checkpoints saved.`);return}console.log(`Checkpoints (${e.length})`),console.log(`─`.repeat(60));for(let t of e)console.log(`${t.id}`),console.log(` Label: ${t.label}`),console.log(` Created: ${t.createdAt}`);return}case`latest`:{let e=h();if(!e){console.log(`No checkpoints saved.`);return}V(e);return}default:console.error(`Unknown checkpoint action: ${t}`),console.error(`Actions: save, load, list, latest`),process.exit(1)}}}],bt=[{name:`proc`,description:`Manage in-memory child processes`,usage:`aikit proc <start|stop|status|list|logs> ...`,run:async e=>{let t=e.shift()?.trim()??``;switch(t){case`start`:{let t=e.shift()?.trim(),n=e.shift()?.trim();(!t||!n)&&(console.error(`Usage: aikit proc start <id> <command> [args...]`),process.exit(1)),z(b(t,n,e));return}case`stop`:{let t=e.shift()?.trim();t||(console.error(`Usage: aikit proc stop <id>`),process.exit(1));let n=S(t);if(!n){console.log(`No managed process found: ${t}`);return}z(n);return}case`status`:{let t=e.shift()?.trim();t||(console.error(`Usage: aikit proc status <id>`),process.exit(1));let n=x(t);if(!n){console.log(`No managed process found: ${t}`);return}z(n);return}case`list`:{let e=v();if(e.length===0){console.log(`No managed processes.`);return}for(let t of e)z(t),console.log(``);return}case`logs`:{let t=M(e,`--tail`,50),n=e.shift()?.trim();n||(console.error(`Usage: aikit proc logs <id> [--tail N]`),process.exit(1));let r=y(n,t);if(r.length===0){console.log(`No logs found for process: ${n}`);return}for(let e of r)console.log(e);return}default:console.error(`Unknown proc action: ${t}`),console.error(`Actions: start, stop, status, list, logs`),process.exit(1)}}},{name:`watch`,description:`Manage in-memory filesystem watchers`,usage:`aikit watch <start|stop|list> ...`,run:async e=>{let t=e.shift()?.trim()??``;switch(t){case`start`:{let t=e.shift()?.trim();t||(console.error(`Usage: aikit watch start <path>`),process.exit(1));let n=Ge({path:l(t)});console.log(`Started watcher: ${n.id}`),console.log(` Path: ${n.path}`),console.log(` Status: ${n.status}`);return}case`stop`:{let t=e.shift()?.trim();t||(console.error(`Usage: aikit watch stop <id>`),process.exit(1));let n=Ke(t);console.log(n?`Stopped watcher: ${t}`:`Watcher not found: ${t}`);return}case`list`:{let e=We();if(e.length===0){console.log(`No active watchers.`);return}for(let t of e)console.log(`${t.id}`),console.log(` Path: ${t.path}`),console.log(` Status: ${t.status}`),console.log(` Events: ${t.eventCount}`);return}default:console.error(`Unknown watch action: ${t}`),console.error(`Actions: start, stop, list`),process.exit(1)}}},{name:`delegate`,description:`Delegate a task to a local Ollama model`,usage:`aikit delegate [--model name] [--system prompt] [--temp 0.3] <prompt | --stdin>`,run:async e=>{if((e[0]===`models`?e.shift():void 0)===`models`){try{let e=await se();if(e.length===0){console.log(`No Ollama models available. Pull one with: ollama pull gemma4:e2b`);return}for(let t of e)console.log(t)}catch{console.error(`Ollama is not running. Start it with: ollama serve`),process.exit(1)}return}let t=N(e,`--model`,``),n=N(e,`--system`,``),r=M(e,`--temp`,.3),i=N(e,`--context`,``),a=e.join(` `);a||=await F(),a||(console.error(`Usage: aikit delegate [--model name] <prompt>`),process.exit(1));let o;i&&(o=await k(l(i),`utf-8`));let s=await oe({prompt:a,model:t||void 0,system:n||void 0,context:o,temperature:r});s.error&&(console.error(`Error: ${s.error}`),process.exit(1)),console.log(s.response),console.error(`\n(${s.model}, ${s.durationMs}ms, ${s.tokenCount??`?`} tokens)`)}}],xt=[{name:`eval`,description:`Evaluate JavaScript or TypeScript in a constrained VM sandbox`,usage:`aikit eval [code] [--lang js|ts] [--timeout ms]`,run:async e=>{let t=N(e,`--lang`,`js`),n=M(e,`--timeout`,5e3),r=e.join(` `),i=r.trim()?``:await F(),a=r||i;a.trim()||(console.error(`Usage: aikit eval [code] [--lang js|ts] [--timeout ms]`),process.exit(1));let o=ue({code:a,lang:t===`ts`?`ts`:`js`,timeout:n});if(!o.success){console.error(`Eval failed in ${o.durationMs}ms: ${o.error}`),process.exitCode=1;return}console.log(`Eval succeeded in ${o.durationMs}ms`),console.log(`─`.repeat(60)),console.log(o.output)}},{name:`test`,description:`Run Vitest for all tests or a specific subset`,usage:`aikit test [files...] [--grep pattern] [--cwd path] [--timeout ms]`,run:async e=>{let t=N(e,`--grep`,``).trim()||void 0,n=N(e,`--cwd`,``).trim()||void 0,r=M(e,`--timeout`,6e4),i=e.filter(Boolean),a=await He({files:i.length>0?i:void 0,grep:t,cwd:n,timeout:r});ct(a),a.passed||(process.exitCode=1)}},{name:`rename`,description:`Rename a symbol across files using whole-word regex matching`,usage:`aikit rename <old> <new> <path> [--dry-run] [--extensions .ts,.tsx] [--exclude dist/**]`,run:async e=>{let t=e.shift()?.trim()??``,n=e.shift()?.trim()??``,r=e.shift()?.trim()??``;(!t||!n||!r)&&(console.error(`Usage: aikit rename <old> <new> <path> [--dry-run] [--extensions .ts,.tsx] [--exclude dist/**]`),process.exit(1));let i=I(N(e,`--extensions`,``)),a=I(N(e,`--exclude`,``)),o=await Ae({oldName:t,newName:n,rootPath:l(r),extensions:i.length>0?i:void 0,exclude:a.length>0?a:void 0,dryRun:P(e,`--dry-run`)});console.log(JSON.stringify(o,null,2))}},{name:`codemod`,description:`Apply regex-based codemod rules from a JSON file across a path`,usage:`aikit codemod <path> --rules <file.json> [--dry-run] [--extensions .ts,.tsx] [--exclude dist/**]`,run:async e=>{let t=e.shift()?.trim()??``,n=N(e,`--rules`,``).trim();(!t||!n)&&(console.error(`Usage: aikit codemod <path> --rules <file.json> [--dry-run] [--extensions .ts,.tsx] [--exclude dist/**]`),process.exit(1));let r=await k(l(n),`utf-8`),i;try{i=JSON.parse(r)}catch{throw Error(`Failed to parse rules file as JSON: ${n}`)}if(!Array.isArray(i))throw Error(`Codemod rules file must contain a JSON array.`);let a=I(N(e,`--extensions`,``)),o=I(N(e,`--exclude`,``)),s=await re({rootPath:l(t),rules:i,extensions:a.length>0?a:void 0,exclude:o.length>0?o:void 0,dryRun:P(e,`--dry-run`)});console.log(JSON.stringify(s,null,2))}},{name:`transform`,description:`Apply jq-like transforms to JSON from stdin`,usage:`cat data.json | aikit transform <expression>`,run:async e=>{let t=e.join(` `).trim(),n=await F();(!t||!n.trim())&&(console.error(`Usage: cat data.json | aikit transform <expression>`),process.exit(1));let r=ae({input:n,expression:t});console.log(r.outputString)}}];function W(){let e=process.env.AIKIT_CONFIG_PATH??(r(l(process.cwd(),`aikit.config.json`))?l(process.cwd(),`aikit.config.json`):null);if(!e)return St();let t=a(e,`utf-8`),n;try{n=JSON.parse(t)}catch{console.error(`Failed to parse ${e} as JSON. Ensure the file contains valid JSON.`),process.exit(1)}let i=s(e);return n.sources=n.sources.map(e=>({...e,path:l(i,e.path)})),n.store.path=l(i,n.store.path),n.curated=n.curated??{path:`curated`},n.curated.path=l(i,n.curated.path),G(n,i),n}function St(){let e=process.env.AIKIT_WORKSPACE_ROOT??process.cwd(),t={sources:[{path:e,excludePatterns:[`node_modules/**`,`dist/**`,`.git/**`,`coverage/**`,`*.lock`,`pnpm-lock.yaml`]}],serverName:`aikit`,indexing:{chunkSize:1500,chunkOverlap:200,minChunkSize:100},embedding:{model:`mixedbread-ai/mxbai-embed-large-v1`,dimensions:1024},store:{backend:`lancedb`,path:l(e,A.data)},curated:{path:l(e,A.aiCurated)},stateDir:l(e,A.state)};return G(t,e),t}function G(e,t){if(!Je())return;let n=Ye(t);e.store.path=l(j(n.partition)),e.stateDir=l(j(n.partition),`state`),e.curated||={path:l(t,A.aiCurated)}}async function Ct(){let e=W(),t=new Ze({model:e.embedding.model,dimensions:e.embedding.dimensions});await t.initialize();let n=await et({backend:e.store.backend,path:e.store.path});await n.initialize();let r=new Qe(t,n),{CuratedKnowledgeManager:i}=await import(`../../server/dist/index.js`),a=new i(e.curated.path,n,t),o;try{let t=new $e({path:e.store.path});await t.initialize(),o=t,r.setGraphStore(o)}catch(e){console.error(`[aikit] Graph store init failed (non-fatal): ${e.message}`),o={initialize:async()=>{},upsertNode:async()=>{},upsertEdge:async()=>{},upsertNodes:async()=>{},upsertEdges:async()=>{},getNode:async()=>null,getNeighbors:async()=>({nodes:[],edges:[]}),traverse:async()=>({nodes:[],edges:[]}),findNodes:async()=>[],findEdges:async()=>[],deleteNode:async()=>{},deleteBySourcePath:async()=>0,clear:async()=>{},getStats:async()=>({nodeCount:0,edgeCount:0,nodeTypes:{},edgeTypes:{}}),validate:async()=>({valid:!0,orphanNodes:[],danglingEdges:[],stats:{nodeCount:0,edgeCount:0,nodeTypes:{},edgeTypes:{}}}),setNodeCommunity:async()=>{},detectCommunities:async()=>({}),traceProcess:async()=>({id:``,entryNodeId:``,label:``,properties:{},steps:[]}),getProcesses:async()=>[],deleteProcess:async()=>{},depthGroupedTraversal:async()=>({}),getCohesionScore:async()=>0,getSymbol360:async()=>({node:{id:``,type:``,name:``,properties:{}},incoming:[],outgoing:[],community:null,processes:[]}),close:async()=>{}}}return await Xe().catch(()=>{}),{config:e,embedder:t,store:n,graphStore:o,indexer:r,curated:a}}async function wt(){let e=await import(`../../flows/dist/index.js`),{FlowLoader:t,FlowRegistryManager:n,FlowStateMachine:r,GitInstaller:a}=e,o=typeof e.getBuiltinFlows==`function`?e.getBuiltinFlows:void 0,s=W(),l=c(s.stateDir??c(s.sources[0].path,`.aikit-state`),`flows`,`installed`),u=c(qe(),`flows`);i(u,{recursive:!0});let d=c(u,`registry.json`),f=s.sources[0].path,p=c(f,`.flows`);return{loader:new t,registry:new n(d),stateMachine:new r(p),git:new a(l),getBuiltinFlows:o,cwd:f}}const K=[e=>c(`skills`,e,`SKILL.md`),e=>c(`skills`,e,`README.md`),e=>c(e,`SKILL.md`),e=>c(e,`README.md`)];function Tt(e,t){for(let a of t.steps){let t=c(e,a.instruction);if(r(t))continue;let o=!1;for(let l of K){let u=c(e,l(a.id));if(r(u)){let e=s(t);r(e)||i(e,{recursive:!0}),n(u,t),o=!0;break}}o||console.warn(`Warning: instruction file for step "${a.id}" not found.\n Expected: ${a.instruction}\n Searched: ${K.map(e=>e(a.id)).join(`, `)}`)}}const q=[{name:`flow`,description:`Manage pluggable development flows`,usage:`flow <add|remove|list|info|use|update|status|start|reset> [args]`,run:async e=>{let t=e[0];if(!t){console.log(`Usage: aikit flow <add|remove|list|info|use|update|status|start|reset|runs>`),console.log(``),console.log(`Commands:`),console.log(` add <source> Install a flow from git URL or local path`),console.log(` remove <name> Remove an installed flow`),console.log(` list List all installed flows`),console.log(` info <name> Show details of a flow`),console.log(` use <name> Set active flow (start it)`),console.log(` update <name> Update a flow from its source`),console.log(` status Show current flow execution status`),console.log(` start [name] Start a flow (or resume)`),console.log(` reset Reset the active flow`),console.log(` runs List all flow runs`);return}let n=await wt();switch(t){case`add`:{let t=e[1];if(!t){console.error(`Usage: aikit flow add <source>`),console.error(` source: git URL (https://...) or local path`);return}let i=t.startsWith(`http`)||t.startsWith(`git@`)||t.endsWith(`.git`),a=l(t),s=r(a);if(!i&&!s){console.error(`Source not found: ${t}`),console.error(`Provide a git URL or existing local path.`);return}let c,u;if(i){console.log(`Cloning ${t}...`);let e=n.git.clone(t);if(!e.success||!e.data){console.error(e.error??`Failed to clone flow`);return}c=e.data,u=`git`}else{let r=e[2]||o(a);console.log(`Copying local flow from ${t}...`);let i=n.git.copyLocal(a,r);if(!i.success||!i.data){console.error(i.error??`Failed to copy local flow`);return}c=i.data,u=`local`}let d=await n.loader.load(c);if(!d.success||!d.data){console.error(d.error??`Failed to load flow`),n.git.remove(c);return}let{manifest:f,format:p}=d.data;if(Tt(c,f),f.install.length>0){console.log(`Installing ${f.install.length} dependencies...`);let e=n.git.runInstallDeps(f.install);if(!e.success){console.error(`Dependency install failed: ${e.error}`),n.git.remove(c);return}}let m=new Date().toISOString(),h=n.registry.register({name:f.name,version:f.version,source:t,sourceType:u,installPath:c,format:p,registeredAt:m,updatedAt:m,manifest:f});if(!h.success){console.error(h.error??`Failed to register flow`),n.git.remove(c);return}console.log(`✓ Flow "${f.name}" v${f.version} installed (${p} format)`),console.log(` Steps: ${f.steps.map(e=>e.id).join(` → `)}`);break}case`remove`:{let t=e[1];if(!t){console.error(`Usage: aikit flow remove <name>`);return}let r=n.registry.get(t);if(!r){console.error(`Flow "${t}" not found`);return}let i=n.git.remove(r.installPath);if(!i.success){console.error(i.error??`Failed to remove flow files`);return}let a=n.registry.unregister(t);if(!a.success){console.error(a.error??`Failed to unregister flow`);return}console.log(`✓ Flow "${t}" removed`);break}case`list`:{let e=n.registry.list();if(e.length===0){console.log("No flows installed. Use `aikit flow add <source>` to install one.");return}console.log(`Installed Flows:`),console.log(`─`.repeat(60));for(let t of e){let e=t.manifest.steps.map(e=>e.id).join(` → `);console.log(` ${t.name} v${t.version} (${t.sourceType}, ${t.format})`),console.log(` Steps: ${e}`)}let t=n.stateMachine.getStatus();t.success&&t.data&&(console.log(``),console.log(`Active: ${t.data.flow} (${t.data.status}, step: ${t.data.currentStep??`done`})`));break}case`info`:{let t=e[1];if(!t){console.error(`Usage: aikit flow info <name>`);return}let r=n.registry.get(t);if(!r){console.error(`Flow "${t}" not found`);return}console.log(`Flow: ${r.name}`),console.log(`Version: ${r.version}`),console.log(`Source: ${r.source} (${r.sourceType})`),console.log(`Format: ${r.format}`),console.log(`Path: ${r.installPath}`),console.log(`Registered: ${r.registeredAt}`),console.log(`Updated: ${r.updatedAt}`),console.log(``),console.log(`Steps:`);for(let e of r.manifest.steps){let t=e.requires.length?` (requires: ${e.requires.join(`, `)})`:``;console.log(` ${e.id}: ${e.name}${t}`),console.log(` Skill: ${e.skill}`),console.log(` Produces: ${e.produces.join(`, `)}`)}if(r.manifest.install.length>0){console.log(``),console.log(`Dependencies:`);for(let e of r.manifest.install)console.log(` ${e}`)}break}case`use`:case`start`:{let r=e[1],i=r?n.registry.get(r):null;if(t===`use`&&!r){console.error(`Usage: aikit flow use <name>`);return}if(t===`start`&&!r){let e=n.stateMachine.getStatus();if(e.success&&e.data&&e.data.status===`active`){console.log(`Resuming flow: ${e.data.flow}`),console.log(`Current step: ${e.data.currentStep}`),console.log(`Completed: ${e.data.completedSteps.join(`, `)||`none`}`);return}console.error("No active flow. Use `aikit flow start <name>` to begin one.");return}if(!i){console.error(`Flow "${r}" not found. Use \`aikit flow list\` to see installed flows.`);return}let a=e[2],o=n.stateMachine.start(i.name,i.manifest,a);if(!o.success||!o.data){console.error(o.error??`Failed to start flow`);return}let s=o.data;console.log(`✓ Flow "${i.name}" started`),console.log(` Topic: ${s.topic}`),console.log(` Run directory: ${s.runDir}`),console.log(` Current step: ${s.currentStep}`),console.log(` Steps: ${i.manifest.steps.map(e=>e.id).join(` → `)}`);break}case`update`:{let t=e[1];if(!t){console.error(`Usage: aikit flow update <name>`);return}let r=n.registry.get(t);if(!r){console.error(`Flow "${t}" not found`);return}if(r.sourceType!==`git`){console.error(`Flow "${t}" is ${r.sourceType}, not updatable via git`);return}console.log(`Updating ${t}...`);let i=n.git.update(r.installPath);if(!i.success){console.error(i.error??`Failed to update flow`);return}let a=await n.loader.load(r.installPath);if(a.success&&a.data){let e=new Date().toISOString(),t=n.registry.register({...r,version:a.data.manifest.version,format:a.data.format,manifest:a.data.manifest,updatedAt:e});if(!t.success){console.error(t.error??`Failed to refresh flow registry entry`);return}}console.log(`✓ Flow "${t}" updated`);break}case`status`:{let e=n.stateMachine.getStatus();if(!e.success||!e.data){console.log(`No active flow.`);return}let t=e.data;if(console.log(`Flow: ${t.flow}`),console.log(`Topic: ${t.topic}`),console.log(`Slug: ${t.slug}`),console.log(`Run Dir: ${t.runDir}`),console.log(`Status: ${t.status}`),console.log(`Current Step: ${t.currentStep??`(completed)`}`),console.log(`Completed: ${t.completedSteps.join(`, `)||`none`}`),t.skippedSteps.length>0&&console.log(`Skipped: ${t.skippedSteps.join(`, `)}`),console.log(`Started: ${t.startedAt}`),console.log(`Updated: ${t.updatedAt}`),Object.keys(t.artifacts).length>0){console.log(`Artifacts:`);for(let[e,n]of Object.entries(t.artifacts))console.log(` ${e}: ${n}`)}break}case`reset`:{let e=n.stateMachine.reset();if(!e.success){console.error(e.error??`Failed to reset flow state`);return}console.log(`✓ Flow abandoned`);break}case`runs`:{let t=e[1],r=e[2],i=n.stateMachine.listRuns({flow:t,status:r});if(i.length===0){console.log(`No flow runs found.`);return}console.log(`Flow Runs:`),console.log(`─`.repeat(80));for(let e of i){let t=e.currentStep?` → ${e.currentStep}`:``;console.log(` ${e.id} [${e.status}] ${e.flow}${t}`),console.log(` Topic: ${e.topic}`),console.log(` Started: ${e.startedAt} | Updated: ${e.updatedAt}`)}break}default:console.error(`Unknown flow command: ${t}`),console.log("Use `aikit flow` for help.")}}}];let J=null;async function Y(){return J||=await Ct(),J}function Et(){return J}const Dt=[{name:`graph`,description:`Query the knowledge graph`,usage:`aikit graph <action> [options]
7
7
  Actions: stats, find-nodes, find-edges, neighbors, traverse, delete, clear
8
8
  Options: --type, --name, --node-id, --edge-type, --direction, --depth, --limit, --source-path`,run:async e=>{let t=e.shift()?.trim()??``;t||(console.error(`Usage: aikit graph <action>
9
9
  Actions: stats, find-nodes, find-edges, neighbors, traverse, delete, clear`),process.exit(1));let{graphStore:n}=await Y(),r=N(e,`--type`,``),i=N(e,`--name`,``),a=N(e,`--node-id`,``),o=N(e,`--edge-type`,``),s=N(e,`--direction`,`both`),c=M(e,`--depth`,2),l=M(e,`--limit`,50),u=N(e,`--source-path`,``),d={stats:`stats`,"find-nodes":`find_nodes`,"find-edges":`find_edges`,neighbors:`neighbors`,traverse:`traverse`,delete:`delete`,clear:`clear`}[t];d||(console.error(`Unknown graph action: ${t}`),console.error(`Actions: stats, find-nodes, find-edges, neighbors, traverse, delete, clear`),process.exit(1));let f=await ge(n,{action:d,nodeType:r||void 0,namePattern:i||void 0,sourcePath:u||void 0,nodeId:a||void 0,edgeType:o||void 0,direction:s,maxDepth:c,limit:l});if(console.log(f.summary),f.nodes&&f.nodes.length>0){console.log(`
@@ -79,8 +79,8 @@ interface FlowRegistry {
79
79
  flows: Record<string, FlowRegistryEntry>;
80
80
  }
81
81
  /** Flow execution state */
82
- type FlowStatus = 'idle' | 'active' | 'completed';
83
- /** Active flow state (persisted to state.json) */
82
+ type FlowStatus = 'idle' | 'active' | 'completed' | 'abandoned';
83
+ /** Flow execution state for the active run */
84
84
  interface FlowState {
85
85
  /** Which flow is active (name from registry) */
86
86
  flow: string;
@@ -98,6 +98,56 @@ interface FlowState {
98
98
  startedAt: string;
99
99
  /** ISO timestamp of last state change */
100
100
  updatedAt: string;
101
+ /** Slug identifier (directory name under .flows/) */
102
+ slug: string;
103
+ /** Absolute path to .flows/{slug}/ */
104
+ runDir: string;
105
+ /** Human-readable topic */
106
+ topic: string;
107
+ }
108
+ /** Persisted meta.json for a flow run under .flows/{slug}/ */
109
+ interface FlowRunMeta {
110
+ /** Slug identifier (directory name) */
111
+ id: string;
112
+ /** Flow name from registry (e.g. "aikit:advanced") */
113
+ flow: string;
114
+ /** Flow version at time of start */
115
+ flowVersion: string;
116
+ /** Human-readable topic */
117
+ topic: string;
118
+ /** Run status */
119
+ status: FlowStatus;
120
+ /** Current step ID (null if completed/abandoned) */
121
+ currentStep: string | null;
122
+ /** Completed step IDs in order */
123
+ completedSteps: string[];
124
+ /** Skipped step IDs */
125
+ skippedSteps: string[];
126
+ /** Artifacts subdirectory name from manifest (e.g. ".spec") */
127
+ artifactsDir: string;
128
+ /** Map of artifact name → relative file path */
129
+ artifacts: Record<string, string>;
130
+ /** ISO timestamp when started */
131
+ startedAt: string;
132
+ /** ISO timestamp of last change */
133
+ updatedAt: string;
134
+ }
135
+ /** Summary of a flow run for listing */
136
+ interface FlowRunSummary {
137
+ /** Slug identifier */
138
+ id: string;
139
+ /** Flow name */
140
+ flow: string;
141
+ /** Human-readable topic */
142
+ topic: string;
143
+ /** Run status */
144
+ status: FlowStatus;
145
+ /** Current step ID */
146
+ currentStep: string | null;
147
+ /** ISO timestamp when started */
148
+ startedAt: string;
149
+ /** ISO timestamp of last change */
150
+ updatedAt: string;
101
151
  }
102
152
  /** Step action for state machine transitions */
103
153
  type StepAction = 'next' | 'skip' | 'redo';
@@ -311,22 +361,37 @@ declare class FlowRegistryManager {
311
361
  //#endregion
312
362
  //#region packages/flows/src/state-machine.d.ts
313
363
  declare class FlowStateMachine {
314
- private readonly statePath;
315
- constructor(statePath: string);
316
- /** Load active flow state from disk, or null if no active flow */
317
- load(): FlowState | null;
318
- /** Save state to disk */
319
- private save;
364
+ private readonly flowsDir;
365
+ constructor(flowsDir: string);
366
+ /** Create a filesystem-safe slug from a human-readable topic */
367
+ private slugify;
368
+ /** Allocate a unique run slug for a topic */
369
+ private generateSlug;
370
+ /** Resolve the meta.json path for a run slug */
371
+ private getMetaPath;
372
+ /** Read a run meta.json from disk */
373
+ private readMeta;
374
+ /** Persist a run meta.json to disk */
375
+ private writeMeta;
376
+ /** Find the currently active run, if any */
377
+ private findActiveRun;
378
+ /** Convert persisted run metadata into the public FlowState shape */
379
+ private metaToState;
320
380
  /** Start a new flow */
321
- start(flowName: string, manifest: FlowManifest): FlowResult<FlowState>;
381
+ start(flowName: string, manifest: FlowManifest, topic?: string): FlowResult<FlowState>;
322
382
  /** Advance the flow: next, skip, or redo current step */
323
383
  step(action: StepAction, manifest: FlowManifest): FlowResult<FlowState>;
324
384
  /** Get current flow status */
325
385
  getStatus(): FlowResult<FlowState>;
326
- /** Reset flow state (clear active flow) */
386
+ /** Reset flow state by abandoning the active run */
327
387
  reset(): FlowResult;
328
388
  /** Record an artifact produced by a step */
329
389
  recordArtifact(name: string, path: string): FlowResult;
390
+ /** List flow runs, optionally filtered by flow name or status */
391
+ listRuns(filter?: {
392
+ flow?: string;
393
+ status?: string;
394
+ }): FlowRunSummary[];
330
395
  }
331
396
  //#endregion
332
397
  //#region packages/flows/src/symlinks.d.ts
@@ -343,4 +408,4 @@ declare class SymlinkManager {
343
408
  private getAgentStem;
344
409
  }
345
410
  //#endregion
346
- export { type BuiltinFlow, ClaudePluginAdapter, type ContentTransformFn, CopilotAdapter, type FlowFormat, type FlowFormatAdapter, FlowLoader, type FlowManifest, type FlowParseOptions, type FlowRegistry, type FlowRegistryEntry, FlowRegistryManager, type FlowResult, type FlowSourceType, type FlowState, FlowStateMachine, type FlowStatus, type FlowStep, FoundationIntegration, GitInstaller, NativeAdapter, OpenSpecAdapter, type StepAction, SymlinkManager, getBuiltinFlows };
411
+ export { type BuiltinFlow, ClaudePluginAdapter, type ContentTransformFn, CopilotAdapter, type FlowFormat, type FlowFormatAdapter, FlowLoader, type FlowManifest, type FlowParseOptions, type FlowRegistry, type FlowRegistryEntry, FlowRegistryManager, type FlowResult, type FlowRunMeta, type FlowRunSummary, type FlowSourceType, type FlowState, FlowStateMachine, type FlowStatus, type FlowStep, FoundationIntegration, GitInstaller, NativeAdapter, OpenSpecAdapter, type StepAction, SymlinkManager, getBuiltinFlows };
@@ -7,9 +7,9 @@ You are operating within the @vpxa/aikit framework.
7
7
  - Use built-in skills (brainstorming, requirements-clarity, etc.) as needed
8
8
  - Track state via the flow state machine (flow_status, flow_step)
9
9
  - Active flow: ${e.name} v${e.version}
10
- - Artifacts go in: ${t}/${e.name}/
10
+ - Artifacts go in: .flows/{topic-slug}/${t}/ (resolved at runtime via flow_start)
11
11
  ${j}`}var N=class{injectPreamble(e,t){let r=M(t);if(!n(e)){u(e,`${r}\n`,`utf-8`);return}let a=i(e,`utf-8`),o=a.indexOf(A),s=a.indexOf(j);if(o!==-1&&s!==-1){u(e,`${a.slice(0,o)}${r}${a.slice(s+31)}`,`utf-8`);return}u(e,`${r}\n\n${a}`,`utf-8`)}removePreamble(e){if(!n(e))return;let t=i(e,`utf-8`),r=t.indexOf(A),a=t.indexOf(j);r===-1||a===-1||u(e,(t.slice(0,r)+t.slice(a+31)).replace(/^\n+/,``).replace(/\n{3,}/g,`
12
12
 
13
13
  `),`utf-8`)}getTargetFiles(e){let t=[],r=p(e,`.github`,`copilot-instructions.md`);n(r)&&t.push(r);let i=p(e,`CLAUDE.md`);n(i)&&t.push(i);let a=p(e,`AGENTS.md`);return n(a)&&t.push(a),t}};function P(e){if(!e||typeof e!=`object`||!(`stderr`in e))return``;let t=e.stderr;return Buffer.isBuffer(t)?t.toString().trim():typeof t==`string`?t.trim():``}function F(e){try{return new URL(e).hostname}catch{}return e.match(/^[^@]+@([^:]+):/)?.[1]??`<host>`}function I(e,t,n){let r=[`Git operation failed for: ${e}`],i=F(e),a=t.includes(`ETIMEDOUT`)||t.includes(`SIGTERM`)||t.toLowerCase().includes(`timed out`),o=n.includes(`Authentication failed`)||n.includes(`could not read Username`)||n.includes(`terminal prompts disabled`)||n.includes(`401`)||n.includes(`403`)||n.includes(`Permission denied`),s=n.includes(`SSL certificate`)||n.includes(`unable to access`)&&n.includes(`SSL`),c=n.includes(`Could not resolve host`)||n.includes(`Name or service not known`),l=n.includes(`SAML`)||n.includes(`single sign-on`);return o||l?(r.push(``),r.push(`Cause: Authentication required or credentials not configured.`),r.push(``),r.push(`To fix this, ensure git can access this repository:`),r.push(``),r.push(` Option 1 - SSH key:`),r.push(` 1. Generate an SSH key: ssh-keygen -t ed25519`),r.push(` 2. Add the public key to your Git hosting account`),r.push(` 3. Use the SSH URL instead: git@<host>:<org>/<repo>.git`),r.push(``),r.push(` Option 2 - Personal Access Token (PAT):`),r.push(` 1. Create a PAT in your Git hosting Settings > Developer settings > Tokens`),r.push(` 2. For GitHub Enterprise with SAML SSO, authorize the token for your org`),r.push(` 3. Configure git credentials:`),r.push(` git config --global credential.helper store`),r.push(` git clone ${e} (enter PAT as password)`),r.push(``),r.push(` Option 3 - Git Credential Manager:`),r.push(` 1. Install: https://github.com/git-ecosystem/git-credential-manager`),r.push(` 2. Run: git clone ${e}`),r.push(` 3. Follow the browser-based auth prompt`),r.push(``),r.push(`After configuring credentials, retry: aikit flow add ${e}`)):a?(r.push(``),r.push(`Cause: Connection timed out - the server did not respond within 60 seconds.`),r.push(``),r.push(`Possible reasons:`),r.push(` - The repository requires VPN access. Ensure your VPN is connected`),r.push(` - The host is behind a firewall or corporate proxy`),r.push(` - The URL may be incorrect`),r.push(``),r.push(`Diagnostics:`),r.push(` 1. Verify the URL is correct: ${e}`),r.push(` 2. Test connectivity: git ls-remote ${e}`),r.push(` 3. If behind a proxy, configure git:`),r.push(` git config --global http.proxy http://proxy:port`),r.push(``),r.push(`If this is a corporate/internal host, you may need to:`),r.push(` - Connect to the corporate VPN`),r.push(` - Add the host to your SSH config or git config`),r.push(` - Ask your IT team to allowlist your machine`),r.push(``),r.push(`After resolving, retry: aikit flow add ${e}`)):s?(r.push(``),r.push(`Cause: SSL/TLS certificate verification failed.`),r.push(``),r.push(`This often happens with corporate proxies or self-signed certificates.`),r.push(``),r.push(`To fix:`),r.push(` 1. If your company uses a custom CA, add it:`),r.push(` git config --global http.sslCAInfo /path/to/ca-bundle.crt`),r.push(` 2. As a last resort (not recommended for production):`),r.push(` git config --global http.sslVerify false`),r.push(``),r.push(`After resolving, retry: aikit flow add ${e}`)):c?(r.push(``),r.push(`Cause: Cannot resolve hostname.`),r.push(``),r.push(`Check:`),r.push(` 1. Is the URL correct? ${e}`),r.push(` 2. Are you connected to the internet/VPN?`),r.push(` 3. Can you resolve the host? ping ${i}`),r.push(``),r.push(`After resolving, retry: aikit flow add ${e}`)):(r.push(``),r.push(`Error: ${t}`),n&&r.push(`Details: ${n}`),r.push(``),r.push(`Troubleshooting:`),r.push(` 1. Verify the URL is a valid git repository: git ls-remote ${e}`),r.push(` 2. Check your git credentials and network connectivity`),r.push(` 3. If the repo requires auth, configure credentials first (PAT or SSH key)`),r.push(``),r.push(`After resolving, retry: aikit flow add ${e}`)),r.join(`
14
14
  `)}function L(){try{return g(`git`,[`credential-manager`,`--version`],{stdio:`pipe`,timeout:5e3}).status===0}catch{return!1}}function R(e,t){let n=`${e}\n${t}`.toLowerCase();return n.includes(`authentication failed`)||n.includes(`could not read username`)||n.includes(`saml sso`)||n.includes(`terminal prompts disabled`)||n.includes(`host key verification failed`)||n.includes(`permission denied`)||n.includes(`403`)||n.includes(`401`)}var z=class{constructor(e){this.flowsDir=e}clone(e,t){let r=this.repoNameFromUrl(e),i=p(this.flowsDir,r);if(n(i))if(!n(p(i,`.git`)))o(i,{recursive:!0,force:!0});else return{success:!1,error:`Flow "${r}" already installed at ${i}. Use update instead.`};try{if(this.ensureFlowsDir(),t){let n=process.platform===`win32`?`bat`:`sh`,r=p(v(),`git-askpass-${Date.now()}.${n}`);process.platform===`win32`?u(r,`@echo ${t}`,{mode:448}):u(r,`#!/bin/sh\necho "${t}"`,{mode:448});try{let t=g(`git`,[`clone`,`--depth`,`1`,e,i],{stdio:`pipe`,timeout:6e4,env:{...process.env,GIT_TERMINAL_PROMPT:`0`,GIT_ASKPASS:r}});if(t.status!==0){let e=t.stderr?.toString().trim()??``;throw Error(e||t.error?.message||`git clone failed`)}}finally{try{l(r)}catch{}}}else{let t=g(`git`,[`clone`,`--depth`,`1`,e,i],{stdio:`pipe`,timeout:6e4,env:{...process.env,GIT_TERMINAL_PROMPT:`0`,GIT_ASKPASS:``}});if(t.status!==0){let e=t.stderr?.toString().trim()??``;throw Error(e||t.error?.message||`git clone failed`)}}return{success:!0,data:i}}catch(r){n(i)&&o(i,{recursive:!0,force:!0});let a=P(r),s=r instanceof Error?r.message:String(r);if(!t&&R(s,a)&&L()){let t=F(e)||e;console.log(`\nAuthentication required for ${t}.\nGit Credential Manager detected - opening browser for login...\n(If a browser window does not open, check your terminal for a device code.)\n`);try{if(this.ensureFlowsDir(),g(`git`,[`clone`,`--depth`,`1`,e,i],{stdio:`inherit`,timeout:12e4,env:{...process.env,GIT_TERMINAL_PROMPT:`1`}}).status===0)return{success:!0,data:i};n(i)&&o(i,{recursive:!0,force:!0})}catch{n(i)&&o(i,{recursive:!0,force:!0})}}return{success:!1,error:I(e,s,a)}}}update(e){if(!n(e))return{success:!1,error:`Install path not found: ${e}`};try{return h(`git pull --ff-only`,{cwd:e,stdio:`pipe`,timeout:6e4,env:{...process.env,GIT_TERMINAL_PROMPT:`0`,GIT_ASKPASS:``}}),{success:!0}}catch(t){let n=e;try{n=h(`git remote get-url origin`,{cwd:e,stdio:`pipe`,timeout:1e4}).toString().trim()}catch{}let r=P(t),i=t instanceof Error?t.message:String(t);if(R(i,r)&&L()){let t=F(n)||n;console.log(`\nAuthentication required for ${t}.\nGit Credential Manager detected - opening browser for login...\n`);try{if(g(`git`,[`pull`,`--ff-only`],{cwd:e,stdio:`inherit`,timeout:12e4,env:{...process.env,GIT_TERMINAL_PROMPT:`1`}}).status===0)return{success:!0}}catch{}}return{success:!1,error:I(n,i,r)}}}copyLocal(e,r){let i=p(this.flowsDir,r);if(n(i))return{success:!1,error:`Flow "${r}" already installed at ${i}`};try{return this.ensureFlowsDir(),t(e,i,{recursive:!0}),{success:!0,data:i}}catch(e){return{success:!1,error:`Copy failed: ${e instanceof Error?e.message:String(e)}`}}}remove(e){if(!n(e))return{success:!0};try{return o(e,{recursive:!0,force:!0}),{success:!0}}catch(e){return{success:!1,error:`Remove failed: ${e instanceof Error?e.message:String(e)}`}}}runInstallDeps(e){for(let t of e)try{if(t.startsWith(`npm:`)){h(`npx skills add ${t.slice(4)} -g`,{stdio:`pipe`,timeout:12e4});continue}if(t.endsWith(`.git`)||t.includes(`github.com`)){h(`npx skills add ${t} -g`,{stdio:`pipe`,timeout:12e4});continue}return{success:!1,error:`Unknown install entry format: ${t}`}}catch(e){return{success:!1,error:`Install dependency failed for "${t}": ${e instanceof Error?e.message:String(e)}`}}return{success:!0}}getLocalCommit(e){try{return h(`git rev-parse HEAD`,{cwd:e,stdio:`pipe`,timeout:1e4,env:{...process.env,GIT_TERMINAL_PROMPT:`0`}}).toString().trim()||null}catch{return null}}getRemoteCommit(e){try{return h(`git ls-remote origin HEAD`,{cwd:e,stdio:`pipe`,timeout:3e4,env:{...process.env,GIT_TERMINAL_PROMPT:`0`}}).toString().trim().split(/\s+/)[0]||null}catch{return null}}hasUpdates(e){let t=this.getLocalCommit(e);if(!t)return{success:!1,error:`Could not determine local commit (not a git repo?)`};let n=this.getRemoteCommit(e);return n?{success:!0,data:{localCommit:t,remoteCommit:n,hasUpdates:t!==n}}:{success:!1,error:`Could not reach remote to check for updates`}}repoNameFromUrl(e){return d(e).replace(/\.git$/,``)}ensureFlowsDir(){n(this.flowsDir)||r(this.flowsDir,{recursive:!0})}};const B=[new T,new C,new w,new E];function V(e){let t=B.find(t=>t.format===e);if(!t)throw Error(`No adapter for format: ${e}`);return t}function H(e){for(let t of B)if(t.detect(e))return t;return null}var U=class{async load(e,t){if(!n(e))return{success:!1,error:`Source directory not found: ${e}`};let r=H(e);if(!r)return{success:!1,error:`No format adapter matches source: ${e}`};try{let n=await r.parse(e,t),i=this.validate(n);return i.success?{success:!0,data:{manifest:n,format:r.format}}:i}catch(e){return{success:!1,error:`Failed to parse flow: ${e instanceof Error?e.message:String(e)}`}}}async loadWithFormat(e,t,n){let r=V(t);try{let t=await r.parse(e,n),i=this.validate(t);return i.success?{success:!0,data:t}:i}catch(e){return{success:!1,error:`Failed to parse flow: ${e instanceof Error?e.message:String(e)}`}}}validate(e){let t=[];e.name?.trim()||t.push(`Missing flow name`),e.version?.trim()||t.push(`Missing flow version`),e.steps?.length||t.push(`Flow must have at least one step`);let n=new Set(e.steps.map(e=>e.id));for(let r of e.steps??[]){r.id?.trim()||t.push(`Step missing id`),r.instruction?.trim()||t.push(`Step "${r.id}" missing instruction path`);for(let e of r.requires??[])n.has(e)||t.push(`Step "${r.id}" requires unknown step "${e}"`)}return n.size!==(e.steps?.length??0)&&t.push(`Duplicate step IDs found`),t.length>0?{success:!1,error:`Validation failed:\n${t.join(`
15
- `)}`}:{success:!0}}};function W(){return k().map(e=>({name:e.manifest.name,version:e.manifest.version,source:`builtin`,sourceType:`builtin`,installPath:e.scaffoldDir,format:`native`,registeredAt:`1970-01-01T00:00:00.000Z`,updatedAt:`1970-01-01T00:00:00.000Z`,manifest:e.manifest}))}var G=class{constructor(e){this.registryPath=e}load(){if(!n(this.registryPath))return{version:1,flows:{}};try{let e=i(this.registryPath,`utf-8`);return JSON.parse(e)}catch{return{version:1,flows:{}}}}save(e){let t=f(this.registryPath);n(t)||r(t,{recursive:!0}),u(this.registryPath,JSON.stringify(e,null,2),`utf-8`)}register(e){let t=this.load();return t.flows[e.name]=e,this.save(t),{success:!0}}unregister(e){let t=this.load();return t.flows[e]?(delete t.flows[e],this.save(t),{success:!0}):{success:!1,error:`Flow "${e}" not found in registry`}}get(e){return this.load().flows[e]||(W().find(t=>t.name===e)??null)}list(){let e=this.load(),t=new Set(Object.keys(e.flows)),n=Object.values(e.flows);for(let e of W())t.has(e.name)||n.push(e);return n}has(e){return e in this.load().flows?!0:W().some(t=>t.name===e)}},K=class{constructor(e){this.statePath=e}load(){if(!n(this.statePath))return null;try{let e=i(this.statePath,`utf-8`);return JSON.parse(e)}catch{return null}}save(e){let t=f(this.statePath);n(t)||r(t,{recursive:!0}),u(this.statePath,JSON.stringify(e,null,2),`utf-8`)}start(e,t){let n=this.load();if(n&&n.status===`active`)return{success:!1,error:`Flow "${n.flow}" is already active. Reset it first.`};if(!t.steps.length)return{success:!1,error:`Flow has no steps`};let r=new Date().toISOString(),i={flow:e,status:`active`,currentStep:t.steps[0].id,completedSteps:[],skippedSteps:[],artifacts:{},startedAt:r,updatedAt:r};return this.save(i),{success:!0,data:i}}step(e,t){let n=this.load();if(!n)return{success:!1,error:`No active flow`};if(n.status!==`active`)return{success:!1,error:`Flow is ${n.status}, not active`};if(!n.currentStep)return{success:!1,error:`No current step`};let r=t.steps.findIndex(e=>e.id===n.currentStep);if(r===-1)return{success:!1,error:`Current step "${n.currentStep}" not found in manifest`};let i=new Date().toISOString();switch(e){case`next`:{n.completedSteps.includes(n.currentStep)||n.completedSteps.push(n.currentStep);let e=r+1;e>=t.steps.length?(n.currentStep=null,n.status=`completed`):n.currentStep=t.steps[e].id;break}case`skip`:{n.skippedSteps.includes(n.currentStep)||n.skippedSteps.push(n.currentStep);let e=r+1;e>=t.steps.length?(n.currentStep=null,n.status=`completed`):n.currentStep=t.steps[e].id;break}case`redo`:n.completedSteps=n.completedSteps.filter(e=>e!==n.currentStep),n.skippedSteps=n.skippedSteps.filter(e=>e!==n.currentStep);break}return n.updatedAt=i,this.save(n),{success:!0,data:n}}getStatus(){let e=this.load();return e?{success:!0,data:e}:{success:!1,error:`No active flow`}}reset(){return n(this.statePath)&&l(this.statePath),{success:!0}}recordArtifact(e,t){let n=this.load();return n?(n.artifacts[e]=t,n.updatedAt=new Date().toISOString(),this.save(n),{success:!0}):{success:!1,error:`No active flow`}}},q=class{createSymlinks(t,i,a,o){let s=this.getTargets(t,i);for(let t of s){n(t.baseDir)||r(t.baseDir,{recursive:!0});for(let r of o.agents){let i=p(a,r);if(!n(i))continue;let o=this.getAgentStem(r),s=p(t.baseDir,`${o}${t.extension}`);n(s)&&l(s);let u=m(f(s),i);try{c(u,s,`file`)}catch{try{e(i,s)}catch(e){console.warn(`Failed to create symlink or copy fallback for ${i}: ${e instanceof Error?e.message:String(e)}`)}}}}}removeSymlinks(e,t){let r=this.getTargets(e,t);for(let e of r)if(n(e.baseDir))try{let t=a(e.baseDir,{withFileTypes:!0});for(let n of t)!n.isFile()&&!n.isSymbolicLink()||l(p(e.baseDir,n.name));a(e.baseDir).length===0&&s(e.baseDir)}catch{}}getTargets(e,t){return[{ide:`copilot`,baseDir:p(e,`.github`,`agents`,`flows`,t),extension:`.agent.md`},{ide:`claude-code`,baseDir:p(e,`.claude`,`agents`,`flows`,t),extension:`.md`}]}getAgentStem(e){return d(e).replace(/\.agent\.md$/,``).replace(/\.md$/,``)}};export{C as ClaudePluginAdapter,w as CopilotAdapter,U as FlowLoader,G as FlowRegistryManager,K as FlowStateMachine,N as FoundationIntegration,z as GitInstaller,T as NativeAdapter,E as OpenSpecAdapter,q as SymlinkManager,k as getBuiltinFlows};
15
+ `)}`}:{success:!0}}};function W(){return k().map(e=>({name:e.manifest.name,version:e.manifest.version,source:`builtin`,sourceType:`builtin`,installPath:e.scaffoldDir,format:`native`,registeredAt:`1970-01-01T00:00:00.000Z`,updatedAt:`1970-01-01T00:00:00.000Z`,manifest:e.manifest}))}var G=class{constructor(e){this.registryPath=e}load(){if(!n(this.registryPath))return{version:1,flows:{}};try{let e=i(this.registryPath,`utf-8`);return JSON.parse(e)}catch{return{version:1,flows:{}}}}save(e){let t=f(this.registryPath);n(t)||r(t,{recursive:!0}),u(this.registryPath,JSON.stringify(e,null,2),`utf-8`)}register(e){let t=this.load();return t.flows[e.name]=e,this.save(t),{success:!0}}unregister(e){let t=this.load();return t.flows[e]?(delete t.flows[e],this.save(t),{success:!0}):{success:!1,error:`Flow "${e}" not found in registry`}}get(e){return this.load().flows[e]||(W().find(t=>t.name===e)??null)}list(){let e=this.load(),t=new Set(Object.keys(e.flows)),n=Object.values(e.flows);for(let e of W())t.has(e.name)||n.push(e);return n}has(e){return e in this.load().flows?!0:W().some(t=>t.name===e)}},K=class{constructor(e){this.flowsDir=e}slugify(e){return e.toLowerCase().replace(/[^a-z0-9]+/g,`-`).replace(/-+/g,`-`).replace(/^-|-$/g,``)||`flow`}generateSlug(e){let t=this.slugify(e);if(!n(p(this.flowsDir,t)))return t;for(let e=2;e<=100;e+=1){let r=`${t}-${e}`;if(!n(p(this.flowsDir,r)))return r}throw Error(`Unable to allocate a flow run slug for topic "${e}"`)}getMetaPath(e){return p(this.flowsDir,e,`meta.json`)}readMeta(e){let t=this.getMetaPath(e);if(!n(t))return null;try{let e=i(t,`utf-8`);return JSON.parse(e)}catch{return null}}writeMeta(e,t){let i=p(this.flowsDir,e);n(i)||r(i,{recursive:!0}),u(this.getMetaPath(e),JSON.stringify(t,null,2),`utf-8`)}findActiveRun(){if(!n(this.flowsDir))return null;for(let e of a(this.flowsDir,{withFileTypes:!0})){if(!e.isDirectory())continue;let t=this.readMeta(e.name);if(t?.status===`active`)return{slug:e.name,meta:t}}return null}metaToState(e,t){return{flow:t.flow,status:t.status,currentStep:t.currentStep,completedSteps:t.completedSteps,skippedSteps:t.skippedSteps,artifacts:t.artifacts,startedAt:t.startedAt,updatedAt:t.updatedAt,slug:e,runDir:p(this.flowsDir,e),topic:t.topic}}start(e,t,n){let i=this.findActiveRun();if(i)return{success:!1,error:`Flow "${i.meta.flow}" is already active. Reset it first.`};if(!t.steps.length)return{success:!1,error:`Flow has no steps`};try{let i=(n??t.description)||e,a=this.generateSlug(i),o=p(this.flowsDir,a),s=new Date().toISOString(),c={id:a,flow:e,flowVersion:t.version,topic:i,status:`active`,currentStep:t.steps[0].id,completedSteps:[],skippedSteps:[],artifactsDir:t.artifacts_dir,artifacts:{},startedAt:s,updatedAt:s};return r(o,{recursive:!0}),r(p(o,t.artifacts_dir),{recursive:!0}),this.writeMeta(a,c),{success:!0,data:this.metaToState(a,c)}}catch(e){return{success:!1,error:e instanceof Error?e.message:String(e)}}}step(e,t){let n=this.findActiveRun();if(!n)return{success:!1,error:`No active flow`};if(n.meta.status!==`active`)return{success:!1,error:`Flow is ${n.meta.status}, not active`};if(!n.meta.currentStep)return{success:!1,error:`No current step`};let r=n.meta.currentStep,i=t.steps.findIndex(e=>e.id===r);if(i===-1)return{success:!1,error:`Current step "${n.meta.currentStep}" not found in manifest`};let a=new Date().toISOString(),o=n.meta;switch(e){case`next`:{o.completedSteps.includes(r)||o.completedSteps.push(r);let e=i+1;e>=t.steps.length?(o.currentStep=null,o.status=`completed`):o.currentStep=t.steps[e].id;break}case`skip`:{o.skippedSteps.includes(r)||o.skippedSteps.push(r);let e=i+1;e>=t.steps.length?(o.currentStep=null,o.status=`completed`):o.currentStep=t.steps[e].id;break}case`redo`:o.completedSteps=o.completedSteps.filter(e=>e!==r),o.skippedSteps=o.skippedSteps.filter(e=>e!==r);break}return o.updatedAt=a,this.writeMeta(n.slug,o),{success:!0,data:this.metaToState(n.slug,o)}}getStatus(){let e=this.findActiveRun();return e?{success:!0,data:this.metaToState(e.slug,e.meta)}:{success:!1,error:`No active flow`}}reset(){let e=this.findActiveRun();return e?(e.meta.status=`abandoned`,e.meta.currentStep=null,e.meta.updatedAt=new Date().toISOString(),this.writeMeta(e.slug,e.meta),{success:!0}):{success:!0}}recordArtifact(e,t){let n=this.findActiveRun();return n?(n.meta.artifacts[e]=t,n.meta.updatedAt=new Date().toISOString(),this.writeMeta(n.slug,n.meta),{success:!0}):{success:!1,error:`No active flow`}}listRuns(e){if(!n(this.flowsDir))return[];let t=[];for(let n of a(this.flowsDir,{withFileTypes:!0})){if(!n.isDirectory())continue;let r=this.readMeta(n.name);r&&(e?.flow&&r.flow!==e.flow||e?.status&&r.status!==e.status||t.push({id:n.name,flow:r.flow,topic:r.topic,status:r.status,currentStep:r.currentStep,startedAt:r.startedAt,updatedAt:r.updatedAt}))}return t.sort((e,t)=>t.updatedAt.localeCompare(e.updatedAt))}},q=class{createSymlinks(t,i,a,o){let s=this.getTargets(t,i);for(let t of s){n(t.baseDir)||r(t.baseDir,{recursive:!0});for(let r of o.agents){let i=p(a,r);if(!n(i))continue;let o=this.getAgentStem(r),s=p(t.baseDir,`${o}${t.extension}`);n(s)&&l(s);let u=m(f(s),i);try{c(u,s,`file`)}catch{try{e(i,s)}catch(e){console.warn(`Failed to create symlink or copy fallback for ${i}: ${e instanceof Error?e.message:String(e)}`)}}}}}removeSymlinks(e,t){let r=this.getTargets(e,t);for(let e of r)if(n(e.baseDir))try{let t=a(e.baseDir,{withFileTypes:!0});for(let n of t)!n.isFile()&&!n.isSymbolicLink()||l(p(e.baseDir,n.name));a(e.baseDir).length===0&&s(e.baseDir)}catch{}}getTargets(e,t){return[{ide:`copilot`,baseDir:p(e,`.github`,`agents`,`flows`,t),extension:`.agent.md`},{ide:`claude-code`,baseDir:p(e,`.claude`,`agents`,`flows`,t),extension:`.md`}]}getAgentStem(e){return d(e).replace(/\.agent\.md$/,``).replace(/\.md$/,``)}};export{C as ClaudePluginAdapter,w as CopilotAdapter,U as FlowLoader,G as FlowRegistryManager,K as FlowStateMachine,N as FoundationIntegration,z as GitInstaller,T as NativeAdapter,E as OpenSpecAdapter,q as SymlinkManager,k as getBuiltinFlows};
@@ -1 +1 @@
1
- import{t as e}from"./curated-manager-DX-_oJg0.js";import{readFileSync as t}from"node:fs";import{dirname as n,resolve as r}from"node:path";import{fileURLToPath as i}from"node:url";import{parseArgs as a}from"node:util";import{createLogger as o,serializeError as s}from"../../core/dist/index.js";const c=n(i(import.meta.url)),l=(()=>{try{let e=r(c,`..`,`..`,`..`,`package.json`);return JSON.parse(t(e,`utf-8`)).version??`0.0.0`}catch{return`0.0.0`}})(),u=o(`server`),{values:d}=a({options:{transport:{type:`string`,default:process.env.AIKIT_TRANSPORT??`stdio`},port:{type:`string`,default:process.env.AIKIT_PORT??`3210`}}});async function f(){if(process.on(`unhandledRejection`,e=>{u.error(`Unhandled rejection`,s(e))}),u.info(`Starting MCP AI Kit server`,{version:l}),d.transport===`http`){let[{default:e},{loadConfig:t,resolveIndexMode:n},{registerDashboardRoutes:r,resolveDashboardDir:i}]=await Promise.all([import(`express`),import(`./config-C5IU9Lau.js`),import(`./dashboard-static-BfIe0Si1.js`)]),a=t();u.info(`Config loaded`,{sourceCount:a.sources.length,storePath:a.store.path});let o=e();o.use(e.json());let c=Number(d.port);o.use((e,t,n)=>{if(t.setHeader(`Access-Control-Allow-Origin`,process.env.AIKIT_CORS_ORIGIN??`http://localhost:${c}`),t.setHeader(`Access-Control-Allow-Methods`,`GET, POST, DELETE, OPTIONS`),t.setHeader(`Access-Control-Allow-Headers`,`Content-Type, Authorization`),e.method===`OPTIONS`){t.status(204).end();return}n()}),r(o,i(),u),o.get(`/health`,(e,t)=>{t.json({status:`ok`})});let l=!1,f=null,p=null,m=null,h=Promise.resolve();o.post(`/mcp`,async(e,t)=>{if(!l||!p||!m){t.status(503).json({jsonrpc:`2.0`,error:{code:-32603,message:`Server initializing — please retry in a few seconds`},id:null});return}let n=h,r;h=new Promise(e=>{r=e}),await n;try{let n=new m({sessionIdGenerator:void 0});await p.connect(n),await n.handleRequest(e,t,e.body),n.close()}catch(e){if(u.error(`MCP handler error`,s(e)),!t.headersSent){let n=e instanceof Error?e.message:String(e),r=n.includes(`Not Acceptable`);t.status(r?406:500).json({jsonrpc:`2.0`,error:{code:r?-32e3:-32603,message:r?n:`Internal server error`},id:null})}}finally{r()}}),o.get(`/mcp`,(e,t)=>{t.writeHead(405).end(JSON.stringify({jsonrpc:`2.0`,error:{code:-32e3,message:`Method not allowed.`},id:null}))}),o.delete(`/mcp`,(e,t)=>{t.writeHead(405).end(JSON.stringify({jsonrpc:`2.0`,error:{code:-32e3,message:`Method not allowed.`},id:null}))});let g=o.listen(c,`127.0.0.1`,()=>{u.info(`MCP server listening`,{url:`http://127.0.0.1:${c}/mcp`,port:c}),setTimeout(async()=>{try{let[{createLazyServer:e,ALL_TOOL_NAMES:t},{StreamableHTTPServerTransport:r},{checkForUpdates:i,autoUpgradeScaffold:o}]=await Promise.all([import(`./server-BlBBxAQO.js`),import(`@modelcontextprotocol/sdk/server/streamableHttp.js`),import(`./version-check-D4j0Pykd.js`)]);i(),o();let c=n(a),d=e(a,c);p=d.server,m=r,l=!0,u.info(`MCP server configured (lazy — AI Kit initializing in background)`,{toolCount:t.length,resourceCount:2}),d.startInit(),c===`auto`?d.ready.then(async()=>{try{let e=a.sources.map(e=>e.path).join(`, `);u.info(`Running initial index`,{sourcePaths:e}),await d.runInitialIndex(),u.info(`Initial index complete`)}catch(e){u.error(`Initial index failed; will retry on aikit_reindex`,s(e))}}).catch(e=>u.error(`AI Kit init or indexing failed`,s(e))):c===`smart`?d.ready.then(async()=>{try{if(!d.kb)throw Error(`AI Kit components are not available after initialization`);let{SmartIndexScheduler:e}=await import(`../../indexer/dist/index.js`);f=new e(d.kb.indexer,a,d.kb.store),f.start(),d.setSmartScheduler(f),u.info(`Smart index scheduler started (HTTP mode)`)}catch(e){u.error(`Failed to start smart index scheduler`,s(e))}}).catch(e=>u.error(`AI Kit initialization failed`,s(e))):(d.ready.catch(e=>u.error(`AI Kit initialization failed`,s(e))),u.info(`Initial full indexing skipped in HTTP mode`,{indexMode:c}))}catch(e){u.error(`Failed to load server modules`,s(e))}},100)}),_=async e=>{u.info(`Shutdown signal received`,{signal:e}),f?.stop(),g.close(),p&&await p.close(),process.exit(0)};process.on(`SIGINT`,()=>_(`SIGINT`)),process.on(`SIGTERM`,()=>_(`SIGTERM`))}else{let[{loadConfig:e,reconfigureForWorkspace:t,resolveIndexMode:n},{createLazyServer:r},{checkForUpdates:a,autoUpgradeScaffold:o},{RootsListChangedNotificationSchema:c}]=await Promise.all([import(`./config-C5IU9Lau.js`),import(`./server-BlBBxAQO.js`),import(`./version-check-D4j0Pykd.js`),import(`@modelcontextprotocol/sdk/types.js`)]),l=e();u.info(`Config loaded`,{sourceCount:l.sources.length,storePath:l.store.path}),a(),o();let d=n(l),f=r(l,d),{server:p,startInit:m,ready:h,runInitialIndex:g}=f,{StdioServerTransport:_}=await import(`@modelcontextprotocol/sdk/server/stdio.js`),v=new _;await p.connect(v),u.info(`MCP server started`,{transport:`stdio`});let y=e=>{if(e.length===0)return!1;let n=e[0].uri,r=n.startsWith(`file://`)?i(n):n;return u.info(`MCP roots resolved`,{rootUri:n,rootPath:r,rootCount:e.length}),t(l,r),!0},b=!1;try{b=y((await p.server.listRoots()).roots),b||u.info(`No MCP roots yet; waiting for roots/list_changed notification`)}catch(e){u.warn(`MCP roots/list not supported by client; using cwd fallback`,{cwd:process.cwd(),...s(e)}),b=!0}b||=await new Promise(e=>{let t=setTimeout(()=>{u.warn(`Timed out waiting for MCP roots/list_changed; using cwd fallback`,{cwd:process.cwd()}),e(!1)},5e3);p.server.setNotificationHandler(c,async()=>{clearTimeout(t);try{e(y((await p.server.listRoots()).roots))}catch(t){u.warn(`roots/list retry failed after notification`,s(t)),e(!1)}})}),m();let x=null,S=()=>{x&&clearTimeout(x),x=setTimeout(()=>{u.info(`Auto-shutdown: no activity for 30 minutes — exiting`),process.exit(0)},18e5),x.unref&&x.unref()};S(),process.stdin.on(`data`,()=>S()),h.catch(e=>{u.error(`Initialization failed — server will continue with limited tools`,s(e))}),d===`auto`?g().catch(e=>u.error(`Initial index failed`,s(e))):d===`smart`?h.then(async()=>{try{if(!f.kb)throw Error(`AI Kit components are not available after initialization`);let{SmartIndexScheduler:e}=await import(`../../indexer/dist/index.js`),t=new e(f.kb.indexer,l,f.kb.store);t.start(),f.setSmartScheduler(t),u.info(`Smart index scheduler started (stdio mode)`)}catch(e){u.error(`Failed to start smart index scheduler`,s(e))}}).catch(e=>u.error(`AI Kit init failed for smart scheduler`,s(e))):u.warn(`Initial full indexing skipped; use aikit_reindex to index manually`,{indexMode:d})}}f().catch(e=>{u.error(`Fatal error`,s(e)),process.exit(1)});export{e as CuratedKnowledgeManager};
1
+ import{t as e}from"./curated-manager-DX-_oJg0.js";import{readFileSync as t}from"node:fs";import{dirname as n,resolve as r}from"node:path";import{fileURLToPath as i}from"node:url";import{parseArgs as a}from"node:util";import{createLogger as o,serializeError as s}from"../../core/dist/index.js";const c=n(i(import.meta.url)),l=(()=>{try{let e=r(c,`..`,`..`,`..`,`package.json`);return JSON.parse(t(e,`utf-8`)).version??`0.0.0`}catch{return`0.0.0`}})(),u=o(`server`),{values:d}=a({options:{transport:{type:`string`,default:process.env.AIKIT_TRANSPORT??`stdio`},port:{type:`string`,default:process.env.AIKIT_PORT??`3210`}}});async function f(){if(process.on(`unhandledRejection`,e=>{u.error(`Unhandled rejection`,s(e))}),u.info(`Starting MCP AI Kit server`,{version:l}),d.transport===`http`){let[{default:e},{loadConfig:t,resolveIndexMode:n},{registerDashboardRoutes:r,resolveDashboardDir:i}]=await Promise.all([import(`express`),import(`./config-C5IU9Lau.js`),import(`./dashboard-static-BfIe0Si1.js`)]),a=t();u.info(`Config loaded`,{sourceCount:a.sources.length,storePath:a.store.path});let o=e();o.use(e.json());let c=Number(d.port);o.use((e,t,n)=>{if(t.setHeader(`Access-Control-Allow-Origin`,process.env.AIKIT_CORS_ORIGIN??`http://localhost:${c}`),t.setHeader(`Access-Control-Allow-Methods`,`GET, POST, DELETE, OPTIONS`),t.setHeader(`Access-Control-Allow-Headers`,`Content-Type, Authorization`),e.method===`OPTIONS`){t.status(204).end();return}n()}),r(o,i(),u),o.get(`/health`,(e,t)=>{t.json({status:`ok`})});let l=!1,f=null,p=null,m=null,h=Promise.resolve();o.post(`/mcp`,async(e,t)=>{if(!l||!p||!m){t.status(503).json({jsonrpc:`2.0`,error:{code:-32603,message:`Server initializing — please retry in a few seconds`},id:null});return}let n=h,r;h=new Promise(e=>{r=e}),await n;try{let n=new m({sessionIdGenerator:void 0});await p.connect(n),await n.handleRequest(e,t,e.body),n.close()}catch(e){if(u.error(`MCP handler error`,s(e)),!t.headersSent){let n=e instanceof Error?e.message:String(e),r=n.includes(`Not Acceptable`);t.status(r?406:500).json({jsonrpc:`2.0`,error:{code:r?-32e3:-32603,message:r?n:`Internal server error`},id:null})}}finally{r()}}),o.get(`/mcp`,(e,t)=>{t.writeHead(405).end(JSON.stringify({jsonrpc:`2.0`,error:{code:-32e3,message:`Method not allowed.`},id:null}))}),o.delete(`/mcp`,(e,t)=>{t.writeHead(405).end(JSON.stringify({jsonrpc:`2.0`,error:{code:-32e3,message:`Method not allowed.`},id:null}))});let g=o.listen(c,`127.0.0.1`,()=>{u.info(`MCP server listening`,{url:`http://127.0.0.1:${c}/mcp`,port:c}),setTimeout(async()=>{try{let[{createLazyServer:e,ALL_TOOL_NAMES:t},{StreamableHTTPServerTransport:r},{checkForUpdates:i,autoUpgradeScaffold:o}]=await Promise.all([import(`./server-C7mYTlIU.js`),import(`@modelcontextprotocol/sdk/server/streamableHttp.js`),import(`./version-check-D4j0Pykd.js`)]);i(),o();let c=n(a),d=e(a,c);p=d.server,m=r,l=!0,u.info(`MCP server configured (lazy — AI Kit initializing in background)`,{toolCount:t.length,resourceCount:2}),d.startInit(),c===`auto`?d.ready.then(async()=>{try{let e=a.sources.map(e=>e.path).join(`, `);u.info(`Running initial index`,{sourcePaths:e}),await d.runInitialIndex(),u.info(`Initial index complete`)}catch(e){u.error(`Initial index failed; will retry on aikit_reindex`,s(e))}}).catch(e=>u.error(`AI Kit init or indexing failed`,s(e))):c===`smart`?d.ready.then(async()=>{try{if(!d.kb)throw Error(`AI Kit components are not available after initialization`);let{SmartIndexScheduler:e}=await import(`../../indexer/dist/index.js`);f=new e(d.kb.indexer,a,d.kb.store),f.start(),d.setSmartScheduler(f),u.info(`Smart index scheduler started (HTTP mode)`)}catch(e){u.error(`Failed to start smart index scheduler`,s(e))}}).catch(e=>u.error(`AI Kit initialization failed`,s(e))):(d.ready.catch(e=>u.error(`AI Kit initialization failed`,s(e))),u.info(`Initial full indexing skipped in HTTP mode`,{indexMode:c}))}catch(e){u.error(`Failed to load server modules`,s(e))}},100)}),_=async e=>{u.info(`Shutdown signal received`,{signal:e}),f?.stop(),g.close(),p&&await p.close(),process.exit(0)};process.on(`SIGINT`,()=>_(`SIGINT`)),process.on(`SIGTERM`,()=>_(`SIGTERM`))}else{let[{loadConfig:e,reconfigureForWorkspace:t,resolveIndexMode:n},{createLazyServer:r},{checkForUpdates:a,autoUpgradeScaffold:o},{RootsListChangedNotificationSchema:c}]=await Promise.all([import(`./config-C5IU9Lau.js`),import(`./server-C7mYTlIU.js`),import(`./version-check-D4j0Pykd.js`),import(`@modelcontextprotocol/sdk/types.js`)]),l=e();u.info(`Config loaded`,{sourceCount:l.sources.length,storePath:l.store.path}),a(),o();let d=n(l),f=r(l,d),{server:p,startInit:m,ready:h,runInitialIndex:g}=f,{StdioServerTransport:_}=await import(`@modelcontextprotocol/sdk/server/stdio.js`),v=new _;await p.connect(v),u.info(`MCP server started`,{transport:`stdio`});let y=e=>{if(e.length===0)return!1;let n=e[0].uri,r=n.startsWith(`file://`)?i(n):n;return u.info(`MCP roots resolved`,{rootUri:n,rootPath:r,rootCount:e.length}),t(l,r),!0},b=!1;try{b=y((await p.server.listRoots()).roots),b||u.info(`No MCP roots yet; waiting for roots/list_changed notification`)}catch(e){u.warn(`MCP roots/list not supported by client; using cwd fallback`,{cwd:process.cwd(),...s(e)}),b=!0}b||=await new Promise(e=>{let t=setTimeout(()=>{u.warn(`Timed out waiting for MCP roots/list_changed; using cwd fallback`,{cwd:process.cwd()}),e(!1)},5e3);p.server.setNotificationHandler(c,async()=>{clearTimeout(t);try{e(y((await p.server.listRoots()).roots))}catch(t){u.warn(`roots/list retry failed after notification`,s(t)),e(!1)}})}),m();let x=null,S=()=>{x&&clearTimeout(x),x=setTimeout(()=>{u.info(`Auto-shutdown: no activity for 30 minutes — exiting`),process.exit(0)},18e5),x.unref&&x.unref()};S(),process.stdin.on(`data`,()=>S()),h.catch(e=>{u.error(`Initialization failed — server will continue with limited tools`,s(e))}),d===`auto`?g().catch(e=>u.error(`Initial index failed`,s(e))):d===`smart`?h.then(async()=>{try{if(!f.kb)throw Error(`AI Kit components are not available after initialization`);let{SmartIndexScheduler:e}=await import(`../../indexer/dist/index.js`),t=new e(f.kb.indexer,l,f.kb.store);t.start(),f.setSmartScheduler(t),u.info(`Smart index scheduler started (stdio mode)`)}catch(e){u.error(`Failed to start smart index scheduler`,s(e))}}).catch(e=>u.error(`AI Kit init failed for smart scheduler`,s(e))):u.warn(`Initial full indexing skipped; use aikit_reindex to index manually`,{indexMode:d})}}f().catch(e=>{u.error(`Fatal error`,s(e)),process.exit(1)});export{e as CuratedKnowledgeManager};