@swarmvaultai/engine 0.1.18 → 0.1.21

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -19,29 +19,42 @@ If you only want to use SwarmVault as a tool, install `@swarmvaultai/cli` instea
19
19
 
20
20
  ```ts
21
21
  import {
22
+ addInput,
23
+ benchmarkVault,
22
24
  compileVault,
23
25
  createMcpServer,
24
26
  createWebSearchAdapter,
25
27
  defaultVaultConfig,
26
28
  defaultVaultSchema,
27
29
  exploreVault,
30
+ exportGraphFormat,
28
31
  exportGraphHtml,
32
+ explainGraphVault,
33
+ getWatchStatus,
34
+ getGitHookStatus,
29
35
  importInbox,
30
36
  ingestInput,
31
37
  initVault,
32
38
  installAgent,
39
+ installGitHooks,
33
40
  getWebSearchAdapterForTask,
34
41
  lintVault,
42
+ listGodNodes,
35
43
  listSchedules,
36
44
  loadVaultConfig,
37
45
  loadVaultSchema,
38
46
  loadVaultSchemas,
47
+ pathGraphVault,
48
+ queryGraphVault,
39
49
  queryVault,
50
+ runWatchCycle,
40
51
  runSchedule,
41
52
  searchVault,
42
53
  serveSchedules,
43
54
  startGraphServer,
44
55
  startMcpServer,
56
+ syncTrackedRepos,
57
+ uninstallGitHooks,
45
58
  watchVault,
46
59
  } from "@swarmvaultai/engine";
47
60
  ```
@@ -51,25 +64,48 @@ The engine also exports the main runtime types for providers, graph artifacts, p
51
64
  ## Example
52
65
 
53
66
  ```ts
54
- import { compileVault, exploreVault, exportGraphHtml, importInbox, initVault, loadVaultSchemas, queryVault, watchVault } from "@swarmvaultai/engine";
67
+ import {
68
+ compileVault,
69
+ exploreVault,
70
+ exportGraphHtml,
71
+ importInbox,
72
+ initVault,
73
+ installGitHooks,
74
+ loadVaultSchemas,
75
+ queryGraphVault,
76
+ queryVault,
77
+ runWatchCycle,
78
+ watchVault
79
+ } from "@swarmvaultai/engine";
55
80
 
56
81
  const rootDir = process.cwd();
57
82
 
58
83
  await initVault(rootDir, { obsidian: true });
59
84
  const schemas = await loadVaultSchemas(rootDir);
60
85
  console.log(schemas.root.path);
86
+ await addInput(rootDir, "https://arxiv.org/abs/2401.12345");
61
87
  await importInbox(rootDir);
62
88
  await compileVault(rootDir, {});
89
+ const benchmark = await benchmarkVault(rootDir);
90
+ console.log(benchmark.avgQueryTokens);
63
91
 
64
92
  const saved = await queryVault(rootDir, { question: "What changed most recently?" });
65
93
  console.log(saved.savedPath);
66
94
 
95
+ const graphQuery = await queryGraphVault(rootDir, "Which nodes bridge the biggest communities?");
96
+ console.log(graphQuery.summary);
97
+
67
98
  const exploration = await exploreVault(rootDir, { question: "What should I investigate next?", steps: 3, format: "report" });
68
99
  console.log(exploration.hubPath);
69
100
 
70
101
  await exportGraphHtml(rootDir, "./exports/graph.html");
102
+ await exportGraphFormat(rootDir, "graphml", "./exports/graph.graphml");
103
+
104
+ await runWatchCycle(rootDir, { repo: true });
105
+ console.log(await getWatchStatus(rootDir));
106
+ await installGitHooks(rootDir);
71
107
 
72
- const watcher = await watchVault(rootDir, { lint: true });
108
+ const watcher = await watchVault(rootDir, { lint: true, repo: true });
73
109
  ```
74
110
 
75
111
  ## Schema Layer
@@ -123,25 +159,37 @@ This matters because many "OpenAI-compatible" backends only implement part of th
123
159
  ### Ingest
124
160
 
125
161
  - `ingestInput(rootDir, input, { includeAssets, maxAssetSize })` ingests a local file path or URL
162
+ - `addInput(rootDir, input, { author, contributor })` captures supported URLs into normalized markdown before ingesting them, or falls back to generic URL ingest
126
163
  - `ingestDirectory(rootDir, inputDir, { repoRoot, include, exclude, maxFiles, gitignore })` recursively ingests a local directory as a repo-aware code/content source tree
127
164
  - `importInbox(rootDir, inputDir?)` recursively imports supported inbox files and browser-clipper style bundles
128
- - JavaScript, TypeScript, Python, Go, Rust, Java, C#, C, C++, and PHP inputs are treated as code sources and compiled into both source pages and `wiki/code/` module pages
165
+ - JavaScript, TypeScript, Python, Go, Rust, Java, C#, C, C++, PHP, Ruby, and PowerShell inputs are treated as code sources and compiled into both source pages and `wiki/code/` module pages
129
166
  - code manifests can carry `repoRelativePath`, and compile writes `state/code-index.json` so local imports can resolve across an ingested repo tree
130
167
  - HTML and markdown URL ingests localize remote image references into `raw/assets/<sourceId>/` by default and rewrite the stored markdown to local relative paths
131
168
 
132
169
  ### Compile + Query
133
170
 
134
171
  - `compileVault(rootDir, { approve })` writes wiki pages, graph data, and search state using the vault schema as guidance, or stages a review bundle
172
+ - compile also writes graph orientation pages such as `wiki/graph/report.md` and `wiki/graph/communities/<community>.md`
173
+ - `benchmarkVault(rootDir, { questions })` writes `state/benchmark.json` and folds the latest benchmark summary into `wiki/graph/report.md`
135
174
  - `queryVault(rootDir, { question, save, format, review })` answers against the compiled vault using the same schema layer and saves by default
136
175
  - `exploreVault(rootDir, { question, steps, format, review })` runs a save-first multi-step exploration loop and writes a hub page plus step outputs
137
176
  - `searchVault(rootDir, query, limit)` searches compiled pages directly
177
+ - `queryGraphVault(rootDir, question, { traversal, budget })` runs deterministic local graph search without a model provider
178
+ - `pathGraphVault(rootDir, from, to)` returns the shortest graph path between two targets
179
+ - `explainGraphVault(rootDir, target)` returns node, community, neighbor, and provenance details
180
+ - `listGodNodes(rootDir, limit)` returns the most connected bridge-heavy graph nodes
138
181
  - project-aware compile also builds `wiki/projects/index.md` plus `wiki/projects/<project>/index.md` rollups without duplicating page trees
139
182
  - human-authored insight pages in `wiki/insights/` are indexed into search and available to query without being rewritten by compile
140
183
  - `chart` and `image` formats save wrapper markdown pages plus local output assets under `wiki/outputs/assets/<slug>/`
141
184
 
142
185
  ### Automation
143
186
 
144
- - `watchVault(rootDir, options)` watches the inbox and appends run records to `state/jobs.ndjson`
187
+ - `watchVault(rootDir, options)` watches the inbox and optionally tracked repo roots, then appends run records to `state/jobs.ndjson`
188
+ - `runWatchCycle(rootDir, options)` runs the same inbox/repo refresh logic once without starting a watcher
189
+ - `getWatchStatus(rootDir)` reads the latest watch-status artifact plus pending semantic refresh entries
190
+ - `syncTrackedRepos(rootDir)` refreshes previously ingested repo roots, updates changed manifests, and removes deleted repo manifests
191
+ - `syncTrackedReposForWatch(rootDir)` is the repo-watch sync path that defers non-code semantic refresh into `state/watch/`
192
+ - `installGitHooks(rootDir)`, `uninstallGitHooks(rootDir)`, and `getGitHookStatus(rootDir)` manage local `post-commit` and `post-checkout` hook blocks for the nearest git repository
145
193
  - `lintVault(rootDir, options)` runs structural lint, optional deep lint, and optional web-augmented evidence gathering
146
194
  - `listSchedules(rootDir)`, `runSchedule(rootDir, jobId)`, and `serveSchedules(rootDir)` manage recurring local jobs from config
147
195
  - compile, query, explore, lint, and watch also write canonical markdown session artifacts to `state/sessions/`
@@ -158,8 +206,9 @@ This matters because many "OpenAI-compatible" backends only implement part of th
158
206
  - `createMcpServer(rootDir)` creates an MCP server instance
159
207
  - `startMcpServer(rootDir)` runs the MCP server over stdio
160
208
  - `exportGraphHtml(rootDir, outputPath)` exports the graph workspace as a standalone HTML file
209
+ - `exportGraphFormat(rootDir, "svg" | "graphml" | "cypher", outputPath)` exports the graph into interoperable file formats
161
210
 
162
- The MCP surface includes tools for workspace info, page search, page reads, source listing, querying, ingestion, compile, and lint, along with resources for config, graph, manifests, schema, page content, and session artifacts.
211
+ The MCP surface includes tools for workspace info, page search, page reads, source listing, querying, ingestion, compile, lint, and graph-native read operations such as graph query, node explain, neighbor lookup, shortest path, and god-node listing, along with resources for config, graph, manifests, schema, page content, and session artifacts.
163
212
 
164
213
  ## Artifacts
165
214
 
@@ -170,6 +219,7 @@ Running the engine produces a local workspace with these main areas:
170
219
  - `raw/sources/`: immutable source copies
171
220
  - `raw/assets/`: copied attachments referenced by ingested markdown bundles and remote URL ingests
172
221
  - `wiki/`: generated markdown pages, the append-only `log.md` activity trail, staged candidates, saved query outputs, exploration hub pages, and a human-only `insights/` area
222
+ - `wiki/graph/`: generated graph report pages and per-community summaries derived from `state/graph.json`
173
223
  - `wiki/outputs/assets/`: local chart/image artifacts and JSON manifests for saved visual outputs
174
224
  - `wiki/code/`: generated module pages for ingested code sources
175
225
  - `wiki/projects/`: generated project rollups over canonical pages
@@ -178,15 +228,17 @@ Running the engine produces a local workspace with these main areas:
178
228
  - `state/extracts/`: extracted text
179
229
  - `state/analyses/`: model analysis output
180
230
  - `state/code-index.json`: repo-aware code module aliases and local resolution data
231
+ - `state/benchmark.json`: latest benchmark/trust summary for the current vault
181
232
  - `state/graph.json`: compiled graph
182
233
  - `state/search.sqlite`: full-text index
183
234
  - `state/sessions/`: canonical session artifacts
184
235
  - `state/approvals/`: staged review bundles from `compileVault({ approve: true })`
185
236
  - `state/schedules/`: persisted schedule state and leases
237
+ - `state/watch/`: watch-status and pending semantic refresh artifacts for repo automation
186
238
  - `state/jobs.ndjson`: watch-mode automation logs
187
239
 
188
240
  Saved outputs are indexed immediately into the graph page registry and search index, then linked back into compiled source, concept, and entity pages immediately through the lightweight artifact sync path. New concept and entity pages stage into `wiki/candidates/` first and promote to active pages on the next matching compile. Insight pages are indexed into search and page reads, but compile does not mutate them. Project-scoped pages receive `project_ids`, project tags, and layered root-plus-project schema hashes when all contributing sources resolve to the same configured project.
189
- Code sources also emit module and symbol nodes into `state/graph.json`, so local imports, exports, inheritance, and same-module call edges are queryable through the same viewer and search pipeline.
241
+ Code sources also emit module, symbol, and parser-backed rationale nodes into `state/graph.json`, so local imports, exports, inheritance, same-module call edges, and rationale links are queryable through the same viewer and search pipeline.
190
242
  Ingest, inbox import, compile, query, lint, review, and candidate operations also append human-readable entries to `wiki/log.md`.
191
243
 
192
244
  ## Notes