@swarmvaultai/cli 0.1.3 → 0.1.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +74 -3
  2. package/dist/index.js +95 -31
  3. package/package.json +3 -3
package/README.md CHANGED
@@ -2,7 +2,7 @@
2
2
 
3
3
  `@swarmvaultai/cli` is the global command-line entry point for SwarmVault.
4
4
 
5
- It gives you the `swarmvault` command for building a local-first knowledge vault from files, URLs, browser clips, and saved query outputs.
5
+ It gives you the `swarmvault` command for building a local-first knowledge vault from files, URLs, browser clips, saved query outputs, and guided exploration runs.
6
6
 
7
7
  ## Install
8
8
 
@@ -23,9 +23,12 @@ Installed commands:
23
23
  mkdir my-vault
24
24
  cd my-vault
25
25
  swarmvault init
26
+ sed -n '1,120p' swarmvault.schema.md
26
27
  swarmvault ingest ./notes.md
27
28
  swarmvault compile
28
29
  swarmvault query "What keeps recurring?" --save
30
+ swarmvault explore "What should I research next?" --steps 3
31
+ swarmvault lint --deep
29
32
  swarmvault graph serve
30
33
  ```
31
34
 
@@ -41,6 +44,9 @@ Create a workspace with:
41
44
  - `state/`
42
45
  - `agent/`
43
46
  - `swarmvault.config.json`
47
+ - `swarmvault.schema.md`
48
+
49
+ The schema file is the vault-specific instruction layer. Edit it to define naming rules, categories, grounding expectations, and exclusions before a serious compile.
44
50
 
45
51
  ### `swarmvault ingest <path-or-url>`
46
52
 
@@ -58,14 +64,48 @@ Compile the current manifests into:
58
64
  - structured graph data in `state/graph.json`
59
65
  - local search data in `state/search.sqlite`
60
66
 
67
+ The compiler also reads `swarmvault.schema.md` and records a `schema_hash` in generated pages so schema edits can mark pages stale.
68
+
61
69
  ### `swarmvault query "<question>" [--save]`
62
70
 
63
- Query the compiled vault. With `--save`, the answer is written into `wiki/outputs/` so the result becomes part of the vault.
71
+ Query the compiled vault. The query layer also reads `swarmvault.schema.md`, so answers follow the vault’s own structure and grounding rules.
72
+
73
+ With `--save`, the answer is written into `wiki/outputs/` and immediately registered in:
74
+
75
+ - `wiki/index.md`
76
+ - `wiki/outputs/index.md`
77
+ - `state/graph.json`
78
+ - `state/search.sqlite`
79
+
80
+ Saved outputs also carry related page, node, and source metadata so later compiles can link them back into the wiki.
81
+
82
+ ### `swarmvault explore "<question>" [--steps <n>]`
64
83
 
65
- ### `swarmvault lint`
84
+ Run a save-first multi-step research loop.
85
+
86
+ Each step:
87
+
88
+ - queries the vault
89
+ - saves the answer into `wiki/outputs/`
90
+ - generates follow-up questions
91
+ - chooses the next follow-up deterministically
92
+
93
+ The command also writes a hub page linking the root question, saved step pages, and generated follow-up questions.
94
+
95
+ ### `swarmvault lint [--deep] [--web]`
66
96
 
67
97
  Run anti-drift and vault health checks such as stale pages, missing graph artifacts, and other structural issues.
68
98
 
99
+ `--deep` adds an LLM-powered advisory pass that can report:
100
+
101
+ - `coverage_gap`
102
+ - `contradiction_candidate`
103
+ - `missing_citation`
104
+ - `candidate_page`
105
+ - `follow_up_question`
106
+
107
+ `--web` can only be used with `--deep`. It enriches deep-lint findings with external evidence snippets and URLs from a configured web-search provider.
108
+
69
109
  ### `swarmvault watch [--lint] [--debounce <ms>]`
70
110
 
71
111
  Watch the inbox directory and trigger import and compile cycles when files change. With `--lint`, each cycle also runs linting. Run metadata is appended to `state/jobs.ndjson`.
@@ -83,6 +123,8 @@ Run SwarmVault as a local MCP server over stdio. This exposes the vault to compa
83
123
  - `compile_vault`
84
124
  - `lint_vault`
85
125
 
126
+ The MCP surface also exposes `swarmvault://schema` and includes `schemaPath` in `workspace_info`.
127
+
86
128
  ### `swarmvault graph serve`
87
129
 
88
130
  Start the local graph UI backed by `state/graph.json`.
@@ -118,10 +160,39 @@ Example:
118
160
 
119
161
  Generic OpenAI-compatible APIs are supported through config when the provider follows the OpenAI request shape closely enough.
120
162
 
163
+ Deep lint web augmentation uses a separate `webSearch` config block. Example:
164
+
165
+ ```json
166
+ {
167
+ "webSearch": {
168
+ "providers": {
169
+ "evidence": {
170
+ "type": "http-json",
171
+ "endpoint": "https://search.example/api/search",
172
+ "method": "GET",
173
+ "apiKeyEnv": "SEARCH_API_KEY",
174
+ "apiKeyHeader": "Authorization",
175
+ "apiKeyPrefix": "Bearer ",
176
+ "queryParam": "q",
177
+ "limitParam": "limit",
178
+ "resultsPath": "results",
179
+ "titleField": "title",
180
+ "urlField": "url",
181
+ "snippetField": "snippet"
182
+ }
183
+ },
184
+ "tasks": {
185
+ "deepLintProvider": "evidence"
186
+ }
187
+ }
188
+ }
189
+ ```
190
+
121
191
  ## Troubleshooting
122
192
 
123
193
  - If you are running from a source checkout and `graph serve` says the viewer build is missing, run `pnpm build` in the repository first
124
194
  - If a provider claims OpenAI compatibility but fails structured generation, declare only the capabilities it actually supports
195
+ - If `lint --deep --web` fails immediately, make sure a `webSearch` provider is configured and mapped to `tasks.deepLintProvider`
125
196
  - Node 24 may emit an experimental warning for `node:sqlite`; that is expected in the current release
126
197
 
127
198
  ## Links
package/dist/index.js CHANGED
@@ -1,74 +1,122 @@
1
1
  #!/usr/bin/env node
2
2
 
3
3
  // src/index.ts
4
- import { Command } from "commander";
5
4
  import process from "process";
6
5
  import {
7
6
  compileVault,
7
+ exploreVault,
8
8
  importInbox,
9
9
  ingestInput,
10
10
  initVault,
11
11
  installAgent,
12
12
  lintVault,
13
+ loadVaultConfig,
13
14
  queryVault,
14
15
  startGraphServer,
15
16
  startMcpServer,
16
17
  watchVault
17
18
  } from "@swarmvaultai/engine";
19
+ import { Command } from "commander";
18
20
  var program = new Command();
19
- program.name("swarmvault").description("SwarmVault is a local-first LLM wiki compiler with graph outputs and pluggable providers.").version("0.1.3");
21
+ program.name("swarmvault").description("SwarmVault is a local-first LLM wiki compiler with graph outputs and pluggable providers.").version("0.1.5").option("--json", "Emit structured JSON output", false);
22
+ function isJson() {
23
+ return program.opts().json === true;
24
+ }
25
+ function emitJson(data) {
26
+ process.stdout.write(`${JSON.stringify(data)}
27
+ `);
28
+ }
29
+ function log(message) {
30
+ if (isJson()) {
31
+ process.stderr.write(`${message}
32
+ `);
33
+ } else {
34
+ process.stdout.write(`${message}
35
+ `);
36
+ }
37
+ }
20
38
  program.command("init").description("Initialize a SwarmVault workspace in the current directory.").action(async () => {
21
39
  await initVault(process.cwd());
22
- process.stdout.write("Initialized SwarmVault workspace.\n");
40
+ if (isJson()) {
41
+ emitJson({ status: "initialized", rootDir: process.cwd() });
42
+ } else {
43
+ log("Initialized SwarmVault workspace.");
44
+ }
23
45
  });
24
46
  program.command("ingest").description("Ingest a local file path or URL into the raw SwarmVault workspace.").argument("<input>", "Local file path or URL").action(async (input) => {
25
47
  const manifest = await ingestInput(process.cwd(), input);
26
- process.stdout.write(`${manifest.sourceId}
27
- `);
48
+ if (isJson()) {
49
+ emitJson(manifest);
50
+ } else {
51
+ log(manifest.sourceId);
52
+ }
28
53
  });
29
54
  var inbox = program.command("inbox").description("Inbox and capture workflows.");
30
55
  inbox.command("import").description("Import supported files from the configured inbox directory.").argument("[dir]", "Optional inbox directory override").action(async (dir) => {
31
56
  const result = await importInbox(process.cwd(), dir);
32
- process.stdout.write(
33
- `Imported ${result.imported.length} source(s) from ${result.inputDir}. Scanned: ${result.scannedCount}. Attachments: ${result.attachmentCount}. Skipped: ${result.skipped.length}.
34
- `
35
- );
57
+ if (isJson()) {
58
+ emitJson(result);
59
+ } else {
60
+ log(
61
+ `Imported ${result.imported.length} source(s) from ${result.inputDir}. Scanned: ${result.scannedCount}. Attachments: ${result.attachmentCount}. Skipped: ${result.skipped.length}.`
62
+ );
63
+ }
36
64
  });
37
65
  program.command("compile").description("Compile manifests into wiki pages, graph JSON, and search index.").action(async () => {
38
66
  const result = await compileVault(process.cwd());
39
- process.stdout.write(
40
- `Compiled ${result.sourceCount} source(s), ${result.pageCount} page(s). Changed: ${result.changedPages.length}.
41
- `
42
- );
67
+ if (isJson()) {
68
+ emitJson(result);
69
+ } else {
70
+ log(`Compiled ${result.sourceCount} source(s), ${result.pageCount} page(s). Changed: ${result.changedPages.length}.`);
71
+ }
43
72
  });
44
73
  program.command("query").description("Query the compiled SwarmVault wiki.").argument("<question>", "Question to ask SwarmVault").option("--save", "Persist the answer to wiki/outputs", false).action(async (question, options) => {
45
74
  const result = await queryVault(process.cwd(), question, options.save ?? false);
46
- process.stdout.write(`${result.answer}
47
- `);
48
- if (result.savedTo) {
49
- process.stdout.write(`Saved to ${result.savedTo}
50
- `);
75
+ if (isJson()) {
76
+ emitJson(result);
77
+ } else {
78
+ log(result.answer);
79
+ if (result.savedTo) {
80
+ log(`Saved to ${result.savedTo}`);
81
+ }
51
82
  }
52
83
  });
53
- program.command("lint").description("Run anti-drift and wiki-health checks.").action(async () => {
54
- const findings = await lintVault(process.cwd());
84
+ program.command("explore").description("Run a save-first multi-step exploration loop against the vault.").argument("<question>", "Root question to explore").option("--steps <n>", "Maximum number of exploration steps", "3").action(async (question, options) => {
85
+ const stepCount = Number.parseInt(options.steps ?? "3", 10);
86
+ const result = await exploreVault(process.cwd(), question, Number.isFinite(stepCount) ? stepCount : 3);
87
+ if (isJson()) {
88
+ emitJson(result);
89
+ } else {
90
+ log(`Exploration hub saved to ${result.hubPath}`);
91
+ log(`Completed ${result.stepCount} step(s).`);
92
+ }
93
+ });
94
+ program.command("lint").description("Run anti-drift and wiki-health checks.").option("--deep", "Run LLM-powered advisory lint", false).option("--web", "Augment deep lint with configured web search", false).action(async (options) => {
95
+ const findings = await lintVault(process.cwd(), {
96
+ deep: options.deep ?? false,
97
+ web: options.web ?? false
98
+ });
99
+ if (isJson()) {
100
+ emitJson(findings);
101
+ return;
102
+ }
55
103
  if (!findings.length) {
56
- process.stdout.write("No findings.\n");
104
+ log("No findings.");
57
105
  return;
58
106
  }
59
107
  for (const finding of findings) {
60
- process.stdout.write(
61
- `[${finding.severity}] ${finding.code}: ${finding.message}${finding.pagePath ? ` (${finding.pagePath})` : ""}
62
- `
63
- );
108
+ log(`[${finding.severity}] ${finding.code}: ${finding.message}${finding.pagePath ? ` (${finding.pagePath})` : ""}`);
64
109
  }
65
110
  });
66
111
  var graph = program.command("graph").description("Graph-related commands.");
67
112
  graph.command("serve").description("Serve the local graph viewer.").option("--port <port>", "Port override").action(async (options) => {
68
113
  const port = options.port ? Number.parseInt(options.port, 10) : void 0;
69
114
  const server = await startGraphServer(process.cwd(), port);
70
- process.stdout.write(`Graph viewer running at http://localhost:${server.port}
71
- `);
115
+ if (isJson()) {
116
+ emitJson({ port: server.port, url: `http://localhost:${server.port}` });
117
+ } else {
118
+ log(`Graph viewer running at http://localhost:${server.port}`);
119
+ }
72
120
  process.on("SIGINT", async () => {
73
121
  await server.close();
74
122
  process.exit(0);
@@ -76,17 +124,26 @@ graph.command("serve").description("Serve the local graph viewer.").option("--po
76
124
  });
77
125
  program.command("watch").description("Watch the inbox directory and run import/compile cycles on changes.").option("--lint", "Run lint after each compile cycle", false).option("--debounce <ms>", "Debounce window in milliseconds", "900").action(async (options) => {
78
126
  const debounceMs = Number.parseInt(options.debounce ?? "900", 10);
127
+ const { paths } = await loadVaultConfig(process.cwd());
79
128
  const controller = await watchVault(process.cwd(), {
80
129
  lint: options.lint ?? false,
81
130
  debounceMs: Number.isFinite(debounceMs) ? debounceMs : 900
82
131
  });
83
- process.stdout.write("Watching inbox for changes. Press Ctrl+C to stop.\n");
132
+ if (isJson()) {
133
+ emitJson({ status: "watching", inboxDir: paths.inboxDir });
134
+ } else {
135
+ log("Watching inbox for changes. Press Ctrl+C to stop.");
136
+ }
84
137
  process.on("SIGINT", async () => {
85
138
  await controller.close();
86
139
  process.exit(0);
87
140
  });
88
141
  });
89
142
  program.command("mcp").description("Run SwarmVault as a local MCP server over stdio.").action(async () => {
143
+ if (isJson()) {
144
+ process.stderr.write(`${JSON.stringify({ status: "running", transport: "stdio" })}
145
+ `);
146
+ }
90
147
  const controller = await startMcpServer(process.cwd());
91
148
  process.on("SIGINT", async () => {
92
149
  await controller.close();
@@ -95,12 +152,19 @@ program.command("mcp").description("Run SwarmVault as a local MCP server over st
95
152
  });
96
153
  program.command("install").description("Install SwarmVault instructions for an agent in the current project.").requiredOption("--agent <agent>", "codex, claude, or cursor").action(async (options) => {
97
154
  const target = await installAgent(process.cwd(), options.agent);
98
- process.stdout.write(`Installed rules into ${target}
99
- `);
155
+ if (isJson()) {
156
+ emitJson({ agent: options.agent, target });
157
+ } else {
158
+ log(`Installed rules into ${target}`);
159
+ }
100
160
  });
101
161
  program.parseAsync(process.argv).catch((error) => {
102
162
  const message = error instanceof Error ? error.message : String(error);
103
- process.stderr.write(`${message}
163
+ if (isJson()) {
164
+ emitJson({ error: message });
165
+ } else {
166
+ process.stderr.write(`${message}
104
167
  `);
168
+ }
105
169
  process.exit(1);
106
170
  });
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@swarmvaultai/cli",
3
- "version": "0.1.3",
3
+ "version": "0.1.5",
4
4
  "description": "Global CLI for SwarmVault.",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",
@@ -39,7 +39,7 @@
39
39
  },
40
40
  "dependencies": {
41
41
  "commander": "^14.0.1",
42
- "@swarmvaultai/engine": "0.1.3"
42
+ "@swarmvaultai/engine": "0.1.5"
43
43
  },
44
44
  "devDependencies": {
45
45
  "@types/node": "^24.6.0",
@@ -48,6 +48,6 @@
48
48
  "scripts": {
49
49
  "build": "tsup src/index.ts --format esm --dts",
50
50
  "test": "node -e \"process.exit(0)\"",
51
- "lint": "tsc --noEmit"
51
+ "typecheck": "tsc --noEmit"
52
52
  }
53
53
  }