@swarmvaultai/cli 0.1.5 → 0.1.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +47 -11
  2. package/dist/index.js +160 -14
  3. package/package.json +2 -2
package/README.md CHANGED
@@ -22,29 +22,34 @@ Installed commands:
22
22
  ```bash
23
23
  mkdir my-vault
24
24
  cd my-vault
25
- swarmvault init
25
+ swarmvault init --obsidian
26
26
  sed -n '1,120p' swarmvault.schema.md
27
27
  swarmvault ingest ./notes.md
28
28
  swarmvault compile
29
- swarmvault query "What keeps recurring?" --save
29
+ swarmvault query "What keeps recurring?"
30
+ swarmvault query "Turn this into slides" --format slides
30
31
  swarmvault explore "What should I research next?" --steps 3
31
32
  swarmvault lint --deep
32
33
  swarmvault graph serve
34
+ swarmvault graph export --html ./exports/graph.html
33
35
  ```
34
36
 
35
37
  ## Commands
36
38
 
37
- ### `swarmvault init`
39
+ ### `swarmvault init [--obsidian]`
38
40
 
39
41
  Create a workspace with:
40
42
 
41
43
  - `inbox/`
42
44
  - `raw/`
43
45
  - `wiki/`
46
+ - `wiki/insights/`
44
47
  - `state/`
48
+ - `state/sessions/`
45
49
  - `agent/`
46
50
  - `swarmvault.config.json`
47
51
  - `swarmvault.schema.md`
52
+ - optional `.obsidian/` workspace files when `--obsidian` is passed
48
53
 
49
54
  The schema file is the vault-specific instruction layer. Edit it to define naming rules, categories, grounding expectations, and exclusions before a serious compile.
50
55
 
@@ -56,7 +61,7 @@ Ingest a local file path or URL into immutable source storage and write a manife
56
61
 
57
62
  Import supported files from the configured inbox directory. This is meant for browser-clipper style markdown bundles and other capture workflows. Local image and asset references are preserved and copied into canonical storage under `raw/assets/`.
58
63
 
59
- ### `swarmvault compile`
64
+ ### `swarmvault compile [--approve]`
60
65
 
61
66
  Compile the current manifests into:
62
67
 
@@ -64,20 +69,47 @@ Compile the current manifests into:
64
69
  - structured graph data in `state/graph.json`
65
70
  - local search data in `state/search.sqlite`
66
71
 
67
- The compiler also reads `swarmvault.schema.md` and records a `schema_hash` in generated pages so schema edits can mark pages stale.
72
+ The compiler also reads `swarmvault.schema.md` and records a `schema_hash` plus lifecycle metadata such as `status`, `created_at`, `updated_at`, `compiled_from`, and `managed_by` in generated pages so schema edits can mark pages stale without losing lifecycle state.
68
73
 
69
- ### `swarmvault query "<question>" [--save]`
74
+ New concept and entity pages are staged into `wiki/candidates/` first. A later matching compile promotes them into `wiki/concepts/` or `wiki/entities/`.
75
+
76
+ With `--approve`, compile writes a staged review bundle into `state/approvals/` without applying active wiki changes.
77
+
78
+ ### `swarmvault review list|show|accept|reject`
79
+
80
+ Inspect and resolve staged approval bundles created by `swarmvault compile --approve`.
81
+
82
+ - `review list` shows pending, accepted, and rejected entry counts per bundle
83
+ - `review show <approvalId>` shows each staged entry plus its current and staged content
84
+ - `review accept <approvalId> [targets...]` applies pending entries to the live wiki
85
+ - `review reject <approvalId> [targets...]` marks pending entries as rejected without mutating active wiki paths
86
+
87
+ Targets can be page ids such as `concept:approval-concept` or relative wiki paths such as `concepts/approval-concept.md`.
88
+
89
+ ### `swarmvault candidate list|promote|archive`
90
+
91
+ Inspect and resolve staged concept and entity candidates.
92
+
93
+ - `candidate list` shows every current candidate plus its active destination path
94
+ - `candidate promote <target>` promotes a candidate immediately into `wiki/concepts/` or `wiki/entities/`
95
+ - `candidate archive <target>` removes a candidate from the staged set
96
+
97
+ Targets can be page ids or relative paths under `wiki/candidates/`.
98
+
99
+ ### `swarmvault query "<question>" [--no-save] [--format markdown|report|slides]`
70
100
 
71
101
  Query the compiled vault. The query layer also reads `swarmvault.schema.md`, so answers follow the vault’s own structure and grounding rules.
72
102
 
73
- With `--save`, the answer is written into `wiki/outputs/` and immediately registered in:
103
+ By default, the answer is written into `wiki/outputs/` and immediately registered in:
74
104
 
75
105
  - `wiki/index.md`
76
106
  - `wiki/outputs/index.md`
77
107
  - `state/graph.json`
78
108
  - `state/search.sqlite`
79
109
 
80
- Saved outputs also carry related page, node, and source metadata so later compiles can link them back into the wiki.
110
+ Saved outputs also carry related page, node, and source metadata so SwarmVault can refresh related source, concept, and entity pages immediately.
111
+
112
+ Human-authored pages in `wiki/insights/` are also indexed into search and query context, but SwarmVault does not rewrite them after initialization.
81
113
 
82
114
  ### `swarmvault explore "<question>" [--steps <n>]`
83
115
 
@@ -108,7 +140,7 @@ Run anti-drift and vault health checks such as stale pages, missing graph artifa
108
140
 
109
141
  ### `swarmvault watch [--lint] [--debounce <ms>]`
110
142
 
111
- Watch the inbox directory and trigger import and compile cycles when files change. With `--lint`, each cycle also runs linting. Run metadata is appended to `state/jobs.ndjson`.
143
+ Watch the inbox directory and trigger import and compile cycles when files change. With `--lint`, each cycle also runs linting. Each cycle writes a canonical session artifact to `state/sessions/`, and compatibility run metadata is still appended to `state/jobs.ndjson`.
112
144
 
113
145
  ### `swarmvault mcp`
114
146
 
@@ -123,11 +155,15 @@ Run SwarmVault as a local MCP server over stdio. This exposes the vault to compa
123
155
  - `compile_vault`
124
156
  - `lint_vault`
125
157
 
126
- The MCP surface also exposes `swarmvault://schema` and includes `schemaPath` in `workspace_info`.
158
+ The MCP surface also exposes `swarmvault://schema`, `swarmvault://sessions`, `swarmvault://sessions/{path}`, and includes `schemaPath` in `workspace_info`.
127
159
 
128
160
  ### `swarmvault graph serve`
129
161
 
130
- Start the local graph UI backed by `state/graph.json`.
162
+ Start the local graph workspace backed by `state/graph.json`, `/api/search`, and `/api/page`.
163
+
164
+ ### `swarmvault graph export --html <output>`
165
+
166
+ Export the graph workspace as a standalone HTML file with embedded graph and page data for offline sharing.
131
167
 
132
168
  ### `swarmvault install --agent <codex|claude|cursor>`
133
169
 
package/dist/index.js CHANGED
@@ -3,22 +3,33 @@
3
3
  // src/index.ts
4
4
  import process from "process";
5
5
  import {
6
+ acceptApproval,
7
+ archiveCandidate,
6
8
  compileVault,
7
9
  exploreVault,
10
+ exportGraphHtml,
8
11
  importInbox,
9
12
  ingestInput,
10
13
  initVault,
11
14
  installAgent,
12
15
  lintVault,
16
+ listApprovals,
17
+ listCandidates,
13
18
  loadVaultConfig,
19
+ promoteCandidate,
14
20
  queryVault,
21
+ readApproval,
22
+ rejectApproval,
23
+ listSchedules,
24
+ runSchedule,
25
+ serveSchedules,
15
26
  startGraphServer,
16
27
  startMcpServer,
17
28
  watchVault
18
29
  } from "@swarmvaultai/engine";
19
- import { Command } from "commander";
30
+ import { Command, Option } from "commander";
20
31
  var program = new Command();
21
- program.name("swarmvault").description("SwarmVault is a local-first LLM wiki compiler with graph outputs and pluggable providers.").version("0.1.5").option("--json", "Emit structured JSON output", false);
32
+ program.name("swarmvault").description("SwarmVault is a local-first LLM wiki compiler with graph outputs and pluggable providers.").version("0.1.7").option("--json", "Emit structured JSON output", false);
22
33
  function isJson() {
23
34
  return program.opts().json === true;
24
35
  }
@@ -35,10 +46,10 @@ function log(message) {
35
46
  `);
36
47
  }
37
48
  }
38
- program.command("init").description("Initialize a SwarmVault workspace in the current directory.").action(async () => {
39
- await initVault(process.cwd());
49
+ program.command("init").description("Initialize a SwarmVault workspace in the current directory.").option("--obsidian", "Generate a minimal .obsidian workspace alongside the vault", false).action(async (options) => {
50
+ await initVault(process.cwd(), { obsidian: options.obsidian ?? false });
40
51
  if (isJson()) {
41
- emitJson({ status: "initialized", rootDir: process.cwd() });
52
+ emitJson({ status: "initialized", rootDir: process.cwd(), obsidian: options.obsidian ?? false });
42
53
  } else {
43
54
  log("Initialized SwarmVault workspace.");
44
55
  }
@@ -62,28 +73,40 @@ inbox.command("import").description("Import supported files from the configured
62
73
  );
63
74
  }
64
75
  });
65
- program.command("compile").description("Compile manifests into wiki pages, graph JSON, and search index.").action(async () => {
66
- const result = await compileVault(process.cwd());
76
+ program.command("compile").description("Compile manifests into wiki pages, graph JSON, and search index.").option("--approve", "Stage a review bundle without applying active page changes", false).action(async (options) => {
77
+ const result = await compileVault(process.cwd(), { approve: options.approve ?? false });
67
78
  if (isJson()) {
68
79
  emitJson(result);
69
80
  } else {
70
- log(`Compiled ${result.sourceCount} source(s), ${result.pageCount} page(s). Changed: ${result.changedPages.length}.`);
81
+ if (result.staged) {
82
+ log(`Staged ${result.changedPages.length} change(s) for review at ${result.approvalDir}.`);
83
+ } else {
84
+ log(`Compiled ${result.sourceCount} source(s), ${result.pageCount} page(s). Changed: ${result.changedPages.length}.`);
85
+ }
71
86
  }
72
87
  });
73
- program.command("query").description("Query the compiled SwarmVault wiki.").argument("<question>", "Question to ask SwarmVault").option("--save", "Persist the answer to wiki/outputs", false).action(async (question, options) => {
74
- const result = await queryVault(process.cwd(), question, options.save ?? false);
88
+ program.command("query").description("Query the compiled SwarmVault wiki.").argument("<question>", "Question to ask SwarmVault").option("--no-save", "Do not persist the answer to wiki/outputs").addOption(new Option("--format <format>", "Output format").choices(["markdown", "report", "slides", "chart", "image"]).default("markdown")).action(async (question, options) => {
89
+ const result = await queryVault(process.cwd(), {
90
+ question,
91
+ save: options.save ?? true,
92
+ format: options.format
93
+ });
75
94
  if (isJson()) {
76
95
  emitJson(result);
77
96
  } else {
78
97
  log(result.answer);
79
- if (result.savedTo) {
80
- log(`Saved to ${result.savedTo}`);
98
+ if (result.savedPath) {
99
+ log(`Saved to ${result.savedPath}`);
81
100
  }
82
101
  }
83
102
  });
84
- program.command("explore").description("Run a save-first multi-step exploration loop against the vault.").argument("<question>", "Root question to explore").option("--steps <n>", "Maximum number of exploration steps", "3").action(async (question, options) => {
103
+ program.command("explore").description("Run a save-first multi-step exploration loop against the vault.").argument("<question>", "Root question to explore").option("--steps <n>", "Maximum number of exploration steps", "3").addOption(new Option("--format <format>", "Output format for step pages").choices(["markdown", "report", "slides", "chart", "image"]).default("markdown")).action(async (question, options) => {
85
104
  const stepCount = Number.parseInt(options.steps ?? "3", 10);
86
- const result = await exploreVault(process.cwd(), question, Number.isFinite(stepCount) ? stepCount : 3);
105
+ const result = await exploreVault(process.cwd(), {
106
+ question,
107
+ steps: Number.isFinite(stepCount) ? stepCount : 3,
108
+ format: options.format
109
+ });
87
110
  if (isJson()) {
88
111
  emitJson(result);
89
112
  } else {
@@ -122,6 +145,89 @@ graph.command("serve").description("Serve the local graph viewer.").option("--po
122
145
  process.exit(0);
123
146
  });
124
147
  });
148
+ graph.command("export").description("Export the graph viewer as a single self-contained HTML file.").requiredOption("--html <output>", "Output HTML file path").action(async (options) => {
149
+ const outputPath = await exportGraphHtml(process.cwd(), options.html);
150
+ if (isJson()) {
151
+ emitJson({ outputPath });
152
+ } else {
153
+ log(`Exported graph HTML to ${outputPath}`);
154
+ }
155
+ });
156
+ var review = program.command("review").description("Review staged compile approval bundles.");
157
+ review.command("list").description("List staged approval bundles and their resolution status.").action(async () => {
158
+ const approvals = await listApprovals(process.cwd());
159
+ if (isJson()) {
160
+ emitJson(approvals);
161
+ return;
162
+ }
163
+ if (!approvals.length) {
164
+ log("No approval bundles.");
165
+ return;
166
+ }
167
+ for (const approval of approvals) {
168
+ log(
169
+ `${approval.approvalId} pending=${approval.pendingCount} accepted=${approval.acceptedCount} rejected=${approval.rejectedCount} created=${approval.createdAt}`
170
+ );
171
+ }
172
+ });
173
+ review.command("show").description("Show the entries inside a staged approval bundle.").argument("<approvalId>", "Approval bundle identifier").action(async (approvalId) => {
174
+ const approval = await readApproval(process.cwd(), approvalId);
175
+ if (isJson()) {
176
+ emitJson(approval);
177
+ return;
178
+ }
179
+ log(`${approval.approvalId} pending=${approval.pendingCount} accepted=${approval.acceptedCount} rejected=${approval.rejectedCount}`);
180
+ for (const entry of approval.entries) {
181
+ log(`- ${entry.status} ${entry.changeType} ${entry.pageId} ${entry.nextPath ?? entry.previousPath ?? ""}`.trim());
182
+ }
183
+ });
184
+ review.command("accept").description("Accept all pending entries, or selected entries, from a staged approval bundle.").argument("<approvalId>", "Approval bundle identifier").argument("[targets...]", "Optional page ids or paths to accept").action(async (approvalId, targets) => {
185
+ const result = await acceptApproval(process.cwd(), approvalId, targets);
186
+ if (isJson()) {
187
+ emitJson(result);
188
+ } else {
189
+ log(`Accepted ${result.updatedEntries.length} entr${result.updatedEntries.length === 1 ? "y" : "ies"} from ${approvalId}.`);
190
+ }
191
+ });
192
+ review.command("reject").description("Reject all pending entries, or selected entries, from a staged approval bundle.").argument("<approvalId>", "Approval bundle identifier").argument("[targets...]", "Optional page ids or paths to reject").action(async (approvalId, targets) => {
193
+ const result = await rejectApproval(process.cwd(), approvalId, targets);
194
+ if (isJson()) {
195
+ emitJson(result);
196
+ } else {
197
+ log(`Rejected ${result.updatedEntries.length} entr${result.updatedEntries.length === 1 ? "y" : "ies"} from ${approvalId}.`);
198
+ }
199
+ });
200
+ var candidate = program.command("candidate").description("Candidate page workflows.");
201
+ candidate.command("list").description("List staged concept and entity candidates.").action(async () => {
202
+ const candidates = await listCandidates(process.cwd());
203
+ if (isJson()) {
204
+ emitJson(candidates);
205
+ return;
206
+ }
207
+ if (!candidates.length) {
208
+ log("No candidates.");
209
+ return;
210
+ }
211
+ for (const entry of candidates) {
212
+ log(`${entry.pageId} ${entry.path} -> ${entry.activePath}`);
213
+ }
214
+ });
215
+ candidate.command("promote").description("Promote a candidate into its active concept or entity path.").argument("<target>", "Candidate page id or path").action(async (target) => {
216
+ const result = await promoteCandidate(process.cwd(), target);
217
+ if (isJson()) {
218
+ emitJson(result);
219
+ } else {
220
+ log(`Promoted ${result.pageId} to ${result.path}`);
221
+ }
222
+ });
223
+ candidate.command("archive").description("Archive a candidate by removing it from the active candidate set.").argument("<target>", "Candidate page id or path").action(async (target) => {
224
+ const result = await archiveCandidate(process.cwd(), target);
225
+ if (isJson()) {
226
+ emitJson(result);
227
+ } else {
228
+ log(`Archived ${result.pageId}`);
229
+ }
230
+ });
125
231
  program.command("watch").description("Watch the inbox directory and run import/compile cycles on changes.").option("--lint", "Run lint after each compile cycle", false).option("--debounce <ms>", "Debounce window in milliseconds", "900").action(async (options) => {
126
232
  const debounceMs = Number.parseInt(options.debounce ?? "900", 10);
127
233
  const { paths } = await loadVaultConfig(process.cwd());
@@ -139,6 +245,46 @@ program.command("watch").description("Watch the inbox directory and run import/c
139
245
  process.exit(0);
140
246
  });
141
247
  });
248
+ var schedule = program.command("schedule").description("Run scheduled vault maintenance jobs.");
249
+ schedule.command("list").description("List configured schedule jobs and their next run state.").action(async () => {
250
+ const schedules = await listSchedules(process.cwd());
251
+ if (isJson()) {
252
+ emitJson(schedules);
253
+ return;
254
+ }
255
+ if (!schedules.length) {
256
+ log("No schedules configured.");
257
+ return;
258
+ }
259
+ for (const entry of schedules) {
260
+ log(
261
+ `${entry.jobId} enabled=${entry.enabled} task=${entry.taskType} next=${entry.nextRunAt ?? "n/a"} last=${entry.lastRunAt ?? "never"} status=${entry.lastStatus ?? "n/a"} approval=${entry.lastApprovalId ?? "none"}`
262
+ );
263
+ }
264
+ });
265
+ schedule.command("run").description("Run one configured schedule job immediately.").argument("<jobId>", "Schedule identifier").action(async (jobId) => {
266
+ const result = await runSchedule(process.cwd(), jobId);
267
+ if (isJson()) {
268
+ emitJson(result);
269
+ return;
270
+ }
271
+ log(
272
+ `${jobId} ${result.success ? "completed" : "failed"} (${result.taskType})${result.approvalId ? ` approval=${result.approvalId}` : ""}${result.error ? ` error=${result.error}` : ""}`
273
+ );
274
+ });
275
+ schedule.command("serve").description("Run the local schedule loop.").option("--poll <ms>", "Polling interval in milliseconds", "30000").action(async (options) => {
276
+ const pollMs = Number.parseInt(options.poll ?? "30000", 10);
277
+ const controller = await serveSchedules(process.cwd(), Number.isFinite(pollMs) ? pollMs : 3e4);
278
+ if (isJson()) {
279
+ emitJson({ status: "serving", pollMs: Number.isFinite(pollMs) ? pollMs : 3e4 });
280
+ } else {
281
+ log("Serving schedules. Press Ctrl+C to stop.");
282
+ }
283
+ process.on("SIGINT", async () => {
284
+ await controller.close();
285
+ process.exit(0);
286
+ });
287
+ });
142
288
  program.command("mcp").description("Run SwarmVault as a local MCP server over stdio.").action(async () => {
143
289
  if (isJson()) {
144
290
  process.stderr.write(`${JSON.stringify({ status: "running", transport: "stdio" })}
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@swarmvaultai/cli",
3
- "version": "0.1.5",
3
+ "version": "0.1.7",
4
4
  "description": "Global CLI for SwarmVault.",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",
@@ -39,7 +39,7 @@
39
39
  },
40
40
  "dependencies": {
41
41
  "commander": "^14.0.1",
42
- "@swarmvaultai/engine": "0.1.5"
42
+ "@swarmvaultai/engine": "0.1.7"
43
43
  },
44
44
  "devDependencies": {
45
45
  "@types/node": "^24.6.0",