@swarmvaultai/cli 0.1.4 → 0.1.7
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +111 -12
- package/dist/index.js +247 -37
- package/package.json +3 -3
package/README.md
CHANGED
|
@@ -2,7 +2,7 @@
|
|
|
2
2
|
|
|
3
3
|
`@swarmvaultai/cli` is the global command-line entry point for SwarmVault.
|
|
4
4
|
|
|
5
|
-
It gives you the `swarmvault` command for building a local-first knowledge vault from files, URLs, browser clips,
|
|
5
|
+
It gives you the `swarmvault` command for building a local-first knowledge vault from files, URLs, browser clips, saved query outputs, and guided exploration runs.
|
|
6
6
|
|
|
7
7
|
## Install
|
|
8
8
|
|
|
@@ -22,27 +22,34 @@ Installed commands:
|
|
|
22
22
|
```bash
|
|
23
23
|
mkdir my-vault
|
|
24
24
|
cd my-vault
|
|
25
|
-
swarmvault init
|
|
25
|
+
swarmvault init --obsidian
|
|
26
26
|
sed -n '1,120p' swarmvault.schema.md
|
|
27
27
|
swarmvault ingest ./notes.md
|
|
28
28
|
swarmvault compile
|
|
29
|
-
swarmvault query "What keeps recurring?"
|
|
29
|
+
swarmvault query "What keeps recurring?"
|
|
30
|
+
swarmvault query "Turn this into slides" --format slides
|
|
31
|
+
swarmvault explore "What should I research next?" --steps 3
|
|
32
|
+
swarmvault lint --deep
|
|
30
33
|
swarmvault graph serve
|
|
34
|
+
swarmvault graph export --html ./exports/graph.html
|
|
31
35
|
```
|
|
32
36
|
|
|
33
37
|
## Commands
|
|
34
38
|
|
|
35
|
-
### `swarmvault init`
|
|
39
|
+
### `swarmvault init [--obsidian]`
|
|
36
40
|
|
|
37
41
|
Create a workspace with:
|
|
38
42
|
|
|
39
43
|
- `inbox/`
|
|
40
44
|
- `raw/`
|
|
41
45
|
- `wiki/`
|
|
46
|
+
- `wiki/insights/`
|
|
42
47
|
- `state/`
|
|
48
|
+
- `state/sessions/`
|
|
43
49
|
- `agent/`
|
|
44
50
|
- `swarmvault.config.json`
|
|
45
51
|
- `swarmvault.schema.md`
|
|
52
|
+
- optional `.obsidian/` workspace files when `--obsidian` is passed
|
|
46
53
|
|
|
47
54
|
The schema file is the vault-specific instruction layer. Edit it to define naming rules, categories, grounding expectations, and exclusions before a serious compile.
|
|
48
55
|
|
|
@@ -54,7 +61,7 @@ Ingest a local file path or URL into immutable source storage and write a manife
|
|
|
54
61
|
|
|
55
62
|
Import supported files from the configured inbox directory. This is meant for browser-clipper style markdown bundles and other capture workflows. Local image and asset references are preserved and copied into canonical storage under `raw/assets/`.
|
|
56
63
|
|
|
57
|
-
### `swarmvault compile`
|
|
64
|
+
### `swarmvault compile [--approve]`
|
|
58
65
|
|
|
59
66
|
Compile the current manifests into:
|
|
60
67
|
|
|
@@ -62,19 +69,78 @@ Compile the current manifests into:
|
|
|
62
69
|
- structured graph data in `state/graph.json`
|
|
63
70
|
- local search data in `state/search.sqlite`
|
|
64
71
|
|
|
65
|
-
The compiler also reads `swarmvault.schema.md` and records a `schema_hash` in generated pages so schema edits can mark pages stale.
|
|
72
|
+
The compiler also reads `swarmvault.schema.md` and records a `schema_hash` plus lifecycle metadata such as `status`, `created_at`, `updated_at`, `compiled_from`, and `managed_by` in generated pages so schema edits can mark pages stale without losing lifecycle state.
|
|
66
73
|
|
|
67
|
-
|
|
74
|
+
New concept and entity pages are staged into `wiki/candidates/` first. A later matching compile promotes them into `wiki/concepts/` or `wiki/entities/`.
|
|
68
75
|
|
|
69
|
-
|
|
76
|
+
With `--approve`, compile writes a staged review bundle into `state/approvals/` without applying active wiki changes.
|
|
70
77
|
|
|
71
|
-
### `swarmvault
|
|
78
|
+
### `swarmvault review list|show|accept|reject`
|
|
79
|
+
|
|
80
|
+
Inspect and resolve staged approval bundles created by `swarmvault compile --approve`.
|
|
81
|
+
|
|
82
|
+
- `review list` shows pending, accepted, and rejected entry counts per bundle
|
|
83
|
+
- `review show <approvalId>` shows each staged entry plus its current and staged content
|
|
84
|
+
- `review accept <approvalId> [targets...]` applies pending entries to the live wiki
|
|
85
|
+
- `review reject <approvalId> [targets...]` marks pending entries as rejected without mutating active wiki paths
|
|
86
|
+
|
|
87
|
+
Targets can be page ids such as `concept:approval-concept` or relative wiki paths such as `concepts/approval-concept.md`.
|
|
88
|
+
|
|
89
|
+
### `swarmvault candidate list|promote|archive`
|
|
90
|
+
|
|
91
|
+
Inspect and resolve staged concept and entity candidates.
|
|
92
|
+
|
|
93
|
+
- `candidate list` shows every current candidate plus its active destination path
|
|
94
|
+
- `candidate promote <target>` promotes a candidate immediately into `wiki/concepts/` or `wiki/entities/`
|
|
95
|
+
- `candidate archive <target>` removes a candidate from the staged set
|
|
96
|
+
|
|
97
|
+
Targets can be page ids or relative paths under `wiki/candidates/`.
|
|
98
|
+
|
|
99
|
+
### `swarmvault query "<question>" [--no-save] [--format markdown|report|slides]`
|
|
100
|
+
|
|
101
|
+
Query the compiled vault. The query layer also reads `swarmvault.schema.md`, so answers follow the vault’s own structure and grounding rules.
|
|
102
|
+
|
|
103
|
+
By default, the answer is written into `wiki/outputs/` and immediately registered in:
|
|
104
|
+
|
|
105
|
+
- `wiki/index.md`
|
|
106
|
+
- `wiki/outputs/index.md`
|
|
107
|
+
- `state/graph.json`
|
|
108
|
+
- `state/search.sqlite`
|
|
109
|
+
|
|
110
|
+
Saved outputs also carry related page, node, and source metadata so SwarmVault can refresh related source, concept, and entity pages immediately.
|
|
111
|
+
|
|
112
|
+
Human-authored pages in `wiki/insights/` are also indexed into search and query context, but SwarmVault does not rewrite them after initialization.
|
|
113
|
+
|
|
114
|
+
### `swarmvault explore "<question>" [--steps <n>]`
|
|
115
|
+
|
|
116
|
+
Run a save-first multi-step research loop.
|
|
117
|
+
|
|
118
|
+
Each step:
|
|
119
|
+
|
|
120
|
+
- queries the vault
|
|
121
|
+
- saves the answer into `wiki/outputs/`
|
|
122
|
+
- generates follow-up questions
|
|
123
|
+
- chooses the next follow-up deterministically
|
|
124
|
+
|
|
125
|
+
The command also writes a hub page linking the root question, saved step pages, and generated follow-up questions.
|
|
126
|
+
|
|
127
|
+
### `swarmvault lint [--deep] [--web]`
|
|
72
128
|
|
|
73
129
|
Run anti-drift and vault health checks such as stale pages, missing graph artifacts, and other structural issues.
|
|
74
130
|
|
|
131
|
+
`--deep` adds an LLM-powered advisory pass that can report:
|
|
132
|
+
|
|
133
|
+
- `coverage_gap`
|
|
134
|
+
- `contradiction_candidate`
|
|
135
|
+
- `missing_citation`
|
|
136
|
+
- `candidate_page`
|
|
137
|
+
- `follow_up_question`
|
|
138
|
+
|
|
139
|
+
`--web` can only be used with `--deep`. It enriches deep-lint findings with external evidence snippets and URLs from a configured web-search provider.
|
|
140
|
+
|
|
75
141
|
### `swarmvault watch [--lint] [--debounce <ms>]`
|
|
76
142
|
|
|
77
|
-
Watch the inbox directory and trigger import and compile cycles when files change. With `--lint`, each cycle also runs linting.
|
|
143
|
+
Watch the inbox directory and trigger import and compile cycles when files change. With `--lint`, each cycle also runs linting. Each cycle writes a canonical session artifact to `state/sessions/`, and compatibility run metadata is still appended to `state/jobs.ndjson`.
|
|
78
144
|
|
|
79
145
|
### `swarmvault mcp`
|
|
80
146
|
|
|
@@ -89,11 +155,15 @@ Run SwarmVault as a local MCP server over stdio. This exposes the vault to compa
|
|
|
89
155
|
- `compile_vault`
|
|
90
156
|
- `lint_vault`
|
|
91
157
|
|
|
92
|
-
The MCP surface also exposes `swarmvault://schema` and includes `schemaPath` in `workspace_info`.
|
|
158
|
+
The MCP surface also exposes `swarmvault://schema`, `swarmvault://sessions`, `swarmvault://sessions/{path}`, and includes `schemaPath` in `workspace_info`.
|
|
93
159
|
|
|
94
160
|
### `swarmvault graph serve`
|
|
95
161
|
|
|
96
|
-
Start the local graph
|
|
162
|
+
Start the local graph workspace backed by `state/graph.json`, `/api/search`, and `/api/page`.
|
|
163
|
+
|
|
164
|
+
### `swarmvault graph export --html <output>`
|
|
165
|
+
|
|
166
|
+
Export the graph workspace as a standalone HTML file with embedded graph and page data for offline sharing.
|
|
97
167
|
|
|
98
168
|
### `swarmvault install --agent <codex|claude|cursor>`
|
|
99
169
|
|
|
@@ -126,10 +196,39 @@ Example:
|
|
|
126
196
|
|
|
127
197
|
Generic OpenAI-compatible APIs are supported through config when the provider follows the OpenAI request shape closely enough.
|
|
128
198
|
|
|
199
|
+
Deep lint web augmentation uses a separate `webSearch` config block. Example:
|
|
200
|
+
|
|
201
|
+
```json
|
|
202
|
+
{
|
|
203
|
+
"webSearch": {
|
|
204
|
+
"providers": {
|
|
205
|
+
"evidence": {
|
|
206
|
+
"type": "http-json",
|
|
207
|
+
"endpoint": "https://search.example/api/search",
|
|
208
|
+
"method": "GET",
|
|
209
|
+
"apiKeyEnv": "SEARCH_API_KEY",
|
|
210
|
+
"apiKeyHeader": "Authorization",
|
|
211
|
+
"apiKeyPrefix": "Bearer ",
|
|
212
|
+
"queryParam": "q",
|
|
213
|
+
"limitParam": "limit",
|
|
214
|
+
"resultsPath": "results",
|
|
215
|
+
"titleField": "title",
|
|
216
|
+
"urlField": "url",
|
|
217
|
+
"snippetField": "snippet"
|
|
218
|
+
}
|
|
219
|
+
},
|
|
220
|
+
"tasks": {
|
|
221
|
+
"deepLintProvider": "evidence"
|
|
222
|
+
}
|
|
223
|
+
}
|
|
224
|
+
}
|
|
225
|
+
```
|
|
226
|
+
|
|
129
227
|
## Troubleshooting
|
|
130
228
|
|
|
131
229
|
- If you are running from a source checkout and `graph serve` says the viewer build is missing, run `pnpm build` in the repository first
|
|
132
230
|
- If a provider claims OpenAI compatibility but fails structured generation, declare only the capabilities it actually supports
|
|
231
|
+
- If `lint --deep --web` fails immediately, make sure a `webSearch` provider is configured and mapped to `tasks.deepLintProvider`
|
|
133
232
|
- Node 24 may emit an experimental warning for `node:sqlite`; that is expected in the current release
|
|
134
233
|
|
|
135
234
|
## Links
|
package/dist/index.js
CHANGED
|
@@ -1,92 +1,295 @@
|
|
|
1
1
|
#!/usr/bin/env node
|
|
2
2
|
|
|
3
3
|
// src/index.ts
|
|
4
|
-
import { Command } from "commander";
|
|
5
4
|
import process from "process";
|
|
6
5
|
import {
|
|
6
|
+
acceptApproval,
|
|
7
|
+
archiveCandidate,
|
|
7
8
|
compileVault,
|
|
9
|
+
exploreVault,
|
|
10
|
+
exportGraphHtml,
|
|
8
11
|
importInbox,
|
|
9
12
|
ingestInput,
|
|
10
13
|
initVault,
|
|
11
14
|
installAgent,
|
|
12
15
|
lintVault,
|
|
16
|
+
listApprovals,
|
|
17
|
+
listCandidates,
|
|
18
|
+
loadVaultConfig,
|
|
19
|
+
promoteCandidate,
|
|
13
20
|
queryVault,
|
|
21
|
+
readApproval,
|
|
22
|
+
rejectApproval,
|
|
23
|
+
listSchedules,
|
|
24
|
+
runSchedule,
|
|
25
|
+
serveSchedules,
|
|
14
26
|
startGraphServer,
|
|
15
27
|
startMcpServer,
|
|
16
28
|
watchVault
|
|
17
29
|
} from "@swarmvaultai/engine";
|
|
30
|
+
import { Command, Option } from "commander";
|
|
18
31
|
var program = new Command();
|
|
19
|
-
program.name("swarmvault").description("SwarmVault is a local-first LLM wiki compiler with graph outputs and pluggable providers.").version("0.1.
|
|
20
|
-
|
|
21
|
-
|
|
22
|
-
|
|
32
|
+
program.name("swarmvault").description("SwarmVault is a local-first LLM wiki compiler with graph outputs and pluggable providers.").version("0.1.7").option("--json", "Emit structured JSON output", false);
|
|
33
|
+
function isJson() {
|
|
34
|
+
return program.opts().json === true;
|
|
35
|
+
}
|
|
36
|
+
function emitJson(data) {
|
|
37
|
+
process.stdout.write(`${JSON.stringify(data)}
|
|
38
|
+
`);
|
|
39
|
+
}
|
|
40
|
+
function log(message) {
|
|
41
|
+
if (isJson()) {
|
|
42
|
+
process.stderr.write(`${message}
|
|
43
|
+
`);
|
|
44
|
+
} else {
|
|
45
|
+
process.stdout.write(`${message}
|
|
46
|
+
`);
|
|
47
|
+
}
|
|
48
|
+
}
|
|
49
|
+
program.command("init").description("Initialize a SwarmVault workspace in the current directory.").option("--obsidian", "Generate a minimal .obsidian workspace alongside the vault", false).action(async (options) => {
|
|
50
|
+
await initVault(process.cwd(), { obsidian: options.obsidian ?? false });
|
|
51
|
+
if (isJson()) {
|
|
52
|
+
emitJson({ status: "initialized", rootDir: process.cwd(), obsidian: options.obsidian ?? false });
|
|
53
|
+
} else {
|
|
54
|
+
log("Initialized SwarmVault workspace.");
|
|
55
|
+
}
|
|
23
56
|
});
|
|
24
57
|
program.command("ingest").description("Ingest a local file path or URL into the raw SwarmVault workspace.").argument("<input>", "Local file path or URL").action(async (input) => {
|
|
25
58
|
const manifest = await ingestInput(process.cwd(), input);
|
|
26
|
-
|
|
27
|
-
|
|
59
|
+
if (isJson()) {
|
|
60
|
+
emitJson(manifest);
|
|
61
|
+
} else {
|
|
62
|
+
log(manifest.sourceId);
|
|
63
|
+
}
|
|
28
64
|
});
|
|
29
65
|
var inbox = program.command("inbox").description("Inbox and capture workflows.");
|
|
30
66
|
inbox.command("import").description("Import supported files from the configured inbox directory.").argument("[dir]", "Optional inbox directory override").action(async (dir) => {
|
|
31
67
|
const result = await importInbox(process.cwd(), dir);
|
|
32
|
-
|
|
33
|
-
|
|
34
|
-
|
|
35
|
-
|
|
68
|
+
if (isJson()) {
|
|
69
|
+
emitJson(result);
|
|
70
|
+
} else {
|
|
71
|
+
log(
|
|
72
|
+
`Imported ${result.imported.length} source(s) from ${result.inputDir}. Scanned: ${result.scannedCount}. Attachments: ${result.attachmentCount}. Skipped: ${result.skipped.length}.`
|
|
73
|
+
);
|
|
74
|
+
}
|
|
36
75
|
});
|
|
37
|
-
program.command("compile").description("Compile manifests into wiki pages, graph JSON, and search index.").action(async () => {
|
|
38
|
-
const result = await compileVault(process.cwd());
|
|
39
|
-
|
|
40
|
-
|
|
41
|
-
|
|
42
|
-
|
|
76
|
+
program.command("compile").description("Compile manifests into wiki pages, graph JSON, and search index.").option("--approve", "Stage a review bundle without applying active page changes", false).action(async (options) => {
|
|
77
|
+
const result = await compileVault(process.cwd(), { approve: options.approve ?? false });
|
|
78
|
+
if (isJson()) {
|
|
79
|
+
emitJson(result);
|
|
80
|
+
} else {
|
|
81
|
+
if (result.staged) {
|
|
82
|
+
log(`Staged ${result.changedPages.length} change(s) for review at ${result.approvalDir}.`);
|
|
83
|
+
} else {
|
|
84
|
+
log(`Compiled ${result.sourceCount} source(s), ${result.pageCount} page(s). Changed: ${result.changedPages.length}.`);
|
|
85
|
+
}
|
|
86
|
+
}
|
|
43
87
|
});
|
|
44
|
-
program.command("query").description("Query the compiled SwarmVault wiki.").argument("<question>", "Question to ask SwarmVault").option("--save", "
|
|
45
|
-
const result = await queryVault(process.cwd(),
|
|
46
|
-
|
|
47
|
-
|
|
48
|
-
|
|
49
|
-
|
|
50
|
-
|
|
88
|
+
program.command("query").description("Query the compiled SwarmVault wiki.").argument("<question>", "Question to ask SwarmVault").option("--no-save", "Do not persist the answer to wiki/outputs").addOption(new Option("--format <format>", "Output format").choices(["markdown", "report", "slides", "chart", "image"]).default("markdown")).action(async (question, options) => {
|
|
89
|
+
const result = await queryVault(process.cwd(), {
|
|
90
|
+
question,
|
|
91
|
+
save: options.save ?? true,
|
|
92
|
+
format: options.format
|
|
93
|
+
});
|
|
94
|
+
if (isJson()) {
|
|
95
|
+
emitJson(result);
|
|
96
|
+
} else {
|
|
97
|
+
log(result.answer);
|
|
98
|
+
if (result.savedPath) {
|
|
99
|
+
log(`Saved to ${result.savedPath}`);
|
|
100
|
+
}
|
|
51
101
|
}
|
|
52
102
|
});
|
|
53
|
-
program.command("
|
|
54
|
-
const
|
|
103
|
+
program.command("explore").description("Run a save-first multi-step exploration loop against the vault.").argument("<question>", "Root question to explore").option("--steps <n>", "Maximum number of exploration steps", "3").addOption(new Option("--format <format>", "Output format for step pages").choices(["markdown", "report", "slides", "chart", "image"]).default("markdown")).action(async (question, options) => {
|
|
104
|
+
const stepCount = Number.parseInt(options.steps ?? "3", 10);
|
|
105
|
+
const result = await exploreVault(process.cwd(), {
|
|
106
|
+
question,
|
|
107
|
+
steps: Number.isFinite(stepCount) ? stepCount : 3,
|
|
108
|
+
format: options.format
|
|
109
|
+
});
|
|
110
|
+
if (isJson()) {
|
|
111
|
+
emitJson(result);
|
|
112
|
+
} else {
|
|
113
|
+
log(`Exploration hub saved to ${result.hubPath}`);
|
|
114
|
+
log(`Completed ${result.stepCount} step(s).`);
|
|
115
|
+
}
|
|
116
|
+
});
|
|
117
|
+
program.command("lint").description("Run anti-drift and wiki-health checks.").option("--deep", "Run LLM-powered advisory lint", false).option("--web", "Augment deep lint with configured web search", false).action(async (options) => {
|
|
118
|
+
const findings = await lintVault(process.cwd(), {
|
|
119
|
+
deep: options.deep ?? false,
|
|
120
|
+
web: options.web ?? false
|
|
121
|
+
});
|
|
122
|
+
if (isJson()) {
|
|
123
|
+
emitJson(findings);
|
|
124
|
+
return;
|
|
125
|
+
}
|
|
55
126
|
if (!findings.length) {
|
|
56
|
-
|
|
127
|
+
log("No findings.");
|
|
57
128
|
return;
|
|
58
129
|
}
|
|
59
130
|
for (const finding of findings) {
|
|
60
|
-
|
|
61
|
-
`[${finding.severity}] ${finding.code}: ${finding.message}${finding.pagePath ? ` (${finding.pagePath})` : ""}
|
|
62
|
-
`
|
|
63
|
-
);
|
|
131
|
+
log(`[${finding.severity}] ${finding.code}: ${finding.message}${finding.pagePath ? ` (${finding.pagePath})` : ""}`);
|
|
64
132
|
}
|
|
65
133
|
});
|
|
66
134
|
var graph = program.command("graph").description("Graph-related commands.");
|
|
67
135
|
graph.command("serve").description("Serve the local graph viewer.").option("--port <port>", "Port override").action(async (options) => {
|
|
68
136
|
const port = options.port ? Number.parseInt(options.port, 10) : void 0;
|
|
69
137
|
const server = await startGraphServer(process.cwd(), port);
|
|
70
|
-
|
|
71
|
-
`);
|
|
138
|
+
if (isJson()) {
|
|
139
|
+
emitJson({ port: server.port, url: `http://localhost:${server.port}` });
|
|
140
|
+
} else {
|
|
141
|
+
log(`Graph viewer running at http://localhost:${server.port}`);
|
|
142
|
+
}
|
|
72
143
|
process.on("SIGINT", async () => {
|
|
73
144
|
await server.close();
|
|
74
145
|
process.exit(0);
|
|
75
146
|
});
|
|
76
147
|
});
|
|
148
|
+
graph.command("export").description("Export the graph viewer as a single self-contained HTML file.").requiredOption("--html <output>", "Output HTML file path").action(async (options) => {
|
|
149
|
+
const outputPath = await exportGraphHtml(process.cwd(), options.html);
|
|
150
|
+
if (isJson()) {
|
|
151
|
+
emitJson({ outputPath });
|
|
152
|
+
} else {
|
|
153
|
+
log(`Exported graph HTML to ${outputPath}`);
|
|
154
|
+
}
|
|
155
|
+
});
|
|
156
|
+
var review = program.command("review").description("Review staged compile approval bundles.");
|
|
157
|
+
review.command("list").description("List staged approval bundles and their resolution status.").action(async () => {
|
|
158
|
+
const approvals = await listApprovals(process.cwd());
|
|
159
|
+
if (isJson()) {
|
|
160
|
+
emitJson(approvals);
|
|
161
|
+
return;
|
|
162
|
+
}
|
|
163
|
+
if (!approvals.length) {
|
|
164
|
+
log("No approval bundles.");
|
|
165
|
+
return;
|
|
166
|
+
}
|
|
167
|
+
for (const approval of approvals) {
|
|
168
|
+
log(
|
|
169
|
+
`${approval.approvalId} pending=${approval.pendingCount} accepted=${approval.acceptedCount} rejected=${approval.rejectedCount} created=${approval.createdAt}`
|
|
170
|
+
);
|
|
171
|
+
}
|
|
172
|
+
});
|
|
173
|
+
review.command("show").description("Show the entries inside a staged approval bundle.").argument("<approvalId>", "Approval bundle identifier").action(async (approvalId) => {
|
|
174
|
+
const approval = await readApproval(process.cwd(), approvalId);
|
|
175
|
+
if (isJson()) {
|
|
176
|
+
emitJson(approval);
|
|
177
|
+
return;
|
|
178
|
+
}
|
|
179
|
+
log(`${approval.approvalId} pending=${approval.pendingCount} accepted=${approval.acceptedCount} rejected=${approval.rejectedCount}`);
|
|
180
|
+
for (const entry of approval.entries) {
|
|
181
|
+
log(`- ${entry.status} ${entry.changeType} ${entry.pageId} ${entry.nextPath ?? entry.previousPath ?? ""}`.trim());
|
|
182
|
+
}
|
|
183
|
+
});
|
|
184
|
+
review.command("accept").description("Accept all pending entries, or selected entries, from a staged approval bundle.").argument("<approvalId>", "Approval bundle identifier").argument("[targets...]", "Optional page ids or paths to accept").action(async (approvalId, targets) => {
|
|
185
|
+
const result = await acceptApproval(process.cwd(), approvalId, targets);
|
|
186
|
+
if (isJson()) {
|
|
187
|
+
emitJson(result);
|
|
188
|
+
} else {
|
|
189
|
+
log(`Accepted ${result.updatedEntries.length} entr${result.updatedEntries.length === 1 ? "y" : "ies"} from ${approvalId}.`);
|
|
190
|
+
}
|
|
191
|
+
});
|
|
192
|
+
review.command("reject").description("Reject all pending entries, or selected entries, from a staged approval bundle.").argument("<approvalId>", "Approval bundle identifier").argument("[targets...]", "Optional page ids or paths to reject").action(async (approvalId, targets) => {
|
|
193
|
+
const result = await rejectApproval(process.cwd(), approvalId, targets);
|
|
194
|
+
if (isJson()) {
|
|
195
|
+
emitJson(result);
|
|
196
|
+
} else {
|
|
197
|
+
log(`Rejected ${result.updatedEntries.length} entr${result.updatedEntries.length === 1 ? "y" : "ies"} from ${approvalId}.`);
|
|
198
|
+
}
|
|
199
|
+
});
|
|
200
|
+
var candidate = program.command("candidate").description("Candidate page workflows.");
|
|
201
|
+
candidate.command("list").description("List staged concept and entity candidates.").action(async () => {
|
|
202
|
+
const candidates = await listCandidates(process.cwd());
|
|
203
|
+
if (isJson()) {
|
|
204
|
+
emitJson(candidates);
|
|
205
|
+
return;
|
|
206
|
+
}
|
|
207
|
+
if (!candidates.length) {
|
|
208
|
+
log("No candidates.");
|
|
209
|
+
return;
|
|
210
|
+
}
|
|
211
|
+
for (const entry of candidates) {
|
|
212
|
+
log(`${entry.pageId} ${entry.path} -> ${entry.activePath}`);
|
|
213
|
+
}
|
|
214
|
+
});
|
|
215
|
+
candidate.command("promote").description("Promote a candidate into its active concept or entity path.").argument("<target>", "Candidate page id or path").action(async (target) => {
|
|
216
|
+
const result = await promoteCandidate(process.cwd(), target);
|
|
217
|
+
if (isJson()) {
|
|
218
|
+
emitJson(result);
|
|
219
|
+
} else {
|
|
220
|
+
log(`Promoted ${result.pageId} to ${result.path}`);
|
|
221
|
+
}
|
|
222
|
+
});
|
|
223
|
+
candidate.command("archive").description("Archive a candidate by removing it from the active candidate set.").argument("<target>", "Candidate page id or path").action(async (target) => {
|
|
224
|
+
const result = await archiveCandidate(process.cwd(), target);
|
|
225
|
+
if (isJson()) {
|
|
226
|
+
emitJson(result);
|
|
227
|
+
} else {
|
|
228
|
+
log(`Archived ${result.pageId}`);
|
|
229
|
+
}
|
|
230
|
+
});
|
|
77
231
|
program.command("watch").description("Watch the inbox directory and run import/compile cycles on changes.").option("--lint", "Run lint after each compile cycle", false).option("--debounce <ms>", "Debounce window in milliseconds", "900").action(async (options) => {
|
|
78
232
|
const debounceMs = Number.parseInt(options.debounce ?? "900", 10);
|
|
233
|
+
const { paths } = await loadVaultConfig(process.cwd());
|
|
79
234
|
const controller = await watchVault(process.cwd(), {
|
|
80
235
|
lint: options.lint ?? false,
|
|
81
236
|
debounceMs: Number.isFinite(debounceMs) ? debounceMs : 900
|
|
82
237
|
});
|
|
83
|
-
|
|
238
|
+
if (isJson()) {
|
|
239
|
+
emitJson({ status: "watching", inboxDir: paths.inboxDir });
|
|
240
|
+
} else {
|
|
241
|
+
log("Watching inbox for changes. Press Ctrl+C to stop.");
|
|
242
|
+
}
|
|
243
|
+
process.on("SIGINT", async () => {
|
|
244
|
+
await controller.close();
|
|
245
|
+
process.exit(0);
|
|
246
|
+
});
|
|
247
|
+
});
|
|
248
|
+
var schedule = program.command("schedule").description("Run scheduled vault maintenance jobs.");
|
|
249
|
+
schedule.command("list").description("List configured schedule jobs and their next run state.").action(async () => {
|
|
250
|
+
const schedules = await listSchedules(process.cwd());
|
|
251
|
+
if (isJson()) {
|
|
252
|
+
emitJson(schedules);
|
|
253
|
+
return;
|
|
254
|
+
}
|
|
255
|
+
if (!schedules.length) {
|
|
256
|
+
log("No schedules configured.");
|
|
257
|
+
return;
|
|
258
|
+
}
|
|
259
|
+
for (const entry of schedules) {
|
|
260
|
+
log(
|
|
261
|
+
`${entry.jobId} enabled=${entry.enabled} task=${entry.taskType} next=${entry.nextRunAt ?? "n/a"} last=${entry.lastRunAt ?? "never"} status=${entry.lastStatus ?? "n/a"} approval=${entry.lastApprovalId ?? "none"}`
|
|
262
|
+
);
|
|
263
|
+
}
|
|
264
|
+
});
|
|
265
|
+
schedule.command("run").description("Run one configured schedule job immediately.").argument("<jobId>", "Schedule identifier").action(async (jobId) => {
|
|
266
|
+
const result = await runSchedule(process.cwd(), jobId);
|
|
267
|
+
if (isJson()) {
|
|
268
|
+
emitJson(result);
|
|
269
|
+
return;
|
|
270
|
+
}
|
|
271
|
+
log(
|
|
272
|
+
`${jobId} ${result.success ? "completed" : "failed"} (${result.taskType})${result.approvalId ? ` approval=${result.approvalId}` : ""}${result.error ? ` error=${result.error}` : ""}`
|
|
273
|
+
);
|
|
274
|
+
});
|
|
275
|
+
schedule.command("serve").description("Run the local schedule loop.").option("--poll <ms>", "Polling interval in milliseconds", "30000").action(async (options) => {
|
|
276
|
+
const pollMs = Number.parseInt(options.poll ?? "30000", 10);
|
|
277
|
+
const controller = await serveSchedules(process.cwd(), Number.isFinite(pollMs) ? pollMs : 3e4);
|
|
278
|
+
if (isJson()) {
|
|
279
|
+
emitJson({ status: "serving", pollMs: Number.isFinite(pollMs) ? pollMs : 3e4 });
|
|
280
|
+
} else {
|
|
281
|
+
log("Serving schedules. Press Ctrl+C to stop.");
|
|
282
|
+
}
|
|
84
283
|
process.on("SIGINT", async () => {
|
|
85
284
|
await controller.close();
|
|
86
285
|
process.exit(0);
|
|
87
286
|
});
|
|
88
287
|
});
|
|
89
288
|
program.command("mcp").description("Run SwarmVault as a local MCP server over stdio.").action(async () => {
|
|
289
|
+
if (isJson()) {
|
|
290
|
+
process.stderr.write(`${JSON.stringify({ status: "running", transport: "stdio" })}
|
|
291
|
+
`);
|
|
292
|
+
}
|
|
90
293
|
const controller = await startMcpServer(process.cwd());
|
|
91
294
|
process.on("SIGINT", async () => {
|
|
92
295
|
await controller.close();
|
|
@@ -95,12 +298,19 @@ program.command("mcp").description("Run SwarmVault as a local MCP server over st
|
|
|
95
298
|
});
|
|
96
299
|
program.command("install").description("Install SwarmVault instructions for an agent in the current project.").requiredOption("--agent <agent>", "codex, claude, or cursor").action(async (options) => {
|
|
97
300
|
const target = await installAgent(process.cwd(), options.agent);
|
|
98
|
-
|
|
99
|
-
|
|
301
|
+
if (isJson()) {
|
|
302
|
+
emitJson({ agent: options.agent, target });
|
|
303
|
+
} else {
|
|
304
|
+
log(`Installed rules into ${target}`);
|
|
305
|
+
}
|
|
100
306
|
});
|
|
101
307
|
program.parseAsync(process.argv).catch((error) => {
|
|
102
308
|
const message = error instanceof Error ? error.message : String(error);
|
|
103
|
-
|
|
309
|
+
if (isJson()) {
|
|
310
|
+
emitJson({ error: message });
|
|
311
|
+
} else {
|
|
312
|
+
process.stderr.write(`${message}
|
|
104
313
|
`);
|
|
314
|
+
}
|
|
105
315
|
process.exit(1);
|
|
106
316
|
});
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@swarmvaultai/cli",
|
|
3
|
-
"version": "0.1.
|
|
3
|
+
"version": "0.1.7",
|
|
4
4
|
"description": "Global CLI for SwarmVault.",
|
|
5
5
|
"type": "module",
|
|
6
6
|
"main": "dist/index.js",
|
|
@@ -39,7 +39,7 @@
|
|
|
39
39
|
},
|
|
40
40
|
"dependencies": {
|
|
41
41
|
"commander": "^14.0.1",
|
|
42
|
-
"@swarmvaultai/engine": "0.1.
|
|
42
|
+
"@swarmvaultai/engine": "0.1.7"
|
|
43
43
|
},
|
|
44
44
|
"devDependencies": {
|
|
45
45
|
"@types/node": "^24.6.0",
|
|
@@ -48,6 +48,6 @@
|
|
|
48
48
|
"scripts": {
|
|
49
49
|
"build": "tsup src/index.ts --format esm --dts",
|
|
50
50
|
"test": "node -e \"process.exit(0)\"",
|
|
51
|
-
"
|
|
51
|
+
"typecheck": "tsc --noEmit"
|
|
52
52
|
}
|
|
53
53
|
}
|