vericontext 0.1.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 vericontext contributors
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,268 @@
1
+ <p align="center">
2
+ <picture>
3
+ <source media="(prefers-color-scheme: dark)" srcset="./assets/logo-dark.png">
4
+ <source media="(prefers-color-scheme: light)" srcset="./assets/logo-light.png">
5
+ <img alt="vericontext" src="./assets/logo-light.png" width="360">
6
+ </picture>
7
+ </p>
8
+
9
+ <h3 align="center">Docs lie. Hashes don't.</h3>
10
+
11
+ <p align="center">
12
+ Deterministic, hash-based verification for docs that reference code.<br>
13
+ Fail-closed. Zero fuzzy matching. One stale citation → entire doc fails.
14
+ </p>
15
+
16
+ <!-- BADGES:START -->
17
+ <!-- generated by add-badges 2026-02-23 -->
18
+ <p align="center">
19
+ <a href="https://www.npmjs.com/package/vericontext"><img src="https://img.shields.io/npm/v/vericontext?style=flat-square&logo=npm&logoColor=white" alt="npm version"></a>
20
+ <a href="LICENSE"><img src="https://img.shields.io/badge/license-MIT-blue?style=flat-square" alt="License: MIT"></a>
21
+ <a href="https://www.typescriptlang.org/"><img src="https://img.shields.io/badge/TypeScript-strict-3178C6?style=flat-square&logo=typescript&logoColor=white" alt="TypeScript"></a>
22
+ <a href="https://modelcontextprotocol.io"><img src="https://img.shields.io/badge/MCP-compatible-8A2BE2?style=flat-square" alt="MCP compatible"></a>
23
+ <a href="package.json"><img src="https://img.shields.io/badge/node-%3E%3D20-339933?style=flat-square&logo=nodedotjs&logoColor=white" alt="Node >= 20"></a>
24
+ </p>
25
+ <!-- BADGES:END -->
26
+
27
+ <p align="center">
28
+ <b>English</b> ·
29
+ <a href="README-ko.md">한국어</a>
30
+ </p>
31
+
32
+ <p align="center">
33
+ <a href="#the-problem">Why</a> ·
34
+ <a href="#setup">Setup</a> ·
35
+ <a href="#github-actions">CI</a> ·
36
+ <a href="#how-it-works">How It Works</a> ·
37
+ <a href="#cli-reference">CLI</a>
38
+ </p>
39
+
40
+ ---
41
+
42
+ ## The Problem
43
+
44
+ AI coding agents now scatter `README.md` and `AGENTS.md` across every directory — `deep init` style. It works great until the code changes and the docs don't. Files move, functions get rewritten, but the docs still say *"this logic lives in `src/handler.ts` at L30-L45."*
45
+
46
+ Over time, **your docs become liars.** And agents that trust those docs make worse decisions.
47
+
48
+ VeriContext embeds a **SHA-256 content hash** into every code citation at write time. When you verify later, either the hash matches or it doesn't. No fuzzy recovery. No "close enough."
49
+
50
+ ```
51
+ Write time: "handler logic" [[vctx:src/handler.ts#L30-L45@a1b2c3d4]]
52
+ After change: vericontext verify → ❌ hash_mismatch (a1b2c3d4 ≠ f5e6d7c8)
53
+ ```
54
+
55
+ **One broken citation → entire doc fails.** Fail-closed by design.
56
+
57
+ ## Setup
58
+
59
+ ```bash
60
+ npx skills add amsminn/vericontext --skill vericontext-enforcer
61
+ ```
62
+
63
+ Detects your agent (Claude Code, Codex, Antigravity, Cursor, Windsurf, OpenCode, …) and installs to the right path. Supports [40+ agents](https://github.com/vercel-labs/skills).
64
+
65
+ To uninstall:
66
+
67
+ ```bash
68
+ npx skills remove vericontext-enforcer
69
+ ```
70
+
71
+ Once installed, the agent automatically:
72
+ - Inserts `[[vctx:...]]` citations when referencing code in docs
73
+ - Runs `vctx_verify_workspace` before commits and plan finalization
74
+ - Flags stale citations for update
75
+
76
+ <details>
77
+ <summary>Manual install per agent</summary>
78
+
79
+ The skill lives at `skills/vericontext-enforcer/SKILL.md` in this repo. Copy it to the right place for your agent:
80
+
81
+ | Agent | Skill path | Also needs MCP? |
82
+ |-------|-----------|-----------------|
83
+ | **Claude Code** | `.claude/skills/vericontext-enforcer/` | No (skill handles it) |
84
+ | **Antigravity** | `~/.gemini/antigravity/skills/vericontext-enforcer/` | No |
85
+ | **Codex** | `.codex/skills/vericontext-enforcer/` | Yes |
86
+ | **Cursor** | `.cursor/rules/vericontext.mdc` | Yes |
87
+ | **Windsurf** | `.windsurf/rules/vericontext.md` | Yes |
88
+ | **OpenCode** | Project rules | Yes |
89
+
90
+ For agents that need MCP, add the server to your config:
91
+
92
+ ```json
93
+ { "mcpServers": { "vericontext": { "command": "npx", "args": ["-y", "vericontext", "mcp"] } } }
94
+ ```
95
+
96
+ MCP tools exposed: `vctx_cite`, `vctx_claim`, `vctx_verify_workspace`
97
+
98
+ </details>
99
+
100
+ ## GitHub Actions
101
+
102
+ Add a step to any workflow:
103
+
104
+ ```yaml
105
+ - name: Verify docs
106
+ run: npx -y vericontext verify workspace --root . --in-path README.md --json
107
+ ```
108
+
109
+ Stale citation → CI red.
110
+
111
+ ### Pre-commit Hook
112
+
113
+ ```bash
114
+ # .husky/pre-commit
115
+ npx vericontext verify workspace --root . --in-path README.md --json
116
+ ```
117
+
118
+ ## How It Works
119
+
120
+ ```
121
+ ┌──────────┐ cite ┌────────────────────────────────────┐
122
+ │ file │ ─────────→ │ [[vctx:path#L1-L2@<sha256-8>]] │
123
+ └──────────┘ └────────────────────────────────────┘
124
+
125
+ ┌──────────┐ verify ┌─────────────▼──────────────────────┐
126
+ │ file │ ─────────→ │ hash match? → ok : hash_mismatch │
127
+ │(changed?)│ └────────────────────────────────────┘
128
+ └──────────┘
129
+ ```
130
+
131
+ 1. **Cite** — snapshot a file's line range with its SHA-256 hash
132
+ 2. **Claim** — declare structure facts: `exists-file`, `exists-dir`, `missing`
133
+ 3. **Verify** — check every citation and claim in a doc. One failure → whole doc fails
134
+
135
+ ### Claim Syntax
136
+
137
+ ```
138
+ Citation: [[vctx:<path>#L<start>-L<end>@<hash8>]]
139
+ Exists: [[vctx-exists:<path>]]
140
+ Exists-file: [[vctx-exists-file:<path>]]
141
+ Exists-dir: [[vctx-exists-dir:<path>/]]
142
+ Missing: [[vctx-missing:<path>]]
143
+ ```
144
+
145
+ Claims can be hidden in HTML comments so they don't clutter the doc:
146
+
147
+ ```html
148
+ The auth module handles login.
149
+ <!-- [[vctx:src/auth.ts#L1-L50@abcd1234]] -->
150
+ ```
151
+
152
+ ### Features
153
+
154
+ | | Feature | Description |
155
+ |---|---|---|
156
+ | # | **Hash citations** | Cite file line ranges with SHA-256 content hashes |
157
+ | 📁 | **Structure claims** | Assert `exists`, `exists-file`, `exists-dir`, `missing` |
158
+ | 🔒 | **Fail-closed** | One broken claim → entire verification fails |
159
+ | 🚧 | **Root jail** | Repo-relative paths only. `../` traversal blocked |
160
+ | 📟 | **CLI + MCP** | Same logic as both CLI and MCP stdio server |
161
+ | 🔤 | **Deterministic** | UTF-8 only, LF-normalized, symlinks and binaries skipped |
162
+
163
+ ## CLI Reference
164
+
165
+ <details>
166
+ <summary><b>Install</b></summary>
167
+
168
+ ```bash
169
+ npm install -g vericontext
170
+ ```
171
+
172
+ Or use directly via `npx`:
173
+
174
+ ```bash
175
+ npx vericontext --help
176
+ ```
177
+
178
+ </details>
179
+
180
+ <details>
181
+ <summary><code>cite</code> — generate a code citation</summary>
182
+
183
+ ```bash
184
+ npx vericontext cite --root <dir> --path <file> --start-line <n> --end-line <n> [--json]
185
+ ```
186
+
187
+ | Option | Required | Description |
188
+ |--------|----------|-------------|
189
+ | `--root` | Yes | Repository root directory |
190
+ | `--path` | Yes | File path relative to root |
191
+ | `--start-line` | Yes | Start line (1-based) |
192
+ | `--end-line` | Yes | End line (1-based) |
193
+ | `--json` | No | JSON output |
194
+
195
+ </details>
196
+
197
+ <details>
198
+ <summary><code>claim</code> — generate a structure claim</summary>
199
+
200
+ ```bash
201
+ npx vericontext claim --root <dir> --kind <kind> --path <path> [--json]
202
+ ```
203
+
204
+ `--kind`: `exists` | `exists-file` | `exists-dir` | `missing`
205
+
206
+ </details>
207
+
208
+ <details>
209
+ <summary><code>verify workspace</code> — verify all claims in a document</summary>
210
+
211
+ ```bash
212
+ npx vericontext verify workspace --root <dir> (--in-path <doc> | --text <text>) [--json]
213
+ ```
214
+
215
+ | Exit Code | Meaning |
216
+ |-----------|---------|
217
+ | `0` | All citations and claims valid |
218
+ | `1` | One or more failures, or input error |
219
+
220
+ ```json
221
+ {
222
+ "ok": false,
223
+ "total": 5,
224
+ "ok_count": 4,
225
+ "fail_count": 1,
226
+ "results": [
227
+ { "claim": "[[vctx:src/foo.ts#L1-L3@deadbeef]]", "ok": false, "reason": "hash_mismatch" }
228
+ ]
229
+ }
230
+ ```
231
+
232
+ </details>
233
+
234
+ <details>
235
+ <summary><b>Error reasons</b></summary>
236
+
237
+ | Reason | Meaning |
238
+ |--------|---------|
239
+ | `hash_mismatch` | Cited content has changed |
240
+ | `file_missing` | File does not exist |
241
+ | `path_escape` | Path escapes root jail |
242
+ | `range_invalid` | Line range out of bounds |
243
+ | `binary_file` | Binary file (null byte detected) |
244
+ | `symlink_skipped` | Symlink (skipped for determinism) |
245
+ | `invalid_input` | Malformed input |
246
+
247
+ </details>
248
+
249
+ ## Self-Verifying Docs
250
+
251
+ This README itself contains hidden VeriContext claims and is verifiable:
252
+
253
+ <!-- [[vctx-exists-dir:src/]] -->
254
+ <!-- [[vctx-exists-file:src/cli.ts]] -->
255
+ <!-- [[vctx-exists-file:src/mcp/server.ts]] -->
256
+ <!-- [[vctx-missing:tmp-output/]] -->
257
+
258
+ ```bash
259
+ npx vericontext verify workspace --root . --in-path README.md --json
260
+ ```
261
+
262
+ ## Contributing
263
+
264
+ See [CONTRIBUTING.md](CONTRIBUTING.md).
265
+
266
+ ## License
267
+
268
+ [MIT](LICENSE)
package/dist/cli.js ADDED
@@ -0,0 +1,406 @@
1
+ #!/usr/bin/env node
2
+
3
+ // src/cli.ts
4
+ import { Command } from "commander";
5
+
6
+ // src/core/file.ts
7
+ import fs from "fs/promises";
8
+ import { createHash } from "crypto";
9
+ var UTF8_DECODER = new TextDecoder("utf-8", { fatal: true });
10
+ function normalizeEol(text) {
11
+ return text.replace(/\r\n?/g, "\n");
12
+ }
13
+ async function readCanonicalText(absolutePath) {
14
+ let stats;
15
+ try {
16
+ stats = await fs.lstat(absolutePath);
17
+ } catch {
18
+ return { ok: false, reason: "file_missing" };
19
+ }
20
+ if (stats.isSymbolicLink()) {
21
+ return { ok: false, reason: "symlink_skipped" };
22
+ }
23
+ if (!stats.isFile()) {
24
+ return { ok: false, reason: "not_a_file" };
25
+ }
26
+ const bytes = await fs.readFile(absolutePath);
27
+ const probe = bytes.subarray(0, 8192);
28
+ if (probe.includes(0)) {
29
+ return { ok: false, reason: "binary_file" };
30
+ }
31
+ let decoded;
32
+ try {
33
+ decoded = UTF8_DECODER.decode(bytes);
34
+ } catch {
35
+ return { ok: false, reason: "invalid_utf8" };
36
+ }
37
+ const canonical = normalizeEol(decoded);
38
+ const lines = canonical.split("\n");
39
+ return { ok: true, text: canonical, lines };
40
+ }
41
+ function hashSha256Hex(value) {
42
+ return createHash("sha256").update(value, "utf8").digest("hex");
43
+ }
44
+ function hashLineSpan(lines, startLine, endLine) {
45
+ if (!Number.isInteger(startLine) || !Number.isInteger(endLine) || startLine < 1 || endLine < startLine) {
46
+ return { ok: false, reason: "range_invalid" };
47
+ }
48
+ if (endLine > lines.length) {
49
+ return { ok: false, reason: "range_invalid" };
50
+ }
51
+ const startIdx = startLine - 1;
52
+ const endIdx = endLine;
53
+ const span = lines.slice(startIdx, endIdx).join("\n");
54
+ return { ok: true, sha256_full: hashSha256Hex(span) };
55
+ }
56
+
57
+ // src/core/pathing.ts
58
+ import path from "path";
59
+ function normalizePathForClaim(inputPath) {
60
+ const normalized = inputPath.replaceAll("\\", "/").replace(/^\.\//, "");
61
+ if (normalized.length > 0 && normalized.endsWith("/")) {
62
+ return normalized;
63
+ }
64
+ return normalized;
65
+ }
66
+ function resolveUnderRoot(root, inputPath) {
67
+ if (path.isAbsolute(inputPath)) {
68
+ return { ok: false, reason: "invalid_path" };
69
+ }
70
+ const cleaned = normalizePathForClaim(inputPath);
71
+ if (cleaned.length === 0 || cleaned === ".") {
72
+ return { ok: false, reason: "invalid_path" };
73
+ }
74
+ const rootAbs = path.resolve(root);
75
+ const absolutePath = path.resolve(rootAbs, cleaned);
76
+ const rel = path.relative(rootAbs, absolutePath);
77
+ if (rel.startsWith("..") || path.isAbsolute(rel)) {
78
+ return { ok: false, reason: "path_escape" };
79
+ }
80
+ const normalizedPath = rel.split(path.sep).join("/");
81
+ if (normalizedPath.length === 0 || normalizedPath.startsWith("../") || normalizedPath === "..") {
82
+ return { ok: false, reason: "path_escape" };
83
+ }
84
+ return { ok: true, normalizedPath, absolutePath };
85
+ }
86
+
87
+ // src/cite/citation.ts
88
+ var CITATION_REGEX = /\[\[vctx:([^#\]]+)#L([0-9]+)-L([0-9]+)@([a-f0-9]{8})\]\]/g;
89
+ function renderCitation(path2, startLine, endLine, sha256Full) {
90
+ const hash8 = sha256Full.slice(0, 8);
91
+ return `[[vctx:${path2}#L${startLine}-L${endLine}@${hash8}]]`;
92
+ }
93
+ function parseCitations(text) {
94
+ const output = [];
95
+ for (const match of text.matchAll(CITATION_REGEX)) {
96
+ const path2 = match[1];
97
+ const startLine = Number.parseInt(match[2], 10);
98
+ const endLine = Number.parseInt(match[3], 10);
99
+ const hash8 = match[4];
100
+ output.push({ path: path2, startLine, endLine, hash8, raw: match[0] });
101
+ }
102
+ return output;
103
+ }
104
+ async function generateCitation(input) {
105
+ const resolved = resolveUnderRoot(input.root, input.path);
106
+ if (!resolved.ok) {
107
+ return { ok: false, reason: resolved.reason };
108
+ }
109
+ const file = await readCanonicalText(resolved.absolutePath);
110
+ if (!file.ok) {
111
+ return { ok: false, reason: file.reason };
112
+ }
113
+ const span = hashLineSpan(file.lines, input.startLine, input.endLine);
114
+ if (!span.ok) {
115
+ return { ok: false, reason: span.reason };
116
+ }
117
+ const citation = renderCitation(resolved.normalizedPath, input.startLine, input.endLine, span.sha256_full);
118
+ return {
119
+ ok: true,
120
+ citation,
121
+ sha256_full: span.sha256_full
122
+ };
123
+ }
124
+
125
+ // src/cite/claim.ts
126
+ import fs2 from "fs/promises";
127
+ var STRUCTURE_REGEX = /\[\[vctx-(exists|exists-file|exists-dir|missing):([^\]]+)\]\]/g;
128
+ function parseStructureClaims(text) {
129
+ const output = [];
130
+ for (const match of text.matchAll(STRUCTURE_REGEX)) {
131
+ output.push({ kind: match[1], path: match[2], raw: match[0] });
132
+ }
133
+ return output;
134
+ }
135
+ function renderStructureClaim(kind, normalizedPath) {
136
+ return `[[vctx-${kind}:${normalizedPath}]]`;
137
+ }
138
+ function generateStructureClaim(input) {
139
+ const resolved = resolveUnderRoot(input.root, input.path);
140
+ if (!resolved.ok) {
141
+ return { ok: false, reason: resolved.reason };
142
+ }
143
+ let normalizedPath = resolved.normalizedPath;
144
+ if (input.kind === "exists-dir" && input.path.endsWith("/") && !normalizedPath.endsWith("/")) {
145
+ normalizedPath = `${normalizedPath}/`;
146
+ }
147
+ return {
148
+ ok: true,
149
+ claim: renderStructureClaim(input.kind, normalizedPath),
150
+ kind: input.kind,
151
+ normalized_path: normalizedPath
152
+ };
153
+ }
154
+ async function verifyStructureKind(input) {
155
+ const resolved = resolveUnderRoot(input.root, input.path);
156
+ if (!resolved.ok) {
157
+ return { ok: false, reason: resolved.reason };
158
+ }
159
+ let stats = null;
160
+ try {
161
+ stats = await fs2.lstat(resolved.absolutePath);
162
+ } catch {
163
+ stats = null;
164
+ }
165
+ if (input.kind === "missing") {
166
+ return stats ? { ok: false, reason: "exists" } : { ok: true };
167
+ }
168
+ if (!stats) {
169
+ return { ok: false, reason: "missing" };
170
+ }
171
+ if (input.kind === "exists") {
172
+ return { ok: true };
173
+ }
174
+ if (input.kind === "exists-file") {
175
+ return stats.isFile() ? { ok: true } : { ok: false, reason: "not_file" };
176
+ }
177
+ return stats.isDirectory() ? { ok: true } : { ok: false, reason: "not_dir" };
178
+ }
179
+
180
+ // src/mcp/server.ts
181
+ import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
182
+ import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
183
+ import { z } from "zod";
184
+
185
+ // src/verify/workspace.ts
186
+ import fs3 from "fs/promises";
187
+ async function readVerifyText(input) {
188
+ const hasInPath = typeof input.inPath === "string";
189
+ const hasText = typeof input.text === "string";
190
+ if (hasInPath === hasText) {
191
+ return { reason: "invalid_input" };
192
+ }
193
+ if (hasText) {
194
+ return input.text;
195
+ }
196
+ const resolved = resolveUnderRoot(input.root, input.inPath);
197
+ if (!resolved.ok) {
198
+ return { reason: resolved.reason };
199
+ }
200
+ let stats;
201
+ try {
202
+ stats = await fs3.lstat(resolved.absolutePath);
203
+ } catch {
204
+ return { reason: "file_missing" };
205
+ }
206
+ if (!stats.isFile()) {
207
+ return { reason: "not_a_file" };
208
+ }
209
+ const raw = await fs3.readFile(resolved.absolutePath, "utf8");
210
+ return raw.replace(/\r\n?/g, "\n");
211
+ }
212
+ async function verifyWorkspace(input) {
213
+ const doc = await readVerifyText(input);
214
+ if (typeof doc !== "string") {
215
+ return { ok: false, reason: doc.reason };
216
+ }
217
+ const citationTokens = parseCitations(doc);
218
+ const structureTokens = parseStructureClaims(doc);
219
+ const results = [];
220
+ for (const token of citationTokens) {
221
+ const resolved = resolveUnderRoot(input.root, token.path);
222
+ if (!resolved.ok) {
223
+ results.push({ claim: token.raw, kind: "citation", ok: false, reason: resolved.reason });
224
+ continue;
225
+ }
226
+ const file = await readCanonicalText(resolved.absolutePath);
227
+ if (!file.ok) {
228
+ results.push({ claim: token.raw, kind: "citation", ok: false, reason: file.reason });
229
+ continue;
230
+ }
231
+ const span = hashLineSpan(file.lines, token.startLine, token.endLine);
232
+ if (!span.ok) {
233
+ results.push({ claim: token.raw, kind: "citation", ok: false, reason: span.reason });
234
+ continue;
235
+ }
236
+ const hash8 = span.sha256_full.slice(0, 8);
237
+ if (hash8 !== token.hash8) {
238
+ results.push({ claim: token.raw, kind: "citation", ok: false, reason: "hash_mismatch" });
239
+ continue;
240
+ }
241
+ results.push({ claim: token.raw, kind: "citation", ok: true });
242
+ }
243
+ for (const token of structureTokens) {
244
+ const structure = await verifyStructureKind({
245
+ root: input.root,
246
+ kind: token.kind,
247
+ path: token.path
248
+ });
249
+ if (!structure.ok) {
250
+ results.push({ claim: token.raw, kind: "structure", ok: false, reason: structure.reason });
251
+ continue;
252
+ }
253
+ results.push({ claim: token.raw, kind: "structure", ok: true });
254
+ }
255
+ const total = results.length;
256
+ const ok_count = results.filter((r) => r.ok).length;
257
+ const fail_count = total - ok_count;
258
+ return {
259
+ ok: fail_count === 0,
260
+ total,
261
+ ok_count,
262
+ fail_count,
263
+ results
264
+ };
265
+ }
266
+
267
+ // src/mcp/server.ts
268
+ function textJson(value) {
269
+ return {
270
+ content: [{ type: "text", text: JSON.stringify(value) }]
271
+ };
272
+ }
273
+ async function runMcpServer() {
274
+ const server = new McpServer({
275
+ name: "vericontext",
276
+ version: "0.1.0"
277
+ });
278
+ server.registerTool(
279
+ "vctx_cite",
280
+ {
281
+ description: "Generate a verifiable citation for a file line-span.",
282
+ inputSchema: {
283
+ root: z.string(),
284
+ path: z.string(),
285
+ start_line: z.number().int().positive(),
286
+ end_line: z.number().int().positive()
287
+ }
288
+ },
289
+ async ({ root, path: path2, start_line, end_line }) => {
290
+ const result = await generateCitation({ root, path: path2, startLine: start_line, endLine: end_line });
291
+ return textJson(result);
292
+ }
293
+ );
294
+ server.registerTool(
295
+ "vctx_claim",
296
+ {
297
+ description: "Generate a structure claim string for a path.",
298
+ inputSchema: {
299
+ root: z.string(),
300
+ kind: z.enum(["exists", "exists-file", "exists-dir", "missing"]),
301
+ path: z.string()
302
+ }
303
+ },
304
+ async ({ root, kind, path: path2 }) => {
305
+ const result = generateStructureClaim({ root, kind, path: path2 });
306
+ return textJson(result);
307
+ }
308
+ );
309
+ server.registerTool(
310
+ "vctx_verify_workspace",
311
+ {
312
+ description: "Verify all in-document VCC claims against current workspace.",
313
+ inputSchema: {
314
+ root: z.string(),
315
+ in_path: z.string().optional(),
316
+ text: z.string().optional()
317
+ }
318
+ },
319
+ async ({ root, in_path, text }) => {
320
+ const result = await verifyWorkspace({ root, inPath: in_path, text });
321
+ return textJson(result);
322
+ }
323
+ );
324
+ const transport = new StdioServerTransport();
325
+ await server.connect(transport);
326
+ }
327
+
328
+ // src/cli.ts
329
+ function printResult(jsonMode, result, fallbackText) {
330
+ if (jsonMode) {
331
+ process.stdout.write(`${JSON.stringify(result)}
332
+ `);
333
+ return;
334
+ }
335
+ if (fallbackText) {
336
+ process.stdout.write(`${fallbackText}
337
+ `);
338
+ return;
339
+ }
340
+ process.stdout.write(`${JSON.stringify(result, null, 2)}
341
+ `);
342
+ }
343
+ function withRootAndJson(command) {
344
+ return command.requiredOption("--root <root>", "Repository root").option("--json", "Output as JSON", false);
345
+ }
346
+ async function main() {
347
+ const program = new Command();
348
+ program.name("vericontext").description("Verifiable Context Compactor").version("0.1.0");
349
+ withRootAndJson(
350
+ program.command("cite").description("Generate a verifiable citation for a file line span").requiredOption("--path <path>", "File path under root").requiredOption("--start-line <startLine>", "1-based start line", Number.parseInt).requiredOption("--end-line <endLine>", "1-based end line", Number.parseInt)
351
+ ).action(async (opts) => {
352
+ const result = await generateCitation({
353
+ root: opts.root,
354
+ path: opts.path,
355
+ startLine: opts.startLine,
356
+ endLine: opts.endLine
357
+ });
358
+ const text = result.ok ? result.citation : void 0;
359
+ printResult(Boolean(opts.json), result, text);
360
+ if (!result.ok) {
361
+ process.exitCode = 1;
362
+ }
363
+ });
364
+ withRootAndJson(
365
+ program.command("claim").description("Generate a structure claim").requiredOption("--kind <kind>", "exists|exists-file|exists-dir|missing").requiredOption("--path <path>", "Path under root")
366
+ ).action(async (opts) => {
367
+ const kinds = /* @__PURE__ */ new Set(["exists", "exists-file", "exists-dir", "missing"]);
368
+ if (!kinds.has(opts.kind)) {
369
+ const err = { ok: false, reason: "invalid_input" };
370
+ printResult(Boolean(opts.json), err);
371
+ process.exitCode = 1;
372
+ return;
373
+ }
374
+ const result = generateStructureClaim({ root: opts.root, kind: opts.kind, path: opts.path });
375
+ const text = result.ok ? result.claim : void 0;
376
+ printResult(Boolean(opts.json), result, text);
377
+ if (!result.ok) {
378
+ process.exitCode = 1;
379
+ }
380
+ });
381
+ const verify = program.command("verify").description("Verification commands");
382
+ withRootAndJson(
383
+ verify.command("workspace").description("Verify all VCC claims from an input document or inline text").option("--in-path <inPath>", "Input document path under root").option("--text <text>", "Inline document text")
384
+ ).action(async (opts) => {
385
+ const result = await verifyWorkspace({
386
+ root: opts.root,
387
+ inPath: opts.inPath,
388
+ text: opts.text
389
+ });
390
+ printResult(Boolean(opts.json), result);
391
+ if (!result.ok) {
392
+ process.exitCode = 1;
393
+ }
394
+ });
395
+ program.command("mcp").description("Run MCP server over stdio").action(async () => {
396
+ await runMcpServer();
397
+ });
398
+ await program.parseAsync(process.argv);
399
+ }
400
+ main().catch((error) => {
401
+ const message = error instanceof Error ? error.message : "unknown_error";
402
+ process.stderr.write(`${message}
403
+ `);
404
+ process.exit(1);
405
+ });
406
+ //# sourceMappingURL=cli.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/cli.ts","../src/core/file.ts","../src/core/pathing.ts","../src/cite/citation.ts","../src/cite/claim.ts","../src/mcp/server.ts","../src/verify/workspace.ts"],"sourcesContent":["#!/usr/bin/env node\n\nimport { Command } from \"commander\";\n\nimport { generateCitation } from \"./cite/citation.js\";\nimport { generateStructureClaim } from \"./cite/claim.js\";\nimport { runMcpServer } from \"./mcp/server.js\";\nimport { verifyWorkspace } from \"./verify/workspace.js\";\n\nfunction printResult(jsonMode: boolean, result: unknown, fallbackText?: string): void {\n if (jsonMode) {\n process.stdout.write(`${JSON.stringify(result)}\\n`);\n return;\n }\n if (fallbackText) {\n process.stdout.write(`${fallbackText}\\n`);\n return;\n }\n process.stdout.write(`${JSON.stringify(result, null, 2)}\\n`);\n}\n\nfunction withRootAndJson(command: Command): Command {\n return command\n .requiredOption(\"--root <root>\", \"Repository root\")\n .option(\"--json\", \"Output as JSON\", false);\n}\n\nasync function main(): Promise<void> {\n const program = new Command();\n\n program\n .name(\"vericontext\")\n .description(\"Verifiable Context Compactor\")\n .version(\"0.1.0\");\n\n withRootAndJson(\n program\n .command(\"cite\")\n .description(\"Generate a verifiable citation for a file line span\")\n .requiredOption(\"--path <path>\", \"File path under root\")\n .requiredOption(\"--start-line <startLine>\", \"1-based start line\", Number.parseInt)\n .requiredOption(\"--end-line <endLine>\", \"1-based end line\", Number.parseInt),\n ).action(async (opts) => {\n const result = await generateCitation({\n root: opts.root,\n path: opts.path,\n startLine: opts.startLine,\n endLine: opts.endLine,\n });\n const text = result.ok ? result.citation : undefined;\n printResult(Boolean(opts.json), result, text);\n if (!result.ok) {\n process.exitCode = 1;\n }\n });\n\n withRootAndJson(\n program\n .command(\"claim\")\n .description(\"Generate a structure claim\")\n .requiredOption(\"--kind <kind>\", \"exists|exists-file|exists-dir|missing\")\n .requiredOption(\"--path <path>\", \"Path under root\"),\n ).action(async (opts) => {\n const kinds = new Set([\"exists\", \"exists-file\", \"exists-dir\", \"missing\"]);\n if (!kinds.has(opts.kind)) {\n const err = { ok: false, reason: \"invalid_input\" };\n printResult(Boolean(opts.json), err);\n process.exitCode = 1;\n return;\n }\n\n const result = generateStructureClaim({ root: opts.root, kind: opts.kind, path: opts.path });\n const text = result.ok ? result.claim : undefined;\n printResult(Boolean(opts.json), result, text);\n if (!result.ok) {\n process.exitCode = 1;\n }\n });\n\n const verify = program.command(\"verify\").description(\"Verification commands\");\n\n withRootAndJson(\n verify\n .command(\"workspace\")\n .description(\"Verify all VCC claims from an input document or inline text\")\n .option(\"--in-path <inPath>\", \"Input document path under root\")\n .option(\"--text <text>\", \"Inline document text\"),\n ).action(async (opts) => {\n const result = await verifyWorkspace({\n root: opts.root,\n inPath: opts.inPath,\n text: opts.text,\n });\n printResult(Boolean(opts.json), result);\n if (!result.ok) {\n process.exitCode = 1;\n }\n });\n\n program.command(\"mcp\").description(\"Run MCP server over stdio\").action(async () => {\n await runMcpServer();\n });\n\n await program.parseAsync(process.argv);\n}\n\nmain().catch((error: unknown) => {\n const message = error instanceof Error ? error.message : \"unknown_error\";\n process.stderr.write(`${message}\\n`);\n process.exit(1);\n});\n","import fs from \"node:fs/promises\";\nimport { createHash } from \"node:crypto\";\n\nimport type { ErrorReason } from \"../types.js\";\n\nexport interface FileTextOk {\n ok: true;\n text: string;\n lines: string[];\n}\n\nexport interface FileTextErr {\n ok: false;\n reason: ErrorReason;\n}\n\nconst UTF8_DECODER = new TextDecoder(\"utf-8\", { fatal: true });\n\nexport function normalizeEol(text: string): string {\n return text.replace(/\\r\\n?/g, \"\\n\");\n}\n\nexport async function readCanonicalText(absolutePath: string): Promise<FileTextOk | FileTextErr> {\n let stats;\n try {\n stats = await fs.lstat(absolutePath);\n } catch {\n return { ok: false, reason: \"file_missing\" };\n }\n\n if (stats.isSymbolicLink()) {\n return { ok: false, reason: \"symlink_skipped\" };\n }\n\n if (!stats.isFile()) {\n return { ok: false, reason: \"not_a_file\" };\n }\n\n const bytes = await fs.readFile(absolutePath);\n const probe = bytes.subarray(0, 8192);\n if (probe.includes(0)) {\n return { ok: false, reason: \"binary_file\" };\n }\n\n let decoded: string;\n try {\n decoded = UTF8_DECODER.decode(bytes);\n } catch {\n return { ok: false, reason: \"invalid_utf8\" };\n }\n\n const canonical = normalizeEol(decoded);\n const lines = canonical.split(\"\\n\");\n return { ok: true, text: canonical, lines };\n}\n\nexport function hashSha256Hex(value: string): string {\n return createHash(\"sha256\").update(value, \"utf8\").digest(\"hex\");\n}\n\nexport interface SpanHashOk {\n ok: true;\n sha256_full: string;\n}\n\nexport interface SpanHashErr {\n ok: false;\n reason: ErrorReason;\n}\n\nexport function hashLineSpan(lines: string[], startLine: number, endLine: number): SpanHashOk | SpanHashErr {\n if (!Number.isInteger(startLine) || !Number.isInteger(endLine) || startLine < 1 || endLine < startLine) {\n return { ok: false, reason: \"range_invalid\" };\n }\n if (endLine > lines.length) {\n return { ok: false, reason: \"range_invalid\" };\n }\n\n const startIdx = startLine - 1;\n const endIdx = endLine;\n const span = lines.slice(startIdx, endIdx).join(\"\\n\");\n return { ok: true, sha256_full: hashSha256Hex(span) };\n}\n","import path from \"node:path\";\n\nimport type { ErrorReason } from \"../types.js\";\n\nexport interface PathOk {\n ok: true;\n normalizedPath: string;\n absolutePath: string;\n}\n\nexport interface PathErr {\n ok: false;\n reason: ErrorReason;\n}\n\nexport const DEFAULT_EXCLUDES = [\".git\", \"node_modules\", \"dist\", \"build\"] as const;\n\nexport function normalizePathForClaim(inputPath: string): string {\n const normalized = inputPath.replaceAll(\"\\\\\", \"/\").replace(/^\\.\\//, \"\");\n if (normalized.length > 0 && normalized.endsWith(\"/\")) {\n return normalized;\n }\n return normalized;\n}\n\nexport function resolveUnderRoot(root: string, inputPath: string): PathOk | PathErr {\n if (path.isAbsolute(inputPath)) {\n return { ok: false, reason: \"invalid_path\" };\n }\n\n const cleaned = normalizePathForClaim(inputPath);\n if (cleaned.length === 0 || cleaned === \".\") {\n return { ok: false, reason: \"invalid_path\" };\n }\n\n const rootAbs = path.resolve(root);\n const absolutePath = path.resolve(rootAbs, cleaned);\n const rel = path.relative(rootAbs, absolutePath);\n\n if (rel.startsWith(\"..\") || path.isAbsolute(rel)) {\n return { ok: false, reason: \"path_escape\" };\n }\n\n const normalizedPath = rel.split(path.sep).join(\"/\");\n if (normalizedPath.length === 0 || normalizedPath.startsWith(\"../\") || normalizedPath === \"..\") {\n return { ok: false, reason: \"path_escape\" };\n }\n\n return { ok: true, normalizedPath, absolutePath };\n}\n\nexport function isExcludedPath(normalizedPath: string): boolean {\n return DEFAULT_EXCLUDES.some((prefix) => normalizedPath === prefix || normalizedPath.startsWith(`${prefix}/`));\n}\n","import { readCanonicalText, hashLineSpan } from \"../core/file.js\";\nimport { resolveUnderRoot } from \"../core/pathing.js\";\nimport type { CiteSuccess, ErrorReason } from \"../types.js\";\n\nexport interface CitationParts {\n path: string;\n startLine: number;\n endLine: number;\n hash8: string;\n raw: string;\n}\n\nexport interface CitationError {\n ok: false;\n reason: ErrorReason;\n}\n\nexport const CITATION_REGEX = /\\[\\[vctx:([^#\\]]+)#L([0-9]+)-L([0-9]+)@([a-f0-9]{8})\\]\\]/g;\n\nexport function renderCitation(path: string, startLine: number, endLine: number, sha256Full: string): string {\n const hash8 = sha256Full.slice(0, 8);\n return `[[vctx:${path}#L${startLine}-L${endLine}@${hash8}]]`;\n}\n\nexport function parseCitations(text: string): CitationParts[] {\n const output: CitationParts[] = [];\n for (const match of text.matchAll(CITATION_REGEX)) {\n const path = match[1];\n const startLine = Number.parseInt(match[2], 10);\n const endLine = Number.parseInt(match[3], 10);\n const hash8 = match[4];\n output.push({ path, startLine, endLine, hash8, raw: match[0] });\n }\n return output;\n}\n\nexport async function generateCitation(input: {\n root: string;\n path: string;\n startLine: number;\n endLine: number;\n}): Promise<CiteSuccess | CitationError> {\n const resolved = resolveUnderRoot(input.root, input.path);\n if (!resolved.ok) {\n return { ok: false, reason: resolved.reason };\n }\n\n const file = await readCanonicalText(resolved.absolutePath);\n if (!file.ok) {\n return { ok: false, reason: file.reason };\n }\n\n const span = hashLineSpan(file.lines, input.startLine, input.endLine);\n if (!span.ok) {\n return { ok: false, reason: span.reason };\n }\n\n const citation = renderCitation(resolved.normalizedPath, input.startLine, input.endLine, span.sha256_full);\n return {\n ok: true,\n citation,\n sha256_full: span.sha256_full,\n };\n}\n","import fs from \"node:fs/promises\";\n\nimport { resolveUnderRoot } from \"../core/pathing.js\";\nimport type { ClaimSuccess, ErrorReason, StructureKind } from \"../types.js\";\n\nexport interface ClaimError {\n ok: false;\n reason: ErrorReason;\n}\n\nexport const STRUCTURE_REGEX = /\\[\\[vctx-(exists|exists-file|exists-dir|missing):([^\\]]+)\\]\\]/g;\n\nexport interface StructureToken {\n kind: StructureKind;\n path: string;\n raw: string;\n}\n\nexport function parseStructureClaims(text: string): StructureToken[] {\n const output: StructureToken[] = [];\n for (const match of text.matchAll(STRUCTURE_REGEX)) {\n output.push({ kind: match[1] as StructureKind, path: match[2], raw: match[0] });\n }\n return output;\n}\n\nexport function renderStructureClaim(kind: StructureKind, normalizedPath: string): string {\n return `[[vctx-${kind}:${normalizedPath}]]`;\n}\n\nexport function generateStructureClaim(input: {\n root: string;\n kind: StructureKind;\n path: string;\n}): ClaimSuccess | ClaimError {\n const resolved = resolveUnderRoot(input.root, input.path);\n if (!resolved.ok) {\n return { ok: false, reason: resolved.reason };\n }\n\n let normalizedPath = resolved.normalizedPath;\n if (input.kind === \"exists-dir\" && input.path.endsWith(\"/\") && !normalizedPath.endsWith(\"/\")) {\n normalizedPath = `${normalizedPath}/`;\n }\n\n return {\n ok: true,\n claim: renderStructureClaim(input.kind, normalizedPath),\n kind: input.kind,\n normalized_path: normalizedPath,\n };\n}\n\nexport async function verifyStructureKind(input: {\n root: string;\n kind: StructureKind;\n path: string;\n}): Promise<{ ok: true } | { ok: false; reason: ErrorReason }> {\n const resolved = resolveUnderRoot(input.root, input.path);\n if (!resolved.ok) {\n return { ok: false, reason: resolved.reason };\n }\n\n let stats: Awaited<ReturnType<typeof fs.lstat>> | null = null;\n try {\n stats = await fs.lstat(resolved.absolutePath);\n } catch {\n stats = null;\n }\n\n if (input.kind === \"missing\") {\n return stats ? { ok: false, reason: \"exists\" } : { ok: true };\n }\n\n if (!stats) {\n return { ok: false, reason: \"missing\" };\n }\n\n if (input.kind === \"exists\") {\n return { ok: true };\n }\n if (input.kind === \"exists-file\") {\n return stats.isFile() ? { ok: true } : { ok: false, reason: \"not_file\" };\n }\n return stats.isDirectory() ? { ok: true } : { ok: false, reason: \"not_dir\" };\n}\n","import { McpServer } from \"@modelcontextprotocol/sdk/server/mcp.js\";\nimport { StdioServerTransport } from \"@modelcontextprotocol/sdk/server/stdio.js\";\nimport { z } from \"zod\";\n\nimport { generateCitation } from \"../cite/citation.js\";\nimport { generateStructureClaim } from \"../cite/claim.js\";\nimport { verifyWorkspace } from \"../verify/workspace.js\";\n\nfunction textJson(value: unknown): { content: Array<{ type: \"text\"; text: string }> } {\n return {\n content: [{ type: \"text\", text: JSON.stringify(value) }],\n };\n}\n\nexport async function runMcpServer(): Promise<void> {\n const server = new McpServer({\n name: \"vericontext\",\n version: \"0.1.0\",\n });\n\n server.registerTool(\n \"vctx_cite\",\n {\n description: \"Generate a verifiable citation for a file line-span.\",\n inputSchema: {\n root: z.string(),\n path: z.string(),\n start_line: z.number().int().positive(),\n end_line: z.number().int().positive(),\n },\n },\n async ({ root, path, start_line, end_line }) => {\n const result = await generateCitation({ root, path, startLine: start_line, endLine: end_line });\n return textJson(result);\n },\n );\n\n server.registerTool(\n \"vctx_claim\",\n {\n description: \"Generate a structure claim string for a path.\",\n inputSchema: {\n root: z.string(),\n kind: z.enum([\"exists\", \"exists-file\", \"exists-dir\", \"missing\"]),\n path: z.string(),\n },\n },\n async ({ root, kind, path }) => {\n const result = generateStructureClaim({ root, kind, path });\n return textJson(result);\n },\n );\n\n server.registerTool(\n \"vctx_verify_workspace\",\n {\n description: \"Verify all in-document VCC claims against current workspace.\",\n inputSchema: {\n root: z.string(),\n in_path: z.string().optional(),\n text: z.string().optional(),\n },\n },\n async ({ root, in_path, text }) => {\n const result = await verifyWorkspace({ root, inPath: in_path, text });\n return textJson(result);\n },\n );\n\n const transport = new StdioServerTransport();\n await server.connect(transport);\n}\n","import fs from \"node:fs/promises\";\n\nimport { parseCitations } from \"../cite/citation.js\";\nimport { parseStructureClaims, verifyStructureKind } from \"../cite/claim.js\";\nimport { readCanonicalText, hashLineSpan } from \"../core/file.js\";\nimport { resolveUnderRoot } from \"../core/pathing.js\";\nimport type { VerifyWorkspaceResult } from \"../types.js\";\nimport type { ErrorReason } from \"../types.js\";\n\nexport interface VerifyInput {\n root: string;\n inPath?: string;\n text?: string;\n}\n\nexport async function readVerifyText(input: VerifyInput): Promise<string | { reason: ErrorReason }> {\n const hasInPath = typeof input.inPath === \"string\";\n const hasText = typeof input.text === \"string\";\n if (hasInPath === hasText) {\n return { reason: \"invalid_input\" };\n }\n\n if (hasText) {\n return input.text as string;\n }\n\n const resolved = resolveUnderRoot(input.root, input.inPath as string);\n if (!resolved.ok) {\n return { reason: resolved.reason };\n }\n\n let stats;\n try {\n stats = await fs.lstat(resolved.absolutePath);\n } catch {\n return { reason: \"file_missing\" };\n }\n if (!stats.isFile()) {\n return { reason: \"not_a_file\" };\n }\n\n const raw = await fs.readFile(resolved.absolutePath, \"utf8\");\n return raw.replace(/\\r\\n?/g, \"\\n\");\n}\n\nexport async function verifyWorkspace(input: VerifyInput): Promise<VerifyWorkspaceResult | { ok: false; reason: ErrorReason }> {\n const doc = await readVerifyText(input);\n if (typeof doc !== \"string\") {\n return { ok: false, reason: doc.reason };\n }\n\n const citationTokens = parseCitations(doc);\n const structureTokens = parseStructureClaims(doc);\n const results: VerifyWorkspaceResult[\"results\"] = [];\n\n for (const token of citationTokens) {\n const resolved = resolveUnderRoot(input.root, token.path);\n if (!resolved.ok) {\n results.push({ claim: token.raw, kind: \"citation\", ok: false, reason: resolved.reason });\n continue;\n }\n\n const file = await readCanonicalText(resolved.absolutePath);\n if (!file.ok) {\n results.push({ claim: token.raw, kind: \"citation\", ok: false, reason: file.reason });\n continue;\n }\n\n const span = hashLineSpan(file.lines, token.startLine, token.endLine);\n if (!span.ok) {\n results.push({ claim: token.raw, kind: \"citation\", ok: false, reason: span.reason });\n continue;\n }\n\n const hash8 = span.sha256_full.slice(0, 8);\n if (hash8 !== token.hash8) {\n results.push({ claim: token.raw, kind: \"citation\", ok: false, reason: \"hash_mismatch\" });\n continue;\n }\n\n results.push({ claim: token.raw, kind: \"citation\", ok: true });\n }\n\n for (const token of structureTokens) {\n const structure = await verifyStructureKind({\n root: input.root,\n kind: token.kind,\n path: token.path,\n });\n\n if (!structure.ok) {\n results.push({ claim: token.raw, kind: \"structure\", ok: false, reason: structure.reason });\n continue;\n }\n results.push({ claim: token.raw, kind: \"structure\", ok: true });\n }\n\n const total = results.length;\n const ok_count = results.filter((r) => r.ok).length;\n const fail_count = total - ok_count;\n\n return {\n ok: fail_count === 0,\n total,\n ok_count,\n fail_count,\n results,\n };\n}\n"],"mappings":";;;AAEA,SAAS,eAAe;;;ACFxB,OAAO,QAAQ;AACf,SAAS,kBAAkB;AAe3B,IAAM,eAAe,IAAI,YAAY,SAAS,EAAE,OAAO,KAAK,CAAC;AAEtD,SAAS,aAAa,MAAsB;AACjD,SAAO,KAAK,QAAQ,UAAU,IAAI;AACpC;AAEA,eAAsB,kBAAkB,cAAyD;AAC/F,MAAI;AACJ,MAAI;AACF,YAAQ,MAAM,GAAG,MAAM,YAAY;AAAA,EACrC,QAAQ;AACN,WAAO,EAAE,IAAI,OAAO,QAAQ,eAAe;AAAA,EAC7C;AAEA,MAAI,MAAM,eAAe,GAAG;AAC1B,WAAO,EAAE,IAAI,OAAO,QAAQ,kBAAkB;AAAA,EAChD;AAEA,MAAI,CAAC,MAAM,OAAO,GAAG;AACnB,WAAO,EAAE,IAAI,OAAO,QAAQ,aAAa;AAAA,EAC3C;AAEA,QAAM,QAAQ,MAAM,GAAG,SAAS,YAAY;AAC5C,QAAM,QAAQ,MAAM,SAAS,GAAG,IAAI;AACpC,MAAI,MAAM,SAAS,CAAC,GAAG;AACrB,WAAO,EAAE,IAAI,OAAO,QAAQ,cAAc;AAAA,EAC5C;AAEA,MAAI;AACJ,MAAI;AACF,cAAU,aAAa,OAAO,KAAK;AAAA,EACrC,QAAQ;AACN,WAAO,EAAE,IAAI,OAAO,QAAQ,eAAe;AAAA,EAC7C;AAEA,QAAM,YAAY,aAAa,OAAO;AACtC,QAAM,QAAQ,UAAU,MAAM,IAAI;AAClC,SAAO,EAAE,IAAI,MAAM,MAAM,WAAW,MAAM;AAC5C;AAEO,SAAS,cAAc,OAAuB;AACnD,SAAO,WAAW,QAAQ,EAAE,OAAO,OAAO,MAAM,EAAE,OAAO,KAAK;AAChE;AAYO,SAAS,aAAa,OAAiB,WAAmB,SAA2C;AAC1G,MAAI,CAAC,OAAO,UAAU,SAAS,KAAK,CAAC,OAAO,UAAU,OAAO,KAAK,YAAY,KAAK,UAAU,WAAW;AACtG,WAAO,EAAE,IAAI,OAAO,QAAQ,gBAAgB;AAAA,EAC9C;AACA,MAAI,UAAU,MAAM,QAAQ;AAC1B,WAAO,EAAE,IAAI,OAAO,QAAQ,gBAAgB;AAAA,EAC9C;AAEA,QAAM,WAAW,YAAY;AAC7B,QAAM,SAAS;AACf,QAAM,OAAO,MAAM,MAAM,UAAU,MAAM,EAAE,KAAK,IAAI;AACpD,SAAO,EAAE,IAAI,MAAM,aAAa,cAAc,IAAI,EAAE;AACtD;;;AClFA,OAAO,UAAU;AAiBV,SAAS,sBAAsB,WAA2B;AAC/D,QAAM,aAAa,UAAU,WAAW,MAAM,GAAG,EAAE,QAAQ,SAAS,EAAE;AACtE,MAAI,WAAW,SAAS,KAAK,WAAW,SAAS,GAAG,GAAG;AACrD,WAAO;AAAA,EACT;AACA,SAAO;AACT;AAEO,SAAS,iBAAiB,MAAc,WAAqC;AAClF,MAAI,KAAK,WAAW,SAAS,GAAG;AAC9B,WAAO,EAAE,IAAI,OAAO,QAAQ,eAAe;AAAA,EAC7C;AAEA,QAAM,UAAU,sBAAsB,SAAS;AAC/C,MAAI,QAAQ,WAAW,KAAK,YAAY,KAAK;AAC3C,WAAO,EAAE,IAAI,OAAO,QAAQ,eAAe;AAAA,EAC7C;AAEA,QAAM,UAAU,KAAK,QAAQ,IAAI;AACjC,QAAM,eAAe,KAAK,QAAQ,SAAS,OAAO;AAClD,QAAM,MAAM,KAAK,SAAS,SAAS,YAAY;AAE/C,MAAI,IAAI,WAAW,IAAI,KAAK,KAAK,WAAW,GAAG,GAAG;AAChD,WAAO,EAAE,IAAI,OAAO,QAAQ,cAAc;AAAA,EAC5C;AAEA,QAAM,iBAAiB,IAAI,MAAM,KAAK,GAAG,EAAE,KAAK,GAAG;AACnD,MAAI,eAAe,WAAW,KAAK,eAAe,WAAW,KAAK,KAAK,mBAAmB,MAAM;AAC9F,WAAO,EAAE,IAAI,OAAO,QAAQ,cAAc;AAAA,EAC5C;AAEA,SAAO,EAAE,IAAI,MAAM,gBAAgB,aAAa;AAClD;;;AChCO,IAAM,iBAAiB;AAEvB,SAAS,eAAeA,OAAc,WAAmB,SAAiB,YAA4B;AAC3G,QAAM,QAAQ,WAAW,MAAM,GAAG,CAAC;AACnC,SAAO,UAAUA,KAAI,KAAK,SAAS,KAAK,OAAO,IAAI,KAAK;AAC1D;AAEO,SAAS,eAAe,MAA+B;AAC5D,QAAM,SAA0B,CAAC;AACjC,aAAW,SAAS,KAAK,SAAS,cAAc,GAAG;AACjD,UAAMA,QAAO,MAAM,CAAC;AACpB,UAAM,YAAY,OAAO,SAAS,MAAM,CAAC,GAAG,EAAE;AAC9C,UAAM,UAAU,OAAO,SAAS,MAAM,CAAC,GAAG,EAAE;AAC5C,UAAM,QAAQ,MAAM,CAAC;AACrB,WAAO,KAAK,EAAE,MAAAA,OAAM,WAAW,SAAS,OAAO,KAAK,MAAM,CAAC,EAAE,CAAC;AAAA,EAChE;AACA,SAAO;AACT;AAEA,eAAsB,iBAAiB,OAKE;AACvC,QAAM,WAAW,iBAAiB,MAAM,MAAM,MAAM,IAAI;AACxD,MAAI,CAAC,SAAS,IAAI;AAChB,WAAO,EAAE,IAAI,OAAO,QAAQ,SAAS,OAAO;AAAA,EAC9C;AAEA,QAAM,OAAO,MAAM,kBAAkB,SAAS,YAAY;AAC1D,MAAI,CAAC,KAAK,IAAI;AACZ,WAAO,EAAE,IAAI,OAAO,QAAQ,KAAK,OAAO;AAAA,EAC1C;AAEA,QAAM,OAAO,aAAa,KAAK,OAAO,MAAM,WAAW,MAAM,OAAO;AACpE,MAAI,CAAC,KAAK,IAAI;AACZ,WAAO,EAAE,IAAI,OAAO,QAAQ,KAAK,OAAO;AAAA,EAC1C;AAEA,QAAM,WAAW,eAAe,SAAS,gBAAgB,MAAM,WAAW,MAAM,SAAS,KAAK,WAAW;AACzG,SAAO;AAAA,IACL,IAAI;AAAA,IACJ;AAAA,IACA,aAAa,KAAK;AAAA,EACpB;AACF;;;AC/DA,OAAOC,SAAQ;AAUR,IAAM,kBAAkB;AAQxB,SAAS,qBAAqB,MAAgC;AACnE,QAAM,SAA2B,CAAC;AAClC,aAAW,SAAS,KAAK,SAAS,eAAe,GAAG;AAClD,WAAO,KAAK,EAAE,MAAM,MAAM,CAAC,GAAoB,MAAM,MAAM,CAAC,GAAG,KAAK,MAAM,CAAC,EAAE,CAAC;AAAA,EAChF;AACA,SAAO;AACT;AAEO,SAAS,qBAAqB,MAAqB,gBAAgC;AACxF,SAAO,UAAU,IAAI,IAAI,cAAc;AACzC;AAEO,SAAS,uBAAuB,OAIT;AAC5B,QAAM,WAAW,iBAAiB,MAAM,MAAM,MAAM,IAAI;AACxD,MAAI,CAAC,SAAS,IAAI;AAChB,WAAO,EAAE,IAAI,OAAO,QAAQ,SAAS,OAAO;AAAA,EAC9C;AAEA,MAAI,iBAAiB,SAAS;AAC9B,MAAI,MAAM,SAAS,gBAAgB,MAAM,KAAK,SAAS,GAAG,KAAK,CAAC,eAAe,SAAS,GAAG,GAAG;AAC5F,qBAAiB,GAAG,cAAc;AAAA,EACpC;AAEA,SAAO;AAAA,IACL,IAAI;AAAA,IACJ,OAAO,qBAAqB,MAAM,MAAM,cAAc;AAAA,IACtD,MAAM,MAAM;AAAA,IACZ,iBAAiB;AAAA,EACnB;AACF;AAEA,eAAsB,oBAAoB,OAIqB;AAC7D,QAAM,WAAW,iBAAiB,MAAM,MAAM,MAAM,IAAI;AACxD,MAAI,CAAC,SAAS,IAAI;AAChB,WAAO,EAAE,IAAI,OAAO,QAAQ,SAAS,OAAO;AAAA,EAC9C;AAEA,MAAI,QAAqD;AACzD,MAAI;AACF,YAAQ,MAAMC,IAAG,MAAM,SAAS,YAAY;AAAA,EAC9C,QAAQ;AACN,YAAQ;AAAA,EACV;AAEA,MAAI,MAAM,SAAS,WAAW;AAC5B,WAAO,QAAQ,EAAE,IAAI,OAAO,QAAQ,SAAS,IAAI,EAAE,IAAI,KAAK;AAAA,EAC9D;AAEA,MAAI,CAAC,OAAO;AACV,WAAO,EAAE,IAAI,OAAO,QAAQ,UAAU;AAAA,EACxC;AAEA,MAAI,MAAM,SAAS,UAAU;AAC3B,WAAO,EAAE,IAAI,KAAK;AAAA,EACpB;AACA,MAAI,MAAM,SAAS,eAAe;AAChC,WAAO,MAAM,OAAO,IAAI,EAAE,IAAI,KAAK,IAAI,EAAE,IAAI,OAAO,QAAQ,WAAW;AAAA,EACzE;AACA,SAAO,MAAM,YAAY,IAAI,EAAE,IAAI,KAAK,IAAI,EAAE,IAAI,OAAO,QAAQ,UAAU;AAC7E;;;ACrFA,SAAS,iBAAiB;AAC1B,SAAS,4BAA4B;AACrC,SAAS,SAAS;;;ACFlB,OAAOC,SAAQ;AAef,eAAsB,eAAe,OAA+D;AAClG,QAAM,YAAY,OAAO,MAAM,WAAW;AAC1C,QAAM,UAAU,OAAO,MAAM,SAAS;AACtC,MAAI,cAAc,SAAS;AACzB,WAAO,EAAE,QAAQ,gBAAgB;AAAA,EACnC;AAEA,MAAI,SAAS;AACX,WAAO,MAAM;AAAA,EACf;AAEA,QAAM,WAAW,iBAAiB,MAAM,MAAM,MAAM,MAAgB;AACpE,MAAI,CAAC,SAAS,IAAI;AAChB,WAAO,EAAE,QAAQ,SAAS,OAAO;AAAA,EACnC;AAEA,MAAI;AACJ,MAAI;AACF,YAAQ,MAAMC,IAAG,MAAM,SAAS,YAAY;AAAA,EAC9C,QAAQ;AACN,WAAO,EAAE,QAAQ,eAAe;AAAA,EAClC;AACA,MAAI,CAAC,MAAM,OAAO,GAAG;AACnB,WAAO,EAAE,QAAQ,aAAa;AAAA,EAChC;AAEA,QAAM,MAAM,MAAMA,IAAG,SAAS,SAAS,cAAc,MAAM;AAC3D,SAAO,IAAI,QAAQ,UAAU,IAAI;AACnC;AAEA,eAAsB,gBAAgB,OAAyF;AAC7H,QAAM,MAAM,MAAM,eAAe,KAAK;AACtC,MAAI,OAAO,QAAQ,UAAU;AAC3B,WAAO,EAAE,IAAI,OAAO,QAAQ,IAAI,OAAO;AAAA,EACzC;AAEA,QAAM,iBAAiB,eAAe,GAAG;AACzC,QAAM,kBAAkB,qBAAqB,GAAG;AAChD,QAAM,UAA4C,CAAC;AAEnD,aAAW,SAAS,gBAAgB;AAClC,UAAM,WAAW,iBAAiB,MAAM,MAAM,MAAM,IAAI;AACxD,QAAI,CAAC,SAAS,IAAI;AAChB,cAAQ,KAAK,EAAE,OAAO,MAAM,KAAK,MAAM,YAAY,IAAI,OAAO,QAAQ,SAAS,OAAO,CAAC;AACvF;AAAA,IACF;AAEA,UAAM,OAAO,MAAM,kBAAkB,SAAS,YAAY;AAC1D,QAAI,CAAC,KAAK,IAAI;AACZ,cAAQ,KAAK,EAAE,OAAO,MAAM,KAAK,MAAM,YAAY,IAAI,OAAO,QAAQ,KAAK,OAAO,CAAC;AACnF;AAAA,IACF;AAEA,UAAM,OAAO,aAAa,KAAK,OAAO,MAAM,WAAW,MAAM,OAAO;AACpE,QAAI,CAAC,KAAK,IAAI;AACZ,cAAQ,KAAK,EAAE,OAAO,MAAM,KAAK,MAAM,YAAY,IAAI,OAAO,QAAQ,KAAK,OAAO,CAAC;AACnF;AAAA,IACF;AAEA,UAAM,QAAQ,KAAK,YAAY,MAAM,GAAG,CAAC;AACzC,QAAI,UAAU,MAAM,OAAO;AACzB,cAAQ,KAAK,EAAE,OAAO,MAAM,KAAK,MAAM,YAAY,IAAI,OAAO,QAAQ,gBAAgB,CAAC;AACvF;AAAA,IACF;AAEA,YAAQ,KAAK,EAAE,OAAO,MAAM,KAAK,MAAM,YAAY,IAAI,KAAK,CAAC;AAAA,EAC/D;AAEA,aAAW,SAAS,iBAAiB;AACnC,UAAM,YAAY,MAAM,oBAAoB;AAAA,MAC1C,MAAM,MAAM;AAAA,MACZ,MAAM,MAAM;AAAA,MACZ,MAAM,MAAM;AAAA,IACd,CAAC;AAED,QAAI,CAAC,UAAU,IAAI;AACjB,cAAQ,KAAK,EAAE,OAAO,MAAM,KAAK,MAAM,aAAa,IAAI,OAAO,QAAQ,UAAU,OAAO,CAAC;AACzF;AAAA,IACF;AACA,YAAQ,KAAK,EAAE,OAAO,MAAM,KAAK,MAAM,aAAa,IAAI,KAAK,CAAC;AAAA,EAChE;AAEA,QAAM,QAAQ,QAAQ;AACtB,QAAM,WAAW,QAAQ,OAAO,CAAC,MAAM,EAAE,EAAE,EAAE;AAC7C,QAAM,aAAa,QAAQ;AAE3B,SAAO;AAAA,IACL,IAAI,eAAe;AAAA,IACnB;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,EACF;AACF;;;ADpGA,SAAS,SAAS,OAAoE;AACpF,SAAO;AAAA,IACL,SAAS,CAAC,EAAE,MAAM,QAAQ,MAAM,KAAK,UAAU,KAAK,EAAE,CAAC;AAAA,EACzD;AACF;AAEA,eAAsB,eAA8B;AAClD,QAAM,SAAS,IAAI,UAAU;AAAA,IAC3B,MAAM;AAAA,IACN,SAAS;AAAA,EACX,CAAC;AAED,SAAO;AAAA,IACL;AAAA,IACA;AAAA,MACE,aAAa;AAAA,MACb,aAAa;AAAA,QACX,MAAM,EAAE,OAAO;AAAA,QACf,MAAM,EAAE,OAAO;AAAA,QACf,YAAY,EAAE,OAAO,EAAE,IAAI,EAAE,SAAS;AAAA,QACtC,UAAU,EAAE,OAAO,EAAE,IAAI,EAAE,SAAS;AAAA,MACtC;AAAA,IACF;AAAA,IACA,OAAO,EAAE,MAAM,MAAAC,OAAM,YAAY,SAAS,MAAM;AAC9C,YAAM,SAAS,MAAM,iBAAiB,EAAE,MAAM,MAAAA,OAAM,WAAW,YAAY,SAAS,SAAS,CAAC;AAC9F,aAAO,SAAS,MAAM;AAAA,IACxB;AAAA,EACF;AAEA,SAAO;AAAA,IACL;AAAA,IACA;AAAA,MACE,aAAa;AAAA,MACb,aAAa;AAAA,QACX,MAAM,EAAE,OAAO;AAAA,QACf,MAAM,EAAE,KAAK,CAAC,UAAU,eAAe,cAAc,SAAS,CAAC;AAAA,QAC/D,MAAM,EAAE,OAAO;AAAA,MACjB;AAAA,IACF;AAAA,IACA,OAAO,EAAE,MAAM,MAAM,MAAAA,MAAK,MAAM;AAC9B,YAAM,SAAS,uBAAuB,EAAE,MAAM,MAAM,MAAAA,MAAK,CAAC;AAC1D,aAAO,SAAS,MAAM;AAAA,IACxB;AAAA,EACF;AAEA,SAAO;AAAA,IACL;AAAA,IACA;AAAA,MACE,aAAa;AAAA,MACb,aAAa;AAAA,QACX,MAAM,EAAE,OAAO;AAAA,QACf,SAAS,EAAE,OAAO,EAAE,SAAS;AAAA,QAC7B,MAAM,EAAE,OAAO,EAAE,SAAS;AAAA,MAC5B;AAAA,IACF;AAAA,IACA,OAAO,EAAE,MAAM,SAAS,KAAK,MAAM;AACjC,YAAM,SAAS,MAAM,gBAAgB,EAAE,MAAM,QAAQ,SAAS,KAAK,CAAC;AACpE,aAAO,SAAS,MAAM;AAAA,IACxB;AAAA,EACF;AAEA,QAAM,YAAY,IAAI,qBAAqB;AAC3C,QAAM,OAAO,QAAQ,SAAS;AAChC;;;AL9DA,SAAS,YAAY,UAAmB,QAAiB,cAA6B;AACpF,MAAI,UAAU;AACZ,YAAQ,OAAO,MAAM,GAAG,KAAK,UAAU,MAAM,CAAC;AAAA,CAAI;AAClD;AAAA,EACF;AACA,MAAI,cAAc;AAChB,YAAQ,OAAO,MAAM,GAAG,YAAY;AAAA,CAAI;AACxC;AAAA,EACF;AACA,UAAQ,OAAO,MAAM,GAAG,KAAK,UAAU,QAAQ,MAAM,CAAC,CAAC;AAAA,CAAI;AAC7D;AAEA,SAAS,gBAAgB,SAA2B;AAClD,SAAO,QACJ,eAAe,iBAAiB,iBAAiB,EACjD,OAAO,UAAU,kBAAkB,KAAK;AAC7C;AAEA,eAAe,OAAsB;AACnC,QAAM,UAAU,IAAI,QAAQ;AAE5B,UACG,KAAK,aAAa,EAClB,YAAY,8BAA8B,EAC1C,QAAQ,OAAO;AAElB;AAAA,IACE,QACG,QAAQ,MAAM,EACd,YAAY,qDAAqD,EACjE,eAAe,iBAAiB,sBAAsB,EACtD,eAAe,4BAA4B,sBAAsB,OAAO,QAAQ,EAChF,eAAe,wBAAwB,oBAAoB,OAAO,QAAQ;AAAA,EAC/E,EAAE,OAAO,OAAO,SAAS;AACvB,UAAM,SAAS,MAAM,iBAAiB;AAAA,MACpC,MAAM,KAAK;AAAA,MACX,MAAM,KAAK;AAAA,MACX,WAAW,KAAK;AAAA,MAChB,SAAS,KAAK;AAAA,IAChB,CAAC;AACD,UAAM,OAAO,OAAO,KAAK,OAAO,WAAW;AAC3C,gBAAY,QAAQ,KAAK,IAAI,GAAG,QAAQ,IAAI;AAC5C,QAAI,CAAC,OAAO,IAAI;AACd,cAAQ,WAAW;AAAA,IACrB;AAAA,EACF,CAAC;AAED;AAAA,IACE,QACG,QAAQ,OAAO,EACf,YAAY,4BAA4B,EACxC,eAAe,iBAAiB,uCAAuC,EACvE,eAAe,iBAAiB,iBAAiB;AAAA,EACtD,EAAE,OAAO,OAAO,SAAS;AACvB,UAAM,QAAQ,oBAAI,IAAI,CAAC,UAAU,eAAe,cAAc,SAAS,CAAC;AACxE,QAAI,CAAC,MAAM,IAAI,KAAK,IAAI,GAAG;AACzB,YAAM,MAAM,EAAE,IAAI,OAAO,QAAQ,gBAAgB;AACjD,kBAAY,QAAQ,KAAK,IAAI,GAAG,GAAG;AACnC,cAAQ,WAAW;AACnB;AAAA,IACF;AAEA,UAAM,SAAS,uBAAuB,EAAE,MAAM,KAAK,MAAM,MAAM,KAAK,MAAM,MAAM,KAAK,KAAK,CAAC;AAC3F,UAAM,OAAO,OAAO,KAAK,OAAO,QAAQ;AACxC,gBAAY,QAAQ,KAAK,IAAI,GAAG,QAAQ,IAAI;AAC5C,QAAI,CAAC,OAAO,IAAI;AACd,cAAQ,WAAW;AAAA,IACrB;AAAA,EACF,CAAC;AAED,QAAM,SAAS,QAAQ,QAAQ,QAAQ,EAAE,YAAY,uBAAuB;AAE5E;AAAA,IACE,OACG,QAAQ,WAAW,EACnB,YAAY,6DAA6D,EACzE,OAAO,sBAAsB,gCAAgC,EAC7D,OAAO,iBAAiB,sBAAsB;AAAA,EACnD,EAAE,OAAO,OAAO,SAAS;AACvB,UAAM,SAAS,MAAM,gBAAgB;AAAA,MACnC,MAAM,KAAK;AAAA,MACX,QAAQ,KAAK;AAAA,MACb,MAAM,KAAK;AAAA,IACb,CAAC;AACD,gBAAY,QAAQ,KAAK,IAAI,GAAG,MAAM;AACtC,QAAI,CAAC,OAAO,IAAI;AACd,cAAQ,WAAW;AAAA,IACrB;AAAA,EACF,CAAC;AAED,UAAQ,QAAQ,KAAK,EAAE,YAAY,2BAA2B,EAAE,OAAO,YAAY;AACjF,UAAM,aAAa;AAAA,EACrB,CAAC;AAED,QAAM,QAAQ,WAAW,QAAQ,IAAI;AACvC;AAEA,KAAK,EAAE,MAAM,CAAC,UAAmB;AAC/B,QAAM,UAAU,iBAAiB,QAAQ,MAAM,UAAU;AACzD,UAAQ,OAAO,MAAM,GAAG,OAAO;AAAA,CAAI;AACnC,UAAQ,KAAK,CAAC;AAChB,CAAC;","names":["path","fs","fs","fs","fs","path"]}
package/package.json ADDED
@@ -0,0 +1,53 @@
1
+ {
2
+ "name": "vericontext",
3
+ "version": "0.1.1",
4
+ "description": "Deterministic, hash-based verification for docs that reference code",
5
+ "type": "module",
6
+ "bin": {
7
+ "vericontext": "dist/cli.js",
8
+ "vcc": "dist/cli.js"
9
+ },
10
+ "files": [
11
+ "dist",
12
+ "skills"
13
+ ],
14
+ "keywords": [
15
+ "documentation",
16
+ "verification",
17
+ "citation",
18
+ "hash",
19
+ "mcp",
20
+ "agents",
21
+ "cli"
22
+ ],
23
+ "repository": {
24
+ "type": "git",
25
+ "url": "https://github.com/amsminn/vericontext.git"
26
+ },
27
+ "homepage": "https://github.com/amsminn/vericontext",
28
+ "license": "MIT",
29
+ "author": "amsminn",
30
+ "engines": {
31
+ "node": ">=20"
32
+ },
33
+ "scripts": {
34
+ "build": "tsup",
35
+ "dev": "tsx src/cli.ts",
36
+ "test": "vitest run",
37
+ "typecheck": "tsc --noEmit",
38
+ "prepublishOnly": "npm run build",
39
+ "mcp:smoke": "tsx scripts/mcp-smoke.ts"
40
+ },
41
+ "dependencies": {
42
+ "@modelcontextprotocol/sdk": "^1.18.2",
43
+ "commander": "^14.0.1",
44
+ "zod": "^3.24.2"
45
+ },
46
+ "devDependencies": {
47
+ "@types/node": "^24.4.0",
48
+ "tsup": "^8.5.0",
49
+ "tsx": "^4.20.5",
50
+ "typescript": "^5.9.2",
51
+ "vitest": "^3.2.4"
52
+ }
53
+ }
@@ -0,0 +1,172 @@
1
+ ---
2
+ name: vericontext-enforcer
3
+ description: >-
4
+ AI 에이전트가 문서(README, AGENTS.md, 플랜 등)를 작성·수정할 때 VeriContext
5
+ citation/claim을 자동으로 생성·삽입하도록 강제하고, 커밋·플랜 마무리 시점에
6
+ vctx_verify_workspace를 실행하여 모든 claim이 최신 상태인지 검증한다.
7
+ 코드를 참조하는 모든 문서 작업에서 사용한다.
8
+ NOT for: 코드 자체의 수정, 테스트 작성, 빌드 작업.
9
+ hooks:
10
+ PreToolUse:
11
+ - matcher: Edit
12
+ hooks:
13
+ - command: >-
14
+ echo '[vericontext-enforcer] Editing a file — remember to update
15
+ any vctx citations referencing changed lines.'
16
+ ---
17
+
18
+ ## Phase 1 — 참조 대상 식별 (Detect & Plan)
19
+
20
+ 문서를 작성하거나 수정하기 전에, 참조할 코드와 구조를 먼저 식별한다.
21
+
22
+ ### 1.1 참조 유형 분류
23
+
24
+ | 참조 유형 | VeriContext 도구 | 실제 예시 |
25
+ |-----------|-----------------|-----------|
26
+ | 코드 라인 범위 | `vctx_cite` | <!-- [[vctx:src/cli.ts#L28-L34@2b4ed3f8]] --> |
27
+ | 파일 존재 여부 | `vctx_claim --kind exists-file` | <!-- [[vctx-exists-file:src/cli.ts]] --> |
28
+ | 디렉토리 존재 여부 | `vctx_claim --kind exists-dir` | <!-- [[vctx-exists-dir:src/]] --> |
29
+ | 파일/디렉토리 부재 확인 | `vctx_claim --kind missing` | <!-- [[vctx-missing:tmp-output]] --> |
30
+
31
+ > 위 예시는 모두 이 프로젝트에서 실제로 유효한 claim이다. `vctx_verify_workspace`로 검증하면 전부 `ok: true`.
32
+
33
+ ### 1.2 행동 규칙
34
+
35
+ 1. **코드를 언급하면 반드시 citation을 생성한다.** "이 함수는 X를 한다"라고 쓰려면, 해당 함수의 라인 범위를 `vctx_cite`로 인용한 뒤 토큰을 문서에 삽입한다.
36
+ 2. **프로젝트 구조를 언급하면 반드시 claim을 생성한다.** "src/ 디렉토리에는..."이라고 쓰려면 `vctx_claim --kind exists-dir`로 claim을 만든다.
37
+ 3. **외부 도구(read_file 등)로 코드를 읽었더라도**, 문서에 기술하기 전에 `vctx_cite`를 호출하여 citation 토큰을 확보한다.
38
+ 4. citation/claim 토큰은 **HTML 주석(`<!-- ... -->`)** 안에 넣거나, 관련 텍스트 바로 뒤에 inline으로 배치한다.
39
+
40
+ ## Phase 2 — Citation/Claim 생성 및 삽입 (Generate & Embed)
41
+
42
+ ### 2.1 CLI 사용법 (에이전트 환경 공통)
43
+
44
+ **Citation 생성:**
45
+ ```bash
46
+ vericontext cite --root <project-root> --path <file> --start-line <N> --end-line <M> --json
47
+ ```
48
+
49
+ 응답 예시:
50
+ ```json
51
+ { "ok": true, "citation": "[[vctx:src/cli.ts#L28-L34@2b4ed3f8]]", "sha256_full": "2b4ed3f8..." }
52
+ ```
53
+
54
+ **Structure Claim 생성:**
55
+ ```bash
56
+ vericontext claim --root <project-root> --kind exists-dir --path src/ --json
57
+ ```
58
+
59
+ 응답 예시:
60
+ ```json
61
+ { "ok": true, "claim": "[[vctx-exists-dir:src/]]", "kind": "exists-dir", "normalized_path": "src/" }
62
+ ```
63
+
64
+ ### 2.2 MCP 사용법 (MCP 지원 에이전트)
65
+
66
+ MCP 도구 `vctx_cite`, `vctx_claim`을 직접 호출한다. 파라미터는 CLI와 동일하다.
67
+
68
+ ### 2.3 삽입 패턴
69
+
70
+ **패턴 A — 인라인 (테이블, 코드맵에서 사용):**
71
+
72
+ | CLI 진입점 | `src/cli.ts` | <!-- [[vctx:src/cli.ts#L28-L34@2b4ed3f8]] --> |
73
+
74
+ **패턴 B — 섹션 끝 (설명 문단 뒤에 사용):**
75
+
76
+ 검증 로직은 claim 단위로 원자적으로 동작한다. 하나라도 실패하면 전체 결과가 `ok: false`가 된다.
77
+ <!-- [[vctx:src/verify/workspace.ts#L46-L109@39dad95f]] -->
78
+
79
+ **패턴 C — 구조 선언 (프로젝트 구조 설명에 사용):**
80
+
81
+ ├── src/ # production logic
82
+ <!-- [[vctx-exists-dir:src/]] -->
83
+ ├── tests/ # unit + e2e tests
84
+ <!-- [[vctx-exists-dir:tests/]] -->
85
+ <!-- [[vctx-exists-file:package.json]] -->
86
+
87
+ ### 2.4 절대 금지 사항
88
+
89
+ - citation 토큰을 수동으로 타이핑하지 않는다. 반드시 도구로 생성한다.
90
+ - hash 값을 추측하거나 이전 값을 복사하지 않는다.
91
+ - 존재하지 않는 파일에 대한 citation을 만들지 않는다.
92
+ - claim 없이 "이 파일이 있다/없다"고 단언하지 않는다.
93
+
94
+ ## Phase 3 — 검증 및 강제 (Verify & Enforce)
95
+
96
+ ### 3.1 검증 타이밍
97
+
98
+ 다음 시점에 **반드시** 검증을 실행한다:
99
+
100
+ 1. **문서 작성/수정 완료 직후** — 커밋 전에 반드시 실행
101
+ 2. **플랜 마무리 시** — 플랜 문서에 포함된 모든 claim 검증
102
+ 3. **코드 수정 후** — 코드를 변경했다면, 해당 코드를 참조하는 문서의 claim 검증
103
+
104
+ ### 3.2 검증 실행
105
+
106
+ **CLI:**
107
+ ```bash
108
+ vericontext verify workspace --root <project-root> --in-path <document.md> --json
109
+ ```
110
+
111
+ **MCP:**
112
+ `vctx_verify_workspace` 도구를 `root`와 `in_path` 파라미터로 호출한다.
113
+
114
+ **응답 예시 (성공):**
115
+ ```json
116
+ { "ok": true, "total": 3, "ok_count": 3, "fail_count": 0, "results": [...] }
117
+ ```
118
+
119
+ **실패 시 응답 구조:**
120
+ ```json
121
+ { "ok": false, "total": N, "ok_count": M, "fail_count": N-M,
122
+ "results": [{ "claim": "<token>", "ok": false, "reason": "hash_mismatch|missing|range_invalid" }] }
123
+ ```
124
+ `reason` 필드의 값에 따라 3.3의 대응 절차를 따른다.
125
+
126
+ ### 3.3 실패 시 대응 절차
127
+
128
+ 1. `hash_mismatch` → 해당 파일의 현재 라인 범위를 다시 읽고, `vctx_cite`로 새 citation을 생성하여 교체한다.
129
+ 2. `missing` / `not_file` / `not_dir` → 파일/디렉토리가 이동·삭제되었는지 확인하고, 문서 내용을 수정하거나 claim을 제거한다.
130
+ 3. `range_invalid` → 파일이 줄어들었다면 올바른 범위로 재인용한다.
131
+ 4. **모든 실패를 해결한 뒤 다시 검증을 실행한다.** `ok: true`가 될 때까지 반복한다.
132
+
133
+ ### 3.4 검증 통과 기준
134
+
135
+ - `ok: true` — 커밋 또는 플랜 제출 가능
136
+ - `ok: false` — **커밋/제출 불가**. 실패한 claim을 모두 수정해야 한다.
137
+
138
+ > **원칙: Fail Closed.** 검증을 건너뛰거나, 실패한 채로 커밋하는 것은 이 스킬의 위반이다.
139
+
140
+ ## Phase 4 — 에이전트별 설정 가이드
141
+
142
+ ### Claude Code
143
+ - MCP 서버로 `vericontext mcp`를 등록하면 `vctx_cite`, `vctx_claim`, `vctx_verify_workspace` 도구를 직접 사용할 수 있다.
144
+ - 이 스킬은 `.claude/skills/vericontext-enforcer`에 symlink로 연결되어 자동 로드된다.
145
+
146
+ ### Codex (OpenAI)
147
+ - CLI(`vericontext cite`, `vericontext claim`, `vericontext verify workspace`)를 Bash/shell 도구로 실행한다.
148
+ - AGENTS.md에 이 스킬의 핵심 규칙이 요약되어 있어 Codex가 자동으로 읽는다.
149
+
150
+ ### OpenCode
151
+ - MCP를 지원하는 경우 Claude Code와 동일하게 MCP 서버를 등록한다.
152
+ - MCP 미지원 시 Codex와 동일하게 CLI + AGENTS.md 방식을 사용한다.
153
+
154
+ ### Antigravity / 기타
155
+ - CLI 실행만 가능한 환경에서는 `scripts/verify-modified-docs.sh`를 실행하여 수정된 문서를 일괄 검증한다.
156
+ - INSTRUCTIONS.md 또는 시스템 프롬프트에 Phase 1~3의 핵심 규칙을 복사하여 사용한다.
157
+
158
+ ## Reference File Index
159
+
160
+ | File | Read When |
161
+ |------|-----------|
162
+ | `references/citation-format-guide.md` | citation/claim 문법을 정확히 알아야 할 때 |
163
+ | `references/verification-playbook.md` | 검증 실패 시 구체적인 해결 절차가 필요할 때 |
164
+ | `references/cross-agent-setup.md` | 새 에이전트 환경에 VeriContext를 설정할 때 |
165
+
166
+ ## Critical Rules
167
+
168
+ 1. **문서에 코드를 참조하면 반드시 `vctx_cite`로 citation을 생성한다.** 예외 없음.
169
+ 2. **커밋 전에 반드시 `vctx_verify_workspace`를 실행한다.** `ok: true`가 아니면 커밋하지 않는다.
170
+ 3. **citation 토큰을 수동으로 작성하지 않는다.** 항상 도구로 생성한다.
171
+ 4. **코드를 수정하면, 해당 코드를 참조하는 문서의 citation도 갱신한다.**
172
+ 5. **검증 실패를 무시하지 않는다.** Fail closed — 실패는 항상 수정해야 한다.
@@ -0,0 +1,27 @@
1
+ [
2
+ {
3
+ "skills": ["vericontext-enforcer"],
4
+ "query": "README.md에 src/cli.ts의 main 함수에 대한 설명을 추가해줘",
5
+ "files": ["src/cli.ts", "README.md"],
6
+ "expected_behavior": [
7
+ "Agent begins Phase 1: identifies src/cli.ts as a code reference target",
8
+ "Agent calls vctx_cite (or vericontext cite CLI) with path=src/cli.ts and appropriate line range",
9
+ "Agent receives citation token with hash",
10
+ "Agent writes the description into README.md with the citation token embedded",
11
+ "Agent runs vctx_verify_workspace on README.md",
12
+ "Agent confirms ok: true before considering the task complete"
13
+ ]
14
+ },
15
+ {
16
+ "skills": ["vericontext-enforcer"],
17
+ "query": "AGENTS.md의 프로젝트 구조 섹션을 최신화해줘",
18
+ "files": ["AGENTS.md"],
19
+ "expected_behavior": [
20
+ "Agent begins Phase 1: identifies directory/file references in structure section",
21
+ "Agent calls vctx_claim for each directory and file mentioned",
22
+ "Agent embeds claim tokens as HTML comments in the structure section",
23
+ "Agent runs vctx_verify_workspace on AGENTS.md",
24
+ "Agent confirms ok: true before considering the task complete"
25
+ ]
26
+ }
27
+ ]
@@ -0,0 +1,15 @@
1
+ [
2
+ {
3
+ "skills": ["vericontext-enforcer"],
4
+ "query": "src/cli.ts를 수정했어. 관련 문서도 업데이트해줘",
5
+ "files": ["src/cli.ts", "README.md", "AGENTS.md"],
6
+ "expected_behavior": [
7
+ "Agent identifies documents that reference src/cli.ts by scanning for vctx citations",
8
+ "Agent runs vctx_verify_workspace on each document to find stale claims",
9
+ "For each failed claim, agent regenerates citation with updated content",
10
+ "Agent replaces stale citation tokens in the documents",
11
+ "Agent re-runs verification and confirms ok: true for all documents",
12
+ "Agent reports which documents and citations were updated"
13
+ ]
14
+ }
15
+ ]
@@ -0,0 +1,52 @@
1
+ # VeriContext Citation & Claim Format Guide
2
+
3
+ ## Citation (코드 라인 참조)
4
+
5
+ ### 형식
6
+ ````
7
+ [[vctx:<relative-path>#L<start>-L<end>@<hash8>]]
8
+ ````
9
+
10
+ ### 구성 요소
11
+ - `relative-path`: 프로젝트 루트 기준 상대 경로 (예: `src/cli.ts`)
12
+ - `L<start>-L<end>`: 1-based 라인 범위 (예: `L36-L103`)
13
+ - `hash8`: SHA-256 해시의 앞 8자 (도구가 자동 생성)
14
+
15
+ ### 예시
16
+ ````
17
+ [[vctx:src/cli.ts#L36-L103@4db34b59]]
18
+ [[vctx:src/verify/workspace.ts#L46-L109@39dad95f]]
19
+ ````
20
+
21
+ ### 검증 동작
22
+ - 해당 파일의 지정 라인 범위를 읽어 SHA-256 해시를 재계산한다.
23
+ - EOL 정규화 적용 (CRLF/CR → LF).
24
+ - 해시가 일치하면 `ok: true`, 불일치하면 `hash_mismatch`.
25
+
26
+ ## Structure Claim (구조 선언)
27
+
28
+ ### 형식
29
+ | Kind | 형식 | 의미 |
30
+ |------|------|------|
31
+ | `exists` | `[[vctx-exists:<path>]]` | 파일 또는 디렉토리가 존재 |
32
+ | `exists-file` | `[[vctx-exists-file:<path>]]` | 파일로 존재 |
33
+ | `exists-dir` | `[[vctx-exists-dir:<path>]]` | 디렉토리로 존재 |
34
+ | `missing` | `[[vctx-missing:<path>]]` | 존재하지 않음 |
35
+
36
+ ### 예시
37
+ ````
38
+ <!-- [[vctx-exists-dir:src/]] -->
39
+ <!-- [[vctx-exists-file:package.json]] -->
40
+ <!-- [[vctx-missing:tmp-output/]] -->
41
+ ````
42
+
43
+ ## 에러 코드
44
+
45
+ | Reason | 원인 | 대응 |
46
+ |--------|------|------|
47
+ | `hash_mismatch` | 파일 내용 변경됨 | 재인용 (`vctx_cite`) |
48
+ | `file_missing` | 파일 삭제/이동됨 | 문서 수정 또는 claim 제거 |
49
+ | `range_invalid` | 라인 범위 초과 | 올바른 범위로 재인용 |
50
+ | `path_escape` | `../` 등 루트 이탈 | 상대 경로 수정 |
51
+ | `not_file` | 디렉토리인데 file claim | kind를 `exists-dir`로 변경 |
52
+ | `not_dir` | 파일인데 dir claim | kind를 `exists-file`로 변경 |
@@ -0,0 +1,72 @@
1
+ # VeriContext Cross-Agent Setup Guide
2
+
3
+ ## 사전 요구사항
4
+
5
+ - Node.js >= 20
6
+ - VeriContext CLI 설치:
7
+ ```bash
8
+ npm install # (vericontext 프로젝트 내)
9
+ npm run build # dist/cli.js 생성
10
+ ```
11
+ - 또는 글로벌 설치: `npm install -g vericontext`
12
+
13
+ ## Claude Code
14
+
15
+ ### MCP 서버 등록
16
+
17
+ `~/.claude/settings.json` 또는 프로젝트 `.claude/settings.json`:
18
+ ```json
19
+ {
20
+ "mcpServers": {
21
+ "vericontext": {
22
+ "command": "node",
23
+ "args": ["<path-to>/dist/cli.js", "mcp"]
24
+ }
25
+ }
26
+ }
27
+ ```
28
+
29
+ ### 스킬 자동 로드
30
+ `.claude/skills/vericontext-enforcer` symlink가 있으면 자동으로 스킬이 로드된다.
31
+
32
+ ## Codex (OpenAI)
33
+
34
+ ### AGENTS.md 방식
35
+ 프로젝트 루트의 `AGENTS.md`에 다음을 추가한다:
36
+
37
+ ````markdown
38
+ ## DOCUMENT VERIFICATION
39
+ 이 프로젝트에서 문서를 작성/수정할 때:
40
+ 1. 코드를 참조하면 `vericontext cite --root . --path <file> --start-line <N> --end-line <M> --json`으로 citation을 생성한다.
41
+ 2. 파일/디렉토리 존재를 언급하면 `vericontext claim --root . --kind <kind> --path <path> --json`으로 claim을 생성한다.
42
+ 3. 커밋 전에 `vericontext verify workspace --root . --in-path <document.md> --json`을 실행하여 모든 claim이 유효한지 확인한다.
43
+ 4. `ok: false`이면 커밋하지 않고, 실패한 claim을 수정한 뒤 재검증한다.
44
+ ````
45
+
46
+ ## OpenCode
47
+
48
+ ### 커스텀 룰 또는 시스템 프롬프트
49
+ Codex와 동일한 지침을 `.opencode/instructions.md` 또는 시스템 프롬프트에 추가한다.
50
+ MCP를 지원하는 경우 Claude Code와 동일하게 MCP 서버를 등록한다.
51
+
52
+ ## Antigravity
53
+
54
+ ### INSTRUCTIONS.md 방식
55
+ 프로젝트 루트의 `INSTRUCTIONS.md`에 위의 AGENTS.md 내용과 동일한 지침을 추가한다.
56
+
57
+ ## CI/CD 게이팅 (선택)
58
+
59
+ ### GitHub Actions 예시
60
+ ```yaml
61
+ - name: Verify documentation freshness
62
+ run: |
63
+ for doc in README.md AGENTS.md; do
64
+ result=$(npx vericontext verify workspace --root . --in-path "$doc" --json)
65
+ ok=$(echo "$result" | jq -r '.ok')
66
+ if [ "$ok" != "true" ]; then
67
+ echo "::error::Stale claims in $doc"
68
+ echo "$result" | jq '.results[] | select(.ok == false)'
69
+ exit 1
70
+ fi
71
+ done
72
+ ```
@@ -0,0 +1,79 @@
1
+ # VeriContext Verification Playbook
2
+
3
+ ## 검증 실행
4
+
5
+ ### 단일 문서 검증
6
+ ```bash
7
+ vericontext verify workspace --root . --in-path README.md --json
8
+ ```
9
+
10
+ ### 응답 구조
11
+ ```json
12
+ {
13
+ "ok": true|false,
14
+ "total": <number>,
15
+ "ok_count": <number>,
16
+ "fail_count": <number>,
17
+ "results": [
18
+ {
19
+ "claim": "<full claim string>",
20
+ "ok": true|false,
21
+ "reason": "<error code if failed>"
22
+ }
23
+ ]
24
+ }
25
+ ```
26
+
27
+ ## 실패 유형별 수정 절차
28
+
29
+ ### 1. `hash_mismatch` — 파일 내용이 변경됨
30
+
31
+ ```
32
+ 원인: citation이 가리키는 라인의 내용이 달라졌다.
33
+ 수정:
34
+ 1. 해당 파일의 현재 내용을 확인한다.
35
+ 2. 참조하려는 코드가 여전히 같은 위치에 있는지 확인한다.
36
+ - 같은 위치: vctx_cite를 다시 실행하여 새 hash로 교체.
37
+ - 다른 위치: 올바른 라인 범위를 찾아 재인용.
38
+ - 코드가 삭제됨: 문서에서 해당 참조를 제거하거나 수정.
39
+ 3. 문서의 citation 토큰을 새 것으로 교체한다.
40
+ ```
41
+
42
+ ### 2. `file_missing` — 파일이 존재하지 않음
43
+
44
+ ```
45
+ 원인: 파일이 삭제되었거나 이름이 변경되었다.
46
+ 수정:
47
+ 1. 파일이 이동/이름변경되었는지 확인한다.
48
+ - 이동됨: 새 경로로 citation 재생성.
49
+ - 삭제됨: 문서에서 해당 참조를 제거.
50
+ 2. structure claim (exists-file 등)도 함께 수정한다.
51
+ ```
52
+
53
+ ### 3. `range_invalid` — 라인 범위 초과
54
+
55
+ ```
56
+ 원인: 파일이 줄어들어 지정한 end_line이 파일 길이를 초과한다.
57
+ 수정:
58
+ 1. 파일의 현재 줄 수를 확인한다.
59
+ 2. 참조하려는 코드의 현재 라인 범위를 찾는다.
60
+ 3. 올바른 범위로 재인용한다.
61
+ ```
62
+
63
+ ### 4. `missing` / `not_file` / `not_dir` — structure claim 실패
64
+
65
+ ```
66
+ 원인: 파일/디렉토리가 예상과 다른 상태다.
67
+ 수정:
68
+ 1. 실제 상태를 확인한다 (파일인지, 디렉토리인지, 없는지).
69
+ 2. claim의 kind를 실제 상태에 맞게 수정하거나, 문서 내용을 수정한다.
70
+ ```
71
+
72
+ ## 수정 후 재검증
73
+
74
+ 수정을 마친 뒤 반드시 다시 검증을 실행한다:
75
+ ```bash
76
+ vericontext verify workspace --root . --in-path <수정한문서.md> --json
77
+ ```
78
+
79
+ `ok: true`가 될 때까지 반복한다. 검증 통과 전에는 커밋하지 않는다.
@@ -0,0 +1,58 @@
1
+ #!/usr/bin/env bash
2
+ # verify-docs.sh — VeriContext로 claim이 포함된 모든 프로젝트 문서를 검증
3
+ # 코드만 변경돼도 citation hash가 깨질 수 있으므로, 문서 diff와 무관하게
4
+ # claim이 포함된 모든 .md 파일을 검증한다.
5
+ # 의존성: node >= 20, vericontext CLI (dist/cli.js)
6
+ # 종료 코드: 0 = 모두 통과, 1 = 실패 있음
7
+
8
+ set -euo pipefail
9
+
10
+ ROOT="$(cd "${1:-.}" && pwd)"
11
+ VCTX="${VERICONTEXT_BIN:-npx vericontext}"
12
+
13
+ # claim 토큰([[vctx)이 포함된 .md 파일을 찾되,
14
+ # 내부 도구/플랜/스킬 디렉토리는 제외한다.
15
+ DOCS=$(grep -rl '\[\[vctx' "$ROOT" --include='*.md' \
16
+ | grep -v node_modules \
17
+ | grep -v '\.git/' \
18
+ | grep -v '\.agents/skills/' \
19
+ | grep -v '\.claude/skills/' \
20
+ | grep -v '\.sisyphus/' \
21
+ | sed "s|^$ROOT/||" \
22
+ | sort || true)
23
+
24
+ if [ -z "$DOCS" ]; then
25
+ echo "[vericontext-enforcer] No documents with vctx claims found."
26
+ exit 0
27
+ fi
28
+
29
+ FAILED=0
30
+ TOTAL=0
31
+
32
+ while IFS= read -r doc; do
33
+ [ -z "$doc" ] && continue
34
+ TOTAL=$((TOTAL + 1))
35
+ echo "[vericontext-enforcer] Verifying: $doc"
36
+
37
+ RESULT=$($VCTX verify workspace --root "$ROOT" --in-path "$doc" --json 2>/dev/null || echo '{"ok":false,"error":"command_failed"}')
38
+ OK=$(echo "$RESULT" | grep -o '"ok":[a-z]*' | head -1 | cut -d: -f2)
39
+
40
+ if [ "$OK" = "true" ]; then
41
+ echo " PASS"
42
+ else
43
+ echo " FAIL"
44
+ echo "$RESULT" | python3 -m json.tool 2>/dev/null || echo "$RESULT"
45
+ FAILED=$((FAILED + 1))
46
+ fi
47
+ done <<< "$DOCS"
48
+
49
+ echo ""
50
+ echo "[vericontext-enforcer] Results: $((TOTAL - FAILED))/$TOTAL passed"
51
+
52
+ if [ "$FAILED" -gt 0 ]; then
53
+ echo "[vericontext-enforcer] $FAILED document(s) have stale claims. Fix before committing."
54
+ exit 1
55
+ fi
56
+
57
+ echo "[vericontext-enforcer] All documents verified."
58
+ exit 0