@levnikolaevich/hex-line-mcp 1.15.0 → 1.16.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +27 -3
  2. package/dist/server.mjs +376 -82
  3. package/package.json +1 -1
package/README.md CHANGED
@@ -39,7 +39,7 @@ Advanced / occasional:
39
39
  | `outline` | AST-based structural overview with hash anchors via tree-sitter WASM. Supports JavaScript/TypeScript, Python, C#, PHP, and fence-aware markdown headings | 95% token reduction, direct edit anchors |
40
40
  | `verify` | Check if held checksums / revision are still current | Staleness check without full re-read |
41
41
  | `inspect_path` | Unified file-or-directory inspection | Minimal tree discovery by default, deeper traversal on demand |
42
- | `changes` | Compare file against git ref, shows added/removed/modified symbols | AST-level semantic diff |
42
+ | `changes` | Compare file against git ref, shows added/removed/modified symbols | AST-level semantic diff with risk/provenance preview |
43
43
  | `bulk_replace` | Search-and-replace across multiple files inside an explicit root path | Compact summary (default) or capped diffs via `format`, dry_run, max_files |
44
44
 
45
45
  ### Hooks (SessionStart + PreToolUse + PostToolUse)
@@ -86,6 +86,15 @@ npm run lint
86
86
  npm run check
87
87
  ```
88
88
 
89
+ For the full MCP workspace regression pass from the repository root, run the package test scripts explicitly:
90
+
91
+ ```bash
92
+ npm --prefix mcp --workspace hex-common test
93
+ npm --prefix mcp --workspace hex-line-mcp test
94
+ npm --prefix mcp --workspace hex-graph-mcp test
95
+ npm --prefix mcp --workspace hex-ssh-mcp test:all
96
+ ```
97
+
89
98
  Maintainers can also run the internal scenario harness when they want reproducible repo-local workflow regressions:
90
99
 
91
100
  ```bash
@@ -93,7 +102,9 @@ npm run scenarios -- --repo /path/to/repo
93
102
  npm run scenarios:diagnostic -- --repo /path/to/repo
94
103
  ```
95
104
 
96
- Comparative built-in vs hex-line benchmarks are maintained outside this package.
105
+ The diagnostics run reports graph payload overhead and auto-refresh telemetry. Read those rows as engineering diagnostics, not as a compression score.
106
+
107
+ Comparative built-in vs hex-line benchmarks are maintained outside this package. External-baseline comparisons must reuse the same scenario suite and correctness contract before making broader claims.
97
108
 
98
109
  ### Optional Graph Enrichment
99
110
 
@@ -103,8 +114,16 @@ If a project already has `.hex-skills/codegraph/index.db`, `hex-line` automatica
103
114
  - Graph enrichment is project-deterministic. `hex-line` only uses the graph database that belongs to the resolved current project scope.
104
115
  - Nested projects do not inherit graph hints from a parent repo index once a nested project boundary is detected.
105
116
  - `better-sqlite3` is optional. If it is unavailable, `hex-line` still works without graph hints.
117
+ - `read_file` only emits the top-line `Graph:` header in `verbosity=full`.
118
+ - `grep_search` only emits line-level graph annotations in `output=content` when `editReady=true`. `summary`, `files`, `count`, and discovery-only content search stay graph-free.
119
+ - Grep annotations are capped and count-suffixes are deduplicated so the first label keeps the counts and later labels stay short.
106
120
  - `read_file`, `outline`, and `grep_search` stay compact: they only surface high-signal local facts such as `api`, framework entrypoints, callers, flow, and clone hints.
107
121
  - `edit_file` and `changes` surface the deeper review layer: external callers, downstream return/property flow, clone peers, public API risk, framework entrypoint risk, and same-name sibling warnings when present.
122
+ - Both tools now expose a short top-level preview before the detailed sections: `payload_sections`, `graph_enrichment`, count fields, and `provenance_summary`.
123
+ - Stale graph data never changes correctness. `hex-line` suppresses stale hints, schedules a best-effort `reindexFile(...)` for a stale file, and escalates to one background `indexProject(...)` when several files go stale inside the burst window.
124
+ - The threshold-crossing file still keeps its point refresh so a short stale burst does not create a blind gap before the project refresh finishes.
125
+ - `edit_file` may still use stale graph context as same-response advisory metadata when the file was just edited, but stale graph never gates the edit protocol itself.
126
+ - `npm run scenarios:diagnostic -- --with-graph` prints graph payload overhead plus an auto-refresh probe line with `suppressions`, `file_refresh`, `project_refresh`, and `threshold_hits`.
108
127
 
109
128
  `hex-line` does not read `hex-graph` internals directly anymore. The integration uses a small read-only contract exposed by `hex-graph-mcp`:
110
129
 
@@ -224,6 +243,11 @@ Result footer includes:
224
243
  - `reason: ...` as the canonical machine-readable cause for the current status
225
244
  - `revision: ...`
226
245
  - `file: ...`
246
+ - `summary: ...` with edited line span / diff counts on successful edits
247
+ - `payload_sections: ...` so callers know which detailed sections follow
248
+ - `graph_enrichment: available | unavailable`
249
+ - `semantic_impact_count: ...`, `semantic_fact_count: ...`, and `clone_warning_count: ...` when graph-backed review data is available
250
+ - `provenance_summary: ...` for the edit protocol + optional graph source
227
251
  - `changed_ranges: ...` when relevant
228
252
  - `recovery_ranges: ...` with the narrowest recommended `read_file` ranges for retry
229
253
  - `next_action: ...` as the canonical immediate choice: `apply_retry_edit`, `apply_retry_batch`, or `reread_then_retry`
@@ -233,7 +257,7 @@ Result footer includes:
233
257
  - `retry_edits: ...` on conservative batch conflicts when every conflicted edit can be retried directly
234
258
  - `suggested_read_call: ...` when rereading is the safest next step
235
259
  - `retry_plan: ...` with a compact machine-readable next-call plan
236
- - `summary: ...` and `snippet: ...` instead of long prose blocks
260
+ - `summary: ...` and `snippet: ...` instead of long prose blocks on conflicts
237
261
  - `edit_conflicts: N` on conservative multi-edit preflight conflicts
238
262
 
239
263
  ### write_file
package/dist/server.mjs CHANGED
@@ -103,7 +103,7 @@ async function checkForUpdates(packageName, currentVersion) {
103
103
  }
104
104
 
105
105
  // lib/read.mjs
106
- import { statSync as statSync4 } from "node:fs";
106
+ import { statSync as statSync5 } from "node:fs";
107
107
 
108
108
  // lib/security.mjs
109
109
  import { realpathSync, statSync as statSync2, existsSync, openSync, readSync, closeSync } from "node:fs";
@@ -307,7 +307,7 @@ function validateWritePath(filePath) {
307
307
  }
308
308
 
309
309
  // lib/graph-enrich.mjs
310
- import { existsSync as existsSync2 } from "node:fs";
310
+ import { existsSync as existsSync2, statSync as statSync3 } from "node:fs";
311
311
  import { join as join3, dirname as dirname2, relative } from "node:path";
312
312
  import { createRequire } from "node:module";
313
313
  var REQUIRED_VIEWS = [
@@ -331,6 +331,20 @@ var FACT_PRIORITY = /* @__PURE__ */ new Map([
331
331
  ]);
332
332
  var _dbs = /* @__PURE__ */ new Map();
333
333
  var _driverUnavailable = false;
334
+ var _reindexUnavailable = false;
335
+ var _reindexLoader = null;
336
+ var _pendingRefreshes = /* @__PURE__ */ new Map();
337
+ var _pendingProjectRefreshes = /* @__PURE__ */ new Map();
338
+ var _freshnessCache = /* @__PURE__ */ new Map();
339
+ var _projectRefreshWindows = /* @__PURE__ */ new Map();
340
+ var _graphRefreshStats = {
341
+ staleSuppressions: 0,
342
+ fileRefreshScheduled: 0,
343
+ fileRefreshCompleted: 0,
344
+ projectRefreshThresholdHits: 0,
345
+ projectRefreshScheduled: 0,
346
+ projectRefreshCompleted: 0
347
+ };
334
348
  var MAX_PROJECT_ROOT_ASCENT = 25;
335
349
  var PROJECT_BOUNDARY_MARKERS = [
336
350
  "package.json",
@@ -343,14 +357,24 @@ var PROJECT_BOUNDARY_MARKERS = [
343
357
  "deno.jsonc",
344
358
  ".git"
345
359
  ];
346
- function getGraphDB(filePath) {
360
+ var FRESHNESS_TOLERANCE_MS = 1;
361
+ var FRESHNESS_CACHE_TTL_MS = 1e3;
362
+ var PROJECT_REFRESH_THRESHOLD = 3;
363
+ var PROJECT_REFRESH_WINDOW_MS = 2e3;
364
+ function getGraphDB(filePath, { allowStale = false } = {}) {
347
365
  if (_driverUnavailable) return null;
348
366
  try {
349
367
  const projectRoot = findProjectRoot(filePath);
350
368
  if (!projectRoot) return null;
351
369
  const dbPath = join3(projectRoot, ".hex-skills/codegraph", "index.db");
352
370
  if (!existsSync2(dbPath)) return null;
353
- if (_dbs.has(dbPath)) return _dbs.get(dbPath);
371
+ if (_dbs.has(dbPath)) {
372
+ const cached = _dbs.get(dbPath);
373
+ if (isFilePathFresh(cached, projectRoot, filePath)) return cached;
374
+ if (allowStale) return cached;
375
+ _graphRefreshStats.staleSuppressions++;
376
+ return null;
377
+ }
354
378
  const require2 = createRequire(import.meta.url);
355
379
  const Database = require2("better-sqlite3");
356
380
  const db = new Database(dbPath, { readonly: true });
@@ -359,7 +383,10 @@ function getGraphDB(filePath) {
359
383
  return null;
360
384
  }
361
385
  _dbs.set(dbPath, db);
362
- return db;
386
+ if (isFilePathFresh(db, projectRoot, filePath)) return db;
387
+ if (allowStale) return db;
388
+ _graphRefreshStats.staleSuppressions++;
389
+ return null;
363
390
  } catch {
364
391
  _driverUnavailable = true;
365
392
  return null;
@@ -402,6 +429,37 @@ function compactSymbolCounts(node) {
402
429
  if ((node.clone_sibling_count || 0) > 0) parts.push(`clone ${node.clone_sibling_count}`);
403
430
  return parts;
404
431
  }
432
+ function lineFactLabel(fact) {
433
+ switch (fact.fact_kind) {
434
+ case "public_api":
435
+ return "api";
436
+ case "framework_entrypoint":
437
+ return fact.related_display_name ? `entry:${fact.related_display_name}` : "entry";
438
+ case "definition":
439
+ return shortKind(fact.kind);
440
+ case "callee":
441
+ return fact.related_display_name ? `callee:${fact.related_display_name}` : "callee";
442
+ case "caller":
443
+ return fact.related_display_name ? `caller:${fact.related_display_name}` : "caller";
444
+ case "outgoing_flow":
445
+ return `flow-out:${fact.target_anchor_kind || "?"}`;
446
+ case "incoming_flow":
447
+ return `flow-in:${fact.target_anchor_kind || "?"}`;
448
+ case "through_flow":
449
+ return "flow-through";
450
+ case "clone":
451
+ return "clone";
452
+ case "hotspot":
453
+ return "hotspot";
454
+ default:
455
+ return fact.fact_kind;
456
+ }
457
+ }
458
+ function formatLineFact(fact, { includeCounts = true } = {}) {
459
+ const countParts = includeCounts ? compactSymbolCounts(fact) : [];
460
+ const suffix = countParts.length > 0 ? ` | ${countParts.join(" | ")}` : "";
461
+ return `[${lineFactLabel(fact)}${suffix}]`;
462
+ }
405
463
  function symbolAnnotation(db, file, name) {
406
464
  try {
407
465
  const node = db.prepare(
@@ -419,6 +477,18 @@ function symbolAnnotation(db, file, name) {
419
477
  return null;
420
478
  }
421
479
  }
480
+ function ensureGraphFreshForFile(db, absoluteFilePath) {
481
+ if (!db) return false;
482
+ try {
483
+ const projectRoot = findProjectRoot(absoluteFilePath);
484
+ if (!projectRoot) return true;
485
+ const fresh = isFilePathFresh(db, projectRoot, absoluteFilePath);
486
+ if (!fresh) _graphRefreshStats.staleSuppressions++;
487
+ return fresh;
488
+ } catch {
489
+ return true;
490
+ }
491
+ }
422
492
  function fileAnnotations(db, file, { startLine = null, endLine = null, limit = 8 } = {}) {
423
493
  try {
424
494
  const hasRange = Number.isInteger(startLine) && Number.isInteger(endLine);
@@ -457,34 +527,6 @@ function fileAnnotations(db, file, { startLine = null, endLine = null, limit = 8
457
527
  return [];
458
528
  }
459
529
  }
460
- function formatLineFact(fact) {
461
- const countParts = compactSymbolCounts(fact);
462
- const suffix = countParts.length > 0 ? ` | ${countParts.join(" | ")}` : "";
463
- switch (fact.fact_kind) {
464
- case "public_api":
465
- return `[api${suffix}]`;
466
- case "framework_entrypoint":
467
- return fact.related_display_name ? `[entry:${fact.related_display_name}${suffix}]` : `[entry${suffix}]`;
468
- case "definition":
469
- return `[${shortKind(fact.kind)}${suffix}]`;
470
- case "callee":
471
- return fact.related_display_name ? `[callee:${fact.related_display_name}${suffix}]` : `[callee${suffix}]`;
472
- case "caller":
473
- return fact.related_display_name ? `[caller:${fact.related_display_name}${suffix}]` : `[caller${suffix}]`;
474
- case "outgoing_flow":
475
- return `[flow-out:${fact.target_anchor_kind || "?"}${suffix}]`;
476
- case "incoming_flow":
477
- return `[flow-in:${fact.target_anchor_kind || "?"}${suffix}]`;
478
- case "through_flow":
479
- return `[flow-through${suffix}]`;
480
- case "clone":
481
- return `[clone${suffix}]`;
482
- case "hotspot":
483
- return `[hotspot${suffix}]`;
484
- default:
485
- return `[${fact.fact_kind}${suffix}]`;
486
- }
487
- }
488
530
  function priorityForFact(factKind) {
489
531
  return FACT_PRIORITY.get(factKind) ?? 99;
490
532
  }
@@ -513,10 +555,14 @@ function matchAnnotation(db, file, line) {
513
555
  facts.sort((left, right) => priorityForFact(left.fact_kind) - priorityForFact(right.fact_kind));
514
556
  const labels = [];
515
557
  const seenKinds = /* @__PURE__ */ new Set();
558
+ const seenCountKeys = /* @__PURE__ */ new Set();
516
559
  for (const fact of facts) {
517
560
  if (seenKinds.has(fact.fact_kind)) continue;
518
561
  seenKinds.add(fact.fact_kind);
519
- labels.push(formatLineFact(fact));
562
+ const countKey = compactSymbolCounts(fact).join("|");
563
+ const includeCounts = countKey.length > 0 && !seenCountKeys.has(countKey);
564
+ if (includeCounts) seenCountKeys.add(countKey);
565
+ labels.push(formatLineFact(fact, { includeCounts }));
520
566
  if (labels.length >= 3) break;
521
567
  }
522
568
  return labels.join(" ");
@@ -639,9 +685,140 @@ function findProjectRoot(filePath) {
639
685
  }
640
686
  return null;
641
687
  }
688
+ function freshnessCacheKey(projectRoot, relativeFile) {
689
+ return `${projectRoot}:${relativeFile}`;
690
+ }
691
+ function normalizeRelativeFile(projectRoot, filePath) {
692
+ const relFile = relative(projectRoot, filePath).replace(/\\/g, "/");
693
+ if (!relFile || relFile.startsWith("..")) return null;
694
+ return relFile;
695
+ }
696
+ function lookupIndexedMtime(db, relativeFile) {
697
+ try {
698
+ const row = db.prepare("SELECT mtime FROM files WHERE path = ? LIMIT 1").get(relativeFile);
699
+ return row?.mtime ?? null;
700
+ } catch {
701
+ return null;
702
+ }
703
+ }
704
+ function isFilePathFresh(db, projectRoot, filePath) {
705
+ let stat;
706
+ try {
707
+ stat = statSync3(filePath);
708
+ } catch {
709
+ return false;
710
+ }
711
+ if (!stat.isFile()) return true;
712
+ const relativeFile = normalizeRelativeFile(projectRoot, filePath);
713
+ if (!relativeFile) return true;
714
+ const cacheKey = freshnessCacheKey(projectRoot, relativeFile);
715
+ const cached = _freshnessCache.get(cacheKey);
716
+ if (cached && Date.now() - cached.checkedAt < FRESHNESS_CACHE_TTL_MS && Math.abs(cached.actualMtime - stat.mtimeMs) < FRESHNESS_TOLERANCE_MS) {
717
+ return cached.fresh;
718
+ }
719
+ const indexedMtime = lookupIndexedMtime(db, relativeFile);
720
+ const fresh = indexedMtime !== null && Math.abs(indexedMtime - stat.mtimeMs) < FRESHNESS_TOLERANCE_MS;
721
+ _freshnessCache.set(cacheKey, {
722
+ checkedAt: Date.now(),
723
+ actualMtime: stat.mtimeMs,
724
+ indexedMtime,
725
+ fresh
726
+ });
727
+ if (!fresh) scheduleGraphRefresh(projectRoot, relativeFile, cacheKey);
728
+ return fresh;
729
+ }
730
+ function scheduleGraphRefresh(projectRoot, relativeFile, cacheKey) {
731
+ if (_reindexUnavailable || _pendingRefreshes.has(cacheKey)) return;
732
+ const triggeredProjectRefresh = recordStaleFile(projectRoot, relativeFile);
733
+ if (triggeredProjectRefresh) {
734
+ _graphRefreshStats.projectRefreshThresholdHits++;
735
+ scheduleProjectGraphRefresh(projectRoot);
736
+ }
737
+ if (_pendingProjectRefreshes.has(projectRoot) && !triggeredProjectRefresh) return;
738
+ const refresh = (async () => {
739
+ const indexer = await loadGraphIndexer();
740
+ if (!indexer?.reindexFile) return;
741
+ try {
742
+ await indexer.reindexFile(projectRoot, relativeFile);
743
+ } catch {
744
+ } finally {
745
+ clearProjectDBCache(projectRoot);
746
+ _freshnessCache.delete(cacheKey);
747
+ _graphRefreshStats.fileRefreshCompleted++;
748
+ }
749
+ })();
750
+ _graphRefreshStats.fileRefreshScheduled++;
751
+ _pendingRefreshes.set(cacheKey, refresh);
752
+ void refresh.finally(() => {
753
+ _pendingRefreshes.delete(cacheKey);
754
+ });
755
+ }
756
+ function recordStaleFile(projectRoot, relativeFile) {
757
+ const now = Date.now();
758
+ let windowState = _projectRefreshWindows.get(projectRoot);
759
+ if (!windowState || now - windowState.startedAt > PROJECT_REFRESH_WINDOW_MS) {
760
+ windowState = { startedAt: now, files: /* @__PURE__ */ new Set() };
761
+ _projectRefreshWindows.set(projectRoot, windowState);
762
+ }
763
+ windowState.files.add(relativeFile);
764
+ return windowState.files.size >= PROJECT_REFRESH_THRESHOLD;
765
+ }
766
+ function clearProjectFreshness(projectRoot) {
767
+ const prefix = `${projectRoot}::`;
768
+ for (const key of _freshnessCache.keys()) {
769
+ if (key.startsWith(prefix)) _freshnessCache.delete(key);
770
+ }
771
+ }
772
+ function clearProjectDBCache(projectRoot) {
773
+ const dbPath = join3(projectRoot, ".hex-skills/codegraph", "index.db");
774
+ const db = _dbs.get(dbPath);
775
+ if (!db) return;
776
+ try {
777
+ db.close();
778
+ } catch {
779
+ }
780
+ _dbs.delete(dbPath);
781
+ }
782
+ function scheduleProjectGraphRefresh(projectRoot) {
783
+ if (_reindexUnavailable || _pendingProjectRefreshes.has(projectRoot)) return;
784
+ const refresh = (async () => {
785
+ const indexer = await loadGraphIndexer();
786
+ if (!indexer?.indexProject) return;
787
+ try {
788
+ await indexer.indexProject(projectRoot);
789
+ } catch {
790
+ } finally {
791
+ clearProjectDBCache(projectRoot);
792
+ clearProjectFreshness(projectRoot);
793
+ _projectRefreshWindows.delete(projectRoot);
794
+ _graphRefreshStats.projectRefreshCompleted++;
795
+ }
796
+ })();
797
+ _graphRefreshStats.projectRefreshScheduled++;
798
+ _pendingProjectRefreshes.set(projectRoot, refresh);
799
+ void refresh.finally(() => {
800
+ _pendingProjectRefreshes.delete(projectRoot);
801
+ });
802
+ }
803
+ async function loadGraphIndexer() {
804
+ if (_reindexUnavailable) return null;
805
+ if (_reindexLoader) return _reindexLoader;
806
+ _reindexLoader = (async () => {
807
+ try {
808
+ const mod = await import(new URL("../../hex-graph-mcp/lib/indexer.mjs", import.meta.url));
809
+ const reindexFile = typeof mod.reindexFile === "function" ? mod.reindexFile : null;
810
+ const indexProject = typeof mod.indexProject === "function" ? mod.indexProject : null;
811
+ if (reindexFile || indexProject) return { reindexFile, indexProject };
812
+ } catch {
813
+ }
814
+ _reindexUnavailable = true;
815
+ return null;
816
+ })();
817
+ return _reindexLoader;
818
+ }
642
819
 
643
820
  // lib/snapshot.mjs
644
- import { statSync as statSync3 } from "node:fs";
821
+ import { statSync as statSync4 } from "node:fs";
645
822
  import { diffLines } from "diff";
646
823
 
647
824
  // ../hex-common/src/text-protocol/hash.mjs
@@ -942,7 +1119,7 @@ function rememberSnapshot(filePath, input, meta = {}) {
942
1119
  }
943
1120
  function readSnapshot(filePath) {
944
1121
  pruneExpired();
945
- const stat = statSync3(filePath);
1122
+ const stat = statSync4(filePath);
946
1123
  const latest = latestByFile.get(filePath);
947
1124
  if (latest && latest.mtimeMs === stat.mtimeMs && latest.size === stat.size) {
948
1125
  touchFile(filePath);
@@ -1219,7 +1396,7 @@ function buildContinuation({ nextOffset, limit }) {
1219
1396
  function readFile2(filePath, opts = {}) {
1220
1397
  filePath = normalizePath(filePath);
1221
1398
  const real = validatePath(filePath);
1222
- const stat = statSync4(real);
1399
+ const stat = statSync5(real);
1223
1400
  if (stat.isDirectory()) {
1224
1401
  throw new Error(`READ_FILE_EXPECTS_FILE: ${filePath} is a directory. Use inspect_path for directory trees and path discovery.`);
1225
1402
  }
@@ -1256,7 +1433,7 @@ function readFile2(filePath, opts = {}) {
1256
1433
  if (db && relFile) {
1257
1434
  const visibleStart = normalizedRanges.length > 0 ? Math.min(...normalizedRanges.map((range) => range.startLine)) : 1;
1258
1435
  const visibleEnd = normalizedRanges.length > 0 ? Math.max(...normalizedRanges.map((range) => range.endLine)) : total;
1259
- const annos = fileAnnotations(db, relFile, { startLine: visibleStart, endLine: visibleEnd, limit: 6 });
1436
+ const annos = fileAnnotations(db, relFile, { startLine: visibleStart, endLine: visibleEnd, limit: 4 });
1260
1437
  if (annos.length > 0) {
1261
1438
  const items = annos.map((a) => {
1262
1439
  const parts = [];
@@ -1346,7 +1523,7 @@ ${serializedBlocks.join("\n\n")}`.trim();
1346
1523
  }
1347
1524
 
1348
1525
  // lib/edit.mjs
1349
- import { statSync as statSync5, writeFileSync } from "node:fs";
1526
+ import { statSync as statSync6, writeFileSync } from "node:fs";
1350
1527
  import { diffLines as diffLines2 } from "diff";
1351
1528
 
1352
1529
  // lib/edit-resolution.mjs
@@ -1870,6 +2047,16 @@ function deriveNextAction({ retryEdit = null, retryEdits = null, suggestedReadCa
1870
2047
  if (suggestedReadCall) return ACTION.REREAD_THEN_RETRY;
1871
2048
  return ACTION.INSPECT_SNIPPET;
1872
2049
  }
2050
+ function summarizeChangedSpan(minLine, maxLine) {
2051
+ return Number.isFinite(minLine) && Number.isFinite(maxLine) && minLine !== Infinity && maxLine > 0 ? `${minLine}-${maxLine}` : "unknown";
2052
+ }
2053
+ function countDiffEntries(fullDiff) {
2054
+ if (!fullDiff) return 0;
2055
+ return fullDiff.split("\n").filter((line) => /^[+-]\d+\|/.test(line)).length;
2056
+ }
2057
+ function payloadSections(sections) {
2058
+ return sections.length > 0 ? sections.join(",") : "summary_only";
2059
+ }
1873
2060
  function buildRetryEdit(edit, lines, options = {}) {
1874
2061
  const retryChecksum = options.retryChecksum || null;
1875
2062
  if (edit.set_line) {
@@ -2581,14 +2768,26 @@ ${snip.text}`;
2581
2768
  }
2582
2769
  }
2583
2770
  }
2771
+ const diffEntryCount = countDiffEntries(fullDiff);
2772
+ const changedSpan = summarizeChangedSpan(minLine, maxLine);
2584
2773
  if (opts.dryRun) {
2585
2774
  let msg2 = `status: ${autoRebased ? STATUS.AUTO_REBASED : STATUS.OK}
2586
2775
  reason: ${REASON.DRY_RUN_PREVIEW}
2587
2776
  revision: ${currentSnapshot.revision}
2588
- file: ${currentSnapshot.fileChecksum}
2589
- Dry run: ${filePath} would change (${lines.length} lines)`;
2777
+ file: ${currentSnapshot.fileChecksum}`;
2590
2778
  if (staleRevision && hasBaseSnapshot) msg2 += `
2591
2779
  changed_ranges: ${describeChangedRanges(changedRanges)}`;
2780
+ msg2 += `
2781
+ summary: lines_changed=${changedSpan} diff_entries=${diffEntryCount} lines_after=${lines.length}`;
2782
+ msg2 += `
2783
+ payload_sections: ${payloadSections(displayDiff ? ["diff"] : [])}`;
2784
+ msg2 += "\ngraph_enrichment: unavailable";
2785
+ msg2 += "\nsemantic_impact_count: 0";
2786
+ msg2 += "\nsemantic_fact_count: 0";
2787
+ msg2 += "\nclone_warning_count: 0";
2788
+ msg2 += "\nprovenance_summary: edit_protocol=snapshot+anchors graph=unavailable";
2789
+ msg2 += `
2790
+ Dry run: ${filePath} would change (${lines.length} lines)`;
2592
2791
  if (displayDiff) msg2 += `
2593
2792
 
2594
2793
  Diff:
@@ -2598,7 +2797,7 @@ ${displayDiff}
2598
2797
  return msg2;
2599
2798
  }
2600
2799
  writeFileSync(real, content, "utf-8");
2601
- const nextStat = statSync5(real);
2800
+ const nextStat = statSync6(real);
2602
2801
  const nextSnapshot = rememberSnapshot(real, content, { mtimeMs: nextStat.mtimeMs, size: nextStat.size });
2603
2802
  let msg = `status: ${autoRebased ? STATUS.AUTO_REBASED : STATUS.OK}
2604
2803
  reason: ${autoRebased ? REASON.EDIT_AUTO_REBASED : REASON.EDIT_APPLIED}
@@ -2613,6 +2812,10 @@ changed_ranges: ${describeChangedRanges(changedRanges)}`;
2613
2812
  remapped_refs:
2614
2813
  ${remaps.map(({ from, to }) => `${from} -> ${to}`).join("\n")}`;
2615
2814
  }
2815
+ let hasPostEditBlock = false;
2816
+ let graphEnrichment = "unavailable";
2817
+ let semanticImpacts = [];
2818
+ let cloneWarnings = [];
2616
2819
  msg += `
2617
2820
  Updated ${filePath} (${lines.length} lines)`;
2618
2821
  if (fullDiff && minLine <= maxLine) {
@@ -2629,18 +2832,20 @@ Updated ${filePath} (${lines.length} lines)`;
2629
2832
  trailing_newline: nextSnapshot.trailingNewline
2630
2833
  }
2631
2834
  });
2835
+ hasPostEditBlock = true;
2632
2836
  msg += `
2633
2837
 
2634
2838
  ${serializeReadBlock(block)}`;
2635
2839
  }
2636
2840
  }
2637
2841
  try {
2638
- const db = getGraphDB(real);
2842
+ const db = getGraphDB(real, { allowStale: true });
2639
2843
  const relFile = db ? getRelativePath(real) : null;
2640
2844
  if (db && relFile && fullDiff && minLine <= maxLine) {
2641
- const impacts = semanticImpact(db, relFile, minLine, maxLine);
2642
- if (impacts.length > 0) {
2643
- const sections = impacts.map((impact) => {
2845
+ graphEnrichment = "available";
2846
+ semanticImpacts = semanticImpact(db, relFile, minLine, maxLine);
2847
+ if (semanticImpacts.length > 0) {
2848
+ const sections = semanticImpacts.map((impact) => {
2644
2849
  const totals = [];
2645
2850
  if (impact.counts.publicApi > 0) totals.push("public API");
2646
2851
  if (impact.counts.frameworkEntrypoints > 0) totals.push(`${impact.counts.frameworkEntrypoints} framework entrypoint`);
@@ -2668,16 +2873,35 @@ ${serializeReadBlock(block)}`;
2668
2873
  \u26A0 Semantic impact:
2669
2874
  ${sections.join("\n")}`;
2670
2875
  }
2671
- const clones = cloneWarning(db, relFile, minLine, maxLine);
2672
- if (clones.length > 0) {
2673
- const list = clones.map((c) => `${c.file}:${c.line}`).join(", ");
2876
+ cloneWarnings = cloneWarning(db, relFile, minLine, maxLine);
2877
+ if (cloneWarnings.length > 0) {
2878
+ const list = cloneWarnings.map((c) => `${c.file}:${c.line}`).join(", ");
2674
2879
  msg += `
2675
2880
 
2676
- \u26A0 ${clones.length} clone(s): ${list}`;
2881
+ \u26A0 ${cloneWarnings.length} clone(s): ${list}`;
2677
2882
  }
2678
2883
  }
2679
2884
  } catch {
2680
2885
  }
2886
+ const payloadKinds = [];
2887
+ if (hasPostEditBlock) payloadKinds.push("post_edit");
2888
+ if (semanticImpacts.length > 0) payloadKinds.push("semantic_impact");
2889
+ if (cloneWarnings.length > 0) payloadKinds.push("clone_warning");
2890
+ if (displayDiff) payloadKinds.push("diff");
2891
+ const semanticFactCount = semanticImpacts.reduce((sum, impact) => sum + (impact.facts?.length || 0), 0);
2892
+ const summaryLines = [
2893
+ `summary: lines_changed=${changedSpan} diff_entries=${diffEntryCount} lines_after=${lines.length}`,
2894
+ `payload_sections: ${payloadSections(payloadKinds)}`,
2895
+ `graph_enrichment: ${graphEnrichment}`,
2896
+ `semantic_impact_count: ${semanticImpacts.length}`,
2897
+ `semantic_fact_count: ${semanticFactCount}`,
2898
+ `clone_warning_count: ${cloneWarnings.length}`,
2899
+ `provenance_summary: edit_protocol=snapshot+anchors graph=${graphEnrichment === "available" ? "hex_line_contract" : "unavailable"}`
2900
+ ].join("\n");
2901
+ msg = msg.replace(`
2902
+ Updated ${filePath} (${lines.length} lines)`, `
2903
+ ${summaryLines}
2904
+ Updated ${filePath} (${lines.length} lines)`);
2681
2905
  if (displayDiff) msg += `
2682
2906
 
2683
2907
  Diff:
@@ -2703,6 +2927,7 @@ var DEFAULT_TOTAL_LIMIT_LIST = 1e3;
2703
2927
  var MAX_OUTPUT = 10 * 1024 * 1024;
2704
2928
  var TIMEOUT2 = 3e4;
2705
2929
  var MAX_SEARCH_OUTPUT_CHARS = 8e4;
2930
+ var GRAPH_MATCH_ANNOTATION_BUDGET = 12;
2706
2931
  function spawnRg(args) {
2707
2932
  return new Promise((resolve_, reject) => {
2708
2933
  let stdout = "";
@@ -2832,6 +3057,7 @@ async function summaryMode(pattern, target, opts, totalLimit) {
2832
3057
  async function contentMode(pattern, target, opts, plain, editReady, totalLimit) {
2833
3058
  const realArgs = ["--json"];
2834
3059
  const plainOutput = plain || !editReady;
3060
+ const shouldUseGraph = editReady && !plain;
2835
3061
  if (opts.caseInsensitive) realArgs.push("-i");
2836
3062
  else if (opts.smartCase) realArgs.push("-S");
2837
3063
  if (opts.literal) realArgs.push("-F");
@@ -2850,8 +3076,9 @@ async function contentMode(pattern, target, opts, plain, editReady, totalLimit)
2850
3076
  if (code !== 0 && code !== null) throw new Error(`GREP_ERROR: rg exit ${code} \u2014 ${stderr.trim() || "unknown error"}`);
2851
3077
  const jsonLines = stdout.trimEnd().split("\n").filter(Boolean);
2852
3078
  const blocks = [];
2853
- const db = getGraphDB(target);
3079
+ const db = shouldUseGraph ? getGraphDB(target) : null;
2854
3080
  const relCache = /* @__PURE__ */ new Map();
3081
+ let annotationBudget = GRAPH_MATCH_ANNOTATION_BUDGET;
2855
3082
  let groupFile = null;
2856
3083
  let groupEntries = [];
2857
3084
  let matchCount = 0;
@@ -2916,15 +3143,20 @@ async function contentMode(pattern, target, opts, plain, editReady, totalLimit)
2916
3143
  }
2917
3144
  const isMatch = msg.type === "match";
2918
3145
  let annotation = "";
2919
- if (db && isMatch) {
2920
- let rel = relCache.get(filePath);
2921
- if (rel === void 0) {
2922
- rel = getRelativePath(resolve2(filePath)) || "";
2923
- relCache.set(filePath, rel);
2924
- }
2925
- if (rel) {
2926
- const a = matchAnnotation(db, rel, ln);
2927
- if (a) annotation = a;
3146
+ if (db && isMatch && annotationBudget > 0) {
3147
+ if (ensureGraphFreshForFile(db, resolve2(filePath))) {
3148
+ let rel = relCache.get(filePath);
3149
+ if (rel === void 0) {
3150
+ rel = getRelativePath(resolve2(filePath)) || "";
3151
+ relCache.set(filePath, rel);
3152
+ }
3153
+ if (rel) {
3154
+ const a = matchAnnotation(db, rel, ln);
3155
+ if (a) {
3156
+ annotation = a;
3157
+ annotationBudget--;
3158
+ }
3159
+ }
2928
3160
  }
2929
3161
  }
2930
3162
  groupEntries.push({
@@ -3390,11 +3622,11 @@ function verifyChecksums(filePath, checksums, opts = {}) {
3390
3622
  }
3391
3623
 
3392
3624
  // lib/inspect-path.mjs
3393
- import { statSync as statSync8 } from "node:fs";
3625
+ import { statSync as statSync9 } from "node:fs";
3394
3626
  import { resolve as resolve6 } from "node:path";
3395
3627
 
3396
3628
  // lib/tree.mjs
3397
- import { readdirSync as readdirSync2, readFileSync as readFileSync4, statSync as statSync6, existsSync as existsSync5 } from "node:fs";
3629
+ import { readdirSync as readdirSync2, readFileSync as readFileSync4, statSync as statSync7, existsSync as existsSync5 } from "node:fs";
3398
3630
  import { resolve as resolve4, basename, join as join4, relative as relative2 } from "node:path";
3399
3631
  import ignore from "ignore";
3400
3632
  var SKIP_DIRS = /* @__PURE__ */ new Set([
@@ -3431,7 +3663,7 @@ function findByPattern(dirPath, opts) {
3431
3663
  const maxDepth = opts.max_depth ?? 20;
3432
3664
  const abs = resolve4(normalizePath(dirPath));
3433
3665
  if (!existsSync5(abs)) throw new Error(`DIRECTORY_NOT_FOUND: ${abs}`);
3434
- if (!statSync6(abs).isDirectory()) throw new Error(`Not a directory: ${abs}`);
3666
+ if (!statSync7(abs).isDirectory()) throw new Error(`Not a directory: ${abs}`);
3435
3667
  const ig = opts.gitignore ?? true ? loadGitignore(abs) : null;
3436
3668
  const matches = [];
3437
3669
  function walk(dir, depth) {
@@ -3472,7 +3704,7 @@ function directoryTree(dirPath, opts = {}) {
3472
3704
  const maxDepth = compact ? 1 : opts.max_depth ?? 3;
3473
3705
  const abs = resolve4(normalizePath(dirPath));
3474
3706
  if (!existsSync5(abs)) throw new Error(`DIRECTORY_NOT_FOUND: ${abs}. Check path or use inspect_path on the parent directory.`);
3475
- const rootStat = statSync6(abs);
3707
+ const rootStat = statSync7(abs);
3476
3708
  if (!rootStat.isDirectory()) throw new Error(`Not a directory: ${abs}`);
3477
3709
  const ig = opts.gitignore ?? true ? loadGitignore(abs) : null;
3478
3710
  let totalFiles = 0;
@@ -3518,7 +3750,7 @@ function directoryTree(dirPath, opts = {}) {
3518
3750
  } else {
3519
3751
  let size = 0, mtime = null, lineCount = null;
3520
3752
  try {
3521
- const st = statSync6(full);
3753
+ const st = statSync7(full);
3522
3754
  size = st.size;
3523
3755
  mtime = st.mtime;
3524
3756
  } catch {
@@ -3567,7 +3799,7 @@ ${lines.join("\n")}`;
3567
3799
  }
3568
3800
 
3569
3801
  // lib/info.mjs
3570
- import { statSync as statSync7, openSync as openSync2, readSync as readSync2, closeSync as closeSync2 } from "node:fs";
3802
+ import { statSync as statSync8, openSync as openSync2, readSync as readSync2, closeSync as closeSync2 } from "node:fs";
3571
3803
  import { resolve as resolve5, isAbsolute as isAbsolute3, extname as extname2, basename as basename2 } from "node:path";
3572
3804
  var MAX_LINE_COUNT_SIZE = 10 * 1024 * 1024;
3573
3805
  var EXT_NAMES = {
@@ -3638,7 +3870,7 @@ function fileInfo(filePath) {
3638
3870
  if (!filePath) throw new Error("Empty file path");
3639
3871
  const normalized = normalizePath(filePath);
3640
3872
  const abs = isAbsolute3(normalized) ? normalized : resolve5(process.cwd(), normalized);
3641
- const stat = statSync7(abs);
3873
+ const stat = statSync8(abs);
3642
3874
  if (!stat.isFile()) throw new Error(`Not a regular file: ${abs}`);
3643
3875
  const size = stat.size;
3644
3876
  const mtime = stat.mtime;
@@ -3663,7 +3895,7 @@ function fileInfo(filePath) {
3663
3895
  // lib/inspect-path.mjs
3664
3896
  function inspectPath(inputPath, opts = {}) {
3665
3897
  const abs = resolve6(normalizePath(inputPath));
3666
- const stat = statSync8(abs);
3898
+ const stat = statSync9(abs);
3667
3899
  if (stat.isFile()) return fileInfo(abs);
3668
3900
  if (stat.isDirectory()) return directoryTree(abs, opts);
3669
3901
  throw new Error(`Unsupported path type: ${abs}`);
@@ -3814,12 +4046,12 @@ function autoSync() {
3814
4046
  }
3815
4047
 
3816
4048
  // lib/changes.mjs
3817
- import { statSync as statSync10 } from "node:fs";
4049
+ import { statSync as statSync11 } from "node:fs";
3818
4050
  import { join as join6 } from "node:path";
3819
4051
 
3820
4052
  // ../hex-common/src/git/semantic-diff.mjs
3821
4053
  import { execFileSync } from "node:child_process";
3822
- import { existsSync as existsSync7, readFileSync as readFileSync6, statSync as statSync9 } from "node:fs";
4054
+ import { existsSync as existsSync7, readFileSync as readFileSync6, statSync as statSync10 } from "node:fs";
3823
4055
  import { dirname as dirname5, extname as extname3, relative as relative3, resolve as resolve8 } from "node:path";
3824
4056
  function normalizePath2(value) {
3825
4057
  return value.replace(/\\/g, "/");
@@ -3828,7 +4060,7 @@ function repoCwd(targetPath) {
3828
4060
  if (!existsSync7(targetPath)) {
3829
4061
  throw new Error(`Path does not exist: ${targetPath}`);
3830
4062
  }
3831
- return statSync9(targetPath).isDirectory() ? targetPath : dirname5(targetPath);
4063
+ return statSync10(targetPath).isDirectory() ? targetPath : dirname5(targetPath);
3832
4064
  }
3833
4065
  function runGit(args, cwd, allowFailure = false) {
3834
4066
  try {
@@ -4076,6 +4308,15 @@ async function semanticGitDiff(targetPath, { baseRef = "HEAD", headRef = null }
4076
4308
  }
4077
4309
 
4078
4310
  // lib/changes.mjs
4311
+ function graphEnrichmentState(db) {
4312
+ return db ? "available" : "unavailable";
4313
+ }
4314
+ function provenanceSummary(semanticDiffState, db) {
4315
+ return `semantic_diff=${semanticDiffState} graph=${db ? "hex_line_contract" : "unavailable"}`;
4316
+ }
4317
+ function payloadSections2(sections) {
4318
+ return sections.length > 0 ? sections.join(",") : "summary_only";
4319
+ }
4079
4320
  function exportedLooking(symbol) {
4080
4321
  return /^\s*(export|public)\b/.test(symbol.text || "");
4081
4322
  }
@@ -4114,9 +4355,10 @@ function summarizeRiskLine(line) {
4114
4355
  async function fileChanges(filePath, compareAgainst = "HEAD") {
4115
4356
  filePath = normalizePath(filePath);
4116
4357
  const real = validatePath(filePath);
4117
- if (statSync10(real).isDirectory()) {
4358
+ if (statSync11(real).isDirectory()) {
4118
4359
  const db2 = getGraphDB(join6(real, "__hex-line_probe__"));
4119
4360
  const diff2 = await semanticGitDiff(real, { baseRef: compareAgainst });
4361
+ const graphEnrichment2 = graphEnrichmentState(db2);
4120
4362
  if (diff2.summary.changed_file_count === 0) {
4121
4363
  return [
4122
4364
  "status: NO_CHANGES",
@@ -4125,9 +4367,19 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4125
4367
  `compare_against: ${compareAgainst}`,
4126
4368
  "scope: directory",
4127
4369
  "summary: changed_files=0",
4128
- `next_action: ${ACTION.NO_ACTION}`
4370
+ `next_action: ${ACTION.NO_ACTION}`,
4371
+ `graph_enrichment: ${graphEnrichment2}`,
4372
+ "risk_summary_count: 0",
4373
+ "removed_api_warning_count: 0",
4374
+ `payload_sections: ${payloadSections2([])}`,
4375
+ `provenance_summary: ${provenanceSummary("clean", db2)}`
4129
4376
  ].join("\n");
4130
4377
  }
4378
+ const supportedCount = diff2.changed_files.filter((entry) => entry.semantic_supported).length;
4379
+ const semanticDiffState = supportedCount === 0 ? "unsupported" : supportedCount === diff2.changed_files.length ? "semantic" : "mixed";
4380
+ let emittedRiskCount = 0;
4381
+ let emittedRemovedApiWarnings = 0;
4382
+ const sectionKinds2 = ["files"];
4131
4383
  const sections = [
4132
4384
  "status: CHANGED",
4133
4385
  `reason: ${REASON.DIRECTORY_CHANGED}`,
@@ -4136,6 +4388,7 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4136
4388
  "scope: directory",
4137
4389
  `summary: changed_files=${diff2.summary.changed_file_count}`,
4138
4390
  `next_action: ${ACTION.INSPECT_FILE}`,
4391
+ `graph_enrichment: ${graphEnrichment2}`,
4139
4392
  ""
4140
4393
  ];
4141
4394
  for (const file2 of diff2.changed_files) {
@@ -4145,16 +4398,32 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4145
4398
  ];
4146
4399
  sections.push(parts2.join(" | "));
4147
4400
  const riskLines2 = summarizeGraphRisk(db2, file2.path.replace(/\\/g, "/"), file2);
4148
- for (const line of riskLines2.slice(0, 2)) sections.push(`risk_summary: ${summarizeRiskLine(line)}`);
4401
+ const visibleRiskLines = riskLines2.slice(0, 2);
4402
+ emittedRiskCount += visibleRiskLines.length;
4403
+ for (const line of visibleRiskLines) sections.push(`risk_summary: ${summarizeRiskLine(line)}`);
4149
4404
  for (const symbol of file2.removed_symbols.slice(0, 2)) {
4150
- if (exportedLooking(symbol)) sections.push(`removed_api_warning: ${symbol.text}`);
4405
+ if (exportedLooking(symbol)) {
4406
+ emittedRemovedApiWarnings += 1;
4407
+ sections.push(`removed_api_warning: ${symbol.text}`);
4408
+ }
4151
4409
  }
4152
4410
  }
4411
+ if (emittedRiskCount > 0) sectionKinds2.push("risk_summary");
4412
+ if (emittedRemovedApiWarnings > 0) sectionKinds2.push("removed_api_warning");
4413
+ sections.splice(
4414
+ 8,
4415
+ 0,
4416
+ `risk_summary_count: ${emittedRiskCount}`,
4417
+ `removed_api_warning_count: ${emittedRemovedApiWarnings}`,
4418
+ `payload_sections: ${payloadSections2(sectionKinds2)}`,
4419
+ `provenance_summary: ${provenanceSummary(semanticDiffState, db2)}`
4420
+ );
4153
4421
  return sections.join("\n");
4154
4422
  }
4155
4423
  const db = getGraphDB(real);
4156
4424
  const diff = await semanticGitDiff(real, { baseRef: compareAgainst });
4157
4425
  const file = diff.changed_files[0];
4426
+ const graphEnrichment = graphEnrichmentState(db);
4158
4427
  if (!file) {
4159
4428
  return [
4160
4429
  "status: NO_CHANGES",
@@ -4163,7 +4432,12 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4163
4432
  `compare_against: ${compareAgainst}`,
4164
4433
  "scope: file",
4165
4434
  "summary: added=0 removed=0 modified=0",
4166
- `next_action: ${ACTION.NO_ACTION}`
4435
+ `next_action: ${ACTION.NO_ACTION}`,
4436
+ `graph_enrichment: ${graphEnrichment}`,
4437
+ "risk_summary_count: 0",
4438
+ "removed_api_warning_count: 0",
4439
+ `payload_sections: ${payloadSections2([])}`,
4440
+ `provenance_summary: ${provenanceSummary("clean", db)}`
4167
4441
  ].join("\n");
4168
4442
  }
4169
4443
  if (!file.semantic_supported) {
@@ -4174,30 +4448,45 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4174
4448
  `compare_against: ${compareAgainst}`,
4175
4449
  "scope: file",
4176
4450
  `summary: semantic diff unavailable for ${file.extension} files`,
4177
- `next_action: ${ACTION.INSPECT_RAW_DIFF}`
4451
+ `next_action: ${ACTION.INSPECT_RAW_DIFF}`,
4452
+ `graph_enrichment: ${graphEnrichment}`,
4453
+ "risk_summary_count: 0",
4454
+ "removed_api_warning_count: 0",
4455
+ `payload_sections: ${payloadSections2([])}`,
4456
+ `provenance_summary: ${provenanceSummary("unsupported", db)}`
4178
4457
  ].join("\n");
4179
4458
  }
4459
+ const relFile = getRelativePath(real) || file.path?.replace(/\\/g, "/");
4460
+ const riskLines = summarizeGraphRisk(db, relFile, file);
4461
+ const removedApiWarnings = file.removed_symbols.filter(exportedLooking).slice(0, 4);
4462
+ const sectionKinds = [];
4180
4463
  const parts = [
4181
4464
  "status: CHANGED",
4182
4465
  `reason: ${REASON.FILE_CHANGED}`,
4183
4466
  `path: ${filePath}`,
4184
4467
  `compare_against: ${compareAgainst}`,
4185
4468
  "scope: file",
4186
- `summary: ${symbolCountSummary(file)}`
4469
+ `summary: ${symbolCountSummary(file)}`,
4470
+ `graph_enrichment: ${graphEnrichment}`,
4471
+ `risk_summary_count: ${riskLines.length}`,
4472
+ `removed_api_warning_count: ${removedApiWarnings.length}`
4187
4473
  ];
4188
4474
  if (file.added_symbols.length) {
4475
+ sectionKinds.push("added");
4189
4476
  parts.push(`next_action: ${ACTION.REVIEW_RISKS}`);
4190
4477
  parts.push("");
4191
4478
  parts.push("added:");
4192
4479
  for (const symbol of file.added_symbols) parts.push(` + ${symbol.start}-${symbol.end}: ${symbol.text}`);
4193
4480
  }
4194
4481
  if (file.removed_symbols.length) {
4482
+ sectionKinds.push("removed");
4195
4483
  if (!parts.includes(`next_action: ${ACTION.REVIEW_RISKS}`)) parts.push(`next_action: ${ACTION.REVIEW_RISKS}`);
4196
4484
  parts.push("");
4197
4485
  parts.push("removed:");
4198
4486
  for (const symbol of file.removed_symbols) parts.push(` - ${symbol.start}-${symbol.end}: ${symbol.text}`);
4199
4487
  }
4200
4488
  if (file.modified_symbols.length) {
4489
+ sectionKinds.push("modified");
4201
4490
  if (!parts.includes(`next_action: ${ACTION.REVIEW_RISKS}`)) parts.push(`next_action: ${ACTION.REVIEW_RISKS}`);
4202
4491
  parts.push("");
4203
4492
  parts.push("modified:");
@@ -4212,9 +4501,14 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4212
4501
  parts.push("");
4213
4502
  parts.push("summary_detail: no symbol changes detected");
4214
4503
  }
4215
- const relFile = getRelativePath(real) || file.path?.replace(/\\/g, "/");
4216
- const riskLines = summarizeGraphRisk(db, relFile, file);
4217
- const removedApiWarnings = file.removed_symbols.filter(exportedLooking).slice(0, 4);
4504
+ if (riskLines.length > 0) sectionKinds.push("risk_summary");
4505
+ if (removedApiWarnings.length > 0) sectionKinds.push("removed_api_warning");
4506
+ parts.splice(
4507
+ 9,
4508
+ 0,
4509
+ `payload_sections: ${payloadSections2(sectionKinds)}`,
4510
+ `provenance_summary: ${provenanceSummary("semantic", db)}`
4511
+ );
4218
4512
  if (riskLines.length || removedApiWarnings.length) {
4219
4513
  parts.push("");
4220
4514
  parts.push("risk_summary:");
@@ -4344,7 +4638,7 @@ OUTPUT_CAPPED: Output exceeded ${MAX_BULK_OUTPUT_CHARS} chars.`;
4344
4638
  }
4345
4639
 
4346
4640
  // server.mjs
4347
- var version = true ? "1.15.0" : (await null).createRequire(import.meta.url)("./package.json").version;
4641
+ var version = true ? "1.16.0" : (await null).createRequire(import.meta.url)("./package.json").version;
4348
4642
  var { server, StdioServerTransport } = await createServerRuntime({
4349
4643
  name: "hex-line-mcp",
4350
4644
  version
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@levnikolaevich/hex-line-mcp",
3
- "version": "1.15.0",
3
+ "version": "1.16.0",
4
4
  "mcpName": "io.github.levnikolaevich/hex-line-mcp",
5
5
  "type": "module",
6
6
  "description": "Hash-verified file editing MCP + token efficiency hook for AI coding agents. 9 tools: inspect_path, read, edit, write, grep, outline, verify, changes, bulk_replace.",