@levnikolaevich/hex-line-mcp 1.25.1 → 1.27.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -46,7 +46,7 @@ Advanced / occasional:
46
46
 
47
47
  | Event | Trigger | Action |
48
48
  |-------|---------|--------|
49
- | **PreToolUse** | Read/Edit/Write/Grep/Glob on project text scope | Hard redirect to hex-line for project-scoped text files and file discovery; built-in tools stay available for binary/media and text paths outside the current project root |
49
+ | **PreToolUse** | Read/Edit/Write/Grep/Glob on project text scope | Advises hex-line by default for project-scoped text files and file discovery; explicit `hooks.mode: "blocking"` hard redirects. Built-in tools stay available for binary/media, plan files in Plan Mode, and text paths outside the current project root |
50
50
  | **PreToolUse** | Bash with dangerous commands | Blocks `rm -rf /`, `git push --force`, etc. Agent must confirm with user |
51
51
  | **PostToolUse** | Bash with 50+ lines output | RTK: deduplicates, truncates, shows filtered summary to Claude as feedback |
52
52
  | **SessionStart** | Session begins | Injects a short bootstrap hint; defers to the active output style when `hex-line` style is enabled |
@@ -54,7 +54,7 @@ Advanced / occasional:
54
54
 
55
55
  ### Bash Redirects
56
56
 
57
- PreToolUse also intercepts project-scoped file inspection Bash commands: `cat`, `type`, `Get-Content`, `head`, `tail`, `ls`, `dir`, `tree`, `find`, `Get-ChildItem`, `stat`, `wc`, `Get-Item`, `grep`, `rg`, `findstr`, `Select-String`, `sed -i`. Targeted inspection pipelines are redirected too; Git/build/test/docker/network and stdin-filter pipelines remain allowed.
57
+ PreToolUse also intercepts project-scoped file inspection Bash commands: `cat`, `type`, `Get-Content`, `head`, `tail`, `ls`, `dir`, `tree`, `find`, `Get-ChildItem`, `stat`, `wc`, `Get-Item`, `grep`, `rg`, `findstr`, `Select-String`, `sed -i`. Targeted inspection pipelines receive advisory hints by default and are hard redirected only in explicit blocking mode; Git/build/test/docker/network and stdin-filter pipelines remain allowed.
58
58
  ## Install
59
59
 
60
60
  ### MCP Server
@@ -119,7 +119,7 @@ If a project already has `.hex-skills/codegraph/index.db`, `hex-line` automatica
119
119
  - Grep annotations are capped and count-suffixes are deduplicated so the first label keeps the counts and later labels stay short.
120
120
  - `read_file`, `outline`, and `grep_search` stay compact: they only surface high-signal local facts such as `api`, framework entrypoints, callers, flow, and clone hints.
121
121
  - `edit_file` and `changes` surface the deeper review layer: external callers, downstream return/property flow, clone peers, public API risk, framework entrypoint risk, and same-name sibling warnings when present.
122
- - Both tools now expose a short top-level preview before the detailed sections: `payload_sections`, `graph_enrichment`, count fields, and `provenance_summary`.
122
+ - Both tools surface structured sections directly; debug markers (`payload_sections`, `graph_enrichment` preview, count fields, `provenance_summary`) were removed per PROTOCOL.md §Response grammar.
123
123
  - Stale graph data never changes correctness. `hex-line` suppresses stale hints, schedules a best-effort `reindexFile(...)` for a stale file, and escalates to one background `indexProject(...)` when several files go stale inside the burst window.
124
124
  - The threshold-crossing file still keeps its point refresh so a short stale burst does not create a blind gap before the project refresh finishes.
125
125
  - `edit_file` may still use stale graph context as same-response advisory metadata when the file was just edited, but stale graph never gates the edit protocol itself.
@@ -176,8 +176,7 @@ Read a file with progressive disclosure. Default mode is discovery-first: plain
176
176
  Default output is discovery-first:
177
177
 
178
178
  ```text
179
- File: lib/search.mjs
180
- meta: 282 lines, 10.2KB, 2 hours ago
179
+ meta: lines=282 size=10240 mtime=2026-04-20T15:00:00.000Z
181
180
  continuation: {"kind":"offset","offset":4,"limit":200}
182
181
 
183
182
  1|import { resolve } from "node:path";
@@ -188,23 +187,23 @@ continuation: {"kind":"offset","offset":4,"limit":200}
188
187
  Explicit edit-ready output:
189
188
 
190
189
  ```
191
- File: lib/search.mjs
192
- meta: 282 lines, 10.2KB, 2 hours ago
190
+ meta: lines=282 size=10240 mtime=2026-04-20T15:00:00.000Z
193
191
  revision: rev-12-a1b2c3d4
194
192
  file: 1-282:beefcafe
195
- eol: lf
196
- trailing_newline: true
197
193
 
198
194
  block: read_range
199
195
  span: 1-3
200
- eol: lf
201
- trailing_newline: true
202
- ab.1 import { resolve } from "node:path";
203
- cd.2 import { readFileSync } from "node:fs";
204
- ef.3 ...
196
+ ab.1\timport { resolve } from "node:path";
197
+ cd.2\timport { readFileSync } from "node:fs";
198
+ ef.3\t...
205
199
  checksum: 1-3:f7e2a1b0
206
200
  ```
207
201
 
202
+ Note: default `eol=lf` and `trailing_newline=true` are assumed when those keys
203
+ are absent from block meta. Non-default values (e.g. `eol: crlf`) are still
204
+ emitted explicitly. The `File: <path>` header was dropped — path is already in
205
+ the caller's input.
206
+
208
207
  ### edit_file
209
208
 
210
209
  Edit using revision-aware hash-verified anchors. Prefer one batched call per file. For text rename use bulk_replace.
@@ -243,9 +242,7 @@ Result footer includes:
243
242
  - `reason: ...` as the canonical machine-readable cause for the current status
244
243
  - `revision: ...`
245
244
  - `summary: ...` with edited line span / diff counts on successful edits
246
- - `payload_sections: ...` so callers know which detailed sections follow
247
- - `graph_enrichment: unavailable` + `graph_fix: ...` when the project graph is missing or out of sync (silent when graph is healthy)
248
- - `⚠ Semantic impact:` and `⚠ N clone(s):` blocks emitted directly when graph-backed review data is available
245
+ - `#semantic_impact` sections and `!clone_siblings count=N list=...` lines emitted directly when graph-backed review data is available
249
246
  - `changed_ranges: ...` when relevant
250
247
  - `recovery_ranges: ...` with the narrowest recommended `read_file` ranges for retry
251
248
  - `next_action: ...` as the canonical immediate choice: `apply_retry_edit`, `apply_retry_batch`, or `reread_then_retry`
@@ -323,15 +320,14 @@ Example output:
323
320
 
324
321
  ```text
325
322
  status: STALE
326
- reason: checksums_stale
327
323
  revision: rev-17-deadbeef
328
- summary: valid=0 stale=1 invalid=0
324
+ summary: valid=0 stale=1
329
325
  next_action: reread_ranges
330
326
  base_revision: rev-16-feedcafe
331
327
  changed_ranges: 10-12(replace)
332
328
  suggested_read_call: {"tool":"mcp__hex-line__read_file","arguments":{"file_path":"/repo/file.ts","ranges":["10-12"]}}
333
329
 
334
- entry: 1/1 | status: STALE | span: 10-12 | checksum: 10-12:oldc0de0 | current_checksum: 10-12:newc0de0 | next_action: reread_range | summary: content changed since checksum capture
330
+ entry: 1/1 STALE span: 10-12 checksum: 10-12:oldc0de0 current_checksum: 10-12:newc0de0
335
331
  ```
336
332
 
337
333
  ### inspect_path
@@ -360,11 +356,11 @@ The unified hook (`hook.mjs`) handles five Claude hook events:
360
356
 
361
357
  ### PreToolUse: Tool Redirect
362
358
 
363
- Hard-routes built-in `Read`, `Edit`, `Write`, `Grep`, and `Glob` on project-scoped text workflows to hex-line. Binary/media files (images, PDFs, notebooks, archives, executables, fonts, media) and text paths outside the current project root stay on built-in tools.
359
+ Advises built-in `Read`, `Edit`, `Write`, `Grep`, and `Glob` on project-scoped text workflows to use hex-line. Missing or malformed hook state defaults to advisory mode; set `.hex-skills/environment_state.json` to `hooks.mode: "blocking"` to hard-route covered calls. Binary/media files (images, PDFs, notebooks, archives, executables, fonts, media), plan markdown files in Plan Mode, and text paths outside the current project root stay on built-in tools.
364
360
 
365
361
  ### PreToolUse: Bash Redirect + Dangerous Blocker
366
362
 
367
- Intercepts project-scoped Bash file inspection commands (`cat`, `type`, `Get-Content`, `head`, `tail`, `ls`, `dir`, `tree`, `find`, `Get-ChildItem`, `stat`, `wc`, `Get-Item`, `grep`, `rg`, `findstr`, `Select-String`, `sed -i`, etc.) and redirects covered cases to hex-line tools. Targeted inspection pipelines are redirected. Dangerous commands (`rm -rf /`, `git push --force`, `git reset --hard`, `DROP TABLE`, `chmod 777`, `mkfs`, `dd`) are blocked.
363
+ Intercepts project-scoped Bash file inspection commands (`cat`, `type`, `Get-Content`, `head`, `tail`, `ls`, `dir`, `tree`, `find`, `Get-ChildItem`, `stat`, `wc`, `Get-Item`, `grep`, `rg`, `findstr`, `Select-String`, `sed -i`, etc.) and advises hex-line tools by default. Explicit blocking mode hard redirects covered cases. Dangerous commands (`rm -rf /`, `git push --force`, `git reset --hard`, `DROP TABLE`, `chmod 777`, `mkfs`, `dd`) are always blocked.
368
364
 
369
365
  ### PostToolUse: RTK Output Filter
370
366
 
package/dist/hook.mjs CHANGED
@@ -207,6 +207,15 @@ function extOf(filePath) {
207
207
  function getFilePath(toolInput) {
208
208
  return toolInput.file_path || toolInput.path || "";
209
209
  }
210
+ function getMutatingTargetPath(toolInput) {
211
+ return toolInput.file_path || toolInput.path || "";
212
+ }
213
+ function isPlanMarkdownFile(filePath) {
214
+ if (!filePath) return false;
215
+ const normalized = filePath.replace(/\\/g, "/");
216
+ const name = normalized.slice(normalized.lastIndexOf("/") + 1);
217
+ return /^plan_.*\.md$/i.test(name);
218
+ }
210
219
  function getGlobScopePath(toolInput) {
211
220
  if (toolInput.path) return toolInput.path;
212
221
  const pattern = typeof toolInput.pattern === "string" ? toolInput.pattern : "";
@@ -390,11 +399,11 @@ function _resetHexLineDisabledCache() {
390
399
  var _hookMode;
391
400
  function getHookMode() {
392
401
  if (_hookMode !== void 0) return _hookMode;
393
- _hookMode = "blocking";
402
+ _hookMode = "advisory";
394
403
  try {
395
404
  const stateFile = resolve(process.cwd(), ".hex-skills/environment_state.json");
396
405
  const data = JSON.parse(readFileSync(stateFile, "utf-8"));
397
- if (data.hooks?.mode === "advisory") _hookMode = "advisory";
406
+ if (data.hooks?.mode === "blocking") _hookMode = "blocking";
398
407
  } catch {
399
408
  }
400
409
  return _hookMode;
@@ -443,7 +452,7 @@ function handlePreToolUse(data) {
443
452
  const toolName = data.tool_name || "";
444
453
  const toolInput = data.tool_input || {};
445
454
  if (data.permission_mode === "plan" && HEX_LINE_MUTATING.has(toolName)) {
446
- const targetPath = (toolInput.path || "").replace(/\\/g, "/");
455
+ const targetPath = getMutatingTargetPath(toolInput).replace(/\\/g, "/");
447
456
  const isPlanSafe = PLAN_SAFE_FOLDERS.some((folder) => targetPath.includes(folder));
448
457
  if (!isPlanSafe) {
449
458
  block(
@@ -467,14 +476,22 @@ function handlePreToolUse(data) {
467
476
  }
468
477
  if (toolName === "Read") {
469
478
  const ext = filePath ? extOf(filePath) : "";
479
+ if (data.permission_mode === "plan" && isPlanMarkdownFile(filePath)) {
480
+ process.exit(0);
481
+ }
470
482
  const rangeHint = isPartialRead(toolInput) ? " Preserve the same offset/limit or ranges." : "";
471
483
  const outlineTip = filePath && OUTLINEABLE_EXT.has(ext) ? ` For structure-first discovery: mcp__hex-line__outline then mcp__hex-line__read_file with ranges.` : "";
472
- const target = filePath ? `Use mcp__hex-line__read_file(file_path="${filePath}").${rangeHint}${outlineTip}` : "Use mcp__hex-line__read_file or mcp__hex-line__inspect_path.";
484
+ const editReadyTip = ' If preparing an edit, call mcp__hex-line__read_file with edit_ready=true and verbosity="full" to get anchors and checksums.';
485
+ const target = filePath ? `Use mcp__hex-line__read_file(file_path="${filePath}").${rangeHint}${outlineTip}${editReadyTip}` : "Use mcp__hex-line__read_file or mcp__hex-line__inspect_path.";
473
486
  redirect(target, DEFERRED_HINT);
474
487
  }
475
488
  if (toolName === "Edit") {
476
- const target = filePath ? `Use mcp__hex-line__edit_file(file_path="${filePath}"). If you need hash anchors first: mcp__hex-line__grep_search(output_mode="content", edit_ready=true).` : 'Use mcp__hex-line__edit_file. If you need hash anchors first: mcp__hex-line__grep_search(output_mode="content", edit_ready=true).';
477
- redirect(target, "Hash-verified edits for project text files.\n" + DEFERRED_HINT);
489
+ if (data.permission_mode === "plan" && isPlanMarkdownFile(filePath)) {
490
+ process.exit(0);
491
+ }
492
+ const editShape = 'edits=[{"set_line":{"anchor":"ab.12","new_text":"replacement"}}]';
493
+ const target = filePath ? `Use mcp__hex-line__edit_file(file_path="${filePath}", ${editShape}). If you need anchors first: mcp__hex-line__read_file(file_path="${filePath}", edit_ready=true, verbosity="full") or mcp__hex-line__grep_search(output_mode="content", edit_ready=true).` : `Use mcp__hex-line__edit_file(${editShape}). If you need anchors first: mcp__hex-line__grep_search(output_mode="content", edit_ready=true).`;
494
+ redirect(target, "Hash-verified edits for project text files. In default advisory mode, built-in Edit remains available as fallback.\n" + DEFERRED_HINT);
478
495
  }
479
496
  if (toolName === "Write") {
480
497
  const pathNote = filePath ? ` with file_path="${filePath}"` : "";
package/dist/server.mjs CHANGED
@@ -193,7 +193,6 @@ function listDirectory(dirPath, opts = {}) {
193
193
  return { text: lines.join("\n"), total };
194
194
  }
195
195
  var MAX_OUTPUT_CHARS = 8e4;
196
- var MAX_DIFF_CHARS = 3e4;
197
196
  var MAX_BULK_OUTPUT_CHARS = 3e4;
198
197
  var MAX_PER_FILE_DIFF_LINES = 50;
199
198
  function readText(filePath) {
@@ -470,40 +469,47 @@ function getGraphDBForProject(directoryPath) {
470
469
  const dbPath = join3(projectRoot, ".hex-skills/codegraph", "index.db");
471
470
  return _dbs.get(dbPath) || null;
472
471
  }
472
+ var _graphFixSeen = /* @__PURE__ */ new Set();
473
473
  function graphUnavailableHint(filePath) {
474
474
  const { reason, projectRoot } = diagnoseGraph(filePath);
475
475
  if (reason === "ok" || reason === "file_not_indexed") return [];
476
+ const throttleKey = `${projectRoot || "_"}:${reason}`;
477
+ if (_graphFixSeen.has(throttleKey)) return [];
478
+ _graphFixSeen.add(throttleKey);
476
479
  const at = projectRoot ? ` at ${projectRoot.replace(/\\/g, "/")}` : "";
477
480
  switch (reason) {
478
481
  case "driver_missing":
479
- return ["graph_enrichment: unavailable", "graph_fix: install better-sqlite3 in hex-line-mcp package"];
482
+ return ["graph_fix: install better-sqlite3 in hex-line-mcp package"];
480
483
  case "no_project_root":
481
- return ["graph_enrichment: unavailable", "graph_fix: file is outside any project root (no package.json / pyproject.toml / .git marker)"];
484
+ return ["graph_fix: file is outside any project root (no package.json / pyproject.toml / .git marker)"];
482
485
  case "index_missing":
483
- return ["graph_enrichment: unavailable", `graph_fix: run mcp__hex-graph__index_project${at}`];
486
+ return [`graph_fix: run mcp__hex-graph__index_project${at}`];
484
487
  case "contract_mismatch":
485
- return ["graph_enrichment: unavailable", `graph_fix: index built by incompatible hex-graph version; re-run mcp__hex-graph__index_project${at}`];
488
+ return [`graph_fix: index built by incompatible hex-graph version; re-run mcp__hex-graph__index_project${at}`];
486
489
  case "stale":
487
- return ["graph_enrichment: unavailable", `graph_fix: file modified after last index; re-run mcp__hex-graph__index_project${at} or wait for background refresh`];
490
+ return [`graph_fix: file modified after last index; re-run mcp__hex-graph__index_project${at} or wait for background refresh`];
488
491
  default:
489
- return ["graph_enrichment: unavailable"];
492
+ return [];
490
493
  }
491
494
  }
492
495
  function graphUnavailableHintForProject(projectRoot) {
493
496
  const { reason } = diagnoseGraphForProject(projectRoot);
494
497
  if (reason === "ok") return [];
498
+ const throttleKey = `${projectRoot || "_"}:${reason}`;
499
+ if (_graphFixSeen.has(throttleKey)) return [];
500
+ _graphFixSeen.add(throttleKey);
495
501
  const at = projectRoot ? ` at ${projectRoot.replace(/\\/g, "/")}` : "";
496
502
  switch (reason) {
497
503
  case "driver_missing":
498
- return ["graph_enrichment: unavailable", "graph_fix: install better-sqlite3 in hex-line-mcp package"];
504
+ return ["graph_fix: install better-sqlite3 in hex-line-mcp package"];
499
505
  case "no_project_root":
500
- return ["graph_enrichment: unavailable", "graph_fix: directory is outside any project root"];
506
+ return ["graph_fix: directory is outside any project root"];
501
507
  case "index_missing":
502
- return ["graph_enrichment: unavailable", `graph_fix: run mcp__hex-graph__index_project${at}`];
508
+ return [`graph_fix: run mcp__hex-graph__index_project${at}`];
503
509
  case "contract_mismatch":
504
- return ["graph_enrichment: unavailable", `graph_fix: index built by incompatible hex-graph version; re-run mcp__hex-graph__index_project${at}`];
510
+ return [`graph_fix: index built by incompatible hex-graph version; re-run mcp__hex-graph__index_project${at}`];
505
511
  default:
506
- return ["graph_enrichment: unavailable"];
512
+ return [];
507
513
  }
508
514
  }
509
515
  function validateContract(db) {
@@ -603,20 +609,6 @@ function ensureGraphFreshForFile(db, absoluteFilePath) {
603
609
  return true;
604
610
  }
605
611
  }
606
- function isGraphFreshAtMtime(db, absoluteFilePath, mtimeMs) {
607
- if (!db) return false;
608
- try {
609
- const projectRoot = findProjectRoot(absoluteFilePath);
610
- if (!projectRoot) return true;
611
- const relativeFile = normalizeRelativeFile(projectRoot, absoluteFilePath);
612
- if (!relativeFile) return true;
613
- const indexedMtime = lookupIndexedMtime(db, relativeFile);
614
- if (indexedMtime == null) return false;
615
- return mtimeMs <= indexedMtime + FRESHNESS_TOLERANCE_MS;
616
- } catch {
617
- return true;
618
- }
619
- }
620
612
  function fileAnnotations(db, file, { startLine = null, endLine = null, limit = 8 } = {}) {
621
613
  try {
622
614
  const hasRange = Number.isInteger(startLine) && Number.isInteger(endLine);
@@ -1290,7 +1282,12 @@ function renderRequestedSpan(block) {
1290
1282
  return `requested_span: ${block.requestedStartLine}-${block.requestedEndLine}`;
1291
1283
  }
1292
1284
  function renderMetaLines(meta = {}) {
1293
- return Object.entries(meta).filter(([, value]) => value !== void 0 && value !== null && value !== "").map(([key, value]) => `${key}: ${value}`);
1285
+ return Object.entries(meta).filter(([key, value]) => {
1286
+ if (value === void 0 || value === null || value === "") return false;
1287
+ if (key === "eol" && value === "lf") return false;
1288
+ if (key === "trailing_newline" && value === true) return false;
1289
+ return true;
1290
+ }).map(([key, value]) => `${key}: ${value}`);
1294
1291
  }
1295
1292
  function renderBaseEntry(entry, plain = false) {
1296
1293
  return plain ? `${entry.lineNumber}|${entry.text}` : `${entry.tag}.${entry.lineNumber} ${entry.text}`;
@@ -1376,10 +1373,10 @@ function serializeReadBlock(block, opts = {}) {
1376
1373
  function serializeSearchBlock(block, opts = {}) {
1377
1374
  if (block.type !== "edit_ready_block") return serializeDiagnosticBlock(block);
1378
1375
  const lines = [
1379
- `block: ${block.kind}`,
1380
- `file: ${block.path}`,
1381
- `span: ${block.startLine}-${block.endLine}`
1376
+ `block: ${block.kind}`
1382
1377
  ];
1378
+ if (!opts.skipFile) lines.push(`file: ${block.path}`);
1379
+ lines.push(`span: ${block.startLine}-${block.endLine}`);
1383
1380
  const requestedSpan = renderRequestedSpan(block);
1384
1381
  if (requestedSpan) lines.push(requestedSpan);
1385
1382
  if (Array.isArray(block.meta.matchLines) && block.meta.matchLines.length > 0) {
@@ -1558,9 +1555,7 @@ function readFile2(filePath, opts = {}) {
1558
1555
  if (normalized.type === "diagnostic_block") diagnostics.push(normalized);
1559
1556
  else normalizedRanges.push(normalized);
1560
1557
  }
1561
- const sizeText = formatSize(stat.size);
1562
- const ago = relativeTime(stat.mtime);
1563
- let meta = `${total} lines, ${sizeText}, ${ago}`;
1558
+ let meta = `lines=${total} size=${stat.size} mtime=${stat.mtime.toISOString()}`;
1564
1559
  const rangeForMeta = normalizedRanges.length === 1 ? normalizedRanges[0] : null;
1565
1560
  if (rangeForMeta) {
1566
1561
  if (rangeForMeta.startLine > 1 || rangeForMeta.endLine < total) meta += `, showing ${rangeForMeta.startLine}-${rangeForMeta.endLine}`;
@@ -1615,8 +1610,7 @@ Graph: ${items.join(" | ")}`;
1615
1610
  const body = [sections.join("\n\n"), diagnosticsText].filter(Boolean).join("\n\n");
1616
1611
  const revisionLine = verbosity === "compact" ? `revision: ${snapshot.revision}
1617
1612
  ` : "";
1618
- return `File: ${filePath}${graphLine}
1619
- meta: ${meta}
1613
+ return `meta: ${meta}${graphLine}
1620
1614
  ${revisionLine}${buildContinuation({ nextOffset, limit: opts.limit && opts.limit > 0 ? opts.limit : DEFAULT_LIMIT })}
1621
1615
  ${body}`.trim();
1622
1616
  }
@@ -1652,12 +1646,9 @@ ${body}`.trim();
1652
1646
  requestedEndLine: cappedAtLine
1653
1647
  })));
1654
1648
  }
1655
- return `File: ${filePath}${graphLine}
1656
- meta: ${meta}
1649
+ return `meta: ${meta}${graphLine}
1657
1650
  revision: ${snapshot.revision}
1658
1651
  file: ${snapshot.fileChecksum}
1659
- eol: ${snapshot.eol}
1660
- trailing_newline: ${snapshot.trailingNewline}
1661
1652
 
1662
1653
  ${serializedBlocks.join("\n\n")}`.trim();
1663
1654
  }
@@ -1746,15 +1737,64 @@ function validateChecksumCoverage(rangeChecksum2, actualStart, actualEnd) {
1746
1737
  }
1747
1738
  return { ok: true, start, end, reason: null };
1748
1739
  }
1740
+ var AMBIGUOUS_DELIMITER_RE = /^[)\]}](?:[;,])?$|^[)\]}]{2,}[;,]?$|^\}\);?$|^\}\)\);?$/;
1741
+ var KNOWN_BLOCK_CLOSERS = /* @__PURE__ */ new Set(["end", "esac", "loop", "endif", "endfor", "endwhile"]);
1742
+ function isAmbiguousDelimiter(line) {
1743
+ const trimmed = String(line ?? "").trim();
1744
+ if (!trimmed) return false;
1745
+ if (AMBIGUOUS_DELIMITER_RE.test(trimmed)) return true;
1746
+ if (KNOWN_BLOCK_CLOSERS.has(trimmed)) return true;
1747
+ return false;
1748
+ }
1749
+ function countDelimChars(lines) {
1750
+ let open = 0, close = 0, paren = 0, cparen = 0, bracket = 0, cbracket = 0;
1751
+ for (const line of lines) {
1752
+ const s = String(line ?? "");
1753
+ for (let i = 0; i < s.length; i++) {
1754
+ const ch = s.charCodeAt(i);
1755
+ if (ch === 123) open++;
1756
+ else if (ch === 125) close++;
1757
+ else if (ch === 40) paren++;
1758
+ else if (ch === 41) cparen++;
1759
+ else if (ch === 91) bracket++;
1760
+ else if (ch === 93) cbracket++;
1761
+ }
1762
+ }
1763
+ return { open, close, paren, cparen, bracket, cbracket };
1764
+ }
1765
+ function lexicalBraceCounts(lines) {
1766
+ return countDelimChars(lines);
1767
+ }
1768
+ function isFileGloballyBalanced(counts) {
1769
+ return counts.open === counts.close && counts.paren === counts.cparen && counts.bracket === counts.cbracket;
1770
+ }
1771
+ function braceDelta(origLines, newLines) {
1772
+ const before = countDelimChars(origLines);
1773
+ const after = countDelimChars(newLines);
1774
+ const bd = after.open - after.close - (before.open - before.close);
1775
+ const pd = after.paren - after.cparen - (before.paren - before.cparen);
1776
+ const kd = after.bracket - after.cbracket - (before.bracket - before.cbracket);
1777
+ return {
1778
+ brace: bd,
1779
+ paren: pd,
1780
+ bracket: kd,
1781
+ totalAbs: Math.abs(bd) + Math.abs(pd) + Math.abs(kd)
1782
+ };
1783
+ }
1749
1784
 
1750
1785
  // lib/output-contract.mjs
1751
1786
  var STATUS = {
1752
1787
  OK: "OK",
1788
+ ERROR: "ERROR",
1753
1789
  AUTO_REBASED: "AUTO_REBASED",
1754
1790
  CONFLICT: "CONFLICT",
1755
1791
  STALE: "STALE",
1756
- INVALID: "INVALID"
1792
+ INVALID: "INVALID",
1793
+ NO_CHANGES: "NO_CHANGES",
1794
+ CHANGED: "CHANGED",
1795
+ UNSUPPORTED: "UNSUPPORTED"
1757
1796
  };
1797
+ var STATUS_VALUES = Object.freeze(Object.values(STATUS));
1758
1798
  var ACTION = {
1759
1799
  APPLY_RETRY_EDIT: "apply_retry_edit",
1760
1800
  APPLY_RETRY_BATCH: "apply_retry_batch",
@@ -1772,6 +1812,7 @@ var ACTION = {
1772
1812
  INSPECT_RAW_DIFF: "inspect_raw_diff",
1773
1813
  REVIEW_RISKS: "review_risks"
1774
1814
  };
1815
+ var ACTION_VALUES = Object.freeze(Object.values(ACTION));
1775
1816
  var REASON = {
1776
1817
  EDIT_APPLIED: "edit_applied",
1777
1818
  EDIT_AUTO_REBASED: "edit_auto_rebased",
@@ -1784,7 +1825,10 @@ var REASON = {
1784
1825
  FILE_CHANGED: "file_changed",
1785
1826
  FILE_UNCHANGED: "file_unchanged",
1786
1827
  SEMANTIC_DIFF_UNSUPPORTED: "semantic_diff_unsupported",
1787
- BATCH_CONFLICT: "batch_conflict"
1828
+ BATCH_CONFLICT: "batch_conflict",
1829
+ MISSING_ANCHOR: "missing_anchor",
1830
+ CHECKSUM_COVERAGE_MISMATCH: "checksum_coverage_mismatch",
1831
+ OVERLAP_DETECTED: "overlap_detected"
1788
1832
  };
1789
1833
 
1790
1834
  // lib/edit.mjs
@@ -2128,7 +2172,7 @@ function buildSuggestedReadCall(filePath, recoveryRanges) {
2128
2172
  return JSON.stringify({
2129
2173
  tool: "mcp__hex-line__read_file",
2130
2174
  arguments: {
2131
- path: filePath,
2175
+ file_path: filePath,
2132
2176
  ranges: dedupeRanges(recoveryRanges)
2133
2177
  }
2134
2178
  });
@@ -2182,13 +2226,6 @@ function deriveNextAction({ retryEdit = null, retryEdits = null, suggestedReadCa
2182
2226
  function summarizeChangedSpan(minLine, maxLine) {
2183
2227
  return Number.isFinite(minLine) && Number.isFinite(maxLine) && minLine !== Infinity && maxLine > 0 ? `${minLine}-${maxLine}` : "unknown";
2184
2228
  }
2185
- function countDiffEntries(fullDiff) {
2186
- if (!fullDiff) return 0;
2187
- return fullDiff.split("\n").filter((line) => /^[+-]\d+\|/.test(line)).length;
2188
- }
2189
- function payloadSections(sections) {
2190
- return sections.length > 0 ? sections.join(",") : "summary_only";
2191
- }
2192
2229
  function buildRetryEdit(edit, lines, options = {}) {
2193
2230
  const retryChecksum = options.retryChecksum || null;
2194
2231
  if (edit.set_line) {
@@ -2663,7 +2700,18 @@ Recovery: read_file path ranges=["${csStart}-${csEnd}"], then retry edit with fr
2663
2700
  return null;
2664
2701
  }
2665
2702
  function applyReplaceBetweenEdit(edit, ctx) {
2666
- const { lines, lineEndings, defaultEol, opts, locateOrConflict, ensureRevisionContext } = ctx;
2703
+ const {
2704
+ lines,
2705
+ lineEndings,
2706
+ defaultEol,
2707
+ opts,
2708
+ locateOrConflict,
2709
+ ensureRevisionContext,
2710
+ currentSnapshot,
2711
+ buildStrictChecksumMismatchError,
2712
+ conflictIfNeeded,
2713
+ origLines
2714
+ } = ctx;
2667
2715
  const boundaryMode = edit.replace_between.boundary_mode || "inclusive";
2668
2716
  if (boundaryMode !== "inclusive" && boundaryMode !== "exclusive") {
2669
2717
  throw new Error(`BAD_INPUT: replace_between boundary_mode must be inclusive or exclusive, got ${boundaryMode}`);
@@ -2685,6 +2733,73 @@ function applyReplaceBetweenEdit(edit, ctx) {
2685
2733
  endAnchor: `${lineTag(fnv1a(lines[endIdx]))}.${endIdx + 1}`
2686
2734
  }));
2687
2735
  if (conflict) return conflict;
2736
+ const rangeChecksum2 = edit.replace_between.range_checksum;
2737
+ const startAmbiguous = isAmbiguousDelimiter(lines[startIdx]);
2738
+ const endAmbiguous = isAmbiguousDelimiter(lines[endIdx]);
2739
+ const bothAmbiguous = startAmbiguous && endAmbiguous;
2740
+ const actualStart = targetRange.start;
2741
+ const actualEnd = targetRange.end;
2742
+ if (rangeChecksum2) {
2743
+ const coverage = validateChecksumCoverage(rangeChecksum2, actualStart, actualEnd);
2744
+ if (!coverage.ok) {
2745
+ const snip = buildErrorSnippet(origLines, actualStart - 1);
2746
+ const retryChecksum = buildRangeChecksum(currentSnapshot, actualStart, actualEnd);
2747
+ throw new Error(
2748
+ `${coverage.reason}
2749
+
2750
+ Current content (lines ${snip.start}-${snip.end}):
2751
+ ${snip.text}
2752
+
2753
+ ` + (retryChecksum ? `Retry checksum: ${retryChecksum}` : "Tip: Use updated hashes above for retry.")
2754
+ );
2755
+ }
2756
+ const { start: csStart, end: csEnd, hex: csHex } = parseChecksum(rangeChecksum2);
2757
+ const actual = buildRangeChecksum(currentSnapshot, csStart, csEnd);
2758
+ const actualHex = actual?.split(":")[1];
2759
+ if (!actual || csHex !== actualHex) {
2760
+ const exactRetryChecksum = buildRangeChecksum(currentSnapshot, actualStart, actualEnd);
2761
+ const details = `CHECKSUM_MISMATCH: expected ${rangeChecksum2}, got ${actual}. Content at lines ${csStart}-${csEnd} differs from when you read it.
2762
+ Recovery: read_file path ranges=["${csStart}-${csEnd}"], then retry edit with fresh checksum.`;
2763
+ if (opts.conflictPolicy !== "strict") {
2764
+ return conflictIfNeeded(
2765
+ "stale_checksum",
2766
+ csStart - 1,
2767
+ exactRetryChecksum || actual,
2768
+ details,
2769
+ [`${actualStart}-${actualEnd}`],
2770
+ buildRetryEdit(edit, lines, {
2771
+ startAnchor: `${lineTag(fnv1a(lines[startIdx]))}.${actualStart}`,
2772
+ endAnchor: `${lineTag(fnv1a(lines[endIdx]))}.${actualEnd}`,
2773
+ retryChecksum: exactRetryChecksum || actual
2774
+ })
2775
+ );
2776
+ }
2777
+ throw buildStrictChecksumMismatchError(details, csStart, actual);
2778
+ }
2779
+ } else if (bothAmbiguous) {
2780
+ if (opts.conflictPolicy !== "strict") {
2781
+ ctx.warnings.push({
2782
+ code: "lone_delimiter_anchors",
2783
+ suggestion: "provide range_checksum or use replace_lines",
2784
+ start_anchor_line: startIdx + 1,
2785
+ end_anchor_line: endIdx + 1
2786
+ });
2787
+ } else {
2788
+ const retryChecksum = buildRangeChecksum(currentSnapshot, actualStart, actualEnd);
2789
+ return conflictIfNeeded(
2790
+ "lone_delimiter_anchors",
2791
+ startIdx,
2792
+ retryChecksum,
2793
+ `Both replace_between anchors are lone delimiters (line-content hashes are ambiguous and may match a sibling closing delimiter). Provide range_checksum to disambiguate, or use replace_lines with range_checksum.`,
2794
+ [`${actualStart}-${actualEnd}`],
2795
+ buildRetryEdit(edit, lines, {
2796
+ startAnchor: `${lineTag(fnv1a(lines[startIdx]))}.${startIdx + 1}`,
2797
+ endAnchor: `${lineTag(fnv1a(lines[endIdx]))}.${endIdx + 1}`,
2798
+ retryChecksum
2799
+ })
2800
+ );
2801
+ }
2802
+ }
2688
2803
  const txt = edit.replace_between.new_text;
2689
2804
  let newLines = sanitizeEditText(txt ?? "").split("\n");
2690
2805
  const sliceStart = boundaryMode === "exclusive" ? startIdx + 1 : startIdx;
@@ -2697,23 +2812,61 @@ function applyReplaceBetweenEdit(edit, ctx) {
2697
2812
  return null;
2698
2813
  }
2699
2814
  replaceLogicalRange(lines, lineEndings, sliceStart, sliceStart + removeCount - 1, newLines, defaultEol);
2815
+ let boundaryEchoSkipped = false;
2700
2816
  const insertEnd = sliceStart + newLines.length;
2701
2817
  if (newLines.length > 0 && insertEnd < lines.length) {
2702
2818
  const lastNew = newLines[newLines.length - 1].trim();
2703
2819
  const firstAfter = lines[insertEnd].trim();
2704
2820
  if (lastNew && lastNew === firstAfter) {
2705
- lines.splice(insertEnd, 1);
2706
- lineEndings.splice(insertEnd, 1);
2707
- ctx.corrections.push({ type: "tail_echo", line: insertEnd + 1, content: firstAfter });
2821
+ if (isAmbiguousDelimiter(firstAfter)) {
2822
+ boundaryEchoSkipped = true;
2823
+ ctx.warnings.push({
2824
+ code: "boundary_echo_skipped",
2825
+ position: "tail",
2826
+ line: insertEnd + 1,
2827
+ content: firstAfter
2828
+ });
2829
+ } else {
2830
+ lines.splice(insertEnd, 1);
2831
+ lineEndings.splice(insertEnd, 1);
2832
+ ctx.corrections.push({ type: "tail_echo", line: insertEnd + 1, content: firstAfter });
2833
+ }
2708
2834
  }
2709
2835
  }
2710
2836
  if (newLines.length > 0 && sliceStart > 0) {
2711
2837
  const firstNew = newLines[0].trim();
2712
2838
  const lineBefore = lines[sliceStart - 1].trim();
2713
2839
  if (firstNew && firstNew === lineBefore) {
2714
- lines.splice(sliceStart - 1, 1);
2715
- lineEndings.splice(sliceStart - 1, 1);
2716
- ctx.corrections.push({ type: "head_echo", line: sliceStart, content: lineBefore });
2840
+ if (isAmbiguousDelimiter(lineBefore)) {
2841
+ boundaryEchoSkipped = true;
2842
+ ctx.warnings.push({
2843
+ code: "boundary_echo_skipped",
2844
+ position: "head",
2845
+ line: sliceStart,
2846
+ content: lineBefore
2847
+ });
2848
+ } else {
2849
+ lines.splice(sliceStart - 1, 1);
2850
+ lineEndings.splice(sliceStart - 1, 1);
2851
+ ctx.corrections.push({ type: "head_echo", line: sliceStart, content: lineBefore });
2852
+ }
2853
+ }
2854
+ }
2855
+ const delta = braceDelta(origRange, newLines);
2856
+ if (delta.totalAbs >= 2) {
2857
+ const preCounts = lexicalBraceCounts(origLines);
2858
+ if (isFileGloballyBalanced(preCounts)) {
2859
+ const postCounts = lexicalBraceCounts(lines);
2860
+ if (!isFileGloballyBalanced(postCounts)) {
2861
+ const hasStructuralSignal = startAmbiguous || endAmbiguous || boundaryEchoSkipped || newLines.some((l) => isAmbiguousDelimiter(l)) || origRange.some((l) => isAmbiguousDelimiter(l));
2862
+ if (hasStructuralSignal) {
2863
+ ctx.warnings.push({
2864
+ code: "brace_imbalance",
2865
+ delta: { brace: delta.brace, paren: delta.paren, bracket: delta.bracket },
2866
+ range: [sliceStart + 1, sliceStart + newLines.length]
2867
+ });
2868
+ }
2869
+ }
2717
2870
  }
2718
2871
  }
2719
2872
  return null;
@@ -2856,7 +3009,8 @@ ${snip.text}`;
2856
3009
  opts,
2857
3010
  origLines,
2858
3011
  staleRevision,
2859
- corrections: []
3012
+ corrections: [],
3013
+ warnings: []
2860
3014
  };
2861
3015
  const batchConflicts = collectBatchConflicts({
2862
3016
  edits: anchored,
@@ -2897,11 +3051,6 @@ ${snip.text}`;
2897
3051
  throw new Error("NOOP_EDIT: File already contains the desired content. No changes needed.");
2898
3052
  }
2899
3053
  const fullDiff = simpleDiff(origLines, lines);
2900
- let displayDiff = fullDiff;
2901
- if (displayDiff && displayDiff.length > MAX_DIFF_CHARS) {
2902
- displayDiff = displayDiff.slice(0, MAX_DIFF_CHARS) + `
2903
- ... (diff truncated, ${displayDiff.length} chars total)`;
2904
- }
2905
3054
  const newLinesAll = lines;
2906
3055
  let minLine = Infinity, maxLine = 0;
2907
3056
  if (fullDiff) {
@@ -2914,7 +3063,6 @@ ${snip.text}`;
2914
3063
  }
2915
3064
  }
2916
3065
  }
2917
- const diffEntryCount = countDiffEntries(fullDiff);
2918
3066
  const changedSpan = summarizeChangedSpan(minLine, maxLine);
2919
3067
  if (opts.dryRun) {
2920
3068
  let msg2 = `status: ${autoRebased ? STATUS.AUTO_REBASED : STATUS.OK}
@@ -2923,49 +3071,50 @@ revision: ${currentSnapshot.revision}`;
2923
3071
  if (staleRevision && hasBaseSnapshot) msg2 += `
2924
3072
  changed_ranges: ${describeChangedRanges(changedRanges)}`;
2925
3073
  msg2 += `
2926
- summary: lines_changed=${changedSpan} diff_entries=${diffEntryCount} lines_after=${lines.length}`;
2927
- msg2 += `
2928
- payload_sections: ${payloadSections(displayDiff ? ["diff"] : [])}`;
3074
+ summary: lines_changed=${changedSpan} lines_after=${lines.length}`;
2929
3075
  const hint = graphUnavailableHint(real);
2930
3076
  if (hint.length > 0) msg2 += `
2931
3077
  ${hint.join("\n")}`;
2932
3078
  msg2 += `
2933
3079
  Dry run: ${filePath} would change (${lines.length} lines)`;
2934
- if (displayDiff) msg2 += `
2935
-
2936
- Diff:
2937
- \`\`\`diff
2938
- ${displayDiff}
2939
- \`\`\``;
2940
3080
  return msg2;
2941
3081
  }
2942
3082
  writeFileSync(real, content, "utf-8");
2943
3083
  const nextStat = statSync6(real);
2944
3084
  const nextSnapshot = rememberSnapshot(real, content, { mtimeMs: nextStat.mtimeMs, size: nextStat.size });
3085
+ const warnings = editContext.warnings;
3086
+ let warningsNextAction = null;
3087
+ if (warnings.length > 0) {
3088
+ const codes = new Set(warnings.map((w) => w.code));
3089
+ if (codes.has("lone_delimiter_anchors")) warningsNextAction = ACTION.REREAD_RANGE;
3090
+ else if (codes.has("brace_imbalance") || codes.has("boundary_echo_skipped")) warningsNextAction = ACTION.REVIEW_RISKS;
3091
+ }
3092
+ const nextActionValue = warningsNextAction || (autoRebased && staleRevision && hasBaseSnapshot ? ACTION.KEEP_USING : null);
2945
3093
  let msg = `status: ${autoRebased ? STATUS.AUTO_REBASED : STATUS.OK}
2946
- reason: ${autoRebased ? REASON.EDIT_AUTO_REBASED : REASON.EDIT_APPLIED}
2947
3094
  revision: ${nextSnapshot.revision}`;
2948
3095
  if (autoRebased && staleRevision && hasBaseSnapshot) {
2949
3096
  msg += `
2950
3097
  changed_ranges: ${describeChangedRanges(changedRanges)}`;
3098
+ }
3099
+ if (nextActionValue) {
2951
3100
  msg += `
2952
- next_action: ${ACTION.KEEP_USING}`;
3101
+ next_action: ${nextActionValue}`;
2953
3102
  }
2954
3103
  if (remaps.length > 0) {
2955
3104
  msg += `
2956
3105
  remapped_refs:
2957
3106
  ${remaps.map(({ from, to }) => `${from} -> ${to}`).join("\n")}`;
2958
3107
  }
2959
- let hasPostEditBlock = false;
3108
+ if (warnings.length > 0) {
3109
+ msg += `
3110
+ warnings: ${JSON.stringify(warnings)}`;
3111
+ }
2960
3112
  let semanticImpacts = [];
2961
3113
  let cloneWarnings = [];
2962
3114
  let graphDbAvailable = false;
2963
- let graphFresh = true;
2964
- msg += `
2965
- Updated ${filePath} (${lines.length} lines)`;
2966
3115
  if (fullDiff && minLine <= maxLine) {
2967
- const ctxStart = Math.max(0, minLine - 6) + 1;
2968
- const ctxEnd = Math.min(newLinesAll.length, maxLine + 5);
3116
+ const ctxStart = Math.max(0, minLine - 4) + 1;
3117
+ const ctxEnd = Math.min(newLinesAll.length, maxLine + 3);
2969
3118
  const entries = createSnapshotEntries(nextSnapshot, ctxStart, ctxEnd);
2970
3119
  if (entries.length > 0) {
2971
3120
  const block = buildEditReadyBlock({
@@ -2977,7 +3126,6 @@ Updated ${filePath} (${lines.length} lines)`;
2977
3126
  trailing_newline: nextSnapshot.trailingNewline
2978
3127
  }
2979
3128
  });
2980
- hasPostEditBlock = true;
2981
3129
  msg += `
2982
3130
 
2983
3131
  ${serializeReadBlock(block)}`;
@@ -2988,7 +3136,6 @@ ${serializeReadBlock(block)}`;
2988
3136
  const relFile = db ? getRelativePath(real) : null;
2989
3137
  if (db && relFile && fullDiff && minLine <= maxLine) {
2990
3138
  graphDbAvailable = true;
2991
- graphFresh = isGraphFreshAtMtime(db, real, currentSnapshot.mtimeMs);
2992
3139
  semanticImpacts = semanticImpact(db, relFile, minLine, maxLine);
2993
3140
  if (semanticImpacts.length > 0) {
2994
3141
  const sections = semanticImpacts.map((impact) => {
@@ -3011,12 +3158,12 @@ ${serializeReadBlock(block)}`;
3011
3158
  }).filter(Boolean);
3012
3159
  return [
3013
3160
  `${impact.symbol}: ${headline}`,
3014
- ...factLines.map((line) => ` ${line}`)
3161
+ ...factLines.map((line) => `.${line}`)
3015
3162
  ].join("\n");
3016
3163
  });
3017
3164
  msg += `
3018
3165
 
3019
- \u26A0 Semantic impact:
3166
+ #semantic_impact
3020
3167
  ${sections.join("\n")}`;
3021
3168
  }
3022
3169
  cloneWarnings = cloneWarning(db, relFile, minLine, maxLine);
@@ -3024,33 +3171,19 @@ ${sections.join("\n")}`;
3024
3171
  const list = cloneWarnings.map((c) => `${c.file}:${c.line}${c.cloneType ? ` (${c.cloneType})` : ""}`).join(", ");
3025
3172
  msg += `
3026
3173
 
3027
- \u26A0 ${cloneWarnings.length} clone(s): ${list}`;
3174
+ !clone_siblings count=${cloneWarnings.length} list=${list}`;
3028
3175
  }
3029
3176
  }
3030
3177
  } catch {
3031
3178
  }
3032
- if (!graphFresh) msg += "\ngraph_fresh: stale";
3033
- const payloadKinds = [];
3034
- if (hasPostEditBlock) payloadKinds.push("post_edit");
3035
- if (semanticImpacts.length > 0) payloadKinds.push("semantic_impact");
3036
- if (cloneWarnings.length > 0) payloadKinds.push("clone_warning");
3037
- if (displayDiff) payloadKinds.push("diff");
3038
3179
  const summaryLineParts = [
3039
- `summary: lines_changed=${changedSpan} diff_entries=${diffEntryCount} lines_after=${lines.length}${editContext.corrections.length > 0 ? ` boundary_echo_stripped=${editContext.corrections.length}` : ``}`,
3040
- `payload_sections: ${payloadSections(payloadKinds)}`
3180
+ `summary: lines_changed=${changedSpan} lines_after=${lines.length}`
3041
3181
  ];
3042
3182
  if (!graphDbAvailable) summaryLineParts.push(...graphUnavailableHint(real));
3043
3183
  const summaryLines = summaryLineParts.join("\n");
3044
- msg = msg.replace(`
3045
- Updated ${filePath} (${lines.length} lines)`, `
3046
- ${summaryLines}
3047
- Updated ${filePath} (${lines.length} lines)`);
3048
- if (displayDiff) msg += `
3049
-
3050
- Diff:
3051
- \`\`\`diff
3052
- ${displayDiff}
3053
- \`\`\``;
3184
+ msg = msg.replace(/\nrevision: ([^\n]+)/, `
3185
+ revision: $1
3186
+ ${summaryLines}`);
3054
3187
  return msg;
3055
3188
  }
3056
3189
 
@@ -3193,12 +3326,8 @@ async function summaryMode(pattern, target, opts, totalLimit) {
3193
3326
  if (fileHits.size > 1 && topFiles.length) {
3194
3327
  lines.push(`top_files: ${topFiles.map(([file, count]) => `${file} (${count})`).join(", ")}`);
3195
3328
  }
3196
- if (snippets.length > 0) {
3197
- lines.push("snippets:");
3198
- lines.push(...snippets.map((snippet) => `- ${snippet}`));
3199
- }
3200
3329
  if (totalLimit > 0 && rawLines.length > totalLimit) {
3201
- lines.push(`continuation_hint: rerun grep_search with a higher head_limit or narrower path/glob to inspect ${rawLines.length - totalLimit} additional match event(s)`);
3330
+ lines.push(`truncated: ${rawLines.length - totalLimit}`);
3202
3331
  }
3203
3332
  return lines.join("\n");
3204
3333
  }
@@ -3361,12 +3490,14 @@ async function contentMode(pattern, target, opts, plain, editReady, totalLimit,
3361
3490
  let capped = false;
3362
3491
  let shownBlocks = 0;
3363
3492
  let shownMatches = 0;
3493
+ let lastPath = null;
3364
3494
  for (const block of blocks) {
3365
3495
  if (block.type === "edit_ready_block" && shownBlocks >= contentBlockLimit) {
3366
3496
  capped = true;
3367
3497
  break;
3368
3498
  }
3369
- const serialized = block.type === "edit_ready_block" ? serializeSearchBlock(block, { plain: plainOutput }) : serializeDiagnosticBlock(block);
3499
+ const sameFile = block.type === "edit_ready_block" && block.path === lastPath;
3500
+ const serialized = block.type === "edit_ready_block" ? serializeSearchBlock(block, { plain: plainOutput, skipFile: sameFile }) : serializeDiagnosticBlock(block);
3370
3501
  if (parts.length > 0 && budget - serialized.length < 0) {
3371
3502
  capped = true;
3372
3503
  break;
@@ -3376,6 +3507,9 @@ async function contentMode(pattern, target, opts, plain, editReady, totalLimit,
3376
3507
  if (block.type === "edit_ready_block") {
3377
3508
  shownBlocks++;
3378
3509
  shownMatches += Array.isArray(block.meta.matchLines) ? block.meta.matchLines.length : 0;
3510
+ lastPath = block.path;
3511
+ } else {
3512
+ lastPath = null;
3379
3513
  }
3380
3514
  }
3381
3515
  if (capped) {
@@ -3663,9 +3797,7 @@ async function fileOutline(filePath) {
3663
3797
  }
3664
3798
  const db = getGraphDB(real);
3665
3799
  const relFile = db ? getRelativePath(real) : null;
3666
- return `File: ${filePath}
3667
-
3668
- ${formatOutline(entries, skippedRanges, snapshot.lines.length, snapshot, db, relFile, note)}`;
3800
+ return formatOutline(entries, skippedRanges, snapshot.lines.length, snapshot, db, relFile, note);
3669
3801
  }
3670
3802
 
3671
3803
  // lib/verify.mjs
@@ -3740,7 +3872,7 @@ function buildSuggestedReadCall2(filePath, ranges) {
3740
3872
  return JSON.stringify({
3741
3873
  tool: "mcp__hex-line__read_file",
3742
3874
  arguments: {
3743
- path: filePath,
3875
+ file_path: filePath,
3744
3876
  ranges: deduped
3745
3877
  }
3746
3878
  });
@@ -3757,28 +3889,20 @@ function overallNextAction(summary) {
3757
3889
  if (summary.stale > 0) return ACTION.REREAD_RANGES;
3758
3890
  return ACTION.KEEP_USING;
3759
3891
  }
3760
- function overallReason(status) {
3761
- if (status === STATUS.OK) return REASON.CHECKSUMS_CURRENT;
3762
- if (status === STATUS.STALE) return REASON.CHECKSUMS_STALE;
3763
- return REASON.CHECKSUMS_INVALID;
3764
- }
3765
3892
  function entrySummary(entry) {
3766
- if (entry.status === "VALID") return "checksum still current";
3767
- if (entry.status === "STALE") return "content changed since checksum capture";
3768
3893
  return entry.reason;
3769
3894
  }
3770
- function renderEntry(entry, index, total, topLevelNextAction) {
3895
+ function renderEntry(entry, index, total) {
3771
3896
  const parts = [
3772
3897
  `entry: ${index}/${total}`,
3773
- `status: ${entry.status}`,
3898
+ entry.status,
3774
3899
  entry.span ? `span: ${entry.span}` : null,
3775
3900
  `checksum: ${entry.checksum}`,
3776
3901
  entry.currentChecksum && entry.currentChecksum !== entry.checksum ? `current_checksum: ${entry.currentChecksum}` : null
3777
3902
  ].filter(Boolean);
3778
- const action = entryNextAction(entry);
3779
- if (action !== topLevelNextAction) parts.push(`next_action: ${action}`);
3780
- if (entry.status !== "VALID") parts.push(`summary: ${entrySummary(entry)}`);
3781
- return parts.join(" | ");
3903
+ if (entry.status === "INVALID") parts.push(`next_action: ${entryNextAction(entry)}`);
3904
+ if (entry.status === "INVALID") parts.push(`summary: ${entrySummary(entry)}`);
3905
+ return parts.join(" ");
3782
3906
  }
3783
3907
  function verifyChecksums(filePath, checksums, opts = {}) {
3784
3908
  filePath = normalizePath(filePath);
@@ -3795,10 +3919,14 @@ function verifyChecksums(filePath, checksums, opts = {}) {
3795
3919
  const verboseSummary = results.length > 1 || summary.stale > 0 || summary.invalid > 0;
3796
3920
  const lines = [
3797
3921
  `status: ${status}`,
3798
- `reason: ${overallReason(status)}`,
3799
3922
  `revision: ${currentSnapshot.revision}`
3800
3923
  ];
3801
- if (verboseSummary) lines.push(`summary: valid=${summary.valid} stale=${summary.stale} invalid=${summary.invalid}`);
3924
+ if (verboseSummary) {
3925
+ const sumParts = [`valid=${summary.valid}`];
3926
+ if (summary.stale > 0) sumParts.push(`stale=${summary.stale}`);
3927
+ if (summary.invalid > 0) sumParts.push(`invalid=${summary.invalid}`);
3928
+ lines.push(`summary: ${sumParts.join(" ")}`);
3929
+ }
3802
3930
  lines.push(`next_action: ${topLevelNextAction}`);
3803
3931
  if (opts.baseRevision && opts.baseRevision !== currentSnapshot.revision) {
3804
3932
  lines.push(`base_revision: ${opts.baseRevision}`);
@@ -3812,7 +3940,7 @@ function verifyChecksums(filePath, checksums, opts = {}) {
3812
3940
  const suggestedReadCall = buildSuggestedReadCall2(filePath, staleRanges);
3813
3941
  if (suggestedReadCall) lines.push(`suggested_read_call: ${suggestedReadCall}`);
3814
3942
  if (results.length > 0) {
3815
- lines.push("", ...results.map((entry, index) => renderEntry(entry, index + 1, results.length, topLevelNextAction)));
3943
+ lines.push("", ...results.map((entry, index) => renderEntry(entry, index + 1, results.length)));
3816
3944
  }
3817
3945
  return lines.join("\n");
3818
3946
  }
@@ -3981,19 +4109,17 @@ function directoryTree(dirPath, opts = {}) {
3981
4109
  if (compact) {
3982
4110
  lines.push(`${prefix}${name}`);
3983
4111
  } else {
3984
- let size = 0, mtime = null, lineCount = null;
4112
+ let size = 0, lineCount = null;
3985
4113
  try {
3986
4114
  const st = statSync7(full);
3987
4115
  size = st.size;
3988
- mtime = st.mtime;
3989
4116
  } catch {
3990
4117
  }
3991
4118
  totalSize += size;
3992
4119
  lineCount = countFileLines(full, size);
3993
4120
  const parts = [];
3994
4121
  if (lineCount !== null) parts.push(`${lineCount}L`);
3995
- parts.push(formatSize(size));
3996
- if (mtime) parts.push(relativeTime(mtime, true));
4122
+ parts.push(`size=${size}`);
3997
4123
  lines.push(`${prefix}${name} (${parts.join(", ")})`);
3998
4124
  }
3999
4125
  }
@@ -4024,10 +4150,8 @@ function directoryTree(dirPath, opts = {}) {
4024
4150
  }
4025
4151
  const rootName = basename(abs);
4026
4152
  walk(abs, " ", 1);
4027
- const header = compact ? `Directory: ${rootName}/ (${totalFiles} files)` : `Directory: ${rootName}/ (${totalFiles} files, ${formatSize(totalSize)})`;
4153
+ const header = compact ? `dir=${rootName}/ files=${totalFiles}` : `dir=${rootName}/ files=${totalFiles} total_bytes=${totalSize}`;
4028
4154
  return `${header}
4029
-
4030
- ${rootName}/
4031
4155
  ${lines.join("\n")}`;
4032
4156
  }
4033
4157
 
@@ -4114,16 +4238,12 @@ function fileInfo(filePath) {
4114
4238
  if (name === "Makefile") typeName = "Makefile";
4115
4239
  const isBinary = size > 0 ? detectBinary(abs, size) : false;
4116
4240
  const lineCount = !isBinary && size > 0 ? countFileLines(abs, size, MAX_LINE_COUNT_SIZE) : null;
4117
- const sizeStr = lineCount !== null ? `Size: ${formatSize(size)} (${lineCount} lines)` : `Size: ${formatSize(size)}`;
4118
- const timeStr = `Modified: ${mtime.toISOString().replace("T", " ").slice(0, 19)} (${relativeTime(mtime)})`;
4119
- const lines = [
4120
- `File: ${normalized}`,
4121
- sizeStr,
4122
- timeStr,
4123
- `Type: ${typeName}`
4124
- ];
4125
- if (isBinary) lines.push(`Binary: yes`);
4126
- return lines.join("\n");
4241
+ const parts = [`size=${size}`];
4242
+ if (lineCount !== null) parts.push(`lines=${lineCount}`);
4243
+ parts.push(`mtime=${mtime.toISOString()}`);
4244
+ parts.push(`type=${typeName}`);
4245
+ if (isBinary) parts.push(`binary=yes`);
4246
+ return parts.join(" ");
4127
4247
  }
4128
4248
 
4129
4249
  // lib/inspect-path.mjs
@@ -4550,9 +4670,6 @@ async function semanticGitDiff(targetPath, { baseRef = "HEAD", headRef = null }
4550
4670
  }
4551
4671
 
4552
4672
  // lib/changes.mjs
4553
- function payloadSections2(sections) {
4554
- return sections.length > 0 ? sections.join(",") : "summary_only";
4555
- }
4556
4673
  function exportedLooking(symbol) {
4557
4674
  return /^\s*(export|public)\b/.test(symbol.text || "");
4558
4675
  }
@@ -4605,9 +4722,6 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4605
4722
  ...graphHint2
4606
4723
  ].join("\n");
4607
4724
  }
4608
- let emittedRiskCount = 0;
4609
- let emittedRemovedApiWarnings = 0;
4610
- const sectionKinds2 = ["files"];
4611
4725
  const sections = [
4612
4726
  "status: CHANGED",
4613
4727
  `reason: ${REASON.DIRECTORY_CHANGED}`,
@@ -4627,20 +4741,13 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4627
4741
  sections.push(parts2.join(" | "));
4628
4742
  const riskLines2 = summarizeGraphRisk(db2, file2.path.replace(/\\/g, "/"), file2);
4629
4743
  const visibleRiskLines = riskLines2.slice(0, 2);
4630
- emittedRiskCount += visibleRiskLines.length;
4631
4744
  for (const line of visibleRiskLines) sections.push(`risk_summary: ${summarizeRiskLine(line)}`);
4632
4745
  for (const symbol of file2.removed_symbols.slice(0, 2)) {
4633
4746
  if (exportedLooking(symbol)) {
4634
- emittedRemovedApiWarnings += 1;
4635
4747
  sections.push(`removed_api_warning: ${symbol.text}`);
4636
4748
  }
4637
4749
  }
4638
4750
  }
4639
- if (emittedRiskCount > 0) sectionKinds2.push("risk_summary");
4640
- if (emittedRemovedApiWarnings > 0) sectionKinds2.push("removed_api_warning");
4641
- const spliceLines = [];
4642
- if (sectionKinds2.length > 0) spliceLines.push(`payload_sections: ${payloadSections2(sectionKinds2)}`);
4643
- sections.splice(7 + graphHint2.length, 0, ...spliceLines);
4644
4751
  return sections.join("\n");
4645
4752
  }
4646
4753
  const db = getGraphDB(real);
@@ -4672,7 +4779,6 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4672
4779
  const relFile = getRelativePath(real) || file.path?.replace(/\\/g, "/");
4673
4780
  const riskLines = summarizeGraphRisk(db, relFile, file);
4674
4781
  const removedApiWarnings = file.removed_symbols.filter(exportedLooking).slice(0, 4);
4675
- const sectionKinds = [];
4676
4782
  const parts = [
4677
4783
  "status: CHANGED",
4678
4784
  `reason: ${REASON.FILE_CHANGED}`,
@@ -4683,21 +4789,18 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4683
4789
  ...graphHint
4684
4790
  ];
4685
4791
  if (file.added_symbols.length) {
4686
- sectionKinds.push("added");
4687
4792
  parts.push(`next_action: ${ACTION.REVIEW_RISKS}`);
4688
4793
  parts.push("");
4689
4794
  parts.push("added:");
4690
4795
  for (const symbol of file.added_symbols) parts.push(` + ${symbol.start}-${symbol.end}: ${symbol.text}`);
4691
4796
  }
4692
4797
  if (file.removed_symbols.length) {
4693
- sectionKinds.push("removed");
4694
4798
  if (!parts.includes(`next_action: ${ACTION.REVIEW_RISKS}`)) parts.push(`next_action: ${ACTION.REVIEW_RISKS}`);
4695
4799
  parts.push("");
4696
4800
  parts.push("removed:");
4697
4801
  for (const symbol of file.removed_symbols) parts.push(` - ${symbol.start}-${symbol.end}: ${symbol.text}`);
4698
4802
  }
4699
4803
  if (file.modified_symbols.length) {
4700
- sectionKinds.push("modified");
4701
4804
  if (!parts.includes(`next_action: ${ACTION.REVIEW_RISKS}`)) parts.push(`next_action: ${ACTION.REVIEW_RISKS}`);
4702
4805
  parts.push("");
4703
4806
  parts.push("modified:");
@@ -4712,12 +4815,6 @@ async function fileChanges(filePath, compareAgainst = "HEAD") {
4712
4815
  parts.push("");
4713
4816
  parts.push("summary_detail: no symbol changes detected");
4714
4817
  }
4715
- if (riskLines.length > 0) sectionKinds.push("risk_summary");
4716
- if (removedApiWarnings.length > 0) sectionKinds.push("removed_api_warning");
4717
- if (sectionKinds.length > 0) {
4718
- const insertIdx = 6 + graphHint.length;
4719
- parts.splice(insertIdx, 0, `payload_sections: ${payloadSections2(sectionKinds)}`);
4720
- }
4721
4818
  if (riskLines.length || removedApiWarnings.length) {
4722
4819
  parts.push("");
4723
4820
  parts.push("risk_summary:");
@@ -4870,8 +4967,8 @@ function errorResult(code, message, recovery, { large = false, extra = null } =
4870
4967
  }
4871
4968
 
4872
4969
  // server.mjs
4873
- var version = true ? "1.25.1" : (await null).createRequire(import.meta.url)("./package.json").version;
4874
- var STATUS_ENUM = z2.enum(["OK", "ERROR", "AUTO_REBASED", "CONFLICT", "STALE", "INVALID", "NO_CHANGES", "CHANGED", "UNSUPPORTED"]);
4970
+ var version = true ? "1.27.0" : (await null).createRequire(import.meta.url)("./package.json").version;
4971
+ var STATUS_ENUM = z2.enum(STATUS_VALUES);
4875
4972
  var ERROR_SHAPE = z2.object({ code: z2.string(), message: z2.string(), recovery: z2.string() }).optional();
4876
4973
  var LINE_REPORT_KEYS = /* @__PURE__ */ new Set([
4877
4974
  "status",
@@ -4890,7 +4987,8 @@ var LINE_REPORT_KEYS = /* @__PURE__ */ new Set([
4890
4987
  "retry_edits",
4891
4988
  "suggested_read_call",
4892
4989
  "retry_plan",
4893
- "remapped_refs"
4990
+ "remapped_refs",
4991
+ "warnings"
4894
4992
  ]);
4895
4993
  var { server, StdioServerTransport } = await createServerRuntime({
4896
4994
  name: "hex-line-mcp",
@@ -4922,7 +5020,15 @@ function parseLineReport(content) {
4922
5020
  if (!match) continue;
4923
5021
  const [, key, value] = match;
4924
5022
  if (!LINE_REPORT_KEYS.has(key) || parsed[key] !== void 0) continue;
4925
- parsed[key] = value;
5023
+ if (key === "warnings") {
5024
+ try {
5025
+ parsed[key] = JSON.parse(value);
5026
+ } catch {
5027
+ parsed[key] = value;
5028
+ }
5029
+ } else {
5030
+ parsed[key] = value;
5031
+ }
4926
5032
  }
4927
5033
  return parsed;
4928
5034
  }
@@ -5007,11 +5113,11 @@ ERROR: ${e.message}`);
5007
5113
  });
5008
5114
  server.registerTool("edit_file", {
5009
5115
  title: "Edit File",
5010
- description: "Apply hash-verified partial edits to one file. Carry base_revision on same-file follow-ups. Preserves existing line endings and trailing-newline shape; conservative conflicts return retry helpers.",
5116
+ description: "Apply hash-verified partial edits to one file. Carry base_revision on same-file follow-ups. Preserves existing line endings and trailing-newline shape; conservative conflicts return retry helpers. boundary_mode=inclusive deletes the anchor lines themselves; new_text must close any delimiter whose opening falls inside the replaced range.",
5011
5117
  inputSchema: z2.object({
5012
5118
  file_path: z2.string().describe("File to edit"),
5013
5119
  edits: z2.union([z2.string(), z2.array(z2.any())]).describe(
5014
- 'JSON array of canonical edits.\n[{"set_line":{"anchor":"ab.12","new_text":"x"}}]\n[{"replace_lines":{"start_anchor":"ab.10","end_anchor":"cd.15","new_text":"x","range_checksum":"10-15:a1b2"}}]\n[{"insert_after":{"anchor":"ab.20","text":"x"}}]\n[{"replace_between":{"start_anchor":"ab.10","end_anchor":"cd.40","new_text":"x","boundary_mode":"inclusive"}}]'
5120
+ 'JSON array of canonical edits.\n[{"set_line":{"anchor":"ab.12","new_text":"x"}}]\n[{"replace_lines":{"start_anchor":"ab.10","end_anchor":"cd.15","new_text":"x","range_checksum":"10-15:a1b2"}}]\n[{"insert_after":{"anchor":"ab.20","text":"x"}}]\n[{"replace_between":{"start_anchor":"ab.10","end_anchor":"cd.40","new_text":"x","boundary_mode":"inclusive","range_checksum":"10-40:a1b2"}}]\nPrefer replace_lines with range_checksum when either anchor is a lone delimiter (}, ), ]) \u2014 replace_between anchors use short line-content hashes and may fuzzy-match a sibling delimiter.'
5015
5121
  ),
5016
5122
  dry_run: flexBool().describe("Preview changes without writing"),
5017
5123
  restore_indent: flexBool().describe("Auto-fix indentation to match anchor (default: false)"),
@@ -5019,7 +5125,7 @@ server.registerTool("edit_file", {
5019
5125
  conflict_policy: z2.enum(["strict", "conservative"]).optional().describe('Conflict handling (default: "conservative"). "conservative" returns structured CONFLICT output with recovery_ranges, retry_edit/retry_edits, suggested_read_call, and retry_plan when available.'),
5020
5126
  allow_external: flexBool().describe("Allow editing a path outside the current project root. Use only when you intentionally target a temp or external file.")
5021
5127
  }),
5022
- outputSchema: z2.object({ status: STATUS_ENUM, file_path: z2.string().optional(), content: z2.string().optional(), reason: z2.string().optional(), summary: z2.string().optional(), next_action: z2.string().optional(), error: ERROR_SHAPE }),
5128
+ outputSchema: z2.object({ status: STATUS_ENUM, file_path: z2.string().optional(), content: z2.string().optional(), reason: z2.string().optional(), summary: z2.string().optional(), next_action: z2.string().optional(), warnings: z2.array(z2.object({ code: z2.string() }).passthrough()).optional(), error: ERROR_SHAPE }),
5023
5129
  annotations: { readOnlyHint: false, destructiveHint: false, idempotentHint: false, openWorldHint: false }
5024
5130
  }, async (rawParams) => {
5025
5131
  const { file_path: p, edits: json, dry_run, restore_indent, base_revision, conflict_policy, allow_external } = rawParams ?? {};
package/output-style.md CHANGED
@@ -47,6 +47,9 @@ Prefer `hex-line` for text files you may inspect or modify. Hash-annotated reads
47
47
  - First mutation in a file: use `grep_search(output_mode="summary")` for narrow targets, or `outline -> read_file(ranges)` for structural edits. Escalate to `grep_search(output_mode="content", edit_ready=true)` only when the next edit needs canonical hunks.
48
48
  - Preserve file conventions mentally: `hex-line` hashes normalized logical text, but `edit_file` preserves the file's existing line endings and trailing-newline shape on write.
49
49
  - Prefer `set_line` or `insert_after` for small local changes. Prefer `replace_between` for larger bounded block rewrites.
50
+ - When either anchor of `replace_between` is a lone delimiter (`}`, `)`, `]`, `});`), switch to `replace_lines` with `range_checksum`, or pass `range_checksum` to `replace_between` directly. `replace_between` anchors use short line-content hashes and may fuzzy-match a sibling closing delimiter.
51
+ - For inclusive `replace_between`: enumerate every `{`, `(`, `[` opened inside the replaced range and ensure `new_text` closes them all. If the range crosses a method/class/namespace boundary, prefer `set_line` + `insert_after` for each hunk.
52
+ - After `replace_between` on C#/Java/Go/C++/Rust files, run the language build or type-check once before proceeding. Brace drift is invisible at edit time.
50
53
  - Use `replace_lines` only when you already hold the exact inclusive range checksum for that block.
51
54
  - Avoid large first-pass edit batches. Start with 1-2 hunks, then continue from the returned `revision` as `base_revision`.
52
55
  - Before a delayed follow-up edit, a formatter pass, or any mixed-tool workflow on the same file, run `verify` with the last checksums and `base_revision`.
@@ -58,8 +61,8 @@ Prefer `hex-line` for text files you may inspect or modify. Hash-annotated reads
58
61
 
59
62
  ## Exceptions
60
63
 
61
- - Built-in `Read`/`Edit`/`Write`/`Grep`/`Glob` are fallback only. With the hook active, project-scoped text calls and file discovery route to hex-line. Built-in OK for images, PDFs, notebooks, and text paths outside the current project root.
62
- - Bash is still fine for npm, node, git, docker, curl, non-inspection pipelines, and other runtime workflows. Project file inspection commands route to hex-line, including Windows-native readers/searchers/listing commands.
64
+ - Built-in `Read`/`Edit`/`Write`/`Grep`/`Glob` are fallback only by preference. With the hook active, project-scoped text calls and file discovery receive hex-line guidance by default and hard-route only when `hooks.mode` is `blocking`. Built-in OK for images, PDFs, notebooks, plan files in Plan Mode, and text paths outside the current project root.
65
+ - Bash is still fine for npm, node, git, docker, curl, non-inspection pipelines, and other runtime workflows. Project file inspection commands receive hex-line guidance, including Windows-native readers/searchers/listing commands.
63
66
 
64
67
  ## hex-graph
65
68
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@levnikolaevich/hex-line-mcp",
3
- "version": "1.25.1",
3
+ "version": "1.27.0",
4
4
  "mcpName": "io.github.levnikolaevich/hex-line-mcp",
5
5
  "type": "module",
6
6
  "description": "Hash-verified file editing MCP + token efficiency hook for AI coding agents. 9 tools: inspect_path, read, edit, write, grep, outline, verify, changes, bulk_replace.",