opencode-hashline 1.1.2 → 1.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.en.md CHANGED
@@ -44,7 +44,12 @@ The AI model can then reference lines by their hash tags for precise editing:
44
44
 
45
45
  ### 🤔 Why does this help?
46
46
 
47
- Traditional line numbers shift as edits are made, causing off-by-one errors and stale references. Hashline tags are **content-addressable** — they're derived from both the line index and the line's content, so they serve as a stable, verifiable reference that the AI can use to communicate about code locations with precision.
47
+ Hashline solves the fundamental problems of the two existing AI file-editing approaches:
48
+
49
+ - **`str_replace`** requires an absolutely exact match of `old_string`. Any extra whitespace, wrong indentation, or duplicate lines in the file — and the edit fails with "String to replace not found". This is so common it has a [mega-thread of 27+ related issues on GitHub](https://github.com/anthropics/claude-code/issues).
50
+ - **`apply_patch`** (unified diff) only works on models specifically trained for this format. On other models the results are catastrophic: Grok 4 fails **50.7%** of patches, GLM-4.7 fails **46.2%** ([source](https://habr.com/ru/companies/bothub/news/995986/)).
51
+
52
+ Hashline addresses each line with a unique `lineNumber:hash`. No string matching, no model-specific training dependency — just precise, verifiable line addressing.
48
53
 
49
54
  ---
50
55
 
@@ -355,22 +360,25 @@ const hl = createHashline({ cacheSize: 50, hashLength: 3 });
355
360
 
356
361
  ## 📊 Benchmark
357
362
 
358
- ### Correctness: hashline vs str_replace
363
+ ### Correctness: hashline vs str_replace vs apply_patch
364
+
365
+ We tested all three approaches on **60 fixtures from [react-edit-benchmark](https://github.com/can1357/oh-my-pi/tree/main/packages/react-edit-benchmark)** — mutated React source files with known bugs (flipped booleans, swapped operators, removed guard clauses, etc.):
359
366
 
360
- We tested both approaches on **60 fixtures from [react-edit-benchmark](https://github.com/can1357/oh-my-pi/tree/main/packages/react-edit-benchmark)** — mutated React source files with known bugs (flipped booleans, swapped operators, removed guard clauses, etc.):
367
+ | | hashline | str_replace | apply_patch |
368
+ |---|:---:|:---:|:---:|
369
+ | **Passed** | **60/60 (100%)** | 58/60 (96.7%) | **60/60 (100%)** |
370
+ | **Failed** | 0 | 2 | 0 |
371
+ | **Ambiguous edits** | 0 | 4 | 0 |
361
372
 
362
- | | hashline | str_replace |
363
- |---|:---:|:---:|
364
- | **Passed** | **60/60 (100%)** | 58/60 (96.7%) |
365
- | **Failed** | 0 | 2 |
366
- | **Ambiguous edits** | 0 | 4 |
373
+ `apply_patch` with context lines matches hashline's reliability **when the model generates the patch correctly**. The key weakness of `apply_patch` is its dependency on model-specific training: models not trained on this format produce malformed diffs (missing context lines, wrong indentation), causing patch application to fail.
367
374
 
368
- str_replace fails when the `old_string` appears multiple times in the file (e.g. repeated guard clauses, similar code blocks). Hashline addresses each line uniquely via `lineNumber:hash`, so ambiguity is impossible.
375
+ `str_replace` fails when `old_string` appears multiple times in the file (repeated guard clauses, similar code blocks). Hashline addresses each line uniquely via `lineNumber:hash` ambiguity is impossible and no model-specific format is required.
369
376
 
370
377
  ```bash
371
378
  # Run yourself:
372
- npx tsx benchmark/run.ts # hashline mode
373
- npx tsx benchmark/run.ts --no-hash # str_replace mode
379
+ npx tsx benchmark/run.ts # hashline mode
380
+ npx tsx benchmark/run.ts --no-hash # str_replace mode
381
+ npx tsx benchmark/run.ts --apply-patch # apply_patch mode
374
382
  ```
375
383
 
376
384
  <details>
package/README.md CHANGED
@@ -44,7 +44,12 @@ AI-модель может ссылаться на строки по их хеш
44
44
 
45
45
  ### 🤔 Почему это помогает?
46
46
 
47
- Традиционные номера строк сдвигаются при редактировании, вызывая ошибки смещения и устаревшие ссылки. Хеш-теги Hashline **контентно-адресуемы** они вычисляются из индекса строки и её содержимого, что делает их стабильной, верифицируемой ссылкой для точной коммуникации о местоположении в коде.
47
+ Hashline решает фундаментальные проблемы двух существующих подходов к редактированию файлов AI:
48
+
49
+ - **`str_replace`** требует абсолютно точного совпадения `old_string`. Любой лишний пробел, неверный отступ или дублирующиеся строки в файле — и редактирование завершается ошибкой «String to replace not found». Это настолько распространённая проблема, что у неё есть [мегатред на 27+ тикетов на GitHub](https://github.com/anthropics/claude-code/issues).
50
+ - **`apply_patch`** (unified diff) работает только на моделях, специально обученных этому формату. На других моделях результаты катастрофические: Grok 4 проваливает **50.7%** патчей, GLM-4.7 — **46.2%** ([источник](https://habr.com/ru/companies/bothub/news/995986/)).
51
+
52
+ Hashline адресует каждую строку уникальным хешем `lineNumber:hash`. Никакого строкового совпадения, никакой зависимости от специального обучения модели — только точная, верифицируемая адресация.
48
53
 
49
54
  ---
50
55
 
@@ -330,22 +335,25 @@ const hl = createHashline({ cacheSize: 50, hashLength: 3 });
330
335
 
331
336
  ## 📊 Бенчмарк
332
337
 
333
- ### Корректность: hashline vs str_replace
338
+ ### Корректность: hashline vs str_replace vs apply_patch
339
+
340
+ Все три подхода протестированы на **60 фикстурах из [react-edit-benchmark](https://github.com/can1357/oh-my-pi/tree/main/packages/react-edit-benchmark)** — мутированных файлах React с известными багами (инвертированные булевы, перепутанные операторы, удалённые guard-клаузы и т.д.):
334
341
 
335
- Оба подхода протестированы на **60 фикстурах из [react-edit-benchmark](https://github.com/can1357/oh-my-pi/tree/main/packages/react-edit-benchmark)** — мутированных файлах React с известными багами (инвертированные булевы, перепутанные операторы, удалённые guard-клаузы и т.д.):
342
+ | | hashline | str_replace | apply_patch |
343
+ |---|:---:|:---:|:---:|
344
+ | **Прошло** | **60/60 (100%)** | 58/60 (96.7%) | **60/60 (100%)** |
345
+ | **Провалено** | 0 | 2 | 0 |
346
+ | **Неоднозначные правки** | 0 | 4 | 0 |
336
347
 
337
- | | hashline | str_replace |
338
- |---|:---:|:---:|
339
- | **Прошло** | **60/60 (100%)** | 58/60 (96.7%) |
340
- | **Провалено** | 0 | 2 |
341
- | **Неоднозначные правки** | 0 | 4 |
348
+ `apply_patch` с контекстными строками работает так же надёжно, как hashline **при условии, что модель правильно генерирует патч**. Слабое место `apply_patch` — зависимость от обучения конкретной модели: не обученные под этот формат модели производят некорректные diff-ы (пропускают контекст, путают отступы), что приводит к провалу применения патча.
342
349
 
343
- str_replace ломается, когда `old_string` встречается в файле несколько раз (например, повторяющиеся guard-клаузы, похожие блоки кода). Hashline адресует каждую строку уникально через `lineNumber:hash`, поэтому неоднозначность исключена.
350
+ `str_replace` ломается, когда `old_string` встречается в файле несколько раз (повторяющиеся guard-клаузы, похожие блоки кода). Hashline адресует каждую строку уникально через `lineNumber:hash` неоднозначность исключена, модельный формат не нужен.
344
351
 
345
352
  ```bash
346
353
  # Запустите сами:
347
- npx tsx benchmark/run.ts # режим hashline
348
- npx tsx benchmark/run.ts --no-hash # режим str_replace
354
+ npx tsx benchmark/run.ts # режим hashline
355
+ npx tsx benchmark/run.ts --no-hash # режим str_replace
356
+ npx tsx benchmark/run.ts --apply-patch # режим apply_patch
349
357
  ```
350
358
 
351
359
  <details>
@@ -28,7 +28,24 @@ var DEFAULT_EXCLUDE_PATTERNS = [
28
28
  "**/*.exe",
29
29
  "**/*.dll",
30
30
  "**/*.so",
31
- "**/*.dylib"
31
+ "**/*.dylib",
32
+ // Sensitive credential and secret files
33
+ "**/.env",
34
+ "**/.env.*",
35
+ "**/*.pem",
36
+ "**/*.key",
37
+ "**/*.p12",
38
+ "**/*.pfx",
39
+ "**/id_rsa",
40
+ "**/id_rsa.pub",
41
+ "**/id_ed25519",
42
+ "**/id_ed25519.pub",
43
+ "**/id_ecdsa",
44
+ "**/id_ecdsa.pub",
45
+ "**/.npmrc",
46
+ "**/.netrc",
47
+ "**/credentials",
48
+ "**/credentials.json"
32
49
  ];
33
50
  var DEFAULT_PREFIX = "#HL ";
34
51
  var DEFAULT_CONFIG = {
@@ -94,11 +111,18 @@ function formatFileWithHashes(content, hashLen, prefix) {
94
111
  const effectivePrefix = prefix === void 0 ? DEFAULT_PREFIX : prefix === false ? "" : prefix;
95
112
  const hashes = new Array(lines.length);
96
113
  const seen = /* @__PURE__ */ new Map();
114
+ const upgraded = /* @__PURE__ */ new Set();
97
115
  for (let idx = 0; idx < lines.length; idx++) {
98
116
  const hash = computeLineHash(idx, lines[idx], effectiveLen);
99
117
  if (seen.has(hash)) {
100
118
  const longerLen = Math.min(effectiveLen + 1, 8);
119
+ const prevIdx = seen.get(hash);
120
+ if (!upgraded.has(prevIdx)) {
121
+ hashes[prevIdx] = computeLineHash(prevIdx, lines[prevIdx], longerLen);
122
+ upgraded.add(prevIdx);
123
+ }
101
124
  hashes[idx] = computeLineHash(idx, lines[idx], longerLen);
125
+ upgraded.add(idx);
102
126
  } else {
103
127
  seen.set(hash, idx);
104
128
  hashes[idx] = hash;
@@ -132,7 +156,8 @@ function stripHashes(content, prefix) {
132
156
  function parseHashRef(ref) {
133
157
  const match = ref.match(/^(\d+):([0-9a-f]{2,8})$/);
134
158
  if (!match) {
135
- throw new Error(`Invalid hash reference: "${ref}". Expected format: "<line>:<2-8 char hex>"`);
159
+ const display = ref.length > 100 ? `${ref.slice(0, 100)}\u2026` : ref;
160
+ throw new Error(`Invalid hash reference: "${display}". Expected format: "<line>:<2-8 char hex>"`);
136
161
  }
137
162
  return {
138
163
  line: parseInt(match[1], 10),
@@ -149,8 +174,9 @@ function normalizeHashRef(ref) {
149
174
  if (annotated) {
150
175
  return `${parseInt(annotated[1], 10)}:${annotated[2].toLowerCase()}`;
151
176
  }
177
+ const display = ref.length > 100 ? `${ref.slice(0, 100)}\u2026` : ref;
152
178
  throw new Error(
153
- `Invalid hash reference: "${ref}". Expected "<line>:<hash>" or an annotated line like "#HL <line>:<hash>|..."`
179
+ `Invalid hash reference: "${display}". Expected "<line>:<hash>" or an annotated line like "#HL <line>:<hash>|..."`
154
180
  );
155
181
  }
156
182
  function buildHashMap(content, hashLen) {
@@ -336,10 +362,15 @@ var HashlineCache = class {
336
362
  return this.cache.size;
337
363
  }
338
364
  };
365
+ var globMatcherCache = /* @__PURE__ */ new Map();
339
366
  function matchesGlob(filePath, pattern) {
340
367
  const normalizedPath = filePath.replace(/\\/g, "/");
341
368
  const normalizedPattern = pattern.replace(/\\/g, "/");
342
- const isMatch = picomatch(normalizedPattern, { dot: true });
369
+ let isMatch = globMatcherCache.get(normalizedPattern);
370
+ if (!isMatch) {
371
+ isMatch = picomatch(normalizedPattern, { dot: true });
372
+ globMatcherCache.set(normalizedPattern, isMatch);
373
+ }
343
374
  return isMatch(normalizedPath);
344
375
  }
345
376
  function shouldExclude(filePath, patterns) {
@@ -4,7 +4,7 @@ import {
4
4
  resolveConfig,
5
5
  shouldExclude,
6
6
  stripHashes
7
- } from "./chunk-X4NVISKE.js";
7
+ } from "./chunk-I6RACR3D.js";
8
8
 
9
9
  // src/hooks.ts
10
10
  import { appendFileSync } from "fs";
@@ -20,7 +20,7 @@ import {
20
20
  shouldExclude,
21
21
  stripHashes,
22
22
  verifyHash
23
- } from "./chunk-X4NVISKE.js";
23
+ } from "./chunk-I6RACR3D.js";
24
24
  export {
25
25
  DEFAULT_CONFIG,
26
26
  DEFAULT_EXCLUDE_PATTERNS,
@@ -107,11 +107,18 @@ function formatFileWithHashes(content, hashLen, prefix) {
107
107
  const effectivePrefix = prefix === void 0 ? DEFAULT_PREFIX : prefix === false ? "" : prefix;
108
108
  const hashes = new Array(lines.length);
109
109
  const seen = /* @__PURE__ */ new Map();
110
+ const upgraded = /* @__PURE__ */ new Set();
110
111
  for (let idx = 0; idx < lines.length; idx++) {
111
112
  const hash = computeLineHash(idx, lines[idx], effectiveLen);
112
113
  if (seen.has(hash)) {
113
114
  const longerLen = Math.min(effectiveLen + 1, 8);
115
+ const prevIdx = seen.get(hash);
116
+ if (!upgraded.has(prevIdx)) {
117
+ hashes[prevIdx] = computeLineHash(prevIdx, lines[prevIdx], longerLen);
118
+ upgraded.add(prevIdx);
119
+ }
114
120
  hashes[idx] = computeLineHash(idx, lines[idx], longerLen);
121
+ upgraded.add(idx);
115
122
  } else {
116
123
  seen.set(hash, idx);
117
124
  hashes[idx] = hash;
@@ -144,7 +151,8 @@ function stripHashes(content, prefix) {
144
151
  function parseHashRef(ref) {
145
152
  const match = ref.match(/^(\d+):([0-9a-f]{2,8})$/);
146
153
  if (!match) {
147
- throw new Error(`Invalid hash reference: "${ref}". Expected format: "<line>:<2-8 char hex>"`);
154
+ const display = ref.length > 100 ? `${ref.slice(0, 100)}\u2026` : ref;
155
+ throw new Error(`Invalid hash reference: "${display}". Expected format: "<line>:<2-8 char hex>"`);
148
156
  }
149
157
  return {
150
158
  line: parseInt(match[1], 10),
@@ -161,8 +169,9 @@ function normalizeHashRef(ref) {
161
169
  if (annotated) {
162
170
  return `${parseInt(annotated[1], 10)}:${annotated[2].toLowerCase()}`;
163
171
  }
172
+ const display = ref.length > 100 ? `${ref.slice(0, 100)}\u2026` : ref;
164
173
  throw new Error(
165
- `Invalid hash reference: "${ref}". Expected "<line>:<hash>" or an annotated line like "#HL <line>:<hash>|..."`
174
+ `Invalid hash reference: "${display}". Expected "<line>:<hash>" or an annotated line like "#HL <line>:<hash>|..."`
166
175
  );
167
176
  }
168
177
  function buildHashMap(content, hashLen) {
@@ -292,7 +301,11 @@ function applyHashEdit(input, content, hashLen) {
292
301
  function matchesGlob(filePath, pattern) {
293
302
  const normalizedPath = filePath.replace(/\\/g, "/");
294
303
  const normalizedPattern = pattern.replace(/\\/g, "/");
295
- const isMatch = (0, import_picomatch.default)(normalizedPattern, { dot: true });
304
+ let isMatch = globMatcherCache.get(normalizedPattern);
305
+ if (!isMatch) {
306
+ isMatch = (0, import_picomatch.default)(normalizedPattern, { dot: true });
307
+ globMatcherCache.set(normalizedPattern, isMatch);
308
+ }
296
309
  return isMatch(normalizedPath);
297
310
  }
298
311
  function shouldExclude(filePath, patterns) {
@@ -355,7 +368,7 @@ function createHashline(config) {
355
368
  }
356
369
  };
357
370
  }
358
- var import_picomatch, DEFAULT_EXCLUDE_PATTERNS, DEFAULT_PREFIX, DEFAULT_CONFIG, modulusCache, stripRegexCache, HashlineCache, textEncoder;
371
+ var import_picomatch, DEFAULT_EXCLUDE_PATTERNS, DEFAULT_PREFIX, DEFAULT_CONFIG, modulusCache, stripRegexCache, HashlineCache, globMatcherCache, textEncoder;
359
372
  var init_hashline = __esm({
360
373
  "src/hashline.ts"() {
361
374
  "use strict";
@@ -388,7 +401,24 @@ var init_hashline = __esm({
388
401
  "**/*.exe",
389
402
  "**/*.dll",
390
403
  "**/*.so",
391
- "**/*.dylib"
404
+ "**/*.dylib",
405
+ // Sensitive credential and secret files
406
+ "**/.env",
407
+ "**/.env.*",
408
+ "**/*.pem",
409
+ "**/*.key",
410
+ "**/*.p12",
411
+ "**/*.pfx",
412
+ "**/id_rsa",
413
+ "**/id_rsa.pub",
414
+ "**/id_ed25519",
415
+ "**/id_ed25519.pub",
416
+ "**/id_ecdsa",
417
+ "**/id_ecdsa.pub",
418
+ "**/.npmrc",
419
+ "**/.netrc",
420
+ "**/credentials",
421
+ "**/credentials.json"
392
422
  ];
393
423
  DEFAULT_PREFIX = "#HL ";
394
424
  DEFAULT_CONFIG = {
@@ -462,6 +492,7 @@ var init_hashline = __esm({
462
492
  return this.cache.size;
463
493
  }
464
494
  };
495
+ globMatcherCache = /* @__PURE__ */ new Map();
465
496
  textEncoder = new TextEncoder();
466
497
  }
467
498
  });
@@ -705,23 +736,37 @@ function createHashlineEditTool(config, cache) {
705
736
  operation: import_zod.z.enum(["replace", "delete", "insert_before", "insert_after"]).describe("Edit operation"),
706
737
  startRef: import_zod.z.string().describe('Start hash reference, e.g. "5:a3f" or "#HL 5:a3f|const x = 1;"'),
707
738
  endRef: import_zod.z.string().optional().describe("End hash reference for range operations. Defaults to startRef when omitted."),
708
- replacement: import_zod.z.string().optional().describe("Replacement/inserted content. Required for replace/insert operations.")
739
+ replacement: import_zod.z.string().max(1e7).optional().describe("Replacement/inserted content. Required for replace/insert operations.")
709
740
  },
710
741
  async execute(args, context) {
711
742
  const { path, operation, startRef, endRef, replacement } = args;
712
743
  const absPath = (0, import_path2.isAbsolute)(path) ? path : (0, import_path2.resolve)(context.directory, path);
713
- let realAbs;
714
- try {
715
- realAbs = (0, import_fs2.realpathSync)(absPath);
716
- } catch {
717
- realAbs = (0, import_path2.resolve)(absPath);
718
- }
719
744
  const realDirectory = (0, import_fs2.realpathSync)((0, import_path2.resolve)(context.directory));
720
745
  const realWorktree = (0, import_fs2.realpathSync)((0, import_path2.resolve)(context.worktree));
721
746
  function isWithin(filePath, dir) {
722
747
  if (dir === import_path2.sep) return false;
748
+ if (process.platform === "win32") {
749
+ if (/^[A-Za-z]:\\$/.test(dir)) return false;
750
+ if (/^\\\\[^\\]+\\[^\\]+$/.test(dir)) return false;
751
+ }
723
752
  return filePath === dir || filePath.startsWith(dir + import_path2.sep);
724
753
  }
754
+ let realAbs;
755
+ try {
756
+ realAbs = (0, import_fs2.realpathSync)(absPath);
757
+ } catch {
758
+ const parentDir = (0, import_path2.dirname)(absPath);
759
+ let realParent;
760
+ try {
761
+ realParent = (0, import_fs2.realpathSync)(parentDir);
762
+ } catch {
763
+ throw new Error(`Access denied: cannot verify parent directory for "${path}"`);
764
+ }
765
+ if (!isWithin(realParent, realDirectory) && !isWithin(realParent, realWorktree)) {
766
+ throw new Error(`Access denied: "${path}" resolves outside the project directory`);
767
+ }
768
+ realAbs = (0, import_path2.resolve)(absPath);
769
+ }
725
770
  if (!isWithin(realAbs, realDirectory) && !isWithin(realAbs, realWorktree)) {
726
771
  throw new Error(`Access denied: "${path}" resolves outside the project directory`);
727
772
  }
@@ -734,6 +779,11 @@ function createHashlineEditTool(config, cache) {
734
779
  const reason = error instanceof Error ? error.message : String(error);
735
780
  throw new Error(`Failed to read "${displayPath}": ${reason}`);
736
781
  }
782
+ if (config.maxFileSize > 0 && getByteLength(current) > config.maxFileSize) {
783
+ throw new Error(
784
+ `File "${displayPath}" exceeds the configured maximum size (${config.maxFileSize} bytes)`
785
+ );
786
+ }
737
787
  let nextContent;
738
788
  let startLine;
739
789
  let endLine;
@@ -788,10 +838,40 @@ function createHashlineEditTool(config, cache) {
788
838
 
789
839
  // src/index.ts
790
840
  var CONFIG_FILENAME = "opencode-hashline.json";
841
+ function sanitizeConfig(raw) {
842
+ if (typeof raw !== "object" || raw === null || Array.isArray(raw)) return {};
843
+ const r = raw;
844
+ const result = {};
845
+ if (Array.isArray(r.exclude)) {
846
+ result.exclude = r.exclude.filter(
847
+ (p) => typeof p === "string" && p.length <= 512
848
+ );
849
+ }
850
+ if (typeof r.maxFileSize === "number" && Number.isFinite(r.maxFileSize) && r.maxFileSize >= 0) {
851
+ result.maxFileSize = r.maxFileSize;
852
+ }
853
+ if (typeof r.hashLength === "number" && Number.isFinite(r.hashLength)) {
854
+ result.hashLength = Math.max(0, Math.min(8, Math.floor(r.hashLength)));
855
+ }
856
+ if (typeof r.cacheSize === "number" && Number.isFinite(r.cacheSize) && r.cacheSize > 0) {
857
+ result.cacheSize = Math.min(Math.floor(r.cacheSize), 1e4);
858
+ }
859
+ if (r.prefix === false) {
860
+ result.prefix = false;
861
+ } else if (typeof r.prefix === "string") {
862
+ if (/^[\x20-\x7E]{0,20}$/.test(r.prefix)) {
863
+ result.prefix = r.prefix;
864
+ }
865
+ }
866
+ if (typeof r.debug === "boolean") {
867
+ result.debug = r.debug;
868
+ }
869
+ return result;
870
+ }
791
871
  function loadConfigFile(filePath) {
792
872
  try {
793
873
  const raw = (0, import_fs3.readFileSync)(filePath, "utf-8");
794
- return JSON.parse(raw);
874
+ return sanitizeConfig(JSON.parse(raw));
795
875
  } catch {
796
876
  return void 0;
797
877
  }
@@ -3,12 +3,13 @@ import {
3
3
  createFileReadAfterHook,
4
4
  createSystemPromptHook,
5
5
  setDebug
6
- } from "./chunk-MKNRMMMR.js";
6
+ } from "./chunk-VPCMHCTB.js";
7
7
  import {
8
8
  HashlineCache,
9
9
  applyHashEdit,
10
+ getByteLength,
10
11
  resolveConfig
11
- } from "./chunk-X4NVISKE.js";
12
+ } from "./chunk-I6RACR3D.js";
12
13
 
13
14
  // src/index.ts
14
15
  import { readFileSync as readFileSync2, realpathSync as realpathSync2, unlinkSync, writeFileSync as writeFileSync2 } from "fs";
@@ -18,7 +19,7 @@ import { fileURLToPath } from "url";
18
19
 
19
20
  // src/hashline-tool.ts
20
21
  import { readFileSync, realpathSync, writeFileSync } from "fs";
21
- import { isAbsolute, relative, resolve, sep } from "path";
22
+ import { dirname, isAbsolute, relative, resolve, sep } from "path";
22
23
  import { z } from "zod";
23
24
  function createHashlineEditTool(config, cache) {
24
25
  return {
@@ -28,23 +29,37 @@ function createHashlineEditTool(config, cache) {
28
29
  operation: z.enum(["replace", "delete", "insert_before", "insert_after"]).describe("Edit operation"),
29
30
  startRef: z.string().describe('Start hash reference, e.g. "5:a3f" or "#HL 5:a3f|const x = 1;"'),
30
31
  endRef: z.string().optional().describe("End hash reference for range operations. Defaults to startRef when omitted."),
31
- replacement: z.string().optional().describe("Replacement/inserted content. Required for replace/insert operations.")
32
+ replacement: z.string().max(1e7).optional().describe("Replacement/inserted content. Required for replace/insert operations.")
32
33
  },
33
34
  async execute(args, context) {
34
35
  const { path, operation, startRef, endRef, replacement } = args;
35
36
  const absPath = isAbsolute(path) ? path : resolve(context.directory, path);
36
- let realAbs;
37
- try {
38
- realAbs = realpathSync(absPath);
39
- } catch {
40
- realAbs = resolve(absPath);
41
- }
42
37
  const realDirectory = realpathSync(resolve(context.directory));
43
38
  const realWorktree = realpathSync(resolve(context.worktree));
44
39
  function isWithin(filePath, dir) {
45
40
  if (dir === sep) return false;
41
+ if (process.platform === "win32") {
42
+ if (/^[A-Za-z]:\\$/.test(dir)) return false;
43
+ if (/^\\\\[^\\]+\\[^\\]+$/.test(dir)) return false;
44
+ }
46
45
  return filePath === dir || filePath.startsWith(dir + sep);
47
46
  }
47
+ let realAbs;
48
+ try {
49
+ realAbs = realpathSync(absPath);
50
+ } catch {
51
+ const parentDir = dirname(absPath);
52
+ let realParent;
53
+ try {
54
+ realParent = realpathSync(parentDir);
55
+ } catch {
56
+ throw new Error(`Access denied: cannot verify parent directory for "${path}"`);
57
+ }
58
+ if (!isWithin(realParent, realDirectory) && !isWithin(realParent, realWorktree)) {
59
+ throw new Error(`Access denied: "${path}" resolves outside the project directory`);
60
+ }
61
+ realAbs = resolve(absPath);
62
+ }
48
63
  if (!isWithin(realAbs, realDirectory) && !isWithin(realAbs, realWorktree)) {
49
64
  throw new Error(`Access denied: "${path}" resolves outside the project directory`);
50
65
  }
@@ -57,6 +72,11 @@ function createHashlineEditTool(config, cache) {
57
72
  const reason = error instanceof Error ? error.message : String(error);
58
73
  throw new Error(`Failed to read "${displayPath}": ${reason}`);
59
74
  }
75
+ if (config.maxFileSize > 0 && getByteLength(current) > config.maxFileSize) {
76
+ throw new Error(
77
+ `File "${displayPath}" exceeds the configured maximum size (${config.maxFileSize} bytes)`
78
+ );
79
+ }
60
80
  let nextContent;
61
81
  let startLine;
62
82
  let endLine;
@@ -111,10 +131,40 @@ function createHashlineEditTool(config, cache) {
111
131
 
112
132
  // src/index.ts
113
133
  var CONFIG_FILENAME = "opencode-hashline.json";
134
+ function sanitizeConfig(raw) {
135
+ if (typeof raw !== "object" || raw === null || Array.isArray(raw)) return {};
136
+ const r = raw;
137
+ const result = {};
138
+ if (Array.isArray(r.exclude)) {
139
+ result.exclude = r.exclude.filter(
140
+ (p) => typeof p === "string" && p.length <= 512
141
+ );
142
+ }
143
+ if (typeof r.maxFileSize === "number" && Number.isFinite(r.maxFileSize) && r.maxFileSize >= 0) {
144
+ result.maxFileSize = r.maxFileSize;
145
+ }
146
+ if (typeof r.hashLength === "number" && Number.isFinite(r.hashLength)) {
147
+ result.hashLength = Math.max(0, Math.min(8, Math.floor(r.hashLength)));
148
+ }
149
+ if (typeof r.cacheSize === "number" && Number.isFinite(r.cacheSize) && r.cacheSize > 0) {
150
+ result.cacheSize = Math.min(Math.floor(r.cacheSize), 1e4);
151
+ }
152
+ if (r.prefix === false) {
153
+ result.prefix = false;
154
+ } else if (typeof r.prefix === "string") {
155
+ if (/^[\x20-\x7E]{0,20}$/.test(r.prefix)) {
156
+ result.prefix = r.prefix;
157
+ }
158
+ }
159
+ if (typeof r.debug === "boolean") {
160
+ result.debug = r.debug;
161
+ }
162
+ return result;
163
+ }
114
164
  function loadConfigFile(filePath) {
115
165
  try {
116
166
  const raw = readFileSync2(filePath, "utf-8");
117
- return JSON.parse(raw);
167
+ return sanitizeConfig(JSON.parse(raw));
118
168
  } catch {
119
169
  return void 0;
120
170
  }
@@ -172,7 +222,7 @@ function createHashlinePlugin(userConfig) {
172
222
  const out = output;
173
223
  const hashLen = config.hashLength || 0;
174
224
  const prefix = config.prefix;
175
- const { formatFileWithHashes, shouldExclude, getByteLength } = await import("./hashline-6YDKBNND.js");
225
+ const { formatFileWithHashes, shouldExclude, getByteLength: getByteLength2 } = await import("./hashline-5PFAXY3H.js");
176
226
  for (const p of out.parts ?? []) {
177
227
  if (p.type !== "file") continue;
178
228
  if (!p.url || !p.mime?.startsWith("text/")) continue;
@@ -199,7 +249,7 @@ function createHashlinePlugin(userConfig) {
199
249
  } catch {
200
250
  continue;
201
251
  }
202
- if (config.maxFileSize > 0 && getByteLength(content) > config.maxFileSize) continue;
252
+ if (config.maxFileSize > 0 && getByteLength2(content) > config.maxFileSize) continue;
203
253
  const cached = cache.get(filePath, content);
204
254
  if (cached) {
205
255
  const tmpPath2 = join(tmpdir(), `hashline-${p.id}.txt`);
package/dist/utils.cjs CHANGED
@@ -87,7 +87,24 @@ var DEFAULT_EXCLUDE_PATTERNS = [
87
87
  "**/*.exe",
88
88
  "**/*.dll",
89
89
  "**/*.so",
90
- "**/*.dylib"
90
+ "**/*.dylib",
91
+ // Sensitive credential and secret files
92
+ "**/.env",
93
+ "**/.env.*",
94
+ "**/*.pem",
95
+ "**/*.key",
96
+ "**/*.p12",
97
+ "**/*.pfx",
98
+ "**/id_rsa",
99
+ "**/id_rsa.pub",
100
+ "**/id_ed25519",
101
+ "**/id_ed25519.pub",
102
+ "**/id_ecdsa",
103
+ "**/id_ecdsa.pub",
104
+ "**/.npmrc",
105
+ "**/.netrc",
106
+ "**/credentials",
107
+ "**/credentials.json"
91
108
  ];
92
109
  var DEFAULT_PREFIX = "#HL ";
93
110
  var DEFAULT_CONFIG = {
@@ -153,11 +170,18 @@ function formatFileWithHashes(content, hashLen, prefix) {
153
170
  const effectivePrefix = prefix === void 0 ? DEFAULT_PREFIX : prefix === false ? "" : prefix;
154
171
  const hashes = new Array(lines.length);
155
172
  const seen = /* @__PURE__ */ new Map();
173
+ const upgraded = /* @__PURE__ */ new Set();
156
174
  for (let idx = 0; idx < lines.length; idx++) {
157
175
  const hash = computeLineHash(idx, lines[idx], effectiveLen);
158
176
  if (seen.has(hash)) {
159
177
  const longerLen = Math.min(effectiveLen + 1, 8);
178
+ const prevIdx = seen.get(hash);
179
+ if (!upgraded.has(prevIdx)) {
180
+ hashes[prevIdx] = computeLineHash(prevIdx, lines[prevIdx], longerLen);
181
+ upgraded.add(prevIdx);
182
+ }
160
183
  hashes[idx] = computeLineHash(idx, lines[idx], longerLen);
184
+ upgraded.add(idx);
161
185
  } else {
162
186
  seen.set(hash, idx);
163
187
  hashes[idx] = hash;
@@ -191,7 +215,8 @@ function stripHashes(content, prefix) {
191
215
  function parseHashRef(ref) {
192
216
  const match = ref.match(/^(\d+):([0-9a-f]{2,8})$/);
193
217
  if (!match) {
194
- throw new Error(`Invalid hash reference: "${ref}". Expected format: "<line>:<2-8 char hex>"`);
218
+ const display = ref.length > 100 ? `${ref.slice(0, 100)}\u2026` : ref;
219
+ throw new Error(`Invalid hash reference: "${display}". Expected format: "<line>:<2-8 char hex>"`);
195
220
  }
196
221
  return {
197
222
  line: parseInt(match[1], 10),
@@ -208,8 +233,9 @@ function normalizeHashRef(ref) {
208
233
  if (annotated) {
209
234
  return `${parseInt(annotated[1], 10)}:${annotated[2].toLowerCase()}`;
210
235
  }
236
+ const display = ref.length > 100 ? `${ref.slice(0, 100)}\u2026` : ref;
211
237
  throw new Error(
212
- `Invalid hash reference: "${ref}". Expected "<line>:<hash>" or an annotated line like "#HL <line>:<hash>|..."`
238
+ `Invalid hash reference: "${display}". Expected "<line>:<hash>" or an annotated line like "#HL <line>:<hash>|..."`
213
239
  );
214
240
  }
215
241
  function buildHashMap(content, hashLen) {
@@ -395,10 +421,15 @@ var HashlineCache = class {
395
421
  return this.cache.size;
396
422
  }
397
423
  };
424
+ var globMatcherCache = /* @__PURE__ */ new Map();
398
425
  function matchesGlob(filePath, pattern) {
399
426
  const normalizedPath = filePath.replace(/\\/g, "/");
400
427
  const normalizedPattern = pattern.replace(/\\/g, "/");
401
- const isMatch = (0, import_picomatch.default)(normalizedPattern, { dot: true });
428
+ let isMatch = globMatcherCache.get(normalizedPattern);
429
+ if (!isMatch) {
430
+ isMatch = (0, import_picomatch.default)(normalizedPattern, { dot: true });
431
+ globMatcherCache.set(normalizedPattern, isMatch);
432
+ }
402
433
  return isMatch(normalizedPath);
403
434
  }
404
435
  function shouldExclude(filePath, patterns) {
package/dist/utils.js CHANGED
@@ -3,7 +3,7 @@ import {
3
3
  createFileReadAfterHook,
4
4
  createSystemPromptHook,
5
5
  isFileReadTool
6
- } from "./chunk-MKNRMMMR.js";
6
+ } from "./chunk-VPCMHCTB.js";
7
7
  import {
8
8
  DEFAULT_CONFIG,
9
9
  DEFAULT_EXCLUDE_PATTERNS,
@@ -25,7 +25,7 @@ import {
25
25
  shouldExclude,
26
26
  stripHashes,
27
27
  verifyHash
28
- } from "./chunk-X4NVISKE.js";
28
+ } from "./chunk-I6RACR3D.js";
29
29
  export {
30
30
  DEFAULT_CONFIG,
31
31
  DEFAULT_EXCLUDE_PATTERNS,
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "opencode-hashline",
3
- "version": "1.1.2",
3
+ "version": "1.2.0",
4
4
  "description": "Hashline plugin for OpenCode — content-addressable line hashing for precise AI code editing",
5
5
  "main": "dist/opencode-hashline.cjs",
6
6
  "module": "dist/opencode-hashline.js",