@ranger1/dx 0.1.32 → 0.1.34

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -31,8 +31,8 @@ tools:
31
31
  - findings id 必须以 `CLD-` 开头
32
32
 
33
33
  ## Cache 约定(强制)
34
- - 本流程所有中间文件都存放在 `~/.opencode/cache/`
35
- - agent/命令之间仅传递文件名(basename),不传目录
34
+
35
+ - 缓存目录固定为 `./.cache/`;交接一律传 `./.cache/<file>`(repo 相对路径),禁止 basename-only(如 `foo.md`)。
36
36
 
37
37
  ## reviewFile 格式(强制)
38
38
 
@@ -32,9 +32,8 @@ tools:
32
32
  - findings id 必须以 `CDX-` 开头
33
33
 
34
34
  ## Cache 约定(强制)
35
- - 本流程所有中间文件都存放在 `~/.opencode/cache/`
36
- - agent/命令之间仅传递文件名(basename),不传目录
37
35
 
36
+ - 缓存目录固定为 `./.cache/`;交接一律传 `./.cache/<file>`(repo 相对路径),禁止 basename-only(如 `foo.md`)。
38
37
 
39
38
  ## reviewFile 格式(强制)
40
39
 
@@ -31,9 +31,8 @@ tools:
31
31
  - findings id 必须以 `GMN-` 开头
32
32
 
33
33
  ## Cache 约定(强制)
34
- - 本流程所有中间文件都存放在 `~/.opencode/cache/`
35
- - agent/命令之间仅传递文件名(basename),不传目录
36
34
 
35
+ - 缓存目录固定为 `./.cache/`;交接一律传 `./.cache/<file>`(repo 相对路径),禁止 basename-only(如 `foo.md`)。
37
36
 
38
37
  ## reviewFile 格式(强制)
39
38
 
@@ -20,11 +20,11 @@ tools:
20
20
 
21
21
  ## 输出(强制)
22
22
 
23
- 脚本会写入 `~/.opencode/cache/`,stdout 只输出单一 JSON(可 `JSON.parse()`)。
23
+ 脚本会写入项目内 `./.cache/`,stdout 只输出单一 JSON(可 `JSON.parse()`)。
24
24
 
25
25
  ## Cache 约定(强制)
26
26
 
27
-
27
+ - 缓存目录固定为 `./.cache/`;交接一律传 `./.cache/<file>`(repo 相对路径),禁止 basename-only(如 `foo.md`)。
28
28
 
29
29
  ## 调用脚本(强制)
30
30
 
@@ -44,3 +44,27 @@ python3 ~/.opencode/agents/pr_context.py --pr <PR_NUMBER> --round <ROUND>
44
44
  - **失败/异常时**:
45
45
  - 若脚本 stdout 已输出合法 JSON(包含 `error` 或其他字段)→ 仍然**原样返回该 JSON**。
46
46
  - 若脚本未输出合法 JSON / 退出异常 → 仅输出一行 JSON:`{"error":"PR_CONTEXT_AGENT_FAILED"}`(必要时可加 `detail` 字段)。
47
+
48
+ ## GitHub 认证校验(重要)
49
+
50
+ 脚本会在调用 `gh repo view/gh pr view` 之前校验 GitHub CLI 已认证。
51
+
52
+ - 为了避免 `gh auth status` 在“其他 host(例如 enterprise)认证异常”时误判,脚本会优先从 `git remote origin` 推断 host,并使用:
53
+ - `gh auth status --hostname <host>`
54
+ - 推断失败时默认使用 `github.com`。
55
+
56
+ 可能出现的错误:
57
+
58
+ - `{"error":"GH_CLI_NOT_FOUND"}`:找不到 `gh` 命令(PATH 内未安装/不可执行)
59
+ - 处理:安装 GitHub CLI:https://cli.github.com/
60
+ - `{"error":"GH_NOT_AUTHENTICATED"}`:当前 repo 的 host 未认证
61
+ - 处理:`gh auth login --hostname <host>`
62
+
63
+ 本地排查命令(在同一个 shell 环境运行):
64
+
65
+ ```bash
66
+ git remote get-url origin
67
+ gh auth status
68
+ gh auth status --hostname github.com
69
+ env | grep '^GH_'
70
+ ```
@@ -26,13 +26,13 @@ tools:
26
26
  ## 前置条件
27
27
 
28
28
  ### Cache 约定(强制)
29
- - 本流程所有中间文件都存放在 `~/.opencode/cache/`
30
- - agent/命令之间仅传递文件名(basename),不传目录
29
+
30
+ - 缓存目录固定为 `./.cache/`;交接一律传 `./.cache/<file>`(repo 相对路径),禁止 basename-only(如 `foo.md`)。
31
31
 
32
32
  ### 必需输入
33
33
 
34
34
  - **PR 编号**:调用者必须在 prompt 中明确提供(如:`请修复 PR #123`)
35
- - **fixFile**:调用者必须在 prompt 中提供问题清单文件名(basename)(Structured Handoff)
35
+ - **fixFile**:调用者必须在 prompt 中提供问题清单文件路径(repo 相对路径,例:`./.cache/fix-...md`)(Structured Handoff)
36
36
 
37
37
  ### 失败快速退出
38
38
 
@@ -10,9 +10,8 @@ tools:
10
10
  # PR Precheck
11
11
 
12
12
  ## Cache 约定(强制)
13
- - 本流程所有中间文件都存放在 `~/.opencode/cache/`
14
- - agent/命令之间仅传递文件名(basename),不传目录
15
13
 
14
+ - 缓存目录固定为 `./.cache/`;交接一律传 `./.cache/<file>`(repo 相对路径),禁止 basename-only(如 `foo.md`)。
16
15
 
17
16
  ## 输入(prompt 必须包含)
18
17
 
@@ -38,6 +37,30 @@ python3 ~/.opencode/agents/pr_precheck.py <PR_NUMBER>
38
37
  - 若脚本 stdout 已输出合法 JSON(包含 `error` 或其他字段)→ 仍然**原样返回该 JSON**。
39
38
  - 若脚本未输出合法 JSON / 退出异常 → 仅输出一行 JSON:`{"error":"PR_PRECHECK_AGENT_FAILED"}`(必要时可加 `detail` 字段)。
40
39
 
40
+ ## GitHub 认证校验(重要)
41
+
42
+ 脚本会在执行 `gh pr view/checkout` 之前校验 GitHub CLI 已认证。
43
+
44
+ - 为了避免 `gh auth status` 在“其他 host(例如 enterprise)认证异常”时误判,脚本会优先从 `git remote origin` 推断 host,并使用:
45
+ - `gh auth status --hostname <host>`
46
+ - 推断失败时默认使用 `github.com`。
47
+
48
+ 可能出现的错误:
49
+
50
+ - `{"error":"GH_CLI_NOT_FOUND"}`:找不到 `gh` 命令(PATH 内未安装/不可执行)
51
+ - 处理:安装 GitHub CLI:https://cli.github.com/
52
+ - `{"error":"GH_NOT_AUTHENTICATED"}`:当前 repo 的 host 未认证
53
+ - 处理:`gh auth login --hostname <host>`
54
+
55
+ 本地排查命令(在同一个 shell 环境运行):
56
+
57
+ ```bash
58
+ git remote get-url origin
59
+ gh auth status
60
+ gh auth status --hostname github.com
61
+ env | grep '^GH_'
62
+ ```
63
+
41
64
  ## 仅当出现 merge 冲突时怎么处理
42
65
 
43
66
  当脚本输出 `{"error":"PR_MERGE_CONFLICTS_UNRESOLVED"}` 时:
@@ -10,8 +10,8 @@ tools:
10
10
  # PR Review Aggregator
11
11
 
12
12
  ## Cache 约定(强制)
13
- - 本流程所有中间文件都存放在 `~/.opencode/cache/`
14
- - agent/命令之间仅传递文件名(basename),不传目录
13
+
14
+ - 缓存目录固定为 `./.cache/`;交接一律传 `./.cache/<file>`(repo 相对路径),禁止 basename-only(如 `foo.md`)。
15
15
 
16
16
  ## 输入(两种模式)
17
17
 
@@ -2,19 +2,52 @@
2
2
  # PR context builder (deterministic).
3
3
  # - Reads PR metadata + recent comments via gh
4
4
  # - Reads changed files via git diff (no patch)
5
- # - Writes Markdown context file to ~/.opencode/cache/
5
+ # - Writes Markdown context file to project cache: ./.cache/
6
6
  # - Prints exactly one JSON object to stdout
7
7
 
8
8
  import argparse
9
9
  import hashlib
10
10
  import json
11
11
  import os
12
+ import re
12
13
  import subprocess
13
14
  import sys
15
+ from urllib.parse import urlparse
14
16
  from pathlib import Path
15
17
 
16
18
 
17
- CACHE_DIR = Path.home() / ".opencode" / "cache"
19
+ def _repo_root():
20
+ # Prefer git top-level so the cache follows the current repo.
21
+ try:
22
+ p = subprocess.run(
23
+ ["git", "rev-parse", "--show-toplevel"],
24
+ stdout=subprocess.PIPE,
25
+ stderr=subprocess.DEVNULL,
26
+ text=True,
27
+ )
28
+ out = (p.stdout or "").strip()
29
+ if p.returncode == 0 and out:
30
+ return Path(out)
31
+ except Exception:
32
+ pass
33
+ return Path.cwd()
34
+
35
+
36
+ def _cache_dir(repo_root):
37
+ return (repo_root / ".cache").resolve()
38
+
39
+
40
+ def _repo_relpath(repo_root, p):
41
+ try:
42
+ rel = p.resolve().relative_to(repo_root.resolve())
43
+ return "./" + rel.as_posix()
44
+ except Exception:
45
+ # Fallback to basename-only.
46
+ return os.path.basename(str(p))
47
+
48
+
49
+ REPO_ROOT = _repo_root()
50
+ CACHE_DIR = _cache_dir(REPO_ROOT)
18
51
  MARKER_SUBSTR = "<!-- pr-review-loop-marker"
19
52
 
20
53
 
@@ -24,8 +57,41 @@ def _json_out(obj):
24
57
 
25
58
 
26
59
  def _run_capture(cmd):
27
- p = subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
28
- return p.returncode, p.stdout, p.stderr
60
+ try:
61
+ p = subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
62
+ return p.returncode, p.stdout, p.stderr
63
+ except FileNotFoundError as e:
64
+ return 127, "", str(e)
65
+
66
+
67
+ def _detect_git_remote_host():
68
+ # Best-effort parse from origin remote.
69
+ rc, origin_url, _ = _run_capture(["git", "remote", "get-url", "origin"])
70
+ if rc != 0:
71
+ rc, origin_url, _ = _run_capture(["git", "config", "--get", "remote.origin.url"])
72
+ if rc != 0:
73
+ return None
74
+
75
+ url = (origin_url or "").strip()
76
+ if not url:
77
+ return None
78
+
79
+ # Examples:
80
+ # - git@github.com:owner/repo.git
81
+ # - ssh://git@github.company.com/owner/repo.git
82
+ # - https://github.com/owner/repo.git
83
+ if url.startswith("git@"): # SCP-like syntax
84
+ m = re.match(r"^git@([^:]+):", url)
85
+ return m.group(1) if m else None
86
+
87
+ if url.startswith("ssh://") or url.startswith("https://") or url.startswith("http://"):
88
+ try:
89
+ parsed = urlparse(url)
90
+ return parsed.hostname
91
+ except Exception:
92
+ return None
93
+
94
+ return None
29
95
 
30
96
 
31
97
  def _clip(s, n):
@@ -95,9 +161,31 @@ def main(argv):
95
161
  _json_out({"error": "NOT_A_GIT_REPO"})
96
162
  return 1
97
163
 
98
- if subprocess.run(["gh", "auth", "status"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL).returncode != 0:
99
- _json_out({"error": "GH_NOT_AUTHENTICATED"})
164
+ host = _detect_git_remote_host() or "github.com"
165
+ rc, gh_out, gh_err = _run_capture(["gh", "auth", "status", "--hostname", host])
166
+ if rc == 127:
167
+ _json_out({
168
+ "error": "GH_CLI_NOT_FOUND",
169
+ "detail": "gh not found in PATH",
170
+ "suggestion": "Install GitHub CLI: https://cli.github.com/",
171
+ })
100
172
  return 1
173
+ if rc != 0:
174
+ # If host detection is wrong, a global check might still succeed.
175
+ rc2, gh_out2, gh_err2 = _run_capture(["gh", "auth", "status"])
176
+ if rc2 == 0:
177
+ pass
178
+ else:
179
+ detail = (gh_err or gh_out or "").strip()
180
+ if len(detail) > 4000:
181
+ detail = detail[-4000:]
182
+ _json_out({
183
+ "error": "GH_NOT_AUTHENTICATED",
184
+ "host": host,
185
+ "detail": detail,
186
+ "suggestion": f"Run: gh auth login --hostname {host}",
187
+ })
188
+ return 1
101
189
 
102
190
  rc, owner_repo, _ = _run_capture(["gh", "repo", "view", "--json", "nameWithOwner", "--jq", ".nameWithOwner"])
103
191
  owner_repo = owner_repo.strip() if rc == 0 else ""
@@ -195,7 +283,8 @@ def main(argv):
195
283
  "repo": {"nameWithOwner": owner_repo},
196
284
  "headOid": head_oid,
197
285
  "existingMarkerCount": marker_count,
198
- "contextFile": context_file,
286
+ # Handoff should be repo-relative path so downstream agents can read it directly.
287
+ "contextFile": _repo_relpath(REPO_ROOT, context_path),
199
288
  }
200
289
  )
201
290
  return 0
@@ -8,20 +8,31 @@
8
8
  # - If mergeable == CONFLICTING: return {"error":"PR_MERGE_CONFLICTS_UNRESOLVED"}
9
9
  # - Run dx cache clear
10
10
  # - Run dx lint and dx build all concurrently
11
- # - On failure, write fixFile to ~/.opencode/cache/ and return {"ok":false,"fixFile":"..."}
11
+ # - On failure, write fixFile to project cache: ./.cache/
12
+ # and return {"ok":false,"fixFile":"./.cache/..."}
12
13
  # - On success, return {"ok":true}
13
14
  #
14
15
  # Stdout contract: print exactly one JSON object and nothing else.
15
16
 
16
17
  import json
18
+ import os
17
19
  import re
18
20
  import secrets
19
21
  import subprocess
20
22
  import sys
23
+ from urllib.parse import urlparse
21
24
  from pathlib import Path
22
25
 
23
26
 
24
27
  def run(cmd, *, cwd=None, stdout_path=None, stderr_path=None):
28
+ try:
29
+ return _run(cmd, cwd=cwd, stdout_path=stdout_path, stderr_path=stderr_path)
30
+ except FileNotFoundError as e:
31
+ # Match common shell semantics for "command not found".
32
+ return 127
33
+
34
+
35
+ def _run(cmd, *, cwd=None, stdout_path=None, stderr_path=None):
25
36
  if stdout_path and stderr_path and stdout_path == stderr_path:
26
37
  with open(stdout_path, "wb") as f:
27
38
  p = subprocess.run(cmd, cwd=cwd, stdout=f, stderr=f)
@@ -45,8 +56,70 @@ def run(cmd, *, cwd=None, stdout_path=None, stderr_path=None):
45
56
 
46
57
 
47
58
  def run_capture(cmd, *, cwd=None):
48
- p = subprocess.run(cmd, cwd=cwd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
49
- return p.returncode, p.stdout, p.stderr
59
+ try:
60
+ p = subprocess.run(cmd, cwd=cwd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
61
+ return p.returncode, p.stdout, p.stderr
62
+ except FileNotFoundError as e:
63
+ return 127, "", str(e)
64
+
65
+
66
+ def _detect_git_remote_host():
67
+ # Best-effort parse from origin remote.
68
+ rc, origin_url, _ = run_capture(["git", "remote", "get-url", "origin"])
69
+ if rc != 0:
70
+ rc, origin_url, _ = run_capture(["git", "config", "--get", "remote.origin.url"])
71
+ if rc != 0:
72
+ return None
73
+
74
+ url = (origin_url or "").strip()
75
+ if not url:
76
+ return None
77
+
78
+ # Examples:
79
+ # - git@github.com:owner/repo.git
80
+ # - ssh://git@github.company.com/owner/repo.git
81
+ # - https://github.com/owner/repo.git
82
+ if url.startswith("git@"): # SCP-like syntax
83
+ # git@host:owner/repo(.git)
84
+ m = re.match(r"^git@([^:]+):", url)
85
+ return m.group(1) if m else None
86
+
87
+ if url.startswith("ssh://") or url.startswith("https://") or url.startswith("http://"):
88
+ try:
89
+ parsed = urlparse(url)
90
+ return parsed.hostname
91
+ except Exception:
92
+ return None
93
+
94
+ return None
95
+
96
+
97
+ def repo_root():
98
+ try:
99
+ p = subprocess.run(
100
+ ["git", "rev-parse", "--show-toplevel"],
101
+ stdout=subprocess.PIPE,
102
+ stderr=subprocess.DEVNULL,
103
+ text=True,
104
+ )
105
+ out = (p.stdout or "").strip()
106
+ if p.returncode == 0 and out:
107
+ return Path(out)
108
+ except Exception:
109
+ pass
110
+ return Path.cwd()
111
+
112
+
113
+ def cache_dir(repo_root_path):
114
+ return (repo_root_path / ".cache").resolve()
115
+
116
+
117
+ def repo_relpath(repo_root_path, p):
118
+ try:
119
+ rel = p.resolve().relative_to(repo_root_path.resolve())
120
+ return "./" + rel.as_posix()
121
+ except Exception:
122
+ return str(p)
50
123
 
51
124
 
52
125
  def tail_text(path, max_lines=200, max_chars=12000):
@@ -101,9 +174,25 @@ def main():
101
174
  print(json.dumps({"error": "NOT_A_GIT_REPO"}))
102
175
  return 1
103
176
 
104
- rc = run(["gh", "auth", "status"])
177
+ host = _detect_git_remote_host() or "github.com"
178
+ rc, gh_out, gh_err = run_capture(["gh", "auth", "status", "--hostname", host])
179
+ if rc == 127:
180
+ print(json.dumps({
181
+ "error": "GH_CLI_NOT_FOUND",
182
+ "detail": "gh not found in PATH",
183
+ "suggestion": "Install GitHub CLI: https://cli.github.com/",
184
+ }))
185
+ return 1
105
186
  if rc != 0:
106
- print(json.dumps({"error": "GH_NOT_AUTHENTICATED"}))
187
+ detail = (gh_err or gh_out or "").strip()
188
+ if len(detail) > 4000:
189
+ detail = detail[-4000:]
190
+ print(json.dumps({
191
+ "error": "GH_NOT_AUTHENTICATED",
192
+ "host": host,
193
+ "detail": detail,
194
+ "suggestion": f"Run: gh auth login --hostname {host}",
195
+ }))
107
196
  return 1
108
197
 
109
198
  rc, pr_json, _ = run_capture(["gh", "pr", "view", pr, "--json", "headRefName,baseRefName,mergeable"])
@@ -155,7 +244,8 @@ def main():
155
244
  return 1
156
245
 
157
246
  run_id = secrets.token_hex(4)
158
- cache = Path.home() / ".opencode" / "cache"
247
+ root = repo_root()
248
+ cache = cache_dir(root)
159
249
  cache.mkdir(parents=True, exist_ok=True)
160
250
 
161
251
  cache_clear_log = cache / f"precheck-pr{pr}-{run_id}-cache-clear.log"
@@ -168,9 +258,9 @@ def main():
168
258
  "headRefName": head,
169
259
  "baseRefName": base,
170
260
  "mergeable": mergeable,
171
- "cacheClearLog": str(cache_clear_log),
172
- "lintLog": str(lint_log),
173
- "buildLog": str(build_log),
261
+ "cacheClearLog": repo_relpath(root, cache_clear_log),
262
+ "lintLog": repo_relpath(root, lint_log),
263
+ "buildLog": repo_relpath(root, build_log),
174
264
  }, indent=2) + "\n")
175
265
 
176
266
  cache_rc = run(["dx", "cache", "clear"], stdout_path=str(cache_clear_log), stderr_path=str(cache_clear_log))
@@ -186,10 +276,10 @@ def main():
186
276
  "line": None,
187
277
  "title": "dx cache clear failed",
188
278
  "description": log_tail,
189
- "suggestion": f"Open log: {cache_clear_log}",
279
+ "suggestion": f"Open log: {repo_relpath(root, cache_clear_log)}",
190
280
  }]
191
281
  write_fixfile(str(fix_path), issues)
192
- print(json.dumps({"ok": False, "fixFile": fix_file}))
282
+ print(json.dumps({"ok": False, "fixFile": repo_relpath(root, fix_path)}))
193
283
  return 1
194
284
 
195
285
  import threading
@@ -226,7 +316,7 @@ def main():
226
316
  "line": line,
227
317
  "title": "dx lint failed",
228
318
  "description": log_tail,
229
- "suggestion": f"Open log: {lint_log}",
319
+ "suggestion": f"Open log: {repo_relpath(root, lint_log)}",
230
320
  })
231
321
  i += 1
232
322
  if results.get("build", 1) != 0:
@@ -240,11 +330,11 @@ def main():
240
330
  "line": line,
241
331
  "title": "dx build all failed",
242
332
  "description": log_tail,
243
- "suggestion": f"Open log: {build_log}",
333
+ "suggestion": f"Open log: {repo_relpath(root, build_log)}",
244
334
  })
245
335
 
246
336
  write_fixfile(str(fix_path), issues)
247
- print(json.dumps({"ok": False, "fixFile": fix_file}))
337
+ print(json.dumps({"ok": False, "fixFile": repo_relpath(root, fix_path)}))
248
338
  return 1
249
339
 
250
340
 
@@ -2,12 +2,12 @@
2
2
  # Deterministic PR review aggregation (script owns all rules).
3
3
  #
4
4
  # Workflow:
5
- # - Mode A: read contextFile + reviewFile(s) from ~/.opencode/cache/, parse findings, merge duplicates,
5
+ # - Mode A: read contextFile + reviewFile(s) from project cache: ./.cache/, parse findings, merge duplicates,
6
6
  # post a single PR comment, and optionally generate a fixFile for pr-fix.
7
7
  # - Mode B: read fixReportFile from cache and post it as a PR comment.
8
8
  #
9
9
  # Input rules:
10
- # - Callers pass only basenames (no paths). This script reads/writes under ~/.opencode/cache/.
10
+ # - Callers should pass repo-relative paths (e.g. ./.cache/foo.md). For backward-compat, basenames are also accepted.
11
11
  # - Duplicate groups come from LLM but are passed as an argument (NOT written to disk).
12
12
  # - Prefer: --duplicate-groups-b64 <base64(json)>
13
13
  # - Also supported: --duplicate-groups-json '<json>'
@@ -20,7 +20,7 @@
20
20
  #
21
21
  # PR comment rules:
22
22
  # - Every comment must include marker: <!-- pr-review-loop-marker -->
23
- # - Comment body must NOT contain local filesystem paths (this script scrubs ~/.opencode/cache and $HOME).
23
+ # - Comment body must NOT contain local filesystem paths (this script scrubs cache paths, $HOME, and repo absolute paths).
24
24
  #
25
25
  # fixFile rules:
26
26
  # - fixFile includes ONLY P0/P1/P2 findings.
@@ -38,7 +38,76 @@ from pathlib import Path
38
38
 
39
39
 
40
40
  MARKER = "<!-- pr-review-loop-marker -->"
41
- CACHE_DIR = Path.home() / ".opencode" / "cache"
41
+
42
+
43
+ def _repo_root():
44
+ try:
45
+ p = subprocess.run(
46
+ ["git", "rev-parse", "--show-toplevel"],
47
+ stdout=subprocess.PIPE,
48
+ stderr=subprocess.DEVNULL,
49
+ text=True,
50
+ )
51
+ out = (p.stdout or "").strip()
52
+ if p.returncode == 0 and out:
53
+ return Path(out)
54
+ except Exception:
55
+ pass
56
+ return Path.cwd()
57
+
58
+
59
+ def _cache_dir(repo_root):
60
+ return (repo_root / ".cache").resolve()
61
+
62
+
63
+ def _is_safe_relpath(p):
64
+ if p.is_absolute():
65
+ return False
66
+ if any(part in ("..",) for part in p.parts):
67
+ return False
68
+ return True
69
+
70
+
71
+ def _resolve_ref(repo_root, cache_dir, ref):
72
+ if not ref:
73
+ return None
74
+ s = str(ref).strip()
75
+ if not s:
76
+ return None
77
+
78
+ # If caller already passes a repo-relative path like ./.cache/foo.md
79
+ looks_like_path = ("/" in s) or ("\\" in s) or s.startswith(".")
80
+ if looks_like_path:
81
+ p = Path(s)
82
+ if p.is_absolute():
83
+ # Only allow absolute paths under cache_dir.
84
+ try:
85
+ p2 = p.resolve()
86
+ p2.relative_to(cache_dir.resolve())
87
+ return p2
88
+ except Exception:
89
+ return None
90
+ if not _is_safe_relpath(p):
91
+ return None
92
+ return (repo_root / p).resolve()
93
+
94
+ # Backward-compat: accept basename-only.
95
+ b = _safe_basename(s)
96
+ if not b:
97
+ return None
98
+ return (cache_dir / b).resolve()
99
+
100
+
101
+ def _repo_relpath(repo_root, p):
102
+ try:
103
+ rel = p.resolve().relative_to(repo_root.resolve())
104
+ return "./" + rel.as_posix()
105
+ except Exception:
106
+ return os.path.basename(str(p))
107
+
108
+
109
+ REPO_ROOT = _repo_root()
110
+ CACHE_DIR = _cache_dir(REPO_ROOT)
42
111
 
43
112
 
44
113
  def _json_out(obj):
@@ -57,14 +126,19 @@ def _safe_basename(name):
57
126
  return base
58
127
 
59
128
 
60
- def _read_cache_text(basename):
61
- p = CACHE_DIR / basename
129
+ def _read_cache_text(ref):
130
+ p = _resolve_ref(REPO_ROOT, CACHE_DIR, ref)
131
+ if not p:
132
+ raise FileNotFoundError("INVALID_CACHE_REF")
62
133
  return p.read_text(encoding="utf-8", errors="replace")
63
134
 
64
135
 
65
- def _write_cache_text(basename, content):
136
+ def _write_cache_text(ref, content):
137
+ p = _resolve_ref(REPO_ROOT, CACHE_DIR, ref)
138
+ if not p:
139
+ raise ValueError("INVALID_CACHE_REF")
66
140
  CACHE_DIR.mkdir(parents=True, exist_ok=True)
67
- p = CACHE_DIR / basename
141
+ p.parent.mkdir(parents=True, exist_ok=True)
68
142
  p.write_text(content, encoding="utf-8", newline="\n")
69
143
 
70
144
 
@@ -88,13 +162,21 @@ def _sanitize_for_comment(text):
88
162
  text = str(text)
89
163
 
90
164
  home = str(Path.home())
91
- cache_abs = str(CACHE_DIR)
165
+ cache_abs = str(CACHE_DIR.resolve())
166
+ repo_abs = str(REPO_ROOT.resolve())
92
167
 
168
+ # Backward-compat scrub.
93
169
  text = text.replace("~/.opencode/cache/", "[cache]/")
94
- text = text.replace(cache_abs + "/", "[cache]/")
95
170
  if home:
96
171
  text = text.replace(home + "/.opencode/cache/", "[cache]/")
97
172
 
173
+ # New cache scrub.
174
+ text = text.replace(cache_abs + "/", "[cache]/")
175
+
176
+ # Avoid leaking absolute local repo paths.
177
+ if repo_abs:
178
+ text = text.replace(repo_abs + "/", "")
179
+
98
180
  return text
99
181
 
100
182
 
@@ -236,8 +318,14 @@ def _counts(findings):
236
318
  return c
237
319
 
238
320
 
239
- def _post_pr_comment(pr_number, body_basename):
240
- body_path = str(CACHE_DIR / body_basename)
321
+ def _post_pr_comment(pr_number, body_ref):
322
+ if isinstance(body_ref, Path):
323
+ p = body_ref
324
+ else:
325
+ p = _resolve_ref(REPO_ROOT, CACHE_DIR, body_ref)
326
+ if not p:
327
+ return False
328
+ body_path = str(p)
241
329
  rc = subprocess.run(
242
330
  ["gh", "pr", "comment", str(pr_number), "--body-file", body_path],
243
331
  stdout=subprocess.DEVNULL,
@@ -334,20 +422,25 @@ def main(argv):
334
422
  round_num = args.round
335
423
  run_id = str(args.run_id)
336
424
 
337
- fix_report_file = _safe_basename(args.fix_report_file) if args.fix_report_file else None
338
- context_file = _safe_basename(args.context_file) if args.context_file else None
425
+ fix_report_file = (args.fix_report_file or "").strip() or None
426
+ context_file = (args.context_file or "").strip() or None
339
427
  review_files = []
340
428
  for rf in args.review_file or []:
341
- b = _safe_basename(rf)
342
- if b:
343
- review_files.append(b)
429
+ s = (rf or "").strip()
430
+ if s:
431
+ review_files.append(s)
344
432
 
345
433
  if fix_report_file:
434
+ fix_p = _resolve_ref(REPO_ROOT, CACHE_DIR, fix_report_file)
435
+ if not fix_p or not fix_p.exists():
436
+ _json_out({"error": "FIX_REPORT_FILE_NOT_FOUND"})
437
+ return 1
346
438
  fix_md = _read_cache_text(fix_report_file)
347
439
  body = _render_mode_b_comment(pr_number, round_num, run_id, fix_md)
348
- body_file = f"review-aggregate-fix-comment-pr{pr_number}-r{round_num}-{run_id}.md"
349
- _write_cache_text(body_file, body)
350
- if not _post_pr_comment(pr_number, body_file):
440
+ body_basename = f"review-aggregate-fix-comment-pr{pr_number}-r{round_num}-{run_id}.md"
441
+ body_ref = _repo_relpath(REPO_ROOT, CACHE_DIR / body_basename)
442
+ _write_cache_text(body_ref, body)
443
+ if not _post_pr_comment(pr_number, body_ref):
351
444
  _json_out({"error": "GH_PR_COMMENT_FAILED"})
352
445
  return 1
353
446
  _json_out({"ok": True})
@@ -360,6 +453,21 @@ def main(argv):
360
453
  _json_out({"error": "MISSING_REVIEW_FILES"})
361
454
  return 1
362
455
 
456
+ ctx_p = _resolve_ref(REPO_ROOT, CACHE_DIR, context_file)
457
+ if not ctx_p or not ctx_p.exists():
458
+ _json_out({"error": "CONTEXT_FILE_NOT_FOUND"})
459
+ return 1
460
+
461
+ valid_review_files = []
462
+ for rf in review_files:
463
+ p = _resolve_ref(REPO_ROOT, CACHE_DIR, rf)
464
+ if p and p.exists():
465
+ valid_review_files.append(rf)
466
+ review_files = valid_review_files
467
+ if not review_files:
468
+ _json_out({"error": "REVIEW_FILES_NOT_FOUND"})
469
+ return 1
470
+
363
471
  raw_reviews = []
364
472
  all_findings = []
365
473
  for rf in review_files:
@@ -377,9 +485,10 @@ def main(argv):
377
485
  stop = len(must_fix) == 0
378
486
 
379
487
  body = _render_mode_a_comment(pr_number, round_num, run_id, counts, must_fix, merged_map, raw_reviews)
380
- body_file = f"review-aggregate-comment-pr{pr_number}-r{round_num}-{run_id}.md"
381
- _write_cache_text(body_file, body)
382
- if not _post_pr_comment(pr_number, body_file):
488
+ body_basename = f"review-aggregate-comment-pr{pr_number}-r{round_num}-{run_id}.md"
489
+ body_ref = _repo_relpath(REPO_ROOT, CACHE_DIR / body_basename)
490
+ _write_cache_text(body_ref, body)
491
+ if not _post_pr_comment(pr_number, body_ref):
383
492
  _json_out({"error": "GH_PR_COMMENT_FAILED"})
384
493
  return 1
385
494
 
@@ -415,8 +524,9 @@ def main(argv):
415
524
  lines.append(f" description: {desc}")
416
525
  lines.append(f" suggestion: {sugg}")
417
526
 
418
- _write_cache_text(fix_file, "\n".join(lines) + "\n")
419
- _json_out({"stop": False, "fixFile": fix_file})
527
+ fix_ref = _repo_relpath(REPO_ROOT, CACHE_DIR / fix_file)
528
+ _write_cache_text(fix_ref, "\n".join(lines) + "\n")
529
+ _json_out({"stop": False, "fixFile": fix_ref})
420
530
  return 0
421
531
 
422
532
 
@@ -12,8 +12,8 @@ agent: sisyphus
12
12
 
13
13
  ## Cache 约定(强制)
14
14
 
15
- - 本流程所有中间文件都存放在 `~/.opencode/cache/`
16
- - agent/命令之间仅传递文件名(basename),不传目录
15
+ - 本流程所有中间文件都存放在项目内:`./.cache/`
16
+ - agent/命令之间传递**repo 相对路径**(例如:`./.cache/pr-context-...md`),不要只传 basename
17
17
 
18
18
  ## 固定 subagent_type(直接用 Task 调用,不要反复确认)
19
19
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@ranger1/dx",
3
- "version": "0.1.32",
3
+ "version": "0.1.34",
4
4
  "type": "module",
5
5
  "license": "MIT",
6
6
  "repository": {