@ranger1/dx 0.1.76 → 0.1.78
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +92 -31
- package/bin/dx.js +3 -3
- package/lib/cli/commands/deploy.js +2 -1
- package/lib/cli/commands/stack.js +198 -237
- package/lib/cli/commands/start.js +0 -6
- package/lib/cli/dx-cli.js +10 -1
- package/lib/cli/help.js +8 -7
- package/lib/{opencode-initial.js → codex-initial.js} +3 -82
- package/lib/vercel-deploy.js +14 -27
- package/package.json +1 -2
- package/@opencode/agents/__pycache__/gh_review_harvest.cpython-314.pyc +0 -0
- package/@opencode/agents/__pycache__/pr_context.cpython-314.pyc +0 -0
- package/@opencode/agents/__pycache__/pr_precheck.cpython-314.pyc +0 -0
- package/@opencode/agents/__pycache__/pr_review_aggregate.cpython-314.pyc +0 -0
- package/@opencode/agents/__pycache__/test_pr_review_aggregate.cpython-314-pytest-9.0.2.pyc +0 -0
- package/@opencode/agents/__pycache__/test_pr_review_aggregate.cpython-314.pyc +0 -0
- package/@opencode/agents/claude-reviewer.md +0 -82
- package/@opencode/agents/codex-reviewer.md +0 -83
- package/@opencode/agents/gemini-reviewer.md +0 -82
- package/@opencode/agents/gh-thread-reviewer.md +0 -122
- package/@opencode/agents/gh_review_harvest.py +0 -292
- package/@opencode/agents/pr-context.md +0 -82
- package/@opencode/agents/pr-fix.md +0 -243
- package/@opencode/agents/pr-precheck.md +0 -89
- package/@opencode/agents/pr-review-aggregate.md +0 -151
- package/@opencode/agents/pr_context.py +0 -351
- package/@opencode/agents/pr_precheck.py +0 -505
- package/@opencode/agents/pr_review_aggregate.py +0 -868
- package/@opencode/agents/test_pr_review_aggregate.py +0 -701
- package/@opencode/commands/doctor.md +0 -271
- package/@opencode/commands/git-commit-and-pr.md +0 -282
- package/@opencode/commands/git-release.md +0 -642
- package/@opencode/commands/oh_attach.json +0 -92
- package/@opencode/commands/opencode_attach.json +0 -29
- package/@opencode/commands/opencode_attach.py +0 -142
- package/@opencode/commands/pr-review-loop.md +0 -211
|
@@ -1,151 +0,0 @@
|
|
|
1
|
-
---
|
|
2
|
-
description: aggregate PR reviews + create fix file
|
|
3
|
-
mode: subagent
|
|
4
|
-
model: openai/gpt-5.3-codex
|
|
5
|
-
temperature: 0.1
|
|
6
|
-
tools:
|
|
7
|
-
bash: true
|
|
8
|
-
---
|
|
9
|
-
|
|
10
|
-
# PR Review Aggregator
|
|
11
|
-
|
|
12
|
-
## Cache 约定(强制)
|
|
13
|
-
|
|
14
|
-
- 缓存目录固定为 `./.cache/`;交接一律传 `./.cache/<file>`(repo 相对路径),禁止 basename-only(如 `foo.md`)。
|
|
15
|
-
|
|
16
|
-
## 输入(两种模式)
|
|
17
|
-
|
|
18
|
-
### 模式 A:评审聚合 + 生成 fixFile + 发布评审评论
|
|
19
|
-
|
|
20
|
-
- `PR #<number>`
|
|
21
|
-
- `round: <number>`
|
|
22
|
-
- `runId: <string>`(必须透传,格式 `<PR>-<ROUND>-<HEAD_SHORT>`,禁止自行生成)
|
|
23
|
-
- `contextFile: <path>`(例如:`./.cache/pr-context-...md`)
|
|
24
|
-
- `reviewFile: <path>`(多行,1+ 条;例如:`./.cache/review-...md`)
|
|
25
|
-
|
|
26
|
-
### 模式 B:发布修复评论(基于 fixReportFile)
|
|
27
|
-
|
|
28
|
-
- `PR #<number>`
|
|
29
|
-
- `round: <number>`
|
|
30
|
-
- `runId: <string>`(必须透传,格式 `<PR>-<ROUND>-<HEAD_SHORT>`,禁止自行生成)
|
|
31
|
-
- `fixReportFile: <path>`(例如:`./.cache/fix-report-...md`)
|
|
32
|
-
|
|
33
|
-
示例:
|
|
34
|
-
|
|
35
|
-
```text
|
|
36
|
-
PR #123
|
|
37
|
-
round: 1
|
|
38
|
-
runId: 123-1-a1b2c3d
|
|
39
|
-
contextFile: ./.cache/pr-context-pr123-r1-123-1-a1b2c3d.md
|
|
40
|
-
reviewFile: ./.cache/review-CDX-pr123-r1-123-1-a1b2c3d.md
|
|
41
|
-
reviewFile: ./.cache/review-CLD-pr123-r1-123-1-a1b2c3d.md
|
|
42
|
-
reviewFile: ./.cache/review-GMN-pr123-r1-123-1-a1b2c3d.md
|
|
43
|
-
```
|
|
44
|
-
|
|
45
|
-
## 执行方式(强制)
|
|
46
|
-
|
|
47
|
-
所有确定性工作(解析/聚合/发评论/生成 fixFile/输出 JSON)都由 `~/.opencode/agents/pr_review_aggregate.py` 完成。
|
|
48
|
-
|
|
49
|
-
你只做两件事:
|
|
50
|
-
|
|
51
|
-
1) 在模式 A 里用大模型判断哪些 finding 是重复的,并把重复分组作为参数传给脚本(不落盘)。
|
|
52
|
-
2) 调用脚本后,把脚本 stdout 的 JSON **原样返回**给调用者(不做解释/分析)。
|
|
53
|
-
|
|
54
|
-
## 重复分组(仅作为脚本入参)
|
|
55
|
-
|
|
56
|
-
你需要基于所有 `reviewFile` 内容判断重复 finding 分组,生成**一行 JSON**(不要代码块、不要解释文字、不要换行)。
|
|
57
|
-
|
|
58
|
-
注意:这行 JSON **不是你的最终输出**,它只用于生成 `--duplicate-groups-b64` 传给脚本。
|
|
59
|
-
|
|
60
|
-
```json
|
|
61
|
-
{"duplicateGroups":[["CDX-001","CLD-003"],["GMN-002","CLD-005","CDX-004"]]}
|
|
62
|
-
```
|
|
63
|
-
|
|
64
|
-
## 智能匹配(仅在模式 A + decision-log 存在时)
|
|
65
|
-
|
|
66
|
-
如果 decision-log(`./.cache/decision-log-pr<PR_NUMBER>.md`)存在,你需要基于 LLM 判断每个新 finding 与已决策问题的本质是否相同,从而生成 **escalation_groups** 参数。
|
|
67
|
-
|
|
68
|
-
**匹配原则**:
|
|
69
|
-
- **Essence 匹配**:对比 `essence` 字段与新 finding 的问题本质。
|
|
70
|
-
- **文件强绑定**:仅当 decision-log 条目的 `file` 与新 finding 的 `file` **完全一致**时才进行匹配。
|
|
71
|
-
- 若文件被重命名/删除/拆分,视为不同问题(为了稳定性,不处理复杂的 rename 映射)。
|
|
72
|
-
- 若 decision-log 条目缺少 `file` 字段(旧数据),则跳过匹配(视为不相关)。
|
|
73
|
-
|
|
74
|
-
**流程**:
|
|
75
|
-
|
|
76
|
-
1. 读取 decision-log,提取已 rejected 问题的 `essence` 和 `file` 字段
|
|
77
|
-
2. 逐个新 finding,**先检查 file 是否匹配**
|
|
78
|
-
- 若 file 不匹配 → 视为 New Issue
|
|
79
|
-
- 若 file 匹配 → 继续对比 essence
|
|
80
|
-
3. 若 essence 也匹配("问题本质相同"):
|
|
81
|
-
4. 收集可升级的问题(重新质疑阈值):
|
|
82
|
-
- **升级阈值**:优先级差距 ≥ 2 级
|
|
83
|
-
- 例如:已 rejected P3 but finding 为 P1 → 可升级质疑
|
|
84
|
-
- 例如:已 rejected P2 but finding 为 P0 → 可升级质疑
|
|
85
|
-
- 例如:已 rejected P2 but finding 为 P1 → 不升级(仅差 1 级)
|
|
86
|
-
5. 生成**一行 JSON**(不要代码块、不要解释文字、不要换行),结构如下:
|
|
87
|
-
|
|
88
|
-
```json
|
|
89
|
-
{"escalationGroups":[["CDX-001"],["GMN-002","CLD-005"]]}
|
|
90
|
-
```
|
|
91
|
-
|
|
92
|
-
其中每个组表示「可以作为已 rejected 问题的升级质疑」的 finding ID 集合。若无可升级问题,输出空数组:
|
|
93
|
-
|
|
94
|
-
```json
|
|
95
|
-
{"escalationGroups":[]}
|
|
96
|
-
```
|
|
97
|
-
|
|
98
|
-
注意:escalation_groups JSON **不是你的最终输出**,它只用于生成 `--escalation-groups-b64` 传给脚本。
|
|
99
|
-
|
|
100
|
-
## 调用脚本(强制)
|
|
101
|
-
|
|
102
|
-
模式 A(带 reviewFile + 重复分组 + 智能匹配):
|
|
103
|
-
|
|
104
|
-
```bash
|
|
105
|
-
python3 ~/.opencode/agents/pr_review_aggregate.py \
|
|
106
|
-
--pr <PR_NUMBER> \
|
|
107
|
-
--round <ROUND> \
|
|
108
|
-
--run-id <RUN_ID> \
|
|
109
|
-
--context-file <CONTEXT_FILE> \
|
|
110
|
-
--review-file <REVIEW_FILE_1> \
|
|
111
|
-
--review-file <REVIEW_FILE_2> \
|
|
112
|
-
--review-file <REVIEW_FILE_3> \
|
|
113
|
-
--duplicate-groups-b64 <BASE64_JSON> \
|
|
114
|
-
--decision-log-file ./.cache/decision-log-pr<PR_NUMBER>.md \
|
|
115
|
-
--escalation-groups-b64 <BASE64_JSON>
|
|
116
|
-
```
|
|
117
|
-
|
|
118
|
-
**参数说明**:
|
|
119
|
-
|
|
120
|
-
- `--duplicate-groups-b64`:base64 编码的 JSON,格式同上,例如 `eyJkdXBsaWNhdGVHcm91cHMiOltbIkNEWC0wMDEiLCJDTEQtMDAzIl1dfQ==`
|
|
121
|
-
- `--decision-log-file`:decision-log 文件路径(可选;若不存在则跳过智能匹配逻辑)
|
|
122
|
-
- `--escalation-groups-b64`:base64 编码的 escalation groups JSON,格式如上,例如 `eyJlc2NhbGF0aW9uR3JvdXBzIjpbWyJDRFgtMDAxIl1dfQ==`
|
|
123
|
-
|
|
124
|
-
模式 B(带 fixReportFile):
|
|
125
|
-
|
|
126
|
-
```bash
|
|
127
|
-
python3 ~/.opencode/agents/pr_review_aggregate.py \
|
|
128
|
-
--pr <PR_NUMBER> \
|
|
129
|
-
--round <ROUND> \
|
|
130
|
-
--run-id <RUN_ID> \
|
|
131
|
-
--fix-report-file <FIX_REPORT_FILE>
|
|
132
|
-
```
|
|
133
|
-
|
|
134
|
-
## 脚本输出处理(强制)
|
|
135
|
-
|
|
136
|
-
- 脚本 stdout 只会输出**单一一行 JSON**(可 `JSON.parse()`)。
|
|
137
|
-
- **成功时**:你的最终输出必须是**脚本 stdout 的那一行 JSON 原样内容**。
|
|
138
|
-
- 典型返回:`{"stop":true}` 或 `{"stop":false,"fixFile":"..."}` 或 `{"ok":true}`
|
|
139
|
-
- 禁止:解释/分析/补充文字
|
|
140
|
-
- 禁止:代码块(```)
|
|
141
|
-
- 禁止:前后空行
|
|
142
|
-
- **失败/异常时**:
|
|
143
|
-
- 若脚本 stdout 已输出合法 JSON(包含 `error` 或其他字段)→ 仍然**原样返回该 JSON**。
|
|
144
|
-
- 若脚本未输出合法 JSON / 退出异常 → 仅返回一行 JSON:`{"error":"PR_REVIEW_AGGREGATE_AGENT_FAILED"}`(必要时可加 `detail` 字段)。
|
|
145
|
-
|
|
146
|
-
## fixFile 结构(补充说明)
|
|
147
|
-
|
|
148
|
-
脚本在模式 A 下生成的 fixFile 分为两段:
|
|
149
|
-
|
|
150
|
-
- `## IssuesToFix`:只包含 P0/P1(必须修)
|
|
151
|
-
- `## OptionalIssues`:包含 P2/P3(由 pr-fix 自主决定是否修复/或拒绝并说明原因)
|
|
@@ -1,351 +0,0 @@
|
|
|
1
|
-
#!/usr/bin/env python3
|
|
2
|
-
# PR context builder (deterministic).
|
|
3
|
-
# - Reads PR metadata + recent comments via gh
|
|
4
|
-
# - Reads changed files via git diff (no patch)
|
|
5
|
-
# - Writes Markdown context file to project cache: ./.cache/
|
|
6
|
-
# - Prints exactly one JSON object to stdout
|
|
7
|
-
|
|
8
|
-
import argparse
|
|
9
|
-
import json
|
|
10
|
-
import os
|
|
11
|
-
import re
|
|
12
|
-
import subprocess
|
|
13
|
-
import sys
|
|
14
|
-
from urllib.parse import urlparse
|
|
15
|
-
from pathlib import Path
|
|
16
|
-
|
|
17
|
-
|
|
18
|
-
def _repo_root():
|
|
19
|
-
# Prefer git top-level so the cache follows the current repo.
|
|
20
|
-
try:
|
|
21
|
-
p = subprocess.run(
|
|
22
|
-
["git", "rev-parse", "--show-toplevel"],
|
|
23
|
-
stdout=subprocess.PIPE,
|
|
24
|
-
stderr=subprocess.DEVNULL,
|
|
25
|
-
text=True,
|
|
26
|
-
)
|
|
27
|
-
out = (p.stdout or "").strip()
|
|
28
|
-
if p.returncode == 0 and out:
|
|
29
|
-
return Path(out)
|
|
30
|
-
except Exception:
|
|
31
|
-
pass
|
|
32
|
-
return Path.cwd()
|
|
33
|
-
|
|
34
|
-
|
|
35
|
-
def _cache_dir(repo_root):
|
|
36
|
-
return (repo_root / ".cache").resolve()
|
|
37
|
-
|
|
38
|
-
|
|
39
|
-
def _repo_relpath(repo_root, p):
|
|
40
|
-
try:
|
|
41
|
-
rel = p.resolve().relative_to(repo_root.resolve())
|
|
42
|
-
return "./" + rel.as_posix()
|
|
43
|
-
except Exception:
|
|
44
|
-
# Fallback to basename-only.
|
|
45
|
-
return os.path.basename(str(p))
|
|
46
|
-
|
|
47
|
-
|
|
48
|
-
REPO_ROOT = _repo_root()
|
|
49
|
-
CACHE_DIR = _cache_dir(REPO_ROOT)
|
|
50
|
-
MARKER_SUBSTR = "<!-- pr-review-loop-marker"
|
|
51
|
-
|
|
52
|
-
|
|
53
|
-
def _json_out(obj):
|
|
54
|
-
sys.stdout.write(json.dumps(obj, ensure_ascii=True))
|
|
55
|
-
sys.stdout.write("\n")
|
|
56
|
-
|
|
57
|
-
|
|
58
|
-
def _run_capture(cmd):
|
|
59
|
-
try:
|
|
60
|
-
p = subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
|
|
61
|
-
return p.returncode, p.stdout, p.stderr
|
|
62
|
-
except FileNotFoundError as e:
|
|
63
|
-
return 127, "", str(e)
|
|
64
|
-
|
|
65
|
-
|
|
66
|
-
def _detect_git_remote_host():
|
|
67
|
-
# Best-effort parse from origin remote.
|
|
68
|
-
rc, origin_url, _ = _run_capture(["git", "remote", "get-url", "origin"])
|
|
69
|
-
if rc != 0:
|
|
70
|
-
rc, origin_url, _ = _run_capture(["git", "config", "--get", "remote.origin.url"])
|
|
71
|
-
if rc != 0:
|
|
72
|
-
return None
|
|
73
|
-
|
|
74
|
-
url = (origin_url or "").strip()
|
|
75
|
-
if not url:
|
|
76
|
-
return None
|
|
77
|
-
|
|
78
|
-
# Examples:
|
|
79
|
-
# - git@github.com:owner/repo.git
|
|
80
|
-
# - ssh://git@github.company.com/owner/repo.git
|
|
81
|
-
# - https://github.com/owner/repo.git
|
|
82
|
-
if url.startswith("git@"): # SCP-like syntax
|
|
83
|
-
m = re.match(r"^git@([^:]+):", url)
|
|
84
|
-
return m.group(1) if m else None
|
|
85
|
-
|
|
86
|
-
if url.startswith("ssh://") or url.startswith("https://") or url.startswith("http://"):
|
|
87
|
-
try:
|
|
88
|
-
parsed = urlparse(url)
|
|
89
|
-
return parsed.hostname
|
|
90
|
-
except Exception:
|
|
91
|
-
return None
|
|
92
|
-
|
|
93
|
-
return None
|
|
94
|
-
|
|
95
|
-
|
|
96
|
-
def _clip(s, n):
|
|
97
|
-
if s is None:
|
|
98
|
-
return ""
|
|
99
|
-
s = str(s)
|
|
100
|
-
return s if len(s) <= n else (s[:n] + "...")
|
|
101
|
-
|
|
102
|
-
|
|
103
|
-
def _safe_basename(name):
|
|
104
|
-
if not name:
|
|
105
|
-
return None
|
|
106
|
-
base = os.path.basename(name.strip())
|
|
107
|
-
if base != name.strip():
|
|
108
|
-
return None
|
|
109
|
-
if base in (".", ".."):
|
|
110
|
-
return None
|
|
111
|
-
return base
|
|
112
|
-
|
|
113
|
-
|
|
114
|
-
def _git_fetch_origin(ref):
|
|
115
|
-
subprocess.run(["git", "fetch", "origin", ref], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
|
|
116
|
-
|
|
117
|
-
|
|
118
|
-
def _gh_default_branch(owner_repo):
|
|
119
|
-
if not owner_repo:
|
|
120
|
-
return ""
|
|
121
|
-
rc, out, _ = _run_capture(
|
|
122
|
-
[
|
|
123
|
-
"gh",
|
|
124
|
-
"repo",
|
|
125
|
-
"view",
|
|
126
|
-
"--repo",
|
|
127
|
-
owner_repo,
|
|
128
|
-
"--json",
|
|
129
|
-
"defaultBranchRef",
|
|
130
|
-
"--jq",
|
|
131
|
-
".defaultBranchRef.name",
|
|
132
|
-
]
|
|
133
|
-
)
|
|
134
|
-
if rc != 0:
|
|
135
|
-
return ""
|
|
136
|
-
return (out or "").strip()
|
|
137
|
-
|
|
138
|
-
|
|
139
|
-
def _git_numstat(base_ref, base_oid):
|
|
140
|
-
# Prefer commit-oid diff when available; it's unambiguous for stacked PRs.
|
|
141
|
-
candidates = []
|
|
142
|
-
if base_oid:
|
|
143
|
-
candidates.append(f"{base_oid}...HEAD")
|
|
144
|
-
if base_ref:
|
|
145
|
-
candidates.append(f"origin/{base_ref}...HEAD")
|
|
146
|
-
candidates.append(f"{base_ref}...HEAD")
|
|
147
|
-
|
|
148
|
-
for lhs in candidates:
|
|
149
|
-
rc, out, _ = _run_capture(["git", "diff", "--numstat", lhs])
|
|
150
|
-
if rc == 0:
|
|
151
|
-
return out
|
|
152
|
-
return ""
|
|
153
|
-
|
|
154
|
-
|
|
155
|
-
def _parse_numstat(numstat_text):
|
|
156
|
-
rows = []
|
|
157
|
-
for line in (numstat_text or "").splitlines():
|
|
158
|
-
parts = line.split("\t")
|
|
159
|
-
if len(parts) < 3:
|
|
160
|
-
continue
|
|
161
|
-
add_s, del_s, path = parts[0].strip(), parts[1].strip(), parts[2].strip()
|
|
162
|
-
if not path:
|
|
163
|
-
continue
|
|
164
|
-
rows.append((add_s, del_s, path))
|
|
165
|
-
return rows
|
|
166
|
-
|
|
167
|
-
|
|
168
|
-
def main(argv):
|
|
169
|
-
class _ArgParser(argparse.ArgumentParser):
|
|
170
|
-
def error(self, message):
|
|
171
|
-
raise ValueError(message)
|
|
172
|
-
|
|
173
|
-
parser = _ArgParser(add_help=False)
|
|
174
|
-
parser.add_argument("--pr", type=int, required=True)
|
|
175
|
-
parser.add_argument("--round", type=int, default=1)
|
|
176
|
-
try:
|
|
177
|
-
args = parser.parse_args(argv)
|
|
178
|
-
except ValueError:
|
|
179
|
-
_json_out({"error": "INVALID_ARGS"})
|
|
180
|
-
return 2
|
|
181
|
-
|
|
182
|
-
pr_number = int(args.pr)
|
|
183
|
-
round_num = int(args.round)
|
|
184
|
-
|
|
185
|
-
def _json_err(error_code, extra=None):
|
|
186
|
-
obj = {"error": error_code, "prNumber": pr_number, "round": round_num}
|
|
187
|
-
if isinstance(extra, dict) and extra:
|
|
188
|
-
obj.update(extra)
|
|
189
|
-
_json_out(obj)
|
|
190
|
-
|
|
191
|
-
# Preconditions: be in a git repo and gh is authenticated.
|
|
192
|
-
rc, out, _ = _run_capture(["git", "rev-parse", "--is-inside-work-tree"])
|
|
193
|
-
if rc != 0 or out.strip() != "true":
|
|
194
|
-
_json_err("NOT_A_GIT_REPO")
|
|
195
|
-
return 1
|
|
196
|
-
|
|
197
|
-
host = _detect_git_remote_host() or "github.com"
|
|
198
|
-
rc, gh_out, gh_err = _run_capture(["gh", "auth", "status", "--hostname", host])
|
|
199
|
-
if rc == 127:
|
|
200
|
-
_json_err(
|
|
201
|
-
"GH_CLI_NOT_FOUND",
|
|
202
|
-
{
|
|
203
|
-
"detail": "gh not found in PATH",
|
|
204
|
-
"suggestion": "Install GitHub CLI: https://cli.github.com/",
|
|
205
|
-
},
|
|
206
|
-
)
|
|
207
|
-
return 1
|
|
208
|
-
if rc != 0:
|
|
209
|
-
# If host detection is wrong, a global check might still succeed.
|
|
210
|
-
rc2, gh_out2, gh_err2 = _run_capture(["gh", "auth", "status"])
|
|
211
|
-
if rc2 == 0:
|
|
212
|
-
pass
|
|
213
|
-
else:
|
|
214
|
-
detail = (gh_err or gh_out or "").strip()
|
|
215
|
-
if len(detail) > 4000:
|
|
216
|
-
detail = detail[-4000:]
|
|
217
|
-
_json_err(
|
|
218
|
-
"GH_NOT_AUTHENTICATED",
|
|
219
|
-
{
|
|
220
|
-
"host": host,
|
|
221
|
-
"detail": detail,
|
|
222
|
-
"suggestion": f"Run: gh auth login --hostname {host}",
|
|
223
|
-
},
|
|
224
|
-
)
|
|
225
|
-
return 1
|
|
226
|
-
|
|
227
|
-
rc, owner_repo, _ = _run_capture(["gh", "repo", "view", "--json", "nameWithOwner", "--jq", ".nameWithOwner"])
|
|
228
|
-
owner_repo = owner_repo.strip() if rc == 0 else ""
|
|
229
|
-
if not owner_repo:
|
|
230
|
-
_json_err("REPO_NOT_FOUND")
|
|
231
|
-
return 1
|
|
232
|
-
|
|
233
|
-
fields = "number,url,title,body,isDraft,labels,baseRefName,headRefName,baseRefOid,headRefOid,comments"
|
|
234
|
-
rc, pr_json, _ = _run_capture(["gh", "pr", "view", str(pr_number), "--repo", owner_repo, "--json", fields])
|
|
235
|
-
if rc != 0:
|
|
236
|
-
_json_err("PR_NOT_FOUND_OR_NO_ACCESS")
|
|
237
|
-
return 1
|
|
238
|
-
try:
|
|
239
|
-
pr = json.loads(pr_json)
|
|
240
|
-
except Exception:
|
|
241
|
-
_json_err("PR_NOT_FOUND_OR_NO_ACCESS")
|
|
242
|
-
return 1
|
|
243
|
-
|
|
244
|
-
head_oid = (pr.get("headRefOid") or "").strip()
|
|
245
|
-
base_oid = (pr.get("baseRefOid") or "").strip()
|
|
246
|
-
base_ref = (pr.get("baseRefName") or "").strip()
|
|
247
|
-
|
|
248
|
-
if not head_oid:
|
|
249
|
-
_json_err("PR_HEAD_OID_NOT_FOUND")
|
|
250
|
-
return 1
|
|
251
|
-
|
|
252
|
-
head_short = head_oid[:7]
|
|
253
|
-
if not base_ref:
|
|
254
|
-
base_ref = _gh_default_branch(owner_repo)
|
|
255
|
-
if not base_ref and not base_oid:
|
|
256
|
-
_json_err("PR_BASE_REF_NOT_FOUND")
|
|
257
|
-
return 1
|
|
258
|
-
head_ref = (pr.get("headRefName") or "").strip()
|
|
259
|
-
url = (pr.get("url") or "").strip()
|
|
260
|
-
|
|
261
|
-
run_id = f"{pr_number}-{round_num}-{head_short}"
|
|
262
|
-
|
|
263
|
-
if base_ref:
|
|
264
|
-
_git_fetch_origin(base_ref)
|
|
265
|
-
file_rows = _parse_numstat(_git_numstat(base_ref, base_oid))
|
|
266
|
-
|
|
267
|
-
labels = []
|
|
268
|
-
for l in (pr.get("labels") or []):
|
|
269
|
-
if isinstance(l, dict) and l.get("name"):
|
|
270
|
-
labels.append(str(l.get("name")))
|
|
271
|
-
|
|
272
|
-
comments = pr.get("comments") or []
|
|
273
|
-
recent = comments[-10:] if isinstance(comments, list) else []
|
|
274
|
-
marker_count = 0
|
|
275
|
-
for c in recent:
|
|
276
|
-
if not isinstance(c, dict):
|
|
277
|
-
continue
|
|
278
|
-
body = c.get("body") or ""
|
|
279
|
-
if isinstance(body, str) and MARKER_SUBSTR in body:
|
|
280
|
-
marker_count += 1
|
|
281
|
-
|
|
282
|
-
CACHE_DIR.mkdir(parents=True, exist_ok=True)
|
|
283
|
-
context_file = f"pr-context-pr{pr_number}-r{round_num}-{run_id}.md"
|
|
284
|
-
context_path = CACHE_DIR / context_file
|
|
285
|
-
|
|
286
|
-
with open(context_path, "w", encoding="utf-8", newline="\n") as fp:
|
|
287
|
-
fp.write("# PR Context\n\n")
|
|
288
|
-
fp.write(f"- Repo: {owner_repo}\n")
|
|
289
|
-
fp.write(f"- PR: #{pr_number} {url}\n")
|
|
290
|
-
fp.write(f"- Round: {round_num}\n")
|
|
291
|
-
fp.write(f"- RunId: {run_id}\n")
|
|
292
|
-
fp.write(f"- Base: {base_ref}\n")
|
|
293
|
-
fp.write(f"- Head: {head_ref}\n")
|
|
294
|
-
fp.write(f"- HeadShort: {head_short}\n")
|
|
295
|
-
fp.write(f"- HeadOid: {head_oid}\n")
|
|
296
|
-
fp.write(f"- Draft: {pr.get('isDraft')}\n")
|
|
297
|
-
fp.write(f"- Labels: {', '.join(labels) if labels else '(none)'}\n")
|
|
298
|
-
fp.write(f"- ExistingLoopMarkers: {marker_count}\n\n")
|
|
299
|
-
|
|
300
|
-
fp.write("## Title\n\n")
|
|
301
|
-
fp.write(_clip(pr.get("title") or "", 200) + "\n\n")
|
|
302
|
-
|
|
303
|
-
fp.write("## Body (excerpt)\n\n")
|
|
304
|
-
fp.write(_clip(pr.get("body") or "", 2000) or "(empty)")
|
|
305
|
-
fp.write("\n\n")
|
|
306
|
-
|
|
307
|
-
fp.write(f"## Changed Files ({len(file_rows)})\n\n")
|
|
308
|
-
if file_rows:
|
|
309
|
-
for add_s, del_s, path in file_rows:
|
|
310
|
-
fp.write(f"- +{add_s} -{del_s} {path}\n")
|
|
311
|
-
else:
|
|
312
|
-
fp.write("(none)\n")
|
|
313
|
-
fp.write("\n")
|
|
314
|
-
|
|
315
|
-
fp.write("## Recent Comments (excerpt)\n\n")
|
|
316
|
-
if recent:
|
|
317
|
-
for c in recent:
|
|
318
|
-
if not isinstance(c, dict):
|
|
319
|
-
continue
|
|
320
|
-
author = None
|
|
321
|
-
if isinstance(c.get("author"), dict):
|
|
322
|
-
author = (c.get("author") or {}).get("login")
|
|
323
|
-
fp.write(f"- {author or 'unknown'}: {_clip(c.get('body') or '', 300)}\n")
|
|
324
|
-
else:
|
|
325
|
-
fp.write("(none)\n")
|
|
326
|
-
|
|
327
|
-
_json_out(
|
|
328
|
-
{
|
|
329
|
-
"agent": "pr-context",
|
|
330
|
-
"prNumber": pr_number,
|
|
331
|
-
"round": round_num,
|
|
332
|
-
"runId": run_id,
|
|
333
|
-
"repo": {"nameWithOwner": owner_repo},
|
|
334
|
-
"headOid": head_oid,
|
|
335
|
-
"headShort": head_short,
|
|
336
|
-
"existingMarkerCount": marker_count,
|
|
337
|
-
# Handoff should be repo-relative path so downstream agents can read it directly.
|
|
338
|
-
"contextFile": _repo_relpath(REPO_ROOT, context_path),
|
|
339
|
-
}
|
|
340
|
-
)
|
|
341
|
-
return 0
|
|
342
|
-
|
|
343
|
-
|
|
344
|
-
if __name__ == "__main__":
|
|
345
|
-
try:
|
|
346
|
-
raise SystemExit(main(sys.argv[1:]))
|
|
347
|
-
except SystemExit:
|
|
348
|
-
raise
|
|
349
|
-
except Exception:
|
|
350
|
-
_json_out({"error": "PR_CONTEXT_SCRIPT_FAILED"})
|
|
351
|
-
raise SystemExit(1)
|