@openthink/team 0.0.3 → 0.0.5
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/dist/assign-ticket.md +66 -15
- package/dist/index.js +142 -13
- package/dist/index.js.map +1 -1
- package/package.json +1 -1
package/dist/assign-ticket.md
CHANGED
|
@@ -61,6 +61,8 @@ Write the comment in this shape:
|
|
|
61
61
|
- <bullet 2>
|
|
62
62
|
```
|
|
63
63
|
|
|
64
|
+
If your appended system context flags the **AGT-107 haiku-downshift heuristic** as active (a `# Product agent: haiku-downshift heuristic active` block), use the header `### YYYY-MM-DD — Product agent (haiku-downshift)` instead of the standard form. The runner has already spawned you on Haiku 4.5; the suffix makes the heuristic visible in the ticket's audit trail.
|
|
65
|
+
|
|
64
66
|
## Phase 3 — Engineering agent (state: refined → spike phase)
|
|
65
67
|
|
|
66
68
|
**Step 0 — Workspace is already prepared.** When `oteam assign` spawned you against a repo-bound ticket, it already cloned `/tmp/open-team-issues/<ticket-id-lowercased>/repo` (from the stamp server when `stamp.enforce: true` is set in `~/.open-team/config.json`; from GitHub otherwise, or when `--no-stamp` was passed) and set your cwd to it. Confirm with `pwd` and `git remote -v`; for stamp-governed repos you should see exactly one remote, `origin`, pointing at `ssh://git@<stamp-host>:<port>/srv/git/<basename>.git`.
|
|
@@ -157,10 +159,27 @@ If you skipped Phase 3 Step 0 (vault-only spike that turned out to need code cha
|
|
|
157
159
|
|
|
158
160
|
Read `CLAUDE.md`, `AGENTS.md`, `README.md` at the repo root if present.
|
|
159
161
|
|
|
160
|
-
**2. Verify the worktree shape.** Run `git remote -v` and
|
|
162
|
+
**2. Verify the worktree shape and compute the merge mode.** Run `git remote -v` and check whether `.stamp/` exists in the worktree. Three shapes are valid:
|
|
163
|
+
|
|
164
|
+
- **Stamp-governed (default).** Exactly one remote, `origin`, pointing at `ssh://git@<stamp-host>:<port>/srv/git/<basename>.git`. The runner clones with that shape on purpose; no rename / re-add is required, and adding a `github` remote here would defeat the AGT-050 invariant. `MODE` will resolve to `stamp` and routing goes through the stamp-protected branch (5a) at the end of Step 5.
|
|
165
|
+
- **Local-stamp.** `origin` points at `git@github.com:<owner>/<repo>.git` AND the worktree contains a `.stamp/` directory. The repo carries stamp config but no stamp server is in use (typically a `--no-stamp` run against a stamp-aware repo). `MODE` will resolve to `local-stamp` and routing goes through 5c — `stamp review` + `stamp merge` produce a signed merge commit locally, which is then pushed to GitHub as the PR head for human review. `stamp push` is intentionally not invoked (no stamp server); `stamp verify <merge-sha>` still works against the PR head from any clone with the trusted public keys.
|
|
166
|
+
- **Plain GitHub.** `origin` points at `git@github.com:<owner>/<repo>.git` and there is no `.stamp/` directory. `MODE` will resolve to `plain` and routing goes through 5b — `git push origin <feature>` + `gh pr create`.
|
|
167
|
+
|
|
168
|
+
Compute `MODE` once, here, from the worktree's actual state — every subsequent step branches on this single variable:
|
|
169
|
+
|
|
170
|
+
```sh
|
|
171
|
+
ORIGIN_URL=$(git remote get-url origin)
|
|
172
|
+
if [ -d .stamp ]; then
|
|
173
|
+
case "$ORIGIN_URL" in
|
|
174
|
+
*github.com*) MODE=local-stamp ;;
|
|
175
|
+
*) MODE=stamp ;;
|
|
176
|
+
esac
|
|
177
|
+
else
|
|
178
|
+
MODE=plain
|
|
179
|
+
fi
|
|
180
|
+
```
|
|
161
181
|
|
|
162
|
-
|
|
163
|
-
- **`--no-stamp` run.** Exactly one remote, `origin`, pointing at `git@github.com:<owner>/<repo>.git`. Continue to Step 3 and route through the plain-GitHub branch (5b) at the end of Step 5 — `git push origin <feature>` + `gh pr create`. The stamp commands (`stamp review`, `stamp merge`, `stamp push`) must not be invoked in this branch even if `.stamp/` exists in the worktree, because there's nowhere to push the stamp-signed merge.
|
|
182
|
+
If `MODE` doesn't match what you expected from `git remote -v` and the visible `.stamp/` state, stop and surface — the worktree was cloned from an unexpected remote or `.stamp/` was added/removed mid-flight.
|
|
164
183
|
|
|
165
184
|
**3. Determine base branch + cut feature branch.**
|
|
166
185
|
|
|
@@ -175,6 +194,17 @@ git fetch origin "$BASE_BRANCH":"$BASE_BRANCH" 2>/dev/null || git fetch origin "
|
|
|
175
194
|
git checkout "$BASE_BRANCH"
|
|
176
195
|
FEATURE_BRANCH="agt/$(echo "$TICKET_ID" | tr '[:upper:]' '[:lower:]')"
|
|
177
196
|
git checkout -b "$FEATURE_BRANCH"
|
|
197
|
+
|
|
198
|
+
# Local-stamp only: cut a work branch off the feature branch so commits in
|
|
199
|
+
# Step 4 land on $WORK_BRANCH and Step 5c can stamp-merge $WORK_BRANCH into
|
|
200
|
+
# $FEATURE_BRANCH locally — the resulting signed merge commit becomes the
|
|
201
|
+
# PR head. In other modes WORK_BRANCH == FEATURE_BRANCH (no extra checkout).
|
|
202
|
+
if [ "$MODE" = "local-stamp" ]; then
|
|
203
|
+
WORK_BRANCH="${FEATURE_BRANCH}-work"
|
|
204
|
+
git checkout -b "$WORK_BRANCH"
|
|
205
|
+
else
|
|
206
|
+
WORK_BRANCH="$FEATURE_BRANCH"
|
|
207
|
+
fi
|
|
178
208
|
```
|
|
179
209
|
|
|
180
210
|
Branch name: `agt/<ticket-id-lowercased>` (e.g. `agt/agt-003`). Vault ticket IDs are the canonical key — Linear identifiers are no longer minted in this flow (Linear-as-publish-target is a separate downstream sync, filed as its own follow-up ticket).
|
|
@@ -197,13 +227,13 @@ When tests pass:
|
|
|
197
227
|
```sh
|
|
198
228
|
git add -A
|
|
199
229
|
COMMIT_BODY="Refs <ticket-id>"
|
|
200
|
-
#
|
|
201
|
-
# eventual
|
|
202
|
-
# this trailer — those worktrees have no github
|
|
203
|
-
# pushed to the stamp server, and GH never sees a
|
|
204
|
-
# trigger auto-close. QA Phase 5 closes the GH issue
|
|
205
|
-
# `gh issue close`, so behaviour is preserved either way.
|
|
206
|
-
if [ "<source.type>" = "github" ] && [
|
|
230
|
+
# GitHub-bound paths (plain, local-stamp): append a `Fixes <gh-issue-url>`
|
|
231
|
+
# trailer so the eventual PR merge auto-closes the GH issue. The
|
|
232
|
+
# stamp-governed path skips this trailer — those worktrees have no github
|
|
233
|
+
# remote, the merge gets pushed to the stamp server, and GH never sees a
|
|
234
|
+
# commit that would trigger auto-close. QA Phase 5 closes the GH issue
|
|
235
|
+
# explicitly via `gh issue close`, so behaviour is preserved either way.
|
|
236
|
+
if [ "<source.type>" = "github" ] && [ "$MODE" != "stamp" ]; then
|
|
207
237
|
COMMIT_BODY="$COMMIT_BODY"$'\n'"Fixes <linked-github URL>"
|
|
208
238
|
fi
|
|
209
239
|
git commit -m "<one-line summary>
|
|
@@ -214,11 +244,7 @@ $COMMIT_BODY
|
|
|
214
244
|
|
|
215
245
|
Never use `--no-verify`. Fix hook failures at the root cause.
|
|
216
246
|
|
|
217
|
-
**5.
|
|
218
|
-
|
|
219
|
-
```sh
|
|
220
|
-
test -d .stamp && REPO_KIND=stamp || REPO_KIND=plain
|
|
221
|
-
```
|
|
247
|
+
**5. Route by `$MODE`** (set in Step 2): `stamp` → 5a, `plain` → 5b, `local-stamp` → 5c.
|
|
222
248
|
|
|
223
249
|
#### 5a — Stamp-protected repo
|
|
224
250
|
|
|
@@ -255,6 +281,31 @@ gh pr create --fill
|
|
|
255
281
|
|
|
256
282
|
Capture the PR URL into `linked-pr:`. Human merges through GitHub PR review.
|
|
257
283
|
|
|
284
|
+
#### 5c — Local-stamp repo (`.stamp/` present, GitHub origin)
|
|
285
|
+
|
|
286
|
+
Run review on `$WORK_BRANCH` against `$FEATURE_BRANCH` (the eventual PR base):
|
|
287
|
+
|
|
288
|
+
```sh
|
|
289
|
+
stamp review --diff "$FEATURE_BRANCH..$WORK_BRANCH"
|
|
290
|
+
stamp status --diff "$FEATURE_BRANCH..$WORK_BRANCH"
|
|
291
|
+
```
|
|
292
|
+
|
|
293
|
+
If the gate isn't open, iterate per the **5-round rule** (same shape as 5a — round 1 structure, round 2 consistency, round 3 polish; later rounds rare). Amend on `$WORK_BRANCH` between rounds. After 5 rounds still red → STOP with `🛑 BLOCKED — Local stamp review red after 5 rounds`.
|
|
294
|
+
|
|
295
|
+
When the gate opens, merge locally and push the signed merge as the PR head:
|
|
296
|
+
|
|
297
|
+
```sh
|
|
298
|
+
git checkout "$FEATURE_BRANCH"
|
|
299
|
+
stamp merge "$WORK_BRANCH" --into "$FEATURE_BRANCH"
|
|
300
|
+
git push -u origin "$FEATURE_BRANCH"
|
|
301
|
+
gh pr create --base "$DEFAULT_BRANCH" --head "$FEATURE_BRANCH" --fill
|
|
302
|
+
git branch -D "$WORK_BRANCH"
|
|
303
|
+
```
|
|
304
|
+
|
|
305
|
+
`stamp push` is intentionally absent — there is no stamp server. The signed merge commit is the PR head; reviewers can `stamp verify <pr-head-sha>` from any clone whose `.stamp/trusted-keys/` contains the signing key. Capture the PR URL into `linked-pr:`. Human merges through GitHub PR review. Never merge a GitHub PR yourself.
|
|
306
|
+
|
|
307
|
+
Local-stamp is single-tier only — the PR base is always `$DEFAULT_BRANCH`. Two-tier (stacked-base) flows require a stamp server to hold the intermediate base branch and aren't supported in this mode.
|
|
308
|
+
|
|
258
309
|
### Phase 4.5 — Release follow-up (single-tier stamp only)
|
|
259
310
|
|
|
260
311
|
If the repo publishes artifacts (npm, crates.io, PyPI, GitHub Releases) gated on a version bump, the merge above adds the change but won't ship until the version bumps. Re-read `CLAUDE.md`/`AGENTS.md` for a "Releases" / "Publishing" section.
|
package/dist/index.js
CHANGED
|
@@ -250,6 +250,7 @@ import { query } from "@anthropic-ai/claude-agent-sdk";
|
|
|
250
250
|
// src/lib/models.ts
|
|
251
251
|
var ROLE_PIPELINE_MODEL = "claude-opus-4-7";
|
|
252
252
|
var NORMALISER_MODEL = "claude-sonnet-4-6";
|
|
253
|
+
var HAIKU_PRODUCT_MODEL = "claude-haiku-4-5";
|
|
253
254
|
var PHASES = ["product", "spike", "implementation", "qa"];
|
|
254
255
|
var DEFAULT_MODELS = {
|
|
255
256
|
product: "claude-sonnet-4-6",
|
|
@@ -281,6 +282,55 @@ function resolveRoleModel(state, models) {
|
|
|
281
282
|
if (pinned && pinned.length > 0) return pinned;
|
|
282
283
|
return ROLE_PIPELINE_MODEL;
|
|
283
284
|
}
|
|
285
|
+
function acceptanceCriteriaIsPopulated(body) {
|
|
286
|
+
const lines = body.split("\n");
|
|
287
|
+
let inSection = false;
|
|
288
|
+
let inHtmlComment = false;
|
|
289
|
+
const numberedBullet = /^\s*\d+\.\s+\S/;
|
|
290
|
+
for (const line of lines) {
|
|
291
|
+
if (!inSection) {
|
|
292
|
+
if (/^##\s+Acceptance Criteria\s*$/.test(line)) {
|
|
293
|
+
inSection = true;
|
|
294
|
+
}
|
|
295
|
+
continue;
|
|
296
|
+
}
|
|
297
|
+
if (/^##\s+/.test(line)) return false;
|
|
298
|
+
let scan = line;
|
|
299
|
+
while (scan.length > 0) {
|
|
300
|
+
if (inHtmlComment) {
|
|
301
|
+
const close = scan.indexOf("-->");
|
|
302
|
+
if (close === -1) {
|
|
303
|
+
scan = "";
|
|
304
|
+
break;
|
|
305
|
+
}
|
|
306
|
+
scan = scan.slice(close + 3);
|
|
307
|
+
inHtmlComment = false;
|
|
308
|
+
} else {
|
|
309
|
+
const open = scan.indexOf("<!--");
|
|
310
|
+
if (open === -1) break;
|
|
311
|
+
const before = scan.slice(0, open);
|
|
312
|
+
if (numberedBullet.test(before)) return true;
|
|
313
|
+
const rest = scan.slice(open + 4);
|
|
314
|
+
const close = rest.indexOf("-->");
|
|
315
|
+
if (close === -1) {
|
|
316
|
+
inHtmlComment = true;
|
|
317
|
+
scan = "";
|
|
318
|
+
break;
|
|
319
|
+
}
|
|
320
|
+
scan = rest.slice(close + 3);
|
|
321
|
+
}
|
|
322
|
+
}
|
|
323
|
+
if (inHtmlComment) continue;
|
|
324
|
+
if (numberedBullet.test(scan)) return true;
|
|
325
|
+
}
|
|
326
|
+
return false;
|
|
327
|
+
}
|
|
328
|
+
function resolveModelForTicket(args) {
|
|
329
|
+
if (args.state === "triage" && args.sourceType === "manual" && args.productDownshift && acceptanceCriteriaIsPopulated(args.body)) {
|
|
330
|
+
return HAIKU_PRODUCT_MODEL;
|
|
331
|
+
}
|
|
332
|
+
return resolveRoleModel(args.state, args.models);
|
|
333
|
+
}
|
|
284
334
|
|
|
285
335
|
// src/lib/normalise.ts
|
|
286
336
|
var SYSTEM_PROMPT = `You normalise unstructured work-item payloads (GitHub issues, Linear tickets, etc.) into well-formed product-vault tickets.
|
|
@@ -458,6 +508,7 @@ import {
|
|
|
458
508
|
} from "fs";
|
|
459
509
|
import { homedir } from "os";
|
|
460
510
|
import { basename, isAbsolute, resolve, join as join2 } from "path";
|
|
511
|
+
var DEFAULT_PRODUCT_DOWNSHIFT = true;
|
|
461
512
|
function configDir() {
|
|
462
513
|
return join2(homedir(), ".open-team");
|
|
463
514
|
}
|
|
@@ -484,8 +535,12 @@ function writeConfig(config) {
|
|
|
484
535
|
default: config.default,
|
|
485
536
|
stamp: config.stamp
|
|
486
537
|
};
|
|
487
|
-
|
|
488
|
-
|
|
538
|
+
const onDiskModels = { ...config.models };
|
|
539
|
+
if (config.productDownshift !== DEFAULT_PRODUCT_DOWNSHIFT) {
|
|
540
|
+
onDiskModels.productDownshift = config.productDownshift;
|
|
541
|
+
}
|
|
542
|
+
if (Object.keys(onDiskModels).length > 0) {
|
|
543
|
+
onDisk.models = onDiskModels;
|
|
489
544
|
}
|
|
490
545
|
if (!config.telemetry.enabled) {
|
|
491
546
|
onDisk.telemetry = config.telemetry;
|
|
@@ -499,6 +554,7 @@ function emptyConfig() {
|
|
|
499
554
|
default: null,
|
|
500
555
|
stamp: null,
|
|
501
556
|
models: {},
|
|
557
|
+
productDownshift: DEFAULT_PRODUCT_DOWNSHIFT,
|
|
502
558
|
telemetry: { enabled: true }
|
|
503
559
|
};
|
|
504
560
|
}
|
|
@@ -600,9 +656,15 @@ function normalise(parsed) {
|
|
|
600
656
|
default: def,
|
|
601
657
|
stamp: normaliseStamp(obj.stamp),
|
|
602
658
|
models: normaliseModels(obj.models),
|
|
659
|
+
productDownshift: normaliseProductDownshift(obj.models),
|
|
603
660
|
telemetry: normaliseTelemetry(obj.telemetry)
|
|
604
661
|
};
|
|
605
662
|
}
|
|
663
|
+
function normaliseProductDownshift(value) {
|
|
664
|
+
if (!value || typeof value !== "object") return DEFAULT_PRODUCT_DOWNSHIFT;
|
|
665
|
+
const v = value;
|
|
666
|
+
return v.productDownshift !== false;
|
|
667
|
+
}
|
|
606
668
|
function normaliseTelemetry(value) {
|
|
607
669
|
if (!value || typeof value !== "object") return { enabled: true };
|
|
608
670
|
const v = value;
|
|
@@ -712,6 +774,15 @@ function seedDefaultModelsIfEmpty() {
|
|
|
712
774
|
writeConfig(config);
|
|
713
775
|
return { action: "seeded", models: config.models };
|
|
714
776
|
}
|
|
777
|
+
function getProductDownshift() {
|
|
778
|
+
return readConfig().productDownshift;
|
|
779
|
+
}
|
|
780
|
+
function setProductDownshift(enabled) {
|
|
781
|
+
const config = readConfig();
|
|
782
|
+
config.productDownshift = enabled;
|
|
783
|
+
writeConfig(config);
|
|
784
|
+
return config.productDownshift;
|
|
785
|
+
}
|
|
715
786
|
function getTelemetryEnabled() {
|
|
716
787
|
return readConfig().telemetry.enabled;
|
|
717
788
|
}
|
|
@@ -1299,6 +1370,28 @@ enforce: ${s.enforce ? "on" : "off"}
|
|
|
1299
1370
|
const lines = PHASES.map((p) => `${p.padEnd(15)} ${m[p] ?? "(unset)"}`);
|
|
1300
1371
|
process.stdout.write(lines.join("\n") + "\n");
|
|
1301
1372
|
});
|
|
1373
|
+
models.command("product-downshift <on|off|show>").description(
|
|
1374
|
+
"Toggle the AGT-107 Haiku-downshift heuristic for well-formed manual tickets (default: on)"
|
|
1375
|
+
).action((flag) => {
|
|
1376
|
+
const lower = flag.toLowerCase();
|
|
1377
|
+
if (lower === "show") {
|
|
1378
|
+
process.stdout.write(`${getProductDownshift() ? "on" : "off"}
|
|
1379
|
+
`);
|
|
1380
|
+
return;
|
|
1381
|
+
}
|
|
1382
|
+
if (lower !== "on" && lower !== "off") {
|
|
1383
|
+
process.stderr.write(
|
|
1384
|
+
`oteam config models product-downshift: expected on|off|show, got "${flag}"
|
|
1385
|
+
`
|
|
1386
|
+
);
|
|
1387
|
+
process.exit(2);
|
|
1388
|
+
}
|
|
1389
|
+
const next = setProductDownshift(lower === "on");
|
|
1390
|
+
process.stdout.write(
|
|
1391
|
+
`\u2705 models.productDownshift = ${next ? "on" : "off"}
|
|
1392
|
+
`
|
|
1393
|
+
);
|
|
1394
|
+
});
|
|
1302
1395
|
const telemetry = new Command("telemetry").description(
|
|
1303
1396
|
"Manage per-phase telemetry recording (default: on)"
|
|
1304
1397
|
);
|
|
@@ -2411,7 +2504,7 @@ function buildTicketCommand() {
|
|
|
2411
2504
|
// src/role-pipeline/runner.ts
|
|
2412
2505
|
import { spawnSync as spawnSync4 } from "child_process";
|
|
2413
2506
|
import { randomUUID } from "crypto";
|
|
2414
|
-
import { writeFileSync as writeFileSync7 } from "fs";
|
|
2507
|
+
import { readFileSync as readFileSync9, writeFileSync as writeFileSync7 } from "fs";
|
|
2415
2508
|
import { tmpdir } from "os";
|
|
2416
2509
|
import { resolve as resolve4, basename as basename5, dirname as dirname2, join as join14 } from "path";
|
|
2417
2510
|
|
|
@@ -2757,7 +2850,16 @@ async function assignTicket(opts) {
|
|
|
2757
2850
|
}
|
|
2758
2851
|
}
|
|
2759
2852
|
const projectContext = loadProjectContext(resolvedVault.path, ticket.project);
|
|
2760
|
-
const
|
|
2853
|
+
const ticketBody = readTicketBody(ticketPath);
|
|
2854
|
+
const model = resolveModelForTicket({
|
|
2855
|
+
state: ticket.state,
|
|
2856
|
+
sourceType: ticket.source.type,
|
|
2857
|
+
body: ticketBody,
|
|
2858
|
+
productDownshift: config.productDownshift,
|
|
2859
|
+
models: config.models
|
|
2860
|
+
});
|
|
2861
|
+
const haikuDownshift = model === HAIKU_PRODUCT_MODEL && ticket.state === "triage";
|
|
2862
|
+
const systemPrompt = composeSystemPrompt(ticket.id, projectContext, haikuDownshift);
|
|
2761
2863
|
const phase = phaseForState(ticket.state);
|
|
2762
2864
|
const telemetry = phase !== null && getTelemetryEnabled() ? {
|
|
2763
2865
|
ticketId: ticket.id,
|
|
@@ -2771,7 +2873,7 @@ async function assignTicket(opts) {
|
|
|
2771
2873
|
claudePath,
|
|
2772
2874
|
ticketPath,
|
|
2773
2875
|
resolvedVault.path,
|
|
2774
|
-
|
|
2876
|
+
systemPrompt,
|
|
2775
2877
|
workspace,
|
|
2776
2878
|
model,
|
|
2777
2879
|
telemetry
|
|
@@ -2790,7 +2892,7 @@ async function assignTicket(opts) {
|
|
|
2790
2892
|
claudePath,
|
|
2791
2893
|
ticketPath,
|
|
2792
2894
|
resolvedVault.path,
|
|
2793
|
-
|
|
2895
|
+
systemPrompt,
|
|
2794
2896
|
workspace,
|
|
2795
2897
|
model,
|
|
2796
2898
|
telemetry
|
|
@@ -2808,7 +2910,7 @@ async function assignTicket(opts) {
|
|
|
2808
2910
|
const escapedTicket = shellEscape(ticketPath);
|
|
2809
2911
|
const slashPrompt = `/assign-ticket ${escapedTicket}`;
|
|
2810
2912
|
const escapedPrompt = shellEscape(slashPrompt);
|
|
2811
|
-
const projectFlag =
|
|
2913
|
+
const projectFlag = systemPrompt ? ` --append-system-prompt "$(cat '${shellEscape(systemPrompt.tmpFile)}')"` : "";
|
|
2812
2914
|
const sessionFlag = telemetry ? ` --session-id '${shellEscape(telemetry.sessionId)}'` : "";
|
|
2813
2915
|
const claudeCmd = `'${escapedClaude}' --dangerously-skip-permissions --model ${shellEscape(model)}${sessionFlag}${projectFlag} '${escapedPrompt}'`;
|
|
2814
2916
|
const telemetryTail = telemetry ? buildTelemetryTail({
|
|
@@ -2845,7 +2947,7 @@ function buildTelemetryTail(input) {
|
|
|
2845
2947
|
].join(" ");
|
|
2846
2948
|
return `; EC=$?; ${oteam} telemetry record ${args} >/dev/null 2>&1 || true; exit "$EC"`;
|
|
2847
2949
|
}
|
|
2848
|
-
function runInline(claudePath, ticketPath, vaultPath,
|
|
2950
|
+
function runInline(claudePath, ticketPath, vaultPath, systemPrompt, workspace, model, telemetry) {
|
|
2849
2951
|
const args = [
|
|
2850
2952
|
"--dangerously-skip-permissions",
|
|
2851
2953
|
"--model",
|
|
@@ -2854,8 +2956,8 @@ function runInline(claudePath, ticketPath, vaultPath, projectContext, workspace,
|
|
|
2854
2956
|
if (telemetry) {
|
|
2855
2957
|
args.push("--session-id", telemetry.sessionId);
|
|
2856
2958
|
}
|
|
2857
|
-
if (
|
|
2858
|
-
args.push("--append-system-prompt",
|
|
2959
|
+
if (systemPrompt) {
|
|
2960
|
+
args.push("--append-system-prompt", systemPrompt.content);
|
|
2859
2961
|
}
|
|
2860
2962
|
args.push(`/assign-ticket ${ticketPath}`);
|
|
2861
2963
|
const cwd = workspace?.path ?? process.cwd();
|
|
@@ -2901,12 +3003,39 @@ function loadProjectContext(vaultPath, projectId) {
|
|
|
2901
3003
|
);
|
|
2902
3004
|
return null;
|
|
2903
3005
|
}
|
|
2904
|
-
|
|
2905
|
-
|
|
2906
|
-
|
|
3006
|
+
return formatProjectContextForPrompt(project);
|
|
3007
|
+
}
|
|
3008
|
+
function composeSystemPrompt(ticketId, projectContext, haikuDownshift) {
|
|
3009
|
+
const parts = [];
|
|
3010
|
+
if (projectContext) parts.push(projectContext);
|
|
3011
|
+
if (haikuDownshift) parts.push(haikuDownshiftPromptHint());
|
|
3012
|
+
if (parts.length === 0) return null;
|
|
3013
|
+
const content = parts.join("\n\n");
|
|
3014
|
+
const safeId = ticketId.replace(/[^a-zA-Z0-9._-]/g, "_");
|
|
3015
|
+
const tmpFile = join14(tmpdir(), `oteam-prompt-${safeId}.md`);
|
|
2907
3016
|
writeFileSync7(tmpFile, content, "utf8");
|
|
2908
3017
|
return { tmpFile, content };
|
|
2909
3018
|
}
|
|
3019
|
+
function haikuDownshiftPromptHint() {
|
|
3020
|
+
return [
|
|
3021
|
+
"# Product agent: haiku-downshift heuristic active",
|
|
3022
|
+
"",
|
|
3023
|
+
"AGT-107: this ticket is a well-formed manual ticket (source.type=manual + populated `## Acceptance Criteria`), so the runner spawned you on Haiku 4.5 instead of the configured Product model. The heuristic exists to handle structural-cleanup cases cheaply; full synthesis still belongs on the configured Product model.",
|
|
3024
|
+
"",
|
|
3025
|
+
"When you advance the ticket, write the comment header as:",
|
|
3026
|
+
"",
|
|
3027
|
+
" ### YYYY-MM-DD \u2014 Product agent (haiku-downshift)",
|
|
3028
|
+
"",
|
|
3029
|
+
"instead of the standard `### YYYY-MM-DD \u2014 Product agent`. That makes the heuristic visible in the ticket's audit trail."
|
|
3030
|
+
].join("\n");
|
|
3031
|
+
}
|
|
3032
|
+
function readTicketBody(path) {
|
|
3033
|
+
try {
|
|
3034
|
+
return readFileSync9(path, "utf8");
|
|
3035
|
+
} catch {
|
|
3036
|
+
return "";
|
|
3037
|
+
}
|
|
3038
|
+
}
|
|
2910
3039
|
function findToolOnPath(name) {
|
|
2911
3040
|
const r = spawnSync4("/usr/bin/env", ["which", name], { encoding: "utf8" });
|
|
2912
3041
|
if (r.status !== 0) return null;
|