@tarcisiopgs/lisa 1.3.0 → 1.4.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +28 -6
- package/dist/index.js +294 -72
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -15,7 +15,7 @@ Most AI agent loops work like Ralph — they grab an issue, throw it at a model,
|
|
|
15
15
|
Lisa is deterministic. She follows a structured pipeline with clear stages (fetch, activate, implement, validate, PR, update) and stops when the work is done. This means:
|
|
16
16
|
|
|
17
17
|
- **Token efficiency** — Each issue gets one focused prompt with full context. No wasted retries, no speculative exploration, no idle polling.
|
|
18
|
-
- **Multi-repo awareness** — Lisa
|
|
18
|
+
- **Multi-repo awareness** — Lisa plans across multiple repos, executes in the correct order (e.g., backend before frontend), and creates one PR per repo.
|
|
19
19
|
- **Model fallback** — Configure a chain of models (`claude → gemini → opencode`). Transient errors (429, quota, timeout) trigger the next model; non-transient errors stop the chain.
|
|
20
20
|
- **Workflow integration** — Issues move through your board in real time (Backlog → In Progress → In Review). Your team always knows what's being worked on.
|
|
21
21
|
- **Self-healing** — Orphan issues (stuck in "In Progress" from interrupted runs) are automatically recovered on startup. Pre-push hook failures trigger the agent to fix and retry.
|
|
@@ -76,7 +76,9 @@ lisa run --provider gemini --once
|
|
|
76
76
|
| `lisa run --limit N` | Process up to N issues |
|
|
77
77
|
| `lisa run --dry-run` | Preview without executing |
|
|
78
78
|
| `lisa run --provider NAME` | Override AI provider |
|
|
79
|
+
| `lisa run --source NAME` | Override issue source (linear, trello) |
|
|
79
80
|
| `lisa run --label NAME` | Override label filter |
|
|
81
|
+
| `lisa run --github METHOD` | Override GitHub method (cli, token) |
|
|
80
82
|
| `lisa run --json` | Output as JSON lines |
|
|
81
83
|
| `lisa run --quiet` | Suppress non-essential output |
|
|
82
84
|
| `lisa config` | Interactive config wizard |
|
|
@@ -99,12 +101,13 @@ All providers use `child_process.spawn` with `sh -c`. Prompts are written to a t
|
|
|
99
101
|
|
|
100
102
|
### Fallback Chain
|
|
101
103
|
|
|
102
|
-
Configure
|
|
104
|
+
Configure a fallback chain in the `models` array. Lisa tries each provider in order — transient errors (429, quota, timeout, network) trigger the next provider. Non-transient errors stop the chain immediately.
|
|
103
105
|
|
|
104
106
|
```yaml
|
|
105
107
|
models:
|
|
106
108
|
- claude
|
|
107
109
|
- gemini
|
|
110
|
+
- opencode
|
|
108
111
|
```
|
|
109
112
|
|
|
110
113
|
If `models` is not set, Lisa uses the single `provider` field.
|
|
@@ -119,7 +122,12 @@ The AI agent creates a branch directly in your current checkout, implements the
|
|
|
119
122
|
|
|
120
123
|
Lisa creates an isolated [git worktree](https://git-scm.com/docs/git-worktree) for each issue under `.worktrees/`. The agent works inside the worktree without touching your main checkout. After the PR is created, the worktree is cleaned up automatically.
|
|
121
124
|
|
|
122
|
-
|
|
125
|
+
**Native worktree support** — When using Claude Code, Lisa delegates worktree lifecycle directly to the provider via the `--worktree` flag. Lisa auto-detects whether the primary provider supports native worktrees and uses the appropriate mode. Other providers use Lisa-managed worktrees.
|
|
126
|
+
|
|
127
|
+
**Multi-repo workspaces** — When multiple repos are configured, Lisa uses a two-phase flow:
|
|
128
|
+
|
|
129
|
+
1. **Planning phase** — A planning agent analyzes the issue and produces a `.lisa-plan.json` with ordered steps (one per affected repo), determining which repos need changes and in what order (e.g., backend API before frontend consumer).
|
|
130
|
+
2. **Execution phase** — Lisa executes each step sequentially, creating one worktree and one PR per repo. Cross-repo context (branch names, PR URLs from previous steps) is passed to each subsequent step so the agent can reference them.
|
|
123
131
|
|
|
124
132
|
Worktree mode is ideal when you want to keep working in the repo while Lisa resolves issues in the background.
|
|
125
133
|
|
|
@@ -129,9 +137,6 @@ Config lives in `.lisa/config.yaml`. Run `lisa init` to create it interactively.
|
|
|
129
137
|
|
|
130
138
|
```yaml
|
|
131
139
|
provider: claude
|
|
132
|
-
models:
|
|
133
|
-
- claude
|
|
134
|
-
- gemini
|
|
135
140
|
source: linear
|
|
136
141
|
workflow: worktree
|
|
137
142
|
|
|
@@ -151,6 +156,7 @@ repos:
|
|
|
151
156
|
- name: my-api
|
|
152
157
|
path: ./api
|
|
153
158
|
base_branch: main
|
|
159
|
+
match: "[API]" # route issues whose title starts with "[API]" to this repo
|
|
154
160
|
- name: my-app
|
|
155
161
|
path: ./app
|
|
156
162
|
base_branch: main
|
|
@@ -227,6 +233,22 @@ Lisa starts resources before the agent runs, waits for the port to be ready, run
|
|
|
227
233
|
- **Signal handling** — SIGINT/SIGTERM gracefully revert the active issue to its previous status before exiting.
|
|
228
234
|
- **Guardrails** — Failed sessions are logged to `.lisa/guardrails.md` and injected into future prompts so the agent avoids repeating the same mistakes.
|
|
229
235
|
|
|
236
|
+
### Overseer
|
|
237
|
+
|
|
238
|
+
Lisa can detect stuck providers — agents that appear to be running but are making no progress. When enabled, the overseer periodically checks `git status` in the working directory. If no changes are detected within the `stuck_threshold`, the provider process is killed and the error is eligible for fallback to the next model in the chain.
|
|
239
|
+
|
|
240
|
+
### Test Runner Auto-Detection
|
|
241
|
+
|
|
242
|
+
Lisa auto-detects `vitest` or `jest` in the project's `package.json` dependencies. When a test runner is found, mandatory test instructions are injected into the agent prompt, requiring the agent to write unit tests for new code and run `npm run test` before committing.
|
|
243
|
+
|
|
244
|
+
### PR Body Formatting
|
|
245
|
+
|
|
246
|
+
Agent-produced PR descriptions are automatically sanitized before creating the pull request: HTML tags are stripped, `*` bullets are normalized to `-`, and wall-of-text (single-line) descriptions are split into bullet points at sentence boundaries. Agents are also instructed to follow a structured markdown template (What / Why / Key changes / Testing).
|
|
247
|
+
|
|
248
|
+
### Terminal Integration
|
|
249
|
+
|
|
250
|
+
Lisa updates the terminal title to reflect the current activity (fetching, implementing, pushing, cooling down) and plays a bell notification when a session completes. This works in any terminal that supports OSC title sequences.
|
|
251
|
+
|
|
230
252
|
## License
|
|
231
253
|
|
|
232
254
|
[MIT](LICENSE)
|
package/dist/index.js
CHANGED
|
@@ -1,9 +1,9 @@
|
|
|
1
1
|
#!/usr/bin/env node
|
|
2
2
|
|
|
3
3
|
// src/cli.ts
|
|
4
|
-
import { execSync as
|
|
4
|
+
import { execSync as execSync6 } from "child_process";
|
|
5
5
|
import { existsSync as existsSync7, readdirSync, readFileSync as readFileSync6 } from "fs";
|
|
6
|
-
import { join as
|
|
6
|
+
import { join as join10, resolve as resolvePath } from "path";
|
|
7
7
|
import * as clack from "@clack/prompts";
|
|
8
8
|
import { defineCommand, runMain } from "citty";
|
|
9
9
|
import pc2 from "picocolors";
|
|
@@ -331,8 +331,8 @@ function banner() {
|
|
|
331
331
|
}
|
|
332
332
|
|
|
333
333
|
// src/loop.ts
|
|
334
|
-
import { appendFileSync as
|
|
335
|
-
import { join as
|
|
334
|
+
import { appendFileSync as appendFileSync8, existsSync as existsSync6, readFileSync as readFileSync5, unlinkSync as unlinkSync6 } from "fs";
|
|
335
|
+
import { join as join9, resolve as resolve5 } from "path";
|
|
336
336
|
import { execa as execa3 } from "execa";
|
|
337
337
|
|
|
338
338
|
// src/lifecycle.ts
|
|
@@ -1128,6 +1128,9 @@ var ClaudeProvider = class {
|
|
|
1128
1128
|
writeFileSync4(promptFile, prompt, "utf-8");
|
|
1129
1129
|
try {
|
|
1130
1130
|
const flags = ["-p", "--dangerously-skip-permissions"];
|
|
1131
|
+
if (opts.model) {
|
|
1132
|
+
flags.push("--model", opts.model);
|
|
1133
|
+
}
|
|
1131
1134
|
if (opts.useNativeWorktree) {
|
|
1132
1135
|
flags.push("--worktree");
|
|
1133
1136
|
}
|
|
@@ -1184,16 +1187,16 @@ var ClaudeProvider = class {
|
|
|
1184
1187
|
}
|
|
1185
1188
|
};
|
|
1186
1189
|
|
|
1187
|
-
// src/providers/
|
|
1190
|
+
// src/providers/copilot.ts
|
|
1188
1191
|
import { execSync as execSync2, spawn as spawn3 } from "child_process";
|
|
1189
1192
|
import { appendFileSync as appendFileSync3, mkdtempSync as mkdtempSync2, unlinkSync as unlinkSync2, writeFileSync as writeFileSync5 } from "fs";
|
|
1190
1193
|
import { tmpdir as tmpdir2 } from "os";
|
|
1191
1194
|
import { join as join4 } from "path";
|
|
1192
|
-
var
|
|
1193
|
-
name = "
|
|
1195
|
+
var CopilotProvider = class {
|
|
1196
|
+
name = "copilot";
|
|
1194
1197
|
async isAvailable() {
|
|
1195
1198
|
try {
|
|
1196
|
-
execSync2("
|
|
1199
|
+
execSync2("copilot version", { stdio: "ignore" });
|
|
1197
1200
|
return true;
|
|
1198
1201
|
} catch {
|
|
1199
1202
|
return false;
|
|
@@ -1205,7 +1208,7 @@ var GeminiProvider = class {
|
|
|
1205
1208
|
const promptFile = join4(tmpDir, "prompt.md");
|
|
1206
1209
|
writeFileSync5(promptFile, prompt, "utf-8");
|
|
1207
1210
|
try {
|
|
1208
|
-
const proc = spawn3("sh", ["-c", `
|
|
1211
|
+
const proc = spawn3("sh", ["-c", `copilot --allow-all -p "$(cat '${promptFile}')"`], {
|
|
1209
1212
|
cwd: opts.cwd,
|
|
1210
1213
|
stdio: ["ignore", "pipe", "pipe"]
|
|
1211
1214
|
});
|
|
@@ -1257,16 +1260,180 @@ var GeminiProvider = class {
|
|
|
1257
1260
|
}
|
|
1258
1261
|
};
|
|
1259
1262
|
|
|
1260
|
-
// src/providers/
|
|
1263
|
+
// src/providers/cursor.ts
|
|
1261
1264
|
import { execSync as execSync3, spawn as spawn4 } from "child_process";
|
|
1262
1265
|
import { appendFileSync as appendFileSync4, mkdtempSync as mkdtempSync3, unlinkSync as unlinkSync3, writeFileSync as writeFileSync6 } from "fs";
|
|
1263
1266
|
import { tmpdir as tmpdir3 } from "os";
|
|
1264
1267
|
import { join as join5 } from "path";
|
|
1268
|
+
function findCursorBinary() {
|
|
1269
|
+
for (const bin of ["agent", "cursor-agent"]) {
|
|
1270
|
+
try {
|
|
1271
|
+
execSync3(`${bin} --version`, { stdio: "ignore" });
|
|
1272
|
+
return bin;
|
|
1273
|
+
} catch {
|
|
1274
|
+
}
|
|
1275
|
+
}
|
|
1276
|
+
return null;
|
|
1277
|
+
}
|
|
1278
|
+
var CursorProvider = class {
|
|
1279
|
+
name = "cursor";
|
|
1280
|
+
async isAvailable() {
|
|
1281
|
+
return findCursorBinary() !== null;
|
|
1282
|
+
}
|
|
1283
|
+
async run(prompt, opts) {
|
|
1284
|
+
const start = Date.now();
|
|
1285
|
+
const bin = findCursorBinary();
|
|
1286
|
+
if (!bin) {
|
|
1287
|
+
return {
|
|
1288
|
+
success: false,
|
|
1289
|
+
output: "cursor agent (agent / cursor-agent) is not installed or not in PATH",
|
|
1290
|
+
duration: Date.now() - start
|
|
1291
|
+
};
|
|
1292
|
+
}
|
|
1293
|
+
const tmpDir = mkdtempSync3(join5(tmpdir3(), "lisa-"));
|
|
1294
|
+
const promptFile = join5(tmpDir, "prompt.md");
|
|
1295
|
+
writeFileSync6(promptFile, prompt, "utf-8");
|
|
1296
|
+
try {
|
|
1297
|
+
const proc = spawn4(
|
|
1298
|
+
"sh",
|
|
1299
|
+
["-c", `${bin} -p "$(cat '${promptFile}')" --output-format text --force`],
|
|
1300
|
+
{
|
|
1301
|
+
cwd: opts.cwd,
|
|
1302
|
+
stdio: ["ignore", "pipe", "pipe"]
|
|
1303
|
+
}
|
|
1304
|
+
);
|
|
1305
|
+
const overseer = opts.overseer?.enabled ? startOverseer(proc, opts.cwd, opts.overseer) : null;
|
|
1306
|
+
const chunks = [];
|
|
1307
|
+
proc.stdout.on("data", (chunk) => {
|
|
1308
|
+
const text2 = chunk.toString();
|
|
1309
|
+
process.stdout.write(text2);
|
|
1310
|
+
chunks.push(text2);
|
|
1311
|
+
try {
|
|
1312
|
+
appendFileSync4(opts.logFile, text2);
|
|
1313
|
+
} catch {
|
|
1314
|
+
}
|
|
1315
|
+
});
|
|
1316
|
+
proc.stderr.on("data", (chunk) => {
|
|
1317
|
+
const text2 = chunk.toString();
|
|
1318
|
+
process.stderr.write(text2);
|
|
1319
|
+
try {
|
|
1320
|
+
appendFileSync4(opts.logFile, text2);
|
|
1321
|
+
} catch {
|
|
1322
|
+
}
|
|
1323
|
+
});
|
|
1324
|
+
const exitCode = await new Promise((resolve6) => {
|
|
1325
|
+
proc.on("close", (code) => {
|
|
1326
|
+
overseer?.stop();
|
|
1327
|
+
resolve6(code ?? 1);
|
|
1328
|
+
});
|
|
1329
|
+
});
|
|
1330
|
+
if (overseer?.wasKilled()) {
|
|
1331
|
+
chunks.push(STUCK_MESSAGE);
|
|
1332
|
+
}
|
|
1333
|
+
return {
|
|
1334
|
+
success: exitCode === 0 && !overseer?.wasKilled(),
|
|
1335
|
+
output: chunks.join(""),
|
|
1336
|
+
duration: Date.now() - start
|
|
1337
|
+
};
|
|
1338
|
+
} catch (err) {
|
|
1339
|
+
return {
|
|
1340
|
+
success: false,
|
|
1341
|
+
output: err instanceof Error ? err.message : String(err),
|
|
1342
|
+
duration: Date.now() - start
|
|
1343
|
+
};
|
|
1344
|
+
} finally {
|
|
1345
|
+
try {
|
|
1346
|
+
unlinkSync3(promptFile);
|
|
1347
|
+
} catch {
|
|
1348
|
+
}
|
|
1349
|
+
}
|
|
1350
|
+
}
|
|
1351
|
+
};
|
|
1352
|
+
|
|
1353
|
+
// src/providers/gemini.ts
|
|
1354
|
+
import { execSync as execSync4, spawn as spawn5 } from "child_process";
|
|
1355
|
+
import { appendFileSync as appendFileSync5, mkdtempSync as mkdtempSync4, unlinkSync as unlinkSync4, writeFileSync as writeFileSync7 } from "fs";
|
|
1356
|
+
import { tmpdir as tmpdir4 } from "os";
|
|
1357
|
+
import { join as join6 } from "path";
|
|
1358
|
+
var GeminiProvider = class {
|
|
1359
|
+
name = "gemini";
|
|
1360
|
+
async isAvailable() {
|
|
1361
|
+
try {
|
|
1362
|
+
execSync4("gemini --version", { stdio: "ignore" });
|
|
1363
|
+
return true;
|
|
1364
|
+
} catch {
|
|
1365
|
+
return false;
|
|
1366
|
+
}
|
|
1367
|
+
}
|
|
1368
|
+
async run(prompt, opts) {
|
|
1369
|
+
const start = Date.now();
|
|
1370
|
+
const tmpDir = mkdtempSync4(join6(tmpdir4(), "lisa-"));
|
|
1371
|
+
const promptFile = join6(tmpDir, "prompt.md");
|
|
1372
|
+
writeFileSync7(promptFile, prompt, "utf-8");
|
|
1373
|
+
try {
|
|
1374
|
+
const modelFlag = opts.model ? `--model ${opts.model}` : "";
|
|
1375
|
+
const proc = spawn5("sh", ["-c", `gemini --yolo ${modelFlag} -p "$(cat '${promptFile}')"`], {
|
|
1376
|
+
cwd: opts.cwd,
|
|
1377
|
+
stdio: ["ignore", "pipe", "pipe"]
|
|
1378
|
+
});
|
|
1379
|
+
const overseer = opts.overseer?.enabled ? startOverseer(proc, opts.cwd, opts.overseer) : null;
|
|
1380
|
+
const chunks = [];
|
|
1381
|
+
proc.stdout.on("data", (chunk) => {
|
|
1382
|
+
const text2 = chunk.toString();
|
|
1383
|
+
process.stdout.write(text2);
|
|
1384
|
+
chunks.push(text2);
|
|
1385
|
+
try {
|
|
1386
|
+
appendFileSync5(opts.logFile, text2);
|
|
1387
|
+
} catch {
|
|
1388
|
+
}
|
|
1389
|
+
});
|
|
1390
|
+
proc.stderr.on("data", (chunk) => {
|
|
1391
|
+
const text2 = chunk.toString();
|
|
1392
|
+
process.stderr.write(text2);
|
|
1393
|
+
try {
|
|
1394
|
+
appendFileSync5(opts.logFile, text2);
|
|
1395
|
+
} catch {
|
|
1396
|
+
}
|
|
1397
|
+
});
|
|
1398
|
+
const exitCode = await new Promise((resolve6) => {
|
|
1399
|
+
proc.on("close", (code) => {
|
|
1400
|
+
overseer?.stop();
|
|
1401
|
+
resolve6(code ?? 1);
|
|
1402
|
+
});
|
|
1403
|
+
});
|
|
1404
|
+
if (overseer?.wasKilled()) {
|
|
1405
|
+
chunks.push(STUCK_MESSAGE);
|
|
1406
|
+
}
|
|
1407
|
+
return {
|
|
1408
|
+
success: exitCode === 0 && !overseer?.wasKilled(),
|
|
1409
|
+
output: chunks.join(""),
|
|
1410
|
+
duration: Date.now() - start
|
|
1411
|
+
};
|
|
1412
|
+
} catch (err) {
|
|
1413
|
+
return {
|
|
1414
|
+
success: false,
|
|
1415
|
+
output: err instanceof Error ? err.message : String(err),
|
|
1416
|
+
duration: Date.now() - start
|
|
1417
|
+
};
|
|
1418
|
+
} finally {
|
|
1419
|
+
try {
|
|
1420
|
+
unlinkSync4(promptFile);
|
|
1421
|
+
} catch {
|
|
1422
|
+
}
|
|
1423
|
+
}
|
|
1424
|
+
}
|
|
1425
|
+
};
|
|
1426
|
+
|
|
1427
|
+
// src/providers/opencode.ts
|
|
1428
|
+
import { execSync as execSync5, spawn as spawn6 } from "child_process";
|
|
1429
|
+
import { appendFileSync as appendFileSync6, mkdtempSync as mkdtempSync5, unlinkSync as unlinkSync5, writeFileSync as writeFileSync8 } from "fs";
|
|
1430
|
+
import { tmpdir as tmpdir5 } from "os";
|
|
1431
|
+
import { join as join7 } from "path";
|
|
1265
1432
|
var OpenCodeProvider = class {
|
|
1266
1433
|
name = "opencode";
|
|
1267
1434
|
async isAvailable() {
|
|
1268
1435
|
try {
|
|
1269
|
-
|
|
1436
|
+
execSync5("opencode --version", { stdio: "ignore" });
|
|
1270
1437
|
return true;
|
|
1271
1438
|
} catch {
|
|
1272
1439
|
return false;
|
|
@@ -1274,11 +1441,11 @@ var OpenCodeProvider = class {
|
|
|
1274
1441
|
}
|
|
1275
1442
|
async run(prompt, opts) {
|
|
1276
1443
|
const start = Date.now();
|
|
1277
|
-
const tmpDir =
|
|
1278
|
-
const promptFile =
|
|
1279
|
-
|
|
1444
|
+
const tmpDir = mkdtempSync5(join7(tmpdir5(), "lisa-"));
|
|
1445
|
+
const promptFile = join7(tmpDir, "prompt.md");
|
|
1446
|
+
writeFileSync8(promptFile, prompt, "utf-8");
|
|
1280
1447
|
try {
|
|
1281
|
-
const proc =
|
|
1448
|
+
const proc = spawn6("sh", ["-c", `opencode run "$(cat '${promptFile}')"`], {
|
|
1282
1449
|
cwd: opts.cwd,
|
|
1283
1450
|
stdio: ["ignore", "pipe", "pipe"]
|
|
1284
1451
|
});
|
|
@@ -1289,7 +1456,7 @@ var OpenCodeProvider = class {
|
|
|
1289
1456
|
process.stdout.write(text2);
|
|
1290
1457
|
chunks.push(text2);
|
|
1291
1458
|
try {
|
|
1292
|
-
|
|
1459
|
+
appendFileSync6(opts.logFile, text2);
|
|
1293
1460
|
} catch {
|
|
1294
1461
|
}
|
|
1295
1462
|
});
|
|
@@ -1297,7 +1464,7 @@ var OpenCodeProvider = class {
|
|
|
1297
1464
|
const text2 = chunk.toString();
|
|
1298
1465
|
process.stderr.write(text2);
|
|
1299
1466
|
try {
|
|
1300
|
-
|
|
1467
|
+
appendFileSync6(opts.logFile, text2);
|
|
1301
1468
|
} catch {
|
|
1302
1469
|
}
|
|
1303
1470
|
});
|
|
@@ -1323,7 +1490,7 @@ var OpenCodeProvider = class {
|
|
|
1323
1490
|
};
|
|
1324
1491
|
} finally {
|
|
1325
1492
|
try {
|
|
1326
|
-
|
|
1493
|
+
unlinkSync5(promptFile);
|
|
1327
1494
|
} catch {
|
|
1328
1495
|
}
|
|
1329
1496
|
}
|
|
@@ -1334,7 +1501,9 @@ var OpenCodeProvider = class {
|
|
|
1334
1501
|
var providers = {
|
|
1335
1502
|
claude: () => new ClaudeProvider(),
|
|
1336
1503
|
gemini: () => new GeminiProvider(),
|
|
1337
|
-
opencode: () => new OpenCodeProvider()
|
|
1504
|
+
opencode: () => new OpenCodeProvider(),
|
|
1505
|
+
copilot: () => new CopilotProvider(),
|
|
1506
|
+
cursor: () => new CursorProvider()
|
|
1338
1507
|
};
|
|
1339
1508
|
async function getAvailableProviders() {
|
|
1340
1509
|
const all = Object.values(providers).map((f) => f());
|
|
@@ -1371,31 +1540,39 @@ var ELIGIBLE_ERROR_PATTERNS = [
|
|
|
1371
1540
|
/not installed/i,
|
|
1372
1541
|
/not in PATH/i,
|
|
1373
1542
|
/command not found/i,
|
|
1374
|
-
/lisa-overseer/i
|
|
1543
|
+
/lisa-overseer/i,
|
|
1544
|
+
/named models unavailable/i,
|
|
1545
|
+
/free plans can only use/i
|
|
1375
1546
|
];
|
|
1376
1547
|
function isEligibleForFallback(output) {
|
|
1377
1548
|
return ELIGIBLE_ERROR_PATTERNS.some((pattern) => pattern.test(output));
|
|
1378
1549
|
}
|
|
1550
|
+
function isCompleteProviderExhaustion(attempts) {
|
|
1551
|
+
if (attempts.length === 0) return false;
|
|
1552
|
+
return attempts.every((a) => !a.success && a.error !== "Non-eligible error");
|
|
1553
|
+
}
|
|
1379
1554
|
async function runWithFallback(models, prompt, opts) {
|
|
1380
1555
|
const attempts = [];
|
|
1381
|
-
for (const
|
|
1382
|
-
const provider = createProvider(
|
|
1556
|
+
for (const spec of models) {
|
|
1557
|
+
const provider = createProvider(spec.provider);
|
|
1383
1558
|
const available = await provider.isAvailable();
|
|
1384
1559
|
if (!available) {
|
|
1385
1560
|
attempts.push({
|
|
1386
|
-
provider:
|
|
1561
|
+
provider: spec.provider,
|
|
1562
|
+
model: spec.model,
|
|
1387
1563
|
success: false,
|
|
1388
|
-
error: `Provider "${
|
|
1564
|
+
error: `Provider "${spec.provider}" is not installed or not in PATH`,
|
|
1389
1565
|
duration: 0
|
|
1390
1566
|
});
|
|
1391
1567
|
continue;
|
|
1392
1568
|
}
|
|
1393
1569
|
const guardrailsSection = opts.guardrailsDir ? buildGuardrailsSection(opts.guardrailsDir) : "";
|
|
1394
1570
|
const fullPrompt = guardrailsSection ? `${prompt}${guardrailsSection}` : prompt;
|
|
1395
|
-
const result = await provider.run(fullPrompt, opts);
|
|
1571
|
+
const result = await provider.run(fullPrompt, { ...opts, model: spec.model });
|
|
1396
1572
|
if (result.success) {
|
|
1397
1573
|
attempts.push({
|
|
1398
|
-
provider:
|
|
1574
|
+
provider: spec.provider,
|
|
1575
|
+
model: spec.model,
|
|
1399
1576
|
success: true,
|
|
1400
1577
|
duration: result.duration
|
|
1401
1578
|
});
|
|
@@ -1403,7 +1580,7 @@ async function runWithFallback(models, prompt, opts) {
|
|
|
1403
1580
|
success: true,
|
|
1404
1581
|
output: result.output,
|
|
1405
1582
|
duration: result.duration,
|
|
1406
|
-
providerUsed: model,
|
|
1583
|
+
providerUsed: spec.model ? `${spec.provider}/${spec.model}` : spec.provider,
|
|
1407
1584
|
provider,
|
|
1408
1585
|
attempts
|
|
1409
1586
|
};
|
|
@@ -1412,14 +1589,15 @@ async function runWithFallback(models, prompt, opts) {
|
|
|
1412
1589
|
appendEntry(opts.guardrailsDir, {
|
|
1413
1590
|
issueId: opts.issueId,
|
|
1414
1591
|
date: (/* @__PURE__ */ new Date()).toISOString().slice(0, 10),
|
|
1415
|
-
provider:
|
|
1592
|
+
provider: spec.provider,
|
|
1416
1593
|
errorType: extractErrorType(result.output),
|
|
1417
1594
|
context: extractContext(result.output)
|
|
1418
1595
|
});
|
|
1419
1596
|
}
|
|
1420
1597
|
const eligible = isEligibleForFallback(result.output);
|
|
1421
1598
|
attempts.push({
|
|
1422
|
-
provider:
|
|
1599
|
+
provider: spec.provider,
|
|
1600
|
+
model: spec.model,
|
|
1423
1601
|
success: false,
|
|
1424
1602
|
error: eligible ? "Eligible error (quota/unavailable/timeout)" : "Non-eligible error",
|
|
1425
1603
|
duration: result.duration
|
|
@@ -1429,7 +1607,7 @@ async function runWithFallback(models, prompt, opts) {
|
|
|
1429
1607
|
success: false,
|
|
1430
1608
|
output: result.output,
|
|
1431
1609
|
duration: result.duration,
|
|
1432
|
-
providerUsed: model,
|
|
1610
|
+
providerUsed: spec.model ? `${spec.provider}/${spec.model}` : spec.provider,
|
|
1433
1611
|
provider,
|
|
1434
1612
|
attempts
|
|
1435
1613
|
};
|
|
@@ -1440,7 +1618,7 @@ async function runWithFallback(models, prompt, opts) {
|
|
|
1440
1618
|
success: false,
|
|
1441
1619
|
output: formatAttemptsReport(attempts),
|
|
1442
1620
|
duration: totalDuration,
|
|
1443
|
-
providerUsed: attempts[attempts.length - 1]?.provider ?? models[0] ?? "claude",
|
|
1621
|
+
providerUsed: attempts[attempts.length - 1]?.provider ?? models[0]?.provider ?? "claude",
|
|
1444
1622
|
attempts
|
|
1445
1623
|
};
|
|
1446
1624
|
}
|
|
@@ -1450,7 +1628,8 @@ function formatAttemptsReport(attempts) {
|
|
|
1450
1628
|
const status2 = a.success ? "OK" : "FAILED";
|
|
1451
1629
|
const error2 = a.error ? ` \u2014 ${a.error}` : "";
|
|
1452
1630
|
const duration = a.duration > 0 ? ` (${Math.round(a.duration / 1e3)}s)` : "";
|
|
1453
|
-
|
|
1631
|
+
const label = a.model ? `${a.provider}/${a.model}` : a.provider;
|
|
1632
|
+
lines.push(` ${i + 1}. ${label}: ${status2}${error2}${duration}`);
|
|
1454
1633
|
}
|
|
1455
1634
|
return lines.join("\n");
|
|
1456
1635
|
}
|
|
@@ -1893,8 +2072,8 @@ function resetTitle() {
|
|
|
1893
2072
|
}
|
|
1894
2073
|
|
|
1895
2074
|
// src/worktree.ts
|
|
1896
|
-
import { appendFileSync as
|
|
1897
|
-
import { join as
|
|
2075
|
+
import { appendFileSync as appendFileSync7, existsSync as existsSync5, readFileSync as readFileSync4 } from "fs";
|
|
2076
|
+
import { join as join8, resolve as resolve4 } from "path";
|
|
1898
2077
|
import { execa as execa2 } from "execa";
|
|
1899
2078
|
var WORKTREES_DIR = ".worktrees";
|
|
1900
2079
|
function generateBranchName(issueId, title) {
|
|
@@ -1909,7 +2088,7 @@ async function cleanupOrphanedWorktree(repoRoot, branchName) {
|
|
|
1909
2088
|
if (!branchList.trim()) {
|
|
1910
2089
|
return false;
|
|
1911
2090
|
}
|
|
1912
|
-
const worktreePath =
|
|
2091
|
+
const worktreePath = join8(repoRoot, WORKTREES_DIR, branchName);
|
|
1913
2092
|
const { stdout: worktreeList } = await execa2("git", ["worktree", "list", "--porcelain"], {
|
|
1914
2093
|
cwd: repoRoot,
|
|
1915
2094
|
reject: false
|
|
@@ -1922,7 +2101,7 @@ async function cleanupOrphanedWorktree(repoRoot, branchName) {
|
|
|
1922
2101
|
return true;
|
|
1923
2102
|
}
|
|
1924
2103
|
async function createWorktree(repoRoot, branchName, baseBranch) {
|
|
1925
|
-
const worktreePath =
|
|
2104
|
+
const worktreePath = join8(repoRoot, WORKTREES_DIR, branchName);
|
|
1926
2105
|
await cleanupOrphanedWorktree(repoRoot, branchName);
|
|
1927
2106
|
await execa2("git", ["fetch", "origin", baseBranch], { cwd: repoRoot });
|
|
1928
2107
|
await execa2("git", ["worktree", "add", "-b", branchName, worktreePath, `origin/${baseBranch}`], {
|
|
@@ -1937,16 +2116,16 @@ async function removeWorktree(repoRoot, worktreePath) {
|
|
|
1937
2116
|
await execa2("git", ["worktree", "prune"], { cwd: repoRoot });
|
|
1938
2117
|
}
|
|
1939
2118
|
function ensureWorktreeGitignore(repoRoot) {
|
|
1940
|
-
const gitignorePath =
|
|
2119
|
+
const gitignorePath = join8(repoRoot, ".gitignore");
|
|
1941
2120
|
if (!existsSync5(gitignorePath)) {
|
|
1942
|
-
|
|
2121
|
+
appendFileSync7(gitignorePath, `${WORKTREES_DIR}
|
|
1943
2122
|
`);
|
|
1944
2123
|
return;
|
|
1945
2124
|
}
|
|
1946
2125
|
const content = readFileSync4(gitignorePath, "utf-8");
|
|
1947
2126
|
if (!content.split("\n").some((line) => line.trim() === WORKTREES_DIR)) {
|
|
1948
2127
|
const separator = content.endsWith("\n") ? "" : "\n";
|
|
1949
|
-
|
|
2128
|
+
appendFileSync7(gitignorePath, `${separator}${WORKTREES_DIR}
|
|
1950
2129
|
`);
|
|
1951
2130
|
}
|
|
1952
2131
|
}
|
|
@@ -1977,15 +2156,15 @@ function determineRepoPath(repos, issue, workspace) {
|
|
|
1977
2156
|
if (repos.length === 0) return void 0;
|
|
1978
2157
|
if (issue.repo) {
|
|
1979
2158
|
const match = repos.find((r) => r.name === issue.repo);
|
|
1980
|
-
if (match) return
|
|
2159
|
+
if (match) return join8(workspace, match.path);
|
|
1981
2160
|
}
|
|
1982
2161
|
for (const r of repos) {
|
|
1983
2162
|
if (r.match && issue.title.startsWith(r.match)) {
|
|
1984
|
-
return
|
|
2163
|
+
return join8(workspace, r.path);
|
|
1985
2164
|
}
|
|
1986
2165
|
}
|
|
1987
2166
|
const first = repos[0];
|
|
1988
|
-
return first ?
|
|
2167
|
+
return first ? join8(workspace, first.path) : void 0;
|
|
1989
2168
|
}
|
|
1990
2169
|
async function detectFeatureBranches(repos, issueId, workspace, globalBaseBranch) {
|
|
1991
2170
|
const entries = repos.length > 0 ? repos.map((r) => ({ path: resolve4(workspace, r.path), baseBranch: r.base_branch })) : [{ path: workspace, baseBranch: globalBaseBranch }];
|
|
@@ -2025,8 +2204,21 @@ async function detectFeatureBranches(repos, issueId, workspace, globalBaseBranch
|
|
|
2025
2204
|
var activeCleanup = null;
|
|
2026
2205
|
var shuttingDown = false;
|
|
2027
2206
|
function resolveModels(config2) {
|
|
2028
|
-
if (config2.models
|
|
2029
|
-
|
|
2207
|
+
if (!config2.models || config2.models.length === 0) {
|
|
2208
|
+
return [{ provider: config2.provider }];
|
|
2209
|
+
}
|
|
2210
|
+
const knownProviders = /* @__PURE__ */ new Set(["claude", "gemini", "opencode", "copilot", "cursor"]);
|
|
2211
|
+
for (const m of config2.models) {
|
|
2212
|
+
if (knownProviders.has(m) && m !== config2.provider) {
|
|
2213
|
+
warn(
|
|
2214
|
+
`Model "${m}" looks like a provider name but provider is "${config2.provider}". Since v1.4.0, "models" lists model names within the configured provider, not provider names. Update your .lisa/config.yaml.`
|
|
2215
|
+
);
|
|
2216
|
+
}
|
|
2217
|
+
}
|
|
2218
|
+
return config2.models.map((m) => ({
|
|
2219
|
+
provider: config2.provider,
|
|
2220
|
+
model: m === config2.provider ? void 0 : m
|
|
2221
|
+
}));
|
|
2030
2222
|
}
|
|
2031
2223
|
function buildPrBody(providerUsed, description) {
|
|
2032
2224
|
const lines = [];
|
|
@@ -2046,7 +2238,7 @@ function buildPrBody(providerUsed, description) {
|
|
|
2046
2238
|
var PR_TITLE_FILE = ".pr-title";
|
|
2047
2239
|
function readPrTitle(cwd) {
|
|
2048
2240
|
try {
|
|
2049
|
-
const title = readFileSync5(
|
|
2241
|
+
const title = readFileSync5(join9(cwd, PR_TITLE_FILE), "utf-8").trim().split("\n")[0]?.trim();
|
|
2050
2242
|
return title || null;
|
|
2051
2243
|
} catch {
|
|
2052
2244
|
return null;
|
|
@@ -2054,13 +2246,13 @@ function readPrTitle(cwd) {
|
|
|
2054
2246
|
}
|
|
2055
2247
|
function cleanupPrTitle(cwd) {
|
|
2056
2248
|
try {
|
|
2057
|
-
|
|
2249
|
+
unlinkSync6(join9(cwd, PR_TITLE_FILE));
|
|
2058
2250
|
} catch {
|
|
2059
2251
|
}
|
|
2060
2252
|
}
|
|
2061
2253
|
var PLAN_FILE = ".lisa-plan.json";
|
|
2062
2254
|
function readLisaPlan(dir) {
|
|
2063
|
-
const planPath =
|
|
2255
|
+
const planPath = join9(dir, PLAN_FILE);
|
|
2064
2256
|
if (!existsSync6(planPath)) return null;
|
|
2065
2257
|
try {
|
|
2066
2258
|
return JSON.parse(readFileSync5(planPath, "utf-8").trim());
|
|
@@ -2070,13 +2262,13 @@ function readLisaPlan(dir) {
|
|
|
2070
2262
|
}
|
|
2071
2263
|
function cleanupPlan(dir) {
|
|
2072
2264
|
try {
|
|
2073
|
-
|
|
2265
|
+
unlinkSync6(join9(dir, PLAN_FILE));
|
|
2074
2266
|
} catch {
|
|
2075
2267
|
}
|
|
2076
2268
|
}
|
|
2077
2269
|
var MANIFEST_FILE = ".lisa-manifest.json";
|
|
2078
2270
|
function readLisaManifest(dir) {
|
|
2079
|
-
const manifestPath =
|
|
2271
|
+
const manifestPath = join9(dir, MANIFEST_FILE);
|
|
2080
2272
|
if (!existsSync6(manifestPath)) return null;
|
|
2081
2273
|
try {
|
|
2082
2274
|
return JSON.parse(readFileSync5(manifestPath, "utf-8").trim());
|
|
@@ -2086,7 +2278,7 @@ function readLisaManifest(dir) {
|
|
|
2086
2278
|
}
|
|
2087
2279
|
function cleanupManifest(dir) {
|
|
2088
2280
|
try {
|
|
2089
|
-
|
|
2281
|
+
unlinkSync6(join9(dir, MANIFEST_FILE));
|
|
2090
2282
|
} catch {
|
|
2091
2283
|
}
|
|
2092
2284
|
}
|
|
@@ -2212,7 +2404,7 @@ async function runLoop(config2, opts) {
|
|
|
2212
2404
|
const models = resolveModels(config2);
|
|
2213
2405
|
installSignalHandlers();
|
|
2214
2406
|
log(
|
|
2215
|
-
`Starting loop (models: ${models.join(" \u2192 ")}, source: ${config2.source}, label: ${config2.source_config.label}, workflow: ${config2.workflow})`
|
|
2407
|
+
`Starting loop (models: ${models.map((m) => m.model ? `${m.provider}/${m.model}` : m.provider).join(" \u2192 ")}, source: ${config2.source}, label: ${config2.source_config.label}, workflow: ${config2.workflow})`
|
|
2216
2408
|
);
|
|
2217
2409
|
if (!opts.dryRun) {
|
|
2218
2410
|
await recoverOrphanIssues(source, config2);
|
|
@@ -2243,7 +2435,9 @@ async function runLoop(config2, opts) {
|
|
|
2243
2435
|
);
|
|
2244
2436
|
}
|
|
2245
2437
|
log(`[dry-run] Workflow mode: ${config2.workflow}`);
|
|
2246
|
-
log(
|
|
2438
|
+
log(
|
|
2439
|
+
`[dry-run] Models priority: ${models.map((m) => m.model ? `${m.provider}/${m.model}` : m.provider).join(" \u2192 ")}`
|
|
2440
|
+
);
|
|
2247
2441
|
log("[dry-run] Then implement, push, create PR, and update issue status");
|
|
2248
2442
|
break;
|
|
2249
2443
|
}
|
|
@@ -2319,6 +2513,12 @@ async function runLoop(config2, opts) {
|
|
|
2319
2513
|
log("Single iteration mode. Exiting.");
|
|
2320
2514
|
break;
|
|
2321
2515
|
}
|
|
2516
|
+
if (isCompleteProviderExhaustion(sessionResult.fallback.attempts)) {
|
|
2517
|
+
error(
|
|
2518
|
+
"All providers exhausted due to infrastructure issues (quota, plan limits, or not installed). Fix your provider configuration and restart lisa."
|
|
2519
|
+
);
|
|
2520
|
+
break;
|
|
2521
|
+
}
|
|
2322
2522
|
log(`Cooling down ${config2.loop.cooldown}s before next issue...`);
|
|
2323
2523
|
setTitle("Lisa \u2014 cooling down...");
|
|
2324
2524
|
await sleep(config2.loop.cooldown * 1e3);
|
|
@@ -2444,7 +2644,7 @@ async function runWorktreeSession(config2, issue, logFile, session, models) {
|
|
|
2444
2644
|
const workspace = resolve5(config2.workspace);
|
|
2445
2645
|
const repoPath = determineRepoPath(config2.repos, issue, workspace) ?? workspace;
|
|
2446
2646
|
const defaultBranch = resolveBaseBranch(config2, repoPath);
|
|
2447
|
-
const primaryProvider = createProvider(models[0] ?? "claude");
|
|
2647
|
+
const primaryProvider = createProvider(models[0]?.provider ?? "claude");
|
|
2448
2648
|
const useNativeWorktree = primaryProvider.supportsNativeWorktree === true;
|
|
2449
2649
|
if (useNativeWorktree) {
|
|
2450
2650
|
return runNativeWorktreeSession(
|
|
@@ -2473,7 +2673,7 @@ async function runNativeWorktreeSession(config2, issue, logFile, session, models
|
|
|
2473
2673
|
stopSpinner();
|
|
2474
2674
|
if (!started) {
|
|
2475
2675
|
error(`Lifecycle startup failed for ${issue.id}. Aborting session.`);
|
|
2476
|
-
return failResult(models[0] ?? "claude");
|
|
2676
|
+
return failResult(models[0]?.provider ?? "claude");
|
|
2477
2677
|
}
|
|
2478
2678
|
}
|
|
2479
2679
|
const testRunner = detectTestRunner(repoPath);
|
|
@@ -2493,7 +2693,7 @@ async function runNativeWorktreeSession(config2, issue, logFile, session, models
|
|
|
2493
2693
|
});
|
|
2494
2694
|
stopSpinner();
|
|
2495
2695
|
try {
|
|
2496
|
-
|
|
2696
|
+
appendFileSync8(
|
|
2497
2697
|
logFile,
|
|
2498
2698
|
`
|
|
2499
2699
|
${"=".repeat(80)}
|
|
@@ -2589,13 +2789,13 @@ async function runManualWorktreeSession(config2, issue, logFile, session, models
|
|
|
2589
2789
|
error(`Failed to create worktree: ${err instanceof Error ? err.message : String(err)}`);
|
|
2590
2790
|
return {
|
|
2591
2791
|
success: false,
|
|
2592
|
-
providerUsed: models[0] ?? "claude",
|
|
2792
|
+
providerUsed: models[0]?.provider ?? "claude",
|
|
2593
2793
|
prUrls: [],
|
|
2594
2794
|
fallback: {
|
|
2595
2795
|
success: false,
|
|
2596
2796
|
output: "",
|
|
2597
2797
|
duration: 0,
|
|
2598
|
-
providerUsed: models[0] ?? "claude",
|
|
2798
|
+
providerUsed: models[0]?.provider ?? "claude",
|
|
2599
2799
|
attempts: []
|
|
2600
2800
|
}
|
|
2601
2801
|
};
|
|
@@ -2612,13 +2812,13 @@ async function runManualWorktreeSession(config2, issue, logFile, session, models
|
|
|
2612
2812
|
await cleanupWorktree(repoPath, worktreePath);
|
|
2613
2813
|
return {
|
|
2614
2814
|
success: false,
|
|
2615
|
-
providerUsed: models[0] ?? "claude",
|
|
2815
|
+
providerUsed: models[0]?.provider ?? "claude",
|
|
2616
2816
|
prUrls: [],
|
|
2617
2817
|
fallback: {
|
|
2618
2818
|
success: false,
|
|
2619
2819
|
output: "",
|
|
2620
2820
|
duration: 0,
|
|
2621
|
-
providerUsed: models[0] ?? "claude",
|
|
2821
|
+
providerUsed: models[0]?.provider ?? "claude",
|
|
2622
2822
|
attempts: []
|
|
2623
2823
|
}
|
|
2624
2824
|
};
|
|
@@ -2641,7 +2841,7 @@ async function runManualWorktreeSession(config2, issue, logFile, session, models
|
|
|
2641
2841
|
});
|
|
2642
2842
|
stopSpinner();
|
|
2643
2843
|
try {
|
|
2644
|
-
|
|
2844
|
+
appendFileSync8(
|
|
2645
2845
|
logFile,
|
|
2646
2846
|
`
|
|
2647
2847
|
${"=".repeat(80)}
|
|
@@ -2745,7 +2945,7 @@ async function runWorktreeMultiRepoSession(config2, issue, logFile, session, mod
|
|
|
2745
2945
|
});
|
|
2746
2946
|
stopSpinner();
|
|
2747
2947
|
try {
|
|
2748
|
-
|
|
2948
|
+
appendFileSync8(
|
|
2749
2949
|
logFile,
|
|
2750
2950
|
`
|
|
2751
2951
|
${"=".repeat(80)}
|
|
@@ -2838,7 +3038,7 @@ async function runMultiRepoStep(config2, issue, step, previousResults, logFile,
|
|
|
2838
3038
|
} catch (err) {
|
|
2839
3039
|
stopSpinner();
|
|
2840
3040
|
error(`Failed to create worktree: ${err instanceof Error ? err.message : String(err)}`);
|
|
2841
|
-
return failResult(models[0] ?? "claude");
|
|
3041
|
+
return failResult(models[0]?.provider ?? "claude");
|
|
2842
3042
|
}
|
|
2843
3043
|
stopSpinner();
|
|
2844
3044
|
ok(`Worktree created at ${worktreePath}`);
|
|
@@ -2855,7 +3055,7 @@ async function runMultiRepoStep(config2, issue, step, previousResults, logFile,
|
|
|
2855
3055
|
});
|
|
2856
3056
|
stopSpinner();
|
|
2857
3057
|
try {
|
|
2858
|
-
|
|
3058
|
+
appendFileSync8(
|
|
2859
3059
|
logFile,
|
|
2860
3060
|
`
|
|
2861
3061
|
${"=".repeat(80)}
|
|
@@ -2959,13 +3159,13 @@ async function runBranchSession(config2, issue, logFile, session, models) {
|
|
|
2959
3159
|
error(`Lifecycle startup failed for ${issue.id}. Aborting session.`);
|
|
2960
3160
|
return {
|
|
2961
3161
|
success: false,
|
|
2962
|
-
providerUsed: models[0] ?? "claude",
|
|
3162
|
+
providerUsed: models[0]?.provider ?? "claude",
|
|
2963
3163
|
prUrls: [],
|
|
2964
3164
|
fallback: {
|
|
2965
3165
|
success: false,
|
|
2966
3166
|
output: "",
|
|
2967
3167
|
duration: 0,
|
|
2968
|
-
providerUsed: models[0] ?? "claude",
|
|
3168
|
+
providerUsed: models[0]?.provider ?? "claude",
|
|
2969
3169
|
attempts: []
|
|
2970
3170
|
}
|
|
2971
3171
|
};
|
|
@@ -2983,7 +3183,7 @@ async function runBranchSession(config2, issue, logFile, session, models) {
|
|
|
2983
3183
|
});
|
|
2984
3184
|
stopSpinner();
|
|
2985
3185
|
try {
|
|
2986
|
-
|
|
3186
|
+
appendFileSync8(
|
|
2987
3187
|
logFile,
|
|
2988
3188
|
`
|
|
2989
3189
|
${"=".repeat(80)}
|
|
@@ -3221,7 +3421,13 @@ async function runConfigWizard() {
|
|
|
3221
3421
|
const providerLabels = {
|
|
3222
3422
|
claude: "Claude Code",
|
|
3223
3423
|
gemini: "Gemini CLI",
|
|
3224
|
-
opencode: "OpenCode"
|
|
3424
|
+
opencode: "OpenCode",
|
|
3425
|
+
copilot: "GitHub Copilot CLI",
|
|
3426
|
+
cursor: "Cursor Agent"
|
|
3427
|
+
};
|
|
3428
|
+
const providerModels = {
|
|
3429
|
+
claude: ["claude-opus-4-5", "claude-sonnet-4-5", "claude-haiku-4-5"],
|
|
3430
|
+
gemini: ["gemini-2.5-pro", "gemini-2.0-flash", "gemini-1.5-pro"]
|
|
3225
3431
|
};
|
|
3226
3432
|
const available = await getAvailableProviders();
|
|
3227
3433
|
if (available.length === 0) {
|
|
@@ -3252,6 +3458,21 @@ After installing, run ${pc2.cyan("lisa init")} again.`
|
|
|
3252
3458
|
if (clack.isCancel(selected)) return process.exit(0);
|
|
3253
3459
|
providerName = selected;
|
|
3254
3460
|
}
|
|
3461
|
+
let selectedModels = [];
|
|
3462
|
+
const availableModels = providerModels[providerName];
|
|
3463
|
+
if (availableModels && availableModels.length > 0) {
|
|
3464
|
+
const modelSelection = await clack.multiselect({
|
|
3465
|
+
message: "Which models to use? (first = primary, rest = fallbacks in order)",
|
|
3466
|
+
options: availableModels.map((m, i) => ({
|
|
3467
|
+
value: m,
|
|
3468
|
+
label: m,
|
|
3469
|
+
hint: i === 0 ? "primary" : `fallback ${i}`
|
|
3470
|
+
})),
|
|
3471
|
+
required: false
|
|
3472
|
+
});
|
|
3473
|
+
if (clack.isCancel(modelSelection)) return process.exit(0);
|
|
3474
|
+
selectedModels = modelSelection ?? [];
|
|
3475
|
+
}
|
|
3255
3476
|
const source = await clack.select({
|
|
3256
3477
|
message: "Where do your issues live?",
|
|
3257
3478
|
options: [
|
|
@@ -3378,6 +3599,7 @@ Then run: ${pc2.cyan(`source ${shell}`)}`
|
|
|
3378
3599
|
}
|
|
3379
3600
|
const cfg = {
|
|
3380
3601
|
provider: providerName,
|
|
3602
|
+
...selectedModels.length > 0 ? { models: selectedModels } : {},
|
|
3381
3603
|
source,
|
|
3382
3604
|
source_config: {
|
|
3383
3605
|
team,
|
|
@@ -3424,12 +3646,12 @@ async function detectGitHubMethod() {
|
|
|
3424
3646
|
}
|
|
3425
3647
|
async function detectGitRepos() {
|
|
3426
3648
|
const cwd = process.cwd();
|
|
3427
|
-
if (existsSync7(
|
|
3649
|
+
if (existsSync7(join10(cwd, ".git"))) {
|
|
3428
3650
|
clack.log.info(`Detected git repository in current directory.`);
|
|
3429
3651
|
return [];
|
|
3430
3652
|
}
|
|
3431
3653
|
const entries = readdirSync(cwd, { withFileTypes: true });
|
|
3432
|
-
const gitDirs = entries.filter((e) => e.isDirectory() && existsSync7(
|
|
3654
|
+
const gitDirs = entries.filter((e) => e.isDirectory() && existsSync7(join10(cwd, e.name, ".git"))).map((e) => e.name);
|
|
3433
3655
|
if (gitDirs.length === 0) {
|
|
3434
3656
|
return [];
|
|
3435
3657
|
}
|
|
@@ -3439,7 +3661,7 @@ async function detectGitRepos() {
|
|
|
3439
3661
|
});
|
|
3440
3662
|
if (clack.isCancel(selected)) return process.exit(0);
|
|
3441
3663
|
return selected.map((dir) => ({
|
|
3442
|
-
name: getGitRepoName(
|
|
3664
|
+
name: getGitRepoName(join10(cwd, dir)) ?? dir,
|
|
3443
3665
|
path: `./${dir}`,
|
|
3444
3666
|
match: "",
|
|
3445
3667
|
base_branch: ""
|
|
@@ -3447,7 +3669,7 @@ async function detectGitRepos() {
|
|
|
3447
3669
|
}
|
|
3448
3670
|
function detectDefaultBranch(repoPath) {
|
|
3449
3671
|
try {
|
|
3450
|
-
const ref =
|
|
3672
|
+
const ref = execSync6("git symbolic-ref refs/remotes/origin/HEAD --short", {
|
|
3451
3673
|
cwd: repoPath,
|
|
3452
3674
|
encoding: "utf-8"
|
|
3453
3675
|
}).trim();
|
|
@@ -3458,7 +3680,7 @@ function detectDefaultBranch(repoPath) {
|
|
|
3458
3680
|
}
|
|
3459
3681
|
function getGitRepoName(repoPath) {
|
|
3460
3682
|
try {
|
|
3461
|
-
const url =
|
|
3683
|
+
const url = execSync6("git remote get-url origin", { cwd: repoPath, encoding: "utf-8" }).trim();
|
|
3462
3684
|
const match = url.match(/\/([^/]+?)(?:\.git)?$/) ?? url.match(/:([^/]+?)(?:\.git)?$/);
|
|
3463
3685
|
return match?.[1] ?? null;
|
|
3464
3686
|
} catch {
|