free-coding-models 0.2.11 → 0.2.12

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -2,10 +2,27 @@
2
2
 
3
3
  ---
4
4
 
5
+ ## 0.2.12
6
+
7
+ ### Added
8
+ - **Auto-select models for all external tools**: All 10 supported tools (Aider, Crush, Goose, Claude Code, Codex, Gemini, Qwen, OpenHands, Amp, Pi) now automatically configure and pre-select the chosen model on launch — no manual model selection needed after pressing Enter.
9
+ - **Changelog loader utility**: New `src/changelog-loader.js` module parses CHANGELOG.md for future TUI integration to display changes directly in the app instead of opening a browser.
10
+
11
+ ### Fixed
12
+ - **Infinite update loop on startup**: Disabled forced auto-update that caused the app to detect the same update repeatedly after restarting. The app now checks for updates in the background without forcing installation.
13
+ - **Removed disruptive browser window**: The auto-update process no longer opens a browser window to show the changelog — it now shows update information in the terminal only.
14
+ - **Update failure tracking**: If update checks fail 3+ times, the app displays a prominent red footer warning: `⚠ OUTDATED version, please update` with manual update instructions instead of crashing.
15
+
16
+ ### Changed
17
+ - **OpenHands integration improved**: Now sets `LLM_MODEL` and `LLM_API_KEY` environment variables for proper model pre-selection on launch.
18
+ - **Amp integration improved**: Now writes `amp.model` to config file with the selected model ID.
19
+
20
+ ---
21
+
5
22
  ## 0.2.11
6
23
 
7
24
  ### Added
8
- - **Pi Coding Agent support**: Enabled Pi (pi.dev) as a launchable mode in the Z key cycle. Select a model and press Enter to configure Pi's config file and spawn the PI coding agent CLI with the chosen model and API endpoint.
25
+ - **Pi Coding Agent support**: Enabled Pi (pi.dev) as a launchable mode in the Z key cycle. Select a model and press Enter to auto-configure Pi's model config and settings, then spawn the PI coding agent CLI with the chosen model pre-selected as the default.
9
26
 
10
27
  ---
11
28
 
package/README.md CHANGED
@@ -85,8 +85,8 @@ By Vanessa Depraute
85
85
  - **🎮 Interactive selection** — Navigate with arrow keys directly in the table, press Enter to act
86
86
  - **💻 OpenCode integration** — Auto-detects NIM setup, sets model as default, launches OpenCode
87
87
  - **🦞 OpenClaw integration** — Sets selected model as default provider in `~/.openclaw/openclaw.json`
88
- - **🧰 Public tool launchers** — `Enter` can auto-configure and launch `OpenCode CLI`, `OpenCode Desktop`, `OpenClaw`, `Crush`, and `Goose`
89
- - **🔌 Install Endpoints flow** — Press `Y` to install one configured provider directly into `OpenCode CLI`, `OpenCode Desktop`, `OpenClaw`, `Crush`, or `Goose`, either with the full provider catalog or a curated subset of models
88
+ - **🧰 Public tool launchers** — `Enter` auto-configures and launches 10+ tools: `OpenCode CLI`, `OpenCode Desktop`, `OpenClaw`, `Crush`, `Goose`, `Aider`, `Claude Code`, `Codex`, `Gemini`, `Qwen`, `OpenHands`, `Amp`, and `Pi`. All tools auto-select the chosen model on launch.
89
+ - **🔌 Install Endpoints flow** — Press `Y` to install one configured provider directly into `OpenCode CLI`, `OpenCode Desktop`, `OpenClaw`, `Crush`, `Goose`, `Aider`, or `Gemini`, either with the full provider catalog or a curated subset of models
90
90
  - **📝 Feature Request (J key)** — Send anonymous feedback directly to the project team
91
91
  - **🐛 Bug Report (I key)** — Send anonymous bug reports directly to the project team
92
92
  - **🎨 Clean output** — Zero scrollback pollution, interface stays open until Ctrl+C
@@ -594,6 +594,30 @@ Use **↑↓** to scroll and **Esc** or **X** to return to the main table.
594
594
 
595
595
  ---
596
596
 
597
+ ## 🧰 Supported Tool Launchers
598
+
599
+ You can use `free-coding-models` with 12+ AI coding tools. When you select a model and press Enter, the tool automatically configures and pre-selects your chosen model:
600
+
601
+ | Tool | Flag | Auto-Config |
602
+ |------|------|------------|
603
+ | OpenCode CLI | `--opencode` | ~/.config/opencode/opencode.json |
604
+ | OpenCode Desktop | `--opencode-desktop` | Opens Desktop app |
605
+ | OpenClaw | `--openclaw` | ~/.openclaw/openclaw.json |
606
+ | Crush | `--crush` | ~/.config/crush/crush.json |
607
+ | Goose | `--goose` | Environment variables |
608
+ | **Aider** | `--aider` | ~/.aider.conf.yml |
609
+ | **Claude Code** | `--claude-code` | CLI flag |
610
+ | **Codex** | `--codex` | CLI flag |
611
+ | **Gemini** | `--gemini` | ~/.gemini/settings.json |
612
+ | **Qwen** | `--qwen` | ~/.qwen/settings.json |
613
+ | **OpenHands** | `--openhands` | LLM_MODEL env var |
614
+ | **Amp** | `--amp` | ~/.config/amp/settings.json |
615
+ | **Pi** | `--pi` | ~/.pi/agent/settings.json |
616
+
617
+ Press **Z** to cycle through different tool modes in the TUI, or use flags to start in your preferred mode.
618
+
619
+ ---
620
+
597
621
  ## 🔌 OpenCode Integration
598
622
 
599
623
  **The easiest way** — let `free-coding-models` do everything:
@@ -604,7 +628,7 @@ Use **↑↓** to scroll and **Esc** or **X** to return to the main table.
604
628
  4. **Press Enter** — tool automatically:
605
629
  - Detects if NVIDIA NIM is configured in OpenCode
606
630
  - Sets your selected model as default in `~/.config/opencode/opencode.json`
607
- - Launches OpenCode with the model ready to use
631
+ - Launches OpenCode with the model pre-selected and ready to use
608
632
 
609
633
  ### tmux sub-agent panes
610
634
 
@@ -261,47 +261,26 @@ async function main() {
261
261
  ts: new Date().toISOString(),
262
262
  })
263
263
 
264
- // 📖 Check for updates in the background
264
+ // 📖 Check for updates in the background (non-blocking, non-forced)
265
+ // 📖 The old auto-update on startup caused infinite loops, so we've moved to:
266
+ // 📖 1. Optional prompt when a new version is available (user chooses to update or not)
267
+ // 📖 2. Display "OUTDATED" in TUI footer if update check fails repeatedly
265
268
  let latestVersion = null
269
+ let isOutdated = false
266
270
  try {
267
271
  latestVersion = await checkForUpdate()
268
- } catch {
269
- // Silently fail - don't block the app if npm registry is unreachable
270
- }
271
-
272
- // 📖 Auto-update system: force updates and handle changelog automatically
273
- // 📖 Skip when running from source (dev mode) — .git means we're in a repo checkout,
274
- // 📖 not a global npm install. Auto-update would overwrite the global copy but restart
275
- // 📖 the local one, causing an infinite update loop since LOCAL_VERSION never changes.
276
- const isDevMode = existsSync(join(dirname(new URL(import.meta.url).pathname.replace(/^\/([A-Z]:)/, '$1')), '..', '.git'))
277
- if (latestVersion && !isDevMode) {
278
- console.log()
279
- console.log(chalk.bold.red(' ⚠ AUTO-UPDATE AVAILABLE'))
280
- console.log(chalk.red(` Version ${latestVersion} will be installed automatically`))
281
- console.log(chalk.dim(' Opening changelog in browser...'))
282
- console.log()
283
-
284
- // 📖 Open changelog automatically
285
- const { execSync } = require('child_process')
286
- const changelogUrl = 'https://github.com/vava-nessa/free-coding-models/releases'
287
- try {
288
- if (isMac) {
289
- execSync(`open "${changelogUrl}"`, { stdio: 'ignore' })
290
- } else if (isWindows) {
291
- execSync(`start "" "${changelogUrl}"`, { stdio: 'ignore' })
292
- } else {
293
- execSync(`xdg-open "${changelogUrl}"`, { stdio: 'ignore' })
294
- }
295
- console.log(chalk.green(' ✅ Changelog opened in browser'))
296
- } catch {
297
- console.log(chalk.yellow(' ⚠ Could not open browser automatically'))
298
- console.log(chalk.dim(` Visit manually: ${changelogUrl}`))
272
+ // 📖 Track update check failures - if it fails 3+ times, mark as outdated
273
+ if (!latestVersion && config.settings?.updateCheckFailures >= 3) {
274
+ isOutdated = true
299
275
  }
300
-
301
- // 📖 Force update immediately
302
- console.log(chalk.cyan(' 🚀 Starting auto-update...'))
303
- runUpdate(latestVersion)
304
- return // runUpdate will restart the process
276
+ } catch (err) {
277
+ // 📖 Silently fail - don't block the app if npm registry is unreachable
278
+ // 📖 But track the failure for outdated detection
279
+ const failures = (config.settings?.updateCheckFailures || 0) + 1
280
+ if (!config.settings) config.settings = {}
281
+ config.settings.updateCheckFailures = Math.min(failures, 3)
282
+ if (failures >= 3) isOutdated = true
283
+ saveConfig(config)
305
284
  }
306
285
 
307
286
  // 📖 Dynamic OpenRouter free model discovery — fetch live free models from API
@@ -392,6 +371,8 @@ async function main() {
392
371
  lastPingTime: now, // 📖 Track when last ping cycle started
393
372
  lastUserActivityAt: now, // 📖 Any keypress refreshes this timer; inactivity can force slow mode.
394
373
  resumeSpeedOnActivity: false, // 📖 Set after idle slowdown so the next activity restarts a 60s speed burst.
374
+ latestVersion, // 📖 Latest npm version available (null if none or check failed)
375
+ isOutdated, // 📖 Set to true if update check failed 3+ times (show red "OUTDATED" footer)
395
376
  mode, // 📖 'opencode' or 'openclaw' — controls Enter action
396
377
  tierFilterMode: 0, // 📖 Index into TIER_CYCLE (0=All, 1=S+, 2=S, ...)
397
378
  originFilterMode: 0, // 📖 Index into ORIGIN_CYCLE (0=All, then providers)
@@ -813,7 +794,7 @@ async function main() {
813
794
  ? overlays.renderHelp()
814
795
  : state.logVisible
815
796
  ? overlays.renderLog()
816
- : renderTable(state.results, state.pendingPings, state.frame, state.cursor, state.sortColumn, state.sortDirection, state.pingInterval, state.lastPingTime, state.mode, state.tierFilterMode, state.scrollOffset, state.terminalRows, state.terminalCols, state.originFilterMode, state.activeProfile, state.profileSaveMode, state.profileSaveBuffer, state.proxyStartupStatus, state.pingMode, state.pingModeSource, state.hideUnconfiguredModels, state.widthWarningStartedAt, state.widthWarningDismissed, state.settingsUpdateState, state.settingsUpdateLatestVersion, getProxySettings(state.config).enabled === true)
797
+ : renderTable(state.results, state.pendingPings, state.frame, state.cursor, state.sortColumn, state.sortDirection, state.pingInterval, state.lastPingTime, state.mode, state.tierFilterMode, state.scrollOffset, state.terminalRows, state.terminalCols, state.originFilterMode, state.activeProfile, state.profileSaveMode, state.profileSaveBuffer, state.proxyStartupStatus, state.pingMode, state.pingModeSource, state.hideUnconfiguredModels, state.widthWarningStartedAt, state.widthWarningDismissed, state.settingsUpdateState, state.settingsUpdateLatestVersion, getProxySettings(state.config).enabled === true, state.isOutdated, state.latestVersion)
817
798
  process.stdout.write(ALT_HOME + content)
818
799
  }, Math.round(1000 / FPS))
819
800
 
@@ -821,7 +802,7 @@ async function main() {
821
802
  const initialVisible = state.results.filter(r => !r.hidden)
822
803
  state.visibleSorted = sortResultsWithPinnedFavorites(initialVisible, state.sortColumn, state.sortDirection)
823
804
 
824
- process.stdout.write(ALT_HOME + renderTable(state.results, state.pendingPings, state.frame, state.cursor, state.sortColumn, state.sortDirection, state.pingInterval, state.lastPingTime, state.mode, state.tierFilterMode, state.scrollOffset, state.terminalRows, state.terminalCols, state.originFilterMode, state.activeProfile, state.profileSaveMode, state.profileSaveBuffer, state.proxyStartupStatus, state.pingMode, state.pingModeSource, state.hideUnconfiguredModels, state.widthWarningStartedAt, state.widthWarningDismissed, state.settingsUpdateState, state.settingsUpdateLatestVersion, getProxySettings(state.config).enabled === true))
805
+ process.stdout.write(ALT_HOME + renderTable(state.results, state.pendingPings, state.frame, state.cursor, state.sortColumn, state.sortDirection, state.pingInterval, state.lastPingTime, state.mode, state.tierFilterMode, state.scrollOffset, state.terminalRows, state.terminalCols, state.originFilterMode, state.activeProfile, state.profileSaveMode, state.profileSaveBuffer, state.proxyStartupStatus, state.pingMode, state.pingModeSource, state.hideUnconfiguredModels, state.widthWarningStartedAt, state.widthWarningDismissed, state.settingsUpdateState, state.settingsUpdateLatestVersion, getProxySettings(state.config).enabled === true, state.isOutdated, state.latestVersion))
825
806
 
826
807
  // 📖 If --recommend was passed, auto-open the Smart Recommend overlay on start
827
808
  if (cliArgs.recommendMode) {
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "free-coding-models",
3
- "version": "0.2.11",
3
+ "version": "0.2.12",
4
4
  "description": "Find the fastest coding LLM models in seconds — ping free models from multiple providers, pick the best one for OpenCode, Cursor, or any AI coding assistant.",
5
5
  "keywords": [
6
6
  "nvidia",
@@ -0,0 +1,118 @@
1
+ /**
2
+ * @file changelog-loader.js
3
+ * @description Load and parse CHANGELOG.md for display in the TUI
4
+ *
5
+ * @functions
6
+ * → loadChangelog() — Read and parse CHANGELOG.md into structured format
7
+ * → getLatestChanges(version) — Return changelog for a specific version
8
+ * → formatChangelogForDisplay(version) — Format for TUI rendering
9
+ *
10
+ * @exports loadChangelog, getLatestChanges, formatChangelogForDisplay
11
+ */
12
+
13
+ import { readFileSync, existsSync } from 'fs'
14
+ import { dirname, join } from 'path'
15
+ import { fileURLToPath } from 'url'
16
+
17
+ const __filename = fileURLToPath(import.meta.url)
18
+ const __dirname = dirname(__filename)
19
+ const CHANGELOG_PATH = join(__dirname, '..', 'CHANGELOG.md')
20
+
21
+ /**
22
+ * 📖 loadChangelog: Read and parse CHANGELOG.md
23
+ * @returns {Object} { versions: { '0.2.11': { added: [], fixed: [], changed: [] }, ... } }
24
+ */
25
+ export function loadChangelog() {
26
+ if (!existsSync(CHANGELOG_PATH)) return { versions: {} }
27
+
28
+ const content = readFileSync(CHANGELOG_PATH, 'utf8')
29
+ const versions = {}
30
+ const lines = content.split('\n')
31
+ let currentVersion = null
32
+ let currentSection = null
33
+ let currentItems = []
34
+
35
+ for (const line of lines) {
36
+ // 📖 Match version headers: ## 0.2.11
37
+ const versionMatch = line.match(/^## ([\d.]+)/)
38
+ if (versionMatch) {
39
+ if (currentVersion && currentSection && currentItems.length > 0) {
40
+ if (!versions[currentVersion]) versions[currentVersion] = {}
41
+ versions[currentVersion][currentSection] = currentItems
42
+ }
43
+ currentVersion = versionMatch[1]
44
+ currentSection = null
45
+ currentItems = []
46
+ continue
47
+ }
48
+
49
+ // 📖 Match section headers: ### Added, ### Fixed, ### Changed
50
+ const sectionMatch = line.match(/^### (Added|Fixed|Changed|Updated)/)
51
+ if (sectionMatch) {
52
+ if (currentVersion && currentSection && currentItems.length > 0) {
53
+ if (!versions[currentVersion]) versions[currentVersion] = {}
54
+ versions[currentVersion][currentSection.toLowerCase()] = currentItems
55
+ }
56
+ currentSection = sectionMatch[1].toLowerCase()
57
+ currentItems = []
58
+ continue
59
+ }
60
+
61
+ // 📖 Match bullet points: - **text**: description
62
+ if (line.match(/^- /) && currentVersion && currentSection) {
63
+ currentItems.push(line.replace(/^- /, ''))
64
+ }
65
+ }
66
+
67
+ // 📖 Save the last section
68
+ if (currentVersion && currentSection && currentItems.length > 0) {
69
+ if (!versions[currentVersion]) versions[currentVersion] = {}
70
+ versions[currentVersion][currentSection] = currentItems
71
+ }
72
+
73
+ return { versions }
74
+ }
75
+
76
+ /**
77
+ * 📖 getLatestChanges: Return changelog for a specific version
78
+ * @param {string} version (e.g. '0.2.11')
79
+ * @returns {Object|null}
80
+ */
81
+ export function getLatestChanges(version) {
82
+ const { versions } = loadChangelog()
83
+ return versions[version] || null
84
+ }
85
+
86
+ /**
87
+ * 📖 formatChangelogForDisplay: Format changelog section as array of strings for TUI
88
+ * @param {string} version
89
+ * @returns {string[]} formatted lines
90
+ */
91
+ export function formatChangelogForDisplay(version) {
92
+ const changes = getLatestChanges(version)
93
+ if (!changes) return []
94
+
95
+ const lines = [
96
+ `📋 Changelog for v${version}`,
97
+ '',
98
+ ]
99
+
100
+ const sections = { added: 'Added', fixed: 'Fixed', changed: 'Changed', updated: 'Updated' }
101
+ for (const [key, label] of Object.entries(sections)) {
102
+ if (changes[key] && changes[key].length > 0) {
103
+ lines.push(`✨ ${label}:`)
104
+ for (const item of changes[key]) {
105
+ // 📖 Wrap long lines for display
106
+ const maxWidth = 70
107
+ let item_text = item.replace(/\*\*([^*]+)\*\*/g, '$1').replace(/`([^`]+)`/g, '$1')
108
+ if (item_text.length > maxWidth) {
109
+ item_text = item_text.substring(0, maxWidth - 3) + '...'
110
+ }
111
+ lines.push(` • ${item_text}`)
112
+ }
113
+ lines.push('')
114
+ }
115
+ }
116
+
117
+ return lines
118
+ }
@@ -92,7 +92,7 @@ export function setActiveProxy(proxyInstance) {
92
92
  }
93
93
 
94
94
  // ─── renderTable: mode param controls footer hint text (opencode vs openclaw) ─────────
95
- export function renderTable(results, pendingPings, frame, cursor = null, sortColumn = 'avg', sortDirection = 'asc', pingInterval = PING_INTERVAL, lastPingTime = Date.now(), mode = 'opencode', tierFilterMode = 0, scrollOffset = 0, terminalRows = 0, terminalCols = 0, originFilterMode = 0, activeProfile = null, profileSaveMode = false, profileSaveBuffer = '', proxyStartupStatus = null, pingMode = 'normal', pingModeSource = 'auto', hideUnconfiguredModels = false, widthWarningStartedAt = null, widthWarningDismissed = false, settingsUpdateState = 'idle', settingsUpdateLatestVersion = null, proxyEnabled = false) {
95
+ export function renderTable(results, pendingPings, frame, cursor = null, sortColumn = 'avg', sortDirection = 'asc', pingInterval = PING_INTERVAL, lastPingTime = Date.now(), mode = 'opencode', tierFilterMode = 0, scrollOffset = 0, terminalRows = 0, terminalCols = 0, originFilterMode = 0, activeProfile = null, profileSaveMode = false, profileSaveBuffer = '', proxyStartupStatus = null, pingMode = 'normal', pingModeSource = 'auto', hideUnconfiguredModels = false, widthWarningStartedAt = null, widthWarningDismissed = false, settingsUpdateState = 'idle', settingsUpdateLatestVersion = null, proxyEnabled = false, isOutdated = false, latestVersion = null) {
96
96
  // 📖 Filter out hidden models for display
97
97
  const visibleResults = results.filter(r => !r.hidden)
98
98
 
@@ -648,25 +648,33 @@ export function renderTable(results, pendingPings, frame, cursor = null, sortCol
648
648
  const latestLabel = chalk.redBright(` local v${LOCAL_VERSION} · latest v${versionStatus.latestVersion}`)
649
649
  lines.push(` ${outdatedBadge}${latestLabel}`)
650
650
  }
651
- lines.push(
652
- chalk.rgb(255, 150, 200)(' Made with 💖 & by \x1b]8;;https://github.com/vava-nessa\x1b\\vava-nessa\x1b]8;;\x1b\\') +
653
- chalk.dim(' ') +
654
- '⭐ ' +
655
- chalk.yellow('\x1b]8;;https://github.com/vava-nessa/free-coding-models\x1b\\Star on GitHub\x1b]8;;\x1b\\') +
656
- chalk.dim(' ') +
657
- '🤝 ' +
658
- chalk.rgb(255, 165, 0)('\x1b]8;;https://github.com/vava-nessa/free-coding-models/graphs/contributors\x1b\\Contributors\x1b]8;;\x1b\\') +
659
- chalk.dim(' ') +
660
- '' +
661
- chalk.rgb(255, 200, 100)('\x1b]8;;https://buymeacoffee.com/vavanessadev\x1b\\Buy me a coffee\x1b]8;;\x1b\\') +
662
- chalk.dim('') +
663
- '💬 ' +
664
- chalk.rgb(200, 150, 255)('\x1b]8;;https://discord.gg/ZTNFHvvCkU\x1b\\Discord\x1b]8;;\x1b\\') +
665
- chalk.dim(' ') +
666
- chalk.rgb(200, 150, 255)('https://discord.gg/ZTNFHvvCkU') +
667
- chalk.dim('') +
668
- chalk.dim('Ctrl+C Exit')
669
- )
651
+
652
+ // 📖 Build footer line, with OUTDATED warning if isOutdated is true
653
+ let footerLine = ''
654
+ if (isOutdated) {
655
+ // 📖 Show OUTDATED in red background, high contrast warning
656
+ footerLine = chalk.bgRed.bold.white(' ⚠ OUTDATED version, please update with "npm i -g free-coding-models@latest" ')
657
+ } else {
658
+ footerLine =
659
+ chalk.rgb(255, 150, 200)(' Made with 💖 & ☕ by \x1b]8;;https://github.com/vava-nessa\x1b\\vava-nessa\x1b]8;;\x1b\\') +
660
+ chalk.dim('') +
661
+ ' ' +
662
+ chalk.yellow('\x1b]8;;https://github.com/vava-nessa/free-coding-models\x1b\\Star on GitHub\x1b]8;;\x1b\\') +
663
+ chalk.dim('') +
664
+ '🤝 ' +
665
+ chalk.rgb(255, 165, 0)('\x1b]8;;https://github.com/vava-nessa/free-coding-models/graphs/contributors\x1b\\Contributors\x1b]8;;\x1b\\') +
666
+ chalk.dim('') +
667
+ '' +
668
+ chalk.rgb(255, 200, 100)('\x1b]8;;https://buymeacoffee.com/vavanessadev\x1b\\Buy me a coffee\x1b]8;;\x1b\\') +
669
+ chalk.dim(' • ') +
670
+ '💬 ' +
671
+ chalk.rgb(200, 150, 255)('\x1b]8;;https://discord.gg/ZTNFHvvCkU\x1b\\Discord\x1b]8;;\x1b\\') +
672
+ chalk.dim(' → ') +
673
+ chalk.rgb(200, 150, 255)('https://discord.gg/ZTNFHvvCkU') +
674
+ chalk.dim(' • ') +
675
+ chalk.dim('Ctrl+C Exit')
676
+ }
677
+ lines.push(footerLine)
670
678
 
671
679
  // 📖 Append \x1b[K (erase to EOL) to each line so leftover chars from previous
672
680
  // 📖 frames are cleared. Then pad with blank cleared lines to fill the terminal,
@@ -228,25 +228,36 @@ function writeQwenConfig(model, providerKey, apiKey, baseUrl) {
228
228
  }
229
229
 
230
230
  function writePiConfig(model, apiKey, baseUrl) {
231
- const filePath = join(homedir(), '.pi', 'agent', 'models.json')
232
- const backupPath = backupIfExists(filePath)
233
- const config = readJson(filePath, { providers: {} })
234
- if (!config.providers || typeof config.providers !== 'object') config.providers = {}
235
- config.providers.freeCodingModels = {
231
+ // 📖 Write models.json with the selected provider config
232
+ const modelsFilePath = join(homedir(), '.pi', 'agent', 'models.json')
233
+ const modelsBackupPath = backupIfExists(modelsFilePath)
234
+ const modelsConfig = readJson(modelsFilePath, { providers: {} })
235
+ if (!modelsConfig.providers || typeof modelsConfig.providers !== 'object') modelsConfig.providers = {}
236
+ modelsConfig.providers.freeCodingModels = {
236
237
  baseUrl,
237
238
  api: 'openai-completions',
238
239
  apiKey,
239
240
  models: [{ id: model.modelId, name: model.label }],
240
241
  }
241
- writeJson(filePath, config)
242
- return { filePath, backupPath }
242
+ writeJson(modelsFilePath, modelsConfig)
243
+
244
+ // 📖 Write settings.json to set the model as default on next launch
245
+ const settingsFilePath = join(homedir(), '.pi', 'agent', 'settings.json')
246
+ const settingsBackupPath = backupIfExists(settingsFilePath)
247
+ const settingsConfig = readJson(settingsFilePath, {})
248
+ settingsConfig.defaultProvider = 'freeCodingModels'
249
+ settingsConfig.defaultModel = model.modelId
250
+ writeJson(settingsFilePath, settingsConfig)
251
+
252
+ return { filePath: modelsFilePath, backupPath: modelsBackupPath, settingsFilePath, settingsBackupPath }
243
253
  }
244
254
 
245
- function writeAmpConfig(baseUrl) {
255
+ function writeAmpConfig(model, baseUrl) {
246
256
  const filePath = join(homedir(), '.config', 'amp', 'settings.json')
247
257
  const backupPath = backupIfExists(filePath)
248
258
  const config = readJson(filePath, {})
249
259
  config['amp.url'] = baseUrl
260
+ config['amp.model'] = model.modelId
250
261
  writeJson(filePath, config)
251
262
  return { filePath, backupPath }
252
263
  }
@@ -346,19 +357,24 @@ export async function startExternalTool(mode, model, config) {
346
357
  }
347
358
 
348
359
  if (mode === 'openhands') {
349
- console.log(chalk.dim(' 📖 OpenHands is launched with --override-with-envs so the selected model applies immediately.'))
360
+ // 📖 OpenHands supports LLM_MODEL env var to set the default model
361
+ env.LLM_MODEL = model.modelId
362
+ env.LLM_API_KEY = apiKey || env.LLM_API_KEY
363
+ if (baseUrl) env.LLM_BASE_URL = baseUrl
364
+ console.log(chalk.dim(` 📖 OpenHands launched with model: ${model.modelId}`))
350
365
  return spawnCommand('openhands', ['--override-with-envs'], env)
351
366
  }
352
367
 
353
368
  if (mode === 'amp') {
354
- printConfigResult(meta.label, writeAmpConfig(baseUrl))
355
- console.log(chalk.yellow(' Amp does not officially expose arbitrary model switching like the other CLIs.'))
356
- console.log(chalk.dim(' The proxy URL is written, then Amp is launched so you can reuse the current endpoint.'))
369
+ printConfigResult(meta.label, writeAmpConfig(model, baseUrl))
370
+ console.log(chalk.dim(` 📖 Amp config updated with model: ${model.modelId}`))
357
371
  return spawnCommand('amp', [], env)
358
372
  }
359
373
 
360
374
  if (mode === 'pi') {
361
- printConfigResult(meta.label, writePiConfig(model, apiKey, baseUrl))
375
+ const piResult = writePiConfig(model, apiKey, baseUrl)
376
+ printConfigResult(meta.label, { filePath: piResult.filePath, backupPath: piResult.backupPath })
377
+ printConfigResult(meta.label, { filePath: piResult.settingsFilePath, backupPath: piResult.settingsBackupPath })
362
378
  return spawnCommand('pi', [], env)
363
379
  }
364
380