free-coding-models 0.1.84 โ 0.1.86
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +19 -12
- package/bin/free-coding-models.js +167 -73
- package/package.json +1 -1
- package/sources.js +12 -10
- package/src/config.js +14 -3
- package/src/constants.js +3 -1
- package/src/key-handler.js +160 -17
- package/src/overlays.js +9 -7
- package/src/provider-metadata.js +20 -20
- package/src/render-table.js +105 -62
- package/src/utils.js +31 -26
package/README.md
CHANGED
|
@@ -2,7 +2,7 @@
|
|
|
2
2
|
<img src="https://img.shields.io/npm/v/free-coding-models?color=76b900&label=npm&logo=npm" alt="npm version">
|
|
3
3
|
<img src="https://img.shields.io/node/v/free-coding-models?color=76b900&logo=node.js" alt="node version">
|
|
4
4
|
<img src="https://img.shields.io/npm/l/free-coding-models?color=76b900" alt="license">
|
|
5
|
-
<img src="https://img.shields.io/badge/models-
|
|
5
|
+
<img src="https://img.shields.io/badge/models-159-76b900?logo=nvidia" alt="models count">
|
|
6
6
|
<img src="https://img.shields.io/badge/providers-20-blue" alt="providers count">
|
|
7
7
|
</p>
|
|
8
8
|
|
|
@@ -72,7 +72,7 @@
|
|
|
72
72
|
- **๐ Parallel pings** โ All models tested simultaneously via native `fetch`
|
|
73
73
|
- **๐ Real-time animation** โ Watch latency appear live in alternate screen buffer
|
|
74
74
|
- **๐ Smart ranking** โ Top 3 fastest models highlighted with medals ๐ฅ๐ฅ๐ฅ
|
|
75
|
-
- **โฑ
|
|
75
|
+
- **โฑ Adaptive monitoring** โ Starts in a fast 2s cadence for 60s, settles to 10s, slows to 30s after 5 minutes idle, and supports a forced 4s mode
|
|
76
76
|
- **๐ Rolling averages** โ Avg calculated from ALL successful pings since start
|
|
77
77
|
- **๐ Uptime tracking** โ Percentage of successful pings shown in real-time
|
|
78
78
|
- **๐ Stability score** โ Composite 0โ100 score measuring consistency (p95, jitter, spikes, uptime)
|
|
@@ -90,7 +90,9 @@
|
|
|
90
90
|
- **๐ถ Status indicators** โ UP โ
ยท No Key ๐ ยท Timeout โณ ยท Overloaded ๐ฅ ยท Not Found ๐ซ
|
|
91
91
|
- **๐ Keyless latency** โ Models are pinged even without an API key
|
|
92
92
|
- **๐ท Tier filtering** โ Filter models by tier letter (S, A, B, C)
|
|
93
|
-
|
|
93
|
+
- **โญ Persistent favorites** โ Press `F` on a selected row to pin/unpin it
|
|
94
|
+
- **๐ Configured-only by default** โ Press `E` to toggle showing only providers with configured API keys; the choice persists across sessions and profiles
|
|
95
|
+
- **๐ช Width guardrail** โ If your terminal is too narrow, the TUI shows a centered warning instead of rendering a broken table
|
|
94
96
|
|
|
95
97
|
---
|
|
96
98
|
|
|
@@ -200,7 +202,7 @@ Use `โโ` arrows to select, `Enter` to confirm. Then the TUI launches with yo
|
|
|
200
202
|
|
|
201
203
|
**How it works:**
|
|
202
204
|
1. **Ping phase** โ All enabled models are pinged in parallel (up to 150 across 20 providers)
|
|
203
|
-
2. **Continuous monitoring** โ Models
|
|
205
|
+
2. **Continuous monitoring** โ Models start at 2s re-pings for 60s, then fall back to 10s automatically
|
|
204
206
|
3. **Real-time updates** โ Watch "Latest", "Avg", and "Up%" columns update live
|
|
205
207
|
4. **Select anytime** โ Use โโ arrows to navigate, press Enter on a model to act
|
|
206
208
|
5. **Smart detection** โ Automatically detects if NVIDIA NIM is configured in OpenCode or OpenClaw
|
|
@@ -235,7 +237,7 @@ Use `โโ` arrows to select, `Enter` to confirm. Then the TUI launches with yo
|
|
|
235
237
|
You can add or change keys anytime with the P key in the TUI.
|
|
236
238
|
```
|
|
237
239
|
|
|
238
|
-
You don't need all
|
|
240
|
+
You don't need all twenty providers โ skip any provider by pressing Enter. At least one key is required.
|
|
239
241
|
|
|
240
242
|
### Adding or changing keys later
|
|
241
243
|
|
|
@@ -269,6 +271,7 @@ Press **`P`** to open the Settings screen at any time:
|
|
|
269
271
|
|
|
270
272
|
Manual update is in the same Settings screen (`P`) under **Maintenance** (Enter to check, Enter again to install when an update is available).
|
|
271
273
|
Favorites are also persisted in the same config file and survive restarts.
|
|
274
|
+
The main table now starts in `Configured Only` mode, so if nothing is set up yet you can press `P` and add your first API key immediately.
|
|
272
275
|
|
|
273
276
|
### Environment variable overrides
|
|
274
277
|
|
|
@@ -376,7 +379,7 @@ TOGETHER_API_KEY=together_xxx free-coding-models
|
|
|
376
379
|
|
|
377
380
|
## ๐ค Coding Models
|
|
378
381
|
|
|
379
|
-
**
|
|
382
|
+
**159 coding models** across 20 providers and 8 tiers, ranked by [SWE-bench Verified](https://www.swebench.com) โ the industry-standard benchmark measuring real GitHub issue resolution. Scores are self-reported by providers unless noted.
|
|
380
383
|
|
|
381
384
|
### Alibaba Cloud (DashScope) (8 models)
|
|
382
385
|
|
|
@@ -725,7 +728,7 @@ This script:
|
|
|
725
728
|
โ 1. Enter alternate screen buffer (like vim/htop/less) โ
|
|
726
729
|
โ 2. Ping ALL models in parallel โ
|
|
727
730
|
โ 3. Display real-time table with Latest/Avg/Stability/Up% โ
|
|
728
|
-
โ 4. Re-ping ALL models
|
|
731
|
+
โ 4. Re-ping ALL models at 2s on startup, then 10s steady-state โ
|
|
729
732
|
โ 5. Update rolling averages + stability scores per model โ
|
|
730
733
|
โ 6. User can navigate with โโ and select with Enter โ
|
|
731
734
|
โ 7. On Enter (OpenCode): set model, launch OpenCode โ
|
|
@@ -733,7 +736,7 @@ This script:
|
|
|
733
736
|
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
|
|
734
737
|
```
|
|
735
738
|
|
|
736
|
-
**Result:** Continuous monitoring interface that stays open until you select a model or press Ctrl+C. Rolling averages give you accurate long-term latency data, the stability score reveals which models are truly consistent vs. deceptively spikey, and you can configure your tool of choice with one keystroke.
|
|
739
|
+
**Result:** Continuous monitoring interface that stays open until you select a model or press Ctrl+C. Rolling averages give you accurate long-term latency data, the stability score reveals which models are truly consistent vs. deceptively spikey, and you can configure your tool of choice with one keystroke. If the terminal is too narrow, the app shows a centered warning instead of a truncated table.
|
|
737
740
|
|
|
738
741
|
---
|
|
739
742
|
|
|
@@ -795,6 +798,9 @@ This script:
|
|
|
795
798
|
"perplexity": { "enabled": true },
|
|
796
799
|
"zai": { "enabled": true }
|
|
797
800
|
},
|
|
801
|
+
"settings": {
|
|
802
|
+
"hideUnconfiguredModels": true
|
|
803
|
+
},
|
|
798
804
|
"favorites": [
|
|
799
805
|
"nvidia/deepseek-ai/deepseek-v3.2"
|
|
800
806
|
]
|
|
@@ -803,7 +809,7 @@ This script:
|
|
|
803
809
|
|
|
804
810
|
**Configuration:**
|
|
805
811
|
- **Ping timeout**: 15 seconds per attempt (slow models get more time)
|
|
806
|
-
- **Ping
|
|
812
|
+
- **Ping cadence**: startup burst at 2 seconds for 60s, then 10 seconds normally, 30 seconds when idle for 5 minutes, or forced 4 seconds via `W`
|
|
807
813
|
- **Monitor mode**: Interface stays open forever, press Ctrl+C to exit
|
|
808
814
|
|
|
809
815
|
**Flags:**
|
|
@@ -830,13 +836,14 @@ This script:
|
|
|
830
836
|
- **F** โ Toggle favorite on selected model (โญ in Model column, pinned at top)
|
|
831
837
|
- **T** โ Cycle tier filter (All โ S+ โ S โ A+ โ A โ A- โ B+ โ B โ C โ All)
|
|
832
838
|
- **D** โ Cycle provider filter (All โ NIM โ Groq โ ...)
|
|
839
|
+
- **E** โ Toggle configured-only mode (on by default, persisted across sessions and profiles)
|
|
833
840
|
- **Z** โ Cycle mode (OpenCode CLI โ OpenCode Desktop โ OpenClaw)
|
|
834
|
-
- **X** โ **Toggle
|
|
841
|
+
- **X** โ **Toggle Token Logs** (view recent request/token usage logs)
|
|
835
842
|
- **P** โ Open Settings (manage API keys, toggles, updates, profiles)
|
|
836
843
|
- **Shift+P** โ Cycle through saved profiles (switches live TUI settings)
|
|
837
844
|
- **Shift+S** โ Save current TUI settings as a named profile (inline prompt)
|
|
838
845
|
- **Q** โ Open Smart Recommend overlay (find the best model for your task)
|
|
839
|
-
- **W
|
|
846
|
+
- **W** โ Cycle ping mode (`FAST` 2s โ `NORMAL` 10s โ `SLOW` 30s โ `FORCED` 4s)
|
|
840
847
|
- **J / I** โ Request feature / Report bug
|
|
841
848
|
- **K / Esc** โ Show help overlay / Close overlay
|
|
842
849
|
- **Ctrl+C** โ Exit
|
|
@@ -862,7 +869,7 @@ Profiles let you save and restore different TUI configurations โ useful if you
|
|
|
862
869
|
- Favorites (starred models)
|
|
863
870
|
- Sort column and direction
|
|
864
871
|
- Tier filter
|
|
865
|
-
- Ping
|
|
872
|
+
- Ping mode
|
|
866
873
|
- API keys
|
|
867
874
|
|
|
868
875
|
**Saving a profile:**
|
|
@@ -177,9 +177,10 @@ async function main() {
|
|
|
177
177
|
ensureFavoritesConfig(config)
|
|
178
178
|
|
|
179
179
|
// ๐ If --profile <name> was passed, load that profile into the live config
|
|
180
|
+
let startupProfileSettings = null
|
|
180
181
|
if (cliArgs.profileName) {
|
|
181
|
-
|
|
182
|
-
if (!
|
|
182
|
+
startupProfileSettings = loadProfile(config, cliArgs.profileName)
|
|
183
|
+
if (!startupProfileSettings) {
|
|
183
184
|
console.error(chalk.red(` Unknown profile "${cliArgs.profileName}". Available: ${listProfiles(config).join(', ') || '(none)'}`))
|
|
184
185
|
process.exit(1)
|
|
185
186
|
}
|
|
@@ -288,6 +289,7 @@ async function main() {
|
|
|
288
289
|
status: 'pending',
|
|
289
290
|
pings: [], // ๐ All ping results (ms or 'TIMEOUT')
|
|
290
291
|
httpCode: null,
|
|
292
|
+
isPinging: false, // ๐ Per-row live flag so Latest Ping can keep last value and show a spinner during refresh.
|
|
291
293
|
hidden: false, // ๐ Simple flag to hide/show models
|
|
292
294
|
}))
|
|
293
295
|
syncFavoriteFlags(results, config)
|
|
@@ -305,23 +307,52 @@ async function main() {
|
|
|
305
307
|
// ๐ Add interactive selection state - cursor index and user's choice
|
|
306
308
|
// ๐ sortColumn: 'rank'|'tier'|'origin'|'model'|'ping'|'avg'|'status'|'verdict'|'uptime'
|
|
307
309
|
// ๐ sortDirection: 'asc' (default) or 'desc'
|
|
308
|
-
|
|
309
|
-
|
|
310
|
+
// ๐ ping cadence is now mode-driven:
|
|
311
|
+
// ๐ speed = 2s for 1 minute bursts
|
|
312
|
+
// ๐ normal = 10s steady state
|
|
313
|
+
// ๐ slow = 30s after 5 minutes of inactivity
|
|
314
|
+
// ๐ forced = 4s and ignores inactivity / auto slowdowns
|
|
315
|
+
const PING_MODE_INTERVALS = {
|
|
316
|
+
speed: 2_000,
|
|
317
|
+
normal: 10_000,
|
|
318
|
+
slow: 30_000,
|
|
319
|
+
forced: 4_000,
|
|
320
|
+
}
|
|
321
|
+
const PING_MODE_CYCLE = ['speed', 'normal', 'slow', 'forced']
|
|
322
|
+
const SPEED_MODE_DURATION_MS = 60_000
|
|
323
|
+
const IDLE_SLOW_AFTER_MS = 5 * 60_000
|
|
324
|
+
const now = Date.now()
|
|
325
|
+
|
|
326
|
+
const intervalToPingMode = (intervalMs) => {
|
|
327
|
+
if (intervalMs <= 3000) return 'speed'
|
|
328
|
+
if (intervalMs <= 5000) return 'forced'
|
|
329
|
+
if (intervalMs >= 30000) return 'slow'
|
|
330
|
+
return 'normal'
|
|
331
|
+
}
|
|
332
|
+
|
|
333
|
+
// ๐ tierFilter: current tier filter letter (null = all, 'S' = S+/S, 'A' = A+/A/A-, etc.)
|
|
310
334
|
const state = {
|
|
311
335
|
results,
|
|
312
336
|
pendingPings: 0,
|
|
313
337
|
frame: 0,
|
|
314
338
|
cursor: 0,
|
|
315
339
|
selectedModel: null,
|
|
316
|
-
sortColumn: 'avg',
|
|
317
|
-
sortDirection: 'asc',
|
|
318
|
-
pingInterval:
|
|
319
|
-
|
|
340
|
+
sortColumn: startupProfileSettings?.sortColumn || 'avg',
|
|
341
|
+
sortDirection: startupProfileSettings?.sortAsc === false ? 'desc' : 'asc',
|
|
342
|
+
pingInterval: PING_MODE_INTERVALS.speed, // ๐ Effective live interval derived from the active ping mode.
|
|
343
|
+
pingMode: 'speed', // ๐ Current ping mode: speed | normal | slow | forced.
|
|
344
|
+
pingModeSource: 'startup', // ๐ Why this mode is active: startup | manual | auto | idle | activity.
|
|
345
|
+
speedModeUntil: now + SPEED_MODE_DURATION_MS, // ๐ Speed bursts auto-fall back to normal after 60 seconds.
|
|
346
|
+
lastPingTime: now, // ๐ Track when last ping cycle started
|
|
347
|
+
lastUserActivityAt: now, // ๐ Any keypress refreshes this timer; inactivity can force slow mode.
|
|
348
|
+
resumeSpeedOnActivity: false, // ๐ Set after idle slowdown so the next activity restarts a 60s speed burst.
|
|
320
349
|
mode, // ๐ 'opencode' or 'openclaw' โ controls Enter action
|
|
321
350
|
tierFilterMode: 0, // ๐ Index into TIER_CYCLE (0=All, 1=S+, 2=S, ...)
|
|
322
351
|
originFilterMode: 0, // ๐ Index into ORIGIN_CYCLE (0=All, then providers)
|
|
352
|
+
hideUnconfiguredModels: startupProfileSettings?.hideUnconfiguredModels === true || config.settings?.hideUnconfiguredModels === true, // ๐ Hide providers with no configured API key when true.
|
|
323
353
|
scrollOffset: 0, // ๐ First visible model index in viewport
|
|
324
354
|
terminalRows: process.stdout.rows || 24, // ๐ Current terminal height
|
|
355
|
+
terminalCols: process.stdout.columns || 80, // ๐ Current terminal width
|
|
325
356
|
// ๐ Settings screen state (P key opens it)
|
|
326
357
|
settingsOpen: false, // ๐ Whether settings overlay is active
|
|
327
358
|
settingsCursor: 0, // ๐ Which provider row is selected in settings
|
|
@@ -380,9 +411,57 @@ async function main() {
|
|
|
380
411
|
// ๐ Re-clamp viewport on terminal resize
|
|
381
412
|
process.stdout.on('resize', () => {
|
|
382
413
|
state.terminalRows = process.stdout.rows || 24
|
|
414
|
+
state.terminalCols = process.stdout.columns || 80
|
|
383
415
|
adjustScrollOffset(state)
|
|
384
416
|
})
|
|
385
417
|
|
|
418
|
+
let ticker = null
|
|
419
|
+
let onKeyPress = null
|
|
420
|
+
let pingModel = null
|
|
421
|
+
|
|
422
|
+
const scheduleNextPing = () => {
|
|
423
|
+
clearTimeout(state.pingIntervalObj)
|
|
424
|
+
const elapsed = Date.now() - state.lastPingTime
|
|
425
|
+
const delay = Math.max(0, state.pingInterval - elapsed)
|
|
426
|
+
state.pingIntervalObj = setTimeout(runPingCycle, delay)
|
|
427
|
+
}
|
|
428
|
+
|
|
429
|
+
const setPingMode = (nextMode, source = 'manual') => {
|
|
430
|
+
const modeInterval = PING_MODE_INTERVALS[nextMode] ?? PING_MODE_INTERVALS.normal
|
|
431
|
+
state.pingMode = nextMode
|
|
432
|
+
state.pingModeSource = source
|
|
433
|
+
state.pingInterval = modeInterval
|
|
434
|
+
state.speedModeUntil = nextMode === 'speed' ? Date.now() + SPEED_MODE_DURATION_MS : null
|
|
435
|
+
state.resumeSpeedOnActivity = source === 'idle'
|
|
436
|
+
if (state.pingIntervalObj) scheduleNextPing()
|
|
437
|
+
}
|
|
438
|
+
|
|
439
|
+
const noteUserActivity = () => {
|
|
440
|
+
state.lastUserActivityAt = Date.now()
|
|
441
|
+
if (state.pingMode === 'forced') return
|
|
442
|
+
if (state.resumeSpeedOnActivity) {
|
|
443
|
+
setPingMode('speed', 'activity')
|
|
444
|
+
}
|
|
445
|
+
}
|
|
446
|
+
|
|
447
|
+
const refreshAutoPingMode = () => {
|
|
448
|
+
const currentTime = Date.now()
|
|
449
|
+
if (state.pingMode === 'forced') return
|
|
450
|
+
|
|
451
|
+
if (state.speedModeUntil && currentTime >= state.speedModeUntil) {
|
|
452
|
+
setPingMode('normal', 'auto')
|
|
453
|
+
return
|
|
454
|
+
}
|
|
455
|
+
|
|
456
|
+
if (currentTime - state.lastUserActivityAt >= IDLE_SLOW_AFTER_MS) {
|
|
457
|
+
if (state.pingMode !== 'slow' || state.pingModeSource !== 'idle') {
|
|
458
|
+
setPingMode('slow', 'idle')
|
|
459
|
+
} else {
|
|
460
|
+
state.resumeSpeedOnActivity = true
|
|
461
|
+
}
|
|
462
|
+
}
|
|
463
|
+
}
|
|
464
|
+
|
|
386
465
|
// ๐ Auto-start proxy on launch if OpenCode config already has an fcm-proxy provider.
|
|
387
466
|
// ๐ Fire-and-forget: does not block UI startup. state.proxyStartupStatus is updated async.
|
|
388
467
|
if (mode === 'opencode' || mode === 'opencode-desktop') {
|
|
@@ -404,13 +483,18 @@ async function main() {
|
|
|
404
483
|
|
|
405
484
|
// ๐ originFilterMode: index into ORIGIN_CYCLE, 0=All, then each provider key in order
|
|
406
485
|
const ORIGIN_CYCLE = [null, ...Object.keys(sources)]
|
|
407
|
-
state.tierFilterMode = 0
|
|
486
|
+
state.tierFilterMode = startupProfileSettings?.tierFilter ? Math.max(0, TIER_CYCLE.indexOf(startupProfileSettings.tierFilter)) : 0
|
|
408
487
|
state.originFilterMode = 0
|
|
409
488
|
|
|
410
489
|
function applyTierFilter() {
|
|
411
490
|
const activeTier = TIER_CYCLE[state.tierFilterMode]
|
|
412
491
|
const activeOrigin = ORIGIN_CYCLE[state.originFilterMode]
|
|
413
492
|
state.results.forEach(r => {
|
|
493
|
+
const unconfiguredHide = state.hideUnconfiguredModels && !getApiKey(state.config, r.providerKey)
|
|
494
|
+
if (unconfiguredHide) {
|
|
495
|
+
r.hidden = true
|
|
496
|
+
return
|
|
497
|
+
}
|
|
414
498
|
// ๐ Favorites stay visible regardless of tier/origin filters.
|
|
415
499
|
if (r.isFavorite) {
|
|
416
500
|
r.hidden = false
|
|
@@ -425,9 +509,6 @@ async function main() {
|
|
|
425
509
|
}
|
|
426
510
|
|
|
427
511
|
// โโโ Overlay renderers + key handler โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
|
|
428
|
-
let pingModel = null
|
|
429
|
-
let ticker = null
|
|
430
|
-
let onKeyPress = null
|
|
431
512
|
const stopUi = ({ resetRawMode = false } = {}) => {
|
|
432
513
|
if (ticker) clearInterval(ticker)
|
|
433
514
|
clearTimeout(state.pingIntervalObj)
|
|
@@ -518,6 +599,10 @@ async function main() {
|
|
|
518
599
|
mergedModels,
|
|
519
600
|
apiKey,
|
|
520
601
|
chalk,
|
|
602
|
+
setPingMode,
|
|
603
|
+
noteUserActivity,
|
|
604
|
+
intervalToPingMode,
|
|
605
|
+
PING_MODE_CYCLE,
|
|
521
606
|
setResults: (next) => { results = next },
|
|
522
607
|
readline,
|
|
523
608
|
})
|
|
@@ -544,9 +629,11 @@ async function main() {
|
|
|
544
629
|
}
|
|
545
630
|
|
|
546
631
|
process.stdin.on('keypress', onKeyPress)
|
|
632
|
+
process.on('SIGCONT', noteUserActivity)
|
|
547
633
|
|
|
548
634
|
// ๐ Animation loop: render settings overlay, recommend overlay, help overlay, feature request overlay, bug report overlay, OR main table
|
|
549
635
|
ticker = setInterval(() => {
|
|
636
|
+
refreshAutoPingMode()
|
|
550
637
|
state.frame++
|
|
551
638
|
// ๐ Cache visible+sorted models each frame so Enter handler always matches the display
|
|
552
639
|
if (!state.settingsOpen && !state.recommendOpen && !state.featureRequestOpen && !state.bugReportOpen) {
|
|
@@ -561,11 +648,11 @@ async function main() {
|
|
|
561
648
|
? overlays.renderFeatureRequest()
|
|
562
649
|
: state.bugReportOpen
|
|
563
650
|
? overlays.renderBugReport()
|
|
564
|
-
|
|
565
|
-
|
|
651
|
+
: state.helpVisible
|
|
652
|
+
? overlays.renderHelp()
|
|
566
653
|
: state.logVisible
|
|
567
654
|
? overlays.renderLog()
|
|
568
|
-
: renderTable(state.results, state.pendingPings, state.frame, state.cursor, state.sortColumn, state.sortDirection, state.pingInterval, state.lastPingTime, state.mode, state.tierFilterMode, state.scrollOffset, state.terminalRows, state.originFilterMode, state.activeProfile, state.profileSaveMode, state.profileSaveBuffer, state.proxyStartupStatus)
|
|
655
|
+
: renderTable(state.results, state.pendingPings, state.frame, state.cursor, state.sortColumn, state.sortDirection, state.pingInterval, state.lastPingTime, state.mode, state.tierFilterMode, state.scrollOffset, state.terminalRows, state.terminalCols, state.originFilterMode, state.activeProfile, state.profileSaveMode, state.profileSaveBuffer, state.proxyStartupStatus, state.pingMode, state.pingModeSource, state.hideUnconfiguredModels)
|
|
569
656
|
process.stdout.write(ALT_HOME + content)
|
|
570
657
|
}, Math.round(1000 / FPS))
|
|
571
658
|
|
|
@@ -573,7 +660,7 @@ async function main() {
|
|
|
573
660
|
const initialVisible = state.results.filter(r => !r.hidden)
|
|
574
661
|
state.visibleSorted = sortResultsWithPinnedFavorites(initialVisible, state.sortColumn, state.sortDirection)
|
|
575
662
|
|
|
576
|
-
process.stdout.write(ALT_HOME + renderTable(state.results, state.pendingPings, state.frame, state.cursor, state.sortColumn, state.sortDirection, state.pingInterval, state.lastPingTime, state.mode, state.tierFilterMode, state.scrollOffset, state.terminalRows, state.originFilterMode, state.activeProfile, state.profileSaveMode, state.profileSaveBuffer, state.proxyStartupStatus))
|
|
663
|
+
process.stdout.write(ALT_HOME + renderTable(state.results, state.pendingPings, state.frame, state.cursor, state.sortColumn, state.sortDirection, state.pingInterval, state.lastPingTime, state.mode, state.tierFilterMode, state.scrollOffset, state.terminalRows, state.terminalCols, state.originFilterMode, state.activeProfile, state.profileSaveMode, state.profileSaveBuffer, state.proxyStartupStatus, state.pingMode, state.pingModeSource, state.hideUnconfiguredModels))
|
|
577
664
|
|
|
578
665
|
// ๐ If --recommend was passed, auto-open the Smart Recommend overlay on start
|
|
579
666
|
if (cliArgs.recommendMode) {
|
|
@@ -593,82 +680,89 @@ async function main() {
|
|
|
593
680
|
// ๐ Uses per-provider API key and URL from sources.js
|
|
594
681
|
// ๐ If no API key is configured, pings without auth โ a 401 still tells us latency + server is up
|
|
595
682
|
pingModel = async (r) => {
|
|
596
|
-
|
|
597
|
-
|
|
598
|
-
|
|
599
|
-
|
|
600
|
-
|
|
601
|
-
const
|
|
602
|
-
|
|
603
|
-
|
|
683
|
+
state.pendingPings += 1
|
|
684
|
+
r.isPinging = true
|
|
685
|
+
|
|
686
|
+
try {
|
|
687
|
+
const providerApiKey = getApiKey(state.config, r.providerKey) ?? null
|
|
688
|
+
const providerUrl = sources[r.providerKey]?.url ?? sources.nvidia.url
|
|
689
|
+
let { code, ms, quotaPercent } = await ping(providerApiKey, r.modelId, r.providerKey, providerUrl)
|
|
690
|
+
|
|
691
|
+
if ((quotaPercent === null || quotaPercent === undefined) && providerApiKey) {
|
|
692
|
+
const providerQuota = await getProviderQuotaPercentCached(r.providerKey, providerApiKey)
|
|
693
|
+
if (typeof providerQuota === 'number' && Number.isFinite(providerQuota)) {
|
|
694
|
+
quotaPercent = providerQuota
|
|
695
|
+
}
|
|
604
696
|
}
|
|
605
|
-
}
|
|
606
697
|
|
|
607
|
-
|
|
608
|
-
|
|
609
|
-
|
|
610
|
-
|
|
611
|
-
|
|
612
|
-
|
|
613
|
-
|
|
614
|
-
|
|
615
|
-
|
|
616
|
-
|
|
617
|
-
|
|
618
|
-
|
|
619
|
-
|
|
620
|
-
|
|
621
|
-
|
|
622
|
-
|
|
623
|
-
|
|
624
|
-
|
|
625
|
-
|
|
698
|
+
// ๐ Store ping result as object with ms and code
|
|
699
|
+
// ๐ ms = actual response time (even for errors like 429)
|
|
700
|
+
// ๐ code = HTTP status code ('200', '429', '500', '000' for timeout)
|
|
701
|
+
r.pings.push({ ms, code })
|
|
702
|
+
|
|
703
|
+
// ๐ Update status based on latest ping
|
|
704
|
+
if (code === '200') {
|
|
705
|
+
r.status = 'up'
|
|
706
|
+
} else if (code === '000') {
|
|
707
|
+
r.status = 'timeout'
|
|
708
|
+
} else if (code === '401') {
|
|
709
|
+
// ๐ 401 = server is reachable but no API key set (or wrong key)
|
|
710
|
+
// ๐ Treated as 'noauth' โ server is UP, latency is real, just needs a key
|
|
711
|
+
r.status = 'noauth'
|
|
712
|
+
r.httpCode = code
|
|
713
|
+
} else {
|
|
714
|
+
r.status = 'down'
|
|
715
|
+
r.httpCode = code
|
|
716
|
+
}
|
|
626
717
|
|
|
627
|
-
|
|
628
|
-
|
|
629
|
-
|
|
630
|
-
|
|
631
|
-
|
|
632
|
-
|
|
718
|
+
if (typeof quotaPercent === 'number' && Number.isFinite(quotaPercent)) {
|
|
719
|
+
r.usagePercent = quotaPercent
|
|
720
|
+
// Provider-level fallback: apply latest known quota to sibling rows on same provider.
|
|
721
|
+
for (const sibling of state.results) {
|
|
722
|
+
if (sibling.providerKey === r.providerKey && (sibling.usagePercent === undefined || sibling.usagePercent === null)) {
|
|
723
|
+
sibling.usagePercent = quotaPercent
|
|
724
|
+
}
|
|
633
725
|
}
|
|
634
726
|
}
|
|
727
|
+
} finally {
|
|
728
|
+
r.isPinging = false
|
|
729
|
+
state.pendingPings = Math.max(0, state.pendingPings - 1)
|
|
635
730
|
}
|
|
636
731
|
}
|
|
637
732
|
|
|
638
733
|
// ๐ Initial ping of all models
|
|
639
734
|
const initialPing = Promise.all(state.results.map(r => pingModel(r)))
|
|
640
735
|
|
|
641
|
-
// ๐ Continuous ping loop with
|
|
642
|
-
const
|
|
643
|
-
|
|
644
|
-
|
|
645
|
-
|
|
646
|
-
|
|
647
|
-
|
|
648
|
-
|
|
649
|
-
|
|
650
|
-
|
|
651
|
-
|
|
652
|
-
|
|
653
|
-
|
|
654
|
-
|
|
655
|
-
}
|
|
736
|
+
// ๐ Continuous ping loop with mode-driven cadence.
|
|
737
|
+
const runPingCycle = async () => {
|
|
738
|
+
refreshAutoPingMode()
|
|
739
|
+
state.lastPingTime = Date.now()
|
|
740
|
+
|
|
741
|
+
// ๐ Refresh persisted usage snapshots each cycle so proxy writes appear live in table.
|
|
742
|
+
// ๐ Freshness-aware: stale snapshots (>30m) are excluded and row reverts to undefined.
|
|
743
|
+
for (const r of state.results) {
|
|
744
|
+
const pct = _usageForRow(r.providerKey, r.modelId)
|
|
745
|
+
if (typeof pct === 'number' && Number.isFinite(pct)) {
|
|
746
|
+
r.usagePercent = pct
|
|
747
|
+
} else {
|
|
748
|
+
// If snapshot is now stale or gone, clear the cached value so UI shows N/A.
|
|
749
|
+
r.usagePercent = undefined
|
|
656
750
|
}
|
|
751
|
+
}
|
|
657
752
|
|
|
658
|
-
|
|
659
|
-
|
|
660
|
-
|
|
661
|
-
})
|
|
753
|
+
state.results.forEach(r => {
|
|
754
|
+
pingModel(r).catch(() => {
|
|
755
|
+
// Individual ping failures don't crash the loop
|
|
662
756
|
})
|
|
757
|
+
})
|
|
663
758
|
|
|
664
|
-
|
|
665
|
-
|
|
666
|
-
}, state.pingInterval)
|
|
759
|
+
refreshAutoPingMode()
|
|
760
|
+
scheduleNextPing()
|
|
667
761
|
}
|
|
668
762
|
|
|
669
763
|
// ๐ Start the ping loop
|
|
670
764
|
state.pingIntervalObj = null
|
|
671
|
-
|
|
765
|
+
scheduleNextPing()
|
|
672
766
|
|
|
673
767
|
await initialPing
|
|
674
768
|
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "free-coding-models",
|
|
3
|
-
"version": "0.1.
|
|
3
|
+
"version": "0.1.86",
|
|
4
4
|
"description": "Find the fastest coding LLM models in seconds โ ping free models from multiple providers, pick the best one for OpenCode, Cursor, or any AI coding assistant.",
|
|
5
5
|
"keywords": [
|
|
6
6
|
"nvidia",
|
package/sources.js
CHANGED
|
@@ -12,6 +12,8 @@
|
|
|
12
12
|
* - ctx: Context window size in tokens (e.g., "128k", "32k")
|
|
13
13
|
*
|
|
14
14
|
* Add new sources here to support additional providers beyond NIM.
|
|
15
|
+
* Public provider catalogs drift often, so these IDs are periodically
|
|
16
|
+
* refreshed against official docs and live model endpoints when available.
|
|
15
17
|
*
|
|
16
18
|
* ๐ฏ Tier scale (based on SWE-bench Verified):
|
|
17
19
|
* - S+: 70%+ (elite frontier coders)
|
|
@@ -114,9 +116,9 @@ export const cerebras = [
|
|
|
114
116
|
['llama-4-scout-17b-16e-instruct', 'Llama 4 Scout', 'A', '44.0%', '10M'],
|
|
115
117
|
['qwen-3-32b', 'Qwen3 32B', 'A+', '50.0%', '128k'],
|
|
116
118
|
['gpt-oss-120b', 'GPT OSS 120B', 'S', '60.0%', '128k'],
|
|
117
|
-
['qwen-3-235b-a22b',
|
|
119
|
+
['qwen-3-235b-a22b-instruct-2507', 'Qwen3 235B', 'S+', '70.0%', '128k'],
|
|
118
120
|
['llama3.1-8b', 'Llama 3.1 8B', 'B', '28.8%', '128k'],
|
|
119
|
-
['glm-4.
|
|
121
|
+
['zai-glm-4.7', 'GLM 4.7', 'S+', '73.8%', '200k'],
|
|
120
122
|
]
|
|
121
123
|
|
|
122
124
|
// ๐ SambaNova source - https://cloud.sambanova.ai
|
|
@@ -124,14 +126,15 @@ export const cerebras = [
|
|
|
124
126
|
// ๐ OpenAI-compatible API, supports all major coding models including DeepSeek V3/R1, Qwen3, Llama 4
|
|
125
127
|
export const sambanova = [
|
|
126
128
|
// โโ S+ tier โโ
|
|
127
|
-
['
|
|
129
|
+
['MiniMax-M2.5', 'MiniMax M2.5', 'S+', '74.0%', '160k'],
|
|
128
130
|
// โโ S tier โโ
|
|
129
131
|
['DeepSeek-R1-0528', 'DeepSeek R1 0528', 'S', '61.0%', '128k'],
|
|
130
132
|
['DeepSeek-V3.1', 'DeepSeek V3.1', 'S', '62.0%', '128k'],
|
|
131
133
|
['DeepSeek-V3-0324', 'DeepSeek V3 0324', 'S', '62.0%', '128k'],
|
|
134
|
+
['DeepSeek-V3.2', 'DeepSeek V3.2', 'S+', '73.1%', '8k'],
|
|
132
135
|
['Llama-4-Maverick-17B-128E-Instruct', 'Llama 4 Maverick', 'S', '62.0%', '1M'],
|
|
133
136
|
['gpt-oss-120b', 'GPT OSS 120B', 'S', '60.0%', '128k'],
|
|
134
|
-
['
|
|
137
|
+
['DeepSeek-V3.1-Terminus', 'DeepSeek V3.1 Term', 'S', '68.4%', '128k'],
|
|
135
138
|
// โโ A+ tier โโ
|
|
136
139
|
['Qwen3-32B', 'Qwen3 32B', 'A+', '50.0%', '128k'],
|
|
137
140
|
// โโ A tier โโ
|
|
@@ -140,24 +143,23 @@ export const sambanova = [
|
|
|
140
143
|
['Meta-Llama-3.3-70B-Instruct', 'Llama 3.3 70B', 'A-', '39.5%', '128k'],
|
|
141
144
|
// โโ B tier โโ
|
|
142
145
|
['Meta-Llama-3.1-8B-Instruct', 'Llama 3.1 8B', 'B', '28.8%', '128k'],
|
|
143
|
-
// โโ A tier โ requested Llama3-Groq coding tuned family โโ
|
|
144
|
-
['Llama-3-Groq-70B-Tool-Use', 'Llama3-Groq 70B', 'A', '43.0%', '128k'],
|
|
145
146
|
]
|
|
146
147
|
|
|
147
148
|
// ๐ OpenRouter source - https://openrouter.ai
|
|
148
149
|
// ๐ Free :free models with shared quota โ 50 free req/day
|
|
149
150
|
// ๐ API keys at https://openrouter.ai/keys
|
|
150
151
|
export const openrouter = [
|
|
151
|
-
['qwen/qwen3-coder:
|
|
152
|
-
['
|
|
153
|
-
['
|
|
152
|
+
['qwen/qwen3-coder:free', 'Qwen3 Coder 480B', 'S+', '70.6%', '262k'],
|
|
153
|
+
['z-ai/glm-4.5-air:free', 'GLM 4.5 Air', 'S+', '72.0%', '128k'],
|
|
154
|
+
['google/gemma-3-27b-it:free', 'Gemma 3 27B', 'B', '22.0%', '128k'],
|
|
154
155
|
['stepfun/step-3.5-flash:free', 'Step 3.5 Flash', 'S+', '74.4%', '256k'],
|
|
155
|
-
['deepseek/deepseek-r1-0528:free', 'DeepSeek R1 0528', 'S', '61.0%', '128k'],
|
|
156
156
|
['qwen/qwen3-next-80b-a3b-instruct:free', 'Qwen3 80B Instruct', 'S', '65.0%', '128k'],
|
|
157
157
|
['openai/gpt-oss-120b:free', 'GPT OSS 120B', 'S', '60.0%', '128k'],
|
|
158
158
|
['openai/gpt-oss-20b:free', 'GPT OSS 20B', 'A', '42.0%', '128k'],
|
|
159
159
|
['nvidia/nemotron-3-nano-30b-a3b:free', 'Nemotron Nano 30B', 'A', '43.0%', '128k'],
|
|
160
160
|
['meta-llama/llama-3.3-70b-instruct:free', 'Llama 3.3 70B', 'A-', '39.5%', '128k'],
|
|
161
|
+
['mistralai/mistral-small-3.1-24b-instruct:free', 'Mistral Small 3.1', 'B+', '30.0%', '128k'],
|
|
162
|
+
['google/gemma-3-12b-it:free', 'Gemma 3 12B', 'C', '15.0%', '128k'],
|
|
161
163
|
]
|
|
162
164
|
|
|
163
165
|
// ๐ Hugging Face Inference source - https://huggingface.co
|
package/src/config.js
CHANGED
|
@@ -71,7 +71,7 @@
|
|
|
71
71
|
* - apiKeys: API keys per provider (can differ between work/personal setups)
|
|
72
72
|
* - providers: enabled/disabled state per provider
|
|
73
73
|
* - favorites: list of pinned favorite models
|
|
74
|
-
* - settings: extra TUI preferences (tierFilter, sortColumn, sortAsc, pingInterval)
|
|
74
|
+
* - settings: extra TUI preferences (tierFilter, sortColumn, sortAsc, pingInterval, hideUnconfiguredModels)
|
|
75
75
|
*
|
|
76
76
|
* ๐ When a profile is loaded via --profile <name> or Shift+P, the main config's
|
|
77
77
|
* apiKeys/providers/favorites are replaced with the profile's values. The profile
|
|
@@ -164,6 +164,8 @@ export function loadConfig() {
|
|
|
164
164
|
// ๐ Ensure the shape is always complete โ fill missing sections with defaults
|
|
165
165
|
if (!parsed.apiKeys) parsed.apiKeys = {}
|
|
166
166
|
if (!parsed.providers) parsed.providers = {}
|
|
167
|
+
if (!parsed.settings || typeof parsed.settings !== 'object') parsed.settings = {}
|
|
168
|
+
if (typeof parsed.settings.hideUnconfiguredModels !== 'boolean') parsed.settings.hideUnconfiguredModels = true
|
|
167
169
|
// ๐ Favorites: list of "providerKey/modelId" pinned rows.
|
|
168
170
|
if (!Array.isArray(parsed.favorites)) parsed.favorites = []
|
|
169
171
|
parsed.favorites = parsed.favorites.filter((fav) => typeof fav === 'string' && fav.trim().length > 0)
|
|
@@ -173,6 +175,10 @@ export function loadConfig() {
|
|
|
173
175
|
if (typeof parsed.telemetry.anonymousId !== 'string' || !parsed.telemetry.anonymousId.trim()) parsed.telemetry.anonymousId = null
|
|
174
176
|
// ๐ Ensure profiles section exists (added in profile system)
|
|
175
177
|
if (!parsed.profiles || typeof parsed.profiles !== 'object') parsed.profiles = {}
|
|
178
|
+
for (const profile of Object.values(parsed.profiles)) {
|
|
179
|
+
if (!profile || typeof profile !== 'object') continue
|
|
180
|
+
profile.settings = profile.settings ? { ..._emptyProfileSettings(), ...profile.settings } : _emptyProfileSettings()
|
|
181
|
+
}
|
|
176
182
|
if (parsed.activeProfile && typeof parsed.activeProfile !== 'string') parsed.activeProfile = null
|
|
177
183
|
return parsed
|
|
178
184
|
} catch {
|
|
@@ -385,14 +391,15 @@ export function isProviderEnabled(config, providerKey) {
|
|
|
385
391
|
* ๐ These settings are saved/restored when switching profiles so each profile
|
|
386
392
|
* can have different sort, filter, and ping preferences.
|
|
387
393
|
*
|
|
388
|
-
* @returns {{ tierFilter: string|null, sortColumn: string, sortAsc: boolean, pingInterval: number }}
|
|
394
|
+
* @returns {{ tierFilter: string|null, sortColumn: string, sortAsc: boolean, pingInterval: number, hideUnconfiguredModels: boolean }}
|
|
389
395
|
*/
|
|
390
396
|
export function _emptyProfileSettings() {
|
|
391
397
|
return {
|
|
392
398
|
tierFilter: null, // ๐ null = show all tiers, or 'S'|'A'|'B'|'C'|'D'
|
|
393
399
|
sortColumn: 'avg', // ๐ default sort column
|
|
394
400
|
sortAsc: true, // ๐ true = ascending (fastest first for latency)
|
|
395
|
-
pingInterval:
|
|
401
|
+
pingInterval: 10000, // ๐ default ms between pings in the steady "normal" mode
|
|
402
|
+
hideUnconfiguredModels: true, // ๐ true = default to providers that are actually configured
|
|
396
403
|
}
|
|
397
404
|
}
|
|
398
405
|
|
|
@@ -505,6 +512,10 @@ function _emptyConfig() {
|
|
|
505
512
|
return {
|
|
506
513
|
apiKeys: {},
|
|
507
514
|
providers: {},
|
|
515
|
+
// ๐ Global TUI preferences that should persist even without a named profile.
|
|
516
|
+
settings: {
|
|
517
|
+
hideUnconfiguredModels: true,
|
|
518
|
+
},
|
|
508
519
|
// ๐ Pinned favorites rendered at top of the table ("providerKey/modelId").
|
|
509
520
|
favorites: [],
|
|
510
521
|
// ๐ Telemetry consent is explicit. null = not decided yet.
|
package/src/constants.js
CHANGED
|
@@ -51,7 +51,9 @@ export const ALT_HOME = '\x1b[H'
|
|
|
51
51
|
|
|
52
52
|
// ๐ Timing constants โ control how fast the health-check loop runs.
|
|
53
53
|
export const PING_TIMEOUT = 15_000 // ๐ 15s per attempt before abort
|
|
54
|
-
|
|
54
|
+
// ๐ PING_INTERVAL is the baseline "normal" cadence. Startup can still temporarily
|
|
55
|
+
// ๐ boost to faster modes, but steady-state uses 10s unless the user picks another mode.
|
|
56
|
+
export const PING_INTERVAL = 10_000
|
|
55
57
|
|
|
56
58
|
// ๐ Animation and column-width constants.
|
|
57
59
|
export const FPS = 12
|