free-coding-models 0.2.15 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -2,6 +2,84 @@
2
2
 
3
3
  ---
4
4
 
5
+ ## 0.3.0
6
+
7
+ ### Added
8
+ - **Always-on background proxy service**: FCM Proxy V2 can run as a persistent background service via `launchd` (macOS) or `systemd` (Linux). All tools get free model access 24/7 — no need to keep the TUI open.
9
+ - **Anthropic wire format translation**: Native bidirectional translation between Anthropic Messages API (`POST /v1/messages`) and OpenAI Chat Completions. Claude Code works natively through FCM Proxy V2.
10
+ - **Dedicated FCM Proxy V2 overlay**: New full-page overlay (from Settings → "FCM Proxy V2 settings →" or `J` key) with proxy config, service status, restart, stop, force-kill, and log viewer.
11
+ - **`J` key — FCM Proxy V2 shortcut**: Opens the proxy settings directly from the main view. Footer shows a green `📡 FCM Proxy V2 On` badge when active, red `📡 FCM Proxy V2 Off` when disabled.
12
+ - **CLI daemon subcommand**: `free-coding-models daemon [status|install|uninstall|restart|stop|logs]` for headless service control.
13
+ - **Stable proxy identity**: Persistent token and preferred port (`18045`) survive daemon restarts — env files and tool configs remain valid across reboots.
14
+ - **Health endpoint**: `GET /v1/health` returns uptime, version, and account/model counts for liveness probes.
15
+ - **`GET /v1/stats` endpoint**: Authenticated endpoint returning per-account health, token stats, totals, and proxy uptime for monitoring and debugging.
16
+ - **Hot-reload**: FCM Proxy V2 watches `~/.free-coding-models.json` and reloads proxy topology automatically when config changes.
17
+ - **Version mismatch detection**: The overlay warns when the running service version differs from the installed FCM version.
18
+ - **Dev environment guard**: `installDaemon()` is blocked when running from a git checkout to prevent hardcoding local repo paths in OS service files.
19
+ - **Generalized proxy sync module** (`src/proxy-sync.js`): Single-endpoint proxy config sync for 12 tools (OpenCode, OpenClaw, Crush, Goose, Pi, Aider, Amp, Qwen, Claude Code, Codex, OpenHands). Writes one `fcm-proxy` provider with all models, cleans up old per-provider `fcm-*` vestiges.
20
+ - **Retry backoff with jitter**: Progressive delays between retries (0ms, 300ms, 800ms + random jitter) to avoid re-hitting the same rate-limit window on 429s.
21
+ - **Automatic account cooldown on consecutive failures**: When an account accumulates 3+ consecutive non-429 failures, it enters graduated cooldown (30s → 60s → 120s). Proxy routes around failing accounts automatically. Resets on success.
22
+
23
+ ### Changed
24
+ - **Rebranded to FCM Proxy V2**: All user-facing references to "daemon", "FCM Proxy", and "Proxy & Daemon" renamed to "FCM Proxy V2" across CLI messages, TUI overlays, endpoint installer, and service descriptions.
25
+ - **Proxy overlay generalized for all tools**: "Persist proxy in OpenCode" → "Auto-sync proxy to {tool}", "Clean OpenCode proxy config" → "Clean {tool} proxy config". New "Active tool" selector row cycles through all 12 proxy-syncable tools.
26
+ - **Feedback overlay redesigned**: Renamed "Report bug" to "Feedback, bugs & requests", input is now left-aligned with a visible cursor, framed by horizontal separator lines. `I` key now covers all feedback types.
27
+ - **Proxy settings moved to dedicated overlay**: The 5 proxy rows in Settings are replaced by a single entry that opens a full-page manager.
28
+ - **Proxy topology extracted to shared module**: `src/proxy-topology.js` is now used by both TUI and daemon, eliminating code duplication.
29
+ - **TUI delegates to background service**: `ensureProxyRunning()` checks for a running service first and reuses its port/token instead of starting an in-process proxy.
30
+ - **Endpoint installer supports proxy mode**: When installing endpoints (Y key) with "FCM Proxy V2" connection mode, env files point to the service's stable token/port.
31
+ - **Claude Code / Codex / Gemini require proxy**: These tools now refuse to launch without FCM Proxy V2 enabled, showing clear instructions to enable it. When proxy is on, they route through it automatically.
32
+ - **Goose launcher rewritten**: Now writes proper custom provider JSON + updates `config.yaml` with `GOOSE_PROVIDER`/`GOOSE_MODEL` for guaranteed auto-selection (replaces obsolete `OPENAI_HOST`/`OPENAI_MODEL` env vars).
33
+ - **Crush launcher improved**: Removed `disable_default_providers` flag, sets both `models.large` and `models.small` defaults for reliable auto-selection.
34
+ - **Pi launcher improved**: Now passes `--provider` and `--model` CLI flags for guaranteed model pre-selection.
35
+
36
+ ### Fixed
37
+ - **Terminal width warning behavior**: Warning now shows max 2 times per session with emojis and double spacing.
38
+ - **Body size limit** (security): `readBody()` now enforces a 10 MB limit. Oversized payloads receive 413.
39
+ - **Stack trace leak prevention** (security): Error responses no longer include `err.message`.
40
+ - **SSE buffer overflow guard** (security): Anthropic SSE transformer limits buffer to 1 MB.
41
+ - **`new URL()` crash protection**: Malformed upstream URLs caught instead of crashing.
42
+ - **`execSync` timeout safety**: All `execSync` calls in daemon-manager use a 15-second timeout via `execSyncSafe()`.
43
+ - **Daemon startup crash protection**: `loadConfig()`, `buildMergedModelsForDaemon()`, and `buildProxyTopologyFromConfig()` wrapped in try/catch.
44
+ - **`resolveCloudflareUrl()` null guard**: Empty provider URLs skipped instead of crashing.
45
+ - **Health check buffer limit**: Responses capped at 64 KB.
46
+ - **SSE line buffering**: SSE tap now correctly handles lines split across chunk boundaries.
47
+ - **Empty choices fallback**: `translateOpenAIToAnthropic` returns fallback content block when OpenAI response has empty choices.
48
+ - **Tool calls streaming index tracking**: Proper `nextBlockIndex`/`currentBlockIndex` counters for correct indexing across multiple tool calls.
49
+ - **Pipe error propagation**: Error handlers on both sides of SSE pipes to prevent uncaught errors on mid-stream disconnects.
50
+ - **Input validation**: `translateAnthropicToOpenAI` guards against null/undefined/non-object input.
51
+ - **Hot-reload race condition**: Config watcher uses `reloadInProgress` flag to prevent concurrent reloads.
52
+ - **Fake response stubs**: Added `destroy()`, `removeListener()`, and `socket: null` for better compatibility.
53
+ - **API key trimming**: Whitespace-trimmed and empty keys filtered out in topology builder.
54
+ - **`writeFileSync` error messages**: Plist and systemd service file write failures now throw clear error messages.
55
+
56
+ ---
57
+
58
+ ## 0.2.17
59
+
60
+ ### Added
61
+ - **All coding tools re-enabled in Z-cycle**: Aider, Claude Code, Codex CLI, Gemini CLI, Qwen Code, OpenHands, and Amp are now back in the public tool mode cycle alongside OpenCode, OpenClaw, Crush, Goose, and Pi — 13 tools total.
62
+ - **All coding tools available in Install Endpoints (Y key)**: The endpoint installer now supports all 13 tools as install targets, not just the original 5. Each tool gets its proper config format (JSON, YAML, or env file).
63
+ - **Connection mode choice in Install flow**: When installing endpoints (Y key), users now choose between **Direct Provider** (pure API connection) or **FCM Proxy** (local proxy with key rotation and usage tracking) — new Step 3 in the 5-step flow.
64
+ - **Install support for new tools**: Pi (models.json + settings.json), Aider (.aider.conf.yml), Amp (settings.json), Gemini (settings.json), Qwen (modelProviders config), Claude Code/Codex/OpenHands (sourceable env files at ~/.fcm-*-env).
65
+
66
+ ### Changed
67
+ - **Install Endpoints flow is now 5 steps**: Provider → Tool → Connection Mode → Scope → Models (was 4 steps without connection mode choice).
68
+ - **Tool labels in install overlay use metadata**: Tool names and emojis in the Y overlay now come from `tool-metadata.js` instead of hard-coded ternary chains — easier to maintain and always in sync.
69
+ - **Help overlay updated**: Z-cycle hint and CLI flag examples now list all 13 tools.
70
+
71
+ ---
72
+
73
+ ## 0.2.16
74
+
75
+ ### Fixed
76
+ - **Changelog index viewport scrolling**: Cursor in changelog version list (N key) now scrolls the viewport to follow selection — previously scrolling past the last visible row would move the cursor off-screen into empty space.
77
+
78
+ ### Added
79
+ - **Changelog version summaries**: Each version in the changelog index now shows a short summary (first change title, ~15 words) after the change count, displayed in dim for readability.
80
+
81
+ ---
82
+
5
83
  ## 0.2.15
6
84
 
7
85
  ### Changed
package/README.md CHANGED
@@ -69,8 +69,8 @@ By Vanessa Depraute
69
69
 
70
70
  - **🎯 Coding-focused** — Only LLM models optimized for code generation, not chat or vision
71
71
  - **🌐 Multi-provider** — Models from NVIDIA NIM, Groq, Cerebras, SambaNova, OpenRouter, Hugging Face Inference, Replicate, DeepInfra, Fireworks AI, Codestral, Hyperbolic, Scaleway, Google AI, SiliconFlow, Together AI, Cloudflare Workers AI, Perplexity API, Alibaba Cloud (DashScope), ZAI, and iFlow
72
- - **⚙️ Settings screen** — Press `P` to manage provider API keys, enable/disable providers, configure the proxy, clean OpenCode proxy sync, and manually check/install updates
73
- - **🔀 Multi-account Proxy (`fcm-proxy`)** — Automatically starts a local reverse proxy that groups all your accounts into a single provider in OpenCode; supports multi-account rotation and auto-detects usage limits to swap between providers.
72
+ - **⚙️ Settings screen** — Press `P` to manage provider API keys, enable/disable providers, access FCM Proxy V2 settings, and check/install updates
73
+ - **📡 FCM Proxy V2** — Built-in reverse proxy with multi-key rotation, rate-limit failover, and Anthropic wire format translation for Claude Code. Optional always-on background service (`launchd`/`systemd`) keeps the proxy running 24/7 — even without the TUI. Dedicated overlay with full status, restart, stop, force-kill, and log viewer.
74
74
  - **🚀 Parallel pings** — All models tested simultaneously via native `fetch`
75
75
  - **📊 Real-time animation** — Watch latency appear live in alternate screen buffer
76
76
  - **🏆 Smart ranking** — Top 3 fastest models highlighted with medals 🥇🥈🥉
@@ -87,7 +87,7 @@ By Vanessa Depraute
87
87
  - **💻 OpenCode integration** — Auto-detects NIM setup, sets model as default, launches OpenCode
88
88
  - **🦞 OpenClaw integration** — Sets selected model as default provider in `~/.openclaw/openclaw.json`
89
89
  - **🧰 Public tool launchers** — `Enter` auto-configures and launches 10+ tools: `OpenCode CLI`, `OpenCode Desktop`, `OpenClaw`, `Crush`, `Goose`, `Aider`, `Claude Code`, `Codex`, `Gemini`, `Qwen`, `OpenHands`, `Amp`, and `Pi`. All tools auto-select the chosen model on launch.
90
- - **🔌 Install Endpoints flow** — Press `Y` to install one configured provider directly into `OpenCode CLI`, `OpenCode Desktop`, `OpenClaw`, `Crush`, `Goose`, `Aider`, or `Gemini`, either with the full provider catalog or a curated subset of models
90
+ - **🔌 Install Endpoints flow** — Press `Y` to install one configured provider into any of the 13 supported tools, with a choice between **Direct Provider** (pure API) or **FCM Proxy V2** (key rotation + usage tracking), then pick all models or a curated subset
91
91
  - **📝 Feature Request (J key)** — Send anonymous feedback directly to the project team
92
92
  - **🐛 Bug Report (I key)** — Send anonymous bug reports directly to the project team
93
93
  - **🎨 Clean output** — Zero scrollback pollution, interface stays open until Ctrl+C
@@ -237,8 +237,8 @@ free-coding-models --tier S --json
237
237
 
238
238
  Running `free-coding-models` with no launcher flag starts in **OpenCode CLI** mode.
239
239
 
240
- - Press **`Z`** in the TUI to cycle the public launch targets: `OpenCode CLI` → `OpenCode Desktop` → `OpenClaw` → `Crush` → `Goose`
241
- - Or start directly in the target mode with a CLI flag such as `--opencode-desktop`, `--openclaw`, `--crush`, or `--goose`
240
+ - Press **`Z`** in the TUI to cycle the public launch targets: `OpenCode CLI` → `OpenCode Desktop` → `OpenClaw` → `Crush` → `Goose` → `Pi` → `Aider` → `Claude Code` → `Codex` → `Gemini` → `Qwen` → `OpenHands` → `Amp`
241
+ - Or start directly in the target mode with a CLI flag such as `--opencode-desktop`, `--openclaw`, `--crush`, `--goose`, `--pi`, `--aider`, `--claude-code`, `--codex`, `--gemini`, `--qwen`, `--openhands`, or `--amp`
242
242
  - The active target is always visible in the header badge before you press `Enter`
243
243
 
244
244
  **How it works:**
@@ -554,31 +554,92 @@ Stability = 0.30 × p95_score
554
554
 
555
555
  ---
556
556
 
557
- ## 🔀 Multi-Account Proxy (`fcm-proxy`)
557
+ ## 📡 FCM Proxy V2
558
558
 
559
- `free-coding-models` includes a built-in reverse proxy that can group all your provider accounts into a single virtual provider.
559
+ `free-coding-models` includes a local reverse proxy that merges all your provider API keys into one endpoint. Optional background service mode keeps it running 24/7 — even without the TUI.
560
560
 
561
- Important:
562
- - **Disabled by default** — proxy mode is now opt-in from the Settings screen (`P`)
563
- - **Direct OpenCode launch remains the default** when proxy mode is off
564
- - **Token/request logs are only populated by proxied requests** today
561
+ > **Disabled by default** — enable in Settings (`P`) → FCM Proxy V2 settings.
565
562
 
566
- ### Why use the proxy?
567
- - **Unified Provider**: Instead of managing 20+ providers in your coding assistant, just use `fcm-proxy`.
568
- - **Automatic Rotation**: When one account hits its rate limit (429), the proxy automatically swaps to the next available account for that model.
569
- - **Quota Awareness**: The proxy tracks usage in real-time and prioritizes accounts with the most remaining bandwidth.
570
- - **Transparent Bridging**: Automatically handles non-standard API paths (like ZAI's `/api/coding/paas/v4/`) and converts them to standard OpenAI-compatible `/v1/` calls.
563
+ ### What the proxy does
571
564
 
572
- ### How to use it
573
- 1. Open **Settings** with `P`
574
- 2. Enable **Proxy mode (opt-in)**
575
- 3. Optionally enable **Persist proxy in OpenCode** if you explicitly want `fcm-proxy` written to `~/.config/opencode/opencode.json`
576
- 4. Optionally set **Preferred proxy port** (`0` = auto)
577
- 5. Use `S` in Settings to sync the proxy catalog into OpenCode only when you actually want that persistent config
565
+ | Feature | Description |
566
+ |---------|-------------|
567
+ | **Unified endpoint** | One URL (`http://127.0.0.1:18045/v1`) replaces 20+ provider endpoints |
568
+ | **Key rotation** | Automatically swaps to the next API key when one hits rate limits (429) |
569
+ | **Usage tracking** | Tracks token consumption per provider/model pair in real-time |
570
+ | **Anthropic translation** | Claude Code sends `POST /v1/messages` the proxy translates to OpenAI format upstream |
571
+ | **Path normalization** | Converts non-standard API paths (ZAI, Cloudflare) to standard `/v1/` calls |
578
572
 
579
- Cleanup:
580
- - Use the Settings action **Clean OpenCode proxy config**
581
- - Or run `free-coding-models --clean-proxy`
573
+ ### In-process vs Background Service mode
574
+
575
+ | | In-process (default) | Background Service (always-on) |
576
+ |---|---|---|
577
+ | **Lifetime** | Starts/stops with TUI | Survives reboots |
578
+ | **Use case** | Quick sessions | 24/7 access from any tool |
579
+ | **Setup** | Toggle in Settings | One-time install via TUI or CLI |
580
+ | **Port** | Random or configured | Stable (`18045` by default) |
581
+ | **Token** | New each session | Persistent (env files stay valid) |
582
+
583
+ ### Quick setup
584
+
585
+ **Via TUI (recommended):**
586
+ 1. Press `P` to open Settings
587
+ 2. Select **FCM Proxy V2 settings →** and press Enter
588
+ 3. Enable **Proxy mode**, then select **Install background service**
589
+
590
+ **Via CLI:**
591
+ ```bash
592
+ free-coding-models daemon install # Install + start as OS service
593
+ free-coding-models daemon status # Check running status
594
+ free-coding-models daemon restart # Restart after config changes
595
+ free-coding-models daemon stop # Graceful stop (SIGTERM)
596
+ free-coding-models daemon uninstall # Remove OS service completely
597
+ free-coding-models daemon logs # Show recent service logs
598
+ ```
599
+
600
+ ### Service management
601
+
602
+ The dedicated **FCM Proxy V2** overlay (accessible via `J` from main TUI, or Settings → Enter) provides full control:
603
+
604
+ - **Active tool selector** — Choose which AI coding tool receives proxy config (cycles through all 12 syncable tools)
605
+ - **Auto-sync toggle** — Automatically write the `fcm-proxy` provider to the active tool's config when the proxy starts
606
+ - **Cleanup** — Remove `fcm-proxy` entries from the active tool's config (works for any syncable tool)
607
+ - **Status display** — Running/Stopped/Stale/Unhealthy with PID, port, uptime, account/model counts
608
+ - **Version mismatch detection** — warns if service version differs from installed FCM version
609
+ - **Restart** — stop + start via the OS service manager
610
+ - **Stop** — graceful SIGTERM (service may auto-restart if installed)
611
+ - **Force kill** — emergency SIGKILL for stuck processes
612
+ - **View logs** — last 50 lines from `~/.free-coding-models/daemon-stdout.log`
613
+
614
+ ### Platform support
615
+
616
+ | Platform | Service type | Config path |
617
+ |----------|-------------|-------------|
618
+ | macOS | `launchd` LaunchAgent | `~/Library/LaunchAgents/com.fcm.proxy.plist` |
619
+ | Linux | `systemd` user service | `~/.config/systemd/user/fcm-proxy.service` |
620
+ | Windows | Not supported | Falls back to in-process proxy |
621
+
622
+ ### Config files
623
+
624
+ | File | Purpose |
625
+ |------|---------|
626
+ | `~/.free-coding-models.json` | API keys, proxy settings, service consent |
627
+ | `~/.free-coding-models/daemon.json` | Status file (PID, port, token) — written by the background service |
628
+ | `~/.free-coding-models/daemon-stdout.log` | Service output log |
629
+
630
+ The `proxy.activeTool` field in the config file tracks which tool the proxy auto-syncs to (e.g. `"opencode"`, `"aider"`, `"claude-code"`).
631
+
632
+ ### Cleanup
633
+
634
+ - From the FCM Proxy V2 overlay: **Clean {tool} proxy config** — removes `fcm-proxy` entries from whichever tool is currently selected
635
+ - Or: `free-coding-models --clean-proxy`
636
+
637
+ ### Safety
638
+
639
+ - **Dev guard**: `installDaemon()` is blocked when running from a git checkout — prevents hardcoding local repo paths in OS service files
640
+ - **Localhost only**: The proxy listens on `127.0.0.1`, never exposed to the network
641
+ - **Consent required**: Service installation requires explicit user action — never auto-installs
642
+ - **Hot-reload**: Config changes are picked up automatically without restarting the service
582
643
 
583
644
  ---
584
645
 
@@ -605,17 +666,21 @@ You can use `free-coding-models` with 12+ AI coding tools. When you select a mod
605
666
  | OpenCode Desktop | `--opencode-desktop` | Opens Desktop app |
606
667
  | OpenClaw | `--openclaw` | ~/.openclaw/openclaw.json |
607
668
  | Crush | `--crush` | ~/.config/crush/crush.json |
608
- | Goose | `--goose` | Environment variables |
669
+ | Goose | `--goose` | ~/.config/goose/config.yaml + custom_providers/ |
609
670
  | **Aider** | `--aider` | ~/.aider.conf.yml |
610
- | **Claude Code** | `--claude-code` | CLI flag |
611
- | **Codex** | `--codex` | CLI flag |
612
- | **Gemini** | `--gemini` | ~/.gemini/settings.json |
671
+ | **Claude Code** | `--claude-code` | Requires FCM Proxy V2 |
672
+ | **Codex** | `--codex` | Requires FCM Proxy V2 |
673
+ | **Gemini** | `--gemini` | Requires FCM Proxy V2 |
613
674
  | **Qwen** | `--qwen` | ~/.qwen/settings.json |
614
675
  | **OpenHands** | `--openhands` | LLM_MODEL env var |
615
676
  | **Amp** | `--amp` | ~/.config/amp/settings.json |
616
677
  | **Pi** | `--pi` | ~/.pi/agent/settings.json |
617
678
 
618
- Press **Z** to cycle through different tool modes in the TUI, or use flags to start in your preferred mode.
679
+ Press **Z** to cycle through all 13 tool modes in the TUI, or use flags to start in your preferred mode.
680
+
681
+ ⚡ = Requires FCM Proxy V2 background service (press `J` to enable). These tools cannot connect to free providers without the proxy.
682
+
683
+ All tools are also available as install targets in the **Install Endpoints** flow (`Y` key) — install an entire provider catalog into any tool with one flow, choosing between Direct Provider or FCM Proxy V2 connection.
619
684
 
620
685
  ---
621
686
 
@@ -914,7 +979,7 @@ This script:
914
979
  | `--tier C` | Show only C tier models |
915
980
  | `--profile <name>` | Load a saved config profile on startup |
916
981
  | `--recommend` | Auto-open Smart Recommend overlay on start |
917
- | `--clean-proxy` | Remove persisted `fcm-proxy` config from OpenCode |
982
+ | `--clean-proxy` | Remove persisted `fcm-proxy` config from the active tool |
918
983
 
919
984
  **Keyboard shortcuts (main TUI):**
920
985
  - **↑↓** — Navigate models
@@ -924,17 +989,18 @@ This script:
924
989
  - **T** — Cycle tier filter (All → S+ → S → A+ → A → A- → B+ → B → C → All)
925
990
  - **D** — Cycle provider filter (All → NIM → Groq → ...)
926
991
  - **E** — Toggle configured-only mode (on by default, persisted across sessions and profiles)
927
- - **Z** — Cycle target tool (OpenCode CLI → OpenCode Desktop → OpenClaw → Crush → Goose)
992
+ - **Z** — Cycle target tool (OpenCode CLI → Desktop → OpenClaw → Crush → Goose → Pi → Aider → Claude Code → Codex → Gemini → Qwen → OpenHands → Amp)
928
993
  - **X** — Toggle request logs (recent proxied request/token usage logs, up to 500 entries)
929
994
  - **A (in logs)** — Toggle between showing 500 entries or ALL logs
930
995
  - **P** — Open Settings (manage API keys, toggles, updates, profiles)
931
- - **Y** — Open Install Endpoints (`provider → tool → all models` or `selected models only`, no proxy)
996
+ - **Y** — Open Install Endpoints (`provider → tool → connection mode scope models`, Direct or FCM Proxy V2)
932
997
  - **Shift+P** — Cycle through saved profiles (switches live TUI settings)
933
998
  - **Shift+S** — Save current TUI settings as a named profile (inline prompt)
934
999
  - **Q** — Open Smart Recommend overlay (find the best model for your task)
935
1000
  - **N** — Open Changelog overlay (browse index of all versions, `Enter` to view details, `B` to go back)
936
1001
  - **W** — Cycle ping mode (`FAST` 2s → `NORMAL` 10s → `SLOW` 30s → `FORCED` 4s)
937
- - **J / I** — Request feature / Report bug
1002
+ - **J** — Open FCM Proxy V2 settings (shows green "Proxy On" / red "Proxy Off" badge in footer)
1003
+ - **I** — Feedback, bugs & requests
938
1004
  - **K / Esc** — Show help overlay / Close overlay
939
1005
  - **Ctrl+C** — Exit
940
1006
 
@@ -942,18 +1008,25 @@ Pressing **K** now shows a full in-app reference: main hotkeys, settings hotkeys
942
1008
 
943
1009
  ### 🔌 Install Endpoints (`Y`)
944
1010
 
945
- `Y` opens a dedicated install flow for configured providers. The flow is:
1011
+ `Y` opens a dedicated install flow for configured providers. The 5-step flow is:
946
1012
 
947
- 1. Pick one provider that already has an API key in Settings
948
- 2. Pick the target tool: `OpenCode CLI`, `OpenCode Desktop`, `OpenClaw`, `Crush`, or `Goose`
949
- 3. Choose either `Install all models` or `Install selected models only`
1013
+ 1. **Provider** — Pick one provider that already has an API key in Settings
1014
+ 2. **Tool** — Pick the target tool from all 13 supported tools:
1015
+ - Config-based: `OpenCode CLI`, `OpenCode Desktop`, `OpenClaw`, `Crush`, `Goose`, `Pi`, `Aider`, `Amp`, `Gemini`, `Qwen`
1016
+ - Env-file based: `Claude Code`, `Codex CLI`, `OpenHands` (writes `~/.fcm-{tool}-env` — source it before launching)
1017
+ 3. **Connection Mode** — Choose how the tool connects to the provider:
1018
+ - **⚡ Direct Provider** — pure API connection, no proxy involved
1019
+ - **🔄 FCM Proxy V2** — route through FCM Proxy V2 with key rotation and usage tracking
1020
+ 4. **Scope** — Choose `Install all models` or `Install selected models only`
1021
+ 5. **Models** (if scope = selected) — Multi-select individual models from the provider catalog
950
1022
 
951
1023
  Important behavior:
952
1024
 
953
- - Installs are written directly into the target tool config as FCM-managed entries, without going through `fcm-proxy`
1025
+ - Installs are written into the target tool config as FCM-managed entries (namespaced under `fcm-*`)
954
1026
  - `Install all models` is the recommended path because FCM can refresh that catalog automatically on later launches when the provider model list changes
955
1027
  - `Install selected models only` is useful when you want a smaller curated picker inside the target tool
956
1028
  - `OpenCode CLI` and `OpenCode Desktop` share the same `opencode.json`, so the managed provider appears in both
1029
+ - For env-based tools (Claude Code, Codex, OpenHands), FCM writes a sourceable file at `~/.fcm-{tool}-env` — run `source ~/.fcm-claude-code-env` before launching
957
1030
 
958
1031
  **Keyboard shortcuts (Settings screen — `P` key):**
959
1032
  - **↑↓** — Navigate providers, maintenance row, and profile rows
@@ -0,0 +1,239 @@
1
+ #!/usr/bin/env node
2
+
3
+ /**
4
+ * @file bin/fcm-proxy-daemon.js
5
+ * @description Standalone headless FCM proxy daemon — runs independently of the TUI.
6
+ *
7
+ * 📖 This is the always-on background proxy server. It reads the user's config
8
+ * (~/.free-coding-models.json), builds the proxy topology (merged models × API keys),
9
+ * and starts a ProxyServer on a stable port with a stable token.
10
+ *
11
+ * 📖 When installed as a launchd LaunchAgent (macOS) or systemd user service (Linux),
12
+ * this daemon starts at login and persists across reboots, allowing Claude Code,
13
+ * Gemini CLI, OpenCode, and all other tools to access free models 24/7.
14
+ *
15
+ * 📖 Status file: ~/.free-coding-models/daemon.json
16
+ * Contains PID, port, token, version, model/account counts. The TUI reads this
17
+ * to detect a running daemon and delegate instead of starting an in-process proxy.
18
+ *
19
+ * 📖 Hot-reload: Watches ~/.free-coding-models.json for changes and reloads the
20
+ * proxy topology (accounts, models) without restarting the process.
21
+ *
22
+ * @see src/proxy-topology.js — shared topology builder
23
+ * @see src/proxy-server.js — ProxyServer implementation
24
+ * @see src/daemon-manager.js — install/uninstall/status management
25
+ */
26
+
27
+ import { readFileSync, writeFileSync, existsSync, mkdirSync, unlinkSync, watch } from 'node:fs'
28
+ import { join } from 'node:path'
29
+ import { homedir } from 'node:os'
30
+ import { createRequire } from 'node:module'
31
+ import { fileURLToPath } from 'node:url'
32
+
33
+ // 📖 Resolve package.json for version info
34
+ const __dirname = fileURLToPath(new URL('.', import.meta.url))
35
+ let PKG_VERSION = 'unknown'
36
+ try {
37
+ const pkg = JSON.parse(readFileSync(join(__dirname, '..', 'package.json'), 'utf8'))
38
+ PKG_VERSION = pkg.version || 'unknown'
39
+ } catch { /* ignore */ }
40
+
41
+ // 📖 Config + data paths
42
+ const CONFIG_PATH = join(homedir(), '.free-coding-models.json')
43
+ const DATA_DIR = join(homedir(), '.free-coding-models')
44
+ const DAEMON_STATUS_FILE = join(DATA_DIR, 'daemon.json')
45
+ const LOG_PREFIX = '[fcm-daemon]'
46
+
47
+ // 📖 Default daemon port — high port unlikely to conflict
48
+ const DEFAULT_DAEMON_PORT = 18045
49
+
50
+ // ─── Helpers ─────────────────────────────────────────────────────────────────
51
+
52
+ function log(msg) {
53
+ console.log(`${LOG_PREFIX} ${new Date().toISOString()} ${msg}`)
54
+ }
55
+
56
+ function logError(msg) {
57
+ console.error(`${LOG_PREFIX} ${new Date().toISOString()} ERROR: ${msg}`)
58
+ }
59
+
60
+ /**
61
+ * 📖 Write daemon status file so TUI and tools can discover the running daemon.
62
+ */
63
+ function writeDaemonStatus(info) {
64
+ if (!existsSync(DATA_DIR)) {
65
+ mkdirSync(DATA_DIR, { mode: 0o700, recursive: true })
66
+ }
67
+ writeFileSync(DAEMON_STATUS_FILE, JSON.stringify(info, null, 2), { mode: 0o600 })
68
+ }
69
+
70
+ /**
71
+ * 📖 Remove daemon status file on shutdown.
72
+ */
73
+ function removeDaemonStatus() {
74
+ try {
75
+ if (existsSync(DAEMON_STATUS_FILE)) unlinkSync(DAEMON_STATUS_FILE)
76
+ } catch { /* best-effort */ }
77
+ }
78
+
79
+ // ─── Main ────────────────────────────────────────────────────────────────────
80
+
81
+ async function main() {
82
+ log(`Starting FCM Proxy V2 v${PKG_VERSION} (PID: ${process.pid})`)
83
+
84
+ // 📖 Dynamic imports — keep startup fast, avoid loading TUI-specific modules
85
+ const { loadConfig, getProxySettings } = await import('../src/config.js')
86
+ const { ProxyServer } = await import('../src/proxy-server.js')
87
+ const { buildProxyTopologyFromConfig, buildMergedModelsForDaemon } = await import('../src/proxy-topology.js')
88
+ const { sources } = await import('../sources.js')
89
+
90
+ // 📖 Load config and build initial topology — wrapped in try/catch to provide clear error on startup failures
91
+ let fcmConfig, proxySettings, mergedModels, accounts, proxyModels
92
+ try {
93
+ fcmConfig = loadConfig()
94
+ proxySettings = getProxySettings(fcmConfig)
95
+ } catch (err) {
96
+ logError(`Fatal: Failed to load config: ${err.message}`)
97
+ process.exit(1)
98
+ }
99
+
100
+ if (!proxySettings.stableToken) {
101
+ logError('No stableToken in proxy settings — run the TUI first to initialize config.')
102
+ process.exit(1)
103
+ }
104
+
105
+ const port = proxySettings.preferredPort || DEFAULT_DAEMON_PORT
106
+ const token = proxySettings.stableToken
107
+
108
+ try {
109
+ log(`Building merged model catalog...`)
110
+ mergedModels = await buildMergedModelsForDaemon()
111
+ log(`Merged ${mergedModels.length} model groups`)
112
+
113
+ const topology = buildProxyTopologyFromConfig(fcmConfig, mergedModels, sources)
114
+ accounts = topology.accounts
115
+ proxyModels = topology.proxyModels
116
+ } catch (err) {
117
+ logError(`Fatal: Failed to build initial topology: ${err.message}`)
118
+ process.exit(1)
119
+ }
120
+
121
+ if (accounts.length === 0) {
122
+ logError('No API keys configured — FCM Proxy V2 has no accounts to serve. Add keys via the TUI.')
123
+ process.exit(1)
124
+ }
125
+
126
+ log(`Built proxy topology: ${accounts.length} accounts across ${Object.keys(proxyModels).length} models`)
127
+
128
+ // 📖 Start the proxy server
129
+ const proxy = new ProxyServer({
130
+ port,
131
+ accounts,
132
+ proxyApiKey: token,
133
+ })
134
+
135
+ try {
136
+ const { port: listeningPort } = await proxy.start()
137
+ log(`Proxy listening on 127.0.0.1:${listeningPort}`)
138
+
139
+ // 📖 Write status file for TUI discovery
140
+ const statusInfo = {
141
+ pid: process.pid,
142
+ port: listeningPort,
143
+ token,
144
+ startedAt: new Date().toISOString(),
145
+ version: PKG_VERSION,
146
+ modelCount: Object.keys(proxyModels).length,
147
+ accountCount: accounts.length,
148
+ }
149
+ writeDaemonStatus(statusInfo)
150
+ log(`Status file written to ${DAEMON_STATUS_FILE}`)
151
+
152
+ // 📖 Set up config file watcher for hot-reload
153
+ let reloadTimeout = null
154
+ // 📖 Prevents concurrent reloads — if a reload is in progress, the next
155
+ // watcher event will be queued (one pending max) instead of stacking
156
+ let reloadInProgress = false
157
+ let reloadQueued = false
158
+ const configWatcher = watch(CONFIG_PATH, () => {
159
+ // 📖 Debounce 1s — config writes can trigger multiple fs events
160
+ if (reloadTimeout) clearTimeout(reloadTimeout)
161
+ reloadTimeout = setTimeout(async () => {
162
+ if (reloadInProgress) {
163
+ reloadQueued = true
164
+ return
165
+ }
166
+ reloadInProgress = true
167
+ try {
168
+ log('Config file changed — reloading topology...')
169
+ fcmConfig = loadConfig()
170
+ mergedModels = await buildMergedModelsForDaemon()
171
+ const newTopology = buildProxyTopologyFromConfig(fcmConfig, mergedModels, sources)
172
+
173
+ if (newTopology.accounts.length === 0) {
174
+ log('Warning: new topology has 0 accounts — keeping current topology')
175
+ return
176
+ }
177
+
178
+ proxy.updateAccounts(newTopology.accounts)
179
+ accounts = newTopology.accounts
180
+ proxyModels = newTopology.proxyModels
181
+
182
+ // 📖 Update status file
183
+ writeDaemonStatus({
184
+ ...statusInfo,
185
+ modelCount: Object.keys(proxyModels).length,
186
+ accountCount: accounts.length,
187
+ })
188
+
189
+ log(`Topology reloaded: ${accounts.length} accounts, ${Object.keys(proxyModels).length} models`)
190
+ } catch (err) {
191
+ logError(`Hot-reload failed: ${err.message}`)
192
+ } finally {
193
+ reloadInProgress = false
194
+ // 📖 If another reload was queued during this one, trigger it now
195
+ if (reloadQueued) {
196
+ reloadQueued = false
197
+ configWatcher.emit('change')
198
+ }
199
+ }
200
+ }, 1000)
201
+ })
202
+
203
+ // 📖 Graceful shutdown
204
+ const shutdown = async (signal) => {
205
+ log(`Received ${signal} — shutting down...`)
206
+ if (reloadTimeout) clearTimeout(reloadTimeout)
207
+ configWatcher.close()
208
+ try {
209
+ await proxy.stop()
210
+ } catch { /* best-effort */ }
211
+ removeDaemonStatus()
212
+ log('FCM Proxy V2 stopped cleanly.')
213
+ process.exit(0)
214
+ }
215
+
216
+ process.on('SIGINT', () => shutdown('SIGINT'))
217
+ process.on('SIGTERM', () => shutdown('SIGTERM'))
218
+ process.on('exit', () => removeDaemonStatus())
219
+
220
+ // 📖 Keep the process alive
221
+ log('FCM Proxy V2 ready. Waiting for requests...')
222
+
223
+ } catch (err) {
224
+ if (err.code === 'EADDRINUSE') {
225
+ logError(`Port ${port} is already in use. Another FCM Proxy V2 instance may be running, or another process occupies this port.`)
226
+ logError(`Change proxy.preferredPort in ~/.free-coding-models.json or stop the conflicting process.`)
227
+ process.exit(2)
228
+ }
229
+ logError(`Failed to start proxy: ${err.message}`)
230
+ removeDaemonStatus()
231
+ process.exit(1)
232
+ }
233
+ }
234
+
235
+ main().catch(err => {
236
+ logError(`Fatal: ${err.message}`)
237
+ removeDaemonStatus()
238
+ process.exit(1)
239
+ })