@elvatis_com/openclaw-cli-bridge-elvatis 0.2.2 → 0.2.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,83 +1,190 @@
1
1
  # openclaw-cli-bridge-elvatis
2
2
 
3
- > OpenClaw plugin that bridges locally installed AI CLIs (Codex, Gemini, Claude Code) as model providers.
3
+ > OpenClaw plugin that bridges locally installed AI CLIs (Codex, Gemini, Claude Code) as model providers — with slash commands for instant model switching.
4
+
5
+ **Current version:** `0.2.2`
6
+
7
+ ---
4
8
 
5
9
  ## What it does
6
10
 
7
- **Phase 1 — Auth bridge:** Registers the `openai-codex` provider using OAuth tokens already stored by the Codex CLI (`~/.codex/auth.json`). No re-login needed.
11
+ ### Phase 1 — Auth bridge (`openai-codex`)
12
+ Registers the `openai-codex` provider by reading OAuth tokens already stored by the Codex CLI (`~/.codex/auth.json`). No re-login needed.
13
+
14
+ ### Phase 2 — Request bridge (local proxy)
15
+ Starts a local OpenAI-compatible HTTP proxy on `127.0.0.1:31337` and configures OpenClaw's `vllm` provider to route calls through `gemini` and `claude` CLI subprocesses.
8
16
 
9
- **Phase 2 Request bridge:** Starts a local OpenAI-compatible HTTP proxy server (default port `31337`) and configures OpenClaw's `vllm` provider to route model calls through `gemini` and `claude` CLI subprocesses.
17
+ Prompt delivery: always via **stdin** (not CLI args) avoids `E2BIG` for long sessions. Each message batch is truncated to the last 20 messages + system message (configurable in `src/cli-runner.ts`).
10
18
 
11
19
  | Model reference | CLI invoked |
12
20
  |---|---|
13
- | `vllm/cli-gemini/gemini-2.5-pro` | `gemini -m gemini-2.5-pro -p "<prompt>"` |
14
- | `vllm/cli-gemini/gemini-2.5-flash` | `gemini -m gemini-2.5-flash -p "<prompt>"` |
15
- | `vllm/cli-claude/claude-opus-4-6` | `claude -p -m claude-opus-4-6 --output-format text "<prompt>"` |
16
- | `vllm/cli-claude/claude-sonnet-4-6` | `claude -p -m claude-sonnet-4-6 --output-format text "<prompt>"` |
21
+ | `vllm/cli-gemini/gemini-2.5-pro` | `gemini -m gemini-2.5-pro @<tmpfile>` |
22
+ | `vllm/cli-gemini/gemini-2.5-flash` | `gemini -m gemini-2.5-flash @<tmpfile>` |
23
+ | `vllm/cli-gemini/gemini-3-pro` | `gemini -m gemini-3-pro @<tmpfile>` |
24
+ | `vllm/cli-claude/claude-sonnet-4-6` | `claude -p --output-format text --model claude-sonnet-4-6` (stdin) |
25
+ | `vllm/cli-claude/claude-opus-4-6` | `claude -p --output-format text --model claude-opus-4-6` (stdin) |
26
+ | `vllm/cli-claude/claude-haiku-4-5` | `claude -p --output-format text --model claude-haiku-4-5` (stdin) |
27
+
28
+ ### Phase 3 — Slash commands
29
+ Six plugin-registered commands for instant model switching (no agent invocation needed):
30
+
31
+ | Command | Switches to |
32
+ |---|---|
33
+ | `/cli-sonnet` | `vllm/cli-claude/claude-sonnet-4-6` |
34
+ | `/cli-opus` | `vllm/cli-claude/claude-opus-4-6` |
35
+ | `/cli-haiku` | `vllm/cli-claude/claude-haiku-4-5` |
36
+ | `/cli-gemini` | `vllm/cli-gemini/gemini-2.5-pro` |
37
+ | `/cli-gemini-flash` | `vllm/cli-gemini/gemini-2.5-flash` |
38
+ | `/cli-gemini3` | `vllm/cli-gemini/gemini-3-pro` |
39
+
40
+ All commands require `requireAuth: true` — only authorized/owner senders can execute them. Each command calls `openclaw models set <model>` via `api.runtime.system.runCommandWithTimeout` and replies with a confirmation.
41
+
42
+ ---
17
43
 
18
44
  ## Requirements
19
45
 
20
- - [OpenClaw](https://openclaw.ai) gateway running
46
+ - [OpenClaw](https://openclaw.ai) gateway (tested with `2026.3.x`)
21
47
  - One or more of:
22
48
  - [`@openai/codex`](https://github.com/openai/codex) — `npm i -g @openai/codex` + `codex login`
23
49
  - [`@google/gemini-cli`](https://github.com/google-gemini/gemini-cli) — `npm i -g @google/gemini-cli` + `gemini auth`
24
50
  - [`@anthropic-ai/claude-code`](https://github.com/anthropic-ai/claude-code) — `npm i -g @anthropic-ai/claude-code` + `claude auth`
25
51
 
52
+ ---
53
+
26
54
  ## Installation
27
55
 
28
56
  ```bash
29
- # Install from ClawHub (once published)
57
+ # From ClawHub
30
58
  clawhub install openclaw-cli-bridge-elvatis
31
59
 
32
- # Or load directly from this repo (development)
60
+ # Or from workspace (development / local path)
33
61
  # Add to ~/.openclaw/openclaw.json:
34
- # plugins.load.paths: ["<path-to-this-repo>"]
35
- # plugins.allow: ["openclaw-cli-bridge-elvatis"]
62
+ # plugins.load.paths: ["~/.openclaw/workspace/openclaw-cli-bridge-elvatis"]
36
63
  # plugins.entries.openclaw-cli-bridge-elvatis: { "enabled": true }
37
64
  ```
38
65
 
39
- ## Auth setup (Phase 1 — Codex)
66
+ ---
67
+
68
+ ## Setup
69
+
70
+ ### 1. Enable + restart
40
71
 
41
- After enabling the plugin, register the Codex auth profile:
72
+ ```bash
73
+ # In ~/.openclaw/openclaw.json → plugins.entries:
74
+ "openclaw-cli-bridge-elvatis": { "enabled": true }
75
+
76
+ openclaw gateway restart
77
+ ```
78
+
79
+ ### 2. Register Codex auth (Phase 1, optional)
42
80
 
43
81
  ```bash
44
82
  openclaw models auth login --provider openai-codex
45
83
  # Select: "Codex CLI (existing login)"
46
84
  ```
47
85
 
48
- The proxy server (Phase 2) starts automatically and patches `openclaw.json` with the `vllm` provider config. Restart the gateway to activate the new models.
86
+ ### 3. Verify proxy (Phase 2)
87
+
88
+ On startup the plugin auto-patches `openclaw.json` with the `vllm` provider config (port `31337`) and logs:
89
+
90
+ ```
91
+ [cli-bridge] proxy ready — vllm/cli-gemini/* and vllm/cli-claude/* available
92
+ [cli-bridge] registered 6 slash commands: /cli-sonnet, /cli-opus, /cli-haiku, /cli-gemini, /cli-gemini-flash, /cli-gemini3
93
+ ```
94
+
95
+ ### 4. Switch models (Phase 3)
96
+
97
+ Use any `/cli-*` command from any connected channel:
98
+
99
+ ```
100
+ /cli-sonnet
101
+ → ✅ Switched to Claude Sonnet 4.6 (CLI)
102
+ `vllm/cli-claude/claude-sonnet-4-6`
103
+ ```
104
+
105
+ ---
49
106
 
50
107
  ## Configuration
51
108
 
52
- Add to your `plugins.entries.openclaw-cli-bridge-elvatis.config` in `~/.openclaw/openclaw.json`:
109
+ In `~/.openclaw/openclaw.json` `plugins.entries.openclaw-cli-bridge-elvatis.config`:
53
110
 
54
111
  ```json5
55
112
  {
56
- "enableCodex": true, // register openai-codex from Codex CLI auth
57
- "enableProxy": true, // start the local CLI proxy server
58
- "proxyPort": 31337, // port for the proxy (default: 31337)
59
- "proxyApiKey": "cli-bridge", // key used between OpenClaw and the proxy
60
- "proxyTimeoutMs": 120000 // CLI timeout in ms (default: 2 min)
113
+ "enableCodex": true, // register openai-codex from Codex CLI auth (default: true)
114
+ "enableProxy": true, // start local CLI proxy server (default: true)
115
+ "proxyPort": 31337, // proxy port (default: 31337)
116
+ "proxyApiKey": "cli-bridge", // key between OpenClaw vllm provider and proxy (default: "cli-bridge")
117
+ "proxyTimeoutMs": 120000 // CLI subprocess timeout in ms (default: 120s)
61
118
  }
62
119
  ```
63
120
 
121
+ ---
122
+
64
123
  ## Architecture
65
124
 
66
125
  ```
67
126
  OpenClaw agent
68
127
 
69
- ├─ openai-codex/* ──► OpenAI API (auth via Codex CLI OAuth tokens)
128
+ ├─ openai-codex/* ──► OpenAI API (auth via ~/.codex/auth.json OAuth tokens)
70
129
 
71
130
  └─ vllm/cli-gemini/* ─┐
72
- vllm/cli-claude/* ─┤─► openclaw-cli-bridge-elvatis proxy (127.0.0.1:31337)
73
- │ ├─ cli-gemini/* → gemini -m <model> -p "<prompt>"
74
- │ └─ cli-claude/* → claude -p -m <model> "<prompt>"
75
- └──────────────────────────────────────────────────────
131
+ vllm/cli-claude/* ─┤─► localhost:31337 (openclaw-cli-bridge proxy)
132
+ │ ├─ cli-gemini/* → gemini -m <model> @<tmpfile>
133
+ │ └─ cli-claude/* → claude -p --model <model> prompt via stdin
134
+ └───────────────────────────────────────────────────
135
+
136
+ Slash commands (bypass agent):
137
+ /cli-sonnet|opus|haiku|gemini|gemini-flash|gemini3
138
+ └─► openclaw models set <model> (atomic, ~1s)
76
139
  ```
77
140
 
78
- ## AAHP handoff
141
+ ---
142
+
143
+ ## Known Issues & Fixes
144
+
145
+ ### `spawn E2BIG` (fixed in v0.2.1)
146
+
147
+ **Symptom:** `CLI error for cli-claude/…: spawn E2BIG` after ~500+ messages in a session.
148
+
149
+ **Root cause:** The OpenClaw gateway modifies `process.env` at runtime (OPENCLAW_* vars, session context, etc.). Spreading the full `process.env` into `spawn()` pushes `argv + envp` over Linux's `ARG_MAX` (~2MB).
150
+
151
+ **Fix:** `buildMinimalEnv()` in `src/cli-runner.ts` — only passes `HOME`, `PATH`, `USER`, and auth keys to the subprocess. Immune to gateway runtime env size.
152
+
153
+ ---
154
+
155
+ ## Development
156
+
157
+ ```bash
158
+ npm run typecheck # tsc --noEmit
159
+ npm test # vitest run
160
+ ```
161
+
162
+ Test coverage: `test/cli-runner.test.ts` — unit tests for `formatPrompt` (truncation, system message handling, MAX_MSG_CHARS).
163
+
164
+ ---
165
+
166
+ ## Changelog
167
+
168
+ ### v0.2.2
169
+ - **feat:** Phase 3 — `/cli-*` slash commands for instant model switching
170
+ - All 6 commands registered via `api.registerCommand` with `requireAuth: true`
171
+ - Calls `openclaw models set <model>` via `api.runtime.system.runCommandWithTimeout`
172
+
173
+ ### v0.2.1
174
+ - **fix:** `spawn E2BIG` — use `buildMinimalEnv()` instead of spreading full `process.env`
175
+ - **feat:** Added `test/cli-runner.test.ts` (5 unit tests)
176
+ - Added Gemini 3 Pro model (`vllm/cli-gemini/gemini-3-pro`)
177
+
178
+ ### v0.2.0
179
+ - **feat:** Phase 2 — local OpenAI-compatible proxy server
180
+ - Prompt via stdin/tmpfile (never as CLI arg) to prevent arg-size issues
181
+ - `MAX_MESSAGES=20` + `MAX_MSG_CHARS=4000` truncation in `formatPrompt`
182
+ - Auto-patch of `openclaw.json` vllm provider config on first start
183
+
184
+ ### v0.1.x
185
+ - Phase 1: Codex CLI OAuth auth bridge
79
186
 
80
- Project tracking lives in `.ai/handoff/` (AAHP v3 protocol).
187
+ ---
81
188
 
82
189
  ## License
83
190
 
package/SKILL.md CHANGED
@@ -1,24 +1,52 @@
1
1
  ---
2
2
  name: openclaw-cli-bridge-elvatis
3
- description: Bridge local Codex, Gemini, and Claude Code CLIs into OpenClaw (Codex OAuth auth bridge + Gemini/Claude OpenAI-compatible local proxy via vllm).
3
+ description: Bridge local Codex, Gemini, and Claude Code CLIs into OpenClaw as vllm model providers. Includes /cli-* slash commands for instant model switching (/cli-sonnet, /cli-opus, /cli-haiku, /cli-gemini, /cli-gemini-flash, /cli-gemini3). E2BIG-safe spawn via minimal env.
4
4
  homepage: https://github.com/elvatis/openclaw-cli-bridge-elvatis
5
5
  metadata:
6
6
  {
7
7
  "openclaw":
8
8
  {
9
9
  "emoji": "🌉",
10
- "requires": { "bins": ["openclaw", "codex", "gemini", "claude"] }
10
+ "requires": { "bins": ["openclaw", "claude", "gemini"] },
11
+ "commands": ["/cli-sonnet", "/cli-opus", "/cli-haiku", "/cli-gemini", "/cli-gemini-flash", "/cli-gemini3"]
11
12
  }
12
13
  }
13
14
  ---
14
15
 
15
- # OpenClaw CLI Bridge Elvatis
16
+ # OpenClaw CLI Bridge
16
17
 
17
- This project provides two layers:
18
+ Bridges locally installed AI CLIs (Codex, Gemini, Claude Code) as OpenClaw model providers. Three phases:
18
19
 
19
- 1. **Codex auth bridge** for `openai-codex/*` by reading existing Codex CLI OAuth tokens from `~/.codex/auth.json`
20
- 2. **Local OpenAI-compatible proxy** (default `127.0.0.1:31337`) for Gemini/Claude CLI execution via OpenClaw `vllm` provider models:
21
- - `vllm/cli-gemini/*`
22
- - `vllm/cli-claude/*`
20
+ ## Phase 1 Codex Auth Bridge
21
+ Registers `openai-codex` provider from existing `~/.codex/auth.json` tokens. No re-login.
23
22
 
24
- See `README.md` for setup and architecture.
23
+ ## Phase 2 Request Proxy
24
+ Local OpenAI-compatible HTTP proxy (`127.0.0.1:31337`) routes vllm model calls to CLI subprocesses:
25
+ - `vllm/cli-gemini/gemini-2.5-pro` / `gemini-2.5-flash` / `gemini-3-pro`
26
+ - `vllm/cli-claude/claude-sonnet-4-6` / `claude-opus-4-6` / `claude-haiku-4-5`
27
+
28
+ Prompts go via stdin/tmpfile — never as CLI args (prevents `E2BIG` for long sessions).
29
+
30
+ ## Phase 3 — Slash Commands
31
+ Six instant model-switch commands (authorized senders only):
32
+
33
+ | Command | Model |
34
+ |---|---|
35
+ | `/cli-sonnet` | `vllm/cli-claude/claude-sonnet-4-6` |
36
+ | `/cli-opus` | `vllm/cli-claude/claude-opus-4-6` |
37
+ | `/cli-haiku` | `vllm/cli-claude/claude-haiku-4-5` |
38
+ | `/cli-gemini` | `vllm/cli-gemini/gemini-2.5-pro` |
39
+ | `/cli-gemini-flash` | `vllm/cli-gemini/gemini-2.5-flash` |
40
+ | `/cli-gemini3` | `vllm/cli-gemini/gemini-3-pro` |
41
+
42
+ Each command runs `openclaw models set <model>` atomically and replies with a confirmation.
43
+
44
+ ## Setup
45
+
46
+ 1. Enable plugin + restart gateway
47
+ 2. (Optional) Register Codex auth: `openclaw models auth login --provider openai-codex`
48
+ 3. Use `/cli-*` commands to switch models from any channel
49
+
50
+ See `README.md` for full configuration reference and architecture diagram.
51
+
52
+ **Version:** 0.2.2
package/index.ts CHANGED
@@ -15,23 +15,27 @@
15
15
  * /cli-gemini → vllm/cli-gemini/gemini-2.5-pro
16
16
  * /cli-gemini-flash → vllm/cli-gemini/gemini-2.5-flash
17
17
  * /cli-gemini3 → vllm/cli-gemini/gemini-3-pro
18
+ * /cli-back → restore model that was active before last /cli-* switch
19
+ * /cli-test [model] → one-shot proxy health check (does NOT switch global model)
18
20
  *
19
21
  * Provider / model naming:
20
- * vllm/cli-gemini/gemini-2.5-pro → `gemini -m gemini-2.5-pro -p "<prompt>"`
21
- * vllm/cli-claude/claude-opus-4-6 → `claude -p -m claude-opus-4-6 --output-format text "<prompt>"`
22
+ * vllm/cli-gemini/gemini-2.5-pro → `gemini -m gemini-2.5-pro @<tmpfile>`
23
+ * vllm/cli-claude/claude-opus-4-6 → `claude -p -m claude-opus-4-6 --output-format text` (stdin)
22
24
  */
23
25
 
26
+ import { readFileSync, writeFileSync, mkdirSync } from "node:fs";
27
+ import { homedir } from "node:os";
28
+ import { join } from "node:path";
29
+ import http from "node:http";
24
30
  import type {
25
31
  OpenClawPluginApi,
26
32
  ProviderAuthContext,
27
33
  ProviderAuthResult,
28
34
  } from "openclaw/plugin-sdk";
29
35
 
30
- // Types derived from the plugin SDK (PluginCommandContext / PluginCommandResult are
31
- // not re-exported from the package, so we infer them from the registerCommand signature).
32
- type RegisterCommandParam = Parameters<OpenClawPluginApi["registerCommand"]>[0];
33
- type PluginCommandContext = Parameters<RegisterCommandParam["handler"]>[0];
34
- type PluginCommandResult = Awaited<ReturnType<RegisterCommandParam["handler"]>>;
36
+ // OpenClawPluginCommandDefinition is defined in the SDK types but not re-exported
37
+ // by the package derive it from the registerCommand signature.
38
+ type OpenClawPluginCommandDefinition = Parameters<OpenClawPluginApi["registerCommand"]>[0];
35
39
  import { buildOauthProviderAuthResult } from "openclaw/plugin-sdk";
36
40
  import {
37
41
  DEFAULT_CODEX_AUTH_PATH,
@@ -41,14 +45,19 @@ import {
41
45
  import { startProxyServer } from "./src/proxy-server.js";
42
46
  import { patchOpencllawConfig } from "./src/config-patcher.js";
43
47
 
48
+ // ──────────────────────────────────────────────────────────────────────────────
49
+ // Types derived from SDK (not re-exported by the package)
50
+ // ──────────────────────────────────────────────────────────────────────────────
51
+ type RegisterCommandParam = Parameters<OpenClawPluginApi["registerCommand"]>[0];
52
+ type PluginCommandContext = Parameters<RegisterCommandParam["handler"]>[0];
53
+ type PluginCommandResult = Awaited<ReturnType<RegisterCommandParam["handler"]>>;
54
+
44
55
  // ──────────────────────────────────────────────────────────────────────────────
45
56
  // Plugin config type
46
57
  // ──────────────────────────────────────────────────────────────────────────────
47
58
  interface CliPluginConfig {
48
- // Phase 1: auth bridge
49
59
  codexAuthPath?: string;
50
60
  enableCodex?: boolean;
51
- // Phase 2: request proxy
52
61
  enableProxy?: boolean;
53
62
  proxyPort?: number;
54
63
  proxyApiKey?: string;
@@ -59,10 +68,53 @@ const DEFAULT_PROXY_PORT = 31337;
59
68
  const DEFAULT_PROXY_API_KEY = "cli-bridge";
60
69
 
61
70
  // ──────────────────────────────────────────────────────────────────────────────
62
- // Phase 3: slash-command model table
71
+ // State file persists the model that was active before the last /cli-* switch
72
+ // Located at ~/.openclaw/cli-bridge-state.json (survives gateway restarts)
73
+ // ──────────────────────────────────────────────────────────────────────────────
74
+ const STATE_FILE = join(homedir(), ".openclaw", "cli-bridge-state.json");
75
+
76
+ interface CliBridgeState {
77
+ previousModel: string;
78
+ }
79
+
80
+ function readState(): CliBridgeState | null {
81
+ try {
82
+ return JSON.parse(readFileSync(STATE_FILE, "utf8")) as CliBridgeState;
83
+ } catch {
84
+ return null;
85
+ }
86
+ }
87
+
88
+ function writeState(state: CliBridgeState): void {
89
+ try {
90
+ mkdirSync(join(homedir(), ".openclaw"), { recursive: true });
91
+ writeFileSync(STATE_FILE, JSON.stringify(state, null, 2) + "\n", "utf8");
92
+ } catch {
93
+ // non-fatal — /cli-back will just report no previous model
94
+ }
95
+ }
96
+
97
+ // ──────────────────────────────────────────────────────────────────────────────
98
+ // Read the current primary model from openclaw.json
63
99
  // ──────────────────────────────────────────────────────────────────────────────
100
+ function readCurrentModel(): string | null {
101
+ try {
102
+ const cfg = JSON.parse(
103
+ readFileSync(join(homedir(), ".openclaw", "openclaw.json"), "utf8")
104
+ );
105
+ const m = cfg?.agents?.defaults?.model;
106
+ if (typeof m === "string") return m;
107
+ if (typeof m === "object" && m !== null && typeof m.primary === "string")
108
+ return m.primary;
109
+ return null;
110
+ } catch {
111
+ return null;
112
+ }
113
+ }
64
114
 
65
- /** CLI bridge models available via /cli-* slash commands. */
115
+ // ──────────────────────────────────────────────────────────────────────────────
116
+ // Phase 3: model command table
117
+ // ──────────────────────────────────────────────────────────────────────────────
66
118
  const CLI_MODEL_COMMANDS = [
67
119
  {
68
120
  name: "cli-sonnet",
@@ -102,16 +154,24 @@ const CLI_MODEL_COMMANDS = [
102
154
  },
103
155
  ] as const;
104
156
 
157
+ /** Default model used by /cli-test when no arg is given */
158
+ const CLI_TEST_DEFAULT_MODEL = "cli-claude/claude-sonnet-4-6";
159
+
105
160
  // ──────────────────────────────────────────────────────────────────────────────
106
- // Helper: run `openclaw models set <model>` and return result text
161
+ // Helper: switch global model, saving previous for /cli-back
107
162
  // ──────────────────────────────────────────────────────────────────────────────
108
-
109
163
  async function switchModel(
110
164
  api: OpenClawPluginApi,
111
165
  model: string,
112
166
  label: string,
113
- _ctx: PluginCommandContext
114
167
  ): Promise<PluginCommandResult> {
168
+ // Save current model BEFORE switching so /cli-back can restore it
169
+ const current = readCurrentModel();
170
+ if (current && current !== model) {
171
+ writeState({ previousModel: current });
172
+ api.logger.info(`[cli-bridge] saved previous model: ${current}`);
173
+ }
174
+
115
175
  try {
116
176
  const result = await api.runtime.system.runCommandWithTimeout(
117
177
  ["openclaw", "models", "set", model],
@@ -126,7 +186,7 @@ async function switchModel(
126
186
 
127
187
  api.logger.info(`[cli-bridge] switched model → ${model}`);
128
188
  return {
129
- text: `✅ Switched to ${label}\n\`${model}\``,
189
+ text: `✅ Switched to **${label}**\n\`${model}\`\n\nUse \`/cli-back\` to restore previous model.`,
130
190
  };
131
191
  } catch (err) {
132
192
  const msg = (err as Error).message;
@@ -135,17 +195,76 @@ async function switchModel(
135
195
  }
136
196
  }
137
197
 
198
+ // ──────────────────────────────────────────────────────────────────────────────
199
+ // Helper: fire a one-shot test request directly at the proxy (no global switch)
200
+ // ──────────────────────────────────────────────────────────────────────────────
201
+ function proxyTestRequest(
202
+ port: number,
203
+ apiKey: string,
204
+ model: string,
205
+ timeoutMs: number
206
+ ): Promise<string> {
207
+ return new Promise((resolve, reject) => {
208
+ const body = JSON.stringify({
209
+ model,
210
+ messages: [{ role: "user", content: "Reply with exactly: CLI bridge OK" }],
211
+ stream: false,
212
+ });
213
+
214
+ const req = http.request(
215
+ {
216
+ hostname: "127.0.0.1",
217
+ port,
218
+ path: "/v1/chat/completions",
219
+ method: "POST",
220
+ headers: {
221
+ "Content-Type": "application/json",
222
+ "Authorization": `Bearer ${apiKey}`,
223
+ "Content-Length": Buffer.byteLength(body),
224
+ },
225
+ },
226
+ (res) => {
227
+ let data = "";
228
+ res.on("data", (chunk: Buffer) => { data += chunk.toString(); });
229
+ res.on("end", () => {
230
+ try {
231
+ const parsed = JSON.parse(data) as {
232
+ choices?: Array<{ message?: { content?: string } }>;
233
+ error?: { message?: string };
234
+ };
235
+ if (parsed.error) {
236
+ resolve(`Proxy error: ${parsed.error.message}`);
237
+ } else {
238
+ resolve(parsed.choices?.[0]?.message?.content?.trim() ?? "(empty response)");
239
+ }
240
+ } catch {
241
+ resolve(`(non-JSON response: ${data.slice(0, 200)})`);
242
+ }
243
+ });
244
+ }
245
+ );
246
+
247
+ req.setTimeout(timeoutMs, () => {
248
+ req.destroy();
249
+ reject(new Error(`Proxy test timed out after ${timeoutMs}ms`));
250
+ });
251
+ req.on("error", reject);
252
+ req.write(body);
253
+ req.end();
254
+ });
255
+ }
256
+
138
257
  // ──────────────────────────────────────────────────────────────────────────────
139
258
  // Plugin definition
140
259
  // ──────────────────────────────────────────────────────────────────────────────
141
260
  const plugin = {
142
261
  id: "openclaw-cli-bridge-elvatis",
143
262
  name: "OpenClaw CLI Bridge",
144
- version: "0.2.1",
263
+ version: "0.2.3",
145
264
  description:
146
- "Phase 1: openai-codex auth bridge (reads ~/.codex/auth.json). " +
147
- "Phase 2: HTTP proxy server routing model calls through gemini/claude CLIs. " +
148
- "Phase 3: /cli-* slash commands for instant model switching.",
265
+ "Phase 1: openai-codex auth bridge. " +
266
+ "Phase 2: HTTP proxy for gemini/claude CLIs. " +
267
+ "Phase 3: /cli-* model switching, /cli-back restore, /cli-test health check.",
149
268
 
150
269
  register(api: OpenClawPluginApi) {
151
270
  const cfg = (api.pluginConfig ?? {}) as CliPluginConfig;
@@ -176,7 +295,6 @@ const plugin = {
176
295
  try {
177
296
  const creds = await readCodexCredentials(codexAuthPath);
178
297
  spin.stop("Codex CLI credentials loaded");
179
-
180
298
  return buildOauthProviderAuthResult({
181
299
  providerId: "openai-codex",
182
300
  defaultModel: CODEX_DEFAULT_MODEL,
@@ -212,7 +330,7 @@ const plugin = {
212
330
  },
213
331
  });
214
332
 
215
- api.logger.info("[cli-bridge] openai-codex provider registered (Codex CLI auth bridge)");
333
+ api.logger.info("[cli-bridge] openai-codex provider registered");
216
334
  }
217
335
 
218
336
  // ── Phase 2: CLI request proxy ─────────────────────────────────────────────
@@ -226,47 +344,124 @@ const plugin = {
226
344
  })
227
345
  .then(() => {
228
346
  api.logger.info(
229
- `[cli-bridge] proxy ready — vllm/cli-gemini/* and vllm/cli-claude/* available`
347
+ `[cli-bridge] proxy ready on :${port} — vllm/cli-gemini/* and vllm/cli-claude/* available`
230
348
  );
231
-
232
- // Auto-patch openclaw.json with vllm provider config (once)
233
349
  const result = patchOpencllawConfig(port);
234
350
  if (result.patched) {
235
351
  api.logger.info(
236
- `[cli-bridge] openclaw.json patched with vllm provider. ` +
237
- `Restart gateway to activate cli-gemini/* and cli-claude/* models.`
352
+ `[cli-bridge] openclaw.json patched with vllm provider. Restart gateway to activate.`
238
353
  );
239
- } else {
240
- api.logger.info(`[cli-bridge] config check: ${result.reason}`);
241
354
  }
242
355
  })
243
356
  .catch((err: Error) => {
244
- api.logger.warn(
245
- `[cli-bridge] proxy server failed to start on port ${port}: ${err.message}`
246
- );
357
+ api.logger.warn(`[cli-bridge] proxy failed to start on port ${port}: ${err.message}`);
247
358
  });
248
359
  }
249
360
 
250
- // ── Phase 3: /cli-* slash commands ────────────────────────────────────────
361
+ // ── Phase 3a: /cli-* model switch commands ─────────────────────────────────
251
362
  for (const entry of CLI_MODEL_COMMANDS) {
252
- // Capture entry in closure (const iteration variable is stable in TS/ESM)
253
363
  const { name, model, description, label } = entry;
254
-
255
364
  api.registerCommand({
256
365
  name,
257
366
  description,
258
- requireAuth: true, // only authorized senders
367
+ requireAuth: true,
259
368
  handler: async (ctx: PluginCommandContext): Promise<PluginCommandResult> => {
260
- api.logger.info(`[cli-bridge] /${name} triggered by ${ctx.senderId ?? "unknown"} (authorized=${ctx.isAuthorizedSender})`);
261
- return switchModel(api, model, label, ctx);
369
+ api.logger.info(`[cli-bridge] /${name} by ${ctx.senderId ?? "?"}`);
370
+ return switchModel(api, model, label);
262
371
  },
263
- });
372
+ } satisfies OpenClawPluginCommandDefinition);
264
373
  }
265
374
 
266
- api.logger.info(
267
- `[cli-bridge] registered ${CLI_MODEL_COMMANDS.length} slash commands: ` +
268
- CLI_MODEL_COMMANDS.map((c) => `/${c.name}`).join(", ")
269
- );
375
+ // ── Phase 3b: /cli-back — restore previous model ──────────────────────────
376
+ api.registerCommand({
377
+ name: "cli-back",
378
+ description: "Restore the model that was active before the last /cli-* switch",
379
+ requireAuth: true,
380
+ handler: async (ctx: PluginCommandContext): Promise<PluginCommandResult> => {
381
+ api.logger.info(`[cli-bridge] /cli-back by ${ctx.senderId ?? "?"}`);
382
+
383
+ const state = readState();
384
+ if (!state?.previousModel) {
385
+ return { text: "ℹ️ No previous model saved. Use `/cli-sonnet` etc. to switch first." };
386
+ }
387
+
388
+ const prev = state.previousModel;
389
+
390
+ // Clear the saved state so a second /cli-back doesn't bounce back
391
+ writeState({ previousModel: "" });
392
+
393
+ try {
394
+ const result = await api.runtime.system.runCommandWithTimeout(
395
+ ["openclaw", "models", "set", prev],
396
+ { timeoutMs: 8_000 }
397
+ );
398
+
399
+ if (result.code !== 0) {
400
+ const err = (result.stderr || result.stdout || "unknown error").trim();
401
+ return { text: `❌ Failed to restore \`${prev}\`: ${err}` };
402
+ }
403
+
404
+ api.logger.info(`[cli-bridge] /cli-back restored → ${prev}`);
405
+ return { text: `✅ Restored previous model\n\`${prev}\`` };
406
+ } catch (err) {
407
+ return { text: `❌ Error: ${(err as Error).message}` };
408
+ }
409
+ },
410
+ } satisfies OpenClawPluginCommandDefinition);
411
+
412
+ // ── Phase 3c: /cli-test — one-shot proxy ping, no global model switch ──────
413
+ api.registerCommand({
414
+ name: "cli-test",
415
+ description: "Test the CLI bridge proxy without switching your active model. Usage: /cli-test [model]",
416
+ acceptsArgs: true,
417
+ requireAuth: true,
418
+ handler: async (ctx: PluginCommandContext): Promise<PluginCommandResult> => {
419
+ const targetModel = ctx.args?.trim() || CLI_TEST_DEFAULT_MODEL;
420
+ // Accept short names like "cli-sonnet" or full "vllm/cli-claude/claude-sonnet-4-6"
421
+ const model = targetModel.startsWith("vllm/")
422
+ ? targetModel
423
+ : `vllm/${targetModel}`;
424
+
425
+ api.logger.info(`[cli-bridge] /cli-test → ${model} by ${ctx.senderId ?? "?"}`);
426
+
427
+ if (!enableProxy) {
428
+ return { text: "❌ Proxy is disabled (enableProxy: false in config)." };
429
+ }
430
+
431
+ const current = readCurrentModel();
432
+ const testTimeoutMs = Math.min(timeoutMs, 30_000);
433
+
434
+ try {
435
+ const start = Date.now();
436
+ const response = await proxyTestRequest(port, apiKey, model, testTimeoutMs);
437
+ const elapsed = Date.now() - start;
438
+
439
+ return {
440
+ text:
441
+ `🧪 **CLI Bridge Test**\n` +
442
+ `Model: \`${model}\`\n` +
443
+ `Response: _${response}_\n` +
444
+ `Latency: ${elapsed}ms\n\n` +
445
+ `Active model unchanged: \`${current ?? "unknown"}\``,
446
+ };
447
+ } catch (err) {
448
+ return {
449
+ text:
450
+ `❌ **CLI Bridge Test Failed**\n` +
451
+ `Model: \`${model}\`\n` +
452
+ `Error: ${(err as Error).message}\n\n` +
453
+ `Active model unchanged: \`${current ?? "unknown"}\``,
454
+ };
455
+ }
456
+ },
457
+ } satisfies OpenClawPluginCommandDefinition);
458
+
459
+ const allCommands = [
460
+ ...CLI_MODEL_COMMANDS.map((c) => `/${c.name}`),
461
+ "/cli-back",
462
+ "/cli-test",
463
+ ];
464
+ api.logger.info(`[cli-bridge] registered ${allCommands.length} commands: ${allCommands.join(", ")}`);
270
465
  },
271
466
  };
272
467
 
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "id": "openclaw-cli-bridge-elvatis",
3
3
  "name": "OpenClaw CLI Bridge",
4
- "version": "0.2.2",
4
+ "version": "0.2.3",
5
5
  "description": "Phase 1: openai-codex auth bridge. Phase 2: local HTTP proxy routing model calls through gemini/claude CLIs (vllm provider).",
6
6
  "providers": [
7
7
  "openai-codex"
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@elvatis_com/openclaw-cli-bridge-elvatis",
3
- "version": "0.2.2",
3
+ "version": "0.2.3",
4
4
  "description": "Bridges gemini, claude, and codex CLI tools as OpenClaw model providers. Reads existing CLI auth without re-login.",
5
5
  "type": "module",
6
6
  "scripts": {