@elvatis_com/openclaw-cli-bridge-elvatis 0.2.13 → 0.2.15
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.ai/handoff/STATUS.md +29 -22
- package/README.md +14 -2
- package/SKILL.md +1 -1
- package/index.ts +1 -1
- package/openclaw.plugin.json +1 -1
- package/package.json +1 -1
- package/src/cli-runner.ts +8 -3
- package/test/cli-runner.test.ts +30 -2
package/.ai/handoff/STATUS.md
CHANGED
|
@@ -1,42 +1,49 @@
|
|
|
1
1
|
# STATUS.md — openclaw-cli-bridge-elvatis
|
|
2
2
|
|
|
3
|
-
_Last updated: 2026-03-
|
|
3
|
+
_Last updated: 2026-03-08 by Akido (claude-sonnet-4-6)_
|
|
4
4
|
|
|
5
|
-
## Current Version: 0.2.
|
|
5
|
+
## Current Version: 0.2.15 — STABLE
|
|
6
6
|
|
|
7
7
|
## What is done
|
|
8
8
|
|
|
9
9
|
- ✅ Repo: `https://github.com/elvatis/openclaw-cli-bridge-elvatis`
|
|
10
|
-
- ✅ npm: `@elvatis_com/openclaw-cli-bridge-elvatis@0.2.
|
|
11
|
-
- ✅ ClawHub: `openclaw-cli-bridge-elvatis@0.2.
|
|
10
|
+
- ✅ npm: `@elvatis_com/openclaw-cli-bridge-elvatis@0.2.15`
|
|
11
|
+
- ✅ ClawHub: `openclaw-cli-bridge-elvatis@0.2.15`
|
|
12
12
|
- ✅ Phase 1: `openai-codex` provider via `~/.codex/auth.json` (no re-login)
|
|
13
13
|
- ✅ Phase 2: Local OpenAI-compatible proxy on `127.0.0.1:31337` (Gemini + Claude CLI)
|
|
14
14
|
- ✅ Phase 3: 10 slash commands (`/cli-sonnet`, `/cli-opus`, `/cli-haiku`, `/cli-gemini`, `/cli-gemini-flash`, `/cli-gemini3`, `/cli-codex`, `/cli-codex-mini`, `/cli-back`, `/cli-test`)
|
|
15
15
|
- ✅ Config patcher: auto-adds vllm provider to `openclaw.json` on first startup
|
|
16
16
|
- ✅ Prompt delivery via stdin (no E2BIG, no Gemini agentic mode)
|
|
17
|
-
- ✅ `registerService` stop() hook: closes proxy server on plugin teardown
|
|
18
|
-
- ✅ `
|
|
17
|
+
- ✅ `registerService` stop() hook: closes proxy server on plugin teardown
|
|
18
|
+
- ✅ `requireAuth: false` on all commands — webchat + WhatsApp authorized via gateway `commands.allowFrom`
|
|
19
|
+
- ✅ `vllm/` prefix stripping in `routeToCliRunner` — accepts both `vllm/cli-claude/...` and bare `cli-claude/...`
|
|
20
|
+
- ✅ End-to-end tested (2026-03-08): claude-sonnet-4-6 ✅ claude-haiku-4-5 ✅ gemini-2.5-flash ✅ gemini-2.5-pro ✅ codex ✅
|
|
21
|
+
|
|
22
|
+
## Known Operational Notes
|
|
23
|
+
|
|
24
|
+
- **Claude CLI auth expires** — token lifetime ~90 days. When `/cli-test` returns 401, run `claude auth login` on the server to refresh.
|
|
25
|
+
- Config patcher writes `openclaw.json` directly → triggers one gateway restart on first install (expected, one-time only)
|
|
26
|
+
- ClawHub publish ignores `.clawhubignore` — use rsync workaround (see CONVENTIONS.md)
|
|
19
27
|
|
|
20
28
|
## Bugs Fixed
|
|
21
29
|
|
|
30
|
+
### v0.2.14 — vllm/ prefix not stripped in model router
|
|
31
|
+
`routeToCliRunner` received full provider path `vllm/cli-claude/...` from OpenClaw
|
|
32
|
+
but only checked for `cli-claude/...` — caused "Unknown CLI bridge model" on all requests.
|
|
33
|
+
Fixed by stripping the `vllm/` prefix before routing.
|
|
34
|
+
|
|
35
|
+
### v0.2.13 — requireAuth blocking webchat commands
|
|
36
|
+
All `/cli-*` commands had `requireAuth: true`. Plugin-level auth checks `isAuthorizedSender`
|
|
37
|
+
via a different resolution path than `commands.allowFrom` config — webchat senders were
|
|
38
|
+
never authorized. Fixed by setting `requireAuth: false`; gateway-level `commands.allowFrom`
|
|
39
|
+
is the correct security layer.
|
|
40
|
+
|
|
22
41
|
### v0.2.9 — Critical: Gateway SIGKILL via fuser
|
|
23
42
|
`fuser -k 31337/tcp` was sending SIGKILL to the gateway process itself during
|
|
24
|
-
in-process hot-reloads.
|
|
25
|
-
so `fuser` found it and killed it — explaining `status=9/KILL` in systemd journal.
|
|
26
|
-
Fixed by replacing `fuser -k` with a safe health probe (`GET /v1/models`): if the
|
|
27
|
-
existing proxy responds, reuse it silently. If EADDRINUSE but no response, wait 1s
|
|
28
|
-
and retry once. No process killing involved.
|
|
43
|
+
in-process hot-reloads. Fixed by replacing `fuser -k` with a safe health probe.
|
|
29
44
|
|
|
30
|
-
### v0.2.7–v0.2.8 — EADDRINUSE on hot-reload
|
|
31
|
-
Added `closeAllConnections()` + `registerService` stop() hook.
|
|
32
|
-
during systemd restarts due to race condition. v0.2.9 health-probe approach is the
|
|
33
|
-
definitive fix.
|
|
45
|
+
### v0.2.7–v0.2.8 — EADDRINUSE on hot-reload
|
|
46
|
+
Added `closeAllConnections()` + `registerService` stop() hook.
|
|
34
47
|
|
|
35
48
|
### v0.2.6 — Port leak on gateway hot-reload
|
|
36
|
-
HTTP proxy server had no cleanup handler.
|
|
37
|
-
|
|
38
|
-
## Open Risks
|
|
39
|
-
|
|
40
|
-
- `openai-codex/gpt-5.4` returns 401 missing scope `model.request` — external (OpenAI account scope), not plugin code
|
|
41
|
-
- Config patcher writes `openclaw.json` directly → triggers one gateway restart on first install (expected, one-time only)
|
|
42
|
-
- ClawHub publish ignores `.clawhubignore` — use rsync workaround (see CONVENTIONS.md)
|
|
49
|
+
HTTP proxy server had no cleanup handler.
|
package/README.md
CHANGED
|
@@ -2,7 +2,7 @@
|
|
|
2
2
|
|
|
3
3
|
> OpenClaw plugin that bridges locally installed AI CLIs (Codex, Gemini, Claude Code) as model providers — with slash commands for instant model switching, restore, and health testing.
|
|
4
4
|
|
|
5
|
-
**Current version:** `0.2.
|
|
5
|
+
**Current version:** `0.2.15`
|
|
6
6
|
|
|
7
7
|
---
|
|
8
8
|
|
|
@@ -234,8 +234,20 @@ npm test # vitest run (5 unit tests for formatPrompt)
|
|
|
234
234
|
|
|
235
235
|
## Changelog
|
|
236
236
|
|
|
237
|
+
### v0.2.15
|
|
238
|
+
- **docs:** Rewrite changelog (entries for v0.2.12–v0.2.14 were corrupted by repeated sed version bumps); all providers verified working (Claude, Gemini, Codex)
|
|
239
|
+
- **docs:** Update STATUS.md with end-to-end test results
|
|
240
|
+
|
|
241
|
+
### v0.2.14
|
|
242
|
+
- **fix:** Strip `vllm/` prefix in `routeToCliRunner` — OpenClaw sends full provider path (`vllm/cli-claude/...`) but proxy router expected bare `cli-claude/...`; caused "Unknown CLI bridge model" on all requests
|
|
243
|
+
- **test:** Add 4 routing tests covering both prefixed and non-prefixed model paths (9 tests total)
|
|
244
|
+
|
|
237
245
|
### v0.2.13
|
|
238
|
-
- **
|
|
246
|
+
- **fix:** Set `requireAuth: false` on all `/cli-*` commands — webchat senders were always blocked because plugin-level auth uses a different resolution path than `commands.allowFrom` config; gateway-level allowlist is the correct security layer
|
|
247
|
+
- **fix:** Hardcoded `version: "0.2.5"` in plugin object (`index.ts`) — now tracks `package.json`
|
|
248
|
+
|
|
249
|
+
### v0.2.12
|
|
250
|
+
- **docs:** Fix changelog continuity — v0.2.10 entry was lost, v0.2.11 description was wrong; all entries now accurate
|
|
239
251
|
|
|
240
252
|
### v0.2.11
|
|
241
253
|
- **docs:** Fix README `Current version` header (was stuck at 0.2.9 after 0.2.10 bump)
|
package/SKILL.md
CHANGED
package/index.ts
CHANGED
|
@@ -277,7 +277,7 @@ function proxyTestRequest(
|
|
|
277
277
|
const plugin = {
|
|
278
278
|
id: "openclaw-cli-bridge-elvatis",
|
|
279
279
|
name: "OpenClaw CLI Bridge",
|
|
280
|
-
version: "0.2.
|
|
280
|
+
version: "0.2.15",
|
|
281
281
|
description:
|
|
282
282
|
"Phase 1: openai-codex auth bridge. " +
|
|
283
283
|
"Phase 2: HTTP proxy for gemini/claude CLIs. " +
|
package/openclaw.plugin.json
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
{
|
|
2
2
|
"id": "openclaw-cli-bridge-elvatis",
|
|
3
3
|
"name": "OpenClaw CLI Bridge",
|
|
4
|
-
"version": "0.2.
|
|
4
|
+
"version": "0.2.15",
|
|
5
5
|
"description": "Phase 1: openai-codex auth bridge. Phase 2: local HTTP proxy routing model calls through gemini/claude CLIs (vllm provider).",
|
|
6
6
|
"providers": [
|
|
7
7
|
"openai-codex"
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@elvatis_com/openclaw-cli-bridge-elvatis",
|
|
3
|
-
"version": "0.2.
|
|
3
|
+
"version": "0.2.15",
|
|
4
4
|
"description": "Bridges gemini, claude, and codex CLI tools as OpenClaw model providers. Reads existing CLI auth without re-login.",
|
|
5
5
|
"type": "module",
|
|
6
6
|
"openclaw": {
|
package/src/cli-runner.ts
CHANGED
|
@@ -254,11 +254,16 @@ export async function routeToCliRunner(
|
|
|
254
254
|
): Promise<string> {
|
|
255
255
|
const prompt = formatPrompt(messages);
|
|
256
256
|
|
|
257
|
-
|
|
258
|
-
|
|
257
|
+
// Strip "vllm/" prefix if present — OpenClaw sends the full provider path
|
|
258
|
+
// (e.g. "vllm/cli-claude/claude-sonnet-4-6") but the router only needs the
|
|
259
|
+
// "cli-<type>/<model>" portion.
|
|
260
|
+
const normalized = model.startsWith("vllm/") ? model.slice(5) : model;
|
|
261
|
+
|
|
262
|
+
if (normalized.startsWith("cli-gemini/")) return runGemini(prompt, normalized, timeoutMs);
|
|
263
|
+
if (normalized.startsWith("cli-claude/")) return runClaude(prompt, normalized, timeoutMs);
|
|
259
264
|
|
|
260
265
|
throw new Error(
|
|
261
|
-
`Unknown CLI bridge model: "${model}". Use "cli-gemini/<model>" or "cli-claude/<model>".`
|
|
266
|
+
`Unknown CLI bridge model: "${model}". Use "vllm/cli-gemini/<model>" or "vllm/cli-claude/<model>".`
|
|
262
267
|
);
|
|
263
268
|
}
|
|
264
269
|
|
package/test/cli-runner.test.ts
CHANGED
|
@@ -1,5 +1,5 @@
|
|
|
1
|
-
import { describe, it, expect } from "vitest";
|
|
2
|
-
import { formatPrompt } from "../src/cli-runner.js";
|
|
1
|
+
import { describe, it, expect, vi } from "vitest";
|
|
2
|
+
import { formatPrompt, routeToCliRunner } from "../src/cli-runner.js";
|
|
3
3
|
|
|
4
4
|
describe("formatPrompt", () => {
|
|
5
5
|
it("returns empty string for empty messages", () => {
|
|
@@ -44,3 +44,31 @@ describe("formatPrompt", () => {
|
|
|
44
44
|
expect(result).toContain("truncated");
|
|
45
45
|
});
|
|
46
46
|
});
|
|
47
|
+
|
|
48
|
+
describe("routeToCliRunner — model normalization", () => {
|
|
49
|
+
it("rejects unknown model without vllm prefix", async () => {
|
|
50
|
+
await expect(
|
|
51
|
+
routeToCliRunner("unknown/model", [], 1000)
|
|
52
|
+
).rejects.toThrow("Unknown CLI bridge model");
|
|
53
|
+
});
|
|
54
|
+
|
|
55
|
+
it("rejects unknown model with vllm prefix", async () => {
|
|
56
|
+
await expect(
|
|
57
|
+
routeToCliRunner("vllm/unknown/model", [], 1000)
|
|
58
|
+
).rejects.toThrow("Unknown CLI bridge model");
|
|
59
|
+
});
|
|
60
|
+
|
|
61
|
+
it("accepts cli-claude/ without vllm prefix (calls runClaude path)", async () => {
|
|
62
|
+
// Should throw CLI spawn error (no real claude), not "Unknown CLI bridge model"
|
|
63
|
+
await expect(
|
|
64
|
+
routeToCliRunner("cli-claude/claude-sonnet-4-6", [{ role: "user", content: "hi" }], 500)
|
|
65
|
+
).rejects.not.toThrow("Unknown CLI bridge model");
|
|
66
|
+
});
|
|
67
|
+
|
|
68
|
+
it("accepts vllm/cli-claude/ — strips vllm prefix before routing", async () => {
|
|
69
|
+
// Should throw CLI spawn error (no real claude), not "Unknown CLI bridge model"
|
|
70
|
+
await expect(
|
|
71
|
+
routeToCliRunner("vllm/cli-claude/claude-sonnet-4-6", [{ role: "user", content: "hi" }], 500)
|
|
72
|
+
).rejects.not.toThrow("Unknown CLI bridge model");
|
|
73
|
+
});
|
|
74
|
+
});
|