@elvatis_com/openclaw-cli-bridge-elvatis 1.0.0 → 1.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,54 +1,35 @@
1
1
  # STATUS — openclaw-cli-bridge-elvatis
2
2
 
3
- ## Current Version: 0.2.28 (npm + ClawHub + GitHub)
4
-
5
- ## What's Done
6
- - v0.2.25: Sleep-resilient token refresh (setInterval), staged /cli-* switch
7
- - v0.2.26: Grok DOM-polling bridge (grok-client.ts, grok-session.ts)
8
- - v0.2.27: Persistent Chromium profile (~/.openclaw/grok-profile/)
9
- - v0.2.28: Cookie-expiry tracking (/grok-status shows ✅/⚠️/🚨)
10
- - claude-browser.ts: DOM-automation for claude.ai (not yet in proxy NEXT)
11
- - 77/77 tests green
12
-
13
- ## Next: v0.3.x v1.0.0 Full Headless Provider Bridge
14
-
15
- ### Provider Status
16
- | Provider | DOM confirmed | browser.ts | Proxy routed | Login cmd | Tests |
17
- |---|---|---|---|---|---|
18
- | Grok | ✅ | ✅ grok-client.ts | ✅ web-grok/* | ✅ /grok-login | ✅ |
19
- | Claude | ✅ | ✅ claude-browser.ts | ❌ | ❌ | partial |
20
- | Gemini | | | ❌ | ❌ | ❌ |
21
- | ChatGPT | | | ❌ | ❌ | ❌ |
22
-
23
- ### Claude DOM (confirmed 2026-03-11)
24
- - URL: https://claude.ai/new
25
- - Editor: .ProseMirror (tiptap)
26
- - Messages: [data-test-render-count] divs
27
- - Assistant msgs: child div class "group" (no "mb-1 mt-6")
28
- - User msgs: child div class "mb-1 mt-6 group"
29
- - CLOUDFLARE: persistent headless blocked — must use OpenClaw browser (CDP 18800)
30
- - Tested working: CLAUDE_WORKS response confirmed via OpenClaw browser
31
-
32
- ### Next Steps (in order)
33
- 1. Add connectToOpenClawBrowser() to claude-browser.ts (same as grok-session.ts)
34
- 2. Add web-claude/* routing to proxy-server.ts (same as web-grok/*)
35
- 3. Add /claude-login, /claude-status, /claude-logout to index.ts
36
- 4. Add claude-browser integration tests (DI-override, same as grok-proxy.test.ts)
37
- 5. Repeat for Gemini (gemini.google.com) and ChatGPT (chatgpt.com)
38
- 6. Bump to v1.0.0 when all 4 providers green + all tests pass
39
-
40
- ## Key Files
41
- - src/claude-browser.ts — Claude DOM automation (ready, not wired)
42
- - src/grok-client.ts — reference implementation
43
- - src/grok-session.ts — reference for login/session management
44
- - src/proxy-server.ts — add web-claude/* routing here
45
- - index.ts — add /claude-login here
46
- - test/claude-browser.test.ts — unit tests (partial, needs proxy integration test)
47
-
48
- ## Constraints
49
- - OpenClaw browser (CDP 18800) required for Cloudflare bypass
50
- - persistent profile approach fails (fingerprint mismatch)
51
- - Each provider: own profile dir ~/.openclaw/<provider>-profile/
52
- - All providers share same proxy port 31337
53
- - Publish only after full test pass (77+ tests green)
54
- - All 3 platforms on every release: GitHub + npm + ClawHub
3
+ ## Current Version: 1.0.0 (npm + ClawHub + GitHub) ✅ RELEASED
4
+
5
+ ## All 4 Providers LIVE — Tested 2026-03-11 22:24
6
+ | Provider | Status | Models | Command |
7
+ |---|---|---|---|
8
+ | Grok | ✅ | web-grok/grok-3, grok-3-fast, grok-3-mini, grok-3-mini-fast | /grok-login |
9
+ | Claude | ✅ | web-claude/claude-sonnet, claude-opus, claude-haiku | /claude-login |
10
+ | Gemini | | web-gemini/gemini-2-5-pro, gemini-2-5-flash, gemini-3-pro, gemini-3-flash | /gemini-login |
11
+ | ChatGPT | ✅ | web-chatgpt/gpt-4o, gpt-4o-mini, gpt-o3, gpt-o4-mini, gpt-5 | /chatgpt-login |
12
+
13
+ Live test: "What is the capital of France?"
14
+ - Grok: "Paris" ✅
15
+ - Claude: "Paris" ✅
16
+ - Gemini: "Paris"
17
+ - ChatGPT: "Paris" ✅
18
+
19
+ ## Stats
20
+ - 22 total models, 16 web-session models
21
+ - 96/96 tests green (8 test files)
22
+ - 0 zombie Chromium processes (singleton CDP, cleanupBrowsers on stop)
23
+ - Cookie expiry tracking for all 4 providers
24
+
25
+ ## Known Issue: Browser persistence after Gateway restart
26
+ After SIGUSR1/full restart, OpenClaw browser is gone (CDP ECONNREFUSED).
27
+ Workaround: manually open browser + 4 provider pages lazy connect takes over.
28
+ Fix needed: auto-start browser on plugin init, or keep-alive ping.
29
+
30
+ ## Next Steps (v1.1.x)
31
+ - Auto-reconnect OpenClaw browser on plugin start
32
+ - /status command showing all 4 providers at once
33
+ - Context-window management for long conversations (new page per conversation)
34
+ - Handle model-switching within chatgpt.com (dropdown selector)
35
+ - Handle Gemini model switching (2.5 Pro vs Flash vs 3)
package/README.md CHANGED
@@ -2,7 +2,7 @@
2
2
 
3
3
  > OpenClaw plugin that bridges locally installed AI CLIs (Codex, Gemini, Claude Code) as model providers — with slash commands for instant model switching, restore, health testing, and model listing.
4
4
 
5
- **Current version:** `1.0.0`
5
+ **Current version:** `1.1.0`
6
6
 
7
7
  ---
8
8
 
@@ -287,12 +287,26 @@ npm test # vitest run (45 tests)
287
287
 
288
288
  ## Changelog
289
289
 
290
+ ### v1.1.0
291
+ - **feat:** Auto-connect all providers on startup (no manual login after restart if browser is open)
292
+ - **feat:** `/bridge-status` — all 4 providers at a glance with expiry info
293
+ - **fix:** Removed obsolete CLI models: gpt-5.2-codex, gpt-5.3-codex-spark, gpt-5.1-codex-mini, gemini-3-flash-preview
294
+ - **fix:** Removed duplicate cli-gemini3-flash (was same as gemini-3-flash-preview)
295
+ - **chore:** Cleaned up CLI_MODEL_COMMANDS (8 models, down from 13)
296
+
290
297
  ## v1.0.0 — Full Headless Browser Bridge 🚀
291
298
 
292
299
  All four major LLM providers are now available via browser automation.
293
300
  No CLI binaries required — just authenticated browser sessions.
294
301
 
295
- ### v1.0.0
302
+ #### v1.1.0
303
+ - **feat:** Auto-connect all providers on startup (no manual login after restart if browser is open)
304
+ - **feat:** `/bridge-status` — all 4 providers at a glance with expiry info
305
+ - **fix:** Removed obsolete CLI models: gpt-5.2-codex, gpt-5.3-codex-spark, gpt-5.1-codex-mini, gemini-3-flash-preview
306
+ - **fix:** Removed duplicate cli-gemini3-flash (was same as gemini-3-flash-preview)
307
+ - **chore:** Cleaned up CLI_MODEL_COMMANDS (8 models, down from 13)
308
+
309
+ ## v1.0.0
296
310
  - **feat:** `chatgpt-browser.ts` — chatgpt.com DOM-automation (`#prompt-textarea` + `[data-message-author-role]`)
297
311
  - **feat:** `web-chatgpt/*` models: gpt-4o, gpt-4o-mini, gpt-o3, gpt-o4-mini, gpt-5
298
312
  - **feat:** `/chatgpt-login`, `/chatgpt-status`, `/chatgpt-logout` + cookie-expiry tracking
package/SKILL.md CHANGED
@@ -53,4 +53,4 @@ Each command runs `openclaw models set <model>` atomically and replies with a co
53
53
 
54
54
  See `README.md` for full configuration reference and architecture diagram.
55
55
 
56
- **Version:** 1.0.0
56
+ **Version:** 1.1.0
package/index.ts CHANGED
@@ -6,7 +6,7 @@
6
6
  *
7
7
  * Phase 2 (request bridge): starts a local OpenAI-compatible HTTP proxy server
8
8
  * and configures OpenClaw's vllm provider to route through it. Model calls
9
- * are handled by the Gemini CLI and Claude Code CLI subprocesses.
9
+ * are routed to CLI tools or browser-session providers.
10
10
  *
11
11
  * Phase 3 (slash commands): registers /cli-* commands for instant model switching.
12
12
  * /cli-sonnet → vllm/cli-claude/claude-sonnet-4-6 (Claude Code CLI proxy)
@@ -15,9 +15,8 @@
15
15
  * /cli-gemini → vllm/cli-gemini/gemini-2.5-pro (Gemini CLI proxy)
16
16
  * /cli-gemini-flash → vllm/cli-gemini/gemini-2.5-flash (Gemini CLI proxy)
17
17
  * /cli-gemini3 → vllm/cli-gemini/gemini-3-pro-preview (Gemini CLI proxy)
18
- * /cli-gemini3-flash→ vllm/cli-gemini/gemini-3-flash-preview (Gemini CLI proxy)
19
18
  * /cli-codex → openai-codex/gpt-5.3-codex (Codex CLI OAuth, direct API)
20
- * /cli-codex-mini → openai-codex/gpt-5.1-codex-mini (Codex CLI OAuth, direct API)
19
+ * /cli-codex54 → openai-codex/gpt-5.4 (Codex CLI OAuth, direct API)
21
20
  * /cli-back → restore model that was active before last /cli-* switch
22
21
  * /cli-test [model] → one-shot proxy health check (does NOT switch global model)
23
22
  * /cli-list → list all registered CLI bridge models with commands
@@ -405,81 +404,17 @@ function readCurrentModel(): string | null {
405
404
  // Phase 3: model command table
406
405
  // ──────────────────────────────────────────────────────────────────────────────
407
406
  const CLI_MODEL_COMMANDS = [
408
- // ── Claude (via local proxy → Claude Code CLI) ──────────────────────────────
409
- {
410
- name: "cli-sonnet",
411
- model: "vllm/cli-claude/claude-sonnet-4-6",
412
- description: "Switch to Claude Sonnet 4.6 (Claude Code CLI via local proxy)",
413
- label: "Claude Sonnet 4.6 (CLI)",
414
- },
415
- {
416
- name: "cli-opus",
417
- model: "vllm/cli-claude/claude-opus-4-6",
418
- description: "Switch to Claude Opus 4.6 (Claude Code CLI via local proxy)",
419
- label: "Claude Opus 4.6 (CLI)",
420
- },
421
- {
422
- name: "cli-haiku",
423
- model: "vllm/cli-claude/claude-haiku-4-5",
424
- description: "Switch to Claude Haiku 4.5 (Claude Code CLI via local proxy)",
425
- label: "Claude Haiku 4.5 (CLI)",
426
- },
427
- // ── Gemini (via local proxy → Gemini CLI) ───────────────────────────────────
428
- {
429
- name: "cli-gemini",
430
- model: "vllm/cli-gemini/gemini-2.5-pro",
431
- description: "Switch to Gemini 2.5 Pro (Gemini CLI via local proxy)",
432
- label: "Gemini 2.5 Pro (CLI)",
433
- },
434
- {
435
- name: "cli-gemini-flash",
436
- model: "vllm/cli-gemini/gemini-2.5-flash",
437
- description: "Switch to Gemini 2.5 Flash (Gemini CLI via local proxy)",
438
- label: "Gemini 2.5 Flash (CLI)",
439
- },
440
- {
441
- name: "cli-gemini3",
442
- model: "vllm/cli-gemini/gemini-3-pro-preview",
443
- description: "Switch to Gemini 3 Pro Preview (Gemini CLI via local proxy)",
444
- label: "Gemini 3 Pro Preview (CLI)",
445
- },
446
- {
447
- name: "cli-gemini3-flash",
448
- model: "vllm/cli-gemini/gemini-3-flash-preview",
449
- description: "Switch to Gemini 3 Flash Preview (Gemini CLI via local proxy)",
450
- label: "Gemini 3 Flash Preview (CLI)",
451
- },
452
- // ── Codex (via openai-codex provider — Codex CLI OAuth auth, direct API) ────
453
- {
454
- name: "cli-codex",
455
- model: "openai-codex/gpt-5.3-codex",
456
- description: "Switch to GPT-5.3 Codex (openai-codex provider, Codex CLI auth)",
457
- label: "GPT-5.3 Codex",
458
- },
459
- {
460
- name: "cli-codex-spark",
461
- model: "openai-codex/gpt-5.3-codex-spark",
462
- description: "Switch to GPT-5.3 Codex Spark (openai-codex provider, Codex CLI auth)",
463
- label: "GPT-5.3 Codex Spark",
464
- },
465
- {
466
- name: "cli-codex52",
467
- model: "openai-codex/gpt-5.2-codex",
468
- description: "Switch to GPT-5.2 Codex (openai-codex provider, Codex CLI auth)",
469
- label: "GPT-5.2 Codex",
470
- },
471
- {
472
- name: "cli-codex54",
473
- model: "openai-codex/gpt-5.4",
474
- description: "Switch to GPT-5.4 (openai-codex provider, Codex CLI auth — may require upgraded scope)",
475
- label: "GPT-5.4 (Codex)",
476
- },
477
- {
478
- name: "cli-codex-mini",
479
- model: "openai-codex/gpt-5.1-codex-mini",
480
- description: "Switch to GPT-5.1 Codex Mini (openai-codex provider, Codex CLI auth)",
481
- label: "GPT-5.1 Codex Mini",
482
- },
407
+ // ── Claude Code CLI (via local proxy) ────────────────────────────────────────
408
+ { name: "cli-sonnet", model: "vllm/cli-claude/claude-sonnet-4-6", description: "Claude Sonnet 4.6 (Claude Code CLI)", label: "Claude Sonnet 4.6 (CLI)" },
409
+ { name: "cli-opus", model: "vllm/cli-claude/claude-opus-4-6", description: "Claude Opus 4.6 (Claude Code CLI)", label: "Claude Opus 4.6 (CLI)" },
410
+ { name: "cli-haiku", model: "vllm/cli-claude/claude-haiku-4-5", description: "Claude Haiku 4.5 (Claude Code CLI)", label: "Claude Haiku 4.5 (CLI)" },
411
+ // ── Gemini CLI (via local proxy) ─────────────────────────────────────────────
412
+ { name: "cli-gemini", model: "vllm/cli-gemini/gemini-2.5-pro", description: "Gemini 2.5 Pro (Gemini CLI)", label: "Gemini 2.5 Pro (CLI)" },
413
+ { name: "cli-gemini-flash", model: "vllm/cli-gemini/gemini-2.5-flash", description: "Gemini 2.5 Flash (Gemini CLI)", label: "Gemini 2.5 Flash (CLI)" },
414
+ { name: "cli-gemini3", model: "vllm/cli-gemini/gemini-3-pro-preview", description: "Gemini 3 Pro (Gemini CLI)", label: "Gemini 3 Pro (CLI)" },
415
+ // ── Codex CLI (openai-codex provider, OAuth auth) ────────────────────────────
416
+ { name: "cli-codex", model: "openai-codex/gpt-5.3-codex", description: "GPT-5.3 Codex (Codex CLI auth)", label: "GPT-5.3 Codex" },
417
+ { name: "cli-codex54", model: "openai-codex/gpt-5.4", description: "GPT-5.4 (Codex CLI auth)", label: "GPT-5.4" },
483
418
  ] as const;
484
419
 
485
420
  /** Default model used by /cli-test when no arg is given */
@@ -678,7 +613,7 @@ function proxyTestRequest(
678
613
  const plugin = {
679
614
  id: "openclaw-cli-bridge-elvatis",
680
615
  name: "OpenClaw CLI Bridge",
681
- version: "1.0.0",
616
+ version: "1.1.0",
682
617
  description:
683
618
  "Phase 1: openai-codex auth bridge. " +
684
619
  "Phase 2: HTTP proxy for gemini/claude CLIs. " +
@@ -697,6 +632,53 @@ const plugin = {
697
632
  // ── Grok session restore (non-blocking) ───────────────────────────────────
698
633
  void tryRestoreGrokSession(grokSessionPath, (msg) => api.logger.info(msg));
699
634
 
635
+ // ── Auto-connect all browser providers on startup (non-blocking) ──────────
636
+ void (async () => {
637
+ // Wait for proxy to start first
638
+ await new Promise(r => setTimeout(r, 3000));
639
+ const log = (msg: string) => api.logger.info(msg);
640
+ const ctx = await connectToOpenClawBrowser(log);
641
+ if (!ctx) { log("[cli-bridge] startup auto-connect: OpenClaw browser not available"); return; }
642
+ const pages = ctx.pages().map(p => p.url());
643
+ log(`[cli-bridge] startup auto-connect: ${pages.length} pages open`);
644
+
645
+ // Claude
646
+ if (pages.some(u => u.includes("claude.ai")) && !claudeContext) {
647
+ try {
648
+ const { getOrCreateClaudePage } = await import("./src/claude-browser.js");
649
+ const { page } = await getOrCreateClaudePage(ctx);
650
+ if (await page.locator(".ProseMirror").isVisible().catch(() => false)) {
651
+ claudeContext = ctx;
652
+ log("[cli-bridge:claude] auto-connected ✅");
653
+ }
654
+ } catch { /* not available */ }
655
+ }
656
+
657
+ // Gemini
658
+ if (pages.some(u => u.includes("gemini.google.com")) && !geminiContext) {
659
+ try {
660
+ const { getOrCreateGeminiPage } = await import("./src/gemini-browser.js");
661
+ const { page } = await getOrCreateGeminiPage(ctx);
662
+ if (await page.locator(".ql-editor").isVisible().catch(() => false)) {
663
+ geminiContext = ctx;
664
+ log("[cli-bridge:gemini] auto-connected ✅");
665
+ }
666
+ } catch { /* not available */ }
667
+ }
668
+
669
+ // ChatGPT
670
+ if (pages.some(u => u.includes("chatgpt.com")) && !chatgptContext) {
671
+ try {
672
+ const { getOrCreateChatGPTPage } = await import("./src/chatgpt-browser.js");
673
+ const { page } = await getOrCreateChatGPTPage(ctx);
674
+ if (await page.locator("#prompt-textarea").isVisible().catch(() => false)) {
675
+ chatgptContext = ctx;
676
+ log("[cli-bridge:chatgpt] auto-connected ✅");
677
+ }
678
+ } catch { /* not available */ }
679
+ }
680
+ })();
681
+
700
682
  // ── Phase 1: openai-codex auth bridge ─────────────────────────────────────
701
683
  if (enableCodex) {
702
684
  api.registerProvider({
@@ -1457,6 +1439,93 @@ const plugin = {
1457
1439
  return { text: "✅ Disconnected from chatgpt.com. Run `/chatgpt-login` to reconnect." };
1458
1440
  },
1459
1441
  } satisfies OpenClawPluginCommandDefinition);
1442
+
1443
+ // ── /bridge-status — all providers at a glance ───────────────────────────
1444
+ api.registerCommand({
1445
+ name: "bridge-status",
1446
+ description: "Show status of all headless browser providers (Grok, Claude, Gemini, ChatGPT)",
1447
+ handler: async (): Promise<PluginCommandResult> => {
1448
+ const lines: string[] = [`🌉 *CLI Bridge v${plugin.version} — Provider Status*\n`];
1449
+
1450
+ const checks = [
1451
+ {
1452
+ name: "Grok",
1453
+ ctx: grokContext,
1454
+ check: async () => {
1455
+ if (!grokContext) return false;
1456
+ const r = await verifySession(grokContext, () => {});
1457
+ if (!r.valid) { grokContext = null; }
1458
+ return r.valid;
1459
+ },
1460
+ models: "web-grok/grok-3, grok-3-fast, grok-3-mini",
1461
+ loginCmd: "/grok-login",
1462
+ expiry: () => { const e = loadGrokExpiry(); return e ? formatExpiryInfo(e) : null; },
1463
+ },
1464
+ {
1465
+ name: "Claude",
1466
+ ctx: claudeContext,
1467
+ check: async () => {
1468
+ if (!claudeContext) return false;
1469
+ try {
1470
+ const { getOrCreateClaudePage } = await import("./src/claude-browser.js");
1471
+ const { page } = await getOrCreateClaudePage(claudeContext);
1472
+ return page.locator(".ProseMirror").isVisible().catch(() => false);
1473
+ } catch { claudeContext = null; return false; }
1474
+ },
1475
+ models: "web-claude/claude-sonnet, claude-opus, claude-haiku",
1476
+ loginCmd: "/claude-login",
1477
+ expiry: () => { const e = loadClaudeExpiry(); return e ? formatClaudeExpiry(e) : null; },
1478
+ },
1479
+ {
1480
+ name: "Gemini",
1481
+ ctx: geminiContext,
1482
+ check: async () => {
1483
+ if (!geminiContext) return false;
1484
+ try {
1485
+ const { getOrCreateGeminiPage } = await import("./src/gemini-browser.js");
1486
+ const { page } = await getOrCreateGeminiPage(geminiContext);
1487
+ return page.locator(".ql-editor").isVisible().catch(() => false);
1488
+ } catch { geminiContext = null; return false; }
1489
+ },
1490
+ models: "web-gemini/gemini-2-5-pro, gemini-2-5-flash, gemini-3-pro, gemini-3-flash",
1491
+ loginCmd: "/gemini-login",
1492
+ expiry: () => { const e = loadGeminiExpiry(); return e ? formatGeminiExpiry(e) : null; },
1493
+ },
1494
+ {
1495
+ name: "ChatGPT",
1496
+ ctx: chatgptContext,
1497
+ check: async () => {
1498
+ if (!chatgptContext) return false;
1499
+ try {
1500
+ const { getOrCreateChatGPTPage } = await import("./src/chatgpt-browser.js");
1501
+ const { page } = await getOrCreateChatGPTPage(chatgptContext);
1502
+ return page.locator("#prompt-textarea").isVisible().catch(() => false);
1503
+ } catch { chatgptContext = null; return false; }
1504
+ },
1505
+ models: "web-chatgpt/gpt-4o, gpt-o3, gpt-o4-mini, gpt-5",
1506
+ loginCmd: "/chatgpt-login",
1507
+ expiry: () => { const e = loadChatGPTExpiry(); return e ? formatChatGPTExpiry(e) : null; },
1508
+ },
1509
+ ];
1510
+
1511
+ for (const c of checks) {
1512
+ const active = c.ctx !== null;
1513
+ const ok = active ? await c.check() : false;
1514
+ const expiry = c.expiry();
1515
+ if (ok) {
1516
+ lines.push(`✅ *${c.name}* — active`);
1517
+ if (expiry) lines.push(` 🕐 ${expiry}`);
1518
+ lines.push(` Models: ${c.models}`);
1519
+ } else {
1520
+ lines.push(`❌ *${c.name}* — not connected (run \`${c.loginCmd}\`)`);
1521
+ }
1522
+ lines.push("");
1523
+ }
1524
+
1525
+ lines.push(`🔌 Proxy: \`127.0.0.1:${port}\` | 22 models total`);
1526
+ return { text: lines.join("\n") };
1527
+ },
1528
+ } satisfies OpenClawPluginCommandDefinition);
1460
1529
  // ─────────────────────────────────────────────────────────────────────────
1461
1530
 
1462
1531
  const allCommands = [
@@ -1476,6 +1545,7 @@ const plugin = {
1476
1545
  "/chatgpt-login",
1477
1546
  "/chatgpt-status",
1478
1547
  "/chatgpt-logout",
1548
+ "/bridge-status",
1479
1549
  ];
1480
1550
  api.logger.info(`[cli-bridge] registered ${allCommands.length} commands: ${allCommands.join(", ")}`);
1481
1551
  },
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "id": "openclaw-cli-bridge-elvatis",
3
3
  "name": "OpenClaw CLI Bridge",
4
- "version": "1.0.0",
4
+ "version": "1.1.0",
5
5
  "description": "Phase 1: openai-codex auth bridge. Phase 2: local HTTP proxy routing model calls through gemini/claude CLIs (vllm provider).",
6
6
  "providers": [
7
7
  "openai-codex"
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@elvatis_com/openclaw-cli-bridge-elvatis",
3
- "version": "1.0.0",
3
+ "version": "1.1.0",
4
4
  "description": "Bridges gemini, claude, and codex CLI tools as OpenClaw model providers. Reads existing CLI auth without re-login.",
5
5
  "type": "module",
6
6
  "openclaw": {
@@ -64,42 +64,14 @@ export interface ProxyServerOptions {
64
64
 
65
65
  /** Available CLI bridge models for GET /v1/models */
66
66
  export const CLI_MODELS = [
67
- {
68
- id: "cli-gemini/gemini-2.5-pro",
69
- name: "Gemini 2.5 Pro (CLI)",
70
- contextWindow: 1_000_000,
71
- maxTokens: 8192,
72
- },
73
- {
74
- id: "cli-gemini/gemini-2.5-flash",
75
- name: "Gemini 2.5 Flash (CLI)",
76
- contextWindow: 1_000_000,
77
- maxTokens: 8192,
78
- },
79
- {
80
- id: "cli-gemini/gemini-3-pro",
81
- name: "Gemini 3 Pro (CLI)",
82
- contextWindow: 1_000_000,
83
- maxTokens: 8192,
84
- },
85
- {
86
- id: "cli-claude/claude-opus-4-6",
87
- name: "Claude Opus 4.6 (CLI)",
88
- contextWindow: 200_000,
89
- maxTokens: 8192,
90
- },
91
- {
92
- id: "cli-claude/claude-sonnet-4-6",
93
- name: "Claude Sonnet 4.6 (CLI)",
94
- contextWindow: 200_000,
95
- maxTokens: 8192,
96
- },
97
- {
98
- id: "cli-claude/claude-haiku-4-5",
99
- name: "Claude Haiku 4.5 (CLI)",
100
- contextWindow: 200_000,
101
- maxTokens: 8192,
102
- },
67
+ // ── Claude Code CLI ───────────────────────────────────────────────────────
68
+ { id: "cli-claude/claude-sonnet-4-6", name: "Claude Sonnet 4.6 (CLI)", contextWindow: 200_000, maxTokens: 8_192 },
69
+ { id: "cli-claude/claude-opus-4-6", name: "Claude Opus 4.6 (CLI)", contextWindow: 200_000, maxTokens: 8_192 },
70
+ { id: "cli-claude/claude-haiku-4-5", name: "Claude Haiku 4.5 (CLI)", contextWindow: 200_000, maxTokens: 8_192 },
71
+ // ── Gemini CLI ────────────────────────────────────────────────────────────
72
+ { id: "cli-gemini/gemini-2.5-pro", name: "Gemini 2.5 Pro (CLI)", contextWindow: 1_000_000, maxTokens: 8_192 },
73
+ { id: "cli-gemini/gemini-2.5-flash", name: "Gemini 2.5 Flash (CLI)", contextWindow: 1_000_000, maxTokens: 8_192 },
74
+ { id: "cli-gemini/gemini-3-pro-preview",name: "Gemini 3 Pro (CLI)", contextWindow: 1_000_000, maxTokens: 8_192 },
103
75
  // Grok web-session models (requires /grok-login)
104
76
  { id: "web-grok/grok-3", name: "Grok 3 (web session)", contextWindow: 131_072, maxTokens: 131_072 },
105
77
  { id: "web-grok/grok-3-fast", name: "Grok 3 Fast (web session)", contextWindow: 131_072, maxTokens: 131_072 },