agentgui 1.0.855 → 1.0.856

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CLAUDE.md CHANGED
@@ -156,24 +156,11 @@ static/vendor/ Third-party assets (highlight.js, Prism, Ripple
156
156
 
157
157
  ## XState State Machines
158
158
 
159
- XState v5 machines are authoritative for their respective state domains. Ad-hoc Maps/Sets/booleans they replaced have been deleted.
159
+ XState v5 machines own their domains exclusively. No ad-hoc Maps/Sets parallel to machines.
160
160
 
161
- **Server machines** (ESM, `lib/`):
162
- - `execution-machine.js`: One actor per conversation. States: idle → streaming → draining (queue drain) → streaming → idle. Also rate_limited state. `execMachine.send(convId, event)` API. `conv.get` and `conv.full` WS responses include `executionState` field.
163
- - `acp-server-machine.js`: One actor per ACP tool (opencode/kilo/codex). States: stopped → starting → running ↔ crashed → restarting. Used by `acp-sdk-manager.js` to track health and drive restart backoff.
164
- - `tool-install-machine.js`: One actor per tool ID. States: unchecked → checking → idle/installed/needs_update/installing/updating/failed. Replaces `installLocks` Map in `tool-spawner.js`. Events: CHECK_START, IDLE, INSTALLED, NEEDS_UPDATE, INSTALL_START, INSTALL_COMPLETE, UPDATE_START, UPDATE_COMPLETE, FAILED. API: `getOrCreate(toolId)`, `send(toolId, event)`, `isLocked(toolId)`, `getMachineActors()`. Context: version, error, installedAt, lastCheckedAt. `GET /api/debug/machines` returns all snapshots when `DEBUG=1`.
161
+ **Server** (lib/): `execution-machine` (per conversation: idle/streaming/draining/rate_limited), `acp-server-machine` (per tool: stopped/starting/running/crashed/restarting), `tool-install-machine` (per tool: unchecked→checking→idle/installed/needs_update/installing/updating/failed). API: `send(id, event)`, `isLocked()`, snapshots at `GET /api/debug/machines` when DEBUG=1.
165
162
 
166
- **Client machines** (browser UMD, `static/js/`):
167
- - `ws-machine.js`: Wraps WebSocketManager. States: disconnected/connecting/connected/reconnecting. Actor accessible as `wsManager._wsActor`. State readable via `wsManager.connectionState`.
168
- - `conv-machine.js`: One actor per conversation. States: idle/streaming/queued. API exposed as `window.convMachineAPI`. All actors in `window.__convMachines` Map for debug.
169
- - `tool-install-machine.js`: One actor per tool ID. States: idle/installing/installed/updating/needs_update/failed. Replaces `operationInProgress` Set in `tools-manager.js`. Context: version, error, progress, installedVersion, publishedVersion. API: `window.toolInstallMachineAPI`. Actors in `window.__toolInstallMachines`.
170
- - `voice-machine.js`: Single actor for TTS playback. States: idle/queued/speaking/disabled. Replaces isSpeaking, isPlayingChunk, ttsDisabledUntilReset booleans and ttsConsecutiveFailures counter in `voice.js`. Circuit-breaker trips at 3 consecutive failures (disabled state, RESET to recover). API: `window.voiceMachineAPI`. Actor at `window.__voiceMachine`.
171
- - `conv-list-machine.js`: Single actor for conversation list. States: unloaded/loading/loaded/error. Context: conversations[], activeId, streamingIds[], version, lastPollAt. Replaces `_conversationVersion`, `_lastMutationSource`, `streamingConversations` Set in `ConversationManager`. All list mutations go through machine events. API: `window.convListMachineAPI`. Actor at `window.__convListMachine`.
172
- - `prompt-machine.js`: Single actor for prompt area. States: ready/loading/streaming/queued/disabled. Replaces dead `_promptState` string and `_promptStateTransitions` object in `client.js`. Driven by `enableControls()`, `disableControls()`, `handleStreamingStart()`, `handleStreamingComplete()`, `handleStreamingError()`. API: `window.promptMachineAPI`. Actor at `window.__promptMachine`.
173
-
174
- **XState browser loading**: UMD bundle at `static/lib/xstate.umd.min.js` (copied from `node_modules/xstate/dist/xstate.umd.min.js` during npm install). Loaded as `defer` script. Load order: xstate.umd.min.js → ws-machine.js → conv-machine.js → tool-install-machine.js → voice-machine.js → conv-list-machine.js → prompt-machine.js → all other app scripts. Exposes `window.XState` global.
175
-
176
- **Authoritative pattern**: Each machine owns its domain exclusively. No parallel ad-hoc state alongside machines. `window.__*` globals expose all client actors for debug inspection.
163
+ **Client** (static/js/, UMD): `ws-machine` (disconnected/connecting/connected/reconnecting), `conv-machine` (per conv: idle/streaming/queued), `tool-install-machine` (per tool), `voice-machine` (single: idle/queued/speaking/disabled circuit-breaker), `conv-list-machine` (single: unloaded/loading/loaded/error), `prompt-machine` (single: ready/loading/streaming/queued/disabled). Load order: xstate.umd.min.js → ws-machine → conv-machine → tool-install-machine → voice-machine → conv-list-machine → prompt-machine. Exposed at `window.__*` globals for debug.
177
164
 
178
165
  ## Key Details
179
166
 
@@ -207,45 +194,21 @@ Managed by `lib/acp-sdk-manager.js`. Features: crash restart with exponential ba
207
194
 
208
195
  ## REST API
209
196
 
210
- All routes are prefixed with `BASE_URL` (default `/gm`).
211
-
212
- - `GET /api/conversations` - List conversations
213
- - `POST /api/conversations` - Create conversation (body: agentId, title, workingDirectory)
214
- - `GET /api/conversations/:id` - Get conversation with streaming status
215
- - `POST /api/conversations/:id` - Update conversation
216
- - `DELETE /api/conversations/:id` - Delete conversation
217
- - `POST /api/conversations/:id/archive` - Archive conversation (soft-delete)
218
- - `POST /api/conversations/:id/restore` - Restore archived conversation
219
- - `GET /api/conversations/archived` - List archived conversations
220
- - `GET /api/conversations/:id/messages` - Get messages (query: limit, offset)
221
- - `POST /api/conversations/:id/messages` - Send message (body: content, agentId)
222
- - `POST /api/conversations/:id/stream` - Start streaming execution
223
- - `GET /api/conversations/:id/full` - Full conversation load with chunks
224
- - `GET /api/conversations/:id/chunks` - Get stream chunks (query: since)
225
- - `GET /api/conversations/:id/sessions/latest` - Get latest session
226
- - `GET /api/sessions/:id` - Get session
227
- - `GET /api/sessions/:id/chunks` - Get session chunks (query: since)
228
- - `GET /api/sessions/:id/execution` - Get execution events (query: limit, offset, filterType)
229
- - `GET /api/agents` - List discovered agents
230
- - `GET /api/acp/status` - ACP tool lifecycle status (ports, health, PIDs, restart counts)
231
- - `GET /api/health` - Server health check (version, uptime, agents, wsClients, memory, acp status)
232
- - `GET /api/home` - Get home directory
233
- - `POST /api/stt` - Speech-to-text (raw audio body)
234
- - `POST /api/tts` - Text-to-speech (body: text)
235
- - `GET /api/speech-status` - Speech model loading status
236
- - `POST /api/folders` - Create folder
237
- - `GET /api/tools` - List detected tools with installation status (via WebSocket tools.list handler)
238
- - `GET /api/tools/:id/status` - Get tool installation status (version, installed_at, error_message)
239
- - `POST /api/tools/:id/install` - Start tool installation (returns `{ success: true }` with background async install)
240
- - `POST /api/tools/:id/update` - Start tool update (body: targetVersion)
241
- - `GET /api/tools/:id/history` - Get tool install/update history (query: limit, offset)
242
- - `POST /api/tools/update` - Batch update all tools with available updates
243
- - `POST /api/tools/refresh-all` - Refresh all tool statuses from package manager
244
- - `POST /api/codex-oauth/start` - Start Codex CLI OAuth flow (returns `{ authUrl, mode }`)
245
- - `GET /api/codex-oauth/status` - Get current Codex OAuth state `{ status, email, error }`
246
- - `POST /api/codex-oauth/relay` - Relay OAuth code+state from remote browser (body: `{ code, state }`)
247
- - `POST /api/codex-oauth/complete` - Complete OAuth by pasting redirect URL (body: `{ url }`)
248
- - `GET /codex-oauth2callback` - OAuth callback endpoint (redirect_uri for local flows)
197
+ All routes prefixed with `BASE_URL` (default `/gm`). Key endpoints:
198
+
199
+ **Conversations**: `GET /api/conversations`, `POST /api/conversations`, `GET/POST/DELETE /api/conversations/:id`, `POST /api/conversations/:id/archive`, `POST /api/conversations/:id/restore`, `GET /api/conversations/:id/messages`, `POST /api/conversations/:id/messages`, `POST /api/conversations/:id/stream`, `GET /api/conversations/:id/full`, `GET /api/conversations/:id/chunks`, `GET /api/conversations/:id/sessions/latest`
200
+
201
+ **Sessions**: `GET /api/sessions/:id`, `GET /api/sessions/:id/chunks`, `GET /api/sessions/:id/execution`
202
+
203
+ **Agents & ACP**: `GET /api/agents`, `GET /api/acp/status`, `GET /api/health`
204
+
205
+ **Speech**: `POST /api/stt`, `POST /api/tts`, `GET /api/speech-status`
206
+
207
+ **Tools**: `GET /api/tools`, `GET/POST /api/tools/:id/install`, `POST /api/tools/:id/update`, `GET /api/tools/:id/history`, `POST /api/tools/update`, `POST /api/tools/refresh-all`
208
+
209
+ **OAuth**: `POST /api/codex-oauth/start`, `GET /api/codex-oauth/status`, `POST /api/codex-oauth/relay`, `POST /api/codex-oauth/complete`, `GET /codex-oauth2callback`
210
+
211
+ **Utility**: `POST /api/folders`, `GET /api/home`
249
212
 
250
213
  ## Tool Update System
251
214
 
@@ -273,24 +236,13 @@ Tool updates are managed through a complete pipeline:
273
236
 
274
237
  ## Tool Detection System
275
238
 
276
- TOOLS array in `lib/tool-manager.js` two categories:
277
- - **`cli`**: `{ id, name, pkg, category: 'cli' }` — detected via `which <bin>` + `<bin> --version`
278
- - **`plugin`**: `{ id, name, pkg, installPkg, pluginId, category: 'plugin', frameWork }` — detected via plugin.json files
279
-
280
- Current tools:
281
- - `cli-claude`: bin=`claude`, pkg=`@anthropic-ai/claude-code`
282
- - `cli-opencode`: bin=`opencode`, pkg=`opencode-ai`
283
- - `cli-gemini`: bin=`gemini`, pkg=`@google/gemini-cli`
284
- - `cli-kilo`: bin=`kilo`, pkg=`@kilocode/cli`
285
- - `cli-codex`: bin=`codex`, pkg=`@openai/codex`
286
- - `cli-agent-browser`: bin=`agent-browser`, pkg=`agent-browser` — uses `-V` flag (not `--version`) for version detection
287
- - `gm-cc`, `gm-oc`, `gm-gc`, `gm-kilo`, `gm-codex`: plugin tools
239
+ **TOOLS** array in `lib/tool-manager.js`: cli (via which + --version) or plugin (via plugin.json). Current: claude, opencode, gemini, kilo, codex, agent-browser (uses `-V`, not `--version`), + plugin tools (gm-cc, gm-oc, gm-gc, gm-kilo, gm-codex).
288
240
 
289
- **BIN_MAP gotcha:** `lib/tool-version-check.js` has a single `BIN_MAP` constant shared by `checkCliInstalled()` and `getCliVersion()`. Any new CLI tool must be added there. `agent-browser` uses `-V` (not `--version`) — a `versionFlag` override handles this.
241
+ **BIN_MAP**: Single constant in `lib/tool-version-check.js` shared by detect + version functions; new CLI tools must be added.
290
242
 
291
- **Framework paths:** `lib/tool-version-check.js` uses a `FRAMEWORK_PATHS` data table instead of per-framework if/else chains. Each framework entry defines pluginDir, versionFile, parseVersion, and optional markerFile/fallbackInstalled. Adding a new framework means adding one entry to this table.
243
+ **FRAMEWORK_PATHS**: Data table (pluginDir/versionFile/parseVersion/optional markerFile). New framework = one table entry.
292
244
 
293
- **Background provisioning:** `autoProvision()` runs at startup, checks/installs missing tools (~10s). `startPeriodicUpdateCheck()` runs every 6 hours in background to check for updates. Both broadcast tool status via WebSocket so UI stays in sync.
245
+ **Provisioning**: `autoProvision()` at startup (~10s), `startPeriodicUpdateCheck()` every 6h. Both broadcast tool status via WS.
294
246
 
295
247
  ### Tool Installation and Update UI Flow
296
248
 
@@ -302,49 +254,11 @@ When user clicks Install/Update button on a tool:
302
254
 
303
255
  ## WebSocket Protocol
304
256
 
305
- Endpoint: `BASE_URL + /sync`
306
-
307
- **Wire format (msgpack binary):**
308
- - Client RPC request: `{ r: requestId, m: method, p: params }`
309
- - Server RPC reply: `{ r: requestId, d: data }` or `{ r: requestId, e: { c: code, m: message } }`
310
- - Server push/broadcast: `{ type, seq, ...data }` or array of these when batched
311
-
312
- **Legacy control messages** (bypass RPC router, handled in `onLegacy`): `subscribe`, `unsubscribe`, `ping`, `latency_report`, `terminal_*`, `pm2_*`, `set_voice`, `get_subscriptions`
313
-
314
- Client sends:
315
- - `{ type: "subscribe", sessionId }` or `{ type: "subscribe", conversationId }`
316
- - `{ type: "unsubscribe", sessionId }`
317
- - `{ type: "ping" }`
318
-
319
- Server broadcasts:
320
- - `streaming_start` - Agent execution started (high priority, flushes immediately)
321
- - `streaming_progress` - New event/chunk from agent (normal priority, batched)
322
- - `streaming_complete` - Execution finished (high priority)
323
- - `streaming_error` - Execution failed (high priority)
324
- - `message_created` - New message (high priority, flushes immediately)
325
- - `conversation_created`, `conversation_updated`, `conversation_deleted`
326
- - `all_conversations_deleted` - Must be in BROADCAST_TYPES set
327
- - `model_download_progress` - Voice model download progress
328
- - `voice_list` - Available TTS voices
329
-
330
- **WSOptimizer** (`lib/ws-optimizer.js`): Per-client priority queue. High-priority events flush immediately; normal/low batch by latency tier (16ms excellent → 200ms bad). Rate limit: 100 msg/sec — overflow is re-queued (not dropped). No `lastKey` deduplication (was removed — caused valid event drops).
331
-
332
- ### WS RPC Methods (86 total)
333
-
334
- **agent:** `agent.auth`, `agent.authstat`, `agent.desc`, `agent.get`, `agent.ls`, `agent.models`, `agent.search`, `agent.subagents`, `agent.update`
335
- **auth:** `auth.configs`, `auth.save`
336
- **codex:** `codex.complete`, `codex.relay`, `codex.start`, `codex.status`
337
- **conv:** `conv.cancel`, `conv.chunks`, `conv.chunks.earlier`, `conv.del`, `conv.del.all`, `conv.export`, `conv.full`, `conv.get`, `conv.import`, `conv.inject`, `conv.ls`, `conv.new`, `conv.prune`, `conv.run-script`, `conv.scripts`, `conv.search`, `conv.steer`, `conv.stop-script`, `conv.sync`, `conv.tags`, `conv.upd`
338
- **gemini:** `gemini.complete`, `gemini.relay`, `gemini.start`, `gemini.status`
339
- **git:** `git.check`, `git.push`
340
- **msg:** `msg.get`, `msg.ls`, `msg.ls.earlier`, `msg.send`, `msg.stream`
341
- **q:** `q.del`, `q.ls`, `q.upd`
342
- **run:** `run.cancel`, `run.del`, `run.get`, `run.new`, `run.resume`, `run.search`, `run.stream`, `run.stream.get`, `run.wait`
343
- **sess:** `sess.chunks`, `sess.exec`, `sess.get`, `sess.latest`
344
- **speech:** `speech.download`, `speech.status`
345
- **thread:** `thread.copy`, `thread.del`, `thread.get`, `thread.history`, `thread.new`, `thread.run.cancel`, `thread.run.steer`, `thread.run.stream`, `thread.run.stream.get`, `thread.search`, `thread.upd`
346
- **tools:** `tools.list`
347
- **util:** `clone`, `discover.claude`, `folders`, `home`, `import.claude`, `voice.cache`, `voice.generate`, `voices`, `ws.stats`
257
+ Endpoint: `BASE_URL + /sync`. Msgpack binary. Wire: RPC request `{r, m, p}`, reply `{r, d}` or `{r, e}`, broadcast `{type, seq, ...}` batched by `WSOptimizer`. Per-client priority queue: high-priority (streaming_start, message_created, streaming_complete) flush immediately; normal/low batch by latency tier. Rate limit: 100 msg/sec (re-queued if overflow).
258
+
259
+ **Legacy messages** (onLegacy): subscribe/unsubscribe/ping/latency_report/terminal_*/pm2_*/set_voice/get_subscriptions
260
+
261
+ **RPC methods** (86 total by category): agent (auth/authstat/desc/get/ls/models/search/subagents/update), auth (configs/save), codex (start/status/relay/complete), conv (ls/new/get/upd/del/cancel/chunks/full/steer/inject/search/prune/scripts/run-script), gemini (start/status/relay/complete), git (check/push), msg (send/stream/get/ls), q (ls/upd/del), run (new/stream/get/wait/cancel/search/resume), sess (get/latest/chunks/exec), speech (download/status), thread (new/get/upd/del/search/copy/history/run.stream/run.cancel/run.steer), tools (list), util (home/folders/clone/voices/voice.cache/voice.generate/ws.stats/discover.claude/import.claude)
348
262
 
349
263
  ## Steering
350
264
 
@@ -367,18 +281,7 @@ Three parallel state stores (must stay in sync):
367
281
 
368
282
  ## Message Flow
369
283
 
370
- 1. User sends`startExecution()` checks `streamingConversations.has(convId)`
371
- 2. If NOT streaming: show optimistic "User" message in UI
372
- 3. If streaming: skip optimistic (will queue server-side)
373
- 4. Send via RPC `msg.stream` → backend creates message + broadcasts `message_created`
374
- 5. Backend checks `activeExecutions.has(convId)`:
375
- - YES: queues, returns `{ queued: true }`, broadcasts `queue_status`
376
- - NO: executes, returns `{ session }`
377
- 6. Queue items render as yellow control blocks in `queue-indicator` div
378
- 7. `message_created` only broadcast for non-queued messages (ws-handlers-conv.js)
379
- 8. When queued message executes: becomes regular user message, queue-indicator updates
380
-
381
- **Streaming session blocks:** `handleStreamingComplete()` removes `.event-streaming-start` and `.event-streaming-complete` DOM blocks to prevent accumulation in long conversations.
284
+ User sendcheck if streaming → (streaming: queue server-side, skip optimistic; else: show optimistic message) → RPC msg.stream → backend checks activeExecutions.has(convId) → (yes: queue, broadcast queue_status; no: execute, return session) → broadcast message_created (non-queued only). Queue renders as yellow blocks. On complete, remove .event-streaming-* DOM blocks.
382
285
 
383
286
  ## Conversations Sidebar
384
287
 
@@ -405,49 +308,15 @@ MIME type priority: `event.media_type` → magic-byte detection (PNG/JPEG/WebP/G
405
308
 
406
309
  ## Voice Model Download
407
310
 
408
- Speech models (~470MB total) are downloaded automatically on server startup. No credentials required.
409
-
410
- ### Download Sources (fallback chain)
411
- 1. **GitHub LFS** (primary): `https://github.com/AnEntrypoint/models`
412
- 2. **HuggingFace** (fallback): `onnx-community/whisper-base` for STT, `AnEntrypoint/sttttsmodels` for TTS
413
-
414
- ### Models
415
- - **Whisper Base** (~280MB): encoder + decoder ONNX models, tokenizer, config files
416
- - **TTS Models** (~190MB): mimi encoder/decoder, flow_lm, text_conditioner, tokenizer
417
-
418
- ### UI Behavior
419
- - Voice tab hidden until models ready; circular progress indicator in header during download
420
- - Model status broadcast via WebSocket `model_download_progress` events
421
- - Cache location: `~/.gmgui/models/`
422
-
423
- ## Performance Notes
424
-
425
- - **Static asset serving:** gzip-only (no brotli — too slow for payloads this size). Pre-compressed once on first request, cached in `_assetCache` Map (etag → `{ raw, gz }`). HTML cached as `_htmlCache` after first request, invalidated on hot-reload.
426
- - **`/api/conversations` N+1 fix:** Uses `getActiveSessionConversationIds()` (single `DISTINCT` query) instead of per-conversation `getSessionsByStatus()` calls.
427
- - **`conv.chunks` since-filter:** Pushed to DB via `getConversationChunksSince(convId, since)` — no JS array filter on full chunk set.
428
- - **Client init:** `loadAgents()`, `loadConversations()`, `checkSpeechStatus()` run in parallel via `Promise.all()`.
429
- - **`perMessageDeflate: false`** on WebSocket server — msgpack binary doesn't compress well, and zlib was blocking the event loop on every streaming_progress send.
430
-
431
- ## Codex CLI OAuth
432
-
433
- OpenAI Codex CLI uses PKCE authorization code flow against `https://auth.openai.com`.
311
+ Models (~470MB: Whisper Base ~280MB + TTS ~190MB) downloaded at startup from GitHub LFS or HuggingFace (fallback). UI: voice tab hidden until ready; progress indicator in header; `model_download_progress` WS broadcast. Cache: `~/.gmgui/models/`.
434
312
 
435
- **Flow:**
436
- 1. `POST /api/codex-oauth/start` generates PKCE (SHA-256 S256 challenge), CSRF state, returns `authUrl`
437
- 2. User opens `authUrl` in browser and authenticates via OpenAI/ChatGPT
438
- 3. **Local**: Browser redirects to `http://localhost:1455/auth/callback` — but since agentgui's server is on a different port, the redirect goes to `GET /codex-oauth2callback` (agentgui intercepts via matching route). Token exchange happens server-side.
439
- 4. **Remote**: Redirect goes to `/codex-oauth2callback` which serves a relay page. Relay POSTs `{ code, state }` to `/api/codex-oauth/relay`. Token exchange happens on the server.
440
- 5. Tokens saved to `$CODEX_HOME/auth.json` (default: `~/.codex/auth.json`) as `{ auth_mode: "chatgpt", tokens: { id_token, access_token, refresh_token }, last_refresh }`
313
+ ## Performance & Observability
441
314
 
442
- **Constants (in server.js):**
443
- - Issuer: `https://auth.openai.com`
444
- - Client ID: `app_EMoamEEZ73f0CkXaXp7hrann`
445
- - Scopes: `openid profile email offline_access api.connectors.read api.connectors.invoke`
446
- - Redirect URI (local): `http://localhost:1455/auth/callback` (actual callback goes to agentgui's `/codex-oauth2callback`)
315
+ **Asset serving**: gzip only (no brotli), pre-compressed once, cached in `_assetCache` (etag-keyed). HTML cached, invalidated on hot-reload. **/api/conversations**: single `DISTINCT` query (not N+1). **Chunks**: `getConversationChunksSince()` pushes filter to DB. **Client init**: loadAgents/loadConversations/checkSpeechStatus parallel. **WS**: perMessageDeflate: false (msgpack + zlib blocked event loop).
447
316
 
448
- **WebSocket handlers** (in `lib/ws-handlers-util.js`): `codex.start`, `codex.status`, `codex.relay`, `codex.complete`
317
+ **Debug API** (`DEBUG=1`): `/api/debug/machines` snapshots, `/api/debug/state` inspection, `/api/debug/ws-stats` latency. Browser: `window.__debug.getSyncState()` exposes all XState machines.
449
318
 
450
- **Agent auth**: `POST /api/agents/codex/auth` starts OAuth flow same as Gemini broadcasts `script_started`/`script_output`/`script_stopped` events as OAuth progresses.
319
+ PKCE S256 flow vs auth.openai.com. `POST /api/codex-oauth/start` authUrl. User authenticates redirect to `/codex-oauth2callback` (local: intercepts localhost:1455/auth/callback; remote: relay page POSTs to `/api/codex-oauth/relay`). Tokens saved to `$CODEX_HOME/auth.json`. WS handlers: codex.start/status/relay/complete.
451
320
 
452
321
  ## ACP SDK Integration
453
322
 
@@ -1,10 +1,19 @@
1
1
  import path from 'path';
2
+ <<<<<<< HEAD
3
+
4
+ export class JsonlParser {
5
+ constructor({ broadcastSync, queries, ownedSessionIds }) {
6
+ this._bc = broadcastSync;
7
+ this._q = queries;
8
+ this._owned = ownedSessionIds;
9
+ =======
2
10
  import fs from 'fs';
3
11
 
4
12
  export class JsonlParser {
5
13
  constructor({ broadcastSync, queries }) {
6
14
  this._bc = broadcastSync;
7
15
  this._q = queries;
16
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
8
17
  this._convMap = new Map();
9
18
  this._emitted = new Map();
10
19
  this._seqs = new Map();
@@ -12,6 +21,8 @@ export class JsonlParser {
12
21
  this._sessions = new Map();
13
22
  }
14
23
 
24
+ <<<<<<< HEAD
25
+ =======
15
26
  /**
16
27
  * Pre-register a GUI-spawned session so _conv finds the right conversation
17
28
  * and _dbSession reuses the existing session ID instead of creating a new one.
@@ -23,6 +34,7 @@ export class JsonlParser {
23
34
  if (dbSessionId) this._sessions.set(claudeSessionId, dbSessionId);
24
35
  }
25
36
 
37
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
26
38
  clear() {
27
39
  this._convMap.clear();
28
40
  this._emitted.clear();
@@ -43,12 +55,28 @@ export class JsonlParser {
43
55
  for (const sid of [...this._streaming]) this._endStreaming(this._convMap.get(sid), sid);
44
56
  }
45
57
 
58
+ <<<<<<< HEAD
59
+ _line(fp, line) {
60
+ line = line.trim(); if (!line) return;
61
+ let e; try { e = JSON.parse(line); } catch (_) { return; }
62
+ if (!e || !e.sessionId) return;
63
+ if (this._owned?.has(e.sessionId)) return;
64
+ const cid = this._conv(e.sessionId, e, fp);
65
+ if (cid) this._route(cid, e.sessionId, e);
66
+ }
67
+
68
+ _conv(sid, e) {
69
+ =======
46
70
  _conv(sid, e, fp) {
71
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
47
72
  if (this._convMap.has(sid)) return this._convMap.get(sid);
48
73
  const found = this._q.getConversations().find(c => c.claudeSessionId === sid);
49
74
  if (found) { this._convMap.set(sid, found.id); return found.id; }
50
75
  if (e.type === 'queue-operation' || e.type === 'last-prompt') return null;
51
76
  if (e.type === 'user' && e.isMeta) return null;
77
+ <<<<<<< HEAD
78
+ const cwd = e.cwd || process.cwd();
79
+ =======
52
80
 
53
81
  // Resolve workingDirectory: event cwd → sessions-index.json → decoded path
54
82
  let cwd = e.cwd || null;
@@ -67,6 +95,7 @@ export class JsonlParser {
67
95
  }
68
96
  cwd = cwd || process.cwd();
69
97
 
98
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
70
99
  const branch = e.gitBranch || '';
71
100
  const base = path.basename(cwd);
72
101
  const title = branch ? `${branch} @ ${base}` : base;
@@ -3,6 +3,11 @@ import { JsonlWatcher as CCFWatcher } from 'ccfollow';
3
3
  import { JsonlParser } from './jsonl-parser.js';
4
4
 
5
5
  export class JsonlWatcher extends CCFWatcher {
6
+ <<<<<<< HEAD
7
+ constructor({ broadcastSync, queries, ownedSessionIds }) {
8
+ super();
9
+ this._parser = new JsonlParser({ broadcastSync, queries, ownedSessionIds });
10
+ =======
6
11
  constructor({ broadcastSync, queries }) {
7
12
  super();
8
13
  this._parser = new JsonlParser({ broadcastSync, queries });
@@ -14,6 +19,7 @@ export class JsonlWatcher extends CCFWatcher {
14
19
  this._currentFp = fp;
15
20
  super._read(fp);
16
21
  this._currentFp = null;
22
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
17
23
  }
18
24
 
19
25
  _line(line) {
@@ -22,6 +28,12 @@ export class JsonlWatcher extends CCFWatcher {
22
28
  let e;
23
29
  try { e = JSON.parse(line); } catch (_) { return; }
24
30
  if (!e || !e.sessionId) return;
31
+ <<<<<<< HEAD
32
+ const cid = this._parser._conv(e.sessionId, e);
33
+ if (cid) this._parser._route(cid, e.sessionId, e);
34
+ }
35
+
36
+ =======
25
37
  const cid = this._parser._conv(e.sessionId, e, this._currentFp);
26
38
  if (cid) this._parser._route(cid, e.sessionId, e);
27
39
  }
@@ -35,6 +47,7 @@ export class JsonlWatcher extends CCFWatcher {
35
47
  this._parser.registerSession(claudeSessionId, convId, dbSessionId);
36
48
  }
37
49
 
50
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
38
51
  stop() {
39
52
  super.stop();
40
53
  this._parser.endAllStreaming();
@@ -1,4 +1,8 @@
1
+ <<<<<<< HEAD
2
+ export function createProcessMessage({ queries, activeExecutions, rateLimitState, execMachine, broadcastSync, runClaudeWithStreaming, cleanupExecution, checkpointManager, discoveredAgents, ownedSessionIds, STARTUP_CWD, buildSystemPrompt, parseRateLimitResetTime, eagerTTS, touchACP, createChunkBatcher, debugLog, logError, scheduleRetry, drainMessageQueue, createEventHandler }) {
3
+ =======
1
4
  export function createProcessMessage({ queries, activeExecutions, rateLimitState, execMachine, broadcastSync, runClaudeWithStreaming, cleanupExecution, checkpointManager, discoveredAgents, STARTUP_CWD, buildSystemPrompt, parseRateLimitResetTime, eagerTTS, touchACP, getJsonlWatcher, debugLog, logError, scheduleRetry, drainMessageQueue, createEventHandler }) {
5
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
2
6
  async function processMessageWithStreaming(conversationId, messageId, sessionId, content, agentId, model, subAgent) {
3
7
  const startTime = Date.now();
4
8
  touchACP(agentId);
@@ -27,10 +31,19 @@ export function createProcessMessage({ queries, activeExecutions, rateLimitState
27
31
  execMachine.send(conversationId, { type: 'START', sessionId });
28
32
  queries.setIsStreaming(conversationId, true);
29
33
  queries.updateSession(sessionId, { status: 'active' });
34
+ <<<<<<< HEAD
35
+ const batcher = createChunkBatcher(queries, debugLog);
36
+ const cwd = conv?.workingDirectory || STARTUP_CWD;
37
+ const allBlocksRef = { val: [] };
38
+ const currentSequenceRef = { val: queries.getMaxSequence(sessionId) ?? -1 };
39
+ const batcherRef = { batcher, eventCount: 0, resumeSessionId: conv?.claudeSessionId || null };
40
+ const onEvent = createEventHandler({ queries, activeExecutions, broadcastSync, rateLimitState, batcherRef, sessionId, conversationId, messageId, content, agentId, model, subAgent, ownedSessionIds, allBlocksRef, currentSequenceRef, scheduleRetry, eagerTTS, debugLog, parseRateLimitResetTime });
41
+ =======
30
42
  const cwd = conv?.workingDirectory || STARTUP_CWD;
31
43
  // stateRef tracks eventCount (for session response metadata) and resumeSessionId
32
44
  const stateRef = { eventCount: 0, resumeSessionId: conv?.claudeSessionId || null };
33
45
  const onEvent = createEventHandler({ queries, activeExecutions, broadcastSync, rateLimitState, batcherRef: stateRef, sessionId, conversationId, messageId, content, agentId, model, subAgent, getJsonlWatcher, scheduleRetry, eagerTTS, debugLog, parseRateLimitResetTime });
46
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
34
47
  try {
35
48
  debugLog(`[stream] Starting: conversationId=${conversationId}, sessionId=${sessionId}`);
36
49
  let resolvedAgentId = agentId || 'claude-code';
@@ -40,7 +53,11 @@ export function createProcessMessage({ queries, activeExecutions, rateLimitState
40
53
  const resolvedSubAgent = subAgent || conv?.subAgent || null;
41
54
  const config = {
42
55
  verbose: true, outputFormat: 'stream-json', timeout: 1800000, print: true,
56
+ <<<<<<< HEAD
57
+ resumeSessionId: batcherRef.resumeSessionId,
58
+ =======
43
59
  resumeSessionId: stateRef.resumeSessionId,
60
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
44
61
  systemPrompt: buildSystemPrompt(agentId, resolvedModel, resolvedSubAgent),
45
62
  model: resolvedModel || undefined, subAgent: resolvedSubAgent || undefined, onEvent,
46
63
  onPid: (pid) => { const e = activeExecutions.get(conversationId); if (e) e.pid = pid; execMachine.send(conversationId, { type: 'SET_PID', pid }); },
@@ -53,6 +70,19 @@ export function createProcessMessage({ queries, activeExecutions, rateLimitState
53
70
  }
54
71
  activeExecutions.delete(conversationId);
55
72
  execMachine.send(conversationId, { type: 'COMPLETE' });
73
+ <<<<<<< HEAD
74
+ batcher.drain();
75
+ if (claudeSessionId) ownedSessionIds.delete(claudeSessionId);
76
+ debugLog(`[stream] Claude returned ${outputs.length} outputs, sessionId=${claudeSessionId}`);
77
+ queries.updateSession(sessionId, { status: 'complete', response: JSON.stringify({ outputs, eventCount: batcherRef.eventCount }), completed_at: Date.now() });
78
+ broadcastSync({ type: 'streaming_complete', sessionId, conversationId, agentId, eventCount: batcherRef.eventCount, seq: currentSequenceRef.val, timestamp: Date.now() });
79
+ debugLog(`[stream] Completed: ${outputs.length} outputs, ${batcherRef.eventCount} events`);
80
+ } catch (error) {
81
+ const elapsed = Date.now() - startTime;
82
+ debugLog(`[stream] Error after ${elapsed}ms: ${error.message}`);
83
+ const conv2 = queries.getConversation(conversationId);
84
+ if (conv2?.claudeSessionId) ownedSessionIds.delete(conv2.claudeSessionId);
85
+ =======
56
86
  debugLog(`[stream] Claude returned ${outputs.length} outputs, sessionId=${claudeSessionId}`);
57
87
  queries.updateSession(sessionId, { status: 'complete', response: JSON.stringify({ outputs, eventCount: stateRef.eventCount }), completed_at: Date.now() });
58
88
  // streaming_complete is broadcast by JsonlParser when it sees the turn_duration event.
@@ -63,6 +93,7 @@ export function createProcessMessage({ queries, activeExecutions, rateLimitState
63
93
  } catch (error) {
64
94
  const elapsed = Date.now() - startTime;
65
95
  debugLog(`[stream] Error after ${elapsed}ms: ${error.message}`);
96
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
66
97
  if (rateLimitState.get(conversationId)?.isStreamDetected) {
67
98
  debugLog(`[rate-limit] Rate limit already handled in stream for conv ${conversationId}, skipping catch handler`);
68
99
  return;
@@ -76,6 +107,10 @@ export function createProcessMessage({ queries, activeExecutions, rateLimitState
76
107
  const errMsg = queries.createMessage(conversationId, 'assistant', `Error: Authentication failed. ${error.message}. Please update your credentials and try again.`);
77
108
  broadcastSync({ type: 'message_created', conversationId, message: errMsg, timestamp: Date.now() });
78
109
  queries.setIsStreaming(conversationId, false);
110
+ <<<<<<< HEAD
111
+ batcher.drain();
112
+ =======
113
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
79
114
  activeExecutions.delete(conversationId);
80
115
  return;
81
116
  }
@@ -94,6 +129,10 @@ export function createProcessMessage({ queries, activeExecutions, rateLimitState
94
129
  const retryAt = Date.now() + cooldownMs;
95
130
  rateLimitState.set(conversationId, { retryAt, cooldownMs, retryCount });
96
131
  broadcastSync({ type: 'rate_limit_hit', sessionId, conversationId, retryAfterMs: cooldownMs, retryAt, retryCount, timestamp: Date.now() });
132
+ <<<<<<< HEAD
133
+ batcher.drain();
134
+ =======
135
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
97
136
  debugLog(`[rate-limit] Scheduling retry for conv ${conversationId} in ${cooldownMs}ms (attempt ${retryCount + 1})`);
98
137
  setTimeout(() => {
99
138
  debugLog(`[rate-limit] Timeout fired for conv ${conversationId}, calling scheduleRetry`);
@@ -103,13 +142,21 @@ export function createProcessMessage({ queries, activeExecutions, rateLimitState
103
142
  }, cooldownMs);
104
143
  return;
105
144
  }
145
+ <<<<<<< HEAD
146
+ const isSessionConflict = error.exitCode === null && batcherRef.eventCount === 0;
147
+ =======
106
148
  const isSessionConflict = error.exitCode === null && stateRef.eventCount === 0;
149
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
107
150
  broadcastSync({ type: 'streaming_error', sessionId, conversationId, error: error.message, isPrematureEnd: error.isPrematureEnd || false, exitCode: error.exitCode, stderrText: error.stderrText, recoverable: elapsed < 60000, isSessionConflict, timestamp: Date.now() });
108
151
  if (!isSessionConflict) {
109
152
  const errMsg = queries.createMessage(conversationId, 'assistant', `Error: ${error.message}`);
110
153
  broadcastSync({ type: 'message_created', conversationId, message: errMsg, timestamp: Date.now() });
111
154
  }
112
155
  } finally {
156
+ <<<<<<< HEAD
157
+ batcher.drain();
158
+ =======
159
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
113
160
  if (!rateLimitState.has(conversationId)) {
114
161
  cleanupExecution(conversationId);
115
162
  drainMessageQueue(conversationId);
@@ -1,6 +1,10 @@
1
1
  import { JsonlWatcher } from './jsonl-watcher.js';
2
2
 
3
+ <<<<<<< HEAD
4
+ export function createOnServerReady({ queries, broadcastSync, warmAssetCache, staticDir, toolManager, discoveredAgents, PORT, BASE_URL, watch, ownedSessionIds, resumeInterruptedStreams, activeExecutions, debugLog, installGMAgentConfigs, startACPTools, getACPStatus, execMachine, toolInstallMachine, getSpeech, ensureModelsDownloaded, performAutoImport, performAgentHealthCheck, pm2Manager, pm2Subscribers, recoverStaleSessions }) {
5
+ =======
3
6
  export function createOnServerReady({ queries, broadcastSync, warmAssetCache, staticDir, toolManager, discoveredAgents, PORT, BASE_URL, watch, setWatcher, resumeInterruptedStreams, activeExecutions, debugLog, installGMAgentConfigs, startACPTools, getACPStatus, execMachine, toolInstallMachine, getSpeech, ensureModelsDownloaded, performAutoImport, performAgentHealthCheck, pm2Manager, pm2Subscribers, recoverStaleSessions }) {
7
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
4
8
  let jsonlWatcher = null;
5
9
 
6
10
  function getJsonlWatcher() { return jsonlWatcher; }
@@ -23,9 +27,14 @@ export function createOnServerReady({ queries, broadcastSync, warmAssetCache, st
23
27
  }, 6 * 60 * 60 * 1000);
24
28
 
25
29
  try {
30
+ <<<<<<< HEAD
31
+ jsonlWatcher = new JsonlWatcher({ broadcastSync, queries, ownedSessionIds });
32
+ jsonlWatcher.start();
33
+ =======
26
34
  jsonlWatcher = new JsonlWatcher({ broadcastSync, queries });
27
35
  jsonlWatcher.start();
28
36
  if (setWatcher) setWatcher(jsonlWatcher);
37
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
29
38
  console.log('[JSONL] Watcher started');
30
39
  } catch (err) { console.error('[JSONL] Watcher failed to start:', err.message); }
31
40
 
@@ -1,3 +1,6 @@
1
+ <<<<<<< HEAD
2
+ export function createEventHandler({ queries, activeExecutions, broadcastSync, rateLimitState, batcherRef, sessionId, conversationId, messageId, content, agentId, model, subAgent, ownedSessionIds, allBlocksRef, currentSequenceRef, scheduleRetry, eagerTTS, debugLog, parseRateLimitResetTime }) {
3
+ =======
1
4
  /**
2
5
  * Minimal Claude stdout event handler.
3
6
  *
@@ -11,10 +14,35 @@
11
14
  * No batcher, no broadcastSync for individual blocks.
12
15
  */
13
16
  export function createEventHandler({ queries, activeExecutions, broadcastSync, rateLimitState, batcherRef, sessionId, conversationId, messageId, content, agentId, model, subAgent, getJsonlWatcher, scheduleRetry, eagerTTS, debugLog, parseRateLimitResetTime }) {
17
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
14
18
  return function onEvent(parsed) {
15
19
  batcherRef.eventCount++;
16
20
  const entry = activeExecutions.get(conversationId);
17
21
  if (entry) entry.lastActivity = Date.now();
22
+ <<<<<<< HEAD
23
+ if (parsed.session_id) {
24
+ ownedSessionIds.add(parsed.session_id);
25
+ if (!batcherRef.resumeSessionId || batcherRef.resumeSessionId !== parsed.session_id) {
26
+ batcherRef.resumeSessionId = parsed.session_id;
27
+ queries.setClaudeSessionId(conversationId, parsed.session_id, sessionId);
28
+ }
29
+ }
30
+ debugLog(`[stream] Event ${batcherRef.eventCount}: type=${parsed.type}`);
31
+
32
+ if (parsed.type === 'system') {
33
+ if (parsed.subtype === 'task_notification') return;
34
+ if (!parsed.model && !parsed.cwd && !parsed.tools) return;
35
+ const block = { type: 'system', subtype: parsed.subtype, model: parsed.model, cwd: parsed.cwd, tools: parsed.tools, session_id: parsed.session_id };
36
+ currentSequenceRef.val++;
37
+ batcherRef.batcher.add(sessionId, conversationId, currentSequenceRef.val, 'system', block);
38
+ broadcastSync({ type: 'streaming_progress', sessionId, conversationId, block, blockRole: 'system', blockIndex: allBlocksRef.val.length, seq: currentSequenceRef.val, timestamp: Date.now() });
39
+ } else if (parsed.type === 'assistant' && parsed.message?.content) {
40
+ for (const block of parsed.message.content) {
41
+ allBlocksRef.val.push(block);
42
+ currentSequenceRef.val++;
43
+ batcherRef.batcher.add(sessionId, conversationId, currentSequenceRef.val, block.type || 'assistant', block);
44
+ broadcastSync({ type: 'streaming_progress', sessionId, conversationId, block, blockRole: 'assistant', blockIndex: allBlocksRef.val.length - 1, seq: currentSequenceRef.val, timestamp: Date.now() });
45
+ =======
18
46
 
19
47
  // Register session with file watcher as soon as we see the session_id.
20
48
  // This pre-maps claudeSessionId → (convId, dbSessionId) in JsonlParser before
@@ -32,15 +60,23 @@ export function createEventHandler({ queries, activeExecutions, broadcastSync, r
32
60
  // Rate-limit detection in assistant text blocks
33
61
  if (parsed.type === 'assistant' && parsed.message?.content) {
34
62
  for (const block of parsed.message.content) {
63
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
35
64
  if (block.type === 'text' && block.text) {
36
65
  const rateLimitMatch = block.text.match(/you'?ve hit your limit|rate limit exceeded/i);
37
66
  if (rateLimitMatch) {
38
67
  debugLog(`[rate-limit] Detected rate limit message in stream for conv ${conversationId}`);
39
68
  const retryAfterSec = parseRateLimitResetTime(block.text);
40
69
  const entry2 = activeExecutions.get(conversationId);
70
+ <<<<<<< HEAD
71
+ if (entry2 && entry2.pid) { try { process.kill(entry2.pid); } catch (e) {} }
72
+ const existingCount = rateLimitState.get(conversationId)?.retryCount || 0;
73
+ if (existingCount >= 3) {
74
+ batcherRef.batcher.drain();
75
+ =======
41
76
  if (entry2 && entry2.pid) { try { process.kill(entry2.pid); } catch (_) {} }
42
77
  const existingCount = rateLimitState.get(conversationId)?.retryCount || 0;
43
78
  if (existingCount >= 3) {
79
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
44
80
  activeExecutions.delete(conversationId);
45
81
  queries.setIsStreaming(conversationId, false);
46
82
  const errMsg = queries.createMessage(conversationId, 'assistant', `Error: Rate limit exceeded after ${existingCount + 1} attempts. Please try again later.`);
@@ -50,6 +86,10 @@ export function createEventHandler({ queries, activeExecutions, broadcastSync, r
50
86
  }
51
87
  rateLimitState.set(conversationId, { retryAt: Date.now() + (retryAfterSec * 1000), cooldownMs: retryAfterSec * 1000, retryCount: existingCount + 1, isStreamDetected: true });
52
88
  broadcastSync({ type: 'rate_limit_hit', sessionId, conversationId, retryAfterMs: retryAfterSec * 1000, retryAt: Date.now() + (retryAfterSec * 1000), retryCount: 1, timestamp: Date.now() });
89
+ <<<<<<< HEAD
90
+ batcherRef.batcher.drain();
91
+ =======
92
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
53
93
  activeExecutions.delete(conversationId);
54
94
  queries.setIsStreaming(conversationId, false);
55
95
  setTimeout(() => {
@@ -62,6 +102,61 @@ export function createEventHandler({ queries, activeExecutions, broadcastSync, r
62
102
  eagerTTS(block.text, conversationId, sessionId);
63
103
  }
64
104
  }
105
+ <<<<<<< HEAD
106
+ } else if (parsed.type === 'user' && parsed.message?.content) {
107
+ for (const block of parsed.message.content) {
108
+ if (block.type === 'tool_result') {
109
+ const toolResultBlock = { type: 'tool_result', tool_use_id: block.tool_use_id, content: typeof block.content === 'string' ? block.content : JSON.stringify(block.content), is_error: block.is_error || false };
110
+ currentSequenceRef.val++;
111
+ batcherRef.batcher.add(sessionId, conversationId, currentSequenceRef.val, 'tool_result', toolResultBlock);
112
+ broadcastSync({ type: 'streaming_progress', sessionId, conversationId, block: toolResultBlock, blockRole: 'tool_result', blockIndex: allBlocksRef.val.length, seq: currentSequenceRef.val, timestamp: Date.now() });
113
+ }
114
+ }
115
+ } else if (parsed.type === 'result') {
116
+ const resultBlock = { type: 'result', subtype: parsed.subtype, duration_ms: parsed.duration_ms, total_cost_usd: parsed.total_cost_usd, num_turns: parsed.num_turns, is_error: parsed.is_error || false, result: parsed.result };
117
+ currentSequenceRef.val++;
118
+ batcherRef.batcher.add(sessionId, conversationId, currentSequenceRef.val, 'result', resultBlock);
119
+ broadcastSync({ type: 'streaming_progress', sessionId, conversationId, block: resultBlock, blockRole: 'result', blockIndex: allBlocksRef.val.length, isResult: true, seq: currentSequenceRef.val, timestamp: Date.now() });
120
+ if (parsed.result) {
121
+ const resultText = typeof parsed.result === 'string' ? parsed.result : JSON.stringify(parsed.result);
122
+ const rlMatch = resultText.match(/you'?ve hit your limit|rate limit exceeded/i);
123
+ if (rlMatch) {
124
+ debugLog(`[rate-limit] Detected rate limit in result for conv ${conversationId}`);
125
+ const retryAfterSec = parseRateLimitResetTime(resultText);
126
+ const entry3 = activeExecutions.get(conversationId);
127
+ if (entry3 && entry3.pid) { try { process.kill(entry3.pid); } catch (e) {} }
128
+ const existingCount2 = rateLimitState.get(conversationId)?.retryCount || 0;
129
+ if (existingCount2 >= 3) {
130
+ batcherRef.batcher.drain();
131
+ activeExecutions.delete(conversationId);
132
+ queries.setIsStreaming(conversationId, false);
133
+ const errMsg2 = queries.createMessage(conversationId, 'assistant', `Error: Rate limit exceeded after ${existingCount2 + 1} attempts. Please try again later.`);
134
+ broadcastSync({ type: 'message_created', conversationId, message: errMsg2, timestamp: Date.now() });
135
+ broadcastSync({ type: 'streaming_complete', sessionId, conversationId, interrupted: true, timestamp: Date.now() });
136
+ return;
137
+ }
138
+ rateLimitState.set(conversationId, { retryAt: Date.now() + (retryAfterSec * 1000), cooldownMs: retryAfterSec * 1000, retryCount: existingCount2 + 1, isStreamDetected: true });
139
+ broadcastSync({ type: 'rate_limit_hit', sessionId, conversationId, retryAfterMs: retryAfterSec * 1000, retryAt: Date.now() + (retryAfterSec * 1000), retryCount: existingCount2 + 1, timestamp: Date.now() });
140
+ batcherRef.batcher.drain();
141
+ activeExecutions.delete(conversationId);
142
+ queries.setIsStreaming(conversationId, false);
143
+ setTimeout(() => {
144
+ rateLimitState.delete(conversationId);
145
+ broadcastSync({ type: 'rate_limit_clear', conversationId, timestamp: Date.now() });
146
+ scheduleRetry(conversationId, messageId, content, agentId, model, subAgent);
147
+ }, retryAfterSec * 1000);
148
+ return;
149
+ }
150
+ if (resultText) eagerTTS(resultText, conversationId, sessionId);
151
+ }
152
+ if (parsed.result && allBlocksRef.val.length === 0) allBlocksRef.val.push({ type: 'text', text: String(parsed.result) });
153
+ } else if (parsed.type === 'tool_status') {
154
+ broadcastSync({ type: 'streaming_progress', sessionId, conversationId, block: { type: 'tool_status', tool_use_id: parsed.tool_use_id, status: parsed.status }, seq: currentSequenceRef.val, timestamp: Date.now() });
155
+ } else if (parsed.type === 'usage') {
156
+ broadcastSync({ type: 'streaming_progress', sessionId, conversationId, block: { type: 'usage', usage: parsed.usage }, seq: currentSequenceRef.val, timestamp: Date.now() });
157
+ } else if (parsed.type === 'plan') {
158
+ broadcastSync({ type: 'streaming_progress', sessionId, conversationId, block: { type: 'plan', entries: parsed.entries }, seq: currentSequenceRef.val, timestamp: Date.now() });
159
+ =======
65
160
  }
66
161
 
67
162
  // Rate-limit detection in result blocks
@@ -94,6 +189,7 @@ export function createEventHandler({ queries, activeExecutions, broadcastSync, r
94
189
  return;
95
190
  }
96
191
  if (resultText) eagerTTS(resultText, conversationId, sessionId);
192
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
97
193
  }
98
194
  };
99
195
  }
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "agentgui",
3
- "version": "1.0.855",
3
+ "version": "1.0.856",
4
4
  "description": "Multi-agent ACP client with real-time communication",
5
5
  "type": "module",
6
6
  "main": "electron/main.js",
@@ -1,3 +1,19 @@
1
+ <<<<<<< HEAD
2
+ *, *::before, *::after { box-sizing: border-box; }
3
+
4
+ :root {
5
+ --color-primary: #3b82f6;
6
+ --color-primary-dark: #1e40af;
7
+ --color-bg-primary: #ffffff;
8
+ --color-bg-secondary: #f9fafb;
9
+ --color-bg-code: #f1f5f9;
10
+ --color-code-text: #1e293b;
11
+ --color-code-border: #cbd5e1;
12
+ --color-thinking-bg: #f5f3ff;
13
+ --color-text-primary: #111827;
14
+ --color-text-secondary: #6b7280;
15
+ --color-border: #e5e7eb;
16
+ =======
1
17
  @import url('https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap');
2
18
 
3
19
  *, *::before, *::after { box-sizing: border-box; }
@@ -16,10 +32,30 @@
16
32
  --color-text-muted: #9ca3af;
17
33
  --color-border: #e5e7eb;
18
34
  --color-border-strong: #d1d5db;
35
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
19
36
  --color-success: #10b981;
20
37
  --color-error: #ef4444;
21
38
  --color-warning: #f59e0b;
22
39
  --color-info: #0891b2;
40
+ <<<<<<< HEAD
41
+ --sidebar-width: 300px;
42
+ --header-height: 52px;
43
+ --msg-max-width: 800px;
44
+ }
45
+
46
+ html.dark {
47
+ --color-bg-primary: #1a1a1a;
48
+ --color-bg-secondary: #242424;
49
+ --color-text-primary: #e5e5e5;
50
+ --color-text-secondary: #a3a3a3;
51
+ --color-border: #333333;
52
+ --color-primary: #737373;
53
+ --color-primary-dark: #525252;
54
+ --color-bg-code: #1e293b;
55
+ --color-code-text: #e2e8f0;
56
+ --color-code-border: #334155;
57
+ --color-thinking-bg: #1e1a2e;
58
+ =======
23
59
  --sidebar-width: 260px;
24
60
  --header-height: 48px;
25
61
  --msg-max-width: 800px;
@@ -51,17 +87,24 @@
51
87
  --shadow-sm: 0 1px 2px rgba(0,0,0,0.3);
52
88
  --shadow-md: 0 4px 12px rgba(0,0,0,0.4);
53
89
  --shadow-lg: 0 8px 24px rgba(0,0,0,0.5);
90
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
54
91
  }
55
92
 
56
93
  html, body {
57
94
  margin: 0;
58
95
  padding: 0;
59
96
  height: 100%;
97
+ <<<<<<< HEAD
98
+ font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
99
+ background-color: var(--color-bg-primary);
100
+ color: var(--color-text-primary);
101
+ =======
60
102
  font-family: 'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
61
103
  background-color: var(--color-bg-primary);
62
104
  color: var(--color-text-primary);
63
105
  -webkit-font-smoothing: antialiased;
64
106
  -moz-osx-font-smoothing: grayscale;
107
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
65
108
  }
66
109
 
67
110
  /* ===== ROOT LAYOUT: sidebar + main, full viewport ===== */
@@ -82,7 +125,10 @@
82
125
  display: flex;
83
126
  flex-direction: column;
84
127
  background-color: var(--color-bg-secondary);
128
+ <<<<<<< HEAD
129
+ =======
85
130
  border-right: 1px solid var(--color-border);
131
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
86
132
  overflow: hidden;
87
133
  transition: none !important;
88
134
  animation: none !important;
@@ -97,26 +143,51 @@
97
143
  }
98
144
 
99
145
  .sidebar-header {
146
+ <<<<<<< HEAD
147
+ padding: 0.75rem 1rem;
148
+ =======
100
149
  padding: 0.875rem 1rem 0.75rem;
150
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
101
151
  display: flex;
102
152
  justify-content: space-between;
103
153
  align-items: center;
104
154
  flex-shrink: 0;
105
155
  min-height: var(--header-height);
156
+ <<<<<<< HEAD
157
+ =======
106
158
  border-bottom: 1px solid var(--color-border);
159
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
107
160
  }
108
161
 
109
162
  .sidebar-header h2 {
110
163
  margin: 0;
164
+ <<<<<<< HEAD
165
+ font-size: 0.875rem;
166
+ font-weight: 600;
167
+ text-transform: uppercase;
168
+ letter-spacing: 0.05em;
169
+ color: var(--color-text-secondary);
170
+ =======
111
171
  font-size: 1rem;
112
172
  font-weight: 700;
113
173
  letter-spacing: -0.01em;
114
174
  color: var(--color-text-primary);
175
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
115
176
  white-space: nowrap;
116
177
  overflow: hidden;
117
178
  }
118
179
 
119
180
  .sidebar-new-btn {
181
+ <<<<<<< HEAD
182
+ padding: 0.375rem 0.625rem;
183
+ font-size: 0.75rem;
184
+ background-color: var(--color-primary);
185
+ color: white;
186
+ border: none;
187
+ border-radius: 0.375rem;
188
+ cursor: pointer;
189
+ transition: background-color 0.2s;
190
+ =======
120
191
  padding: 0.375rem 0.75rem;
121
192
  font-size: 0.8rem;
122
193
  font-weight: 600;
@@ -126,6 +197,7 @@
126
197
  border-radius: var(--radius-md);
127
198
  cursor: pointer;
128
199
  transition: background-color var(--transition-fast);
200
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
129
201
  white-space: nowrap;
130
202
  flex-shrink: 0;
131
203
  }
@@ -154,6 +226,8 @@
154
226
 
155
227
  .sidebar-clone-btn:hover { border-color: var(--color-primary); color: var(--color-primary); }
156
228
 
229
+ <<<<<<< HEAD
230
+ =======
157
231
  /* Sidebar overflow menu */
158
232
  .sidebar-overflow-btn {
159
233
  width: 28px;
@@ -205,6 +279,7 @@
205
279
  .sidebar-overflow-menu-item.danger { color: var(--color-error); }
206
280
  .sidebar-overflow-menu-wrapper { position: relative; }
207
281
 
282
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
208
283
  .clone-input-bar {
209
284
  display: flex;
210
285
  align-items: center;
@@ -285,6 +360,24 @@
285
360
  }
286
361
 
287
362
  .conversation-item {
363
+ <<<<<<< HEAD
364
+ padding: 0.75rem 0.75rem;
365
+ margin: 0.125rem 0.5rem;
366
+ border-radius: 0.375rem;
367
+ cursor: pointer;
368
+ transition: background-color 0.15s;
369
+ border-left: 3px solid transparent;
370
+ user-select: none;
371
+ }
372
+
373
+ .conversation-item:hover { background-color: var(--color-bg-primary); }
374
+
375
+ .conversation-item.active {
376
+ background-color: var(--color-primary);
377
+ color: white;
378
+ border-left-color: var(--color-primary-dark);
379
+ }
380
+ =======
288
381
  padding: 0.625rem 0.75rem;
289
382
  margin: 0.125rem 0.5rem;
290
383
  border-radius: var(--radius-md);
@@ -301,6 +394,7 @@
301
394
  color: var(--color-primary);
302
395
  }
303
396
  .conversation-item.active .conversation-item-meta { color: var(--color-primary); opacity: 0.7; }
397
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
304
398
 
305
399
  .conversation-item-title {
306
400
  font-weight: 500;
@@ -320,7 +414,11 @@
320
414
  text-overflow: ellipsis;
321
415
  }
322
416
 
417
+ <<<<<<< HEAD
418
+ .conversation-item.active .conversation-item-meta { color: rgba(255,255,255,0.7); }
419
+ =======
323
420
  /* conversation-item.active meta color handled in active block */
421
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
324
422
 
325
423
  .conversation-streaming-badge {
326
424
  display: inline-flex;
@@ -331,14 +429,22 @@
331
429
 
332
430
  .streaming-dot {
333
431
  display: inline-block;
432
+ <<<<<<< HEAD
433
+ width: 0.5rem;
434
+ height: 0.5rem;
435
+ =======
334
436
  width: 0.4rem;
335
437
  height: 0.4rem;
438
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
336
439
  border-radius: 50%;
337
440
  background-color: var(--color-success);
338
441
  animation: pulse 1.5s ease-in-out infinite;
339
442
  }
340
443
 
341
444
  .conversation-item.active .streaming-dot {
445
+ <<<<<<< HEAD
446
+ background-color: #fff;
447
+ =======
342
448
  background-color: var(--color-success);
343
449
  }
344
450
 
@@ -362,6 +468,7 @@
362
468
  @keyframes typing-bounce {
363
469
  0%, 80%, 100% { transform: scale(0.7); opacity: 0.5; }
364
470
  40% { transform: scale(1); opacity: 1; }
471
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
365
472
  }
366
473
 
367
474
  .conversation-item {
@@ -377,6 +484,8 @@
377
484
  overflow: hidden;
378
485
  }
379
486
 
487
+ <<<<<<< HEAD
488
+ =======
380
489
  /* Date group headers in sidebar */
381
490
  .conv-date-group-header {
382
491
  padding: 0.5rem 1.25rem 0.25rem;
@@ -402,6 +511,7 @@
402
511
  margin-right: 0.375rem;
403
512
  }
404
513
 
514
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
405
515
  .conversation-item.pinned { cursor: grab; }
406
516
  .conversation-item.pinned:active { cursor: grabbing; }
407
517
  .conversation-item-checkbox {
@@ -459,16 +569,25 @@
459
569
  .conversation-item.active .conversation-item-delete,
460
570
  .conversation-item.active .conversation-item-archive,
461
571
  .conversation-item.active .conversation-item-export {
572
+ <<<<<<< HEAD
573
+ color: rgba(255,255,255,0.8);
574
+ =======
462
575
  color: var(--color-primary);
463
576
  opacity: 0.7;
577
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
464
578
  }
465
579
 
466
580
  .conversation-item.active .conversation-item-delete:hover,
467
581
  .conversation-item.active .conversation-item-archive:hover,
468
582
  .conversation-item.active .conversation-item-export:hover {
583
+ <<<<<<< HEAD
584
+ background-color: rgba(255,255,255,0.2);
585
+ color: white;
586
+ =======
469
587
  background-color: rgba(var(--color-primary-rgb),0.15);
470
588
  color: var(--color-primary);
471
589
  opacity: 1;
590
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
472
591
  }
473
592
 
474
593
  .sidebar-empty {
@@ -494,11 +613,18 @@
494
613
  align-items: center;
495
614
  gap: 0.75rem;
496
615
  padding: 0 1rem;
616
+ <<<<<<< HEAD
617
+ height: var(--header-height);
618
+ min-height: var(--header-height);
619
+ flex-shrink: 0;
620
+ background-color: var(--color-bg-secondary);
621
+ =======
497
622
  height: 48px;
498
623
  min-height: 48px;
499
624
  flex-shrink: 0;
500
625
  background: var(--color-bg-primary);
501
626
  border-bottom: 1px solid var(--color-border);
627
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
502
628
  }
503
629
 
504
630
  .sidebar-toggle-btn {
@@ -528,9 +654,14 @@
528
654
  }
529
655
 
530
656
  .header-title {
657
+ <<<<<<< HEAD
658
+ font-size: 1.125rem;
659
+ font-weight: 600;
660
+ =======
531
661
  font-size: 0.9375rem;
532
662
  font-weight: 600;
533
663
  letter-spacing: -0.01em;
664
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
534
665
  margin: 0;
535
666
  flex: 1;
536
667
  min-width: 0;
@@ -693,6 +824,8 @@
693
824
  flex-direction: column;
694
825
  }
695
826
 
827
+ <<<<<<< HEAD
828
+ =======
696
829
  /* ===== MESSAGE BUBBLES ===== */
697
830
  .message-user-bubble {
698
831
  display: flex;
@@ -771,6 +904,7 @@
771
904
  .message-action-btn svg { width: 13px; height: 13px; }
772
905
  .message-wrapper { position: relative; }
773
906
 
907
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
774
908
  #output {
775
909
  display: flex;
776
910
  flex-direction: column;
@@ -1017,6 +1151,15 @@
1017
1151
  white-space: nowrap;
1018
1152
  }
1019
1153
 
1154
+ <<<<<<< HEAD
1155
+ /* --- Input area: fixed at bottom --- */
1156
+ .input-section {
1157
+ flex-shrink: 0;
1158
+ background-color: var(--color-bg-primary);
1159
+ padding: 0.75rem 1rem;
1160
+ }
1161
+
1162
+ =======
1020
1163
  /* ===== INPUT AREA REDESIGN ===== */
1021
1164
  .input-section {
1022
1165
  flex-shrink: 0;
@@ -1139,6 +1282,7 @@
1139
1282
  .input-send-btn:disabled { opacity: 0.4; cursor: not-allowed; transform: none; }
1140
1283
  .input-send-btn svg { width: 16px; height: 16px; }
1141
1284
 
1285
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
1142
1286
  .input-wrapper {
1143
1287
  max-width: var(--msg-max-width);
1144
1288
  margin: 0 auto;
@@ -1401,9 +1545,15 @@
1401
1545
  .permanently-expanded > summary::marker { display: none; content: ''; }
1402
1546
 
1403
1547
  /* ===== Folder Browser Modal ===== */
1548
+ <<<<<<< HEAD
1549
+ .folder-modal-overlay { display:none; position:fixed; top:0; left:0; width:100%; height:100%; background:rgba(0,0,0,0.5); z-index:2000; align-items:center; justify-content:center; }
1550
+ .folder-modal-overlay.visible { display:flex; }
1551
+ .folder-modal { background:var(--color-bg-primary); border-radius:0.5rem; width:500px; max-width:90vw; max-height:80vh; display:flex; flex-direction:column; box-shadow:0 20px 60px rgba(0,0,0,0.3); }
1552
+ =======
1404
1553
  .folder-modal-overlay { display:none; position:fixed; top:0; left:0; width:100%; height:100%; background:rgba(0,0,0,0.5); backdrop-filter:blur(8px); -webkit-backdrop-filter:blur(8px); z-index:2000; align-items:center; justify-content:center; transition:opacity 0.15s ease; animation:fadeIn 0.15s ease; }
1405
1554
  .folder-modal-overlay.visible { display:flex; }
1406
1555
  .folder-modal { background:var(--color-bg-primary); border-radius:var(--radius-xl); width:500px; max-width:90vw; max-height:80vh; display:flex; flex-direction:column; box-shadow:var(--shadow-lg); animation:scaleIn 0.15s ease; }
1556
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
1407
1557
  .folder-modal-header { padding:1rem; display:flex; justify-content:space-between; align-items:center; flex-shrink:0; }
1408
1558
  .folder-modal-header h3 { margin:0; font-size:1rem; font-weight:600; }
1409
1559
  .folder-modal-close { background:none; border:none; font-size:1.25rem; cursor:pointer; color:var(--color-text-secondary); padding:0.25rem; line-height:1; }
@@ -1423,6 +1573,8 @@
1423
1573
  .folder-modal-footer { padding:0.75rem 1rem; display:flex; justify-content:flex-end; gap:0.5rem; flex-shrink:0; }
1424
1574
  .folder-modal-footer .btn { padding:0.5rem 1rem; font-size:0.8rem; }
1425
1575
 
1576
+ <<<<<<< HEAD
1577
+ =======
1426
1578
  @keyframes fadeIn {
1427
1579
  from { opacity: 0; }
1428
1580
  to { opacity: 1; }
@@ -1432,6 +1584,7 @@
1432
1584
  to { opacity: 1; transform: scale(1); }
1433
1585
  }
1434
1586
 
1587
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
1435
1588
  .btn { padding:0.5rem 1rem; border:none; border-radius:0.375rem; font-weight:500; cursor:pointer; transition:all 0.15s; font-size:0.875rem; }
1436
1589
  .btn-primary { background-color:var(--color-primary); color:white; }
1437
1590
  .btn-primary:hover:not(:disabled) { background-color:var(--color-primary-dark); }
@@ -1554,6 +1707,8 @@
1554
1707
  scrollbar-color: #475569 transparent;
1555
1708
  }
1556
1709
 
1710
+ <<<<<<< HEAD
1711
+ =======
1557
1712
  /* Modern thin scrollbars */
1558
1713
  .sidebar-list, #output-scroll, .message-scroll-area {
1559
1714
  scrollbar-width: thin;
@@ -1563,6 +1718,7 @@
1563
1718
  scrollbar-color: var(--color-border-strong) transparent;
1564
1719
  }
1565
1720
 
1721
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
1566
1722
  .voice-mic-btn {
1567
1723
  position: absolute;
1568
1724
  top: 4px;
@@ -1818,6 +1974,8 @@
1818
1974
  .icon-sm svg { width: 1rem !important; height: 1rem !important; }
1819
1975
  .icon-lg svg { width: 1.5rem !important; height: 1.5rem !important; }
1820
1976
 
1977
+ <<<<<<< HEAD
1978
+ =======
1821
1979
  /* ===== STREAMING STATUS BAR ===== */
1822
1980
  .streaming-status-bar {
1823
1981
  max-width: var(--msg-max-width);
@@ -1942,6 +2100,7 @@
1942
2100
  line-height: 1.4;
1943
2101
  }
1944
2102
 
2103
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
1945
2104
  /* ===== STREAMING BLOCK STYLES ===== */
1946
2105
  .block-text {
1947
2106
  margin-bottom: 0;
@@ -1,8 +1,11 @@
1
+ <<<<<<< HEAD
2
+ =======
1
3
  /**
2
4
  * AgentGUI Client
3
5
  * Main application orchestrator that integrates WebSocket, event processing,
4
6
  * and streaming renderer for real-time Claude Code execution visualization
5
7
  */
8
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
6
9
 
7
10
  class AgentGUIClient {
8
11
  constructor(config = {}) {
@@ -14,13 +17,19 @@ class AgentGUIClient {
14
17
  ...config
15
18
  };
16
19
 
20
+ <<<<<<< HEAD
21
+ =======
17
22
  // Initialize components - reuse global wsManager/wsClient if available
23
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
18
24
  this.renderer = new StreamingRenderer(config.renderer || {});
19
25
  this.wsManager = window.wsManager || new WebSocketManager(config.websocket || {});
20
26
  if (!window.wsManager) window.wsManager = this.wsManager;
21
27
  this.eventProcessor = new EventProcessor(config.eventProcessor || {});
22
28
 
29
+ <<<<<<< HEAD
30
+ =======
23
31
  // Application state
32
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
24
33
  this.state = {
25
34
  isInitialized: false,
26
35
  currentSession: null,
@@ -31,6 +40,21 @@ class AgentGUIClient {
31
40
  agents: []
32
41
  };
33
42
 
43
+ <<<<<<< HEAD
44
+ this.conversationCache = new Map();
45
+ this.MAX_CACHE_SIZE = 10;
46
+
47
+ this.conversationListCache = {
48
+ data: [],
49
+ timestamp: 0,
50
+ ttl: 30000
51
+ };
52
+
53
+ this.draftPrompts = new Map();
54
+
55
+ this.eventHandlers = {};
56
+
57
+ =======
34
58
  // Conversation DOM cache: store rendered DOM + scroll position per conversationId
35
59
  this.conversationCache = new Map();
36
60
  this.MAX_CACHE_SIZE = 10;
@@ -49,6 +73,7 @@ class AgentGUIClient {
49
73
  this.eventHandlers = {};
50
74
 
51
75
  // UI state
76
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
52
77
  this.ui = {
53
78
  statusIndicator: null,
54
79
  messageInput: null,
@@ -62,6 +87,17 @@ class AgentGUIClient {
62
87
  this._isLoadingConversation = false;
63
88
  this._modelCache = new Map();
64
89
 
90
+ <<<<<<< HEAD
91
+ this._renderedSeqs = {};
92
+ this._inflightRequests = new Map();
93
+ this._previousConvAbort = null;
94
+
95
+ this._bgCache = new Map();
96
+ this.BG_CACHE_MAX = 50;
97
+
98
+ this._loadInProgress = {};
99
+ this._currentRequestId = 0;
100
+ =======
65
101
  this._renderedSeqs = {}; // plain object: sessionId → Set<number>
66
102
  this._inflightRequests = new Map();
67
103
  this._previousConvAbort = null;
@@ -74,6 +110,7 @@ class AgentGUIClient {
74
110
  // PHASE 2: Request Lifetime Tracking
75
111
  this._loadInProgress = {}; // { [conversationId]: { requestId, abortController, timestamp, prevConversationId } }
76
112
  this._currentRequestId = 0; // Auto-incrementing request counter
113
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
77
114
 
78
115
 
79
116
  this._scrollTarget = 0;
@@ -85,7 +122,10 @@ class AgentGUIClient {
85
122
  this._lastSendTime = 0;
86
123
  this._countdownTimer = null;
87
124
 
125
+ <<<<<<< HEAD
126
+ =======
88
127
  // Router state
128
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
89
129
  this.routerState = {
90
130
  currentConversationId: null,
91
131
  currentSessionId: null
@@ -96,17 +136,25 @@ class AgentGUIClient {
96
136
 
97
137
  _dbg(...args) { if (this._debug) console.log('[AgentGUI]', ...args); }
98
138
 
139
+ <<<<<<< HEAD
140
+ =======
99
141
  /**
100
142
  * Initialize the client
101
143
  */
144
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
102
145
  async init() {
103
146
  try {
104
147
  this._dbg('Initializing AgentGUI client');
105
148
 
149
+ <<<<<<< HEAD
150
+ const wsReady = this.config.autoConnect ? this.connectWebSocket() : Promise.resolve();
151
+
152
+ =======
106
153
  // Start WebSocket connection immediately (don't wait for UI setup)
107
154
  const wsReady = this.config.autoConnect ? this.connectWebSocket() : Promise.resolve();
108
155
 
109
156
  // Initialize renderer and UI in parallel with WS connection
157
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
110
158
  this.renderer.init(this.config.outputContainerId, this.config.scrollContainerId);
111
159
 
112
160
  if (typeof ImageLoader !== 'undefined') {
@@ -117,7 +165,10 @@ class AgentGUIClient {
117
165
  this.setupRendererListeners();
118
166
  this.setupUI();
119
167
 
168
+ <<<<<<< HEAD
169
+ =======
120
170
  // Wait for WS, then load data in parallel
171
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
121
172
  await wsReady;
122
173
  await Promise.all([
123
174
  this.loadAgents(),
@@ -125,10 +176,15 @@ class AgentGUIClient {
125
176
  this.checkSpeechStatus()
126
177
  ]);
127
178
 
179
+ <<<<<<< HEAD
180
+ this.enableControls();
181
+
182
+ =======
128
183
  // Enable controls for initial interaction
129
184
  this.enableControls();
130
185
 
131
186
  // Restore state from URL on page load
187
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
132
188
  this.restoreStateFromUrl();
133
189
 
134
190
  this.state.isInitialized = true;
@@ -144,6 +200,8 @@ class AgentGUIClient {
144
200
  }
145
201
  }
146
202
 
203
+ <<<<<<< HEAD
204
+ =======
147
205
  /**
148
206
  * Setup WebSocket event listeners
149
207
  */
@@ -3552,6 +3610,7 @@ class AgentGUIClient {
3552
3610
  this.wsManager.destroy();
3553
3611
  this.eventHandlers = {};
3554
3612
  }
3613
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
3555
3614
  }
3556
3615
 
3557
3616
  window.__convPerfMetrics = () => {
@@ -3559,6 +3618,11 @@ window.__convPerfMetrics = () => {
3559
3618
  return entries.map(e => ({ name: e.name, ms: Math.round(e.duration) }));
3560
3619
  };
3561
3620
 
3621
+ <<<<<<< HEAD
3622
+ let agentGUIClient = null;
3623
+
3624
+ document.addEventListener('DOMContentLoaded', async () => {
3625
+ =======
3562
3626
  function updateWelcomeScreen() {
3563
3627
  const welcomeScreen = document.getElementById('welcomeScreen');
3564
3628
  const outputScroll = document.getElementById('output-scroll');
@@ -3615,6 +3679,7 @@ let agentGUIClient = null;
3615
3679
  // Initialize on DOM ready
3616
3680
  document.addEventListener('DOMContentLoaded', async () => {
3617
3681
  updateWelcomeScreen();
3682
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
3618
3683
  try {
3619
3684
  agentGUIClient = new AgentGUIClient();
3620
3685
  window.agentGuiClient = agentGUIClient;
@@ -3625,7 +3690,10 @@ document.addEventListener('DOMContentLoaded', async () => {
3625
3690
  }
3626
3691
  });
3627
3692
 
3693
+ <<<<<<< HEAD
3694
+ =======
3628
3695
  // Export for testing
3696
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
3629
3697
  if (typeof module !== 'undefined' && module.exports) {
3630
3698
  module.exports = AgentGUIClient;
3631
3699
  }
@@ -1,3 +1,5 @@
1
+ <<<<<<< HEAD
2
+ =======
1
3
  /**
2
4
  * Conversations Module
3
5
  * Manages conversation list sidebar with real-time updates
@@ -23,6 +25,7 @@ function getAgentColor(agentId) {
23
25
  return key ? AGENT_COLORS[key] : AGENT_COLORS.default;
24
26
  }
25
27
 
28
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
26
29
  function pathSplit(p) {
27
30
  return p.split(/[\/\\]/).filter(Boolean);
28
31
  }
@@ -134,6 +137,8 @@ class ConversationManager {
134
137
  return ' (' + model.replace(/^claude-/i, '').replace(/-\d{8,}.*$/, '').replace(/-/g, ' ') + ')';
135
138
  }
136
139
 
140
+ <<<<<<< HEAD
141
+ =======
137
142
  setupDelegatedListeners() {
138
143
  let draggedId = null;
139
144
  this.listEl.addEventListener('dragstart', (e) => {
@@ -801,11 +806,14 @@ class ConversationManager {
801
806
  });
802
807
  }
803
808
 
809
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1
804
810
  escapeHtml(text) {
805
811
  return window._escHtml(text);
806
812
  }
807
813
  }
808
814
 
815
+ <<<<<<< HEAD
816
+ =======
809
817
  if (document.readyState === 'loading') {
810
818
  document.addEventListener('DOMContentLoaded', () => {
811
819
  window.conversationManager = new ConversationManager();
@@ -813,3 +821,4 @@ if (document.readyState === 'loading') {
813
821
  } else {
814
822
  window.conversationManager = new ConversationManager();
815
823
  }
824
+ >>>>>>> 6bfde951cbeb65ec72b73da9c23b9c8c0ba0bbc1