@mariozechner/pi-mom 0.18.1 → 0.18.3
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +31 -11
- package/README.md +30 -18
- package/dist/agent.d.ts +14 -2
- package/dist/agent.d.ts.map +1 -1
- package/dist/agent.js +294 -331
- package/dist/agent.js.map +1 -1
- package/dist/context.d.ts +132 -0
- package/dist/context.d.ts.map +1 -0
- package/dist/context.js +538 -0
- package/dist/context.js.map +1 -0
- package/dist/log.d.ts +1 -1
- package/dist/log.d.ts.map +1 -1
- package/dist/log.js +14 -1
- package/dist/log.js.map +1 -1
- package/dist/main.d.ts.map +1 -1
- package/dist/main.js +166 -90
- package/dist/main.js.map +1 -1
- package/dist/slack.d.ts +86 -55
- package/dist/slack.d.ts.map +1 -1
- package/dist/slack.js +322 -418
- package/dist/slack.js.map +1 -1
- package/package.json +4 -3
package/CHANGELOG.md
CHANGED
|
@@ -1,6 +1,34 @@
|
|
|
1
1
|
# Changelog
|
|
2
2
|
|
|
3
|
-
## [
|
|
3
|
+
## [0.18.3] - 2025-12-11
|
|
4
|
+
|
|
5
|
+
### Changed
|
|
6
|
+
|
|
7
|
+
- Complete rewrite of message handling architecture (#115)
|
|
8
|
+
- Now uses `AgentSession` from coding-agent for session management
|
|
9
|
+
- Brings auto-compaction, overflow handling, and proper prompt caching
|
|
10
|
+
- `log.jsonl` is the source of truth for all channel messages
|
|
11
|
+
- `context.jsonl` stores LLM context (messages sent to Claude, same format as coding-agent)
|
|
12
|
+
- Sync mechanism ensures context.jsonl stays in sync with log.jsonl at run start
|
|
13
|
+
- Session header written immediately on new session creation (not lazily)
|
|
14
|
+
- Tool results preserved in context.jsonl for multi-turn continuity
|
|
15
|
+
|
|
16
|
+
- Backfill improvements
|
|
17
|
+
- Only backfills channels that already have a `log.jsonl` file
|
|
18
|
+
- Strips @mentions from backfilled messages (consistent with live messages)
|
|
19
|
+
- Uses largest timestamp in log for efficient incremental backfill
|
|
20
|
+
- Fetches DM channels in addition to public/private channels
|
|
21
|
+
|
|
22
|
+
- Message handling improvements
|
|
23
|
+
- Channel chatter (messages without @mention) logged but doesn't trigger processing
|
|
24
|
+
- Messages sent while mom is busy are logged and synced on next run
|
|
25
|
+
- Pre-startup messages (replayed by Slack on reconnect) logged but not auto-processed
|
|
26
|
+
- Stop command executes immediately (not queued), can interrupt running tasks
|
|
27
|
+
- Channel @mentions no longer double-logged (was firing both app_mention and message events)
|
|
28
|
+
|
|
29
|
+
- Usage summary now includes context window usage
|
|
30
|
+
- Shows current context tokens vs model's context window
|
|
31
|
+
- Example: `Context: 4.2k / 200k (2.1%)`
|
|
4
32
|
|
|
5
33
|
### Fixed
|
|
6
34
|
|
|
@@ -13,20 +41,12 @@
|
|
|
13
41
|
- Private channel messages not being logged
|
|
14
42
|
- Added `message.groups` to required bot events in README
|
|
15
43
|
- Added `groups:history` and `groups:read` to required scopes in README
|
|
16
|
-
|
|
17
|
-
|
|
44
|
+
|
|
45
|
+
- Stop command now updates "Stopping..." to "Stopped" instead of posting two messages
|
|
18
46
|
|
|
19
47
|
### Added
|
|
20
48
|
|
|
21
49
|
- Port truncation logic from coding-agent: bash and read tools now use consistent 2000 lines OR 50KB limits with actionable notices
|
|
22
|
-
- Remove redundant context history truncation (tools already provide truncation with actionable hints)
|
|
23
|
-
|
|
24
|
-
- Message backfill on startup (#103)
|
|
25
|
-
- Fetches missed messages from Slack using `conversations.history` API when mom restarts
|
|
26
|
-
- Backfills up to 3 pages (3000 messages) per channel since last logged timestamp
|
|
27
|
-
- Includes mom's own responses and user messages (excludes other bots)
|
|
28
|
-
- Downloads attachments from backfilled messages
|
|
29
|
-
- Logs progress: channel count, per-channel message counts, total with duration
|
|
30
50
|
|
|
31
51
|
## [0.10.2] - 2025-11-27
|
|
32
52
|
|
package/README.md
CHANGED
|
@@ -145,24 +145,27 @@ You provide mom with a **data directory** (e.g., `./data`) as her workspace. Whi
|
|
|
145
145
|
```
|
|
146
146
|
./data/ # Your host directory
|
|
147
147
|
├── MEMORY.md # Global memory (shared across channels)
|
|
148
|
+
├── settings.json # Global settings (compaction, retry, etc.)
|
|
148
149
|
├── skills/ # Global custom CLI tools mom creates
|
|
149
150
|
├── C123ABC/ # Each Slack channel gets a directory
|
|
150
151
|
│ ├── MEMORY.md # Channel-specific memory
|
|
151
|
-
│ ├── log.jsonl # Full
|
|
152
|
+
│ ├── log.jsonl # Full message history (source of truth)
|
|
153
|
+
│ ├── context.jsonl # LLM context (synced from log.jsonl)
|
|
152
154
|
│ ├── attachments/ # Files users shared
|
|
153
155
|
│ ├── scratch/ # Mom's working directory
|
|
154
156
|
│ └── skills/ # Channel-specific CLI tools
|
|
155
|
-
└──
|
|
157
|
+
└── D456DEF/ # DM channels also get directories
|
|
156
158
|
└── ...
|
|
157
159
|
```
|
|
158
160
|
|
|
159
161
|
**What's stored here:**
|
|
160
|
-
-
|
|
161
|
-
-
|
|
162
|
+
- `log.jsonl`: All channel messages (user messages, bot responses). Source of truth.
|
|
163
|
+
- `context.jsonl`: Messages sent to Claude. Synced from log.jsonl at each run start.
|
|
164
|
+
- Memory files: Context mom remembers across sessions
|
|
162
165
|
- Custom tools/scripts mom creates (aka "skills")
|
|
163
166
|
- Working files, cloned repos, generated output
|
|
164
167
|
|
|
165
|
-
|
|
168
|
+
Mom efficiently greps `log.jsonl` for conversation history, giving her essentially infinite context beyond what's in `context.jsonl`.
|
|
166
169
|
|
|
167
170
|
### Memory
|
|
168
171
|
|
|
@@ -230,33 +233,41 @@ Mom will read the `SKILL.md` file before using a skill, and reuse stored credent
|
|
|
230
233
|
|
|
231
234
|
Update mom anytime with `npm install -g @mariozechner/pi-mom`. This only updates the Node.js app on your host. Anything mom installed inside the Docker container remains unchanged.
|
|
232
235
|
|
|
233
|
-
## Message History
|
|
236
|
+
## Message History
|
|
234
237
|
|
|
235
|
-
|
|
238
|
+
Mom uses two files per channel to manage messages:
|
|
239
|
+
|
|
240
|
+
### log.jsonl (Source of Truth)
|
|
241
|
+
|
|
242
|
+
All channel messages are stored here. This includes user messages, channel chatter (messages without @mention), and bot responses. Format: one JSON object per line with ISO 8601 timestamps:
|
|
236
243
|
|
|
237
244
|
```typescript
|
|
238
245
|
interface LoggedMessage {
|
|
239
246
|
date: string; // ISO 8601 (e.g., "2025-11-26T10:44:00.000Z")
|
|
240
|
-
ts: string; // Slack timestamp
|
|
247
|
+
ts: string; // Slack timestamp (seconds.microseconds)
|
|
241
248
|
user: string; // User ID or "bot"
|
|
242
249
|
userName?: string; // Handle (e.g., "mario")
|
|
243
250
|
displayName?: string; // Display name (e.g., "Mario Zechner")
|
|
244
|
-
text: string; // Message text
|
|
245
|
-
attachments:
|
|
246
|
-
original: string; // Original filename
|
|
247
|
-
local: string; // Path relative to data dir
|
|
248
|
-
}>;
|
|
251
|
+
text: string; // Message text (@mentions stripped)
|
|
252
|
+
attachments: string[]; // Filenames of attachments
|
|
249
253
|
isBot: boolean;
|
|
250
254
|
}
|
|
251
255
|
```
|
|
252
256
|
|
|
253
257
|
**Example:**
|
|
254
258
|
```json
|
|
255
|
-
{"date":"2025-11-26T10:44:00.123Z","ts":"1732619040.123456","user":"U123ABC","userName":"mario","text":"
|
|
259
|
+
{"date":"2025-11-26T10:44:00.123Z","ts":"1732619040.123456","user":"U123ABC","userName":"mario","text":"hello","attachments":[],"isBot":false}
|
|
256
260
|
{"date":"2025-11-26T10:44:05.456Z","ts":"1732619045456","user":"bot","text":"Hi! How can I help?","attachments":[],"isBot":true}
|
|
257
261
|
```
|
|
258
262
|
|
|
259
|
-
|
|
263
|
+
### context.jsonl (LLM Context)
|
|
264
|
+
|
|
265
|
+
Messages sent to Claude are stored here. This is synced from `log.jsonl` at the start of each run to ensure:
|
|
266
|
+
- Backfilled messages (from Slack API on startup) are included
|
|
267
|
+
- Channel chatter between @mentions is included
|
|
268
|
+
- Messages sent while mom was busy are included
|
|
269
|
+
|
|
270
|
+
Mom knows how to query `log.jsonl` efficiently (see [her system prompt](src/agent.ts)) for older history beyond what's in context.
|
|
260
271
|
|
|
261
272
|
## Security Considerations
|
|
262
273
|
|
|
@@ -339,9 +350,10 @@ mom --sandbox=docker:mom-exec ./data-exec
|
|
|
339
350
|
|
|
340
351
|
### Code Structure
|
|
341
352
|
|
|
342
|
-
- `src/main.ts`: Entry point, CLI arg parsing,
|
|
343
|
-
- `src/agent.ts`: Agent runner, event handling, tool execution
|
|
344
|
-
- `src/slack.ts`: Slack integration
|
|
353
|
+
- `src/main.ts`: Entry point, CLI arg parsing, handler setup, SlackContext adapter
|
|
354
|
+
- `src/agent.ts`: Agent runner, event handling, tool execution, session management
|
|
355
|
+
- `src/slack.ts`: Slack integration (Socket Mode), backfill, message logging
|
|
356
|
+
- `src/context.ts`: Session manager (context.jsonl), log-to-context sync
|
|
345
357
|
- `src/store.ts`: Channel data persistence, attachment downloads
|
|
346
358
|
- `src/log.ts`: Centralized logging (console output)
|
|
347
359
|
- `src/sandbox.ts`: Docker/host sandbox execution
|
package/dist/agent.d.ts
CHANGED
|
@@ -1,11 +1,23 @@
|
|
|
1
1
|
import { type SandboxConfig } from "./sandbox.js";
|
|
2
2
|
import type { SlackContext } from "./slack.js";
|
|
3
3
|
import type { ChannelStore } from "./store.js";
|
|
4
|
+
export interface PendingMessage {
|
|
5
|
+
userName: string;
|
|
6
|
+
text: string;
|
|
7
|
+
attachments: {
|
|
8
|
+
local: string;
|
|
9
|
+
}[];
|
|
10
|
+
timestamp: number;
|
|
11
|
+
}
|
|
4
12
|
export interface AgentRunner {
|
|
5
|
-
run(ctx: SlackContext,
|
|
13
|
+
run(ctx: SlackContext, store: ChannelStore, pendingMessages?: PendingMessage[]): Promise<{
|
|
6
14
|
stopReason: string;
|
|
7
15
|
}>;
|
|
8
16
|
abort(): void;
|
|
9
17
|
}
|
|
10
|
-
|
|
18
|
+
/**
|
|
19
|
+
* Get or create an AgentRunner for a channel.
|
|
20
|
+
* Runners are cached - one per channel, persistent across messages.
|
|
21
|
+
*/
|
|
22
|
+
export declare function getOrCreateRunner(sandboxConfig: SandboxConfig, channelId: string, channelDir: string): AgentRunner;
|
|
11
23
|
//# sourceMappingURL=agent.d.ts.map
|
package/dist/agent.d.ts.map
CHANGED
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"agent.d.ts","sourceRoot":"","sources":["../src/agent.ts"],"names":[],"mappings":"AAMA,OAAO,EAAkB,KAAK,aAAa,EAAE,MAAM,cAAc,CAAC;AAClE,OAAO,KAAK,EAAe,YAAY,EAAY,MAAM,YAAY,CAAC;AACtE,OAAO,KAAK,EAAE,YAAY,EAAE,MAAM,YAAY,CAAC;AA4B/C,MAAM,WAAW,WAAW;IAC3B,GAAG,CAAC,GAAG,EAAE,YAAY,EAAE,UAAU,EAAE,MAAM,EAAE,KAAK,EAAE,YAAY,GAAG,OAAO,CAAC;QAAE,UAAU,EAAE,MAAM,CAAA;KAAE,CAAC,CAAC;IACjG,KAAK,IAAI,IAAI,CAAC;CACd;AAqUD,wBAAgB,iBAAiB,CAAC,aAAa,EAAE,aAAa,GAAG,WAAW,CAgV3E","sourcesContent":["import { Agent, type AgentEvent, ProviderTransport } from \"@mariozechner/pi-agent-core\";\nimport { getModel } from \"@mariozechner/pi-ai\";\nimport { existsSync, readFileSync } from \"fs\";\nimport { mkdir, writeFile } from \"fs/promises\";\nimport { join } from \"path\";\nimport * as log from \"./log.js\";\nimport { createExecutor, type SandboxConfig } from \"./sandbox.js\";\nimport type { ChannelInfo, SlackContext, UserInfo } from \"./slack.js\";\nimport type { ChannelStore } from \"./store.js\";\nimport { createMomTools, setUploadFunction } from \"./tools/index.js\";\n\n// Hardcoded model for now\nconst model = getModel(\"anthropic\", \"claude-sonnet-4-5\");\n\n/**\n * Convert Date.now() to Slack timestamp format (seconds.microseconds)\n * Uses a monotonic counter to ensure ordering even within the same millisecond\n */\nlet lastTsMs = 0;\nlet tsCounter = 0;\n\nfunction toSlackTs(): string {\n\tconst now = Date.now();\n\tif (now === lastTsMs) {\n\t\t// Same millisecond - increment counter for sub-ms ordering\n\t\ttsCounter++;\n\t} else {\n\t\t// New millisecond - reset counter\n\t\tlastTsMs = now;\n\t\ttsCounter = 0;\n\t}\n\tconst seconds = Math.floor(now / 1000);\n\tconst micros = (now % 1000) * 1000 + tsCounter; // ms to micros + counter\n\treturn `${seconds}.${micros.toString().padStart(6, \"0\")}`;\n}\n\nexport interface AgentRunner {\n\trun(ctx: SlackContext, channelDir: string, store: ChannelStore): Promise<{ stopReason: string }>;\n\tabort(): void;\n}\n\nfunction getAnthropicApiKey(): string {\n\tconst key = process.env.ANTHROPIC_OAUTH_TOKEN || process.env.ANTHROPIC_API_KEY;\n\tif (!key) {\n\t\tthrow new Error(\"ANTHROPIC_OAUTH_TOKEN or ANTHROPIC_API_KEY must be set\");\n\t}\n\treturn key;\n}\n\ninterface LogMessage {\n\tdate?: string;\n\tts?: string;\n\tuser?: string;\n\tuserName?: string;\n\ttext?: string;\n\tattachments?: Array<{ local: string }>;\n\tisBot?: boolean;\n}\n\nfunction getRecentMessages(channelDir: string, turnCount: number): string {\n\tconst logPath = join(channelDir, \"log.jsonl\");\n\tif (!existsSync(logPath)) {\n\t\treturn \"(no message history yet)\";\n\t}\n\n\tconst content = readFileSync(logPath, \"utf-8\");\n\tconst lines = content.trim().split(\"\\n\").filter(Boolean);\n\n\tif (lines.length === 0) {\n\t\treturn \"(no message history yet)\";\n\t}\n\n\t// Parse all messages and sort by Slack timestamp\n\t// (attachment downloads can cause out-of-order logging)\n\tconst messages: LogMessage[] = [];\n\tfor (const line of lines) {\n\t\ttry {\n\t\t\tmessages.push(JSON.parse(line));\n\t\t} catch {}\n\t}\n\tmessages.sort((a, b) => {\n\t\tconst tsA = parseFloat(a.ts || \"0\");\n\t\tconst tsB = parseFloat(b.ts || \"0\");\n\t\treturn tsA - tsB;\n\t});\n\n\t// Group into \"turns\" - a turn is either:\n\t// - A single user message (isBot: false)\n\t// - A sequence of consecutive bot messages (isBot: true) coalesced into one turn\n\t// We walk backwards to get the last N turns\n\tconst turns: LogMessage[][] = [];\n\tlet currentTurn: LogMessage[] = [];\n\tlet lastWasBot: boolean | null = null;\n\n\tfor (let i = messages.length - 1; i >= 0; i--) {\n\t\tconst msg = messages[i];\n\t\tconst isBot = msg.isBot === true;\n\n\t\tif (lastWasBot === null) {\n\t\t\t// First message\n\t\t\tcurrentTurn.unshift(msg);\n\t\t\tlastWasBot = isBot;\n\t\t} else if (isBot && lastWasBot) {\n\t\t\t// Consecutive bot messages - same turn\n\t\t\tcurrentTurn.unshift(msg);\n\t\t} else {\n\t\t\t// Transition - save current turn and start new one\n\t\t\tturns.unshift(currentTurn);\n\t\t\tcurrentTurn = [msg];\n\t\t\tlastWasBot = isBot;\n\n\t\t\t// Stop if we have enough turns\n\t\t\tif (turns.length >= turnCount) {\n\t\t\t\tbreak;\n\t\t\t}\n\t\t}\n\t}\n\n\t// Don't forget the last turn we were building\n\tif (currentTurn.length > 0 && turns.length < turnCount) {\n\t\tturns.unshift(currentTurn);\n\t}\n\n\t// Flatten turns back to messages and format as TSV\n\tconst formatted: string[] = [];\n\tfor (const turn of turns) {\n\t\tfor (const msg of turn) {\n\t\t\tconst date = (msg.date || \"\").substring(0, 19);\n\t\t\tconst user = msg.userName || msg.user || \"\";\n\t\t\tconst text = msg.text || \"\";\n\t\t\tconst attachments = (msg.attachments || []).map((a) => a.local).join(\",\");\n\t\t\tformatted.push(`${date}\\t${user}\\t${text}\\t${attachments}`);\n\t\t}\n\t}\n\n\treturn formatted.join(\"\\n\");\n}\n\nfunction getMemory(channelDir: string): string {\n\tconst parts: string[] = [];\n\n\t// Read workspace-level memory (shared across all channels)\n\tconst workspaceMemoryPath = join(channelDir, \"..\", \"MEMORY.md\");\n\tif (existsSync(workspaceMemoryPath)) {\n\t\ttry {\n\t\t\tconst content = readFileSync(workspaceMemoryPath, \"utf-8\").trim();\n\t\t\tif (content) {\n\t\t\t\tparts.push(\"### Global Workspace Memory\\n\" + content);\n\t\t\t}\n\t\t} catch (error) {\n\t\t\tlog.logWarning(\"Failed to read workspace memory\", `${workspaceMemoryPath}: ${error}`);\n\t\t}\n\t}\n\n\t// Read channel-specific memory\n\tconst channelMemoryPath = join(channelDir, \"MEMORY.md\");\n\tif (existsSync(channelMemoryPath)) {\n\t\ttry {\n\t\t\tconst content = readFileSync(channelMemoryPath, \"utf-8\").trim();\n\t\t\tif (content) {\n\t\t\t\tparts.push(\"### Channel-Specific Memory\\n\" + content);\n\t\t\t}\n\t\t} catch (error) {\n\t\t\tlog.logWarning(\"Failed to read channel memory\", `${channelMemoryPath}: ${error}`);\n\t\t}\n\t}\n\n\tif (parts.length === 0) {\n\t\treturn \"(no working memory yet)\";\n\t}\n\n\treturn parts.join(\"\\n\\n\");\n}\n\nfunction buildSystemPrompt(\n\tworkspacePath: string,\n\tchannelId: string,\n\tmemory: string,\n\tsandboxConfig: SandboxConfig,\n\tchannels: ChannelInfo[],\n\tusers: UserInfo[],\n): string {\n\tconst channelPath = `${workspacePath}/${channelId}`;\n\tconst isDocker = sandboxConfig.type === \"docker\";\n\n\t// Format channel mappings\n\tconst channelMappings =\n\t\tchannels.length > 0 ? channels.map((c) => `${c.id}\\t#${c.name}`).join(\"\\n\") : \"(no channels loaded)\";\n\n\t// Format user mappings\n\tconst userMappings =\n\t\tusers.length > 0 ? users.map((u) => `${u.id}\\t@${u.userName}\\t${u.displayName}`).join(\"\\n\") : \"(no users loaded)\";\n\n\tconst envDescription = isDocker\n\t\t? `You are running inside a Docker container (Alpine Linux).\n- Bash working directory: / (use cd or absolute paths)\n- Install tools with: apk add <package>\n- Your changes persist across sessions`\n\t\t: `You are running directly on the host machine.\n- Bash working directory: ${process.cwd()}\n- Be careful with system modifications`;\n\n\tconst currentDate = new Date().toISOString().split(\"T\")[0]; // YYYY-MM-DD\n\tconst currentDateTime = new Date().toISOString(); // Full ISO 8601\n\n\treturn `You are mom, a Slack bot assistant. Be concise. No emojis.\n\n## Context\n- Date: ${currentDate} (${currentDateTime})\n- You receive the last 50 conversation turns. If you need older context, search log.jsonl.\n\n## Slack Formatting (mrkdwn, NOT Markdown)\nBold: *text*, Italic: _text_, Code: \\`code\\`, Block: \\`\\`\\`code\\`\\`\\`, Links: <url|text>\nDo NOT use **double asterisks** or [markdown](links).\n\n## Slack IDs\nChannels: ${channelMappings}\n\nUsers: ${userMappings}\n\nWhen mentioning users, use <@username> format (e.g., <@mario>).\n\n## Environment\n${envDescription}\n\n## Workspace Layout\n${workspacePath}/\n├── MEMORY.md # Global memory (all channels)\n├── skills/ # Global CLI tools you create\n└── ${channelId}/ # This channel\n ├── MEMORY.md # Channel-specific memory\n ├── log.jsonl # Full message history\n ├── attachments/ # User-shared files\n ├── scratch/ # Your working directory\n └── skills/ # Channel-specific tools\n\n## Skills (Custom CLI Tools)\nYou can create reusable CLI tools for recurring tasks (email, APIs, data processing, etc.).\nStore in \\`${workspacePath}/skills/<name>/\\` or \\`${channelPath}/skills/<name>/\\`.\nEach skill needs a \\`SKILL.md\\` documenting usage. Read it before using a skill.\nList skills in global memory so you remember them.\n\n## Memory\nWrite to MEMORY.md files to persist context across conversations.\n- Global (${workspacePath}/MEMORY.md): skills, preferences, project info\n- Channel (${channelPath}/MEMORY.md): channel-specific decisions, ongoing work\nUpdate when you learn something important or when asked to remember something.\n\n### Current Memory\n${memory}\n\n## System Configuration Log\nMaintain ${workspacePath}/SYSTEM.md to log all environment modifications:\n- Installed packages (apk add, npm install, pip install)\n- Environment variables set\n- Config files modified (~/.gitconfig, cron jobs, etc.)\n- Skill dependencies installed\n\nUpdate this file whenever you modify the environment. On fresh container, read it first to restore your setup.\n\n## Log Queries (CRITICAL: limit output to avoid context overflow)\nFormat: \\`{\"date\":\"...\",\"ts\":\"...\",\"user\":\"...\",\"userName\":\"...\",\"text\":\"...\",\"isBot\":false}\\`\nThe log contains user messages AND your tool calls/results. Filter appropriately.\n${isDocker ? \"Install jq: apk add jq\" : \"\"}\n\n**Conversation only (excludes tool calls/results) - use for summaries:**\n\\`\\`\\`bash\n# Recent conversation (no [Tool] or [Tool Result] lines)\ngrep -v '\"text\":\"\\\\[Tool' log.jsonl | tail -30 | jq -c '{date: .date[0:19], user: (.userName // .user), text}'\n\n# Yesterday's conversation\ngrep '\"date\":\"2025-11-26' log.jsonl | grep -v '\"text\":\"\\\\[Tool' | jq -c '{date: .date[0:19], user: (.userName // .user), text}'\n\n# Specific user's messages\ngrep '\"userName\":\"mario\"' log.jsonl | grep -v '\"text\":\"\\\\[Tool' | tail -20 | jq -c '{date: .date[0:19], text}'\n\\`\\`\\`\n\n**Full details (includes tool calls) - use when you need technical context:**\n\\`\\`\\`bash\n# Raw recent entries\ntail -20 log.jsonl | jq -c '{date: .date[0:19], user: (.userName // .user), text}'\n\n# Count all messages\nwc -l log.jsonl\n\\`\\`\\`\n\n## Tools\n- bash: Run shell commands (primary tool). Install packages as needed.\n- read: Read files\n- write: Create/overwrite files \n- edit: Surgical file edits\n- attach: Share files to Slack\n\nEach tool requires a \"label\" parameter (shown to user).\n`;\n}\n\nfunction truncate(text: string, maxLen: number): string {\n\tif (text.length <= maxLen) return text;\n\treturn text.substring(0, maxLen - 3) + \"...\";\n}\n\nfunction extractToolResultText(result: unknown): string {\n\t// If it's already a string, return it\n\tif (typeof result === \"string\") {\n\t\treturn result;\n\t}\n\n\t// If it's an object with content array (tool result format)\n\tif (\n\t\tresult &&\n\t\ttypeof result === \"object\" &&\n\t\t\"content\" in result &&\n\t\tArray.isArray((result as { content: unknown }).content)\n\t) {\n\t\tconst content = (result as { content: Array<{ type: string; text?: string }> }).content;\n\t\tconst textParts: string[] = [];\n\t\tfor (const part of content) {\n\t\t\tif (part.type === \"text\" && part.text) {\n\t\t\t\ttextParts.push(part.text);\n\t\t\t}\n\t\t}\n\t\tif (textParts.length > 0) {\n\t\t\treturn textParts.join(\"\\n\");\n\t\t}\n\t}\n\n\t// Fallback to JSON\n\treturn JSON.stringify(result);\n}\n\nfunction formatToolArgsForSlack(_toolName: string, args: Record<string, unknown>): string {\n\tconst lines: string[] = [];\n\n\tfor (const [key, value] of Object.entries(args)) {\n\t\t// Skip the label - it's already shown\n\t\tif (key === \"label\") continue;\n\n\t\t// For read tool, format path with offset/limit\n\t\tif (key === \"path\" && typeof value === \"string\") {\n\t\t\tconst offset = args.offset as number | undefined;\n\t\t\tconst limit = args.limit as number | undefined;\n\t\t\tif (offset !== undefined && limit !== undefined) {\n\t\t\t\tlines.push(`${value}:${offset}-${offset + limit}`);\n\t\t\t} else {\n\t\t\t\tlines.push(value);\n\t\t\t}\n\t\t\tcontinue;\n\t\t}\n\n\t\t// Skip offset/limit since we already handled them\n\t\tif (key === \"offset\" || key === \"limit\") continue;\n\n\t\t// For other values, format them\n\t\tif (typeof value === \"string\") {\n\t\t\tlines.push(value);\n\t\t} else {\n\t\t\tlines.push(JSON.stringify(value));\n\t\t}\n\t}\n\n\treturn lines.join(\"\\n\");\n}\n\nexport function createAgentRunner(sandboxConfig: SandboxConfig): AgentRunner {\n\tlet agent: Agent | null = null;\n\tconst executor = createExecutor(sandboxConfig);\n\n\treturn {\n\t\tasync run(ctx: SlackContext, channelDir: string, store: ChannelStore): Promise<{ stopReason: string }> {\n\t\t\t// Ensure channel directory exists\n\t\t\tawait mkdir(channelDir, { recursive: true });\n\n\t\t\tconst channelId = ctx.message.channel;\n\t\t\tconst workspacePath = executor.getWorkspacePath(channelDir.replace(`/${channelId}`, \"\"));\n\t\t\tconst recentMessages = getRecentMessages(channelDir, 50);\n\n\t\t\tconst memory = getMemory(channelDir);\n\t\t\tconst systemPrompt = buildSystemPrompt(\n\t\t\t\tworkspacePath,\n\t\t\t\tchannelId,\n\t\t\t\tmemory,\n\t\t\t\tsandboxConfig,\n\t\t\t\tctx.channels,\n\t\t\t\tctx.users,\n\t\t\t);\n\n\t\t\t// Debug: log context sizes\n\t\t\tlog.logInfo(\n\t\t\t\t`Context sizes - system: ${systemPrompt.length} chars, messages: ${recentMessages.length} chars, memory: ${memory.length} chars`,\n\t\t\t);\n\t\t\tlog.logInfo(`Channels: ${ctx.channels.length}, Users: ${ctx.users.length}`);\n\n\t\t\t// Set up file upload function for the attach tool\n\t\t\t// For Docker, we need to translate paths back to host\n\t\t\tsetUploadFunction(async (filePath: string, title?: string) => {\n\t\t\t\tconst hostPath = translateToHostPath(filePath, channelDir, workspacePath, channelId);\n\t\t\t\tawait ctx.uploadFile(hostPath, title);\n\t\t\t});\n\n\t\t\t// Create tools with executor\n\t\t\tconst tools = createMomTools(executor);\n\n\t\t\t// Create ephemeral agent\n\t\t\tagent = new Agent({\n\t\t\t\tinitialState: {\n\t\t\t\t\tsystemPrompt,\n\t\t\t\t\tmodel,\n\t\t\t\t\tthinkingLevel: \"off\",\n\t\t\t\t\ttools,\n\t\t\t\t},\n\t\t\t\ttransport: new ProviderTransport({\n\t\t\t\t\tgetApiKey: async () => getAnthropicApiKey(),\n\t\t\t\t}),\n\t\t\t});\n\n\t\t\t// Create logging context\n\t\t\tconst logCtx = {\n\t\t\t\tchannelId: ctx.message.channel,\n\t\t\t\tuserName: ctx.message.userName,\n\t\t\t\tchannelName: ctx.channelName,\n\t\t\t};\n\n\t\t\t// Track pending tool calls to pair args with results and timing\n\t\t\tconst pendingTools = new Map<string, { toolName: string; args: unknown; startTime: number }>();\n\n\t\t\t// Track usage across all assistant messages in this run\n\t\t\tconst totalUsage = {\n\t\t\t\tinput: 0,\n\t\t\t\toutput: 0,\n\t\t\t\tcacheRead: 0,\n\t\t\t\tcacheWrite: 0,\n\t\t\t\tcost: {\n\t\t\t\t\tinput: 0,\n\t\t\t\t\toutput: 0,\n\t\t\t\t\tcacheRead: 0,\n\t\t\t\t\tcacheWrite: 0,\n\t\t\t\t\ttotal: 0,\n\t\t\t\t},\n\t\t\t};\n\n\t\t\t// Track stop reason\n\t\t\tlet stopReason = \"stop\";\n\n\t\t\t// Slack message limit is 40,000 characters - split into multiple messages if needed\n\t\t\tconst SLACK_MAX_LENGTH = 40000;\n\t\t\tconst splitForSlack = (text: string): string[] => {\n\t\t\t\tif (text.length <= SLACK_MAX_LENGTH) return [text];\n\t\t\t\tconst parts: string[] = [];\n\t\t\t\tlet remaining = text;\n\t\t\t\tlet partNum = 1;\n\t\t\t\twhile (remaining.length > 0) {\n\t\t\t\t\tconst chunk = remaining.substring(0, SLACK_MAX_LENGTH - 50);\n\t\t\t\t\tremaining = remaining.substring(SLACK_MAX_LENGTH - 50);\n\t\t\t\t\tconst suffix = remaining.length > 0 ? `\\n_(continued ${partNum}...)_` : \"\";\n\t\t\t\t\tparts.push(chunk + suffix);\n\t\t\t\t\tpartNum++;\n\t\t\t\t}\n\t\t\t\treturn parts;\n\t\t\t};\n\n\t\t\t// Promise queue to ensure ctx.respond/respondInThread calls execute in order\n\t\t\t// Handles errors gracefully by posting to thread instead of crashing\n\t\t\tconst queue = {\n\t\t\t\tchain: Promise.resolve(),\n\t\t\t\tenqueue(fn: () => Promise<void>, errorContext: string): void {\n\t\t\t\t\tthis.chain = this.chain.then(async () => {\n\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\tawait fn();\n\t\t\t\t\t\t} catch (err) {\n\t\t\t\t\t\t\tconst errMsg = err instanceof Error ? err.message : String(err);\n\t\t\t\t\t\t\tlog.logWarning(`Slack API error (${errorContext})`, errMsg);\n\t\t\t\t\t\t\t// Try to post error to thread, but don't crash if that fails too\n\t\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\t\tawait ctx.respondInThread(`_Error: ${errMsg}_`);\n\t\t\t\t\t\t\t} catch {\n\t\t\t\t\t\t\t\t// Ignore - we tried our best\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t});\n\t\t\t\t},\n\t\t\t\t// Enqueue a message that may need splitting\n\t\t\t\tenqueueMessage(text: string, target: \"main\" | \"thread\", errorContext: string, log = true): void {\n\t\t\t\t\tconst parts = splitForSlack(text);\n\t\t\t\t\tfor (const part of parts) {\n\t\t\t\t\t\tthis.enqueue(\n\t\t\t\t\t\t\t() => (target === \"main\" ? ctx.respond(part, log) : ctx.respondInThread(part)),\n\t\t\t\t\t\t\terrorContext,\n\t\t\t\t\t\t);\n\t\t\t\t\t}\n\t\t\t\t},\n\t\t\t\tflush(): Promise<void> {\n\t\t\t\t\treturn this.chain;\n\t\t\t\t},\n\t\t\t};\n\n\t\t\t// Subscribe to events\n\t\t\tagent.subscribe(async (event: AgentEvent) => {\n\t\t\t\tswitch (event.type) {\n\t\t\t\t\tcase \"tool_execution_start\": {\n\t\t\t\t\t\tconst args = event.args as { label?: string };\n\t\t\t\t\t\tconst label = args.label || event.toolName;\n\n\t\t\t\t\t\t// Store args to pair with result later\n\t\t\t\t\t\tpendingTools.set(event.toolCallId, {\n\t\t\t\t\t\t\ttoolName: event.toolName,\n\t\t\t\t\t\t\targs: event.args,\n\t\t\t\t\t\t\tstartTime: Date.now(),\n\t\t\t\t\t\t});\n\n\t\t\t\t\t\t// Log to console\n\t\t\t\t\t\tlog.logToolStart(logCtx, event.toolName, label, event.args as Record<string, unknown>);\n\n\t\t\t\t\t\t// Log to jsonl\n\t\t\t\t\t\tawait store.logMessage(ctx.message.channel, {\n\t\t\t\t\t\t\tdate: new Date().toISOString(),\n\t\t\t\t\t\t\tts: toSlackTs(),\n\t\t\t\t\t\t\tuser: \"bot\",\n\t\t\t\t\t\t\ttext: `[Tool] ${event.toolName}: ${JSON.stringify(event.args)}`,\n\t\t\t\t\t\t\tattachments: [],\n\t\t\t\t\t\t\tisBot: true,\n\t\t\t\t\t\t});\n\n\t\t\t\t\t\t// Show label in main message only\n\t\t\t\t\t\tqueue.enqueue(() => ctx.respond(`_→ ${label}_`, false), \"tool label\");\n\t\t\t\t\t\tbreak;\n\t\t\t\t\t}\n\n\t\t\t\t\tcase \"tool_execution_end\": {\n\t\t\t\t\t\tconst resultStr = extractToolResultText(event.result);\n\t\t\t\t\t\tconst pending = pendingTools.get(event.toolCallId);\n\t\t\t\t\t\tpendingTools.delete(event.toolCallId);\n\n\t\t\t\t\t\tconst durationMs = pending ? Date.now() - pending.startTime : 0;\n\n\t\t\t\t\t\t// Log to console\n\t\t\t\t\t\tif (event.isError) {\n\t\t\t\t\t\t\tlog.logToolError(logCtx, event.toolName, durationMs, resultStr);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tlog.logToolSuccess(logCtx, event.toolName, durationMs, resultStr);\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\t// Log to jsonl\n\t\t\t\t\t\tawait store.logMessage(ctx.message.channel, {\n\t\t\t\t\t\t\tdate: new Date().toISOString(),\n\t\t\t\t\t\t\tts: toSlackTs(),\n\t\t\t\t\t\t\tuser: \"bot\",\n\t\t\t\t\t\t\ttext: `[Tool Result] ${event.toolName}: ${event.isError ? \"ERROR: \" : \"\"}${resultStr}`,\n\t\t\t\t\t\t\tattachments: [],\n\t\t\t\t\t\t\tisBot: true,\n\t\t\t\t\t\t});\n\n\t\t\t\t\t\t// Post args + result together in thread\n\t\t\t\t\t\tconst label = pending?.args ? (pending.args as { label?: string }).label : undefined;\n\t\t\t\t\t\tconst argsFormatted = pending\n\t\t\t\t\t\t\t? formatToolArgsForSlack(event.toolName, pending.args as Record<string, unknown>)\n\t\t\t\t\t\t\t: \"(args not found)\";\n\t\t\t\t\t\tconst duration = (durationMs / 1000).toFixed(1);\n\t\t\t\t\t\tlet threadMessage = `*${event.isError ? \"✗\" : \"✓\"} ${event.toolName}*`;\n\t\t\t\t\t\tif (label) {\n\t\t\t\t\t\t\tthreadMessage += `: ${label}`;\n\t\t\t\t\t\t}\n\t\t\t\t\t\tthreadMessage += ` (${duration}s)\\n`;\n\n\t\t\t\t\t\tif (argsFormatted) {\n\t\t\t\t\t\t\tthreadMessage += \"```\\n\" + argsFormatted + \"\\n```\\n\";\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tthreadMessage += \"*Result:*\\n```\\n\" + resultStr + \"\\n```\";\n\n\t\t\t\t\t\tqueue.enqueueMessage(threadMessage, \"thread\", \"tool result thread\", false);\n\n\t\t\t\t\t\t// Show brief error in main message if failed\n\t\t\t\t\t\tif (event.isError) {\n\t\t\t\t\t\t\tqueue.enqueue(() => ctx.respond(`_Error: ${truncate(resultStr, 200)}_`, false), \"tool error\");\n\t\t\t\t\t\t}\n\t\t\t\t\t\tbreak;\n\t\t\t\t\t}\n\n\t\t\t\t\tcase \"message_update\": {\n\t\t\t\t\t\t// No longer stream to console - just track that we're streaming\n\t\t\t\t\t\tbreak;\n\t\t\t\t\t}\n\n\t\t\t\t\tcase \"message_start\":\n\t\t\t\t\t\tif (event.message.role === \"assistant\") {\n\t\t\t\t\t\t\tlog.logResponseStart(logCtx);\n\t\t\t\t\t\t}\n\t\t\t\t\t\tbreak;\n\n\t\t\t\t\tcase \"message_end\":\n\t\t\t\t\t\tif (event.message.role === \"assistant\") {\n\t\t\t\t\t\t\tconst assistantMsg = event.message as any; // AssistantMessage type\n\n\t\t\t\t\t\t\t// Track stop reason\n\t\t\t\t\t\t\tif (assistantMsg.stopReason) {\n\t\t\t\t\t\t\t\tstopReason = assistantMsg.stopReason;\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// Accumulate usage\n\t\t\t\t\t\t\tif (assistantMsg.usage) {\n\t\t\t\t\t\t\t\ttotalUsage.input += assistantMsg.usage.input;\n\t\t\t\t\t\t\t\ttotalUsage.output += assistantMsg.usage.output;\n\t\t\t\t\t\t\t\ttotalUsage.cacheRead += assistantMsg.usage.cacheRead;\n\t\t\t\t\t\t\t\ttotalUsage.cacheWrite += assistantMsg.usage.cacheWrite;\n\t\t\t\t\t\t\t\ttotalUsage.cost.input += assistantMsg.usage.cost.input;\n\t\t\t\t\t\t\t\ttotalUsage.cost.output += assistantMsg.usage.cost.output;\n\t\t\t\t\t\t\t\ttotalUsage.cost.cacheRead += assistantMsg.usage.cost.cacheRead;\n\t\t\t\t\t\t\t\ttotalUsage.cost.cacheWrite += assistantMsg.usage.cost.cacheWrite;\n\t\t\t\t\t\t\t\ttotalUsage.cost.total += assistantMsg.usage.cost.total;\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// Extract thinking and text from assistant message\n\t\t\t\t\t\t\tconst content = event.message.content;\n\t\t\t\t\t\t\tconst thinkingParts: string[] = [];\n\t\t\t\t\t\t\tconst textParts: string[] = [];\n\t\t\t\t\t\t\tfor (const part of content) {\n\t\t\t\t\t\t\t\tif (part.type === \"thinking\") {\n\t\t\t\t\t\t\t\t\tthinkingParts.push(part.thinking);\n\t\t\t\t\t\t\t\t} else if (part.type === \"text\") {\n\t\t\t\t\t\t\t\t\ttextParts.push(part.text);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tconst text = textParts.join(\"\\n\");\n\n\t\t\t\t\t\t\t// Post thinking to main message and thread\n\t\t\t\t\t\t\tfor (const thinking of thinkingParts) {\n\t\t\t\t\t\t\t\tlog.logThinking(logCtx, thinking);\n\t\t\t\t\t\t\t\tqueue.enqueueMessage(`_${thinking}_`, \"main\", \"thinking main\");\n\t\t\t\t\t\t\t\tqueue.enqueueMessage(`_${thinking}_`, \"thread\", \"thinking thread\", false);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// Post text to main message and thread\n\t\t\t\t\t\t\tif (text.trim()) {\n\t\t\t\t\t\t\t\tlog.logResponse(logCtx, text);\n\t\t\t\t\t\t\t\tqueue.enqueueMessage(text, \"main\", \"response main\");\n\t\t\t\t\t\t\t\tqueue.enqueueMessage(text, \"thread\", \"response thread\", false);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t\tbreak;\n\t\t\t\t}\n\t\t\t});\n\n\t\t\t// Run the agent with user's message\n\t\t\t// Prepend recent messages to the user prompt (not system prompt) for better caching\n\t\t\t// The current message is already the last entry in recentMessages\n\t\t\tconst userPrompt =\n\t\t\t\t`Conversation history (last 50 turns). Respond to the last message.\\n` +\n\t\t\t\t`Format: date TAB user TAB text TAB attachments\\n\\n` +\n\t\t\t\trecentMessages;\n\t\t\t// Debug: write full context to file\n\t\t\tconst toolDefs = tools.map((t) => ({ name: t.name, description: t.description, parameters: t.parameters }));\n\t\t\tconst debugPrompt =\n\t\t\t\t`=== SYSTEM PROMPT (${systemPrompt.length} chars) ===\\n\\n${systemPrompt}\\n\\n` +\n\t\t\t\t`=== TOOL DEFINITIONS (${JSON.stringify(toolDefs).length} chars) ===\\n\\n${JSON.stringify(toolDefs, null, 2)}\\n\\n` +\n\t\t\t\t`=== USER PROMPT (${userPrompt.length} chars) ===\\n\\n${userPrompt}`;\n\t\t\tawait writeFile(join(channelDir, \"last_prompt.txt\"), debugPrompt, \"utf-8\");\n\n\t\t\tawait agent.prompt(userPrompt);\n\n\t\t\t// Wait for all queued respond calls to complete\n\t\t\tawait queue.flush();\n\n\t\t\t// Get final assistant message text from agent state and replace main message\n\t\t\tconst messages = agent.state.messages;\n\t\t\tconst lastAssistant = messages.filter((m) => m.role === \"assistant\").pop();\n\t\t\tconst finalText =\n\t\t\t\tlastAssistant?.content\n\t\t\t\t\t.filter((c): c is { type: \"text\"; text: string } => c.type === \"text\")\n\t\t\t\t\t.map((c) => c.text)\n\t\t\t\t\t.join(\"\\n\") || \"\";\n\t\t\tif (finalText.trim()) {\n\t\t\t\ttry {\n\t\t\t\t\t// For the main message, truncate if too long (full text is in thread)\n\t\t\t\t\tconst mainText =\n\t\t\t\t\t\tfinalText.length > SLACK_MAX_LENGTH\n\t\t\t\t\t\t\t? finalText.substring(0, SLACK_MAX_LENGTH - 50) + \"\\n\\n_(see thread for full response)_\"\n\t\t\t\t\t\t\t: finalText;\n\t\t\t\t\tawait ctx.replaceMessage(mainText);\n\t\t\t\t} catch (err) {\n\t\t\t\t\tconst errMsg = err instanceof Error ? err.message : String(err);\n\t\t\t\t\tlog.logWarning(\"Failed to replace message with final text\", errMsg);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// Log usage summary if there was any usage\n\t\t\tif (totalUsage.cost.total > 0) {\n\t\t\t\tconst summary = log.logUsageSummary(logCtx, totalUsage);\n\t\t\t\tqueue.enqueue(() => ctx.respondInThread(summary), \"usage summary\");\n\t\t\t\tawait queue.flush();\n\t\t\t}\n\n\t\t\treturn { stopReason };\n\t\t},\n\n\t\tabort(): void {\n\t\t\tagent?.abort();\n\t\t},\n\t};\n}\n\n/**\n * Translate container path back to host path for file operations\n */\nfunction translateToHostPath(\n\tcontainerPath: string,\n\tchannelDir: string,\n\tworkspacePath: string,\n\tchannelId: string,\n): string {\n\tif (workspacePath === \"/workspace\") {\n\t\t// Docker mode - translate /workspace/channelId/... to host path\n\t\tconst prefix = `/workspace/${channelId}/`;\n\t\tif (containerPath.startsWith(prefix)) {\n\t\t\treturn join(channelDir, containerPath.slice(prefix.length));\n\t\t}\n\t\t// Maybe it's just /workspace/...\n\t\tif (containerPath.startsWith(\"/workspace/\")) {\n\t\t\treturn join(channelDir, \"..\", containerPath.slice(\"/workspace/\".length));\n\t\t}\n\t}\n\t// Host mode or already a host path\n\treturn containerPath;\n}\n"]}
|
|
1
|
+
{"version":3,"file":"agent.d.ts","sourceRoot":"","sources":["../src/agent.ts"],"names":[],"mappings":"AAQA,OAAO,EAAkB,KAAK,aAAa,EAAE,MAAM,cAAc,CAAC;AAClE,OAAO,KAAK,EAAe,YAAY,EAAY,MAAM,YAAY,CAAC;AACtE,OAAO,KAAK,EAAE,YAAY,EAAE,MAAM,YAAY,CAAC;AA0B/C,MAAM,WAAW,cAAc;IAC9B,QAAQ,EAAE,MAAM,CAAC;IACjB,IAAI,EAAE,MAAM,CAAC;IACb,WAAW,EAAE;QAAE,KAAK,EAAE,MAAM,CAAA;KAAE,EAAE,CAAC;IACjC,SAAS,EAAE,MAAM,CAAC;CAClB;AAED,MAAM,WAAW,WAAW;IAC3B,GAAG,CAAC,GAAG,EAAE,YAAY,EAAE,KAAK,EAAE,YAAY,EAAE,eAAe,CAAC,EAAE,cAAc,EAAE,GAAG,OAAO,CAAC;QAAE,UAAU,EAAE,MAAM,CAAA;KAAE,CAAC,CAAC;IACjH,KAAK,IAAI,IAAI,CAAC;CACd;AA4ND;;;GAGG;AACH,wBAAgB,iBAAiB,CAAC,aAAa,EAAE,aAAa,EAAE,SAAS,EAAE,MAAM,EAAE,UAAU,EAAE,MAAM,GAAG,WAAW,CAOlH","sourcesContent":["import { Agent, type AgentEvent, ProviderTransport } from \"@mariozechner/pi-agent-core\";\nimport { getModel } from \"@mariozechner/pi-ai\";\nimport { AgentSession, messageTransformer } from \"@mariozechner/pi-coding-agent\";\nimport { existsSync, readFileSync } from \"fs\";\nimport { mkdir, writeFile } from \"fs/promises\";\nimport { join } from \"path\";\nimport { MomSessionManager, MomSettingsManager } from \"./context.js\";\nimport * as log from \"./log.js\";\nimport { createExecutor, type SandboxConfig } from \"./sandbox.js\";\nimport type { ChannelInfo, SlackContext, UserInfo } from \"./slack.js\";\nimport type { ChannelStore } from \"./store.js\";\nimport { createMomTools, setUploadFunction } from \"./tools/index.js\";\n\n// Hardcoded model for now - TODO: make configurable (issue #63)\nconst model = getModel(\"anthropic\", \"claude-sonnet-4-5\");\n\n/**\n * Convert Date.now() to Slack timestamp format (seconds.microseconds)\n * Uses a monotonic counter to ensure ordering even within the same millisecond\n */\nlet lastTsMs = 0;\nlet tsCounter = 0;\n\nfunction toSlackTs(): string {\n\tconst now = Date.now();\n\tif (now === lastTsMs) {\n\t\ttsCounter++;\n\t} else {\n\t\tlastTsMs = now;\n\t\ttsCounter = 0;\n\t}\n\tconst seconds = Math.floor(now / 1000);\n\tconst micros = (now % 1000) * 1000 + tsCounter;\n\treturn `${seconds}.${micros.toString().padStart(6, \"0\")}`;\n}\n\nexport interface PendingMessage {\n\tuserName: string;\n\ttext: string;\n\tattachments: { local: string }[];\n\ttimestamp: number;\n}\n\nexport interface AgentRunner {\n\trun(ctx: SlackContext, store: ChannelStore, pendingMessages?: PendingMessage[]): Promise<{ stopReason: string }>;\n\tabort(): void;\n}\n\nfunction getAnthropicApiKey(): string {\n\tconst key = process.env.ANTHROPIC_OAUTH_TOKEN || process.env.ANTHROPIC_API_KEY;\n\tif (!key) {\n\t\tthrow new Error(\"ANTHROPIC_OAUTH_TOKEN or ANTHROPIC_API_KEY must be set\");\n\t}\n\treturn key;\n}\n\nfunction getMemory(channelDir: string): string {\n\tconst parts: string[] = [];\n\n\t// Read workspace-level memory (shared across all channels)\n\tconst workspaceMemoryPath = join(channelDir, \"..\", \"MEMORY.md\");\n\tif (existsSync(workspaceMemoryPath)) {\n\t\ttry {\n\t\t\tconst content = readFileSync(workspaceMemoryPath, \"utf-8\").trim();\n\t\t\tif (content) {\n\t\t\t\tparts.push(\"### Global Workspace Memory\\n\" + content);\n\t\t\t}\n\t\t} catch (error) {\n\t\t\tlog.logWarning(\"Failed to read workspace memory\", `${workspaceMemoryPath}: ${error}`);\n\t\t}\n\t}\n\n\t// Read channel-specific memory\n\tconst channelMemoryPath = join(channelDir, \"MEMORY.md\");\n\tif (existsSync(channelMemoryPath)) {\n\t\ttry {\n\t\t\tconst content = readFileSync(channelMemoryPath, \"utf-8\").trim();\n\t\t\tif (content) {\n\t\t\t\tparts.push(\"### Channel-Specific Memory\\n\" + content);\n\t\t\t}\n\t\t} catch (error) {\n\t\t\tlog.logWarning(\"Failed to read channel memory\", `${channelMemoryPath}: ${error}`);\n\t\t}\n\t}\n\n\tif (parts.length === 0) {\n\t\treturn \"(no working memory yet)\";\n\t}\n\n\treturn parts.join(\"\\n\\n\");\n}\n\nfunction buildSystemPrompt(\n\tworkspacePath: string,\n\tchannelId: string,\n\tmemory: string,\n\tsandboxConfig: SandboxConfig,\n\tchannels: ChannelInfo[],\n\tusers: UserInfo[],\n): string {\n\tconst channelPath = `${workspacePath}/${channelId}`;\n\tconst isDocker = sandboxConfig.type === \"docker\";\n\n\t// Format channel mappings\n\tconst channelMappings =\n\t\tchannels.length > 0 ? channels.map((c) => `${c.id}\\t#${c.name}`).join(\"\\n\") : \"(no channels loaded)\";\n\n\t// Format user mappings\n\tconst userMappings =\n\t\tusers.length > 0 ? users.map((u) => `${u.id}\\t@${u.userName}\\t${u.displayName}`).join(\"\\n\") : \"(no users loaded)\";\n\n\tconst envDescription = isDocker\n\t\t? `You are running inside a Docker container (Alpine Linux).\n- Bash working directory: / (use cd or absolute paths)\n- Install tools with: apk add <package>\n- Your changes persist across sessions`\n\t\t: `You are running directly on the host machine.\n- Bash working directory: ${process.cwd()}\n- Be careful with system modifications`;\n\n\treturn `You are mom, a Slack bot assistant. Be concise. No emojis.\n\n## Context\n- For current date/time, use: date\n- You have access to previous conversation context including tool results from prior turns.\n- For older history beyond your context, search log.jsonl (contains user messages and your final responses, but not tool results).\n\n## Slack Formatting (mrkdwn, NOT Markdown)\nBold: *text*, Italic: _text_, Code: \\`code\\`, Block: \\`\\`\\`code\\`\\`\\`, Links: <url|text>\nDo NOT use **double asterisks** or [markdown](links).\n\n## Slack IDs\nChannels: ${channelMappings}\n\nUsers: ${userMappings}\n\nWhen mentioning users, use <@username> format (e.g., <@mario>).\n\n## Environment\n${envDescription}\n\n## Workspace Layout\n${workspacePath}/\n├── MEMORY.md # Global memory (all channels)\n├── skills/ # Global CLI tools you create\n└── ${channelId}/ # This channel\n ├── MEMORY.md # Channel-specific memory\n ├── log.jsonl # Message history (no tool results)\n ├── attachments/ # User-shared files\n ├── scratch/ # Your working directory\n └── skills/ # Channel-specific tools\n\n## Skills (Custom CLI Tools)\nYou can create reusable CLI tools for recurring tasks (email, APIs, data processing, etc.).\nStore in \\`${workspacePath}/skills/<name>/\\` or \\`${channelPath}/skills/<name>/\\`.\nEach skill needs a \\`SKILL.md\\` documenting usage. Read it before using a skill.\nList skills in global memory so you remember them.\n\n## Memory\nWrite to MEMORY.md files to persist context across conversations.\n- Global (${workspacePath}/MEMORY.md): skills, preferences, project info\n- Channel (${channelPath}/MEMORY.md): channel-specific decisions, ongoing work\nUpdate when you learn something important or when asked to remember something.\n\n### Current Memory\n${memory}\n\n## System Configuration Log\nMaintain ${workspacePath}/SYSTEM.md to log all environment modifications:\n- Installed packages (apk add, npm install, pip install)\n- Environment variables set\n- Config files modified (~/.gitconfig, cron jobs, etc.)\n- Skill dependencies installed\n\nUpdate this file whenever you modify the environment. On fresh container, read it first to restore your setup.\n\n## Log Queries (for older history)\nFormat: \\`{\"date\":\"...\",\"ts\":\"...\",\"user\":\"...\",\"userName\":\"...\",\"text\":\"...\",\"isBot\":false}\\`\nThe log contains user messages and your final responses (not tool calls/results).\n${isDocker ? \"Install jq: apk add jq\" : \"\"}\n\n\\`\\`\\`bash\n# Recent messages\ntail -30 log.jsonl | jq -c '{date: .date[0:19], user: (.userName // .user), text}'\n\n# Search for specific topic\ngrep -i \"topic\" log.jsonl | jq -c '{date: .date[0:19], user: (.userName // .user), text}'\n\n# Messages from specific user\ngrep '\"userName\":\"mario\"' log.jsonl | tail -20 | jq -c '{date: .date[0:19], text}'\n\\`\\`\\`\n\n## Tools\n- bash: Run shell commands (primary tool). Install packages as needed.\n- read: Read files\n- write: Create/overwrite files\n- edit: Surgical file edits\n- attach: Share files to Slack\n\nEach tool requires a \"label\" parameter (shown to user).\n`;\n}\n\nfunction truncate(text: string, maxLen: number): string {\n\tif (text.length <= maxLen) return text;\n\treturn text.substring(0, maxLen - 3) + \"...\";\n}\n\nfunction extractToolResultText(result: unknown): string {\n\tif (typeof result === \"string\") {\n\t\treturn result;\n\t}\n\n\tif (\n\t\tresult &&\n\t\ttypeof result === \"object\" &&\n\t\t\"content\" in result &&\n\t\tArray.isArray((result as { content: unknown }).content)\n\t) {\n\t\tconst content = (result as { content: Array<{ type: string; text?: string }> }).content;\n\t\tconst textParts: string[] = [];\n\t\tfor (const part of content) {\n\t\t\tif (part.type === \"text\" && part.text) {\n\t\t\t\ttextParts.push(part.text);\n\t\t\t}\n\t\t}\n\t\tif (textParts.length > 0) {\n\t\t\treturn textParts.join(\"\\n\");\n\t\t}\n\t}\n\n\treturn JSON.stringify(result);\n}\n\nfunction formatToolArgsForSlack(_toolName: string, args: Record<string, unknown>): string {\n\tconst lines: string[] = [];\n\n\tfor (const [key, value] of Object.entries(args)) {\n\t\tif (key === \"label\") continue;\n\n\t\tif (key === \"path\" && typeof value === \"string\") {\n\t\t\tconst offset = args.offset as number | undefined;\n\t\t\tconst limit = args.limit as number | undefined;\n\t\t\tif (offset !== undefined && limit !== undefined) {\n\t\t\t\tlines.push(`${value}:${offset}-${offset + limit}`);\n\t\t\t} else {\n\t\t\t\tlines.push(value);\n\t\t\t}\n\t\t\tcontinue;\n\t\t}\n\n\t\tif (key === \"offset\" || key === \"limit\") continue;\n\n\t\tif (typeof value === \"string\") {\n\t\t\tlines.push(value);\n\t\t} else {\n\t\t\tlines.push(JSON.stringify(value));\n\t\t}\n\t}\n\n\treturn lines.join(\"\\n\");\n}\n\n// Cache runners per channel\nconst channelRunners = new Map<string, AgentRunner>();\n\n/**\n * Get or create an AgentRunner for a channel.\n * Runners are cached - one per channel, persistent across messages.\n */\nexport function getOrCreateRunner(sandboxConfig: SandboxConfig, channelId: string, channelDir: string): AgentRunner {\n\tconst existing = channelRunners.get(channelId);\n\tif (existing) return existing;\n\n\tconst runner = createRunner(sandboxConfig, channelId, channelDir);\n\tchannelRunners.set(channelId, runner);\n\treturn runner;\n}\n\n/**\n * Create a new AgentRunner for a channel.\n * Sets up the session and subscribes to events once.\n */\nfunction createRunner(sandboxConfig: SandboxConfig, channelId: string, channelDir: string): AgentRunner {\n\tconst executor = createExecutor(sandboxConfig);\n\tconst workspacePath = executor.getWorkspacePath(channelDir.replace(`/${channelId}`, \"\"));\n\n\t// Create tools\n\tconst tools = createMomTools(executor);\n\n\t// Initial system prompt (will be updated each run with fresh memory/channels/users)\n\tconst memory = getMemory(channelDir);\n\tconst systemPrompt = buildSystemPrompt(workspacePath, channelId, memory, sandboxConfig, [], []);\n\n\t// Create session manager and settings manager\n\t// Pass model info so new sessions get a header written immediately\n\tconst sessionManager = new MomSessionManager(channelDir, {\n\t\tprovider: model.provider,\n\t\tid: model.id,\n\t\tthinkingLevel: \"off\",\n\t});\n\tconst settingsManager = new MomSettingsManager(join(channelDir, \"..\"));\n\n\t// Create agent\n\tconst agent = new Agent({\n\t\tinitialState: {\n\t\t\tsystemPrompt,\n\t\t\tmodel,\n\t\t\tthinkingLevel: \"off\",\n\t\t\ttools,\n\t\t},\n\t\tmessageTransformer,\n\t\ttransport: new ProviderTransport({\n\t\t\tgetApiKey: async () => getAnthropicApiKey(),\n\t\t}),\n\t});\n\n\t// Load existing messages\n\tconst loadedSession = sessionManager.loadSession();\n\tif (loadedSession.messages.length > 0) {\n\t\tagent.replaceMessages(loadedSession.messages);\n\t\tlog.logInfo(`[${channelId}] Loaded ${loadedSession.messages.length} messages from context.jsonl`);\n\t}\n\n\t// Create AgentSession wrapper\n\tconst session = new AgentSession({\n\t\tagent,\n\t\tsessionManager: sessionManager as any,\n\t\tsettingsManager: settingsManager as any,\n\t});\n\n\t// Mutable per-run state - event handler references this\n\tconst runState = {\n\t\tctx: null as SlackContext | null,\n\t\tlogCtx: null as { channelId: string; userName?: string; channelName?: string } | null,\n\t\tqueue: null as {\n\t\t\tenqueue(fn: () => Promise<void>, errorContext: string): void;\n\t\t\tenqueueMessage(text: string, target: \"main\" | \"thread\", errorContext: string, doLog?: boolean): void;\n\t\t} | null,\n\t\tpendingTools: new Map<string, { toolName: string; args: unknown; startTime: number }>(),\n\t\ttotalUsage: {\n\t\t\tinput: 0,\n\t\t\toutput: 0,\n\t\t\tcacheRead: 0,\n\t\t\tcacheWrite: 0,\n\t\t\tcost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },\n\t\t},\n\t\tstopReason: \"stop\",\n\t};\n\n\t// Subscribe to events ONCE\n\tsession.subscribe(async (event) => {\n\t\t// Skip if no active run\n\t\tif (!runState.ctx || !runState.logCtx || !runState.queue) return;\n\n\t\tconst { ctx, logCtx, queue, pendingTools } = runState;\n\n\t\tif (event.type === \"tool_execution_start\") {\n\t\t\tconst agentEvent = event as AgentEvent & { type: \"tool_execution_start\" };\n\t\t\tconst args = agentEvent.args as { label?: string };\n\t\t\tconst label = args.label || agentEvent.toolName;\n\n\t\t\tpendingTools.set(agentEvent.toolCallId, {\n\t\t\t\ttoolName: agentEvent.toolName,\n\t\t\t\targs: agentEvent.args,\n\t\t\t\tstartTime: Date.now(),\n\t\t\t});\n\n\t\t\tlog.logToolStart(logCtx, agentEvent.toolName, label, agentEvent.args as Record<string, unknown>);\n\t\t\tqueue.enqueue(() => ctx.respond(`_→ ${label}_`, false), \"tool label\");\n\t\t} else if (event.type === \"tool_execution_end\") {\n\t\t\tconst agentEvent = event as AgentEvent & { type: \"tool_execution_end\" };\n\t\t\tconst resultStr = extractToolResultText(agentEvent.result);\n\t\t\tconst pending = pendingTools.get(agentEvent.toolCallId);\n\t\t\tpendingTools.delete(agentEvent.toolCallId);\n\n\t\t\tconst durationMs = pending ? Date.now() - pending.startTime : 0;\n\n\t\t\tif (agentEvent.isError) {\n\t\t\t\tlog.logToolError(logCtx, agentEvent.toolName, durationMs, resultStr);\n\t\t\t} else {\n\t\t\t\tlog.logToolSuccess(logCtx, agentEvent.toolName, durationMs, resultStr);\n\t\t\t}\n\n\t\t\t// Post args + result to thread\n\t\t\tconst label = pending?.args ? (pending.args as { label?: string }).label : undefined;\n\t\t\tconst argsFormatted = pending\n\t\t\t\t? formatToolArgsForSlack(agentEvent.toolName, pending.args as Record<string, unknown>)\n\t\t\t\t: \"(args not found)\";\n\t\t\tconst duration = (durationMs / 1000).toFixed(1);\n\t\t\tlet threadMessage = `*${agentEvent.isError ? \"✗\" : \"✓\"} ${agentEvent.toolName}*`;\n\t\t\tif (label) threadMessage += `: ${label}`;\n\t\t\tthreadMessage += ` (${duration}s)\\n`;\n\t\t\tif (argsFormatted) threadMessage += \"```\\n\" + argsFormatted + \"\\n```\\n\";\n\t\t\tthreadMessage += \"*Result:*\\n```\\n\" + resultStr + \"\\n```\";\n\n\t\t\tqueue.enqueueMessage(threadMessage, \"thread\", \"tool result thread\", false);\n\n\t\t\tif (agentEvent.isError) {\n\t\t\t\tqueue.enqueue(() => ctx.respond(`_Error: ${truncate(resultStr, 200)}_`, false), \"tool error\");\n\t\t\t}\n\t\t} else if (event.type === \"message_start\") {\n\t\t\tconst agentEvent = event as AgentEvent & { type: \"message_start\" };\n\t\t\tif (agentEvent.message.role === \"assistant\") {\n\t\t\t\tlog.logResponseStart(logCtx);\n\t\t\t}\n\t\t} else if (event.type === \"message_end\") {\n\t\t\tconst agentEvent = event as AgentEvent & { type: \"message_end\" };\n\t\t\tif (agentEvent.message.role === \"assistant\") {\n\t\t\t\tconst assistantMsg = agentEvent.message as any;\n\n\t\t\t\tif (assistantMsg.stopReason) {\n\t\t\t\t\trunState.stopReason = assistantMsg.stopReason;\n\t\t\t\t}\n\n\t\t\t\tif (assistantMsg.usage) {\n\t\t\t\t\trunState.totalUsage.input += assistantMsg.usage.input;\n\t\t\t\t\trunState.totalUsage.output += assistantMsg.usage.output;\n\t\t\t\t\trunState.totalUsage.cacheRead += assistantMsg.usage.cacheRead;\n\t\t\t\t\trunState.totalUsage.cacheWrite += assistantMsg.usage.cacheWrite;\n\t\t\t\t\trunState.totalUsage.cost.input += assistantMsg.usage.cost.input;\n\t\t\t\t\trunState.totalUsage.cost.output += assistantMsg.usage.cost.output;\n\t\t\t\t\trunState.totalUsage.cost.cacheRead += assistantMsg.usage.cost.cacheRead;\n\t\t\t\t\trunState.totalUsage.cost.cacheWrite += assistantMsg.usage.cost.cacheWrite;\n\t\t\t\t\trunState.totalUsage.cost.total += assistantMsg.usage.cost.total;\n\t\t\t\t}\n\n\t\t\t\tconst content = agentEvent.message.content;\n\t\t\t\tconst thinkingParts: string[] = [];\n\t\t\t\tconst textParts: string[] = [];\n\t\t\t\tfor (const part of content) {\n\t\t\t\t\tif (part.type === \"thinking\") {\n\t\t\t\t\t\tthinkingParts.push((part as any).thinking);\n\t\t\t\t\t} else if (part.type === \"text\") {\n\t\t\t\t\t\ttextParts.push((part as any).text);\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\tconst text = textParts.join(\"\\n\");\n\n\t\t\t\tfor (const thinking of thinkingParts) {\n\t\t\t\t\tlog.logThinking(logCtx, thinking);\n\t\t\t\t\tqueue.enqueueMessage(`_${thinking}_`, \"main\", \"thinking main\");\n\t\t\t\t\tqueue.enqueueMessage(`_${thinking}_`, \"thread\", \"thinking thread\", false);\n\t\t\t\t}\n\n\t\t\t\tif (text.trim()) {\n\t\t\t\t\tlog.logResponse(logCtx, text);\n\t\t\t\t\tqueue.enqueueMessage(text, \"main\", \"response main\");\n\t\t\t\t\tqueue.enqueueMessage(text, \"thread\", \"response thread\", false);\n\t\t\t\t}\n\t\t\t}\n\t\t} else if (event.type === \"auto_compaction_start\") {\n\t\t\tlog.logInfo(`Auto-compaction started (reason: ${(event as any).reason})`);\n\t\t\tqueue.enqueue(() => ctx.respond(\"_Compacting context..._\", false), \"compaction start\");\n\t\t} else if (event.type === \"auto_compaction_end\") {\n\t\t\tconst compEvent = event as any;\n\t\t\tif (compEvent.result) {\n\t\t\t\tlog.logInfo(`Auto-compaction complete: ${compEvent.result.tokensBefore} tokens compacted`);\n\t\t\t} else if (compEvent.aborted) {\n\t\t\t\tlog.logInfo(\"Auto-compaction aborted\");\n\t\t\t}\n\t\t} else if (event.type === \"auto_retry_start\") {\n\t\t\tconst retryEvent = event as any;\n\t\t\tlog.logWarning(`Retrying (${retryEvent.attempt}/${retryEvent.maxAttempts})`, retryEvent.errorMessage);\n\t\t\tqueue.enqueue(\n\t\t\t\t() => ctx.respond(`_Retrying (${retryEvent.attempt}/${retryEvent.maxAttempts})..._`, false),\n\t\t\t\t\"retry\",\n\t\t\t);\n\t\t}\n\t});\n\n\t// Slack message limit\n\tconst SLACK_MAX_LENGTH = 40000;\n\tconst splitForSlack = (text: string): string[] => {\n\t\tif (text.length <= SLACK_MAX_LENGTH) return [text];\n\t\tconst parts: string[] = [];\n\t\tlet remaining = text;\n\t\tlet partNum = 1;\n\t\twhile (remaining.length > 0) {\n\t\t\tconst chunk = remaining.substring(0, SLACK_MAX_LENGTH - 50);\n\t\t\tremaining = remaining.substring(SLACK_MAX_LENGTH - 50);\n\t\t\tconst suffix = remaining.length > 0 ? `\\n_(continued ${partNum}...)_` : \"\";\n\t\t\tparts.push(chunk + suffix);\n\t\t\tpartNum++;\n\t\t}\n\t\treturn parts;\n\t};\n\n\treturn {\n\t\tasync run(\n\t\t\tctx: SlackContext,\n\t\t\t_store: ChannelStore,\n\t\t\t_pendingMessages?: PendingMessage[],\n\t\t): Promise<{ stopReason: string }> {\n\t\t\t// Ensure channel directory exists\n\t\t\tawait mkdir(channelDir, { recursive: true });\n\n\t\t\t// Reload messages from context.jsonl\n\t\t\t// This picks up any messages synced from log.jsonl before this run\n\t\t\tconst reloadedSession = sessionManager.loadSession();\n\t\t\tif (reloadedSession.messages.length > 0) {\n\t\t\t\tagent.replaceMessages(reloadedSession.messages);\n\t\t\t\tlog.logInfo(`[${channelId}] Reloaded ${reloadedSession.messages.length} messages from context`);\n\t\t\t}\n\n\t\t\t// Update system prompt with fresh memory and channel/user info\n\t\t\tconst memory = getMemory(channelDir);\n\t\t\tconst systemPrompt = buildSystemPrompt(\n\t\t\t\tworkspacePath,\n\t\t\t\tchannelId,\n\t\t\t\tmemory,\n\t\t\t\tsandboxConfig,\n\t\t\t\tctx.channels,\n\t\t\t\tctx.users,\n\t\t\t);\n\t\t\tsession.agent.setSystemPrompt(systemPrompt);\n\n\t\t\t// Set up file upload function\n\t\t\tsetUploadFunction(async (filePath: string, title?: string) => {\n\t\t\t\tconst hostPath = translateToHostPath(filePath, channelDir, workspacePath, channelId);\n\t\t\t\tawait ctx.uploadFile(hostPath, title);\n\t\t\t});\n\n\t\t\t// Reset per-run state\n\t\t\trunState.ctx = ctx;\n\t\t\trunState.logCtx = {\n\t\t\t\tchannelId: ctx.message.channel,\n\t\t\t\tuserName: ctx.message.userName,\n\t\t\t\tchannelName: ctx.channelName,\n\t\t\t};\n\t\t\trunState.pendingTools.clear();\n\t\t\trunState.totalUsage = {\n\t\t\t\tinput: 0,\n\t\t\t\toutput: 0,\n\t\t\t\tcacheRead: 0,\n\t\t\t\tcacheWrite: 0,\n\t\t\t\tcost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },\n\t\t\t};\n\t\t\trunState.stopReason = \"stop\";\n\n\t\t\t// Create queue for this run\n\t\t\tlet queueChain = Promise.resolve();\n\t\t\trunState.queue = {\n\t\t\t\tenqueue(fn: () => Promise<void>, errorContext: string): void {\n\t\t\t\t\tqueueChain = queueChain.then(async () => {\n\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\tawait fn();\n\t\t\t\t\t\t} catch (err) {\n\t\t\t\t\t\t\tconst errMsg = err instanceof Error ? err.message : String(err);\n\t\t\t\t\t\t\tlog.logWarning(`Slack API error (${errorContext})`, errMsg);\n\t\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\t\tawait ctx.respondInThread(`_Error: ${errMsg}_`);\n\t\t\t\t\t\t\t} catch {\n\t\t\t\t\t\t\t\t// Ignore\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t});\n\t\t\t\t},\n\t\t\t\tenqueueMessage(text: string, target: \"main\" | \"thread\", errorContext: string, doLog = true): void {\n\t\t\t\t\tconst parts = splitForSlack(text);\n\t\t\t\t\tfor (const part of parts) {\n\t\t\t\t\t\tthis.enqueue(\n\t\t\t\t\t\t\t() => (target === \"main\" ? ctx.respond(part, doLog) : ctx.respondInThread(part)),\n\t\t\t\t\t\t\terrorContext,\n\t\t\t\t\t\t);\n\t\t\t\t\t}\n\t\t\t\t},\n\t\t\t};\n\n\t\t\t// Log context info\n\t\t\tlog.logInfo(`Context sizes - system: ${systemPrompt.length} chars, memory: ${memory.length} chars`);\n\t\t\tlog.logInfo(`Channels: ${ctx.channels.length}, Users: ${ctx.users.length}`);\n\n\t\t\t// Build user message with username prefix\n\t\t\t// Format: \"[username]: message\" so LLM knows who's talking\n\t\t\tlet userMessage = `[${ctx.message.userName || \"unknown\"}]: ${ctx.message.text}`;\n\n\t\t\t// Add attachment paths if any\n\t\t\tif (ctx.message.attachments && ctx.message.attachments.length > 0) {\n\t\t\t\tconst attachmentPaths = ctx.message.attachments.map((a) => a.local).join(\"\\n\");\n\t\t\t\tuserMessage += `\\n\\nAttachments:\\n${attachmentPaths}`;\n\t\t\t}\n\n\t\t\t// Debug: write context to last_prompt.jsonl\n\t\t\tconst debugContext = {\n\t\t\t\tsystemPrompt,\n\t\t\t\tmessages: session.messages,\n\t\t\t\tnewUserMessage: userMessage,\n\t\t\t};\n\t\t\tawait writeFile(join(channelDir, \"last_prompt.jsonl\"), JSON.stringify(debugContext, null, 2));\n\n\t\t\tawait session.prompt(userMessage);\n\n\t\t\t// Wait for queued messages\n\t\t\tawait queueChain;\n\n\t\t\t// Final message update\n\t\t\tconst messages = session.messages;\n\t\t\tconst lastAssistant = messages.filter((m) => m.role === \"assistant\").pop();\n\t\t\tconst finalText =\n\t\t\t\tlastAssistant?.content\n\t\t\t\t\t.filter((c): c is { type: \"text\"; text: string } => c.type === \"text\")\n\t\t\t\t\t.map((c) => c.text)\n\t\t\t\t\t.join(\"\\n\") || \"\";\n\n\t\t\tif (finalText.trim()) {\n\t\t\t\ttry {\n\t\t\t\t\tconst mainText =\n\t\t\t\t\t\tfinalText.length > SLACK_MAX_LENGTH\n\t\t\t\t\t\t\t? finalText.substring(0, SLACK_MAX_LENGTH - 50) + \"\\n\\n_(see thread for full response)_\"\n\t\t\t\t\t\t\t: finalText;\n\t\t\t\t\tawait ctx.replaceMessage(mainText);\n\t\t\t\t} catch (err) {\n\t\t\t\t\tconst errMsg = err instanceof Error ? err.message : String(err);\n\t\t\t\t\tlog.logWarning(\"Failed to replace message with final text\", errMsg);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// Log usage summary with context info\n\t\t\tif (runState.totalUsage.cost.total > 0) {\n\t\t\t\t// Get last non-aborted assistant message for context calculation\n\t\t\t\tconst messages = session.messages;\n\t\t\t\tconst lastAssistantMessage = messages\n\t\t\t\t\t.slice()\n\t\t\t\t\t.reverse()\n\t\t\t\t\t.find((m) => m.role === \"assistant\" && (m as any).stopReason !== \"aborted\") as any;\n\n\t\t\t\tconst contextTokens = lastAssistantMessage\n\t\t\t\t\t? lastAssistantMessage.usage.input +\n\t\t\t\t\t\tlastAssistantMessage.usage.output +\n\t\t\t\t\t\tlastAssistantMessage.usage.cacheRead +\n\t\t\t\t\t\tlastAssistantMessage.usage.cacheWrite\n\t\t\t\t\t: 0;\n\t\t\t\tconst contextWindow = model.contextWindow || 200000;\n\n\t\t\t\tconst summary = log.logUsageSummary(runState.logCtx!, runState.totalUsage, contextTokens, contextWindow);\n\t\t\t\trunState.queue.enqueue(() => ctx.respondInThread(summary), \"usage summary\");\n\t\t\t\tawait queueChain;\n\t\t\t}\n\n\t\t\t// Clear run state\n\t\t\trunState.ctx = null;\n\t\t\trunState.logCtx = null;\n\t\t\trunState.queue = null;\n\n\t\t\treturn { stopReason: runState.stopReason };\n\t\t},\n\n\t\tabort(): void {\n\t\t\tsession.abort();\n\t\t},\n\t};\n}\n\n/**\n * Translate container path back to host path for file operations\n */\nfunction translateToHostPath(\n\tcontainerPath: string,\n\tchannelDir: string,\n\tworkspacePath: string,\n\tchannelId: string,\n): string {\n\tif (workspacePath === \"/workspace\") {\n\t\tconst prefix = `/workspace/${channelId}/`;\n\t\tif (containerPath.startsWith(prefix)) {\n\t\t\treturn join(channelDir, containerPath.slice(prefix.length));\n\t\t}\n\t\tif (containerPath.startsWith(\"/workspace/\")) {\n\t\t\treturn join(channelDir, \"..\", containerPath.slice(\"/workspace/\".length));\n\t\t}\n\t}\n\treturn containerPath;\n}\n"]}
|