@oh-my-pi/pi-coding-agent 14.2.0 → 14.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (54) hide show
  1. package/CHANGELOG.md +59 -0
  2. package/package.json +19 -19
  3. package/src/cli/args.ts +10 -1
  4. package/src/cli/shell-cli.ts +15 -3
  5. package/src/config/settings-schema.ts +60 -1
  6. package/src/dap/session.ts +8 -2
  7. package/src/debug/system-info.ts +6 -2
  8. package/src/discovery/claude.ts +58 -36
  9. package/src/discovery/opencode.ts +20 -2
  10. package/src/edit/index.ts +3 -1
  11. package/src/edit/modes/chunk.ts +133 -53
  12. package/src/edit/modes/hashline.ts +36 -11
  13. package/src/edit/renderer.ts +98 -133
  14. package/src/edit/streaming.ts +351 -0
  15. package/src/exec/bash-executor.ts +60 -5
  16. package/src/internal-urls/docs-index.generated.ts +5 -5
  17. package/src/internal-urls/pi-protocol.ts +0 -2
  18. package/src/lsp/client.ts +22 -6
  19. package/src/lsp/defaults.json +2 -1
  20. package/src/lsp/index.ts +53 -10
  21. package/src/lsp/types.ts +2 -0
  22. package/src/modes/acp/acp-agent.ts +76 -2
  23. package/src/modes/components/assistant-message.ts +1 -34
  24. package/src/modes/components/hook-editor.ts +1 -1
  25. package/src/modes/components/tool-execution.ts +111 -101
  26. package/src/modes/controllers/input-controller.ts +1 -1
  27. package/src/modes/interactive-mode.ts +0 -2
  28. package/src/modes/theme/mermaid-cache.ts +13 -52
  29. package/src/modes/theme/theme.ts +2 -2
  30. package/src/prompts/system/system-prompt.md +1 -1
  31. package/src/prompts/tools/ast-grep.md +1 -0
  32. package/src/prompts/tools/browser.md +1 -0
  33. package/src/prompts/tools/chunk-edit.md +25 -22
  34. package/src/prompts/tools/gh-pr-push.md +2 -1
  35. package/src/prompts/tools/grep.md +4 -3
  36. package/src/prompts/tools/lsp.md +6 -0
  37. package/src/prompts/tools/read-chunk.md +46 -7
  38. package/src/prompts/tools/read.md +7 -4
  39. package/src/sdk.ts +8 -5
  40. package/src/session/agent-session.ts +36 -20
  41. package/src/session/session-manager.ts +228 -57
  42. package/src/session/streaming-output.ts +11 -0
  43. package/src/system-prompt.ts +7 -2
  44. package/src/task/executor.ts +1 -0
  45. package/src/tools/ast-edit.ts +37 -2
  46. package/src/tools/bash.ts +75 -12
  47. package/src/tools/find.ts +19 -26
  48. package/src/tools/gh.ts +6 -16
  49. package/src/tools/grep.ts +94 -37
  50. package/src/tools/path-utils.ts +31 -3
  51. package/src/tools/resolve.ts +12 -3
  52. package/src/tools/sqlite-reader.ts +116 -3
  53. package/src/tools/vim.ts +1 -1
  54. package/src/web/search/providers/codex.ts +129 -6
@@ -9,7 +9,7 @@ export const EMBEDDED_DOCS: Readonly<Record<string, string>> = {
9
9
  "compaction.md": "# Compaction and Branch Summaries\n\nCompaction and branch summaries are the two mechanisms that keep long sessions usable without losing prior work context.\n\n- **Compaction** rewrites old history into a summary on the current branch.\n- **Branch summary** captures abandoned branch context during `/tree` navigation.\n\nBoth are persisted as session entries and converted back into user-context messages when rebuilding LLM input.\n\n## Key implementation files\n\n- `src/session/compaction/compaction.ts`\n- `src/session/compaction/branch-summarization.ts`\n- `src/session/compaction/pruning.ts`\n- `src/session/compaction/utils.ts`\n- `src/session/session-manager.ts`\n- `src/session/agent-session.ts`\n- `src/session/messages.ts`\n- `src/extensibility/hooks/types.ts`\n- `src/config/settings-schema.ts`\n\n## Session entry model\n\nCompaction and branch summaries are first-class session entries, not plain assistant/user messages.\n\n- `CompactionEntry`\n - `type: \"compaction\"`\n - `summary`, optional `shortSummary`\n - `firstKeptEntryId` (compaction boundary)\n - `tokensBefore`\n - optional `details`, `preserveData`, `fromExtension`\n- `BranchSummaryEntry`\n - `type: \"branch_summary\"`\n - `fromId`, `summary`\n - optional `details`, `fromExtension`\n\nWhen context is rebuilt (`buildSessionContext`):\n\n1. Latest compaction on the active path is converted to one `compactionSummary` message.\n2. Kept entries from `firstKeptEntryId` to the compaction point are re-included.\n3. Later entries on the path are appended.\n4. `branch_summary` entries are converted to `branchSummary` messages.\n5. `custom_message` entries are converted to `custom` messages.\n\nThose custom roles are then transformed into LLM-facing user messages in `convertToLlm()` using the static templates:\n\n- `prompts/compaction/compaction-summary-context.md`\n- `prompts/compaction/branch-summary-context.md`\n\n## Compaction pipeline\n\n### Triggers\n\nCompaction can run in three ways:\n\n1. **Manual**: `/compact [instructions]` calls `AgentSession.compact(...)`.\n2. **Automatic overflow recovery**: after an assistant error that matches context overflow.\n3. **Automatic threshold compaction**: after a successful turn when context exceeds threshold.\n\n### Compaction shape (visual)\n\n```text\nBefore compaction:\n\n entry: 0 1 2 3 4 5 6 7 8 9\n ┌─────┬─────┬─────┬──────┬─────┬─────┬──────┬──────┬─────┬──────┐\n │ hdr │ usr │ ass │ tool │ usr │ ass │ tool │ tool │ ass │ tool │\n └─────┴─────┴─────┴──────┴─────┴─────┴──────┴──────┴─────┴──────┘\n └────────┬───────┘ └──────────────┬──────────────┘\n messagesToSummarize kept messages\n ↑\n firstKeptEntryId (entry 4)\n\nAfter compaction (new entry appended):\n\n entry: 0 1 2 3 4 5 6 7 8 9 10\n ┌─────┬─────┬─────┬──────┬─────┬─────┬──────┬──────┬─────┬──────┬─────┐\n │ hdr │ usr │ ass │ tool │ usr │ ass │ tool │ tool │ ass │ tool │ cmp │\n └─────┴─────┴─────┴──────┴─────┴─────┴──────┴──────┴─────┴──────┴─────┘\n └──────────┬──────┘ └──────────────────────┬───────────────────┘\n not sent to LLM sent to LLM\n ↑\n starts from firstKeptEntryId\n\nWhat the LLM sees:\n\n ┌────────┬─────────┬─────┬─────┬──────┬──────┬─────┬──────┐\n │ system │ summary │ usr │ ass │ tool │ tool │ ass │ tool │\n └────────┴─────────┴─────┴─────┴──────┴──────┴─────┴──────┘\n ↑ ↑ └─────────────────┬────────────────┘\n prompt from cmp messages from firstKeptEntryId\n```\n\n\n### Overflow-retry vs threshold compaction\n\nThe two automatic paths are intentionally different:\n\n- **Overflow-retry compaction**\n - Trigger: current-model assistant error is detected as context overflow.\n - The failing assistant error message is removed from active agent state before retry.\n - Auto compaction runs with `reason: \"overflow\"` and `willRetry: true`.\n - On success, agent auto-continues (`agent.continue()`) after compaction.\n\n- **Threshold compaction**\n - Trigger: `contextTokens > contextWindow - compaction.reserveTokens`.\n - Runs with `reason: \"threshold\"` and `willRetry: false`.\n - On success, if `compaction.autoContinue !== false`, injects a synthetic prompt:\n - `\"Continue if you have next steps.\"`\n\n### Pre-compaction pruning\n\nBefore compaction checks, tool-result pruning may run (`pruneToolOutputs`).\n\nDefault prune policy:\n\n- Protect newest `40_000` tool-output tokens.\n- Require at least `20_000` total estimated savings.\n- Never prune tool results from `skill` or `read`.\n\nPruned tool results are replaced with:\n\n- `[Output truncated - N tokens]`\n\nIf pruning changes entries, session storage is rewritten and agent message state is refreshed before compaction decisions.\n\n### Boundary and cut-point logic\n\n`prepareCompaction()` only considers entries since the last compaction entry (if any).\n\n1. Find previous compaction index.\n2. Compute `boundaryStart = prevCompactionIndex + 1`.\n3. Adapt `keepRecentTokens` using measured usage ratio when available.\n4. Run `findCutPoint()` over the boundary window.\n\nValid cut points include:\n\n- message entries with roles: `user`, `assistant`, `bashExecution`, `hookMessage`, `branchSummary`, `compactionSummary`\n- `custom_message` entries\n- `branch_summary` entries\n\nHard rule: never cut at `toolResult`.\n\nIf there are non-message metadata entries immediately before the cut point (`model_change`, `thinking_level_change`, labels, etc.), they are pulled into the kept region by moving cut index backward until a message or compaction boundary is hit.\n\n### Split-turn handling\n\nIf cut point is not at a user-turn start, compaction treats it as a split turn.\n\nTurn start detection treats these as user-turn boundaries:\n\n- `message.role === \"user\"`\n- `message.role === \"bashExecution\"`\n- `custom_message` entry\n- `branch_summary` entry\n\nSplit-turn compaction generates two summaries:\n\n1. History summary (`messagesToSummarize`)\n2. Turn-prefix summary (`turnPrefixMessages`)\n\nFinal stored summary is merged as:\n\n```markdown\n<history summary>\n\n---\n\n**Turn Context (split turn):**\n\n<turn prefix summary>\n```\n\n### Summary generation\n\n`compact(...)` builds summaries from serialized conversation text:\n\n1. Convert messages via `convertToLlm()`.\n2. Serialize with `serializeConversation()`.\n3. Wrap in `<conversation>...</conversation>`.\n4. Optionally include `<previous-summary>...</previous-summary>`.\n5. Optionally inject hook context as `<additional-context>` list.\n6. Execute summarization prompt with `SUMMARIZATION_SYSTEM_PROMPT`.\n\nPrompt selection:\n\n- first compaction: `compaction-summary.md`\n- iterative compaction with prior summary: `compaction-update-summary.md`\n- split-turn second pass: `compaction-turn-prefix.md`\n- short UI summary: `compaction-short-summary.md`\n\nRemote summarization mode:\n\n- If `compaction.remoteEndpoint` is set, compaction POSTs:\n - `{ systemPrompt, prompt }`\n- Expects JSON containing at least `{ summary }`.\n\n### File-operation context in summaries\n\nCompaction tracks cumulative file activity using assistant tool calls:\n\n- `read(path)` → read set\n- `write(path)` → modified set\n- `edit(path)` → modified set\n\nCumulative behavior:\n\n- Includes prior compaction details only when prior entry is pi-generated (`fromExtension !== true`).\n- In split turns, includes turn-prefix file ops too.\n- `readFiles` excludes files also modified.\n\nSummary text gets file tags appended via prompt template:\n\n```xml\n<read-files>\n...\n</read-files>\n<modified-files>\n...\n</modified-files>\n```\n\n### Persist and reload\n\nAfter summary generation (or hook-provided summary), agent session:\n\n1. Appends `CompactionEntry` with `appendCompaction(...)`.\n2. Rebuilds context via `buildSessionContext()`.\n3. Replaces live agent messages with rebuilt context.\n4. Emits `session_compact` hook event.\n\n## Branch summarization pipeline\n\nBranch summarization is tied to tree navigation, not token overflow.\n\n### Trigger\n\nDuring `navigateTree(...)`:\n\n1. Compute abandoned entries from old leaf to common ancestor using `collectEntriesForBranchSummary(...)`.\n2. If caller requested summary (`options.summarize`), generate summary before switching leaf.\n3. If summary exists, attach it at the navigation target using `branchWithSummary(...)`.\n\nOperationally this is commonly driven by `/tree` flow when `branchSummary.enabled` is enabled.\n\n### Branch switch shape (visual)\n\n```text\nTree before navigation:\n\n ┌─ B ─ C ─ D (old leaf, being abandoned)\n A ───┤\n └─ E ─ F (target)\n\nCommon ancestor: A\nEntries to summarize: B, C, D\n\nAfter navigation with summary:\n\n ┌─ B ─ C ─ D ─ [summary of B,C,D]\n A ───┤\n └─ E ─ F (new leaf)\n```\n\n\n### Preparation and token budget\n\n`generateBranchSummary(...)` computes budget as:\n\n- `tokenBudget = model.contextWindow - branchSummary.reserveTokens`\n\n`prepareBranchEntries(...)` then:\n\n1. First pass: collect cumulative file ops from all summarized entries, including prior pi-generated `branch_summary` details.\n2. Second pass: walk newest → oldest, adding messages until token budget is reached.\n3. Prefer preserving recent context.\n4. May still include large summary entries near budget edge for continuity.\n\nCompaction entries are included as messages (`compactionSummary`) during branch summarization input.\n\n### Summary generation and persistence\n\nBranch summarization:\n\n1. Converts and serializes selected messages.\n2. Wraps in `<conversation>`.\n3. Uses custom instructions if supplied, otherwise `branch-summary.md`.\n4. Calls summarization model with `SUMMARIZATION_SYSTEM_PROMPT`.\n5. Prepends `branch-summary-preamble.md`.\n6. Appends file-operation tags.\n\nResult is stored as `BranchSummaryEntry` with optional details (`readFiles`, `modifiedFiles`).\n\n## Extension and hook touchpoints\n\n### `session_before_compact`\n\nPre-compaction hook.\n\nCan:\n\n- cancel compaction (`{ cancel: true }`)\n- provide full custom compaction payload (`{ compaction: CompactionResult }`)\n\n### `session.compacting`\n\nPrompt/context customization hook for default compaction.\n\nCan return:\n\n- `prompt` (override base summary prompt)\n- `context` (extra context lines injected into `<additional-context>`)\n- `preserveData` (stored on compaction entry)\n\n### `session_compact`\n\nPost-compaction notification with saved `compactionEntry` and `fromExtension` flag.\n\n### `session_before_tree`\n\nRuns on tree navigation before default branch summary generation.\n\nCan:\n\n- cancel navigation\n- provide custom `{ summary: { summary, details } }` used when user requested summarization\n\n### `session_tree`\n\nPost-navigation event exposing new/old leaf and optional summary entry.\n\n## Runtime behavior and failure semantics\n\n- Manual compaction aborts current agent operation first.\n- `abortCompaction()` cancels both manual and auto-compaction controllers.\n- Auto compaction emits start/end session events for UI/state updates.\n- Auto compaction can try multiple model candidates and retry transient failures.\n- Overflow errors are excluded from generic retry path because they are handled by compaction.\n- If auto-compaction fails:\n - overflow path emits `Context overflow recovery failed: ...`\n - threshold path emits `Auto-compaction failed: ...`\n- Branch summarization can be cancelled via abort signal (e.g., Escape), returning canceled/aborted navigation result.\n\n## Settings and defaults\n\nFrom `settings-schema.ts`:\n\n- `compaction.enabled` = `true`\n- `compaction.reserveTokens` = `16384`\n- `compaction.keepRecentTokens` = `20000`\n- `compaction.autoContinue` = `true`\n- `compaction.remoteEndpoint` = `undefined`\n- `branchSummary.enabled` = `false`\n- `branchSummary.reserveTokens` = `16384`\n\nThese values are consumed at runtime by `AgentSession` and compaction/branch summarization modules.",
10
10
  "config-usage.md": "# Configuration Discovery and Resolution\n\nThis document describes how the coding-agent resolves configuration today: which roots are scanned, how precedence works, and how resolved config is consumed by settings, skills, hooks, tools, and extensions.\n\n## Scope\n\nPrimary implementation:\n\n- `src/config.ts`\n- `src/config/settings.ts`\n- `src/config/settings-schema.ts`\n- `src/discovery/builtin.ts`\n- `src/discovery/helpers.ts`\n\nKey integration points:\n\n- `src/capability/index.ts`\n- `src/discovery/index.ts`\n- `src/extensibility/skills.ts`\n- `src/extensibility/hooks/loader.ts`\n- `src/extensibility/custom-tools/loader.ts`\n- `src/extensibility/extensions/loader.ts`\n\n---\n\n## Resolution flow (visual)\n\n```text\n Config roots (ordered)\n┌───────────────────────────────────────┐\n│ 1) ~/.omp/agent + <cwd>/.omp │\n│ 2) ~/.claude + <cwd>/.claude │\n│ 3) ~/.codex + <cwd>/.codex │\n│ 4) ~/.gemini + <cwd>/.gemini │\n└───────────────────────────────────────┘\n │\n ▼\n config.ts helper resolution\n (getConfigDirs/findConfigFile/findNearest...)\n │\n ▼\n capability providers enumerate items\n (native, claude, codex, gemini, agents, etc.)\n │\n ▼\n priority sort + per-capability dedup\n │\n ▼\n subsystem-specific consumption\n (settings, skills, hooks, tools, extensions)\n```\n\n\n## 1) Config roots and source order\n\n## Canonical roots\n\n`src/config.ts` defines a fixed source priority list:\n\n1. `.omp` (native)\n2. `.claude`\n3. `.codex`\n4. `.gemini`\n\nUser-level bases:\n\n- `~/.omp/agent`\n- `~/.claude`\n- `~/.codex`\n- `~/.gemini`\n\nProject-level bases:\n\n- `<cwd>/.omp`\n- `<cwd>/.claude`\n- `<cwd>/.codex`\n- `<cwd>/.gemini`\n\n`CONFIG_DIR_NAME` is `.omp` (`packages/utils/src/dirs.ts`).\n\n## Important constraint\n\nThe generic helpers in `src/config.ts` do **not** include `.pi` in source discovery order.\n\n---\n\n## 2) Core discovery helpers (`src/config.ts`)\n\n## `getConfigDirs(subpath, options)`\n\nReturns ordered entries:\n\n- User-level entries first (by source priority)\n- Then project-level entries (by same source priority)\n\nOptions:\n\n- `user` (default `true`)\n- `project` (default `true`)\n- `cwd` (default `getProjectDir()`)\n- `existingOnly` (default `false`)\n\nThis API is used for directory-based config lookups (commands, hooks, tools, agents, etc.).\n\n## `findConfigFile(subpath, options)` / `findConfigFileWithMeta(...)`\n\nSearches for the first existing file across ordered bases, returns first match (path-only or path+metadata).\n\n## `findAllNearestProjectConfigDirs(subpath, cwd)`\n\nWalks parent directories upward and returns the **nearest existing directory per source base** (`.omp`, `.claude`, `.codex`, `.gemini`), then sorts results by source priority.\n\nUse this when project config should be inherited from ancestor directories (monorepo/nested workspace behavior).\n\n---\n\n## 3) File config wrapper (`ConfigFile<T>` in `src/config.ts`)\n\n`ConfigFile<T>` is the schema-validated loader for single config files.\n\nSupported formats:\n\n- `.yml` / `.yaml`\n- `.json` / `.jsonc`\n\nBehavior:\n\n- Validates parsed data with AJV against a provided TypeBox schema.\n- Caches load result until `invalidate()`.\n- Returns tri-state result via `tryLoad()`:\n - `ok`\n - `not-found`\n - `error` (`ConfigError` with schema/parse context)\n\nLegacy migration still supported:\n\n- If target path is `.yml`/`.yaml`, a sibling `.json` is auto-migrated once (`migrateJsonToYml`).\n\n---\n\n## 4) Settings resolution model (`src/config/settings.ts`)\n\nThe runtime settings model is layered:\n\n1. Global settings: `~/.omp/agent/config.yml`\n2. Project settings: discovered via settings capability (`settings.json` from providers)\n3. Runtime overrides: in-memory, non-persistent\n4. Schema defaults: from `SETTINGS_SCHEMA`\n\nEffective read path:\n\n`defaults <- global <- project <- overrides`\n\nWrite behavior:\n\n- `settings.set(...)` writes to the **global** layer (`config.yml`) and queues background save.\n- Project settings are read-only from capability discovery.\n\n## Migration behavior still active\n\nOn startup, if `config.yml` is missing:\n\n1. Migrate from `~/.omp/agent/settings.json` (renamed to `.bak` on success)\n2. Merge with legacy DB settings from `agent.db`\n3. Write merged result to `config.yml`\n\nField-level migrations in `#migrateRawSettings`:\n\n- `queueMode` -> `steeringMode`\n- `ask.timeout` milliseconds -> seconds when old value looks like ms (`> 1000`)\n- Legacy flat `theme: \"...\"` -> `theme.dark/theme.light` structure\n\n---\n\n## 5) Capability/discovery integration\n\nMost non-core config loading flows through the capability registry (`src/capability/index.ts` + `src/discovery/index.ts`).\n\n## Provider ordering\n\nProviders are sorted by numeric priority (higher first). Example priorities:\n\n- Native OMP (`builtin.ts`): `100`\n- Claude: `80`\n- Codex / agents / Claude marketplace: `70`\n- Gemini: `60`\n\n```text\nProvider precedence (higher wins)\n\nnative (.omp) priority 100\nclaude priority 80\ncodex / agents / ... priority 70\ngemini priority 60\n```\n\n## Dedup semantics\n\nCapabilities define a `key(item)`:\n\n- same key => first item wins (higher-priority/earlier-loaded item)\n- no key (`undefined`) => no dedup, all items retained\n\nRelevant keys:\n\n- skills: `name`\n- tools: `name`\n- hooks: `${type}:${tool}:${name}`\n- extension modules: `name`\n- extensions: `name`\n- settings: no dedup (all items preserved)\n\n---\n\n## 6) Native `.omp` provider behavior (`src/discovery/builtin.ts`)\n\nNative provider (`id: native`) reads from:\n\n- project: `<cwd>/.omp/...`\n- user: `~/.omp/agent/...`\n\n### Directory admission rule\n\n`builtin.ts` only includes a config root if the directory exists **and is non-empty** (`ifNonEmptyDir`).\n\n### Scope-specific loading\n\n- Skills: `skills/*/SKILL.md`\n- Slash commands: `commands/*.md`\n- Rules: `rules/*.{md,mdc}`\n- Prompts: `prompts/*.md`\n- Instructions: `instructions/*.md`\n- Hooks: `hooks/pre/*`, `hooks/post/*`\n- Tools: `tools/*.json|*.md` and `tools/<name>/index.ts`\n- Extension modules: discovered under `extensions/` (+ legacy `settings.json.extensions` string array)\n- Extensions: `extensions/<name>/gemini-extension.json`\n- Settings capability: `settings.json`\n\n### Nearest-project lookup nuance\n\nFor `SYSTEM.md` and `AGENTS.md`, native provider uses nearest-ancestor project `.omp` directory search (walk-up) but still requires the `.omp` dir to be non-empty.\n\n---\n\n## 7) How major subsystems consume config\n\n## Settings subsystem\n\n- `Settings.init()` loads global `config.yml` + discovered project `settings.json` capability items.\n- Only capability items with `level === \"project\"` are merged into project layer.\n\n## Skills subsystem\n\n- `extensibility/skills.ts` loads via `loadCapability(skillCapability.id, { cwd })`.\n- Applies source toggles and filters (`ignoredSkills`, `includeSkills`, custom dirs).\n- Legacy-named toggles still exist (`skills.enablePiUser`, `skills.enablePiProject`) but they gate the native provider (`provider === \"native\"`).\n\n## Hooks subsystem\n\n- `discoverAndLoadHooks()` resolves hook paths from hook capability + explicit configured paths.\n- Then loads modules via Bun import.\n\n## Tools subsystem\n\n- `discoverAndLoadCustomTools()` resolves tool paths from tool capability + plugin tool paths + explicit configured paths.\n- Declarative `.md/.json` tool files are metadata only; executable loading expects code modules.\n\n## Extensions subsystem\n\n- `discoverAndLoadExtensions()` resolves extension modules from extension-module capability plus explicit paths.\n- Current implementation intentionally keeps only capability items with `_source.provider === \"native\"` before loading.\n\n---\n\n## 8) Precedence rules to rely on\n\nUse this mental model:\n\n1. Source directory ordering from `config.ts` determines candidate path order.\n2. Capability provider priority determines cross-provider precedence.\n3. Capability key dedup determines collision behavior (first wins for keyed capabilities).\n4. Subsystem-specific merge logic can further change effective precedence (especially settings).\n\n### Settings-specific caveat\n\nSettings capability items are not deduplicated; `Settings.#loadProjectSettings()` deep-merges project items in returned order. Because merge applies later item values over earlier values, effective override behavior depends on provider emission order, not just capability key semantics.\n\n---\n\n## 9) Legacy/compatibility behaviors still present\n\n- `ConfigFile` JSON -> YAML migration for YAML-targeted files.\n- Settings migration from `settings.json` and `agent.db` to `config.yml`.\n- Settings key migrations (`queueMode`, `ask.timeout`, flat `theme`).\n- Extension manifest compatibility: loader accepts both `package.json.omp` and `package.json.pi` manifest sections.\n- Legacy setting names `skills.enablePiUser` / `skills.enablePiProject` are still active gates for native skill source.\n\nIf these compatibility paths are removed in code, update this document immediately; several runtime behaviors still depend on them today.\n",
11
11
  "custom-tools.md": "# Custom Tools\n\nCustom tools are model-callable functions that plug into the same tool execution pipeline as built-in tools.\n\nA custom tool is a TypeScript/JavaScript module that exports a factory. The factory receives a host API (`CustomToolAPI`) and returns one tool or an array of tools.\n\n## What this is (and is not)\n\n- **Custom tool**: callable by the model during a turn (`execute` + TypeBox schema).\n- **Extension**: lifecycle/event framework that can register tools and intercept/modify events.\n- **Hook**: external pre/post command scripts.\n- **Skill**: static guidance/context package, not executable tool code.\n\nIf you need the model to call code directly, use a custom tool.\n\n## Integration paths in current code\n\nThere are two active integration styles:\n\n1. **SDK-provided custom tools** (`options.customTools`)\n - Wrapped into agent tools via `CustomToolAdapter` or extension wrappers.\n - Always included in the initial active tool set in SDK bootstrap.\n\n2. **Filesystem-discovered modules via loader API** (`discoverAndLoadCustomTools` / `loadCustomTools`)\n - Exposed as library APIs in `src/extensibility/custom-tools/loader.ts`.\n - Host code can call these to discover and load tool modules from config/provider/plugin paths.\n\n```text\nModel tool call flow\n\nLLM tool call\n │\n ▼\nTool registry (built-ins + custom tool adapters)\n │\n ▼\nCustomTool.execute(toolCallId, params, onUpdate, ctx, signal)\n │\n ├─ onUpdate(...) -> streamed partial result\n └─ return result -> final tool content/details\n```\n\n## Discovery locations (loader API)\n\n`discoverAndLoadCustomTools(configuredPaths, cwd, builtInToolNames)` merges:\n\n1. Capability providers (`toolCapability`), including:\n - Native OMP config (`~/.omp/agent/tools`, `.omp/tools`)\n - Claude config (`~/.claude/tools`, `.claude/tools`)\n - Codex config (`~/.codex/tools`, `.codex/tools`)\n - Claude marketplace plugin cache provider\n2. Installed plugin manifests (`~/.omp/plugins/node_modules/*` via plugin loader)\n3. Explicit configured paths passed to the loader\n\n### Important behavior\n\n- Duplicate resolved paths are deduplicated.\n- Tool name conflicts are rejected against built-ins and already-loaded custom tools.\n- `.md` and `.json` files are discovered as tool metadata by some providers, but the executable module loader rejects them as runnable tools.\n- Relative configured paths are resolved from `cwd`; `~` is expanded.\n\n## Module contract\n\nA custom tool module must export a function (default export preferred):\n\n```ts\nimport type { CustomToolFactory } from \"@oh-my-pi/pi-coding-agent\";\n\nconst factory: CustomToolFactory = (pi) => ({\n\tname: \"repo_stats\",\n\tlabel: \"Repo Stats\",\n\tdescription: \"Counts tracked TypeScript files\",\n\tparameters: pi.typebox.Type.Object({\n\t\tglob: pi.typebox.Type.Optional(pi.typebox.Type.String({ default: \"**/*.ts\" })),\n\t}),\n\n\tasync execute(toolCallId, params, onUpdate, ctx, signal) {\n\t\tonUpdate?.({\n\t\t\tcontent: [{ type: \"text\", text: \"Scanning files...\" }],\n\t\t\tdetails: { phase: \"scan\" },\n\t\t});\n\n\t\tconst result = await pi.exec(\"git\", [\"ls-files\", params.glob ?? \"**/*.ts\"], { signal, cwd: pi.cwd });\n\t\tif (result.killed) {\n\t\t\tthrow new Error(\"Scan was cancelled\");\n\t\t}\n\t\tif (result.code !== 0) {\n\t\t\tthrow new Error(result.stderr || \"git ls-files failed\");\n\t\t}\n\n\t\tconst files = result.stdout.split(\"\\n\").filter(Boolean);\n\t\treturn {\n\t\t\tcontent: [{ type: \"text\", text: `Found ${files.length} files` }],\n\t\t\tdetails: { count: files.length, sample: files.slice(0, 10) },\n\t\t};\n\t},\n\n\tonSession(event) {\n\t\tif (event.reason === \"shutdown\") {\n\t\t\t// cleanup resources if needed\n\t\t}\n\t},\n});\n\nexport default factory;\n```\n\nFactory return type:\n\n- `CustomTool`\n- `CustomTool[]`\n- `Promise<CustomTool | CustomTool[]>`\n\n## API surface passed to factories (`CustomToolAPI`)\n\nFrom `types.ts` and `loader.ts`:\n\n- `cwd`: host working directory\n- `exec(command, args, options?)`: process execution helper\n- `ui`: UI context (can be no-op in headless modes)\n- `hasUI`: `false` in non-interactive flows\n- `logger`: shared file logger\n- `typebox`: injected `@sinclair/typebox`\n- `pi`: injected `@oh-my-pi/pi-coding-agent` exports\n- `pushPendingAction(action)`: register a preview action for hidden `resolve` tool (`docs/resolve-tool-runtime.md`)\n\nLoader starts with a no-op UI context and requires host code to call `setUIContext(...)` when real UI is ready.\n\n## Execution contract and typing\n\n`CustomTool.execute` signature:\n\n```ts\nexecute(toolCallId, params, onUpdate, ctx, signal)\n```\n\n- `params` is statically typed from your TypeBox schema via `Static<TParams>`.\n- Runtime argument validation happens before execution in the agent loop.\n- `onUpdate` emits partial results for UI streaming.\n- `ctx` includes session/model state and an `abort()` helper.\n- `signal` carries cancellation.\n\n`CustomToolAdapter` bridges this to the agent tool interface and forwards calls in the correct argument order.\n\n## How tools are exposed to the model\n\n- Tools are wrapped into `AgentTool` instances (`CustomToolAdapter` or extension wrappers).\n- They are inserted into the session tool registry by name.\n- In SDK bootstrap, custom and extension-registered tools are force-included in the initial active set.\n- CLI `--tools` currently validates only built-in tool names; custom tool inclusion is handled through discovery/registration paths and SDK options.\n\n## Rendering hooks\n\nOptional rendering hooks:\n\n- `renderCall(args, theme)`\n- `renderResult(result, options, theme, args?)`\n\nRuntime behavior in TUI:\n\n- If hooks exist, tool output is rendered inside a `Box` container.\n- `renderResult` receives `{ expanded, isPartial, spinnerFrame? }`.\n- Renderer errors are caught and logged; UI falls back to default text rendering.\n\n## Session/state handling\n\nOptional `onSession(event, ctx)` receives session lifecycle events, including:\n\n- `start`, `switch`, `branch`, `tree`, `shutdown`\n- `auto_compaction_start`, `auto_compaction_end`\n- `auto_retry_start`, `auto_retry_end`\n- `ttsr_triggered`, `todo_reminder`\n\nUse `ctx.sessionManager` to reconstruct state from history when branch/session context changes.\n\n## Failures and cancellation semantics\n\n### Synchronous/async failures\n\n- Throwing (or rejected promises) in `execute` is treated as tool failure.\n- Agent runtime converts failures into tool result messages with `isError: true` and error text content.\n- With extension wrappers, `tool_result` handlers can further rewrite content/details and even override error status.\n\n### Cancellation\n\n- Agent abort propagates through `AbortSignal` to `execute`.\n- Forward `signal` to subprocess work (`pi.exec(..., { signal })`) for cooperative cancellation.\n- `ctx.abort()` lets a tool request abort of the current agent operation.\n\n### onSession errors\n\n- `onSession` errors are caught and logged as warnings; they do not crash the session.\n\n## Real constraints to design for\n\n- Tool names must be globally unique in the active registry.\n- Prefer deterministic, schema-shaped outputs in `details` for renderer/state reconstruction.\n- Guard UI usage with `pi.hasUI`.\n- Treat `.md`/`.json` in tool directories as metadata, not executable modules.\n",
12
- "environment-variables.md": "# Environment Variables (Current Runtime Reference)\n\nThis reference is derived from current code paths in:\n\n- `packages/coding-agent/src/**`\n- `packages/ai/src/**` (provider/auth resolution used by coding-agent)\n- `packages/utils/src/**` and `packages/tui/src/**` where those vars directly affect coding-agent runtime\n\nIt documents only active behavior.\n\n## Resolution model and precedence\n\nMost runtime lookups use `$env` from `@oh-my-pi/pi-utils` (`packages/utils/src/env.ts`).\n\n`$env` loading order:\n\n1. Existing process environment (`Bun.env`)\n2. Project `.env` (`$PWD/.env`) for keys not already set\n3. Home `.env` (`~/.env`) for keys not already set\n\nAdditional rule in `.env` files: `OMP_*` keys are mirrored to `PI_*` keys during parse.\n\n---\n\n## 1) Model/provider authentication\n\nThese are consumed via `getEnvApiKey()` (`packages/ai/src/stream.ts`) unless noted otherwise.\n\n### Core provider credentials\n\n| Variable | Used for | Required when | Notes / precedence |\n|---------------------------------|---|---------------------------------------------------------------|-----------------------------------------------------------------------------------------------------|\n| `ANTHROPIC_OAUTH_TOKEN` | Anthropic API auth | Using Anthropic with OAuth token auth | Takes precedence over `ANTHROPIC_API_KEY` for provider auth resolution |\n| `ANTHROPIC_API_KEY` | Anthropic API auth | Using Anthropic without OAuth token | Fallback after `ANTHROPIC_OAUTH_TOKEN` |\n| `ANTHROPIC_FOUNDRY_API_KEY` | Anthropic via Azure Foundry / enterprise gateway | `CLAUDE_CODE_USE_FOUNDRY` enabled | Takes precedence over `ANTHROPIC_OAUTH_TOKEN` and `ANTHROPIC_API_KEY` when Foundry mode is enabled |\n| `OPENAI_API_KEY` | OpenAI auth | Using OpenAI-family providers without explicit apiKey argument | Used by OpenAI Completions/Responses providers |\n| `GEMINI_API_KEY` | Google Gemini auth | Using `google` provider models | Primary key for Gemini provider mapping |\n| `GOOGLE_API_KEY` | Gemini image tool auth fallback | Using `gemini_image` tool without `GEMINI_API_KEY` | Used by coding-agent image tool fallback path |\n| `GROQ_API_KEY` | Groq auth | Using Groq models | |\n| `CEREBRAS_API_KEY` | Cerebras auth | Using Cerebras models | |\n| `TOGETHER_API_KEY` | Together auth | Using `together` provider | |\n| `HUGGINGFACE_HUB_TOKEN` | Hugging Face auth | Using `huggingface` provider | Primary Hugging Face token env var |\n| `HF_TOKEN` | Hugging Face auth | Using `huggingface` provider | Fallback when `HUGGINGFACE_HUB_TOKEN` is unset |\n| `SYNTHETIC_API_KEY` | Synthetic auth | Using Synthetic models | |\n| `NVIDIA_API_KEY` | NVIDIA auth | Using `nvidia` provider | |\n| `NANO_GPT_API_KEY` | NanoGPT auth | Using `nanogpt` provider | |\n| `VENICE_API_KEY` | Venice auth | Using `venice` provider | |\n| `LITELLM_API_KEY` | LiteLLM auth | Using `litellm` provider | OpenAI-compatible LiteLLM proxy key |\n| `LM_STUDIO_API_KEY` | LM Studio auth (optional) | Using `lm-studio` provider with authenticated hosts | Local LM Studio usually runs without auth; any non-empty token works when a key is required |\n| `OLLAMA_API_KEY` | Ollama auth (optional) | Using `ollama` provider with authenticated hosts | Local Ollama usually runs without auth; any non-empty token works when a key is required |\n| `LLAMA_CPP_API_KEY` | Ollama auth (optional) | Using `llama-server` with `--api-key` parameter | Local llama.cpp usually runs without auth; any non-empty token works when a key is configured |\n| `XIAOMI_API_KEY` | Xiaomi MiMo auth | Using `xiaomi` provider | |\n| `MOONSHOT_API_KEY` | Moonshot auth | Using `moonshot` provider | |\n| `XAI_API_KEY` | xAI auth | Using xAI models | |\n| `OPENROUTER_API_KEY` | OpenRouter auth | Using OpenRouter models | Also used by image tool when preferred/auto provider is OpenRouter |\n| `MISTRAL_API_KEY` | Mistral auth | Using Mistral models | |\n| `ZAI_API_KEY` | z.ai auth | Using z.ai models | Also used by z.ai web search provider |\n| `MINIMAX_API_KEY` | MiniMax auth | Using `minimax` provider | |\n| `MINIMAX_CODE_API_KEY` | MiniMax Code auth | Using `minimax-code` provider | |\n| `MINIMAX_CODE_CN_API_KEY` | MiniMax Code CN auth | Using `minimax-code-cn` provider | |\n| `OPENCODE_API_KEY` | OpenCode auth | Using OpenCode models | |\n| `QIANFAN_API_KEY` | Qianfan auth | Using `qianfan` provider | |\n| `QWEN_OAUTH_TOKEN` | Qwen Portal auth | Using `qwen-portal` with OAuth token | Takes precedence over `QWEN_PORTAL_API_KEY` |\n| `QWEN_PORTAL_API_KEY` | Qwen Portal auth | Using `qwen-portal` with API key | Fallback after `QWEN_OAUTH_TOKEN` |\n| `ZENMUX_API_KEY` | ZenMux auth | Using `zenmux` provider | Used for ZenMux OpenAI and Anthropic-compatible routes |\n| `VLLM_API_KEY` | vLLM auth/discovery opt-in | Using `vllm` provider (local OpenAI-compatible servers) | Any non-empty value works for no-auth local servers |\n| `CURSOR_ACCESS_TOKEN` | Cursor provider auth | Using Cursor provider | |\n| `AI_GATEWAY_API_KEY` | Vercel AI Gateway auth | Using `vercel-ai-gateway` provider | |\n| `CLOUDFLARE_AI_GATEWAY_API_KEY` | Cloudflare AI Gateway auth | Using `cloudflare-ai-gateway` provider | Base URL must be configured as `https://gateway.ai.cloudflare.com/v1/<account>/<gateway>/anthropic` |\n\n### GitHub/Copilot token chains\n\n| Variable | Used for | Chain |\n|---|---|---|\n| `COPILOT_GITHUB_TOKEN` | GitHub Copilot provider auth | `COPILOT_GITHUB_TOKEN` → `GH_TOKEN` → `GITHUB_TOKEN` |\n| `GH_TOKEN` | Copilot fallback; GitHub API auth in web scraper | In web scraper: `GITHUB_TOKEN` → `GH_TOKEN` |\n| `GITHUB_TOKEN` | Copilot fallback; GitHub API auth in web scraper | In web scraper: checked before `GH_TOKEN` |\n\n---\n\n## 2) Provider-specific runtime configuration\n\n### Anthropic Foundry Gateway (Azure / enterprise proxy)\n\nWhen `CLAUDE_CODE_USE_FOUNDRY` is enabled, Anthropic requests switch to Foundry mode:\n\n- Base URL resolves from `FOUNDRY_BASE_URL` (fallback remains model/default base URL if unset).\n- API key resolution for provider `anthropic` becomes:\n `ANTHROPIC_FOUNDRY_API_KEY` → `ANTHROPIC_OAUTH_TOKEN` → `ANTHROPIC_API_KEY`.\n- `ANTHROPIC_CUSTOM_HEADERS` is parsed as comma/newline-separated `key: value` pairs and merged into request headers.\n- TLS client/server material can be injected from env values:\n `NODE_EXTRA_CA_CERTS`, `CLAUDE_CODE_CLIENT_CERT`, `CLAUDE_CODE_CLIENT_KEY`.\n Each accepts either:\n - a filesystem path to PEM content, or\n - inline PEM (including escaped `\\n` sequences).\n\n| Variable | Value type | Behavior |\n|---|---|---|\n| `CLAUDE_CODE_USE_FOUNDRY` | Boolean-like string (`1`, `true`, `yes`, `on`) | Enables Foundry mode for Anthropic provider |\n| `FOUNDRY_BASE_URL` | URL string | Anthropic endpoint base URL in Foundry mode |\n| `ANTHROPIC_FOUNDRY_API_KEY` | Token string | Used for `Authorization: Bearer <token>` |\n| `ANTHROPIC_CUSTOM_HEADERS` | Header list string | Extra headers; format `header-a: value, header-b: value` or newline-separated |\n| `NODE_EXTRA_CA_CERTS` | PEM path or inline PEM | Extra CA chain for server certificate validation |\n| `CLAUDE_CODE_CLIENT_CERT` | PEM path or inline PEM | mTLS client certificate |\n| `CLAUDE_CODE_CLIENT_KEY` | PEM path or inline PEM | mTLS client private key (must be paired with cert) |\n\n### Amazon Bedrock\n\n| Variable | Default / behavior |\n|---|---|\n| `AWS_REGION` | Primary region source |\n| `AWS_DEFAULT_REGION` | Fallback if `AWS_REGION` unset |\n| `AWS_PROFILE` | Enables named profile auth path |\n| `AWS_ACCESS_KEY_ID` + `AWS_SECRET_ACCESS_KEY` | Enables IAM key auth path |\n| `AWS_BEARER_TOKEN_BEDROCK` | Enables bearer token auth path |\n| `AWS_CONTAINER_CREDENTIALS_RELATIVE_URI` / `AWS_CONTAINER_CREDENTIALS_FULL_URI` | Enables ECS task credential path |\n| `AWS_WEB_IDENTITY_TOKEN_FILE` + `AWS_ROLE_ARN` | Enables web identity auth path |\n| `AWS_BEDROCK_SKIP_AUTH` | If `1`, injects dummy credentials (proxy/non-auth scenarios) |\n| `AWS_BEDROCK_FORCE_HTTP1` | If `1`, forces Node HTTP/1 request handler |\n| `HTTPS_PROXY` / `HTTP_PROXY` / `ALL_PROXY` | Routes Bedrock runtime and AWS SSO credential calls through the configured proxy using HTTP/1 |\n| `NO_PROXY` | Excludes matching hosts from proxy routing when a proxy variable is configured |\n\nRegion fallback in provider code: `options.region` → `AWS_REGION` → `AWS_DEFAULT_REGION` → `us-east-1`.\n\n### Azure OpenAI Responses\n\n| Variable | Default / behavior |\n|---|---|\n| `AZURE_OPENAI_API_KEY` | Required unless API key passed as option |\n| `AZURE_OPENAI_API_VERSION` | Default `v1` |\n| `AZURE_OPENAI_BASE_URL` | Direct base URL override |\n| `AZURE_OPENAI_RESOURCE_NAME` | Used to construct base URL: `https://<resource>.openai.azure.com/openai/v1` |\n| `AZURE_OPENAI_DEPLOYMENT_NAME_MAP` | Optional mapping string: `modelId=deploymentName,model2=deployment2` |\n\nBase URL resolution: option `azureBaseUrl` → env `AZURE_OPENAI_BASE_URL` → option/env resource name → `model.baseUrl`.\n\n### Google Vertex AI\n\n| Variable | Required? | Notes |\n|---|---|---|\n| `GOOGLE_CLOUD_PROJECT` | Yes (unless passed in options) | Fallback: `GCLOUD_PROJECT` |\n| `GCLOUD_PROJECT` | Fallback | Used as alternate project ID source |\n| `GOOGLE_CLOUD_LOCATION` | Yes (unless passed in options) | No default in provider |\n| `GOOGLE_APPLICATION_CREDENTIALS` | Conditional | If set, file must exist; otherwise ADC fallback path is checked (`~/.config/gcloud/application_default_credentials.json`) |\n\n### Kimi\n\n| Variable | Default / behavior |\n|---|---|\n| `KIMI_CODE_OAUTH_HOST` | Primary OAuth host override |\n| `KIMI_OAUTH_HOST` | Fallback OAuth host override |\n| `KIMI_CODE_BASE_URL` | Overrides Kimi usage endpoint base URL (`usage/kimi.ts`) |\n\nOAuth host chain: `KIMI_CODE_OAUTH_HOST` → `KIMI_OAUTH_HOST` → `https://auth.kimi.com`.\n\n### Antigravity/Gemini image compatibility\n\n| Variable | Default / behavior |\n|---|---|\n| `PI_AI_ANTIGRAVITY_VERSION` | Overrides Antigravity user-agent version tag in Gemini CLI provider |\n\n### OpenAI Codex responses (feature/debug controls)\n\n| Variable | Behavior |\n|---|---|\n| `PI_CODEX_DEBUG` | `1`/`true` enables Codex provider debug logging |\n| `PI_CODEX_WEBSOCKET` | `1`/`true` enables websocket transport preference |\n| `PI_CODEX_WEBSOCKET_V2` | `1`/`true` enables websocket v2 path |\n| `PI_CODEX_WEBSOCKET_IDLE_TIMEOUT_MS` | Positive integer override (default 300000) |\n| `PI_CODEX_WEBSOCKET_RETRY_BUDGET` | Non-negative integer override (default 5) |\n| `PI_CODEX_WEBSOCKET_RETRY_DELAY_MS` | Positive integer base backoff override (default 500) |\n\n### Cursor provider debug\n\n| Variable | Behavior |\n|---|---|\n| `DEBUG_CURSOR` | Enables provider debug logs; `2`/`verbose` for detailed payload snippets |\n| `DEBUG_CURSOR_LOG` | Optional file path for JSONL debug log output |\n\n### Prompt cache compatibility switch\n\n| Variable | Behavior |\n|---|---|\n| `PI_CACHE_RETENTION` | If `long`, enables long retention where supported (`anthropic`, `openai-responses`, Bedrock retention resolution) |\n\n---\n\n## 3) Web search subsystem\n\n### Search provider credentials\n\n| Variable | Used by |\n|---|---|\n| `EXA_API_KEY` | Exa search provider and Exa MCP tools |\n| `BRAVE_API_KEY` | Brave search provider |\n| `PERPLEXITY_API_KEY` | Perplexity search provider API-key mode |\n| `TAVILY_API_KEY` | Tavily search provider |\n| `ZAI_API_KEY` | z.ai search provider (also checks stored OAuth in `agent.db`) |\n| `OPENAI_API_KEY` / Codex OAuth in DB | Codex search provider availability/auth |\n\n### Anthropic web search auth chain\n\n`packages/coding-agent/src/web/search/auth.ts` resolves Anthropic web-search credentials in this order:\n\n1. `ANTHROPIC_SEARCH_API_KEY` (+ optional `ANTHROPIC_SEARCH_BASE_URL`)\n2. `models.json` provider entry with `api: \"anthropic-messages\"`\n3. Anthropic OAuth credentials from `agent.db` (must not expire within 5-minute buffer)\n4. Generic Anthropic env fallback: provider key (`ANTHROPIC_FOUNDRY_API_KEY`/`ANTHROPIC_OAUTH_TOKEN`/`ANTHROPIC_API_KEY`) + optional `ANTHROPIC_BASE_URL` (`FOUNDRY_BASE_URL` when Foundry mode is enabled)\n\nRelated vars:\n\n| Variable | Default / behavior |\n|---|---|\n| `ANTHROPIC_SEARCH_API_KEY` | Highest-priority explicit search key |\n| `ANTHROPIC_SEARCH_BASE_URL` | Defaults to `https://api.anthropic.com` when omitted |\n| `ANTHROPIC_SEARCH_MODEL` | Defaults to `claude-haiku-4-5` |\n| `ANTHROPIC_BASE_URL` | Generic fallback base URL for tier-4 auth path |\n\n### Perplexity OAuth flow behavior flag\n\n| Variable | Behavior |\n|---|---|\n| `PI_AUTH_NO_BORROW` | If set, disables macOS native-app token borrowing path in Perplexity login flow |\n\n---\n\n## 4) Python tooling and kernel runtime\n\n| Variable | Default / behavior |\n|---|---|\n| `PI_PY` | Python tool mode override: `0`/`bash`=`bash-only`, `1`/`py`=`ipy-only`, `mix`/`both`=`both`; invalid values ignored |\n| `PI_PYTHON_SKIP_CHECK` | If `1`, skips Python kernel availability checks/warm checks |\n| `PI_PYTHON_GATEWAY_URL` | If set, uses external kernel gateway instead of local shared gateway |\n| `PI_PYTHON_GATEWAY_TOKEN` | Optional auth token for external gateway (`Authorization: token <value>`) |\n| `PI_PYTHON_IPC_TRACE` | If `1`, enables low-level IPC trace path in kernel module |\n| `VIRTUAL_ENV` | Highest-priority venv path for Python runtime resolution |\n\nExtra conditional behavior:\n\n- If `BUN_ENV=test` or `NODE_ENV=test`, Python availability checks are treated as OK and warming is skipped.\n- Python env filtering denies common API keys and allows safe base vars + `LC_`, `XDG_`, `PI_` prefixes.\n\n---\n\n## 5) Agent/runtime behavior toggles\n\n| Variable | Default / behavior |\n|----------------------------|----------------------------------------------------------------------------------------------|\n| `PI_SMOL_MODEL` | Ephemeral model-role override for `smol` (CLI `--smol` takes precedence) |\n| `PI_SLOW_MODEL` | Ephemeral model-role override for `slow` (CLI `--slow` takes precedence) |\n| `PI_PLAN_MODEL` | Ephemeral model-role override for `plan` (CLI `--plan` takes precedence) |\n| `PI_NO_TITLE` | If set (any non-empty value), disables auto session title generation on first user message |\n| `NULL_PROMPT` | If `true`, system prompt builder returns empty string |\n| `PI_BLOCKED_AGENT` | Blocks a specific subagent type in task tool |\n| `PI_SUBPROCESS_CMD` | Overrides subagent spawn command (`omp` / `omp.cmd` resolution bypass) |\n| `PI_TASK_MAX_OUTPUT_BYTES` | Max captured output bytes per subagent (default `500000`) |\n| `PI_TASK_MAX_OUTPUT_LINES` | Max captured output lines per subagent (default `5000`) |\n| `PI_TIMING` | If `1`, enables startup/tool timing instrumentation logs |\n| `PI_DEBUG_STARTUP` | Enables startup stage debug prints to stderr in multiple startup paths |\n| `PI_PACKAGE_DIR` | Overrides package asset base dir resolution (docs/examples/changelog path lookup) |\n| `PI_DISABLE_LSPMUX` | If `1`, disables lspmux detection/integration and forces direct LSP server spawning |\n| `LM_STUDIO_BASE_URL` | Default implicit LM Studio discovery base URL override (`http://127.0.0.1:1234/v1` if unset) |\n| `OLLAMA_BASE_URL` | Default implicit Ollama discovery base URL override (`http://127.0.0.1:11434` if unset) |\n| `LLAMA_CPP_BASE_URL` | Default implicit Llama.cpp discovery base URL override (`http://127.0.0.1:8080` if unset) |\n| `PI_EDIT_VARIANT` | If `hashline`, forces hashline read/grep display mode when edit tool available |\n| `PI_NO_PTY` | If `1`, disables interactive PTY path for bash tool |\n\n`PI_NO_PTY` is also set internally when CLI `--no-pty` is used.\n\n---\n\n## 6) Storage and config root paths\n\nThese are consumed via `@oh-my-pi/pi-utils/dirs` and affect where coding-agent stores data.\n\n| Variable | Default / behavior |\n|---|---|\n| `PI_CONFIG_DIR` | Config root dirname under home (default `.omp`) |\n| `PI_CODING_AGENT_DIR` | Full override for agent directory (default `~/<PI_CONFIG_DIR or .omp>/agent`) |\n| `PWD` | Used when matching canonical current working directory in path helpers |\n\n---\n\n## 7) Shell/tool execution environment\n\n(From `packages/utils/src/procmgr.ts` and coding-agent bash tool integration.)\n\n| Variable | Behavior |\n|---|---|\n| `PI_BASH_NO_CI` | Suppresses automatic `CI=true` injection into spawned shell env |\n| `CLAUDE_BASH_NO_CI` | Legacy alias fallback for `PI_BASH_NO_CI` |\n| `PI_BASH_NO_LOGIN` | Intended to disable login shell mode |\n| `CLAUDE_BASH_NO_LOGIN` | Legacy alias fallback for `PI_BASH_NO_LOGIN` |\n| `PI_SHELL_PREFIX` | Optional command prefix wrapper |\n| `CLAUDE_CODE_SHELL_PREFIX` | Legacy alias fallback for `PI_SHELL_PREFIX` |\n| `VISUAL` | Preferred external editor command |\n| `EDITOR` | Fallback external editor command |\n\nCurrent implementation note: `PI_BASH_NO_LOGIN`/`CLAUDE_BASH_NO_LOGIN` are read, but current `getShellArgs()` returns `['-l','-c']` in both branches (effectively no-op today).\n\n---\n\n## 8) UI/theme/session detection (auto-detected env)\n\nThese are read as runtime signals; they are usually set by the terminal/OS rather than manually configured.\n\n| Variable | Used for |\n|---|---|\n| `COLORTERM`, `TERM`, `WT_SESSION` | Color capability detection (theme color mode) |\n| `COLORFGBG` | Terminal background light/dark auto-detection |\n| `TERM_PROGRAM`, `TERM_PROGRAM_VERSION`, `TERMINAL_EMULATOR` | Terminal identity in system prompt/context |\n| `KDE_FULL_SESSION`, `XDG_CURRENT_DESKTOP`, `DESKTOP_SESSION`, `XDG_SESSION_DESKTOP`, `GDMSESSION`, `WINDOWMANAGER` | Desktop/window-manager detection in system prompt/context |\n| `KITTY_WINDOW_ID`, `TMUX_PANE`, `TERM_SESSION_ID`, `WT_SESSION` | Stable per-terminal session breadcrumb IDs |\n| `SHELL`, `ComSpec`, `TERM_PROGRAM`, `TERM` | System info diagnostics |\n| `APPDATA`, `XDG_CONFIG_HOME` | lspmux config path resolution |\n| `HOME` | Path shortening in MCP command UI |\n\n---\n\n## 9) Native loader/debug flags\n\n| Variable | Behavior |\n|---|---|\n| `PI_DEV` | Enables verbose native addon load diagnostics in `packages/natives` |\n\n## 10) TUI runtime flags (shared package, affects coding-agent UX)\n\n| Variable | Behavior |\n|---|---|\n| `PI_NOTIFICATIONS` | `off` / `0` / `false` suppress desktop notifications |\n| `PI_TUI_WRITE_LOG` | If set, logs TUI writes to file |\n| `PI_HARDWARE_CURSOR` | If `1`, enables hardware cursor mode |\n| `PI_CLEAR_ON_SHRINK` | If `1`, clears empty rows when content shrinks |\n| `PI_DEBUG_REDRAW` | If `1`, enables redraw debug logging |\n| `PI_TUI_DEBUG` | If `1`, enables deep TUI debug dump path |\n\n---\n\n## 11) Commit generation controls\n\n| Variable | Behavior |\n|---|---|\n| `PI_COMMIT_TEST_FALLBACK` | If `true` (case-insensitive), force commit fallback generation path |\n| `PI_COMMIT_NO_FALLBACK` | If `true`, disables fallback when agent returns no proposal |\n| `PI_COMMIT_MAP_REDUCE` | If `false`, disables map-reduce commit analysis path |\n| `DEBUG` | If set, commit agent error stack traces are printed |\n\n---\n\n## Security-sensitive variables\n\nTreat these as secrets; do not log or commit them:\n\n- Provider/API keys and OAuth/bearer credentials (all `*_API_KEY`, `*_TOKEN`, OAuth access/refresh tokens)\n- Cloud credentials (`AWS_*`, `GOOGLE_APPLICATION_CREDENTIALS` path may expose service-account material)\n- Search/provider auth vars (`EXA_API_KEY`, `BRAVE_API_KEY`, `PERPLEXITY_API_KEY`, Anthropic search keys)\n- Foundry mTLS material (`CLAUDE_CODE_CLIENT_CERT`, `CLAUDE_CODE_CLIENT_KEY`, `NODE_EXTRA_CA_CERTS` when it points to private CA bundles)\n\nPython runtime also explicitly strips many common key vars before spawning kernel subprocesses (`packages/coding-agent/src/ipy/runtime.ts`).\n",
12
+ "environment-variables.md": "# Environment Variables (Current Runtime Reference)\n\nThis reference is derived from current code paths in:\n\n- `packages/coding-agent/src/**`\n- `packages/ai/src/**` (provider/auth resolution used by coding-agent)\n- `packages/utils/src/**` and `packages/tui/src/**` where those vars directly affect coding-agent runtime\n\nIt documents only active behavior.\n\n## Resolution model and precedence\n\nMost runtime lookups use `$env` from `@oh-my-pi/pi-utils` (`packages/utils/src/env.ts`).\n\n`$env` loading order:\n\n1. Existing process environment (`Bun.env`)\n2. Project `.env` (`$PWD/.env`) for keys not already set\n3. Home `.env` (`~/.env`) for keys not already set\n\nAdditional rule in `.env` files: `OMP_*` keys are mirrored to `PI_*` keys during parse.\n\n---\n\n## 1) Model/provider authentication\n\nThese are consumed via `getEnvApiKey()` (`packages/ai/src/stream.ts`) unless noted otherwise.\n\n### Core provider credentials\n\n| Variable | Used for | Required when | Notes / precedence |\n|---------------------------------|---|---------------------------------------------------------------|-----------------------------------------------------------------------------------------------------|\n| `ANTHROPIC_OAUTH_TOKEN` | Anthropic API auth | Using Anthropic with OAuth token auth | Takes precedence over `ANTHROPIC_API_KEY` for provider auth resolution |\n| `ANTHROPIC_API_KEY` | Anthropic API auth | Using Anthropic without OAuth token | Fallback after `ANTHROPIC_OAUTH_TOKEN` |\n| `ANTHROPIC_FOUNDRY_API_KEY` | Anthropic via Azure Foundry / enterprise gateway | `CLAUDE_CODE_USE_FOUNDRY` enabled | Takes precedence over `ANTHROPIC_OAUTH_TOKEN` and `ANTHROPIC_API_KEY` when Foundry mode is enabled |\n| `OPENAI_API_KEY` | OpenAI auth | Using OpenAI-family providers without explicit apiKey argument | Used by OpenAI Completions/Responses providers |\n| `GEMINI_API_KEY` | Google Gemini auth | Using `google` provider models | Primary key for Gemini provider mapping |\n| `GOOGLE_API_KEY` | Gemini image tool auth fallback | Using `gemini_image` tool without `GEMINI_API_KEY` | Used by coding-agent image tool fallback path |\n| `GROQ_API_KEY` | Groq auth | Using Groq models | |\n| `CEREBRAS_API_KEY` | Cerebras auth | Using Cerebras models | |\n| `TOGETHER_API_KEY` | Together auth | Using `together` provider | |\n| `HUGGINGFACE_HUB_TOKEN` | Hugging Face auth | Using `huggingface` provider | Primary Hugging Face token env var |\n| `HF_TOKEN` | Hugging Face auth | Using `huggingface` provider | Fallback when `HUGGINGFACE_HUB_TOKEN` is unset |\n| `SYNTHETIC_API_KEY` | Synthetic auth | Using Synthetic models | |\n| `NVIDIA_API_KEY` | NVIDIA auth | Using `nvidia` provider | |\n| `NANO_GPT_API_KEY` | NanoGPT auth | Using `nanogpt` provider | |\n| `VENICE_API_KEY` | Venice auth | Using `venice` provider | |\n| `LITELLM_API_KEY` | LiteLLM auth | Using `litellm` provider | OpenAI-compatible LiteLLM proxy key |\n| `LM_STUDIO_API_KEY` | LM Studio auth (optional) | Using `lm-studio` provider with authenticated hosts | Local LM Studio usually runs without auth; any non-empty token works when a key is required |\n| `OLLAMA_API_KEY` | Ollama auth (optional) | Using `ollama` provider with authenticated hosts | Local Ollama usually runs without auth; any non-empty token works when a key is required |\n| `LLAMA_CPP_API_KEY` | Ollama auth (optional) | Using `llama-server` with `--api-key` parameter | Local llama.cpp usually runs without auth; any non-empty token works when a key is configured |\n| `XIAOMI_API_KEY` | Xiaomi MiMo auth | Using `xiaomi` provider | |\n| `MOONSHOT_API_KEY` | Moonshot auth | Using `moonshot` provider | |\n| `XAI_API_KEY` | xAI auth | Using xAI models | |\n| `OPENROUTER_API_KEY` | OpenRouter auth | Using OpenRouter models | Also used by image tool when preferred/auto provider is OpenRouter |\n| `MISTRAL_API_KEY` | Mistral auth | Using Mistral models | |\n| `ZAI_API_KEY` | z.ai auth | Using z.ai models | Also used by z.ai web search provider |\n| `MINIMAX_API_KEY` | MiniMax auth | Using `minimax` provider | |\n| `MINIMAX_CODE_API_KEY` | MiniMax Code auth | Using `minimax-code` provider | |\n| `MINIMAX_CODE_CN_API_KEY` | MiniMax Code CN auth | Using `minimax-code-cn` provider | |\n| `OPENCODE_API_KEY` | OpenCode auth | Using OpenCode models | |\n| `QIANFAN_API_KEY` | Qianfan auth | Using `qianfan` provider | |\n| `QWEN_OAUTH_TOKEN` | Qwen Portal auth | Using `qwen-portal` with OAuth token | Takes precedence over `QWEN_PORTAL_API_KEY` |\n| `QWEN_PORTAL_API_KEY` | Qwen Portal auth | Using `qwen-portal` with API key | Fallback after `QWEN_OAUTH_TOKEN` |\n| `ZENMUX_API_KEY` | ZenMux auth | Using `zenmux` provider | Used for ZenMux OpenAI and Anthropic-compatible routes |\n| `VLLM_API_KEY` | vLLM auth/discovery opt-in | Using `vllm` provider (local OpenAI-compatible servers) | Any non-empty value works for no-auth local servers |\n| `CURSOR_ACCESS_TOKEN` | Cursor provider auth | Using Cursor provider | |\n| `AI_GATEWAY_API_KEY` | Vercel AI Gateway auth | Using `vercel-ai-gateway` provider | |\n| `CLOUDFLARE_AI_GATEWAY_API_KEY` | Cloudflare AI Gateway auth | Using `cloudflare-ai-gateway` provider | Base URL must be configured as `https://gateway.ai.cloudflare.com/v1/<account>/<gateway>/anthropic` |\n\n### GitHub/Copilot token chains\n\n| Variable | Used for | Chain |\n|---|---|---|\n| `COPILOT_GITHUB_TOKEN` | GitHub Copilot provider auth | `COPILOT_GITHUB_TOKEN` → `GH_TOKEN` → `GITHUB_TOKEN` |\n| `GH_TOKEN` | Copilot fallback; GitHub API auth in web scraper | In web scraper: `GITHUB_TOKEN` → `GH_TOKEN` |\n| `GITHUB_TOKEN` | Copilot fallback; GitHub API auth in web scraper | In web scraper: checked before `GH_TOKEN` |\n\n---\n\n## 2) Provider-specific runtime configuration\n\n### Anthropic Foundry Gateway (Azure / enterprise proxy)\n\nWhen `CLAUDE_CODE_USE_FOUNDRY` is enabled, Anthropic requests switch to Foundry mode:\n\n- Base URL resolves from `FOUNDRY_BASE_URL` (fallback remains model/default base URL if unset).\n- API key resolution for provider `anthropic` becomes:\n `ANTHROPIC_FOUNDRY_API_KEY` → `ANTHROPIC_OAUTH_TOKEN` → `ANTHROPIC_API_KEY`.\n- `ANTHROPIC_CUSTOM_HEADERS` is parsed as comma/newline-separated `key: value` pairs and merged into request headers.\n- TLS client/server material can be injected from env values:\n `NODE_EXTRA_CA_CERTS`, `CLAUDE_CODE_CLIENT_CERT`, `CLAUDE_CODE_CLIENT_KEY`.\n Each accepts either:\n - a filesystem path to PEM content, or\n - inline PEM (including escaped `\\n` sequences).\n\n| Variable | Value type | Behavior |\n|---|---|---|\n| `CLAUDE_CODE_USE_FOUNDRY` | Boolean-like string (`1`, `true`, `yes`, `on`) | Enables Foundry mode for Anthropic provider |\n| `FOUNDRY_BASE_URL` | URL string | Anthropic endpoint base URL in Foundry mode |\n| `ANTHROPIC_FOUNDRY_API_KEY` | Token string | Used for `Authorization: Bearer <token>` |\n| `ANTHROPIC_CUSTOM_HEADERS` | Header list string | Extra headers; format `header-a: value, header-b: value` or newline-separated |\n| `NODE_EXTRA_CA_CERTS` | PEM path or inline PEM | Extra CA chain for server certificate validation |\n| `CLAUDE_CODE_CLIENT_CERT` | PEM path or inline PEM | mTLS client certificate |\n| `CLAUDE_CODE_CLIENT_KEY` | PEM path or inline PEM | mTLS client private key (must be paired with cert) |\n\n### Amazon Bedrock\n\n| Variable | Default / behavior |\n|---|---|\n| `AWS_REGION` | Primary region source |\n| `AWS_DEFAULT_REGION` | Fallback if `AWS_REGION` unset |\n| `AWS_PROFILE` | Enables named profile auth path |\n| `AWS_ACCESS_KEY_ID` + `AWS_SECRET_ACCESS_KEY` | Enables IAM key auth path |\n| `AWS_BEARER_TOKEN_BEDROCK` | Enables bearer token auth path |\n| `AWS_CONTAINER_CREDENTIALS_RELATIVE_URI` / `AWS_CONTAINER_CREDENTIALS_FULL_URI` | Enables ECS task credential path |\n| `AWS_WEB_IDENTITY_TOKEN_FILE` + `AWS_ROLE_ARN` | Enables web identity auth path |\n| `AWS_BEDROCK_SKIP_AUTH` | If `1`, injects dummy credentials (proxy/non-auth scenarios) |\n| `AWS_BEDROCK_FORCE_HTTP1` | If `1`, forces Node HTTP/1 request handler |\n| `HTTPS_PROXY` / `HTTP_PROXY` / `ALL_PROXY` | Routes Bedrock runtime and AWS SSO credential calls through the configured proxy using HTTP/1 |\n| `NO_PROXY` | Excludes matching hosts from proxy routing when a proxy variable is configured |\n\nRegion fallback in provider code: `options.region` → `AWS_REGION` → `AWS_DEFAULT_REGION` → `us-east-1`.\n\n### Azure OpenAI Responses\n\n| Variable | Default / behavior |\n|---|---|\n| `AZURE_OPENAI_API_KEY` | Required unless API key passed as option |\n| `AZURE_OPENAI_API_VERSION` | Default `v1` |\n| `AZURE_OPENAI_BASE_URL` | Direct base URL override |\n| `AZURE_OPENAI_RESOURCE_NAME` | Used to construct base URL: `https://<resource>.openai.azure.com/openai/v1` |\n| `AZURE_OPENAI_DEPLOYMENT_NAME_MAP` | Optional mapping string: `modelId=deploymentName,model2=deployment2` |\n\nBase URL resolution: option `azureBaseUrl` → env `AZURE_OPENAI_BASE_URL` → option/env resource name → `model.baseUrl`.\n\n### Google Vertex AI\n\n| Variable | Required? | Notes |\n|---|---|---|\n| `GOOGLE_CLOUD_PROJECT` | Yes (unless passed in options) | Fallback: `GCLOUD_PROJECT` |\n| `GCLOUD_PROJECT` | Fallback | Used as alternate project ID source |\n| `GOOGLE_CLOUD_LOCATION` | Yes (unless passed in options) | No default in provider |\n| `GOOGLE_APPLICATION_CREDENTIALS` | Conditional | If set, file must exist; otherwise ADC fallback path is checked (`~/.config/gcloud/application_default_credentials.json`) |\n\n### Kimi\n\n| Variable | Default / behavior |\n|---|---|\n| `KIMI_CODE_OAUTH_HOST` | Primary OAuth host override |\n| `KIMI_OAUTH_HOST` | Fallback OAuth host override |\n| `KIMI_CODE_BASE_URL` | Overrides Kimi usage endpoint base URL (`usage/kimi.ts`) |\n\nOAuth host chain: `KIMI_CODE_OAUTH_HOST` → `KIMI_OAUTH_HOST` → `https://auth.kimi.com`.\n\n### Antigravity/Gemini image compatibility\n\n| Variable | Default / behavior |\n|---|---|\n| `PI_AI_ANTIGRAVITY_VERSION` | Overrides Antigravity user-agent version tag in Gemini CLI provider |\n\n### OpenAI Codex responses (feature/debug controls)\n\n| Variable | Behavior |\n|---|---|\n| `PI_CODEX_DEBUG` | `1`/`true` enables Codex provider debug logging |\n| `PI_CODEX_WEBSOCKET` | `1`/`true` enables websocket transport preference |\n| `PI_CODEX_WEBSOCKET_V2` | `1`/`true` enables websocket v2 path |\n| `PI_CODEX_WEBSOCKET_IDLE_TIMEOUT_MS` | Positive integer override (default 300000) |\n| `PI_CODEX_WEBSOCKET_RETRY_BUDGET` | Non-negative integer override (default 5) |\n| `PI_CODEX_WEBSOCKET_RETRY_DELAY_MS` | Positive integer base backoff override (default 500) |\n\n### Cursor provider debug\n\n| Variable | Behavior |\n|---|---|\n| `DEBUG_CURSOR` | Enables provider debug logs; `2`/`verbose` for detailed payload snippets |\n| `DEBUG_CURSOR_LOG` | Optional file path for JSONL debug log output |\n\n### Prompt cache compatibility switch\n\n| Variable | Behavior |\n|---|---|\n| `PI_CACHE_RETENTION` | If `long`, enables long retention where supported (`anthropic`, `openai-responses`, Bedrock retention resolution) |\n\n---\n\n## 3) Web search subsystem\n\n### Search provider credentials\n\n| Variable | Used by |\n|---|---|\n| `EXA_API_KEY` | Exa search provider and Exa MCP tools |\n| `BRAVE_API_KEY` | Brave search provider |\n| `PERPLEXITY_API_KEY` | Perplexity search provider API-key mode |\n| `TAVILY_API_KEY` | Tavily search provider |\n| `ZAI_API_KEY` | z.ai search provider (also checks stored OAuth in `agent.db`) |\n| `OPENAI_API_KEY` / Codex OAuth in DB | Codex search provider availability/auth |\n\n### Anthropic web search auth chain\n\n`packages/coding-agent/src/web/search/auth.ts` resolves Anthropic web-search credentials in this order:\n\n1. `ANTHROPIC_SEARCH_API_KEY` (+ optional `ANTHROPIC_SEARCH_BASE_URL`)\n2. `models.json` provider entry with `api: \"anthropic-messages\"`\n3. Anthropic OAuth credentials from `agent.db` (must not expire within 5-minute buffer)\n4. Generic Anthropic env fallback: provider key (`ANTHROPIC_FOUNDRY_API_KEY`/`ANTHROPIC_OAUTH_TOKEN`/`ANTHROPIC_API_KEY`) + optional `ANTHROPIC_BASE_URL` (`FOUNDRY_BASE_URL` when Foundry mode is enabled)\n\nRelated vars:\n\n| Variable | Default / behavior |\n|---|---|\n| `ANTHROPIC_SEARCH_API_KEY` | Highest-priority explicit search key |\n| `ANTHROPIC_SEARCH_BASE_URL` | Defaults to `https://api.anthropic.com` when omitted |\n| `ANTHROPIC_SEARCH_MODEL` | Defaults to `claude-haiku-4-5` |\n| `ANTHROPIC_BASE_URL` | Generic fallback base URL for tier-4 auth path |\n\n### Perplexity OAuth flow behavior flag\n\n| Variable | Behavior |\n|---|---|\n| `PI_AUTH_NO_BORROW` | If set, disables macOS native-app token borrowing path in Perplexity login flow |\n\n---\n\n## 4) Python tooling and kernel runtime\n\n| Variable | Default / behavior |\n|---|---|\n| `PI_PY` | Python tool mode override: `0`/`bash`=`bash-only`, `1`/`py`=`ipy-only`, `mix`/`both`=`both`; invalid values ignored |\n| `PI_PYTHON_SKIP_CHECK` | If `1`, skips Python kernel availability checks/warm checks |\n| `PI_PYTHON_GATEWAY_URL` | If set, uses external kernel gateway instead of local shared gateway |\n| `PI_PYTHON_GATEWAY_TOKEN` | Optional auth token for external gateway (`Authorization: token <value>`) |\n| `PI_PYTHON_IPC_TRACE` | If `1`, enables low-level IPC trace path in kernel module |\n| `VIRTUAL_ENV` | Highest-priority venv path for Python runtime resolution |\n\nExtra conditional behavior:\n\n- If `BUN_ENV=test` or `NODE_ENV=test`, Python availability checks are treated as OK and warming is skipped.\n- Python env filtering denies common API keys and allows safe base vars + `LC_`, `XDG_`, `PI_` prefixes.\n\n---\n\n## 5) Agent/runtime behavior toggles\n\n| Variable | Default / behavior |\n|----------------------------|----------------------------------------------------------------------------------------------|\n| `PI_SMOL_MODEL` | Ephemeral model-role override for `smol` (CLI `--smol` takes precedence) |\n| `PI_SLOW_MODEL` | Ephemeral model-role override for `slow` (CLI `--slow` takes precedence) |\n| `PI_PLAN_MODEL` | Ephemeral model-role override for `plan` (CLI `--plan` takes precedence) |\n| `PI_NO_TITLE` | If set (any non-empty value), disables auto session title generation on first user message |\n| `NULL_PROMPT` | If `true`, system prompt builder returns empty string |\n| `PI_BLOCKED_AGENT` | Blocks a specific subagent type in task tool |\n| `PI_SUBPROCESS_CMD` | Overrides subagent spawn command (`omp` / `omp.cmd` resolution bypass) |\n| `PI_TASK_MAX_OUTPUT_BYTES` | Max captured output bytes per subagent (default `500000`) |\n| `PI_TASK_MAX_OUTPUT_LINES` | Max captured output lines per subagent (default `5000`) |\n| `PI_TIMING` | If `1`, enables startup/tool timing instrumentation logs |\n| `PI_DEBUG_STARTUP` | Enables startup stage debug prints to stderr in multiple startup paths |\n| `PI_PACKAGE_DIR` | Overrides package asset base dir resolution (docs/examples/changelog path lookup) |\n| `PI_DISABLE_LSPMUX` | If `1`, disables lspmux detection/integration and forces direct LSP server spawning |\n| `LM_STUDIO_BASE_URL` | Default implicit LM Studio discovery base URL override (`http://127.0.0.1:1234/v1` if unset) |\n| `OLLAMA_BASE_URL` | Default implicit Ollama discovery base URL override (`http://127.0.0.1:11434` if unset) |\n| `LLAMA_CPP_BASE_URL` | Default implicit Llama.cpp discovery base URL override (`http://127.0.0.1:8080` if unset) |\n| `PI_EDIT_VARIANT` | If `hashline`, forces hashline read/grep display mode when edit tool available |\n| `PI_NO_PTY` | If `1`, disables interactive PTY path for bash tool |\n\n`PI_NO_PTY` is also set internally when CLI `--no-pty` is used.\n\n---\n\n## 6) Storage and config root paths\n\nThese are consumed via `@oh-my-pi/pi-utils/dirs` and affect where coding-agent stores data.\n\n| Variable | Default / behavior |\n|---|---|\n| `PI_CONFIG_DIR` | Config root dirname under home (default `.omp`) |\n| `PI_CODING_AGENT_DIR` | Full override for agent directory (default `~/<PI_CONFIG_DIR or .omp>/agent`) |\n| `PWD` | Used when matching canonical current working directory in path helpers |\n\n---\n\n## 7) Shell/tool execution environment\n\n(From `packages/utils/src/procmgr.ts` and coding-agent bash tool integration.)\n\n| Variable | Behavior |\n|---|---|\n| `PI_BASH_NO_CI` | Suppresses automatic `CI=true` injection into spawned shell env |\n| `CLAUDE_BASH_NO_CI` | Legacy alias fallback for `PI_BASH_NO_CI` |\n| `PI_BASH_NO_LOGIN` | Intended to disable login shell mode |\n| `CLAUDE_BASH_NO_LOGIN` | Legacy alias fallback for `PI_BASH_NO_LOGIN` |\n| `PI_SHELL_PREFIX` | Optional command prefix wrapper |\n| `CLAUDE_CODE_SHELL_PREFIX` | Legacy alias fallback for `PI_SHELL_PREFIX` |\n| `VISUAL` | Preferred external editor command |\n| `EDITOR` | Fallback external editor command |\n\nCurrent implementation note: `PI_BASH_NO_LOGIN`/`CLAUDE_BASH_NO_LOGIN` are read, but current `getShellArgs()` returns `['-l','-c']` in both branches (effectively no-op today).\n\n---\n\n## 8) UI/theme/session detection (auto-detected env)\n\nThese are read as runtime signals; they are usually set by the terminal/OS rather than manually configured.\n\n| Variable | Used for |\n|---|---|\n| `COLORTERM`, `TERM`, `WT_SESSION` | Color capability detection (theme color mode) |\n| `COLORFGBG` | Terminal background light/dark auto-detection |\n| `TERM_PROGRAM`, `TERM_PROGRAM_VERSION`, `TERMINAL_EMULATOR` | Terminal identity in system prompt/context |\n| `KDE_FULL_SESSION`, `XDG_CURRENT_DESKTOP`, `DESKTOP_SESSION`, `XDG_SESSION_DESKTOP`, `GDMSESSION`, `WINDOWMANAGER` | Desktop/window-manager detection in system prompt/context |\n| `KITTY_WINDOW_ID`, `TMUX_PANE`, `TERM_SESSION_ID`, `WT_SESSION` | Stable per-terminal session breadcrumb IDs |\n| `SHELL`, `ComSpec`, `TERM_PROGRAM`, `TERM` | System info diagnostics |\n| `APPDATA`, `XDG_CONFIG_HOME` | lspmux config path resolution |\n| `HOME` | Path shortening in MCP command UI |\n\n---\n\n## 9) TUI runtime flags (shared package, affects coding-agent UX)\n\n| Variable | Behavior |\n|---|---|\n| `PI_NOTIFICATIONS` | `off` / `0` / `false` suppress desktop notifications |\n| `PI_TUI_WRITE_LOG` | If set, logs TUI writes to file |\n| `PI_HARDWARE_CURSOR` | If `1`, enables hardware cursor mode |\n| `PI_CLEAR_ON_SHRINK` | If `1`, clears empty rows when content shrinks |\n| `PI_DEBUG_REDRAW` | If `1`, enables redraw debug logging |\n| `PI_TUI_DEBUG` | If `1`, enables deep TUI debug dump path |\n\n---\n\n## 11) Commit generation controls\n\n| Variable | Behavior |\n|---|---|\n| `PI_COMMIT_TEST_FALLBACK` | If `true` (case-insensitive), force commit fallback generation path |\n| `PI_COMMIT_NO_FALLBACK` | If `true`, disables fallback when agent returns no proposal |\n| `PI_COMMIT_MAP_REDUCE` | If `false`, disables map-reduce commit analysis path |\n| `DEBUG` | If set, commit agent error stack traces are printed |\n\n---\n\n## Security-sensitive variables\n\nTreat these as secrets; do not log or commit them:\n\n- Provider/API keys and OAuth/bearer credentials (all `*_API_KEY`, `*_TOKEN`, OAuth access/refresh tokens)\n- Cloud credentials (`AWS_*`, `GOOGLE_APPLICATION_CREDENTIALS` path may expose service-account material)\n- Search/provider auth vars (`EXA_API_KEY`, `BRAVE_API_KEY`, `PERPLEXITY_API_KEY`, Anthropic search keys)\n- Foundry mTLS material (`CLAUDE_CODE_CLIENT_CERT`, `CLAUDE_CODE_CLIENT_KEY`, `NODE_EXTRA_CA_CERTS` when it points to private CA bundles)\n\nPython runtime also explicitly strips many common key vars before spawning kernel subprocesses (`packages/coding-agent/src/ipy/runtime.ts`).\n",
13
13
  "extension-loading.md": "# Extension Loading (TypeScript/JavaScript Modules)\n\nThis document covers how the coding agent discovers and loads **extension modules** (`.ts`/`.js`) at startup.\n\nIt does **not** cover `gemini-extension.json` manifest extensions (documented separately).\n\n## What this subsystem does\n\nExtension loading builds a list of module entry files, imports each module with Bun, executes its factory, and returns:\n\n- loaded extension definitions\n- per-path load errors (without aborting the whole load)\n- a shared extension runtime object used later by `ExtensionRunner`\n\n## Primary implementation files\n\n- `src/extensibility/extensions/loader.ts` — path discovery + import/execution\n- `src/extensibility/extensions/index.ts` — public exports\n- `src/extensibility/extensions/runner.ts` — runtime/event execution after load\n- `src/discovery/builtin.ts` — native auto-discovery provider for extension modules\n- `src/config/settings.ts` — loads merged `extensions` / `disabledExtensions` settings\n\n---\n\n## Inputs to extension loading\n\n### 1) Auto-discovered native extension modules\n\n`discoverAndLoadExtensions()` first asks discovery providers for `extension-module` capability items, then keeps only provider `native` items.\n\nEffective native locations:\n\n- Project: `<cwd>/.omp/extensions`\n- User: `~/.omp/agent/extensions`\n\nPath roots come from the native provider (`SOURCE_PATHS.native`).\n\nNotes:\n\n- Native auto-discovery is currently `.omp` based.\n- Legacy `.pi` is still accepted in `package.json` manifest keys (`pi.extensions`), but not as a native root here.\n\n### 2) Explicitly configured paths\n\nAfter auto-discovery, configured paths are appended and resolved.\n\nConfigured path sources in the main session startup path (`sdk.ts`):\n\n1. CLI-provided paths (`--extension/-e`, and `--hook` is also treated as an extension path)\n2. Settings `extensions` array (merged global + project settings)\n\nGlobal settings file:\n\n- `~/.omp/agent/config.yml` (or custom agent dir via `PI_CODING_AGENT_DIR`)\n\nProject settings file:\n\n- `<cwd>/.omp/settings.json`\n\nExamples:\n\n```yaml\n# ~/.omp/agent/config.yml\nextensions:\n - ~/my-exts/safety.ts\n - ./local/ext-pack\n```\n\n```json\n{\n \"extensions\": [\"./.omp/extensions/my-extra\"]\n}\n```\n\n---\n\n## Enable/disable controls\n\n### Disable discovery\n\n- CLI: `--no-extensions`\n- SDK option: `disableExtensionDiscovery`\n\nBehavior split:\n\n- SDK: when `disableExtensionDiscovery=true`, it still loads `additionalExtensionPaths` via `loadExtensions()`.\n- CLI path building (`main.ts`) currently clears CLI extension paths when `--no-extensions` is set, so explicit `-e/--hook` are not forwarded in that mode.\n\n### Disable specific extension modules\n\n`disabledExtensions` setting filters by extension id format:\n\n- `extension-module:<derivedName>`\n\n`derivedName` is based on entry path (`getExtensionNameFromPath`), for example:\n\n- `/x/foo.ts` -> `foo`\n- `/x/bar/index.ts` -> `bar`\n\nExample:\n\n```yaml\ndisabledExtensions:\n - extension-module:foo\n```\n\n---\n\n## Path and entry resolution\n\n### Path normalization\n\nFor configured paths:\n\n1. Normalize unicode spaces\n2. Expand `~`\n3. If relative, resolve against current `cwd`\n\n### If configured path is a file\n\nIt is used directly as a module entry candidate.\n\n### If configured path is a directory\n\nResolution order:\n\n1. `package.json` in that directory with `omp.extensions` (or legacy `pi.extensions`) -> use declared entries\n2. `index.ts`\n3. `index.js`\n4. Otherwise scan one level for extension entries:\n - direct `*.ts` / `*.js`\n - subdir `index.ts` / `index.js`\n - subdir `package.json` with `omp.extensions` / `pi.extensions`\n\nRules and constraints:\n\n- no recursive discovery beyond one subdirectory level\n- declared `extensions` manifest entries are resolved relative to that package directory\n- declared entries are included only if file exists/access is allowed\n- in `*/index.{ts,js}` pairs, TypeScript is preferred over JavaScript\n- symlinks are treated as eligible files/directories\n\n### Ignore behavior differs by source\n\n- Native auto-discovery (`discoverExtensionModulePaths` in discovery helpers) uses native glob with `gitignore: true` and `hidden: false`.\n- Explicit configured directory scanning in `loader.ts` uses `readdir` rules and does **not** apply gitignore filtering.\n\n---\n\n## Load order and precedence\n\n`discoverAndLoadExtensions()` builds one ordered list and then calls `loadExtensions()`.\n\nOrder:\n\n1. Native auto-discovered modules\n2. Explicit configured paths (in provided order)\n\nIn `sdk.ts`, configured order is:\n\n1. CLI additional paths\n2. Settings `extensions`\n\nDe-duplication:\n\n- absolute path based\n- first seen path wins\n- later duplicates are ignored\n\nImplication: if the same module path is both auto-discovered and explicitly configured, it is loaded once at the first position (auto-discovered stage).\n\n---\n\n## Module import and factory contract\n\nEach candidate path is loaded with dynamic import:\n\n- `await import(resolvedPath)`\n- factory is `module.default ?? module`\n- factory must be a function (`ExtensionFactory`)\n\nIf export is not a function, that path fails with a structured error and loading continues.\n\n---\n\n## Failure handling and isolation\n\n### During loading\n\nPer extension path, failures are captured as `{ path, error }` and do not stop other paths from loading.\n\nCommon cases:\n\n- import failure / missing file\n- invalid factory export (non-function)\n- exception thrown while executing factory\n\n### Runtime isolation model\n\n- Extensions are **not sandboxed** (same process/runtime).\n- They share one `EventBus` and one `ExtensionRuntime` instance.\n- During load, runtime action methods intentionally throw `ExtensionRuntimeNotInitializedError`; action wiring happens later in `ExtensionRunner.initialize()`.\n\n### After loading\n\nWhen events run through `ExtensionRunner`, handler exceptions are caught and emitted as extension errors instead of crashing the runner loop.\n\n---\n\n## Minimal user/project layout examples\n\n### User-level\n\n```text\n~/.omp/agent/\n config.yml\n extensions/\n guardrails.ts\n audit/\n index.ts\n```\n\n### Project-level\n\n```text\n<repo>/\n .omp/\n settings.json\n extensions/\n checks/\n package.json\n lint-gates.ts\n```\n\n`checks/package.json`:\n\n```json\n{\n \"omp\": {\n \"extensions\": [\"./src/check-a.ts\", \"./src/check-b.js\"]\n }\n}\n```\n\nLegacy manifest key still accepted:\n\n```json\n{\n \"pi\": {\n \"extensions\": [\"./index.ts\"]\n }\n}\n```\n",
14
14
  "extensions.md": "# Extensions\n\nPrimary guide for authoring runtime extensions in `packages/coding-agent`.\n\nThis document covers the current extension runtime in:\n\n- `src/extensibility/extensions/types.ts`\n- `src/extensibility/extensions/runner.ts`\n- `src/extensibility/extensions/wrapper.ts`\n- `src/extensibility/extensions/index.ts`\n- `src/modes/controllers/extension-ui-controller.ts`\n\nFor discovery paths and filesystem loading rules, see `docs/extension-loading.md`.\n\n## What an extension is\n\nAn extension is a TS/JS module exporting a default factory:\n\n```ts\nimport type { ExtensionAPI } from \"@oh-my-pi/pi-coding-agent\";\n\nexport default function myExtension(pi: ExtensionAPI) {\n\t// register handlers/tools/commands/renderers\n}\n```\n\nExtensions can combine all of the following in one module:\n\n- event handlers (`pi.on(...)`)\n- LLM-callable tools (`pi.registerTool(...)`)\n- slash commands (`pi.registerCommand(...)`)\n- keyboard shortcuts and flags\n- custom message rendering\n- session/message injection APIs (`sendMessage`, `sendUserMessage`, `appendEntry`)\n\n## Runtime model\n\n1. Extensions are imported and their factory functions run.\n2. During that load phase, registration methods are valid; runtime action methods are not yet initialized.\n3. `ExtensionRunner.initialize(...)` wires live actions/contexts for the active mode.\n4. Session/agent/tool lifecycle events are emitted to handlers.\n5. Every tool execution is wrapped with extension interception (`tool_call` / `tool_result`).\n\n```text\nExtension lifecycle (simplified)\n\nload paths\n │\n ▼\nimport module + run factory (registration only)\n │\n ▼\nExtensionRunner.initialize(mode/session/tool registry)\n │\n ├─ emit session/agent events to handlers\n ├─ wrap tool execution (tool_call/tool_result)\n └─ expose runtime actions (sendMessage, setActiveTools, ...)\n```\n\nImportant constraint from `loader.ts`:\n\n- calling action methods like `pi.sendMessage()` during extension load throws `ExtensionRuntimeNotInitializedError`\n- register first; perform runtime behavior from events/commands/tools\n\n## Quick start\n\n```ts\nimport type { ExtensionAPI } from \"@oh-my-pi/pi-coding-agent\";\nimport { Type } from \"@sinclair/typebox\";\n\nexport default function (pi: ExtensionAPI) {\n\tpi.setLabel(\"Safety + Utilities\");\n\n\tpi.on(\"session_start\", async (_event, ctx) => {\n\t\tctx.ui.notify(`Extension loaded in ${ctx.cwd}`, \"info\");\n\t});\n\n\tpi.on(\"tool_call\", async (event) => {\n\t\tif (event.toolName === \"bash\" && event.input.command?.includes(\"rm -rf\")) {\n\t\t\treturn { block: true, reason: \"Blocked by extension policy\" };\n\t\t}\n\t});\n\n\tpi.registerTool({\n\t\tname: \"hello_extension\",\n\t\tlabel: \"Hello Extension\",\n\t\tdescription: \"Return a greeting\",\n\t\tparameters: Type.Object({ name: Type.String() }),\n\t\tasync execute(_toolCallId, params, _signal, _onUpdate, _ctx) {\n\t\t\treturn {\n\t\t\t\tcontent: [{ type: \"text\", text: `Hello, ${params.name}` }],\n\t\t\t\tdetails: { greeted: params.name },\n\t\t\t};\n\t\t},\n\t});\n\n\tpi.registerCommand(\"hello-ext\", {\n\t\tdescription: \"Show queue state\",\n\t\thandler: async (_args, ctx) => {\n\t\t\tctx.ui.notify(`pending=${ctx.hasPendingMessages()}`, \"info\");\n\t\t},\n\t});\n}\n```\n\n## Extension API surfaces\n\n## 1) Registration and actions (`ExtensionAPI`)\n\nCore methods:\n\n- `on(event, handler)`\n- `registerTool`, `registerCommand`, `registerShortcut`, `registerFlag`\n- `registerMessageRenderer`\n- `sendMessage`, `sendUserMessage`, `appendEntry`\n- `getActiveTools`, `getAllTools`, `setActiveTools`\n- `getSessionName`, `setSessionName`\n- `setModel`, `getThinkingLevel`, `setThinkingLevel`\n- `registerProvider`\n- `events` (shared event bus)\n\nIn interactive mode, `input` handlers run before the built-in first-message auto-title check. Extensions that call `await pi.setSessionName(...)` from `input` can set the persisted session name and prevent the default auto-generated title from running for that session.\n\nAlso exposed:\n\n- `pi.logger`\n- `pi.typebox`\n- `pi.pi` (package exports)\n\n### Message delivery semantics\n\n`pi.sendMessage(message, options)` supports:\n\n- `deliverAs: \"steer\"` (default) — interrupts current run\n- `deliverAs: \"followUp\"` — queued to run after current run\n- `deliverAs: \"nextTurn\"` — stored and injected on the next user prompt\n- `triggerTurn: true` — starts a turn when idle (`nextTurn` ignores this)\n\n`pi.sendUserMessage(content, { deliverAs })` always goes through prompt flow; while streaming it queues as steer/follow-up.\n\n## 2) Handler context (`ExtensionContext`)\n\nHandlers and tool `execute` receive `ctx` with:\n\n- `ui`\n- `hasUI`\n- `cwd`\n- `sessionManager` (read-only)\n- `modelRegistry`, `model`\n- `getContextUsage()`\n- `compact(...)`\n- `isIdle()`, `hasPendingMessages()`, `abort()`\n- `shutdown()`\n- `getSystemPrompt()`\n\n## 3) Command context (`ExtensionCommandContext`)\n\nCommand handlers additionally get:\n\n- `waitForIdle()`\n- `newSession(...)`\n- `switchSession(...)`\n- `branch(entryId)`\n- `navigateTree(targetId, { summarize })`\n- `reload()`\n\nUse command context for session-control flows; these methods are intentionally separated from general event handlers.\n\n## Event surface (current names and behavior)\n\nCanonical event unions and payload types are in `types.ts`.\n\n### Session lifecycle\n\n- `session_start`\n- `session_before_switch` / `session_switch`\n- `session_before_branch` / `session_branch`\n- `session_before_compact` / `session.compacting` / `session_compact`\n- `session_before_tree` / `session_tree`\n- `session_shutdown`\n\nCancelable pre-events:\n\n- `session_before_switch` → `{ cancel?: boolean }`\n- `session_before_branch` → `{ cancel?: boolean; skipConversationRestore?: boolean }`\n- `session_before_compact` → `{ cancel?: boolean; compaction?: CompactionResult }`\n- `session_before_tree` → `{ cancel?: boolean; summary?: { summary: string; details?: unknown } }`\n\n### Prompt and turn lifecycle\n\n- `input`\n- `before_agent_start`\n- `context`\n- `agent_start` / `agent_end`\n- `turn_start` / `turn_end`\n- `message_start` / `message_update` / `message_end`\n\n### Tool lifecycle\n\n- `tool_call` (pre-exec, may block)\n- `tool_result` (post-exec, may patch content/details/isError)\n- `tool_execution_start` / `tool_execution_update` / `tool_execution_end` (observability)\n\n`tool_result` is middleware-style: handlers run in extension order and each sees prior modifications.\n\n### Reliability/runtime signals\n\n- `auto_compaction_start` / `auto_compaction_end`\n- `auto_retry_start` / `auto_retry_end`\n- `ttsr_triggered`\n- `todo_reminder`\n\n### User command interception\n\n- `user_bash` (override with `{ result }`)\n- `user_python` (override with `{ result }`)\n\n### `resources_discover`\n\n`resources_discover` exists in extension types and `ExtensionRunner`.\nCurrent runtime note: `ExtensionRunner.emitResourcesDiscover(...)` is implemented, but there are no `AgentSession` callsites invoking it in the current codebase.\n\n## Tool authoring details\n\n`registerTool` uses `ToolDefinition` from `types.ts`.\n\nCurrent `execute` signature:\n\n```ts\nexecute(\n\ttoolCallId,\n\tparams,\n\tsignal,\n\tonUpdate,\n\tctx,\n): Promise<AgentToolResult>\n```\n\nTemplate:\n\n```ts\npi.registerTool({\n\tname: \"my_tool\",\n\tlabel: \"My Tool\",\n\tdescription: \"...\",\n\tparameters: Type.Object({}),\n\tasync execute(_id, _params, signal, onUpdate, ctx) {\n\t\tif (signal?.aborted) {\n\t\t\treturn { content: [{ type: \"text\", text: \"Cancelled\" }] };\n\t\t}\n\t\tonUpdate?.({ content: [{ type: \"text\", text: \"Working...\" }] });\n\t\treturn { content: [{ type: \"text\", text: \"Done\" }], details: {} };\n\t},\n\tonSession(event, ctx) {\n\t\t// reason: start|switch|branch|tree|shutdown\n\t},\n\trenderCall(args, theme) {\n\t\t// optional TUI render\n\t},\n\trenderResult(result, options, theme, args) {\n\t\t// optional TUI render\n\t},\n});\n```\n\n`tool_call`/`tool_result` intercept all tools once the registry is wrapped in `sdk.ts`, including built-ins and extension/custom tools.\n\n## UI integration points\n\n`ctx.ui` implements the `ExtensionUIContext` interface. Support differs by mode.\n\n### Interactive mode (`extension-ui-controller.ts`)\n\nSupported:\n\n- dialogs: `select`, `confirm`, `input`, `editor`\n- notifications/status/editor text/terminal input/custom overlays\n- theme listing/loading by name (`setTheme` supports string names)\n- tools expanded toggle\n\nCurrent no-op methods in this controller:\n\n- `setFooter`\n- `setHeader`\n- `setEditorComponent`\n\nAlso note: `setWidget` currently routes to status-line text via `setHookWidget(...)`.\n\n### RPC mode (`rpc-mode.ts`)\n\n`ctx.ui` is backed by RPC `extension_ui_request` events:\n\n- dialog methods (`select`, `confirm`, `input`, `editor`) round-trip to client responses\n- fire-and-forget methods emit requests (`notify`, `setStatus`, `setWidget` for string arrays, `setTitle`, `setEditorText`)\n\nUnsupported/no-op in RPC implementation:\n\n- `onTerminalInput`\n- `custom`\n- `setFooter`, `setHeader`, `setEditorComponent`\n- `setWorkingMessage`\n- theme switching/loading (`setTheme` returns failure)\n- tool expansion controls are inert\n\n### Print/headless/subagent paths\n\nWhen no UI context is supplied to runner init, `ctx.hasUI` is `false` and methods are no-op/default-returning.\n\n### Background interactive mode\n\nBackground mode installs a non-interactive UI context object. In current implementation, `ctx.hasUI` may still be `true` while interactive dialogs return defaults/no-op behavior.\n\n## Session and state patterns\n\nFor durable extension state:\n\n1. Persist with `pi.appendEntry(customType, data)`.\n2. Rebuild state from `ctx.sessionManager.getBranch()` on `session_start`, `session_branch`, `session_tree`.\n3. Keep tool result `details` structured when state should be visible/reconstructible from tool result history.\n\nExample reconstruction pattern:\n\n```ts\npi.on(\"session_start\", async (_event, ctx) => {\n\tlet latest;\n\tfor (const entry of ctx.sessionManager.getBranch()) {\n\t\tif (entry.type === \"custom\" && entry.customType === \"my-state\") {\n\t\t\tlatest = entry.data;\n\t\t}\n\t}\n\t// restore from latest\n});\n```\n\n## Rendering extension points\n\n## Custom message renderer\n\n```ts\npi.registerMessageRenderer(\"my-type\", (message, { expanded }, theme) => {\n\t// return pi-tui Component\n});\n```\n\nUsed by interactive rendering when custom messages are displayed.\n\n## Tool call/result renderer\n\nProvide `renderCall` / `renderResult` on `registerTool` definitions for custom tool visualization in TUI.\n\n## Constraints and pitfalls\n\n- Runtime actions are unavailable during extension load.\n- `tool_call` errors block execution (fail-closed).\n- Command name conflicts with built-ins are skipped with diagnostics.\n- Reserved shortcuts are ignored (`ctrl+c`, `ctrl+d`, `ctrl+z`, `ctrl+k`, `ctrl+p`, `ctrl+l`, `ctrl+o`, `ctrl+t`, `ctrl+g`, `shift+tab`, `shift+ctrl+p`, `alt+enter`, `escape`, `enter`).\n- Treat `ctx.reload()` as terminal for the current command handler frame.\n\n## Extensions vs hooks vs custom-tools\n\nUse the right surface:\n\n- **Extensions** (`src/extensibility/extensions/*`): unified system (events + tools + commands + renderers + provider registration).\n- **Hooks** (`src/extensibility/hooks/*`): separate legacy event API.\n- **Custom-tools** (`src/extensibility/custom-tools/*`): tool-focused modules; when loaded alongside extensions they are adapted and still pass through extension interception wrappers.\n\nIf you need one package that owns policy, tools, command UX, and rendering together, use extensions.\n",
15
15
  "fs-scan-cache-architecture.md": "# Filesystem Scan Cache Architecture Contract\n\nThis document defines the current contract for the shared filesystem scan cache implemented in Rust (`crates/pi-natives/src/fs_cache.rs`) and consumed by native discovery/search APIs exposed to `packages/coding-agent`.\n\n## What this cache is\n\nThe cache stores full directory-scan entry lists (`GlobMatch[]`) keyed by scan scope and traversal policy, then lets higher-level operations (glob filtering, fuzzy scoring, grep file selection) run against those cached entries.\n\nPrimary goals:\n- avoid repeated filesystem walks for repeated discovery/search calls\n- keep consistency across `glob`, `fuzzyFind`, and `grep` when they share the same scan policy\n- allow explicit staleness recovery for empty results and explicit invalidation after file mutations\n\n## Ownership and public surface\n\n- Cache implementation and policy: `crates/pi-natives/src/fs_cache.rs`\n- Native consumers:\n - `crates/pi-natives/src/glob.rs`\n - `crates/pi-natives/src/fd.rs` (`fuzzyFind`)\n - `crates/pi-natives/src/grep.rs`\n- JS binding/export:\n - `packages/natives/src/glob/index.ts` (`invalidateFsScanCache`)\n - `packages/natives/src/glob/types.ts`\n - `packages/natives/src/grep/types.ts`\n- Coding-agent mutation invalidation helpers:\n - `packages/coding-agent/src/tools/fs-cache-invalidation.ts`\n\n## Cache key partitioning (hard contract)\n\nEach entry is keyed by:\n- canonicalized `root` directory path\n- `include_hidden` boolean\n- `use_gitignore` boolean\n\nImplications:\n- Hidden and non-hidden scans do **not** share entries.\n- Gitignore-respecting and ignore-disabled scans do **not** share entries.\n- Consumers must pass stable semantics for hidden/gitignore behavior; changing either flag creates a different cache partition.\n\n`node_modules` inclusion is **not** in the cache key. The cache stores entries with `node_modules` included; per-consumer filtering is applied after retrieval.\n\n## Scan collection behavior\n\nCache population uses a deterministic walker (`ignore::WalkBuilder`) configured by `include_hidden` and `use_gitignore`:\n- `follow_links(false)`\n- sorted by file path\n- `.git` is always skipped\n- `node_modules` is always collected at cache-scan time (and optionally filtered later)\n- entry file type + `mtime` are captured via `symlink_metadata`\n\nSearch roots are resolved by `resolve_search_path`:\n- relative paths are resolved against current cwd\n- target must be an existing directory\n- root is canonicalized when possible\n\n## Freshness and eviction policy\n\nGlobal policy (environment-overridable):\n- `FS_SCAN_CACHE_TTL_MS` (default `1000`)\n- `FS_SCAN_EMPTY_RECHECK_MS` (default `200`)\n- `FS_SCAN_CACHE_MAX_ENTRIES` (default `16`)\n\nBehavior:\n- `get_or_scan(...)`\n - if TTL is `0`: bypass cache entirely, always fresh scan (`cache_age_ms = 0`)\n - on cache hit within TTL: return cached entries + non-zero `cache_age_ms`\n - on expired hit: evict key, rescan, store fresh entry\n- max entry enforcement is oldest-first eviction by `created_at`\n\n## Empty-result fast recheck (separate from normal hits)\n\nNormal cache hit:\n- a cache hit inside TTL returns cached entries and does nothing else.\n\nEmpty-result fast recheck:\n- this is a **caller-side** policy using `ScanResult.cache_age_ms`\n- if filtered/query result is empty and cached scan age is at least `empty_recheck_ms()`, caller performs one `force_rescan(...)` and retries\n- intended to reduce stale-negative results when files were recently added but cache is still within TTL\n\nCurrent consumers:\n- `glob`: rechecks when filtered matches are empty and scan age exceeds threshold\n- `fuzzyFind` (`fd.rs`): rechecks only when query is non-empty and scored matches are empty\n- `grep`: rechecks when selected candidate file list is empty\n\n## Consumer defaults and cache usage\n\nCache is opt-in on all exposed APIs (`cache?: boolean`, default `false`).\n\nCurrent defaults in native APIs:\n- `glob`: `hidden=false`, `gitignore=true`, `cache=false`\n- `fuzzyFind`: `hidden=false`, `gitignore=true`, `cache=false`\n- `grep`: `hidden=true`, `cache=false`, and cache scan always uses `use_gitignore=true`\n\nCoding-agent callers today:\n- High-volume mention candidate discovery enables cache:\n - `packages/coding-agent/src/utils/file-mentions.ts`\n - profile: `hidden=true`, `gitignore=true`, `includeNodeModules=true`, `cache=true`\n- Tool-level `grep` integration currently disables scan cache (`cache: false`):\n - `packages/coding-agent/src/tools/grep.ts`\n\n## Invalidation contract\n\nNative invalidation entrypoint:\n- `invalidateFsScanCache(path?: string)`\n - with `path`: remove cache entries whose root is a prefix of target path\n - without path: clear all scan cache entries\n\nPath handling details:\n- relative invalidation paths are resolved against cwd\n- invalidation attempts canonicalization\n- if target does not exist (e.g., delete), fallback canonicalizes parent and reattaches filename when possible\n- this preserves invalidation behavior for create/delete/rename where one side may not exist\n\n## Coding-agent mutation flow responsibilities\n\nCoding-agent code must invalidate after successful filesystem mutations.\n\nCentral helpers:\n- `invalidateFsScanAfterWrite(path)`\n- `invalidateFsScanAfterDelete(path)`\n- `invalidateFsScanAfterRename(oldPath, newPath)` (invalidates both sides when paths differ)\n\nCurrent mutation tool callsites:\n- `packages/coding-agent/src/tools/write.ts`\n- `packages/coding-agent/src/patch/index.ts` (hashline/patch/replace flows)\n\nRule: if a flow mutates filesystem content or location and bypasses these helpers, cache staleness bugs are expected.\n\n## Adding a new cache consumer safely\n\nWhen introducing cache use in a new scanner/search path:\n\n1. **Use stable scan policy inputs**\n - decide hidden/gitignore semantics first\n - pass them consistently to `get_or_scan`/`force_rescan` so cache partitions are intentional\n\n2. **Treat cache data as pre-filtered only by traversal policy**\n - apply tool-specific filtering (glob patterns, type filters, node_modules rules) after retrieval\n - never assume cached entries already reflect your higher-level filters\n\n3. **Implement empty-result fast recheck only for stale-negative risk**\n - use `scan.cache_age_ms >= empty_recheck_ms()`\n - retry once with `force_rescan(..., store=true, ...)`\n - keep this path separate from normal cache-hit logic\n\n4. **Respect no-cache mode explicitly**\n - when caller disables cache, call `force_rescan(..., store=false, ...)`\n - do not populate shared cache in a no-cache request path\n\n5. **Wire mutation invalidation for any new write path**\n - after successful write/edit/delete/rename, call the coding-agent invalidation helper\n - for rename/move, invalidate both old and new paths\n\n6. **Do not add per-call TTL knobs**\n - current contract is global policy only (env-configured), no per-request TTL override\n\n## Known boundaries\n\n- Cache scope is process-local in-memory (`DashMap`), not persisted across process restarts.\n- Cache stores scan entries, not final tool results.\n- `glob`/`fuzzyFind`/`grep` share scan entries only when key dimensions (`root`, `hidden`, `gitignore`) match.\n- `.git` is always excluded at scan collection time regardless of caller options.\n",
@@ -23,10 +23,10 @@ export const EMBEDDED_DOCS: Readonly<Record<string, string>> = {
23
23
  "mcp-server-tool-authoring.md": "# MCP server and tool authoring\n\nThis document explains how MCP server definitions become callable `mcp_*` tools in coding-agent, and what operators should expect when configs are invalid, duplicated, disabled, or auth-gated.\n\n## Architecture at a glance\n\n```text\nConfig sources (.omp/.claude/.cursor/.vscode/mcp.json, mcp.json, etc.)\n -> discovery providers normalize to canonical MCPServer\n -> capability loader dedupes by server name (higher provider priority wins)\n -> loadAllMCPConfigs converts to MCPServerConfig + skips enabled:false\n -> MCPManager connects/listTools (with auth/header/env resolution)\n -> MCPTool/DeferredMCPTool bridge exposes tools as mcp_<server>_<tool>\n -> AgentSession.refreshMCPTools replaces live MCP tools immediately\n```\n\n## 1) Server config model and validation\n\n`src/mcp/types.ts` defines the authoring shape used by MCP config writers and runtime:\n\n- `stdio` (default when `type` missing): requires `command`, optional `args`, `env`, `cwd`\n- `http`: requires `url`, optional `headers`\n- `sse`: requires `url`, optional `headers` (kept for compatibility)\n- shared fields: `enabled`, `timeout`, `auth`\n\n`validateServerConfig()` (`src/mcp/config.ts`) enforces transport basics:\n\n- rejects configs that set both `command` and `url`\n- requires `command` for stdio\n- requires `url` for http/sse\n- rejects unknown `type`\n\n`config-writer.ts` applies this validation for add/update operations and also validates server names:\n\n- non-empty\n- max 100 chars\n- only `[a-zA-Z0-9_.-]`\n\n### Transport pitfalls\n\n- `type` omitted means stdio. If you intended HTTP/SSE but omitted `type`, `command` becomes mandatory.\n- `sse` is still accepted but treated as HTTP transport internally (`createHttpTransport`).\n- Validation is structural, not reachability: a syntactically valid URL can still fail at connect time.\n\n## 2) Discovery, normalization, and precedence\n\n### Capability-based discovery\n\n`loadAllMCPConfigs()` (`src/mcp/config.ts`) loads canonical `MCPServer` items via `loadCapability(mcpCapability.id)`.\n\nThe capability layer (`src/capability/index.ts`) then:\n\n1. loads providers in priority order\n2. dedupes by `server.name` (first win = highest priority)\n3. validates deduped items\n\nResult: duplicate server names across sources are not merged. One definition wins; lower-priority duplicates are shadowed.\n\n### `.mcp.json` and related files\n\nThe dedicated fallback provider in `src/discovery/mcp-json.ts` reads project-root `mcp.json` and `.mcp.json` (low priority).\n\nIn practice MCP servers also come from higher-priority providers (for example native `.omp/...` and tool-specific config dirs). Authoring guidance:\n\n- Prefer `.omp/mcp.json` (project) or `~/.omp/agent/mcp.json` (user) for explicit control.\n- Use root `mcp.json` / `.mcp.json` when you need fallback compatibility.\n- Reusing the same server name in multiple sources causes precedence shadowing, not merge.\n\n### Normalization behavior\n\n`convertToLegacyConfig()` (`src/mcp/config.ts`) maps canonical `MCPServer` to runtime `MCPServerConfig`.\n\nKey behavior:\n\n- transport inferred as `server.transport ?? (command ? \"stdio\" : url ? \"http\" : \"stdio\")`\n- disabled servers (`enabled === false`) are dropped before connection\n- optional fields are preserved when present\n\n### Environment expansion during discovery\n\n`mcp-json.ts` expands env placeholders in string fields with `expandEnvVarsDeep()`:\n\n- supports `${VAR}` and `${VAR:-default}`\n- unresolved values remain literal `${VAR}` strings\n\n`mcp-json.ts` also performs runtime type checks for user JSON and logs warnings for invalid `enabled`/`timeout` values instead of hard-failing the whole file.\n\n## 3) Auth and runtime value resolution\n\n`MCPManager.prepareConfig()`/`#resolveAuthConfig()` (`src/mcp/manager.ts`) is the final pre-connect pass.\n\n### OAuth credential injection\n\nIf config has:\n\n```ts\nauth: { type: \"oauth\", credentialId: \"...\" }\n```\n\nand credential exists in auth storage:\n\n- `http`/`sse`: injects `Authorization: Bearer <access_token>` header\n- `stdio`: injects `OAUTH_ACCESS_TOKEN` env var\n\nIf credential lookup fails, manager logs a warning and continues with unresolved auth.\n\n### Header/env value resolution\n\nBefore connect, manager resolves each header/env value via `resolveConfigValue()` (`src/config/resolve-config-value.ts`):\n\n- value starting with `!` => execute shell command, use trimmed stdout (cached)\n- otherwise, treat value as environment variable name first (`process.env[name]`), fallback to literal value\n- unresolved command/env values are omitted from final headers/env map\n\nOperational caveat: this means a mistyped secret command/env key can silently remove that header/env entry, producing downstream 401/403 or server startup failures.\n\n## 4) Tool bridge: MCP -> agent-callable tools\n\n`src/mcp/tool-bridge.ts` converts MCP tool definitions into `CustomTool`s.\n\n### Naming and collision domain\n\nTool names are generated as:\n\n```text\nmcp_<sanitized_server_name>_<sanitized_tool_name>\n```\n\nRules:\n\n- lowercases\n- non-`[a-z_]` chars become `_`\n- repeated underscores collapse\n- redundant `<server>_` prefix in tool name is stripped once\n\nThis avoids many collisions, but not all. Different raw names can still sanitize to the same identifier (for example `my-server` and `my.server` both sanitize similarly), and registry insertion is last-write-wins.\n\n### Schema mapping\n\n`convertSchema()` keeps MCP JSON Schema mostly as-is but patches object schemas missing `properties` with `{}` for provider compatibility.\n\n### Execution mapping\n\n`MCPTool.execute()` / `DeferredMCPTool.execute()`:\n\n- calls MCP `tools/call`\n- flattens MCP content into displayable text\n- returns structured details (`serverName`, `mcpToolName`, provider metadata)\n- maps server-reported `isError` to `Error: ...` text result\n- maps thrown transport/runtime failures to `MCP error: ...`\n- preserves abort semantics by translating AbortError into `ToolAbortError`\n\n## 5) Operator lifecycle: add/edit/remove and live updates\n\nInteractive mode exposes `/mcp` in `src/modes/controllers/mcp-command-controller.ts`.\n\nSupported operations:\n\n- `add` (wizard or quick-add)\n- `remove` / `rm`\n- `enable` / `disable`\n- `test`\n- `reauth` / `unauth`\n- `reload`\n\nConfig writes are atomic (`writeMCPConfigFile`: temp file + rename).\n\nAfter changes, controller calls `#reloadMCP()`:\n\n1. `mcpManager.disconnectAll()`\n2. `mcpManager.discoverAndConnect()`\n3. `session.refreshMCPTools(mcpManager.getTools())`\n\n`refreshMCPTools()` replaces all `mcp_` registry entries and immediately re-activates the latest MCP tool set, so changes take effect without restarting the session.\n\n### Mode differences\n\n- **Interactive/TUI mode**: `/mcp` gives in-app UX (wizard, OAuth flow, connection status text, immediate runtime rebinding).\n- **SDK/headless integration**: `discoverAndLoadMCPTools()` (`src/mcp/loader.ts`) returns loaded tools + per-server errors; no `/mcp` command UX.\n\n## 6) User-visible error surfaces\n\nCommon error strings users/operators see:\n\n- add/update validation failures:\n - `Invalid server config: ...`\n - `Server \"<name>\" already exists in <path>`\n- quick-add argument issues:\n - `Use either --url or -- <command...>, not both.`\n - `--token requires --url (HTTP/SSE transport).`\n- connect/test failures:\n - `Failed to connect to \"<name>\": <message>`\n - timeout help text suggests increasing timeout\n - auth help text for `401/403`\n- auth/OAuth flows:\n - `Authentication required ... OAuth endpoints could not be discovered`\n - `OAuth flow timed out. Please try again.`\n - `OAuth authentication failed: ...`\n- disabled server usage:\n - `Server \"<name>\" is disabled. Run /mcp enable <name> first.`\n\nBad source JSON in discovery is generally handled as warnings/logs; config-writer paths throw explicit errors.\n\n## 7) Practical authoring guidance\n\nFor robust MCP authoring in this codebase:\n\n1. Keep server names globally unique across all MCP-capable config sources.\n2. Prefer alphanumeric/underscore names to avoid sanitized-name collisions in generated `mcp_*` tool names.\n3. Use explicit `type` to avoid accidental stdio defaults.\n4. Treat `enabled: false` as hard-off: server is omitted from runtime connect set.\n5. For OAuth configs, store a valid `credentialId`; otherwise auth injection is skipped.\n6. If using command-based secret resolution (`!cmd`), verify command output is stable and non-empty.\n\n## Implementation files\n\n- [`src/mcp/types.ts`](../packages/coding-agent/src/mcp/types.ts)\n- [`src/mcp/config.ts`](../packages/coding-agent/src/mcp/config.ts)\n- [`src/mcp/config-writer.ts`](../packages/coding-agent/src/mcp/config-writer.ts)\n- [`src/mcp/tool-bridge.ts`](../packages/coding-agent/src/mcp/tool-bridge.ts)\n- [`src/discovery/mcp-json.ts`](../packages/coding-agent/src/discovery/mcp-json.ts)\n- [`src/modes/controllers/mcp-command-controller.ts`](../packages/coding-agent/src/modes/controllers/mcp-command-controller.ts)\n- [`src/mcp/manager.ts`](../packages/coding-agent/src/mcp/manager.ts)\n- [`src/capability/index.ts`](../packages/coding-agent/src/capability/index.ts)\n- [`src/config/resolve-config-value.ts`](../packages/coding-agent/src/config/resolve-config-value.ts)\n- [`src/mcp/loader.ts`](../packages/coding-agent/src/mcp/loader.ts)\n",
24
24
  "memory.md": "# Autonomous Memory\n\nWhen enabled, the agent automatically extracts durable knowledge from past sessions and injects a compact summary into each new session. Over time it builds a project-scoped memory store — technical decisions, recurring workflows, pitfalls — that carries forward without manual effort.\n\nDisabled by default. Enable via `/settings` or `config.yml`:\n\n```yaml\nmemories:\n enabled: true\n```\n\n## Usage\n\n### What gets injected\n\nAt session start, if a memory summary exists for the current project, it is injected into the system prompt as a **Memory Guidance** block. The agent is instructed to:\n\n- Treat memory as heuristic context — useful for process and prior decisions, not authoritative on current repo state.\n- Cite the memory artifact path when memory changes the plan, and pair it with current-repo evidence before acting.\n- Prefer repo state and user instruction when they conflict with memory; treat conflicting memory as stale.\n\n### Reading memory artifacts\n\nThe agent can read memory files directly using `memory://` URLs with the `read` tool:\n\n| URL | Content |\n|---|---|\n| `memory://root` | Compact summary injected at startup |\n| `memory://root/MEMORY.md` | Full long-term memory document |\n| `memory://root/skills/<name>/SKILL.md` | A generated skill playbook |\n\n### `/memory` slash command\n\n| Subcommand | Effect |\n|---|---|\n| `view` | Show the current memory injection payload |\n| `clear` / `reset` | Delete all memory data and generated artifacts |\n| `enqueue` / `rebuild` | Force consolidation to run at next startup |\n\n## How it works\n\nMemories are built by a background pipeline that at startup or manually triggered via slash command.\n\n**Phase 1 — per-session extraction:** For each past session that has changed since it was last processed, a model reads the session history and extracts durable signal: technical decisions, constraints, resolved failures, recurring workflows. Sessions that are too recent, too old, or currently active are skipped. Each extraction produces a raw memory block and a short synopsis for that session.\n\n**Phase 2 — consolidation:** After extraction, a second model pass reads all per-session extractions and produces three outputs written to disk:\n\n- `MEMORY.md` — a curated long-term memory document\n- `memory_summary.md` — the compact text injected at session start\n- `skills/` — reusable procedural playbooks, each in its own subdirectory\n\nPhase 2 uses a lease to prevent double-running when multiple processes start simultaneously. Stale skill directories from prior runs are pruned automatically.\n\nAll output is scanned for secrets before being written to disk.\n\n### Extraction behavior\n\nMemory extraction and consolidation behavior is driven entirely by static prompt files in `src/prompts/memories/`.\n\n| File | Purpose | Variables |\n|---|---|---|\n| `stage_one_system.md` | System prompt for per-session extraction | — |\n| `stage_one_input.md` | User-turn template wrapping session content | `{{thread_id}}`, `{{response_items_json}}` |\n| `consolidation.md` | Prompt for cross-session consolidation | `{{raw_memories}}`, `{{rollout_summaries}}` |\n| `read_path.md` | Memory guidance injected into live sessions | `{{memory_summary}}` |\n\n### Model selection\n\nMemory piggybacks on the model role system.\n\n| Phase | Role | Purpose |\n|---|---|---|\n| Phase 1 (extraction) | `default` | Per-session knowledge extraction |\n| Phase 2 (consolidation) | `smol` | Cross-session synthesis |\n\nIf `smol` is not configured, Phase 2 falls back to the `default` role.\n\n## Configuration\n\n| Setting | Default | Description |\n|---|---|---|\n| `memories.enabled` | `false` | Master switch |\n| `memories.maxRolloutAgeDays` | `30` | Sessions older than this are not processed |\n| `memories.minRolloutIdleHours` | `12` | Sessions active more recently than this are skipped |\n| `memories.maxRolloutsPerStartup` | `64` | Cap on sessions processed in a single startup |\n| `memories.summaryInjectionTokenLimit` | `5000` | Max tokens of the summary injected into the system prompt |\n\nAdditional tuning knobs (concurrency, lease durations, token budgets) are available in config for advanced use.\n\n## Key files\n\n- `src/memories/index.ts` — pipeline orchestration, injection, slash command handling\n- `src/memories/storage.ts` — SQLite-backed job queue and thread registry\n- `src/prompts/memories/` — memory prompt templates\n- `src/internal-urls/memory-protocol.ts` — `memory://` URL handler\n",
25
25
  "models.md": "# Model and Provider Configuration (`models.yml`)\n\nThis document describes how the coding-agent currently loads models, applies overrides, resolves credentials, and chooses models at runtime.\n\n## What controls model behavior\n\nPrimary implementation files:\n\n- `src/config/model-registry.ts` — loads built-in + custom models, provider overrides, runtime discovery, auth integration\n- `src/config/model-resolver.ts` — parses model patterns and selects initial/smol/slow models\n- `src/config/settings-schema.ts` — model-related settings (`modelRoles`, provider transport preferences)\n- `src/session/auth-storage.ts` — API key + OAuth resolution order\n- `packages/ai/src/models.ts` and `packages/ai/src/types.ts` — built-in providers/models and `Model`/`compat` types\n\n## Config file location and legacy behavior\n\nDefault config path:\n\n- `~/.omp/agent/models.yml`\n\nLegacy behavior still present:\n\n- If `models.yml` is missing and `models.json` exists at the same location, it is migrated to `models.yml`.\n- Explicit `.json` / `.jsonc` config paths are still supported when passed programmatically to `ModelRegistry`.\n\n## `models.yml` shape\n\n```yaml\nproviders:\n <provider-id>:\n # provider-level config\nequivalence:\n overrides:\n <provider-id>/<model-id>: <canonical-model-id>\n exclude:\n - <provider-id>/<model-id>\n```\n\n`provider-id` is the canonical provider key used across selection and auth lookup.\n\n`equivalence` is optional and configures canonical model grouping on top of concrete provider models:\n\n- `overrides` maps an exact concrete selector (`provider/modelId`) to an official upstream canonical id\n- `exclude` opts a concrete selector out of canonical grouping\n\n## Provider-level fields\n\n```yaml\nproviders:\n my-provider:\n baseUrl: https://api.example.com/v1\n apiKey: MY_PROVIDER_API_KEY\n api: openai-completions\n headers:\n X-Team: platform\n authHeader: true\n auth: apiKey\n discovery:\n type: ollama\n modelOverrides:\n some-model-id:\n name: Renamed model\n models:\n - id: some-model-id\n name: Some Model\n api: openai-completions\n reasoning: false\n input: [text]\n cost:\n input: 0\n output: 0\n cacheRead: 0\n cacheWrite: 0\n contextWindow: 128000\n maxTokens: 16384\n headers:\n X-Model: value\n compat:\n supportsStore: true\n supportsDeveloperRole: true\n supportsReasoningEffort: true\n maxTokensField: max_completion_tokens\n openRouterRouting:\n only: [anthropic]\n vercelGatewayRouting:\n order: [anthropic, openai]\n extraBody:\n gateway: m1-01\n controller: mlx\n```\n\n### Allowed provider/model `api` values\n\n- `openai-completions`\n- `openai-responses`\n- `openai-codex-responses`\n- `azure-openai-responses`\n- `anthropic-messages`\n- `google-generative-ai`\n- `google-vertex`\n\n### Allowed auth/discovery values\n\n- `auth`: `apiKey` (default) or `none`\n- `discovery.type`: `ollama`\n\n## Validation rules (current)\n\n### Full custom provider (`models` is non-empty)\n\nRequired:\n\n- `baseUrl`\n- `apiKey` unless `auth: none`\n- `api` at provider level or each model\n\n### Override-only provider (`models` missing or empty)\n\nMust define at least one of:\n\n- `baseUrl`\n- `modelOverrides`\n- `discovery`\n\n### Discovery\n\n- `discovery` requires provider-level `api`.\n\n### Model value checks\n\n- `id` required\n- `contextWindow` and `maxTokens` must be positive if provided\n\n## Merge and override order\n\nModelRegistry pipeline (on refresh):\n\n1. Load built-in providers/models from `@oh-my-pi/pi-ai`.\n2. Load `models.yml` custom config.\n3. Apply provider overrides (`baseUrl`, `headers`) to built-in models.\n4. Apply `modelOverrides` (per provider + model id).\n5. Merge custom `models`:\n - same `provider + id` replaces existing\n - otherwise append\n6. Apply runtime-discovered models (currently Ollama and LM Studio), then re-apply model overrides.\n\n## Canonical model equivalence and coalescing\n\nThe registry keeps every concrete provider model and then builds a canonical layer above them.\n\nCanonical ids are official upstream ids only, for example:\n\n- `claude-opus-4-6`\n- `claude-haiku-4-5`\n- `gpt-5.3-codex`\n### `models.yml` equivalence config\n\nExample:\n\n```yaml\nproviders:\n zenmux:\n baseUrl: https://api.zenmux.example/v1\n apiKey: ZENMUX_API_KEY\n api: openai-codex-responses\n models:\n - id: codex\n name: Zenmux Codex\n reasoning: true\n input: [text]\n cost:\n input: 0\n output: 0\n cacheRead: 0\n cacheWrite: 0\n contextWindow: 200000\n maxTokens: 32768\n\nequivalence:\n overrides:\n zenmux/codex: gpt-5.3-codex\n p-codex/codex: gpt-5.3-codex\n exclude:\n - demo/codex-preview\n```\n\nBuild order for canonical grouping:\n\n1. exact user override from `equivalence.overrides`\n2. bundled official-id matches from built-in model metadata\n3. conservative heuristic normalization for gateway/provider variants\n4. fallback to the concrete model's own id\n\nCurrent heuristics are intentionally narrow:\n\n- embedded upstream prefixes can be stripped when present, for example `anthropic/...` or `openai/...`\n- dotted and dashed version variants can normalize only when they map to an existing official id, for example `4.6 -> 4-6`\n- ambiguous families or versions are not merged without a bundled match or explicit override\n\n### Canonical resolution behavior\n\nWhen multiple concrete variants share a canonical id, resolution uses:\n\n1. availability and auth\n2. `config.yml` `modelProviderOrder`\n3. existing registry/provider order if `modelProviderOrder` is unset\n\nDisabled or unauthenticated providers are skipped.\n\nSession state and transcripts continue to record the concrete provider/model that actually executed the turn.\n\nProvider defaults vs per-model overrides:\n\n- Provider `headers` are baseline.\n- Model `headers` override provider header keys.\n- `modelOverrides` can override model metadata (`name`, `reasoning`, `input`, `cost`, `contextWindow`, `maxTokens`, `headers`, `compat`, `contextPromotionTarget`).\n- `compat` is deep-merged for nested routing blocks (`openRouterRouting`, `vercelGatewayRouting`, `extraBody`).\n\n## Runtime discovery integration\n\n### Implicit Ollama discovery\n\nIf `ollama` is not explicitly configured, registry adds an implicit discoverable provider:\n\n- provider: `ollama`\n- api: `openai-completions`\n- base URL: `OLLAMA_BASE_URL` or `http://127.0.0.1:11434`\n- auth mode: keyless (`auth: none` behavior)\n\nRuntime discovery calls `GET /api/tags` on Ollama and synthesizes model entries with local defaults.\n\n### Implicit llama.cpp discovery\n\nIf `llama.cpp` is not explicitly configured, registry adds an implicit discoverable provider:\nNote: it's using the newer antropic messages api instead of the openai-competions.\n\n- provider: `llama.cpp`\n- api: `openai-responses`\n- base URL: `LLAMA_CPP_BASE_URL` or `http://127.0.0.1:8080`\n- auth mode: keyless (`auth: none` behavior)\n\nRuntime discovery calls `GET models` on llama.cpp and synthesizes model entries with local defaults.\n\n### Implicit LM Studio discovery\n\nIf `lm-studio` is not explicitly configured, registry adds an implicit discoverable provider:\n\n- provider: `lm-studio`\n- api: `openai-completions`\n- base URL: `LM_STUDIO_BASE_URL` or `http://127.0.0.1:1234/v1`\n- auth mode: keyless (`auth: none` behavior)\n\nRuntime discovery fetches models (`GET /models`) and synthesizes model entries with local defaults.\n\n### Explicit provider discovery\n\nYou can configure discovery yourself:\n\n```yaml\nproviders:\n ollama:\n baseUrl: http://127.0.0.1:11434\n api: openai-completions\n auth: none\n discovery:\n type: ollama\n \n llama.cpp:\n baseUrl: http://127.0.0.1:8080\n api: openai-responses\n auth: none\n discovery:\n type: llama.cpp\n```\n\n### Extension provider registration\n\nExtensions can register providers at runtime (`pi.registerProvider(...)`), including:\n\n- model replacement/append for a provider\n- custom stream handler registration for new API IDs\n- custom OAuth provider registration\n\n## Auth and API key resolution order\n\nWhen requesting a key for a provider, effective order is:\n\n1. Runtime override (CLI `--api-key`)\n2. Stored API key credential in `agent.db`\n3. Stored OAuth credential in `agent.db` (with refresh)\n4. Environment variable mapping (`OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, etc.)\n5. ModelRegistry fallback resolver (provider `apiKey` from `models.yml`, env-name-or-literal semantics)\n\n`models.yml` `apiKey` behavior:\n\n- Value is first treated as an environment variable name.\n- If no env var exists, the literal string is used as the token.\n\nIf `authHeader: true` and provider `apiKey` is set, models get:\n\n- `Authorization: Bearer <resolved-key>` header injected.\n\nKeyless providers:\n\n- Providers marked `auth: none` are treated as available without credentials.\n- `getApiKey*` returns `kNoAuth` for them.\n\n## Model availability vs all models\n\n- `getAll()` returns the loaded model registry (built-in + merged custom + discovered).\n- `getAvailable()` filters to models that are keyless or have resolvable auth.\n\nSo a model can exist in registry but not be selectable until auth is available.\n\n## Runtime model resolution\n\n### CLI and pattern parsing\n\n`model-resolver.ts` supports:\n\n- exact `provider/modelId`\n- exact canonical model id\n- exact model id (provider inferred)\n- fuzzy/substring matching\n- glob scope patterns in `--models` (e.g. `openai/*`, `*sonnet*`)\n- optional `:thinkingLevel` suffix (`off|minimal|low|medium|high|xhigh`)\n\n`--provider` is legacy; `--model` is preferred.\n\nResolution precedence for exact selectors:\n\n1. exact `provider/modelId` bypasses coalescing\n2. exact canonical id resolves through the canonical index\n3. exact bare concrete id still works\n4. fuzzy and glob matching run after the exact paths\n\n### Initial model selection priority\n\n`findInitialModel(...)` uses this order:\n\n1. explicit CLI provider+model\n2. first scoped model (if not resuming)\n3. saved default provider/model\n4. known provider defaults (e.g. OpenAI/Anthropic/etc.) among available models\n5. first available model\n\n### Role aliases and settings\n\nSupported model roles:\n\n- `default`, `smol`, `slow`, `plan`, `commit`\n\nRole aliases like `pi/smol` expand through `settings.modelRoles`. Each role value can also append a thinking selector such as `:minimal`, `:low`, `:medium`, or `:high`.\n\nIf a role points at another role, the target model still inherits normally and any explicit suffix on the referring role wins for that role-specific use.\n\nRelated settings:\n\n- `modelRoles` (record)\n- `enabledModels` (scoped pattern list)\n- `modelProviderOrder` (global canonical-provider precedence)\n- `providers.kimiApiFormat` (`openai` or `anthropic` request format)\n- `providers.openaiWebsockets` (`auto|off|on` websocket preference for OpenAI Codex transport)\n\n`modelRoles` may store either:\n\n- `provider/modelId` to pin a concrete provider variant\n- a canonical id such as `gpt-5.3-codex` to allow provider coalescing\n\nFor `enabledModels` and CLI `--models`:\n\n- exact canonical ids expand to all concrete variants in that canonical group\n- explicit `provider/modelId` entries stay exact\n- globs and fuzzy matches still operate on concrete models\n\n## `/model` and `--list-models`\n\nBoth surfaces keep provider-prefixed models visible and selectable.\n\nThey now also expose canonical/coalesced models:\n\n- `/model` includes a canonical view alongside provider tabs\n- `--list-models` prints a canonical section plus the concrete provider rows\n\nSelecting a canonical entry stores the canonical selector. Selecting a provider row stores the explicit `provider/modelId`.\n\n## Context promotion (model-level fallback chains)\n\nContext promotion is an overflow recovery mechanism for small-context variants (for example `*-spark`) that automatically promotes to a larger-context sibling when the API rejects a request with a context length error.\n\n### Trigger and order\n\nWhen a turn fails with a context overflow error (e.g. `context_length_exceeded`), `AgentSession` attempts promotion **before** falling back to compaction:\n\n1. If `contextPromotion.enabled` is true, resolve a promotion target (see below).\n2. If a target is found, switch to it and retry the request — no compaction needed.\n3. If no target is available, fall through to auto-compaction on the current model.\n\n### Target selection\n\nSelection is model-driven, not role-driven:\n\n1. `currentModel.contextPromotionTarget` (if configured)\n2. smallest larger-context model on the same provider + API\n\nCandidates are ignored unless credentials resolve (`ModelRegistry.getApiKey(...)`).\n\n### OpenAI Codex websocket handoff\n\nIf switching from/to `openai-codex-responses`, session provider state key `openai-codex-responses` is closed before model switch. This drops websocket transport state so the next turn starts clean on the promoted model.\n\n### Persistence behavior\n\nPromotion uses temporary switching (`setModelTemporary`):\n\n- recorded as a temporary `model_change` in session history\n- does not rewrite saved role mapping\n\n### Configuring explicit fallback chains\n\nConfigure fallback directly in model metadata via `contextPromotionTarget`.\n\n`contextPromotionTarget` accepts either:\n\n- `provider/model-id` (explicit)\n- `model-id` (resolved within current provider)\n\nExample (`models.yml`) for Spark -> non-Spark on the same provider:\n\n```yaml\nproviders:\n openai-codex:\n modelOverrides:\n gpt-5.3-codex-spark:\n contextPromotionTarget: openai-codex/gpt-5.3-codex\n```\n\nThe built-in model generator also assigns this automatically for `*-spark` models when a same-provider base model exists.\n\n## Compatibility and routing fields\n\n`models.yml` supports this `compat` subset:\n\n- `supportsStore`\n- `supportsDeveloperRole`\n- `supportsReasoningEffort`\n- `maxTokensField` (`max_completion_tokens` or `max_tokens`)\n- `openRouterRouting.only` / `openRouterRouting.order`\n- `vercelGatewayRouting.only` / `vercelGatewayRouting.order`\n\nThese are consumed by the OpenAI-completions transport logic and combined with URL-based auto-detection.\n\n## Practical examples\n\n### Local OpenAI-compatible endpoint (no auth)\n\n```yaml\nproviders:\n local-openai:\n baseUrl: http://127.0.0.1:8000/v1\n auth: none\n api: openai-completions\n models:\n - id: Qwen/Qwen2.5-Coder-32B-Instruct\n name: Qwen 2.5 Coder 32B (local)\n```\n\n### Hosted proxy with env-based key\n\n```yaml\nproviders:\n anthropic-proxy:\n baseUrl: https://proxy.example.com/anthropic\n apiKey: ANTHROPIC_PROXY_API_KEY\n api: anthropic-messages\n authHeader: true\n models:\n - id: claude-sonnet-4-20250514\n name: Claude Sonnet 4 (Proxy)\n reasoning: true\n input: [text, image]\n```\n\n### Override built-in provider route + model metadata\n\n```yaml\nproviders:\n openrouter:\n baseUrl: https://my-proxy.example.com/v1\n headers:\n X-Team: platform\n modelOverrides:\n anthropic/claude-sonnet-4:\n name: Sonnet 4 (Corp)\n compat:\n openRouterRouting:\n only: [anthropic]\n```\n\n## Legacy consumer caveat\n\nMost model configuration now flows through `models.yml` via `ModelRegistry`.\n\nOne notable legacy path remains: web-search Anthropic auth resolution still reads `~/.omp/agent/models.json` directly in `src/web/search/auth.ts`.\n\nIf you rely on that specific path, keep JSON compatibility in mind until that module is migrated.\n\n## Failure mode\n\nIf `models.yml` fails schema or validation checks:\n\n- registry keeps operating with built-in models\n- error is exposed via `ModelRegistry.getError()` and surfaced in UI/notifications\n",
26
- "natives-addon-loader-runtime.md": "# Natives Addon Loader Runtime\n\nThis document deep-dives the addon loading/validation layer in `@oh-my-pi/pi-natives`: how `native.ts` decides which `.node` file to load, when embedded payload extraction runs, and how startup failures are reported.\n\n## Implementation files\n\n- `packages/natives/src/native.ts`\n- `packages/natives/src/embedded-addon.ts`\n- `packages/natives/src/bindings.ts`\n- `packages/natives/package.json`\n\n## Scope and responsibility\n\nLoader/runtime responsibilities are intentionally narrow:\n\n- Build a platform/CPU-aware candidate list for addon filenames and directories.\n- Optionally materialize an embedded addon into a versioned per-user cache directory.\n- Attempt candidates in deterministic order.\n- Reject stale or incompatible addons via `validateNative` before exposing bindings.\n\nOut of scope here: module-specific grep/text/highlight behavior.\n\n## Runtime inputs and derived state\n\nAt module initialization (`export const native = loadNative();`), `native.ts` computes static context:\n\n- **Platform tag**: ``${process.platform}-${process.arch}`` (for example `darwin-arm64`).\n- **Package version**: from `packages/natives/package.json` (`version` field).\n- **Core directories**:\n - `nativeDir`: package-local `packages/natives/native`.\n - `execDir`: directory containing `process.execPath`.\n - `versionedDir`: `<getNativesDir()>/<packageVersion>`.\n - `userDataDir` fallback:\n - Windows: `%LOCALAPPDATA%/omp` (or `%USERPROFILE%/AppData/Local/omp`).\n - Non-Windows: `~/.local/bin`.\n- **Compiled-binary mode** (`isCompiledBinary`): true if any of:\n - `PI_COMPILED` env var is set, or\n - `import.meta.url` contains Bun-embedded markers (`$bunfs`, `~BUN`, `%7EBUN`).\n- **Variant override**: `PI_NATIVE_VARIANT` (`modern`/`baseline` only; invalid values ignored).\n- **Selected variant**: explicit override, otherwise runtime AVX2 detection on x64 (`modern` if AVX2, else `baseline`).\n\n## Platform support and tag resolution\n\n`SUPPORTED_PLATFORMS` is fixed to:\n\n- `linux-x64`\n- `linux-arm64`\n- `darwin-x64`\n- `darwin-arm64`\n- `win32-x64`\n\nBehavior detail:\n\n- Unsupported platforms are not rejected up-front.\n- Loader still tries all computed candidates first.\n- If nothing loads, it throws an explicit unsupported-platform error listing supported tags.\n\nThis preserves useful diagnostics for near-miss cases while still failing hard for truly unsupported targets.\n\n## Variant selection (`modern` / `baseline` / default)\n\n### x64 behavior\n\n1. If `PI_NATIVE_VARIANT` is `modern` or `baseline`, that value wins.\n2. Else detect AVX2 support:\n - Linux: scan `/proc/cpuinfo` for `avx2`.\n - macOS: query `sysctl` (`machdep.cpu.leaf7_features`, fallback `machdep.cpu.features`).\n - Windows: run PowerShell `[System.Runtime.Intrinsics.X86.Avx2]::IsSupported`.\n3. Result:\n - AVX2 available -> `modern`\n - AVX2 unavailable/undetectable -> `baseline`\n\n### Non-x64 behavior\n\n- No variant is used; loader stays on the default filename (`pi_natives.<platform>-<arch>.node`).\n\n### Filename construction\n\nGiven `tag = <platform>-<arch>`:\n\n- Non-x64 or no variant: `pi_natives.<tag>.node`\n- x64 + `modern`: try in order\n 1. `pi_natives.<tag>-modern.node`\n 2. `pi_natives.<tag>-baseline.node` (intentional fallback)\n- x64 + `baseline`: only `pi_natives.<tag>-baseline.node`\n\nThe `addonLabel` used in final error messages is either `<tag>` or `<tag> (<variant>)`.\n\n## Candidate path construction and fallback ordering\n\n`native.ts` builds candidate pools before any `require(...)` call.\n\n### Release candidates\n\nBuilt from variant-resolved filename list and searched in this order:\n\n- **Non-compiled runtime**:\n 1. `<nativeDir>/<filename>`\n 2. `<execDir>/<filename>`\n\n- **Compiled runtime** (`PI_COMPILED` or Bun embedded markers):\n 1. `<versionedDir>/<filename>`\n 2. `<userDataDir>/<filename>`\n 3. `<nativeDir>/<filename>`\n 4. `<execDir>/<filename>`\n\n`dedupedCandidates` removes duplicates while preserving first occurrence order.\n\n### Final runtime sequence\n\nAt load time:\n\n1. Optional embedded extraction candidate (if produced) is inserted at the front.\n2. Remaining deduplicated candidates are tried in order.\n3. First candidate that both `require(...)`s and passes `validateNative(...)` wins.\n\n## Embedded addon extraction lifecycle\n\n`embedded-addon.ts` defines a generated manifest shape:\n\n- `platformTag`\n- `version`\n- `files[]` where each entry has `variant`, `filename`, `filePath`\n\nCurrent checked-in default is `embeddedAddon: null`; compiled artifacts may replace this with real metadata.\n\n### Extraction state machine\n\nExtraction (`maybeExtractEmbeddedAddon`) runs only when all gates pass:\n\n1. `isCompiledBinary === true`\n2. `embeddedAddon !== null`\n3. `embeddedAddon.platformTag === platformTag`\n4. `embeddedAddon.version === packageVersion`\n5. A variant-appropriate embedded file is found\n\nVariant file selection mirrors runtime variant intent:\n\n- Non-x64: prefer `default`, then first available file.\n- x64 + `modern`: prefer `modern`, fallback to `baseline`.\n- x64 + `baseline`: require `baseline`.\n\nMaterialization behavior:\n\n1. Ensure `<versionedDir>` exists (`mkdirSync(..., { recursive: true })`).\n2. If `<versionedDir>/<selected filename>` already exists, reuse it (no rewrite).\n3. Else read embedded source `filePath` and write target file.\n4. Return target path for highest-priority load attempt.\n\nOn failure, extraction does not crash immediately; it appends an error entry (directory creation or write failure) and loader proceeds to normal candidate probing.\n\n## Lifecycle and state transitions\n\n```text\nInit\n -> Compute platform/version/variant/candidate lists\n -> (Compiled + embedded manifest matches?)\n yes -> Try extract embedded to versionedDir (record errors, continue)\n no -> Skip extraction\n -> For each runtime candidate in order:\n require(candidate)\n -> success: validateNative\n -> pass: return bindings (READY)\n -> fail: record error, continue\n -> failure: record error, continue\n -> none loaded:\n if unsupported platform tag -> throw Unsupported platform\n else -> throw Failed to load (full tried-path diagnostics + hints)\n```\n\n## `validateNative` contract checks\n\n`validateNative(bindings, source)` enforces a function-only contract over `NativeBindings` at startup.\n\nMechanics:\n\n- For each required export name, it checks `typeof bindings[name] === \"function\"`.\n- Missing names are aggregated.\n- If any are missing, loader throws:\n - source addon path,\n - missing export list,\n - rebuild command hint.\n\nThis is a hard compatibility gate against stale binaries, partial builds, and symbol/name drift.\n\n### JS API ↔ native export mapping (validation gate)\n\n| JS binding name checked in `validateNative` | Expected native export name |\n| --- | --- |\n| `grep` | `grep` |\n| `glob` | `glob` |\n| `highlightCode` | `highlightCode` |\n| `executeShell` | `executeShell` |\n| `PtySession` | `PtySession` |\n| `Shell` | `Shell` |\n| `visibleWidth` | `visibleWidth` |\n| `getSystemInfo` | `getSystemInfo` |\n| `getWorkProfile` | `getWorkProfile` |\n| `invalidateFsScanCache` | `invalidateFsScanCache` |\n\nNote: `bindings.ts` declares only the base `cancelWork(id)` member; module `types.ts` files declaration-merge additional symbols that `validateNative` enforces.\n\n## Failure behavior and diagnostics\n\n## Unsupported platform\n\nIf all candidates fail and `platformTag` is not in `SUPPORTED_PLATFORMS`, loader throws:\n\n- `Unsupported platform: <tag>`\n- Full supported-platform list\n- Explicit issue-reporting guidance\n\n## Stale binary / mismatch symptoms\n\nTypical stale mismatch signal:\n\n- `Native addon missing exports (<candidate>). Missing: ...`\n\nCommon causes:\n\n- Old `.node` binary from previous package version/API shape.\n- Wrong variant artifact selected (for x64).\n- New Rust export not present in loaded artifact.\n\nLoader behavior:\n\n- Records per-candidate missing-export failures.\n- Continues probing remaining candidates.\n- If no candidate validates, final error includes every attempted path with each failure message.\n\n## Compiled-binary startup failures\n\nIn compiled mode final diagnostics include:\n\n- expected versioned cache target paths (`<versionedDir>/<filename>`),\n- remediation to delete stale `<versionedDir>` and rerun,\n- direct release download `curl` commands for each expected filename.\n\n## Non-compiled startup failures\n\nIn normal package/runtime mode final diagnostics include:\n\n- reinstall hint (`bun install @oh-my-pi/pi-natives`),\n- local rebuild command (`bun --cwd=packages/natives run build`),\n- optional x64 variant build hint (`TARGET_VARIANT=baseline|modern ...`).\n\n## Runtime behavior\n\n- The loader always uses the release candidate chain.\n- Setting `PI_DEV` only enables per-candidate console diagnostics (`Loaded native addon...` and load errors).",
27
- "natives-architecture.md": "# Natives Architecture\n\n`@oh-my-pi/pi-natives` is a three-layer stack:\n\n1. **TypeScript wrapper/API layer** exposes stable JS/TS entrypoints.\n2. **Addon loading/validation layer** resolves and validates the `.node` binary for the current runtime.\n3. **Rust N-API module layer** implements performance-critical primitives exported to JS.\n\nThis document is the foundation for deeper module-level docs.\n\n## Implementation files\n\n- `packages/natives/src/index.ts`\n- `packages/natives/src/native.ts`\n- `packages/natives/src/bindings.ts`\n- `packages/natives/src/embedded-addon.ts`\n- `packages/natives/scripts/build-native.ts`\n- `packages/natives/scripts/embed-native.ts`\n- `packages/natives/package.json`\n- `crates/pi-natives/src/lib.rs`\n\n## Layer 1: TypeScript wrapper/API layer\n\n`packages/natives/src/index.ts` is the public barrel. It groups exports by capability domain and re-exports typed wrappers rather than exposing raw N-API bindings directly.\n\nCurrent top-level groups:\n\n- **Search/text primitives**: `grep`, `glob`, `text`, `highlight`\n- **Execution/process/terminal primitives**: `shell`, `pty`, `ps`, `keys`\n- **System/media/conversion primitives**: `image`, `html`, `clipboard`, `system-info`, `work`\n\n`packages/natives/src/bindings.ts` defines the base interface contract:\n\n- `NativeBindings` starts with shared members (`cancelWork(id: number)`)\n- module-specific bindings are added by declaration merging from each module’s `types.ts`\n- `Cancellable` standardizes timeout and abort-signal options for wrappers that expose cancellation\n\n**Guaranteed contract (API-facing):** consumers import from `@oh-my-pi/pi-natives` and use typed wrappers.\n\n**Implementation detail (may change):** declaration merging and internal wrapper layout (`src/<module>/index.ts`, `src/<module>/types.ts`).\n\n## Layer 2: Addon loading and validation\n\n`packages/natives/src/native.ts` owns runtime addon selection, optional extraction, and export validation.\n\n### Candidate resolution model\n\n- Platform tag is `\"${process.platform}-${process.arch}\"`.\n- Supported tags are currently:\n - `linux-x64`\n - `linux-arm64`\n - `darwin-x64`\n - `darwin-arm64`\n - `win32-x64`\n- x64 can use CPU variants:\n - `modern` (AVX2-capable)\n - `baseline` (fallback)\n- Non-x64 uses the default filename (no variant suffix).\n\nFilename strategy:\n\n- Release: `pi_natives.<platform>-<arch>.node`\n- x64 variant release: `pi_natives.<platform>-<arch>-modern.node` and/or `...-baseline.node`\n- `PI_DEV` enables loader diagnostics but does not change addon filenames\n\n### Platform-specific variant detection\n\nFor x64, variant selection uses:\n\n- **Linux**: `/proc/cpuinfo`\n- **macOS**: `sysctl machdep.cpu.leaf7_features` / `machdep.cpu.features`\n- **Windows**: PowerShell check for `System.Runtime.Intrinsics.X86.Avx2`\n\n`PI_NATIVE_VARIANT` can explicitly force `modern` or `baseline`.\n\n### Binary distribution and extraction model\n\n`packages/natives/package.json` includes both `src` and `native` in published files. The `native/` directory stores prebuilt platform artifacts.\n\nFor compiled binaries (`PI_COMPILED` or Bun embedded runtime markers), loader behavior is:\n\n1. Check versioned user cache path: `<getNativesDir()>/<packageVersion>/...`\n2. Check legacy compiled-binary location:\n - Windows: `%LOCALAPPDATA%/omp` (fallback `%USERPROFILE%/AppData/Local/omp`)\n - non-Windows: `~/.local/bin`\n3. Fall back to packaged `native/` and executable directory candidates\n\nIf an embedded addon manifest is present (`embedded-addon.ts` generated by `scripts/embed-native.ts`), `native.ts` can materialize the matching embedded binary into the versioned cache directory before loading.\n\n### Validation and failure modes\n\nAfter `require(candidate)`, `validateNative(...)` verifies required exports (for example `grep`, `glob`, `highlightCode`, `PtySession`, `Shell`, `getSystemInfo`, `getWorkProfile`, `invalidateFsScanCache`).\n\nFailure paths are explicit:\n\n- **Unsupported platform tag**: throws with supported platform list\n- **No loadable candidate**: throws with all attempted paths and remediation hints\n- **Missing exports**: throws with exact missing names and rebuild command\n- **Embedded extraction errors**: records directory/write failures and includes them in final load diagnostics\n\n**Guaranteed contract (API-facing):** addon load either succeeds with a validated binding set or fails fast with actionable error text.\n\n**Implementation detail (may change):** exact candidate search order and compiled-binary fallback path ordering.\n\n## Layer 3: Rust N-API module layer\n\n`crates/pi-natives/src/lib.rs` is the Rust entry module that declares exported module ownership:\n\n- `clipboard`\n- `fd`\n- `fs_cache`\n- `glob`\n- `glob_util`\n- `grep`\n- `highlight`\n- `html`\n- `image`\n- `keys`\n- `prof`\n- `ps`\n- `pty`\n- `shell`\n- `system_info`\n- `task`\n- `text`\n\nThese modules implement the N-API symbols consumed and validated by `native.ts`. JS-level names are surfaced through the TS wrappers in `packages/natives/src`.\n\n**Guaranteed contract (API-facing):** Rust module exports must match the binding names expected by `validateNative` and wrapper modules.\n\n**Implementation detail (may change):** internal Rust module decomposition and helper module boundaries (`glob_util`, `task`, etc.).\n\n## Ownership boundaries\n\nAt architecture level, ownership is split as follows:\n\n- **TS wrapper/API ownership (`packages/natives/src`)**\n - public API grouping, option typing, and stable JS ergonomics\n - cancellation surface (`timeoutMs`, `AbortSignal`) exposed to callers\n- **Loader ownership (`packages/natives/src/native.ts`)**\n - runtime binary selection\n - CPU variant selection and override handling\n - compiled-binary extraction and candidate probing\n - hard validation of required native exports\n- **Rust ownership (`crates/pi-natives/src`)**\n - algorithmic and system-level implementation\n - platform-native behavior and performance-sensitive logic\n - N-API symbol implementation that TS wrappers consume\n\n## Runtime flow (high level)\n\n1. Consumer imports from `@oh-my-pi/pi-natives`.\n2. Wrapper module calls into singleton `native` binding.\n3. `native.ts` selects candidate binary for platform/arch/variant.\n4. Optional embedded binary extraction occurs for compiled distributions.\n5. Addon is loaded and export set is validated.\n6. Wrapper returns typed results to caller.\n\n## Glossary\n\n- **Native addon**: A `.node` binary loaded via Node-API (N-API).\n- **Platform tag**: Runtime tuple `platform-arch` (for example `darwin-arm64`).\n- **Variant**: x64 CPU-specific build flavor (`modern` AVX2, `baseline` fallback).\n- **Wrapper**: TS function/class that provides typed API over raw native exports.\n- **Declaration merging**: TS technique used by module `types.ts` files to extend `NativeBindings`.\n- **Compiled binary mode**: Runtime mode where the CLI is bundled and native addons are resolved from extracted/cache paths instead of only package-local paths.\n- **Embedded addon**: Build artifact metadata and file references generated into `embedded-addon.ts` so compiled binaries can extract matching `.node` payloads.\n- **Validation gate**: `validateNative(...)` check that rejects stale/mismatched binaries missing required exports.\n",
26
+ "natives-addon-loader-runtime.md": "# Natives Addon Loader Runtime\n\nThis document deep-dives the addon loading/validation layer in `@oh-my-pi/pi-natives`: how `native.ts` decides which `.node` file to load, when embedded payload extraction runs, and how startup failures are reported.\n\n## Implementation files\n\n- `packages/natives/src/native.ts`\n- `packages/natives/src/embedded-addon.ts`\n- `packages/natives/src/bindings.ts`\n- `packages/natives/package.json`\n\n## Scope and responsibility\n\nLoader/runtime responsibilities are intentionally narrow:\n\n- Build a platform/CPU-aware candidate list for addon filenames and directories.\n- Optionally materialize an embedded addon into a versioned per-user cache directory.\n- Attempt candidates in deterministic order.\n- Reject stale or incompatible addons via `validateNative` before exposing bindings.\n\nOut of scope here: module-specific grep/text/highlight behavior.\n\n## Runtime inputs and derived state\n\nAt module initialization (`export const native = loadNative();`), `native.ts` computes static context:\n\n- **Platform tag**: ``${process.platform}-${process.arch}`` (for example `darwin-arm64`).\n- **Package version**: from `packages/natives/package.json` (`version` field).\n- **Core directories**:\n - `nativeDir`: package-local `packages/natives/native`.\n - `execDir`: directory containing `process.execPath`.\n - `versionedDir`: `<getNativesDir()>/<packageVersion>`.\n - `userDataDir` fallback:\n - Windows: `%LOCALAPPDATA%/omp` (or `%USERPROFILE%/AppData/Local/omp`).\n - Non-Windows: `~/.local/bin`.\n- **Compiled-binary mode** (`isCompiledBinary`): true if any of:\n - `PI_COMPILED` env var is set, or\n - `import.meta.url` contains Bun-embedded markers (`$bunfs`, `~BUN`, `%7EBUN`).\n- **Variant override**: `PI_NATIVE_VARIANT` (`modern`/`baseline` only; invalid values ignored).\n- **Selected variant**: explicit override, otherwise runtime AVX2 detection on x64 (`modern` if AVX2, else `baseline`).\n\n## Platform support and tag resolution\n\n`SUPPORTED_PLATFORMS` is fixed to:\n\n- `linux-x64`\n- `linux-arm64`\n- `darwin-x64`\n- `darwin-arm64`\n- `win32-x64`\n\nBehavior detail:\n\n- Unsupported platforms are not rejected up-front.\n- Loader still tries all computed candidates first.\n- If nothing loads, it throws an explicit unsupported-platform error listing supported tags.\n\nThis preserves useful diagnostics for near-miss cases while still failing hard for truly unsupported targets.\n\n## Variant selection (`modern` / `baseline` / default)\n\n### x64 behavior\n\n1. If `PI_NATIVE_VARIANT` is `modern` or `baseline`, that value wins.\n2. Else detect AVX2 support:\n - Linux: scan `/proc/cpuinfo` for `avx2`.\n - macOS: query `sysctl` (`machdep.cpu.leaf7_features`, fallback `machdep.cpu.features`).\n - Windows: run PowerShell `[System.Runtime.Intrinsics.X86.Avx2]::IsSupported`.\n3. Result:\n - AVX2 available -> `modern`\n - AVX2 unavailable/undetectable -> `baseline`\n\n### Non-x64 behavior\n\n- No variant is used; loader stays on the default filename (`pi_natives.<platform>-<arch>.node`).\n\n### Filename construction\n\nGiven `tag = <platform>-<arch>`:\n\n- Non-x64 or no variant: `pi_natives.<tag>.node`\n- x64 + `modern`: try in order\n 1. `pi_natives.<tag>-modern.node`\n 2. `pi_natives.<tag>-baseline.node` (intentional fallback)\n- x64 + `baseline`: only `pi_natives.<tag>-baseline.node`\n\nThe `addonLabel` used in final error messages is either `<tag>` or `<tag> (<variant>)`.\n\n## Candidate path construction and fallback ordering\n\n`native.ts` builds candidate pools before any `require(...)` call.\n\n### Release candidates\n\nBuilt from variant-resolved filename list and searched in this order:\n\n- **Non-compiled runtime**:\n 1. `<nativeDir>/<filename>`\n 2. `<execDir>/<filename>`\n\n- **Compiled runtime** (`PI_COMPILED` or Bun embedded markers):\n 1. `<versionedDir>/<filename>`\n 2. `<userDataDir>/<filename>`\n 3. `<nativeDir>/<filename>`\n 4. `<execDir>/<filename>`\n\n`dedupedCandidates` removes duplicates while preserving first occurrence order.\n\n### Final runtime sequence\n\nAt load time:\n\n1. Optional embedded extraction candidate (if produced) is inserted at the front.\n2. Remaining deduplicated candidates are tried in order.\n3. First candidate that both `require(...)`s and passes `validateNative(...)` wins.\n\n## Embedded addon extraction lifecycle\n\n`embedded-addon.ts` defines a generated manifest shape:\n\n- `platformTag`\n- `version`\n- `files[]` where each entry has `variant`, `filename`, `filePath`\n\nCurrent checked-in default is `embeddedAddon: null`; compiled artifacts may replace this with real metadata.\n\n### Extraction state machine\n\nExtraction (`maybeExtractEmbeddedAddon`) runs only when all gates pass:\n\n1. `isCompiledBinary === true`\n2. `embeddedAddon !== null`\n3. `embeddedAddon.platformTag === platformTag`\n4. `embeddedAddon.version === packageVersion`\n5. A variant-appropriate embedded file is found\n\nVariant file selection mirrors runtime variant intent:\n\n- Non-x64: prefer `default`, then first available file.\n- x64 + `modern`: prefer `modern`, fallback to `baseline`.\n- x64 + `baseline`: require `baseline`.\n\nMaterialization behavior:\n\n1. Ensure `<versionedDir>` exists (`mkdirSync(..., { recursive: true })`).\n2. If `<versionedDir>/<selected filename>` already exists, reuse it (no rewrite).\n3. Else read embedded source `filePath` and write target file.\n4. Return target path for highest-priority load attempt.\n\nOn failure, extraction does not crash immediately; it appends an error entry (directory creation or write failure) and loader proceeds to normal candidate probing.\n\n## Lifecycle and state transitions\n\n```text\nInit\n -> Compute platform/version/variant/candidate lists\n -> (Compiled + embedded manifest matches?)\n yes -> Try extract embedded to versionedDir (record errors, continue)\n no -> Skip extraction\n -> For each runtime candidate in order:\n require(candidate)\n -> success: validateNative\n -> pass: return bindings (READY)\n -> fail: record error, continue\n -> failure: record error, continue\n -> none loaded:\n if unsupported platform tag -> throw Unsupported platform\n else -> throw Failed to load (full tried-path diagnostics + hints)\n```\n\n## `validateNative` contract checks\n\n`validateNative(bindings, source)` enforces a function-only contract over `NativeBindings` at startup.\n\nMechanics:\n\n- For each required export name, it checks `typeof bindings[name] === \"function\"`.\n- Missing names are aggregated.\n- If any are missing, loader throws:\n - source addon path,\n - missing export list,\n - rebuild command hint.\n\nThis is a hard compatibility gate against stale binaries, partial builds, and symbol/name drift.\n\n### JS API ↔ native export mapping (validation gate)\n\n| JS binding name checked in `validateNative` | Expected native export name |\n| --- | --- |\n| `grep` | `grep` |\n| `glob` | `glob` |\n| `highlightCode` | `highlightCode` |\n| `executeShell` | `executeShell` |\n| `PtySession` | `PtySession` |\n| `Shell` | `Shell` |\n| `visibleWidth` | `visibleWidth` |\n| `getSystemInfo` | `getSystemInfo` |\n| `getWorkProfile` | `getWorkProfile` |\n| `invalidateFsScanCache` | `invalidateFsScanCache` |\n\nNote: `bindings.ts` declares only the base `cancelWork(id)` member; module `types.ts` files declaration-merge additional symbols that `validateNative` enforces.\n\n## Failure behavior and diagnostics\n\n## Unsupported platform\n\nIf all candidates fail and `platformTag` is not in `SUPPORTED_PLATFORMS`, loader throws:\n\n- `Unsupported platform: <tag>`\n- Full supported-platform list\n- Explicit issue-reporting guidance\n\n## Stale binary / mismatch symptoms\n\nTypical stale mismatch signal:\n\n- `Native addon missing exports (<candidate>). Missing: ...`\n\nCommon causes:\n\n- Old `.node` binary from previous package version/API shape.\n- Wrong variant artifact selected (for x64).\n- New Rust export not present in loaded artifact.\n\nLoader behavior:\n\n- Records per-candidate missing-export failures.\n- Continues probing remaining candidates.\n- If no candidate validates, final error includes every attempted path with each failure message.\n\n## Compiled-binary startup failures\n\nIn compiled mode final diagnostics include:\n\n- expected versioned cache target paths (`<versionedDir>/<filename>`),\n- remediation to delete stale `<versionedDir>` and rerun,\n- direct release download `curl` commands for each expected filename.\n\n## Non-compiled startup failures\n\nIn normal package/runtime mode final diagnostics include:\n\n- reinstall hint (`bun install @oh-my-pi/pi-natives`),\n- local rebuild command (`bun --cwd=packages/natives run build`),\n- optional x64 variant build hint (`TARGET_VARIANT=baseline|modern ...`).\n\n## Runtime behavior\n\n- The loader always uses the release candidate chain.",
27
+ "natives-architecture.md": "# Natives Architecture\n\n`@oh-my-pi/pi-natives` is a three-layer stack:\n\n1. **TypeScript wrapper/API layer** exposes stable JS/TS entrypoints.\n2. **Addon loading/validation layer** resolves and validates the `.node` binary for the current runtime.\n3. **Rust N-API module layer** implements performance-critical primitives exported to JS.\n\nThis document is the foundation for deeper module-level docs.\n\n## Implementation files\n\n- `packages/natives/src/index.ts`\n- `packages/natives/src/native.ts`\n- `packages/natives/src/bindings.ts`\n- `packages/natives/src/embedded-addon.ts`\n- `packages/natives/scripts/build-native.ts`\n- `packages/natives/scripts/embed-native.ts`\n- `packages/natives/package.json`\n- `crates/pi-natives/src/lib.rs`\n\n## Layer 1: TypeScript wrapper/API layer\n\n`packages/natives/src/index.ts` is the public barrel. It groups exports by capability domain and re-exports typed wrappers rather than exposing raw N-API bindings directly.\n\nCurrent top-level groups:\n\n- **Search/text primitives**: `grep`, `glob`, `text`, `highlight`\n- **Execution/process/terminal primitives**: `shell`, `pty`, `ps`, `keys`\n- **System/media/conversion primitives**: `image`, `html`, `clipboard`, `system-info`, `work`\n\n`packages/natives/src/bindings.ts` defines the base interface contract:\n\n- `NativeBindings` starts with shared members (`cancelWork(id: number)`)\n- module-specific bindings are added by declaration merging from each module’s `types.ts`\n- `Cancellable` standardizes timeout and abort-signal options for wrappers that expose cancellation\n\n**Guaranteed contract (API-facing):** consumers import from `@oh-my-pi/pi-natives` and use typed wrappers.\n\n**Implementation detail (may change):** declaration merging and internal wrapper layout (`src/<module>/index.ts`, `src/<module>/types.ts`).\n\n## Layer 2: Addon loading and validation\n\n`packages/natives/src/native.ts` owns runtime addon selection, optional extraction, and export validation.\n\n### Candidate resolution model\n\n- Platform tag is `\"${process.platform}-${process.arch}\"`.\n- Supported tags are currently:\n - `linux-x64`\n - `linux-arm64`\n - `darwin-x64`\n - `darwin-arm64`\n - `win32-x64`\n- x64 can use CPU variants:\n - `modern` (AVX2-capable)\n - `baseline` (fallback)\n- Non-x64 uses the default filename (no variant suffix).\n\nFilename strategy:\n\n- Release: `pi_natives.<platform>-<arch>.node`\n- x64 variant release: `pi_natives.<platform>-<arch>-modern.node` and/or `...-baseline.node`\n\n### Platform-specific variant detection\n\nFor x64, variant selection uses:\n\n- **Linux**: `/proc/cpuinfo`\n- **macOS**: `sysctl machdep.cpu.leaf7_features` / `machdep.cpu.features`\n- **Windows**: PowerShell check for `System.Runtime.Intrinsics.X86.Avx2`\n\n`PI_NATIVE_VARIANT` can explicitly force `modern` or `baseline`.\n\n### Binary distribution and extraction model\n\n`packages/natives/package.json` includes both `src` and `native` in published files. The `native/` directory stores prebuilt platform artifacts.\n\nFor compiled binaries (`PI_COMPILED` or Bun embedded runtime markers), loader behavior is:\n\n1. Check versioned user cache path: `<getNativesDir()>/<packageVersion>/...`\n2. Check legacy compiled-binary location:\n - Windows: `%LOCALAPPDATA%/omp` (fallback `%USERPROFILE%/AppData/Local/omp`)\n - non-Windows: `~/.local/bin`\n3. Fall back to packaged `native/` and executable directory candidates\n\nIf an embedded addon manifest is present (`embedded-addon.ts` generated by `scripts/embed-native.ts`), `native.ts` can materialize the matching embedded binary into the versioned cache directory before loading.\n\n### Validation and failure modes\n\nAfter `require(candidate)`, `validateNative(...)` verifies required exports (for example `grep`, `glob`, `highlightCode`, `PtySession`, `Shell`, `getSystemInfo`, `getWorkProfile`, `invalidateFsScanCache`).\n\nFailure paths are explicit:\n\n- **Unsupported platform tag**: throws with supported platform list\n- **No loadable candidate**: throws with all attempted paths and remediation hints\n- **Missing exports**: throws with exact missing names and rebuild command\n- **Embedded extraction errors**: records directory/write failures and includes them in final load diagnostics\n\n**Guaranteed contract (API-facing):** addon load either succeeds with a validated binding set or fails fast with actionable error text.\n\n**Implementation detail (may change):** exact candidate search order and compiled-binary fallback path ordering.\n\n## Layer 3: Rust N-API module layer\n\n`crates/pi-natives/src/lib.rs` is the Rust entry module that declares exported module ownership:\n\n- `clipboard`\n- `fd`\n- `fs_cache`\n- `glob`\n- `glob_util`\n- `grep`\n- `highlight`\n- `html`\n- `image`\n- `keys`\n- `prof`\n- `ps`\n- `pty`\n- `shell`\n- `system_info`\n- `task`\n- `text`\n\nThese modules implement the N-API symbols consumed and validated by `native.ts`. JS-level names are surfaced through the TS wrappers in `packages/natives/src`.\n\n**Guaranteed contract (API-facing):** Rust module exports must match the binding names expected by `validateNative` and wrapper modules.\n\n**Implementation detail (may change):** internal Rust module decomposition and helper module boundaries (`glob_util`, `task`, etc.).\n\n## Ownership boundaries\n\nAt architecture level, ownership is split as follows:\n\n- **TS wrapper/API ownership (`packages/natives/src`)**\n - public API grouping, option typing, and stable JS ergonomics\n - cancellation surface (`timeoutMs`, `AbortSignal`) exposed to callers\n- **Loader ownership (`packages/natives/src/native.ts`)**\n - runtime binary selection\n - CPU variant selection and override handling\n - compiled-binary extraction and candidate probing\n - hard validation of required native exports\n- **Rust ownership (`crates/pi-natives/src`)**\n - algorithmic and system-level implementation\n - platform-native behavior and performance-sensitive logic\n - N-API symbol implementation that TS wrappers consume\n\n## Runtime flow (high level)\n\n1. Consumer imports from `@oh-my-pi/pi-natives`.\n2. Wrapper module calls into singleton `native` binding.\n3. `native.ts` selects candidate binary for platform/arch/variant.\n4. Optional embedded binary extraction occurs for compiled distributions.\n5. Addon is loaded and export set is validated.\n6. Wrapper returns typed results to caller.\n\n## Glossary\n\n- **Native addon**: A `.node` binary loaded via Node-API (N-API).\n- **Platform tag**: Runtime tuple `platform-arch` (for example `darwin-arm64`).\n- **Variant**: x64 CPU-specific build flavor (`modern` AVX2, `baseline` fallback).\n- **Wrapper**: TS function/class that provides typed API over raw native exports.\n- **Declaration merging**: TS technique used by module `types.ts` files to extend `NativeBindings`.\n- **Compiled binary mode**: Runtime mode where the CLI is bundled and native addons are resolved from extracted/cache paths instead of only package-local paths.\n- **Embedded addon**: Build artifact metadata and file references generated into `embedded-addon.ts` so compiled binaries can extract matching `.node` payloads.\n- **Validation gate**: `validateNative(...)` check that rejects stale/mismatched binaries missing required exports.\n",
28
28
  "natives-binding-contract.md": "# Natives Binding Contract (TypeScript Side)\n\nThis document defines the TypeScript-side contract that sits between `@oh-my-pi/pi-natives` callers and the loaded N-API addon.\n\nIt focuses on three pieces:\n\n1. contract shape (`NativeBindings` + module augmentation),\n2. wrapper behavior (`src/<module>/index.ts`),\n3. public export surface (`src/index.ts`).\n\n## Implementation files\n\n- `packages/natives/src/bindings.ts`\n- `packages/natives/src/native.ts`\n- `packages/natives/src/index.ts`\n- `packages/natives/src/clipboard/types.ts`\n- `packages/natives/src/clipboard/index.ts`\n- `packages/natives/src/glob/types.ts`\n- `packages/natives/src/glob/index.ts`\n- `packages/natives/src/grep/types.ts`\n- `packages/natives/src/grep/index.ts`\n- `packages/natives/src/highlight/types.ts`\n- `packages/natives/src/highlight/index.ts`\n- `packages/natives/src/html/types.ts`\n- `packages/natives/src/html/index.ts`\n- `packages/natives/src/image/types.ts`\n- `packages/natives/src/image/index.ts`\n- `packages/natives/src/keys/types.ts`\n- `packages/natives/src/keys/index.ts`\n- `packages/natives/src/ps/types.ts`\n- `packages/natives/src/ps/index.ts`\n- `packages/natives/src/pty/types.ts`\n- `packages/natives/src/pty/index.ts`\n- `packages/natives/src/shell/types.ts`\n- `packages/natives/src/shell/index.ts`\n- `packages/natives/src/system-info/types.ts`\n- `packages/natives/src/system-info/index.ts`\n- `packages/natives/src/text/types.ts`\n- `packages/natives/src/text/index.ts`\n- `packages/natives/src/work/types.ts`\n- `packages/natives/src/work/index.ts`\n\n## Contract model\n\n`packages/natives/src/bindings.ts` defines the base contract:\n\n- `NativeBindings` (base interface, currently includes `cancelWork(id: number): void`)\n- `Cancellable` (`timeoutMs?: number`, `signal?: AbortSignal`)\n- `TsFunc<T>` callback shape used by N-API threadsafe callbacks\n\nEach module adds its own fields by declaration merging:\n\n```ts\n// packages/natives/src/<module>/types.ts\ndeclare module \"../bindings\" {\n\tinterface NativeBindings {\n\t\tgrep(options: GrepOptions, onMatch?: TsFunc<GrepMatch>): Promise<GrepResult>;\n\t}\n}\n```\n\nThis keeps one aggregate binding interface without a monolithic central type file.\n\n## Declaration-merging lifecycle and state transitions\n\n### 1) Compile-time type assembly\n\n- `bindings.ts` provides the base `NativeBindings` symbol.\n- Every `src/<module>/types.ts` augments `NativeBindings`.\n- `src/native.ts` imports all `./<module>/types` files for side effects so the merged contract is in scope where `NativeBindings` is used.\n\nState transition: **Base contract** → **Merged contract**.\n\n### 2) Runtime addon load and validation gate\n\n- `src/native.ts` loads candidate `.node` binaries.\n- Loaded object is treated as `NativeBindings` and immediately passed through `validateNative(...)`.\n- `validateNative` verifies required export keys by `typeof bindings[name] === \"function\"`.\n\nState transition: **Untrusted addon object** → **Validated native binding object** (or hard failure).\n\n### 3) Wrapper invocation\n\n- Module wrappers in `src/<module>/index.ts` call `native.<export>`.\n- Wrappers adapt defaults and callback shape (`(err, value)` to value-only callback patterns in JS APIs).\n- `src/index.ts` re-exports module wrappers/types as the public package API.\n\nState transition: **Validated raw bindings** → **Ergonomic public API**.\n\n## Wrapper responsibilities\n\nWrappers are intentionally thin; they do not re-implement native logic.\n\nPrimary responsibilities:\n\n- **Argument normalization/defaulting**\n - `glob()` resolves `options.path` to absolute path and defaults `hidden`, `gitignore`, `recursive`.\n - `hasMatch()` fills default flags (`ignoreCase`, `multiline`) before native call.\n- **Callback adaptation**\n - `grep()`, `glob()`, `executeShell()` convert `TsFunc<T>` (`error, value`) into user callback receiving only successful values.\n- **Environment or policy behavior around native calls**\n - Clipboard wrapper adds OSC52/Termux/headless handling and treats copy as best effort.\n- **Public naming and re-export curation**\n - `searchContent()` maps to native export `search`.\n\n## Public export surface organization\n\n`packages/natives/src/index.ts` is the canonical public barrel. It groups exports by capability domain:\n\n- Search/text: `grep`, `glob`, `text`, `highlight`\n- Execution/process/terminal: `shell`, `pty`, `ps`, `keys`\n- System/media/conversion: `image`, `html`, `clipboard`, `system-info`, `work`\n\nMaintainer rule: if a wrapper is not re-exported from `src/index.ts`, it is not part of the intended public package surface.\n\n## JS API ↔ native export mapping (representative)\n\nThe Rust side uses N-API export names (typically from `#[napi]` snake_case -> camelCase conversion, with occasional explicit aliases) that must match these binding keys.\n\n| Category | Public JS API (wrapper) | Native binding key | Return type | Async? |\n|---|---|---|---|---|\n| Grep | `grep(options, onMatch?)` | `grep` | `Promise<GrepResult>` | Yes |\n| Grep | `searchContent(content, options)` | `search` | `SearchResult` | No |\n| Grep | `hasMatch(content, pattern, opts?)` | `hasMatch` | `boolean` | No |\n| Grep | `fuzzyFind(options)` | `fuzzyFind` | `Promise<FuzzyFindResult>` | Yes |\n| Glob | `glob(options, onMatch?)` | `glob` | `Promise<GlobResult>` | Yes |\n| Glob | `invalidateFsScanCache(path?)` | `invalidateFsScanCache` | `void` | No |\n| Shell | `executeShell(options, onChunk?)` | `executeShell` | `Promise<ShellExecuteResult>` | Yes |\n| Shell | `Shell` | `Shell` | class constructor | N/A |\n| PTY | `PtySession` | `PtySession` | class constructor | N/A |\n| Text | `truncateToWidth(...)` | `truncateToWidth` | `string` | No |\n| Text | `sliceWithWidth(...)` | `sliceWithWidth` | `SliceWithWidthResult` | No |\n| Text | `visibleWidth(text)` | `visibleWidth` | `number` | No |\n| Highlight | `highlightCode(code, lang, colors)` | `highlightCode` | `string` | No |\n| HTML | `htmlToMarkdown(html, options?)` | `htmlToMarkdown` | `Promise<string>` | Yes |\n| System | `getSystemInfo()` | `getSystemInfo` | `SystemInfo` | No |\n| Work | `getWorkProfile(lastSeconds)` | `getWorkProfile` | `WorkProfile` | No |\n| Process | `killTree(pid, signal)` | `killTree` | `number` | No |\n| Process | `listDescendants(pid)` | `listDescendants` | `number[]` | No |\n| Clipboard | `copyToClipboard(text)` | `copyToClipboard` | `Promise<void>` (best effort wrapper behavior) | Yes |\n| Clipboard | `readImageFromClipboard()` | `readImageFromClipboard` | `Promise<ClipboardImage \\| null>` | Yes |\n| Keys | `parseKey(data, kittyProtocolActive)` | `parseKey` | `string \\| null` | No |\n\n## Sync vs async contract differences\n\nThe contract mixes sync and async APIs; wrappers preserve native call style rather than forcing one model:\n\n- **Promise-based async exports** for I/O or long-running work (`grep`, `glob`, `htmlToMarkdown`, `executeShell`, clipboard, image operations).\n- **Synchronous exports** for deterministic in-memory transforms/parsers (`search`, `hasMatch`, highlighting, text width/slicing, key parsing, process queries).\n- **Constructor exports** for stateful runtime objects (`Shell`, `PtySession`, `PhotonImage`).\n\nImplication for maintainers: changing sync ↔ async for an existing export is a breaking API and contract change across wrappers and callers.\n\n## Object and enum typing patterns\n\n### Object patterns (`#[napi(object)]`-style JS objects)\n\nTS models object-shaped native values as interfaces, for example:\n\n- `GrepResult`, `SearchResult`, `GlobResult`\n- `SystemInfo`, `WorkProfile`\n- `ClipboardImage`, `ParsedKittyResult`\n\nThese are structural contracts at compile time; runtime shape correctness is owned by native implementation.\n\n### Enum patterns\n\nNumeric native enums are represented as `const enum` values in TS:\n\n- `FileType` (`1=file`, `2=dir`, `3=symlink`)\n- `ImageFormat` (`0=PNG`, `1=JPEG`, `2=WEBP`, `3=GIF`)\n- `SamplingFilter`, `Ellipsis`, `KeyEventType`\n\nCallers see named enum members; the binding boundary passes numbers.\n\n## How mismatches are caught\n\nMismatch detection happens at two layers:\n\n1. **Compile-time TypeScript contract checks**\n - Wrappers call `native.<name>` against merged `NativeBindings`.\n - Missing/renamed binding keys break TS type-checking in wrappers.\n\n2. **Runtime validation in `validateNative`**\n - After load, `native.ts` checks required exports and throws if any are missing.\n - Error message includes missing keys and rebuild instruction.\n\nThis catches the common stale-binary drift: wrapper/type exists but loaded `.node` lacks the export.\n\n## Failure behavior and caveats\n\n### Load/validation failures (hard failures)\n\n- Addon load failure or unsupported platform throws during module init in `native.ts`.\n- Missing required exports throws before wrappers are usable.\n\nEffect: package fails fast rather than deferring failure to first call.\n\n### Wrapper-level behavior differences\n\n- Some wrappers intentionally soften failures (`copyToClipboard` is best effort and swallows native failure).\n- Streaming callbacks ignore callback error payloads and only forward successful value events.\n\n### Type-level caveats (runtime stricter than TS)\n\n- TS optional fields do not guarantee semantic validity; native layer can still reject malformed values.\n- `const enum` typing does not prevent out-of-range numeric values from untyped callers at runtime.\n- `validateNative` checks only presence/function-ness of required exports, not deep argument/return-shape compatibility.\n- `bindings.ts` includes `cancelWork(id)` in the base interface, but current runtime validation list does not enforce that key.\n\n## Maintainer checklist for binding changes\n\nWhen adding/changing an export, update all of:\n\n1. `src/<module>/types.ts` (augmentation + contract types)\n2. `src/<module>/index.ts` (wrapper behavior)\n3. `src/native.ts` imports for the module types (if new module)\n4. `validateNative` required export checks\n5. `src/index.ts` public re-exports\n\nSkipping any step creates either compile-time drift or runtime load-time failure.",
29
- "natives-build-release-debugging.md": "# Natives Build, Release, and Debugging Runbook\n\nThis runbook describes how the `@oh-my-pi/pi-natives` build pipeline produces `.node` addons, how compiled distributions load them, and how to debug loader/build failures.\n\nIt follows the architecture terms from `docs/natives-architecture.md`:\n- **build-time artifact production** (`scripts/build-native.ts`)\n- **embedded addon manifest generation** (`scripts/embed-native.ts`)\n- **runtime addon loading + validation gate** (`src/native.ts`)\n\n## Implementation files\n\n- `packages/natives/scripts/build-native.ts`\n- `packages/natives/scripts/embed-native.ts`\n- `packages/natives/package.json`\n- `packages/natives/src/native.ts`\n- `crates/pi-natives/Cargo.toml`\n\n## Build pipeline overview\n\n### 1) Build entrypoints\n\n`packages/natives/package.json` scripts:\n\n- `bun scripts/build-native.ts` (`build`) → release build\n- `bun scripts/build-native.ts --dev` (`dev:native`) → debug/dev profile build (same output naming)\n- `bun scripts/embed-native.ts` (`embed:native`) → generate `src/embedded-addon.ts` from built files\n\n### 2) Rust artifact build\n\n`build-native.ts` runs Cargo in `crates/pi-natives`:\n\n- base command: `cargo build`\n- release mode adds `--release` unless `--dev` is passed\n- cross target adds `--target <CROSS_TARGET>`\n\n`crates/pi-natives/Cargo.toml` declares `crate-type = [\"cdylib\"]`, so Cargo emits a shared library (`.so`/`.dylib`/`.dll`) that is then copied/renamed to a `.node` addon filename.\n\n### 3) Artifact discovery and install\n\nAfter Cargo completes, `build-native.ts` scans candidate output directories in order:\n\n1. `${CARGO_TARGET_DIR}` (if set)\n2. `<repo>/target`\n3. `crates/pi-natives/target`\n\nFor each root it checks profile directories:\n- cross build: `<root>/<crossTarget>/<profile>` then `<root>/<profile>`\n- native build: `<root>/<profile>`\n\nThen it looks for one of:\n- `libpi_natives.so`\n- `libpi_natives.dylib`\n- `pi_natives.dll`\n- `libpi_natives.dll`\n\nWhen found, it atomically installs into `packages/natives/native/` with temp-file + rename semantics (Windows fallback handles locked DLL replacement failures explicitly).\n\n## Target/variant model and naming conventions\n\n## Platform tag\n\nBoth build and runtime use platform tag:\n\n`<platform>-<arch>` (example: `darwin-arm64`, `linux-x64`)\n\n## Variant model (x64 only)\n\nx64 supports CPU variants:\n- `modern` (AVX2-capable path)\n- `baseline` (fallback)\n\nNon-x64 uses a single default artifact (no variant suffix).\n\n### Output filenames\n\nRelease builds:\n- x64: `pi_natives.<platform>-<arch>-modern.node` or `...-baseline.node`\n- non-x64: `pi_natives.<platform>-<arch>.node`\n\nDev build (`--dev`):\n- Uses debug profile flags but keeps standard platform-tagged output naming\n\nRuntime loader candidate order in `native.ts`:\n- release candidates\n- compiled mode prepends extracted/cache candidates before package-local files\n\n## Environment flags and build options\n\n## Runtime flags\n\n- `PI_DEV` (loader behavior): enable loader diagnostics\n- `PI_NATIVE_VARIANT` (loader behavior, x64 only): force `modern` or `baseline` selection at runtime\n- `PI_COMPILED` (loader behavior): enable compiled-binary candidate/extraction behavior\n\n## Build-time flags/options\n\n- `--dev` (script arg): build debug profile\n- `CROSS_TARGET`: passed to Cargo `--target`\n- `TARGET_PLATFORM`: override output platform tag naming\n- `TARGET_ARCH`: override output arch naming\n- `TARGET_VARIANT` (x64 only): force `modern` or `baseline` for output filename and RUSTFLAGS policy\n- `CARGO_TARGET_DIR`: additional root when searching Cargo outputs\n- `RUSTFLAGS`:\n - if unset and not cross-compiling, script sets:\n - modern: `-C target-cpu=x86-64-v3`\n - baseline: `-C target-cpu=x86-64-v2`\n - non-x64 / no variant: `-C target-cpu=native`\n - if already set, script does not override\n\n## Build state/lifecycle transitions\n\n### Build lifecycle (`build-native.ts`)\n\n1. **Init**: parse args/env (`--dev`, target overrides, cross flags)\n2. **Variant resolve**:\n - non-x64 → no variant\n - x64 + `TARGET_VARIANT` → explicit variant\n - x64 cross-build without `TARGET_VARIANT` → hard error\n - x64 local build without override → detect host AVX2\n3. **Compile**: run Cargo with resolved profile/target\n4. **Locate artifact**: scan target roots/profile dirs/library names\n5. **Install**: copy + atomic rename into `packages/natives/native`\n6. **Complete**: output addon ready for loader candidates\n\nFailure exits happen at any stage with explicit error text (invalid variant, failed cargo build, missing output library, install/rename failure).\n\n### Embed lifecycle (`embed-native.ts`)\n\n1. **Init**: compute platform tag from `TARGET_PLATFORM`/`TARGET_ARCH` or host values\n2. **Candidate set**:\n - x64 expects both `modern` and `baseline`\n - non-x64 expects one default file\n3. **Validate availability** in `packages/natives/native`\n4. **Generate manifest** (`src/embedded-addon.ts`) with Bun `file` imports and package version\n5. **Runtime extraction ready** for compiled mode\n\n`--reset` bypasses validation and writes a null manifest stub (`embeddedAddon = null`).\n\n## Dev workflow vs shipped/compiled behavior\n\n## Local development workflow\n\nTypical local loop:\n\n1. Build addon:\n - release: `bun --cwd=packages/natives run build`\n - debug profile: `bun --cwd=packages/natives run dev:native`\n2. Set `PI_DEV=1` when testing loader diagnostics\n3. Loader in `native.ts` resolves package-local `native/` (and executable-dir fallback) candidates\n4. `validateNative` enforces export compatibility before wrappers use the binding\n\n## Shipped/compiled binary workflow\n\nIn compiled mode (`PI_COMPILED` or Bun embedded markers):\n\n1. Loader computes versioned cache dir: `<getNativesDir()>/<packageVersion>` (operationally `~/.omp/natives/<version>`)\n2. If embedded manifest matches current platform+version, loader may extract selected embedded file into that versioned dir\n3. Runtime candidate order includes:\n - versioned cache dir\n - legacy compiled-binary dir (`%LOCALAPPDATA%/omp` on Windows, `~/.local/bin` elsewhere)\n - package/executable directories\n4. First successfully loaded addon still must pass `validateNative`\n\nThis is why packaging + runtime loader expectations must align: filenames, platform tags, and exported symbols must match what `native.ts` probes and validates.\n\n## JS API ↔ Rust export mapping (validation gate subset)\n\n`native.ts` requires these JS-visible exports to exist on the loaded addon. They map to Rust N-API exports in `crates/pi-natives/src`:\n\n| JS name required by `validateNative` | Rust export declaration | Rust source file |\n| --- | --- | --- |\n| `glob` | `#[napi] pub fn glob(...)` | `crates/pi-natives/src/glob.rs` |\n| `grep` | `#[napi] pub fn grep(...)` | `crates/pi-natives/src/grep.rs` |\n| `search` | `#[napi] pub fn search(...)` | `crates/pi-natives/src/grep.rs` |\n| `highlightCode` | `#[napi] pub fn highlight_code(...)` | `crates/pi-natives/src/highlight.rs` |\n| `getSystemInfo` | `#[napi] pub fn get_system_info(...)` | `crates/pi-natives/src/system_info.rs` |\n| `getWorkProfile` | `#[napi] pub fn get_work_profile(...)` (camel-cased export) | `crates/pi-natives/src/prof.rs` |\n| `invalidateFsScanCache` | `#[napi] pub fn invalidate_fs_scan_cache(...)` | `crates/pi-natives/src/fs_cache.rs` |\n\nIf any required symbol is missing, loader fails fast with a rebuild hint.\n\n## Failure behavior and diagnostics\n\n## Build-time failures\n\n- Invalid variant configuration:\n - `TARGET_VARIANT` set on non-x64 → immediate error\n - x64 cross-build without explicit `TARGET_VARIANT` → immediate error\n- Cargo build failure:\n - script surfaces non-zero exit and stderr\n- Artifact not found:\n - script prints every checked profile directory\n- Install failure:\n - explicit message; Windows includes locked-file hint\n\n## Runtime loader failures (`native.ts`)\n\n- Unsupported platform tag:\n - throws with supported platform list\n- No candidate could load:\n - throws with full candidate error list and mode-specific remediation hints\n- Missing exports:\n - throws with exact missing symbol names and rebuild command\n- Embedded extraction problems:\n - extraction mkdir/write errors recorded and included in final diagnostics\n\n## Troubleshooting matrix\n\n| Symptom | Likely cause | Verify | Fix |\n| --- | --- | --- | --- |\n| `Native addon missing exports ... Missing: <name>` | Stale `.node` binary, Rust export name mismatch, or wrong binary loaded | Run with `PI_DEV=1` to see loaded path; inspect export list for that file | Rebuild `build`; ensure Rust `#[napi]` export name (or explicit alias when needed) matches JS key; remove stale cached/versioned files |\n| x64 machine loads baseline when modern expected | `PI_NATIVE_VARIANT=baseline`, no AVX2 detected, or only baseline file present | Check `PI_NATIVE_VARIANT`; inspect `native/` for `-modern` file | Build modern variant (`TARGET_VARIANT=modern ... build`) and ensure file is shipped |\n| Cross-build produces unusable/wrong-labeled binary | Mismatch between `CROSS_TARGET` and `TARGET_PLATFORM`/`TARGET_ARCH`, or missing `TARGET_VARIANT` for x64 | Confirm env tuple and output filename | Re-run with consistent env values and explicit x64 `TARGET_VARIANT` |\n| Compiled binary fails after upgrade | Stale extracted cache (`~/.omp/natives/<old-or-mismatched-version>`) or embedded manifest mismatch | Inspect versioned natives dir and loader error list | Delete versioned natives cache for the package version and rerun; regenerate embedded manifest during packaging |\n| Loader probes many paths and none work | Platform mismatch or missing release artifact in package `native/` | Check `platformTag` vs actual filename(s) | Ensure built filename exactly matches `pi_natives.<platform>-<arch>(-variant).node` convention and package includes `native/` |\n| `embed:native` fails with \"Incomplete native addons\" | Required variant files not built before embedding | Check expected vs found list in error text | Build required files first (x64: both modern+baseline; non-x64: default), then rerun `embed:native` |\n\n## Operational commands\n\n```bash\n# Release artifact for current host\nbun --cwd=packages/natives run build\n\n# Debug profile artifact build\nbun --cwd=packages/natives run dev:native\n\n# Build explicit x64 variants\nTARGET_VARIANT=modern bun --cwd=packages/natives run build\nTARGET_VARIANT=baseline bun --cwd=packages/natives run build\n\n# Generate embedded addon manifest from built native files\nbun --cwd=packages/natives run embed:native\n\n# Reset embedded manifest to null stub\nbun --cwd=packages/natives run embed:native -- --reset\n```",
29
+ "natives-build-release-debugging.md": "# Natives Build, Release, and Debugging Runbook\n\nThis runbook describes how the `@oh-my-pi/pi-natives` build pipeline produces `.node` addons, how compiled distributions load them, and how to debug loader/build failures.\n\nIt follows the architecture terms from `docs/natives-architecture.md`:\n- **build-time artifact production** (`scripts/build-native.ts`)\n- **embedded addon manifest generation** (`scripts/embed-native.ts`)\n- **runtime addon loading + validation gate** (`src/native.ts`)\n\n## Implementation files\n\n- `packages/natives/scripts/build-native.ts`\n- `packages/natives/scripts/embed-native.ts`\n- `packages/natives/package.json`\n- `packages/natives/src/native.ts`\n- `crates/pi-natives/Cargo.toml`\n\n## Build pipeline overview\n\n### 1) Build entrypoints\n\n`packages/natives/package.json` scripts:\n\n- `bun scripts/build-native.ts` (`build`) → release build\n- `bun scripts/embed-native.ts` (`embed:native`) → generate `src/embedded-addon.ts` from built files\n\n### 2) Rust artifact build\n\n`build-native.ts` runs Cargo in `crates/pi-natives`:\n\n- base command: `cargo build --release`\n- cross target adds `--target <CROSS_TARGET>`\n\n`crates/pi-natives/Cargo.toml` declares `crate-type = [\"cdylib\"]`, so Cargo emits a shared library (`.so`/`.dylib`/`.dll`) that is then copied/renamed to a `.node` addon filename.\n\n### 3) Artifact discovery and install\n\nAfter Cargo completes, `build-native.ts` scans candidate output directories in order:\n\n1. `${CARGO_TARGET_DIR}` (if set)\n2. `<repo>/target`\n3. `crates/pi-natives/target`\n\nFor each root it checks profile directories:\n- cross build: `<root>/<crossTarget>/<profile>` then `<root>/<profile>`\n- native build: `<root>/<profile>`\n\nThen it looks for one of:\n- `libpi_natives.so`\n- `libpi_natives.dylib`\n- `pi_natives.dll`\n- `libpi_natives.dll`\n\nWhen found, it atomically installs into `packages/natives/native/` with temp-file + rename semantics (Windows fallback handles locked DLL replacement failures explicitly).\n\n## Target/variant model and naming conventions\n\n## Platform tag\n\nBoth build and runtime use platform tag:\n\n`<platform>-<arch>` (example: `darwin-arm64`, `linux-x64`)\n\n## Variant model (x64 only)\n\nx64 supports CPU variants:\n- `modern` (AVX2-capable path)\n- `baseline` (fallback)\n\nNon-x64 uses a single default artifact (no variant suffix).\n\n### Output filenames\n\nRelease builds:\n- x64: `pi_natives.<platform>-<arch>-modern.node` or `...-baseline.node`\n- non-x64: `pi_natives.<platform>-<arch>.node`\n\nRuntime loader candidate order in `native.ts`:\n- release candidates\n- compiled mode prepends extracted/cache candidates before package-local files\n\n## Environment flags and build options\n\n## Runtime flags\n\n- `PI_NATIVE_VARIANT` (loader behavior, x64 only): force `modern` or `baseline` selection at runtime\n- `PI_COMPILED` (loader behavior): enable compiled-binary candidate/extraction behavior\n\n## Build-time flags/options\n\n- `CROSS_TARGET`: passed to Cargo `--target`\n- `TARGET_PLATFORM`: override output platform tag naming\n- `TARGET_ARCH`: override output arch naming\n- `TARGET_VARIANT` (x64 only): force `modern` or `baseline` for output filename and RUSTFLAGS policy\n- `CARGO_TARGET_DIR`: additional root when searching Cargo outputs\n- `RUSTFLAGS`:\n - if unset and not cross-compiling, script sets:\n - modern: `-C target-cpu=x86-64-v3`\n - baseline: `-C target-cpu=x86-64-v2`\n - non-x64 / no variant: `-C target-cpu=native`\n - if already set, script does not override\n\n## Build state/lifecycle transitions\n\n### Build lifecycle (`build-native.ts`)\n\n1. **Init**: parse args/env (target overrides, cross flags)\n2. **Variant resolve**:\n - non-x64 → no variant\n - x64 + `TARGET_VARIANT` → explicit variant\n - x64 cross-build without `TARGET_VARIANT` → hard error\n - x64 local build without override → detect host AVX2\n3. **Compile**: run Cargo with resolved profile/target\n4. **Locate artifact**: scan target roots/profile dirs/library names\n5. **Install**: copy + atomic rename into `packages/natives/native`\n6. **Complete**: output addon ready for loader candidates\n\nFailure exits happen at any stage with explicit error text (invalid variant, failed cargo build, missing output library, install/rename failure).\n\n### Embed lifecycle (`embed-native.ts`)\n\n1. **Init**: compute platform tag from `TARGET_PLATFORM`/`TARGET_ARCH` or host values\n2. **Candidate set**:\n - x64 expects both `modern` and `baseline`\n - non-x64 expects one default file\n3. **Validate availability** in `packages/natives/native`\n4. **Generate manifest** (`src/embedded-addon.ts`) with Bun `file` imports and package version\n5. **Runtime extraction ready** for compiled mode\n\n`--reset` bypasses validation and writes a null manifest stub (`embeddedAddon = null`).\n\n## Dev workflow vs shipped/compiled behavior\n\n## Local development workflow\n\nTypical local loop:\n\n1. Build addon: `bun --cwd=packages/natives run build`\n2. Loader in `native.ts` resolves package-local `native/` (and executable-dir fallback) candidates\n3. `validateNative` enforces export compatibility before wrappers use the binding\n\n## Shipped/compiled binary workflow\n\nIn compiled mode (`PI_COMPILED` or Bun embedded markers):\n\n1. Loader computes versioned cache dir: `<getNativesDir()>/<packageVersion>` (operationally `~/.omp/natives/<version>`)\n2. If embedded manifest matches current platform+version, loader may extract selected embedded file into that versioned dir\n3. Runtime candidate order includes:\n - versioned cache dir\n - legacy compiled-binary dir (`%LOCALAPPDATA%/omp` on Windows, `~/.local/bin` elsewhere)\n - package/executable directories\n4. First successfully loaded addon still must pass `validateNative`\n\nThis is why packaging + runtime loader expectations must align: filenames, platform tags, and exported symbols must match what `native.ts` probes and validates.\n\n## JS API ↔ Rust export mapping (validation gate subset)\n\n`native.ts` requires these JS-visible exports to exist on the loaded addon. They map to Rust N-API exports in `crates/pi-natives/src`:\n\n| JS name required by `validateNative` | Rust export declaration | Rust source file |\n| --- | --- | --- |\n| `glob` | `#[napi] pub fn glob(...)` | `crates/pi-natives/src/glob.rs` |\n| `grep` | `#[napi] pub fn grep(...)` | `crates/pi-natives/src/grep.rs` |\n| `search` | `#[napi] pub fn search(...)` | `crates/pi-natives/src/grep.rs` |\n| `highlightCode` | `#[napi] pub fn highlight_code(...)` | `crates/pi-natives/src/highlight.rs` |\n| `getSystemInfo` | `#[napi] pub fn get_system_info(...)` | `crates/pi-natives/src/system_info.rs` |\n| `getWorkProfile` | `#[napi] pub fn get_work_profile(...)` (camel-cased export) | `crates/pi-natives/src/prof.rs` |\n| `invalidateFsScanCache` | `#[napi] pub fn invalidate_fs_scan_cache(...)` | `crates/pi-natives/src/fs_cache.rs` |\n\nIf any required symbol is missing, loader fails fast with a rebuild hint.\n\n## Failure behavior and diagnostics\n\n## Build-time failures\n\n- Invalid variant configuration:\n - `TARGET_VARIANT` set on non-x64 → immediate error\n - x64 cross-build without explicit `TARGET_VARIANT` → immediate error\n- Cargo build failure:\n - script surfaces non-zero exit and stderr\n- Artifact not found:\n - script prints every checked profile directory\n- Install failure:\n - explicit message; Windows includes locked-file hint\n\n## Runtime loader failures (`native.ts`)\n\n- Unsupported platform tag:\n - throws with supported platform list\n- No candidate could load:\n - throws with full candidate error list and mode-specific remediation hints\n- Missing exports:\n - throws with exact missing symbol names and rebuild command\n- Embedded extraction problems:\n - extraction mkdir/write errors recorded and included in final diagnostics\n\n## Troubleshooting matrix\n\n| Symptom | Likely cause | Verify | Fix |\n| --- | --- | --- | --- |\n| `Native addon missing exports ... Missing: <name>` | Stale `.node` binary, Rust export name mismatch, or wrong binary loaded | Inspect export list for the binary | Rebuild `build`; ensure Rust `#[napi]` export name (or explicit alias when needed) matches JS key; remove stale cached/versioned files |\n| x64 machine loads baseline when modern expected | `PI_NATIVE_VARIANT=baseline`, no AVX2 detected, or only baseline file present | Check `PI_NATIVE_VARIANT`; inspect `native/` for `-modern` file | Build modern variant (`TARGET_VARIANT=modern ... build`) and ensure file is shipped |\n| Cross-build produces unusable/wrong-labeled binary | Mismatch between `CROSS_TARGET` and `TARGET_PLATFORM`/`TARGET_ARCH`, or missing `TARGET_VARIANT` for x64 | Confirm env tuple and output filename | Re-run with consistent env values and explicit x64 `TARGET_VARIANT` |\n| Compiled binary fails after upgrade | Stale extracted cache (`~/.omp/natives/<old-or-mismatched-version>`) or embedded manifest mismatch | Inspect versioned natives dir and loader error list | Delete versioned natives cache for the package version and rerun; regenerate embedded manifest during packaging |\n| Loader probes many paths and none work | Platform mismatch or missing release artifact in package `native/` | Check `platformTag` vs actual filename(s) | Ensure built filename exactly matches `pi_natives.<platform>-<arch>(-variant).node` convention and package includes `native/` |\n| `embed:native` fails with \"Incomplete native addons\" | Required variant files not built before embedding | Check expected vs found list in error text | Build required files first (x64: both modern+baseline; non-x64: default), then rerun `embed:native` |\n\n## Operational commands\n\n```bash\n# Release artifact for current host\nbun --cwd=packages/natives run build\n\n# Build explicit x64 variants\nTARGET_VARIANT=modern bun --cwd=packages/natives run build\nTARGET_VARIANT=baseline bun --cwd=packages/natives run build\n\n# Generate embedded addon manifest from built native files\nbun --cwd=packages/natives run embed:native\n\n# Reset embedded manifest to null stub\nbun --cwd=packages/natives run embed:native -- --reset\n```",
30
30
  "natives-media-system-utils.md": "# Natives media + system utilities\n\nThis document is a subsystem deep-dive for the **system/media/conversion primitives** layer described in [`docs/natives-architecture.md`](./natives-architecture.md): `image`, `html`, `clipboard`, and `work` profiling.\n\n## Implementation files\n\n- `crates/pi-natives/src/image.rs`\n- `crates/pi-natives/src/html.rs`\n- `crates/pi-natives/src/clipboard.rs`\n- `crates/pi-natives/src/prof.rs`\n- `crates/pi-natives/src/task.rs`\n- `packages/natives/src/image/index.ts`\n- `packages/natives/src/image/types.ts`\n- `packages/natives/src/html/index.ts`\n- `packages/natives/src/html/types.ts`\n- `packages/natives/src/clipboard/index.ts`\n- `packages/natives/src/clipboard/types.ts`\n- `packages/natives/src/work/index.ts`\n- `packages/natives/src/work/types.ts`\n\n> Note: there is no `crates/pi-natives/src/work.rs`; work profiling is implemented in `prof.rs` and fed by instrumentation in `task.rs`.\n\n## TS API ↔ Rust export/module mapping\n\n| TS export (packages/natives) | Rust N-API export | Rust module |\n| ------------------------------------------- | ----------------------------------------------------------------------- | ------------------------------------- |\n| `PhotonImage.parse(bytes)` | `PhotonImage::parse` | `image.rs` |\n| `PhotonImage#resize(width, height, filter)` | `PhotonImage::resize` | `image.rs` |\n| `PhotonImage#encode(format, quality)` | `PhotonImage::encode` | `image.rs` |\n| `htmlToMarkdown(html, options)` | `html_to_markdown` | `html.rs` |\n| `copyToClipboard(text)` | `copy_to_clipboard` + TS fallback logic | `clipboard.rs` + `clipboard/index.ts` |\n| `readImageFromClipboard()` | `read_image_from_clipboard` | `clipboard.rs` |\n| `getWorkProfile(lastSeconds)` | `get_work_profile` | `prof.rs` |\n\n## Data format boundaries and conversions\n\n### Image (`image`)\n\n- **JS input boundary**: `Uint8Array` encoded image bytes.\n- **Rust decode boundary**: bytes are copied to `Vec<u8>`, format is guessed with `ImageReader::with_guessed_format()`, then decoded to `DynamicImage`.\n- **In-memory state**: `PhotonImage` stores `Arc<DynamicImage>`.\n- **Output boundary**: `encode(format, quality)` returns `Promise<Uint8Array>` (Rust `Vec<u8>`).\n\nFormat IDs are numeric:\n\n- `0`: PNG\n- `1`: JPEG\n- `2`: WebP (lossless encoder)\n- `3`: GIF\n\nConstraints:\n\n- `quality` is only used for JPEG.\n- PNG/WebP/GIF ignore `quality`.\n- Unsupported format IDs fail (`Invalid image format: <id>`).\n\n### HTML conversion (`html`)\n\n- **JS input boundary**: HTML `string` + optional object `{ cleanContent?: boolean; skipImages?: boolean }`.\n- **Rust conversion boundary**: `String` input is converted by `html_to_markdown_rs::convert`.\n- **Output boundary**: Markdown `string`.\n\nConversion behavior:\n\n- `cleanContent` defaults to `false`.\n- When `cleanContent=true`, preprocessing is enabled with `PreprocessingPreset::Aggressive` and hard-removal flags for navigation/forms.\n- `skipImages` defaults to `false`.\n\n### Clipboard (`clipboard`)\n\n- **Text path**:\n - TS first emits OSC 52 (`\\x1b]52;c;<base64>\\x07`) when stdout is a TTY.\n - Same text is then attempted via native clipboard API (`native.copyToClipboard`) as best-effort.\n - On Termux, TS attempts `termux-clipboard-set` first.\n- **Image read path**:\n - Rust reads raw image from `arboard`.\n - Rust re-encodes it to PNG bytes (`image` crate), returns `{ data: Uint8Array, mimeType: \"image/png\" }`.\n - TS returns `null` early on Termux or Linux sessions without display server (`DISPLAY`/`WAYLAND_DISPLAY` missing).\n\n### Work profiling (`work`)\n\n- **Collection boundary**: profiling samples are produced by `profile_region(tag)` guards in `task::blocking` and `task::future`.\n- **Storage format**: fixed-size circular buffer (`MAX_SAMPLES = 10_000`) storing stack path + duration (`μs`) + timestamp (`μs since process start`).\n- **Output boundary**: `getWorkProfile(lastSeconds)` returns object:\n - `folded`: folded-stack text (flamegraph input)\n - `summary`: markdown table summary\n - `svg`: optional flamegraph SVG\n - `totalMs`, `sampleCount`\n\n## Lifecycle and state transitions\n\n### Image lifecycle\n\n1. `PhotonImage.parse(bytes)` schedules a blocking decode task (`image.decode`).\n2. On success, a native `PhotonImage` handle exists in JS.\n3. `resize(...)` creates a new native handle (`image.resize`), old and new handles can coexist.\n4. `encode(...)` materializes bytes (`image.encode`) without mutating image dimensions.\n\nFailure transitions:\n\n- Format detection/decode failure rejects parse promise.\n- Encode failure rejects encode promise.\n- Invalid format ID rejects encode promise.\n\n### HTML lifecycle\n\n1. `htmlToMarkdown(html, options)` schedules a blocking conversion task.\n2. Conversion runs with defaulted options (`cleanContent=false`, `skipImages=false`) unless specified.\n3. Returns markdown string or rejects.\n\nFailure transitions:\n\n- Converter failure returns rejected promise (`Conversion error: ...`).\n\n### Clipboard lifecycle\n\n`copyToClipboard(text)` is intentionally best-effort and multi-path:\n\n1. If TTY: attempt OSC 52 write (base64 payload).\n2. Try Termux command when `TERMUX_VERSION` is set.\n3. Try native `arboard` text copy.\n4. Swallow errors at TS layer.\n\n`readImageFromClipboard()` strictness differs by stage:\n\n1. TS hard-gates unsupported runtime contexts (Termux/headless Linux) to `null`.\n2. Rust `arboard` read runs only when TS allows it.\n3. `ContentNotAvailable` maps to `null`.\n4. Other Rust errors reject.\n\n### Work profiling lifecycle\n\n1. No explicit start: profiling is always on when task helpers execute.\n2. Every instrumented task scope records one sample on guard drop.\n3. Samples overwrite oldest entries after buffer capacity is reached.\n4. `getWorkProfile(lastSeconds)` reads a time window and derives folded/summary/svg artifacts.\n\nFailure transitions:\n\n- SVG generation failure is soft-fail (`svg: null`), while folded and summary still return.\n- Empty sample window returns empty folded data and `svg: null`, not an error.\n\n## Unsupported operations and error propagation\n\n### Image\n\n- Unsupported decode input or corrupted bytes: strict failure (promise rejection).\n- Unsupported encode format ID: strict failure.\n- No best-effort fallback path in TS wrapper.\n\n### HTML\n\n- Conversion errors are strict failures (rejection).\n- Option omission is best-effort defaulting, not failure.\n\n### Clipboard\n\n- Text copy is best-effort at TS layer: operational failures are suppressed.\n- Image read distinguishes \"no image\" (`null`) from operational failure (rejection).\n- Termux/headless Linux are treated as unsupported contexts for image read (`null`).\n\n### Work profiling\n\n- Retrieval is strict for function call itself, but artifact generation is partially best-effort (`svg` nullable).\n- Buffer truncation is expected behavior (ring buffer), not data loss bug.\n\n## Platform caveats\n\n- **Clipboard text**: OSC 52 depends on terminal support; native clipboard access depends on desktop environment/session.\n- **Clipboard image read**: blocked in TS for Termux and Linux without display server.\n",
31
31
  "natives-rust-task-cancellation.md": "# Native Rust task execution and cancellation (`pi-natives`)\n\nThis document describes how `crates/pi-natives` schedules native work and how cancellation flows from JS options (`timeoutMs`, `AbortSignal`) to Rust execution.\n\n## Implementation files\n\n- `crates/pi-natives/src/task.rs`\n- `crates/pi-natives/src/grep.rs`\n- `crates/pi-natives/src/glob.rs`\n- `crates/pi-natives/src/fd.rs`\n- `crates/pi-natives/src/shell.rs`\n- `crates/pi-natives/src/pty.rs`\n- `crates/pi-natives/src/html.rs`\n- `crates/pi-natives/src/image.rs`\n- `crates/pi-natives/src/clipboard.rs`\n- `crates/pi-natives/src/text.rs`\n- `crates/pi-natives/src/ps.rs`\n\n## Core primitives (`task.rs`)\n\n`task.rs` defines three core pieces:\n\n1. `task::blocking(tag, cancel_token, work)`\n - Wraps `napi::AsyncTask` / `Task`.\n - `compute()` runs on libuv worker threads (for CPU-bound or blocking/sync system calls).\n - Returns a JS `Promise<T>`.\n\n2. `task::future(env, tag, work)`\n - Wraps `env.spawn_future(...)`.\n - Runs async work on Tokio runtime.\n - Returns `PromiseRaw<'env, T>`.\n\n3. `CancelToken` / `AbortToken` / `AbortReason`\n - `CancelToken::new(timeout_ms, signal)` combines deadline + optional `AbortSignal`.\n - `CancelToken::heartbeat()` is cooperative cancellation for blocking loops.\n - `CancelToken::wait()` is async cancellation wait (`Signal` / `Timeout` / `User` Ctrl-C).\n - `AbortToken` lets external code request abort (`abort(reason)`).\n\n## `blocking` vs `future`: execution model and selection\n\n### Use `task::blocking`\n\nUse when work is CPU-heavy or fundamentally synchronous/blocking:\n\n- regex/file scanning (`grep`, `glob`, `fuzzy_find`)\n- synchronous PTY loop internals (`run_pty_sync` via `spawn_blocking`)\n- clipboard/image/html conversions\n\nBehavior:\n\n- Work closure receives a cloned `CancelToken`.\n- Cancellation is only observed where code checks `ct.heartbeat()?`.\n- Closure `Err(...)` rejects JS promise.\n\n### Use `task::future`\n\nUse when work must `await` async operations:\n\n- shell session orchestration (`shell.run`, `executeShell`)\n- task racing (`tokio::select!`) between completion and cancellation\n\nBehavior:\n\n- Future can race normal completion against `ct.wait()`.\n- On cancel path, async implementations typically propagate cancellation to inner subsystems (e.g., `tokio_util::CancellationToken`) and optionally force abort on grace timeout.\n\n## JS API ↔ Rust export mapping (task/cancel relevant)\n\n| JS-facing API | Rust export (`#[napi]`) | Scheduler | Cancellation hookup |\n|---|---|---|---|\n| `grep(options, onMatch?)` | `grep` | `task::blocking(\"grep\", ct, ...)` | `CancelToken::new(options.timeoutMs, options.signal)` + `ct.heartbeat()` |\n| `glob(options, onMatch?)` | `glob` | `task::blocking(\"glob\", ct, ...)` | `CancelToken::new(...)` + `ct.heartbeat()` in filter loop |\n| `fuzzyFind(options)` | `fuzzy_find` | `task::blocking(\"fuzzy_find\", ct, ...)` | `CancelToken::new(...)` + `ct.heartbeat()` in scoring loop |\n| `shell.run(options, onChunk?)` | `Shell::run` | `task::future(env, \"shell.run\", ...)` | `ct.wait()` raced against run task; bridges to Tokio `CancellationToken` |\n| `executeShell(options, onChunk?)` | `execute_shell` | `task::future(env, \"shell.execute\", ...)` | same as above |\n| `pty.start(options, onChunk?)` | `PtySession::start` | `task::future(env, \"pty.start\", ...)` + inner `spawn_blocking` | `CancelToken` checked in sync PTY loop via `heartbeat()` |\n| `htmlToMarkdown(html, options?)` | `html_to_markdown` | `task::blocking(\"html_to_markdown\", (), ...)` | none (`()` token) |\n| `PhotonImage.parse/encode/resize` | `PhotonImage::{parse,encode,resize}` | `task::blocking(...)` | none (`()` token) |\n| `copyToClipboard/readImageFromClipboard` | `copy_to_clipboard` / `read_image_from_clipboard` | `task::blocking(...)` | none (`()` token) |\n\n`text.rs` and `ps.rs` currently do not use `task::blocking`/`task::future` and therefore do not participate in this cancellation path.\n\n## Cancellation lifecycle and state transitions\n\n### `CancelToken` lifecycle\n\n`CancelToken` is cooperative and stateful:\n\n```text\nCreated\n ├─ no signal + no timeout -> passive token (never aborts unless externally emplaced)\n ├─ signal registered -> waits for AbortSignal callback\n └─ deadline set -> timeout check becomes active\n\nRunning\n ├─ heartbeat()/wait() sees signal -> AbortReason::Signal\n ├─ heartbeat()/wait() sees deadline -> AbortReason::Timeout\n ├─ wait() sees Ctrl-C -> AbortReason::User\n └─ no abort -> continue\n\nAborted (terminal)\n └─ first abort reason wins (atomic flag + notifier)\n```\n\n### Before-start vs mid-execution cancellation\n\n- **Before start / before first cancellation check**:\n - `task::future` users that race on `ct.wait()` can resolve cancel immediately once they enter `select!`.\n - `task::blocking` users only observe cancellation when closure code reaches `heartbeat()`. If closure does not heartbeat early, cancellation is delayed.\n\n- **Mid-execution**:\n - `blocking`: next `heartbeat()` returns `Err(\"Aborted: ...\")`.\n - `future`: `ct.wait()` branch wins `select!`, then code cancels subordinate async machinery (for shell: cancels Tokio token, waits up to 2s, then aborts task).\n\n## Heartbeat expectations for long-running loops\n\n`heartbeat()` must run at predictable cadence in loops with unbounded or large work sets.\n\nObserved patterns:\n\n- `glob::filter_entries`: check each entry before filtering/matching.\n- `fd::score_entries`: check each scanned candidate.\n- `grep_sync`: explicit cancellation check before heavy search phase, plus fs-cache calls that also receive token.\n- `run_pty_sync`: check every loop tick (~16ms sleep cadence) and kill child on cancellation.\n\nPractical rule: no loop over external-size input should exceed a short bounded interval without a heartbeat.\n\n## Failure behavior and error propagation to JS\n\n### Blocking tasks\n\nError path:\n\n1. Closure returns `Err(napi::Error)` (including `heartbeat()` abort).\n2. `Task::compute()` returns `Err`.\n3. `AsyncTask` rejects JS promise.\n\nTypical error strings:\n\n- `Aborted: Timeout`\n- `Aborted: Signal`\n- domain errors (`Failed to decode image: ...`, `Conversion error: ...`, etc.)\n\n### Future tasks\n\nError path:\n\n1. Async body returns `Err(napi::Error)` or join failure is mapped (`... task failed: {err}`).\n2. `task::future`-spawned promise rejects.\n3. Some APIs intentionally return structured cancellation results instead of rejection (`ShellRunResult`/`ShellExecuteResult` with `cancelled`/`timed_out` flags and `exit_code: None`).\n\n### Cancellation reporting split\n\n- **Abort as error**: most blocking exports using `heartbeat()?`.\n- **Abort as typed result**: shell/pty style command APIs that model cancellation in result structs.\n\nChoose one model per API and document it explicitly.\n\n## Common pitfalls\n\n1. **Missing heartbeat in blocking loops**\n - Symptom: timeout/signal appears ignored until loop ends.\n - Fix: add `ct.heartbeat()?` at loop top and before expensive per-item steps.\n\n2. **Long uncancelable sections**\n - Symptom: cancellation latency spikes during single large call (decode, sort, compression, etc.).\n - Fix: split work into chunks with heartbeat boundaries; if impossible, document latency.\n\n3. **Blocking async executor**\n - Symptom: async API stalls when sync-heavy code runs directly in future.\n - Fix: move CPU/sync blocks to `task::blocking` or `tokio::task::spawn_blocking`.\n\n4. **Inconsistent cancel semantics**\n - Symptom: one API rejects on cancel, another resolves with flags, confusing callers.\n - Fix: standardize per domain and keep wrapper docs aligned.\n\n5. **Forgetting cancellation bridge in nested async tasks**\n - Symptom: outer token is cancelled but inner readers/subprocess tasks keep running.\n - Fix: bridge cancellation to inner token/signal and enforce grace timeout + forced abort fallback.\n\n## Checklist for new cancellable exports\n\n1. Classify work correctly:\n - CPU-bound or sync blocking -> `task::blocking`\n - async I/O / `await` orchestration -> `task::future`\n\n2. Expose cancel inputs when needed:\n - include `timeoutMs` and `signal` in `#[napi(object)]` options\n - create `let ct = task::CancelToken::new(timeout_ms, signal);`\n\n3. Wire cancellation through all layers:\n - blocking loops: `ct.heartbeat()?` at stable intervals\n - async orchestration: race with `ct.wait()` and cancel sub-tasks/tokens\n\n4. Decide cancellation contract:\n - reject promise with abort error, or\n - resolve typed `{ cancelled, timedOut, ... }`\n - keep this contract consistent for the API family\n\n5. Propagate failures with context:\n - map errors via `Error::from_reason(format!(\"...: {err}\"))`\n - include stage-specific prefixes (`spawn`, `decode`, `wait`, etc.)\n\n6. Handle before-start and mid-flight cancellation:\n - cancellation check/await must happen before expensive body and during long execution\n\n7. Validate no executor misuse:\n - no long sync work directly inside async futures without `spawn_blocking`/blocking task wrapper\n",
32
32
  "natives-shell-pty-process.md": "# Natives Shell, PTY, Process, and Key Internals\n\nThis document covers the **execution/process/terminal primitives** in `@oh-my-pi/pi-natives`: `shell`, `pty`, `ps`, and `keys`, using the architecture terms from `docs/natives-architecture.md`.\n\n## Implementation files\n\n- `crates/pi-natives/src/shell.rs`\n- `crates/pi-natives/src/shell/windows.rs` (Windows only)\n- `crates/pi-natives/src/pty.rs`\n- `crates/pi-natives/src/ps.rs`\n- `crates/pi-natives/src/keys.rs`\n- `crates/pi-natives/src/task.rs` (shared cancellation behavior used by shell/pty)\n- `packages/natives/src/shell/index.ts`\n- `packages/natives/src/shell/types.ts`\n- `packages/natives/src/pty/index.ts`\n- `packages/natives/src/pty/types.ts`\n- `packages/natives/src/ps/index.ts`\n- `packages/natives/src/ps/types.ts`\n- `packages/natives/src/keys/index.ts`\n- `packages/natives/src/keys/types.ts`\n- `packages/natives/src/bindings.ts`\n\n## Layer ownership\n\n- **TS wrapper/API layer** (`packages/natives/src/*`): typed entrypoints, cancellation surface (`timeoutMs`, `AbortSignal`), and JS ergonomics.\n- **Rust N-API module layer** (`crates/pi-natives/src/*`): shell/PTY process execution, process-tree traversal/termination, and key-sequence parsing.\n- **Validation gate** (`native.ts`, architecture-level): ensures required exports (`Shell`, `executeShell`, `PtySession`, `killTree`, `listDescendants`, key helpers) exist before wrappers are used.\n\n## Shell subsystem (`shell`)\n\n### API model\n\nTwo execution modes are exposed:\n\n1. **One-shot** via `executeShell(options, onChunk?)`.\n2. **Persistent session** via `new Shell(options?)` then `shell.run(...)` repeatedly.\n\nBoth stream output through a threadsafe callback and return `{ exitCode?, cancelled, timedOut }`.\n\n### Session creation and environment model\n\nRust creates `brush_core::Shell` with:\n\n- non-interactive mode,\n- `do_not_inherit_env: true`,\n- explicit environment reconstruction from host env,\n- skip-list for shell-sensitive vars (`PS1`, `PWD`, `SHLVL`, bash function exports, etc.).\n\nSession env behavior:\n\n- `ShellOptions.sessionEnv` is applied once at session creation.\n- `ShellRunOptions.env` is command-scoped (`EnvironmentScope::Command`) and popped after each run.\n- `PATH` is merged specially on Windows with case-insensitive dedupe.\n\nWindows-only path enrichment (`shell/windows.rs`): discovered Git-for-Windows paths (`cmd`, `bin`, `usr/bin`) are appended if present and not already included.\n\n### Runtime lifecycle and state transitions\n\nPersistent shell (`Shell.run`) uses this state machine:\n\n- **Idle/Uninitialized**: `session: None`.\n- **Running**: first `run()` lazily creates session, stores `current_abort` token, executes command.\n- **Completed + keepalive**: if execution control flow is `Normal`, `current_abort` is cleared and session is reused.\n- **Completed + teardown**: if control flow is loop/script/shell-exit related (`BreakLoop`, `ContinueLoop`, `ReturnFromFunctionOrScript`, `ExitShell`), session is dropped (`session: None`).\n- **Cancelled/Timed out**: run task is cancelled, grace wait (2s), then force-abort; session is dropped.\n- **Error**: session is dropped.\n\nOne-shot shell (`executeShell`) always creates and drops a fresh session per call.\n\n### Streaming/output behavior\n\n- Stdout/stderr are routed into a shared pipe and read concurrently.\n- Reader decodes UTF-8 incrementally; invalid byte sequences emit `U+FFFD` replacement chunks.\n- After process completion, output drain has idle/max guards (`250ms` idle, `2s` max) to avoid hanging on background jobs keeping descriptors open.\n\n### Cancellation, timeout, and background jobs\n\n- `CancelToken` is constructed from `timeoutMs` and optional `AbortSignal`.\n- On cancellation/timeout, shell cancellation token is triggered, then task gets a 2s graceful window before forced abort.\n- If cancellation occurs, background jobs are terminated (`TERM`, then delayed `KILL`) using brush job metadata.\n\n`Shell.abort()` behavior:\n\n- aborts only current running command for that `Shell` instance,\n- no-op success when nothing is running.\n\n### Failure behavior\n\nCommon surfaced errors include:\n\n- session init failures (`Failed to initialize shell`),\n- cwd errors (`Failed to set cwd`),\n- env set/pop failures,\n- snapshot source failures,\n- pipe creation/clone failures,\n- execution failure (`Shell execution failed: ...`),\n- task wrapper failures (`Shell execution task failed: ...`).\n\nResult-level cancellation flags:\n\n- timeout -> `exitCode: undefined`, `timedOut: true`.\n- abort signal -> `exitCode: undefined`, `cancelled: true`.\n\n## PTY subsystem (`pty`)\n\n### API model\n\n`new PtySession()` exposes:\n\n- `start(options, onChunk?) -> Promise<{ exitCode?, cancelled, timedOut }>`\n- `write(data)`\n- `resize(cols, rows)`\n- `kill()`\n\n### Runtime lifecycle and state transitions\n\n`PtySession` state machine:\n\n- **Idle**: `core: None`.\n- **Reserved**: `start()` installs control channel synchronously (`core: Some`) before async work begins, so `write/resize/kill` become immediately valid.\n- **Running**: blocking PTY loop handles child state, reader events, cancellation heartbeat, and control messages.\n- **Terminal closed**: child exit + reader completion.\n- **Finalized**: `core` is always reset to `None` after start task completion (success or error).\n\nConcurrency guard:\n\n- starting while already running returns `PTY session already running`.\n\n### Spawn/attach/write/read/terminate patterns\n\n- PTY opened via `portable_pty::native_pty_system().openpty(...)`.\n- Command currently runs as `sh -lc <command>` with optional `cwd` and env overrides.\n- `write()` sends raw bytes to PTY stdin.\n- `resize()` clamps dimensions (`cols 20..400`, `rows 5..200`) and calls master resize.\n- `kill()` marks run as cancelled and kills child process.\n\nOutput path:\n\n- dedicated reader thread reads master stream,\n- incremental UTF-8 decode with `U+FFFD` replacement on invalid bytes,\n- chunks forwarded through N-API threadsafe callback.\n\n### Cancellation and timeout semantics\n\n- `timeoutMs` and `AbortSignal` feed a `CancelToken`.\n- loop calls `ct.heartbeat()` periodically; abort triggers child kill.\n- timeout classification is string-based (`\"Timeout\"` substring in heartbeat error).\n\n### Failure behavior\n\nError surfaces include:\n\n- PTY allocation/open failure,\n- PTY spawn failure,\n- writer/reader acquisition failure,\n- child status/wait failures,\n- lock poisoning,\n- control-channel disconnection (`PTY session is no longer available`).\n\nControl call failures when not running:\n\n- `write/resize/kill` return `PTY session is not running`.\n\n## Process-tree subsystem (`ps`)\n\n### API model\n\n- `killTree(pid, signal) -> number`\n- `listDescendants(pid) -> number[]`\n\nTS wrapper also registers native kill-tree integration into shared utils via `setNativeKillTree(native.killTree)`.\n\n### Platform-specific implementation\n\n- **Linux**: recursively reads `/proc/<pid>/task/<pid>/children`.\n- **macOS**: uses `libproc` `proc_listchildpids`.\n- **Windows**: snapshots process table with `CreateToolhelp32Snapshot`, builds parent->children map, terminates with `OpenProcess(PROCESS_TERMINATE)` + `TerminateProcess`.\n\n### Kill-tree behavior\n\n- Descendants are collected recursively.\n- Kill order is bottom-up (deepest descendants first) to reduce orphan re-parenting.\n- Root pid is killed last.\n- Return value is count of successful terminations.\n\nSignal behavior:\n\n- POSIX: provided `signal` is passed to `kill`.\n- Windows: `signal` is ignored; termination is unconditional process terminate.\n\n### Failure behavior\n\nThis module is intentionally non-throwing at API surface:\n\n- missing/inaccessible process tree branches are skipped,\n- per-pid kill failures are counted as unsuccessful (not errors),\n- lookup miss typically yields `[]` from `listDescendants` and `0` from `killTree`.\n\n## Key parsing subsystem (`keys`)\n\n### API model\n\nExposed helpers:\n\n- `parseKey(data, kittyProtocolActive)`\n- `matchesKey(data, keyId, kittyProtocolActive)`\n- `parseKittySequence(data)`\n- `matchesKittySequence(data, expectedCodepoint, expectedModifier)`\n- `matchesLegacySequence(data, keyName)`\n\n### Parsing model\n\nThe parser combines:\n\n- direct single-byte mappings (`enter`, `tab`, `ctrl+<letter>`, printable ASCII),\n- O(1) legacy escape-sequence lookup (PHF map),\n- xterm `modifyOtherKeys` parsing,\n- Kitty protocol parsing (`CSI u`, `CSI ~`, `CSI 1;...<letter>`),\n- normalization to key IDs (`ctrl+c`, `shift+tab`, `pageUp`, `f5`, etc.).\n\nModifier handling:\n\n- only shift/alt/ctrl bits are compared for key matching,\n- lock bits are masked out before comparisons.\n\nLayout behavior:\n\n- base-layout fallback is intentionally constrained so remapped layouts do not create false matches for ASCII letters/symbols.\n\n### Failure behavior\n\n- Unrecognized or invalid sequences produce `null` from parse functions.\n- Match functions return `false` on parse failure or mismatch.\n- No thrown error surface for malformed key input.\n\n## JS wrapper API ↔ Rust export mapping\n\n### Shell + PTY + Process\n\n| TS wrapper API | Rust N-API export | Notes |\n|---|---|---|\n| `executeShell(options, onChunk?)` | `executeShell` (`execute_shell`) | One-shot shell execution |\n| `new Shell(options?)` | `Shell` class | Persistent shell session |\n| `shell.run(options, onChunk?)` | `Shell::run` | Reuses session on keepalive control flow |\n| `shell.abort()` | `Shell::abort` | Aborts active run for that shell instance |\n| `new PtySession()` | `PtySession` class | Stateful PTY session |\n| `pty.start(options, onChunk?)` | `PtySession::start` | Interactive PTY run |\n| `pty.write(data)` | `PtySession::write` | Raw stdin passthrough |\n| `pty.resize(cols, rows)` | `PtySession::resize` | Clamped terminal dimensions |\n| `pty.kill()` | `PtySession::kill` | Force-kills active PTY child |\n| `killTree(pid, signal)` | `killTree` (`kill_tree`) | Children-first process tree termination |\n| `listDescendants(pid)` | `listDescendants` (`list_descendants`) | Recursive descendants listing |\n\n### Keys\n\n| TS wrapper API | Rust N-API export | Notes |\n|---|---|---|\n| `matchesKittySequence(data, cp, mod)` | `matchesKittySequence` (`matches_kitty_sequence`) | Kitty codepoint+modifier match |\n| `parseKey(data, kittyProtocolActive)` | `parseKey` (`parse_key`) | Normalized key-id parser |\n| `matchesLegacySequence(data, keyName)` | `matchesLegacySequence` (`matches_legacy_sequence`) | Exact legacy sequence map check |\n| `parseKittySequence(data)` | `parseKittySequence` (`parse_kitty_sequence`) | Structured Kitty parse result |\n| `matchesKey(data, keyId, kittyProtocolActive)` | `matchesKey` (`matches_key`) | High-level key matcher |\n\n## Abandoned session cleanup and finalization notes\n\n- **Shell persistent session**: if a run is cancelled/timed out/errors/non-keepalive control flow, Rust explicitly drops the internal session state. Successful normal runs keep the session for reuse.\n- **PTY session**: `core` is always cleared after `start()` finishes, including failure paths.\n- **No explicit JS finalizer-driven kill contract** is exposed by wrappers; cleanup is primarily tied to run completion/cancellation paths. Callers should use `timeoutMs`, `AbortSignal`, `shell.abort()`, or `pty.kill()` for deterministic teardown.\n",
@@ -35,7 +35,7 @@ export const EMBEDDED_DOCS: Readonly<Record<string, string>> = {
35
35
  "notebook-tool-runtime.md": "# Notebook tool runtime internals\n\nThis document describes the current `notebook` tool implementation and its relationship to the kernel-backed Python runtime.\n\nThe critical distinction: **`notebook` is a JSON notebook editor, not a notebook executor**. It edits `.ipynb` cell sources directly; it does not start or talk to a Python kernel.\n\n## Implementation files\n\n- [`src/tools/notebook.ts`](../packages/coding-agent/src/tools/notebook.ts)\n- [`src/ipy/executor.ts`](../packages/coding-agent/src/ipy/executor.ts)\n- [`src/ipy/kernel.ts`](../packages/coding-agent/src/ipy/kernel.ts)\n- [`src/session/streaming-output.ts`](../packages/coding-agent/src/session/streaming-output.ts)\n- [`src/tools/python.ts`](../packages/coding-agent/src/tools/python.ts)\n\n## 1) Runtime boundary: editing vs executing\n\n## `notebook` tool (`src/tools/notebook.ts`)\n\n- Supports `action: edit | insert | delete` on a `.ipynb` file.\n- Resolves path relative to session CWD (`resolveToCwd`).\n- Loads notebook JSON, validates `cells` array, validates `cell_index` bounds.\n- Applies source edits in-memory and writes full notebook JSON back with `JSON.stringify(notebook, null, 1)`.\n- Returns textual summary + structured `details` (`action`, `cellIndex`, `cellType`, `totalCells`, `cellSource`).\n\nNo kernel lifecycle exists in this tool:\n\n- no gateway acquisition\n- no kernel session ID\n- no `execute_request`\n- no stream chunks from kernel channels\n- no rich display capture (`image/png`, JSON display, status MIME)\n\n## Notebook-like execution path (`src/tools/python.ts` + `src/ipy/*`)\n\nWhen the agent needs to run cell-style Python code (sequential cells, persistent state, rich displays), that goes through the **`python` tool**, not `notebook`.\n\nThat path is where kernel modes, restart/cancel behavior, chunk streaming, and output artifact truncation live.\n\n## 2) Notebook cell handling semantics (`notebook` tool)\n\n## Source normalization\n\n`content` is split into `source: string[]` with newline preservation:\n\n- each non-final line keeps trailing `\\n`\n- final line has no forced trailing newline\n\nThis mirrors notebook JSON conventions and avoids accidental line concatenation on later edits.\n\n## Action behavior\n\n- `edit`\n - replaces `cells[cell_index].source`\n - preserves existing `cell_type`\n- `insert`\n - inserts at `[0..cellCount]`\n - `cell_type` defaults to `code`\n - code cells initialize `execution_count: null` and `outputs: []`\n - markdown cells initialize only `metadata` + `source`\n- `delete`\n - removes `cells[cell_index]`\n - returns removed `source` in details for renderer preview\n\n## Error surfaces\n\nHard failures are thrown for:\n\n- missing notebook file\n- invalid JSON\n- missing/non-array `cells`\n- out-of-range index (insert and non-insert have different valid ranges)\n- missing `content` for `edit`/`insert`\n\nThese become `Error:` tool responses upstream; renderer uses notebook path + formatted error text.\n\n## 3) Kernel session semantics (where they actually exist)\n\nKernel semantics are implemented in `executePython` / `PythonKernel` and apply to the `python` tool.\n\n## Modes\n\n`PythonKernelMode`:\n\n- `session` (default)\n - kernels cached in `kernelSessions` map\n - max 4 sessions; oldest evicted on overflow\n - idle/dead cleanup every 30s, timeout after 5 minutes\n - per-session queue serializes execution (`session.queue`)\n- `per-call`\n - creates kernel for request\n - executes\n - always shuts down kernel in `finally`\n\n## Reset behavior\n\n`python` tool passes `reset` only for the first cell in a multi-cell call; later cells always run with `reset: false`.\n\n## Kernel death / restart / retry\n\nIn session mode (`withKernelSession`):\n\n- dead kernel detected by heartbeat (`kernel.isAlive()` check every 5s) or execute failure.\n- pre-run dead state triggers `restartKernelSession`.\n- execute-time crash path retries once: restart kernel, rerun handler.\n- `restartCount > 1` in same session throws `Python kernel restarted too many times in this session`.\n\nStartup retry behavior:\n\n- shared gateway kernel creation retries once on `SharedGatewayCreateError` with HTTP 5xx.\n\nResource exhaustion recovery:\n\n- detects `EMFILE`/`ENFILE`/\"Too many open files\" style failures\n- clears tracked sessions\n- calls `shutdownSharedGateway()`\n- retries kernel session creation once\n\n## 4) Environment/session variable injection\n\nKernel startup receives optional env map from executor:\n\n- `PI_SESSION_FILE` (session state file path)\n- `ARTIFACTS` (artifact directory)\n\n`PythonKernel.#initializeKernelEnvironment(...)` then runs init script inside kernel to:\n\n- `os.chdir(cwd)`\n- inject env entries into `os.environ`\n- prepend cwd to `sys.path` if missing\n\nImplication:\n\n- prelude helpers that read session or artifact context rely on these env vars in Python process state.\n\n## 5) Streaming/chunk and display handling (kernel-backed path)\n\nThe kernel client processes Jupyter protocol messages per execution:\n\n- `stream` -> text chunk to `onChunk`\n- `execute_result` / `display_data` ->\n - display text chosen by MIME precedence: `text/markdown` > `text/plain` > converted `text/html`\n - structured outputs captured separately:\n - `application/json` -> `{ type: \"json\" }`\n - `image/png` -> `{ type: \"image\" }`\n - `application/x-omp-status` -> `{ type: \"status\" }` (no text emission)\n- `error` -> traceback text pushed to chunk stream + structured error metadata\n- `input_request` -> emits stdin warning text, sends empty `input_reply`, marks stdin requested\n- completion waits for both `execute_reply` and kernel `status=idle`\n\nCancellation/timeout:\n\n- abort signal triggers `interrupt()` (REST `/interrupt` + control-channel `interrupt_request`)\n- result marks `cancelled=true`\n- timeout path annotates output with `Command timed out after <n> seconds`\n\n## 6) Truncation and artifact behavior\n\n`OutputSink` in `src/session/streaming-output.ts` is used by kernel execution paths (`executeWithKernel`):\n\n- sanitizes every chunk (`sanitizeText`)\n- tracks total/output lines and bytes\n- optional artifact spill file (`artifactPath`, `artifactId`)\n- when in-memory buffer exceeds threshold (`DEFAULT_MAX_BYTES` unless overridden):\n - marks truncated\n - keeps tail bytes in memory (UTF-8 safe boundary)\n - can spill full stream to artifact sink\n\n`dump()` returns:\n\n- visible output text (possibly tail-truncated)\n- truncation flag + counts\n- artifact ID (for `artifact://<id>` references)\n\n`python` tool converts this metadata into result truncation notices and TUI warnings.\n\n`notebook` tool does **not** use `OutputSink`; it has no stream/artifact truncation pipeline because it does not execute code.\n\n## 7) Renderer assumptions and formatting\n\n## Notebook renderer (`notebookToolRenderer`)\n\n- call view: status line with action + notebook path + cell/type metadata\n- result view:\n - success summary derived from `details`\n - `cellSource` rendered via `renderCodeCell`\n - markdown cells set language hint `markdown`; other cells have no explicit language override\n - collapsed code preview limit is `PREVIEW_LIMITS.COLLAPSED_LINES * 2`\n - supports expanded mode via shared render options\n - uses render cache keyed by width + expanded state\n\nError rendering assumption:\n\n- if first text content starts with `Error:`, renderer formats as notebook error block.\n\n## Python renderer (for actual execution output)\n\nKernel-backed execution rendering expects:\n\n- per-cell status transitions (`pending/running/complete/error`)\n- optional structured status event section\n- optional JSON output trees\n- truncation warnings + optional `artifact://<id>` pointer\n\nThis renderer behavior is unrelated to `notebook` JSON editing results except that both reuse shared TUI primitives.\n\n## 8) Divergence from plain Python tool behavior\n\nIf \"plain Python tool\" means `python` execution path:\n\n- `python` executes code in a kernel, persists state by mode, streams chunks, captures rich displays, handles interrupts/timeouts, and supports output truncation/artifacts.\n- `notebook` performs deterministic notebook JSON mutations only; no execution, no kernel state, no chunk stream, no display outputs, no artifact pipeline.\n\nIf a workflow needs both:\n\n1. edit notebook source with `notebook`\n2. execute code cells via `python` (manually passing code), not through `notebook`\n\nCurrent implementation does not provide a single tool that both mutates `.ipynb` and executes notebook cells through kernel context.\n",
36
36
  "plugin-manager-installer-plumbing.md": "# Plugin manager and installer plumbing\n\nThis document describes how `omp plugin` operations mutate plugin state on disk and how installed plugins become runtime capabilities (tools today, hooks/commands path resolution available).\n\n## Scope and architecture\n\nThere are two plugin-management implementations in the codebase:\n\n1. **Active path used by CLI commands**: `PluginManager` (`src/extensibility/plugins/manager.ts`)\n2. **Legacy helper module**: installer functions (`src/extensibility/plugins/installer.ts`)\n\n`omp plugin ...` command execution goes through `PluginManager`.\n\n`installer.ts` still documents important safety checks and filesystem behavior, but it is not the path used by `src/commands/plugin.ts` + `src/cli/plugin-cli.ts`.\n\n## Lifecycle: from CLI invocation to runtime availability\n\n```text\nomp plugin <action> ...\n -> src/commands/plugin.ts\n -> runPluginCommand(...) in src/cli/plugin-cli.ts\n -> PluginManager method (install/list/uninstall/link/...) \n -> mutate ~/.omp/plugins/{package.json,node_modules,omp-plugins.lock.json}\n -> runtime discovery: discoverAndLoadCustomTools(...)\n -> getAllPluginToolPaths(cwd)\n -> custom tool loader imports tool modules\n```\n\n### Command entrypoints\n\n- `src/commands/plugin.ts` defines command/flags and forwards to `runPluginCommand`.\n- `src/cli/plugin-cli.ts` maps subcommands to `PluginManager` methods:\n - `install`, `uninstall`, `list`, `link`, `doctor`, `features`, `config`, `enable`, `disable`\n- No explicit `update` action exists; update is done by re-running `install` with a new package/version spec.\n\n## On-disk model\n\nGlobal plugin state lives under `~/.omp/plugins`:\n\n- `package.json` — dependency manifest used by `bun install`/`bun uninstall`\n- `node_modules/` — installed plugin packages or symlinks\n- `omp-plugins.lock.json` — runtime state:\n - enabled/disabled per plugin\n - selected feature set per plugin\n - persisted plugin settings\n\nProject-local overrides live at:\n\n- `<cwd>/.omp/plugin-overrides.json`\n\nOverrides are read-only from manager/loader perspective (no write path here) and can disable plugins or override features/settings for this project.\n\n## Plugin spec parsing and metadata interpretation\n\n## Install spec grammar\n\n`parsePluginSpec` (`parser.ts`) supports:\n\n- `pkg` -> `features: null` (defaults behavior)\n- `pkg[*]` -> enable all manifest features\n- `pkg[]` -> enable no optional features\n- `pkg[a,b]` -> enable named features\n- `@scope/pkg@1.2.3[feat]` -> scoped + versioned package with explicit feature selection\n\n`extractPackageName` strips version suffix for on-disk path lookup after install.\n\n## Manifest source and required fields\n\nManifest is resolved as:\n\n1. `package.json.omp`\n2. fallback `package.json.pi`\n3. fallback `{ version: package.version }`\n\nImplications:\n\n- There is no strict schema validation in manager/loader.\n- A package missing `omp`/`pi` is still installable and listable.\n- Runtime plugin loading (`getEnabledPlugins`) skips packages without `omp`/`pi` manifest.\n- `manifest.version` is always overwritten from package `version`.\n\nMalformed `package.json` JSON is a hard failure at read time; malformed manifest shape may fail later only when specific fields are consumed.\n\n## Install/update flow (`PluginManager.install`)\n\n1. Parse feature bracket syntax from install spec.\n2. Validate package name against regex + shell-metacharacter denylist.\n3. Ensure plugin `package.json` exists (`omp-plugins`, private dependencies map).\n4. Run `bun install <packageSpec>` in `~/.omp/plugins`.\n5. Read installed package `node_modules/<name>/package.json`.\n6. Resolve manifest and compute `enabledFeatures`:\n - `[*]`: all declared features (or `null` if no feature map)\n - `[a,b]`: validates each feature exists in manifest features map\n - `[]`: empty feature list\n - bare spec: `null` (use defaults policy later in loader)\n7. Upsert lockfile runtime state: `{ version, enabledFeatures, enabled: true }`.\n\n### Update semantics\n\nBecause update is install-driven:\n\n- `omp plugin install pkg@newVersion` updates dependency and lockfile version.\n- Existing settings are preserved; state entry is overwritten for version/features/enabled.\n- No separate “check updates” or transactional migration logic exists.\n\n## Remove flow (`PluginManager.uninstall`)\n\n1. Validate package name.\n2. Run `bun uninstall <name>` in plugin dir.\n3. Remove plugin runtime state from lockfile:\n - `config.plugins[name]`\n - `config.settings[name]`\n\nIf uninstall command fails, runtime state is not changed.\n\n## List flow (`PluginManager.list`)\n\n1. Read plugin dependency map from `~/.omp/plugins/package.json`.\n2. Load lockfile runtime config (missing file -> empty defaults).\n3. Load project overrides (`<cwd>/.omp/plugin-overrides.json`, parse/read errors -> empty object with warning).\n4. For each dependency with a resolvable package.json:\n - build `InstalledPlugin` record\n - merge feature/enable state:\n - base from lockfile (or defaults)\n - project overrides can replace feature selection\n - project `disabled` list masks plugin as disabled\n\nThis is the effective state used by CLI status output and settings/features operations.\n\n## Link flow (`PluginManager.link`)\n\n`link` supports local plugin development by symlinking a local package into `~/.omp/plugins/node_modules/<pkg.name>`.\n\nBehavior:\n\n1. Resolve `localPath` against manager cwd.\n2. Require local `package.json` and `name` field.\n3. Ensure plugin dirs exist.\n4. For scoped names, create scope directory.\n5. Remove existing path at target link location.\n6. Create symlink.\n7. Add runtime lockfile entry enabled with default features (`null`).\n\nCaveat: current `PluginManager.link` does not enforce the `cwd` path-boundary check present in legacy `installer.ts` (`normalizedPath.startsWith(normalizedCwd)`), so trust is the caller’s responsibility.\n\n## Runtime loading: from installed plugin to callable capabilities\n\n## Discovery gate\n\n`getEnabledPlugins(cwd)` (`plugins/loader.ts`) reads:\n\n- plugin dependency manifest (`package.json`)\n- lockfile runtime state\n- project overrides via `getConfigDirPaths(\"plugin-overrides.json\", { user: false, cwd })`\n\nFiltering:\n\n- skip if no plugin package.json\n- skip if manifest (`omp`/`pi`) absent\n- skip if globally disabled in lockfile\n- skip if project-disabled\n\n## Capability path resolution\n\nFor each enabled plugin:\n\n- `resolvePluginToolPaths(plugin)`\n- `resolvePluginHookPaths(plugin)`\n- `resolvePluginCommandPaths(plugin)`\n\nEach resolver includes base entries plus feature entries:\n\n- explicit feature list -> only selected features\n- `enabledFeatures === null` -> enable features marked `default: true`\n\nMissing files are silently skipped (`existsSync` guard).\n\n## Current runtime wiring differences\n\n- **Tools are wired into runtime today** via `discoverAndLoadCustomTools` (`custom-tools/loader.ts`), which calls `getAllPluginToolPaths(cwd)`.\n- Paths are de-duplicated by resolved absolute path in custom tool discovery (`seen` set, first path wins).\n- **Hooks/commands resolvers exist** and are exported, but this code path does not currently wire them into a runtime registry in the same way tools are wired.\n\n## Lock/state management details\n\n`PluginManager` caches runtime config in memory per instance (`#runtimeConfig`) and lazily loads once.\n\nLoad behavior:\n\n- lockfile missing -> `{ plugins: {}, settings: {} }`\n- lockfile read/parse failure -> warning + same empty defaults\n\nSave behavior:\n\n- writes full lockfile JSON pretty-printed each mutation\n\nNo cross-process locking or merge strategy exists; concurrent writers can overwrite each other.\n\n## Safety checks and trust boundaries\n\n## Input/package validation\n\nActive manager path enforces package-name validation:\n\n- regex for scoped/unscoped package specs (optionally with version)\n- explicit shell metacharacter denylist (`[;&|`$(){}[]<>\\\\]`)\n\nThis limits command-injection risk when invoking `bun install/uninstall`.\n\n## Filesystem trust boundary\n\n- Plugin code executes in-process when custom tool modules are imported; no sandboxing.\n- Manifest relative paths are joined against plugin package directory and only existence-checked.\n- The plugin package itself is trusted code once installed.\n\n## Legacy installer-only checks\n\n`installer.ts` includes additional link-time checks not mirrored in `PluginManager.link`:\n\n- local path must resolve inside project cwd\n- extra package name/path traversal guards for symlink target naming\n\nBecause CLI uses `PluginManager`, these stricter link guards are not currently on the main path.\n\n## Failure, partial success, and rollback behavior\n\nThe plugin manager is not transactional.\n\n| Operation stage | Failure behavior | Rollback |\n| --- | --- | --- |\n| `bun install` fails | install aborts with stderr | N/A (no state writes yet) |\n| Install succeeds, then manifest/feature validation fails | command fails | No uninstall rollback; dependency may remain in `node_modules`/`package.json` |\n| Install succeeds, then lockfile write fails | command fails | No rollback of installed package |\n| `bun uninstall` succeeds, lockfile write fails | command fails | Package removed, stale runtime state may remain |\n| `link` removes old target then symlink creation fails | command fails | No restoration of previous link/dir |\n\nOperationally, `doctor --fix` can repair some drift (`bun install`, orphaned config cleanup, invalid-feature cleanup), but it is best-effort.\n\n## Malformed/missing manifest behavior summary\n\n- Missing `omp`/`pi` field:\n - install/list: tolerated (minimal manifest)\n - runtime enabled-plugin discovery: skipped as non-plugin\n- Missing feature referenced by install spec or `features --set/--enable`: hard error with available feature list\n- Invalid `plugin-overrides.json`: ignored with fallback to `{}` in both manager and loader paths\n- Missing tool/hook/command file paths referenced by manifest: silently ignored during resolver expansion; flagged as errors only by `doctor`\n\n## Mode differences and precedence\n\n- `--dry-run` (install): returns synthetic install result, no filesystem/network/state writes.\n- `--json`: output formatting only, no behavior change.\n- Project overrides always take precedence over global lockfile for feature/settings view.\n- Effective enablement is `runtimeEnabled && !projectDisabled`.\n\n## Implementation files\n\n- [`src/commands/plugin.ts`](../packages/coding-agent/src/commands/plugin.ts) — CLI command declaration and flag mapping\n- [`src/cli/plugin-cli.ts`](../packages/coding-agent/src/cli/plugin-cli.ts) — action dispatch, user-facing command handlers\n- [`src/extensibility/plugins/manager.ts`](../packages/coding-agent/src/extensibility/plugins/manager.ts) — active install/remove/list/link/state/doctor implementation\n- [`src/extensibility/plugins/installer.ts`](../packages/coding-agent/src/extensibility/plugins/installer.ts) — legacy installer helpers and additional link safety checks\n- [`src/extensibility/plugins/loader.ts`](../packages/coding-agent/src/extensibility/plugins/loader.ts) — enabled-plugin discovery and tool/hook/command path resolution\n- [`src/extensibility/plugins/parser.ts`](../packages/coding-agent/src/extensibility/plugins/parser.ts) — install spec and package-name parsing helpers\n- [`src/extensibility/plugins/types.ts`](../packages/coding-agent/src/extensibility/plugins/types.ts) — manifest/runtime/override type contracts\n- [`src/extensibility/custom-tools/loader.ts`](../packages/coding-agent/src/extensibility/custom-tools/loader.ts) — runtime wiring for plugin-provided tool modules\n",
37
37
  "porting-from-pi-mono.md": "# Porting From pi-mono: A Practical Merge Guide\n\nThis guide is a repeatable checklist for porting changes from pi-mono into this repo.\nUse it for any merge: single file, feature branch, or full release sync.\n\n## Last Sync Point\n\n**Commit:** `b21b42d032919de2f2e6920a76fa9a37c3920c0a`\n**Date:** 2026-03-22\n\nUpdate this section after each sync; do not reuse the previous range.\n\nWhen starting a new sync, generate patches from this commit forward:\n\n```bash\ngit format-patch b21b42d032919de2f2e6920a76fa9a37c3920c0a..HEAD --stdout > changes.patch\n```\n\n## 0) Define the scope\n\n- Identify the upstream reference (commit, tag, or PR).\n- List the packages or folders you plan to touch.\n- Decide which features are in-scope and which are intentionally skipped.\n\n## 1) Bring code over safely\n\n- Prefer a clean, focused diff rather than a wholesale copy.\n- Avoid copying built artifacts or generated files.\n- If upstream added new files, add them explicitly and review contents.\n\n## 2) Match import extension conventions\n\nMost runtime TypeScript sources omit `.js` in internal imports, but some test/bench entrypoints keep `.js` for ESM\nruntime compatibility. Follow the local package’s existing style; do not blanket-strip extensions.\n\n- In `packages/coding-agent` runtime sources, keep internal imports extensionless unless importing non-TS assets.\n- In `packages/tui/test` and `packages/natives/bench`, keep `.js` where surrounding files already use it.\n- Keep real file extensions when required by tooling (e.g., `.json`, `.css`, `.md` text embeds).\n- Example: `import { x } from \"./foo.js\";` → `import { x } from \"./foo\";` (only when the package convention is extensionless).\n\n## 3) Replace import scopes\n\nUpstream uses different package scopes. Replace them consistently.\n\n- Replace old scopes with the local scope used here.\n- Examples (adjust to match the actual packages you are porting):\n - `@mariozechner/pi-coding-agent` → `@oh-my-pi/pi-coding-agent`\n - `@mariozechner/pi-agent-core` → `@oh-my-pi/pi-agent-core`\n - `@mariozechner/pi-tui` → `@oh-my-pi/pi-tui`\n - `@mariozechner/pi-ai` → `@oh-my-pi/pi-ai`\n\n## 4) Use Bun APIs where they improve on Node\n\nWe run on Bun. Replace Node APIs only when Bun provides a better alternative.\n\n**DO replace:**\n\n- Process spawning: `child_process.spawn` → Bun Shell `$` for simple commands, `Bun.spawn`/`Bun.spawnSync` for streaming or long-running work\n- File I/O: `fs.readFileSync` → `Bun.file().text()` / `Bun.write()`\n- HTTP clients: `node-fetch`, `axios` → native `fetch`\n- Crypto hashing: `node:crypto` → Web Crypto or `Bun.hash`\n- SQLite: `better-sqlite3` → `bun:sqlite`\n- Env loading: `dotenv` → Bun loads `.env` automatically\n\n**DO NOT replace (these work fine in Bun):**\n\n- `os.homedir()` — do NOT replace with `Bun.env.HOME`, `Bun.env.HOME`, or literal `\"~\"`\n- `os.tmpdir()` — do NOT replace with `Bun.env.TMPDIR || \"/tmp\"` or hardcoded paths\n- `fs.mkdtempSync()` — do NOT replace with manual path construction\n- `path.join()`, `path.resolve()`, etc. — these are fine\n\n**Import style:** Use the `node:` prefix with namespace imports only (no named imports from `node:fs` or `node:path`).\n\n**Additional Bun conventions:**\n\n- Prefer Bun Shell `$` for short, non-streaming commands; use `Bun.spawn` only when you need streaming I/O or process control.\n- Use `Bun.file()`/`Bun.write()` for files and `node:fs/promises` for directories.\n- Avoid `Bun.file().exists()` checks; use `isEnoent` handling in try/catch.\n- Prefer `Bun.sleep(ms)` over `setTimeout` wrappers.\n\n**Wrong:**\n\n```typescript\n// BROKEN: env vars may be undefined, \"~\" is not expanded\nconst home = Bun.env.HOME || \"~\";\nconst tmp = Bun.env.TMPDIR || \"/tmp\";\n```\n\n**Correct:**\n\n```typescript\nimport * as os from \"node:os\";\nimport * as fs from \"node:fs\";\nimport * as path from \"node:path\";\n\nconst configDir = path.join(os.homedir(), \".config\", \"myapp\");\nconst tempDir = fs.mkdtempSync(path.join(os.tmpdir(), \"myapp-\"));\n```\n\n## 5) Prefer Bun embeds (no copying)\n\nDo not copy runtime assets or vendor files at build time.\n\n- If upstream copies assets into a dist folder, replace with Bun-friendly embeds.\n- Prompts are static `.md` files; use Bun text imports (`with { type: \"text\" }`) and Handlebars instead of inline prompt strings.\n- Use `import.meta.dir` + `Bun.file` to load adjacent non-text resources.\n- Keep assets in-repo and let the bundler include them.\n- Eliminate copy scripts unless the user explicitly requests them.\n- If upstream reads a bundled fallback file at runtime, replace filesystem reads with a Bun text embed import.\n - Example (Codex instructions fallback):\n - `const FALLBACK_PROMPT_PATH = join(import.meta.dir, \"codex-instructions.md\");` -> removed\n - `import FALLBACK_INSTRUCTIONS from \"./codex-instructions.md\" with { type: \"text\" };`\n - Use `return FALLBACK_INSTRUCTIONS;` instead of `readFileSync(FALLBACK_PROMPT_PATH, \"utf8\")`\n\n## 6) Port `package.json` carefully\n\nTreat `package.json` as a contract. Merge intentionally.\n\n- Keep existing `name`, `version`, `type`, `exports`, and `bin` unless the port requires changes.\n- Replace npm/node scripts with Bun equivalents (e.g., `bun check`, `bun test`).\n- Ensure dependencies use the correct scope.\n- Do not downgrade dependencies to fix type errors; upgrade instead.\n- Validate workspace package links and `peerDependencies`.\n\n## 7) Align code style and tooling\n\n- Keep existing formatting conventions.\n- Do not introduce `any` unless required.\n- Avoid dynamic imports and inline type imports; use top-level imports only.\n- Never build prompts in code; prompts are static `.md` files rendered with Handlebars.\n- In coding-agent, never use `console.log`/`console.warn`/`console.error`; use `logger` from `@oh-my-pi/pi-utils`.\n- Use `Promise.withResolvers()` instead of `new Promise((resolve, reject) => ...)`.\n- **No `private`/`protected`/`public` keywords on class fields or methods.** Use ES `#` private fields for encapsulation; leave accessible members bare (no keyword). The only exception is constructor parameter properties (`constructor(private readonly x: T)`), where the keyword is required by TypeScript. When porting upstream code that uses `private foo` or `protected bar`, convert to `#foo` (private) or bare `bar` (accessible).\n- Prefer existing helpers and utilities over new ad-hoc code.\n- Preserve Bun-first infrastructure changes already made in this repo:\n - Runtime is Bun (no Node entry points).\n - Package manager is Bun (no npm lockfiles).\n - Heavy Node APIs (`child_process`, `readline`) are replaced with Bun equivalents.\n - Lightweight Node APIs (`os.homedir`, `os.tmpdir`, `fs.mkdtempSync`, `path.*`) are kept.\n - CLI shebangs use `bun` (not `node`, not `tsx`).\n - Packages use source files directly (no TypeScript build step).\n - CI workflows run Bun for install/check/test.\n\n## 8) Remove old compatibility layers\n\nUnless requested, remove upstream compatibility shims.\n\n- Delete old APIs that were replaced.\n- Update all call sites to the new API directly.\n- Do not keep `*_v2` or parallel versions.\n\n## 9) Update docs and references\n\n- Replace pi-mono repo links where appropriate.\n- Update examples to use Bun and correct package scopes.\n- Ensure README instructions still match the current repo behavior.\n\n## 10) Validate the port\n\nRun the standard checks after changes:\n\n- `bun check`\n\nIf the repo already has failing checks unrelated to your changes, call that out.\nTests use Bun's runner (not Vitest), but only run `bun test` when explicitly requested.\n\n## 11) Protect improved features (regression trap list)\n\nIf you already improved behavior locally, treat those as **non‑negotiable**. Before porting, write down\nthe improvements and add explicit checks so they don’t get lost in the merge.\n\n- **Freeze the expected behavior**: add a short “before/after” note for each improvement (inputs, outputs,\n defaults, edge cases). This prevents silent rollback.\n- **Map old → new APIs**: if upstream renamed concepts (hooks → extensions, custom tools → tools, etc.),\n ensure every old entry point still wires through. One missed flag or export equals lost functionality.\n- **Verify exports**: check `package.json` `exports`, public types, and barrel files. Upstream ports often\n forget to re-export local additions.\n- **Cover non‑happy paths**: if you fixed error handling, timeouts, or fallback logic, add a test or at\n least a manual checklist that exercises those paths.\n- **Check defaults and config merge order**: improvements often live in defaults. Confirm new defaults\n didn’t revert (e.g., new config precedence, disabled features, tool lists).\n- **Audit env/shell behavior**: if you fixed execution or sandboxing, verify the new path still uses your\n sanitized env and does not reintroduce alias/function overrides.\n- **Re-run targeted samples**: keep a minimal set of \"known good\" examples and run them after the port\n (CLI flags, extension registration, tool execution).\n\n## 12) Detect and handle reworked code\n\nBefore porting a file, check if upstream significantly refactored it:\n\n```bash\n# Compare the file you're about to port against what you have locally\ngit diff HEAD upstream/main -- path/to/file.ts\n```\n\nIf the diff shows the file was **reworked** (not just patched):\n\n- New abstractions, renamed concepts, merged modules, changed data flow\n\nThen you must **read the new implementation thoroughly** before porting. Blind merging of reworked code loses functionality because:\n\nNote: interactive mode was recently split into controllers/utils/types. When backporting related changes, port updates into the individual files we created and ensure `interactive-mode.ts` wiring stays in sync.\n\n1. **Defaults change silently** - A new variable `defaultFoo = [a, b]` may replace an old `getAllFoo()` that returned `[a, b, c, d, e]`.\n\n2. **API options get dropped** - When systems merge (e.g., `hooks` + `customTools` → `extensions`), old options may not wire through to the new implementation.\n\n3. **Code paths go stale** - A renamed concept (e.g., `hookMessage` → `custom`) needs updates in every switch statement, type guard, and handler—not just the definition.\n\n4. **Context/capabilities shrink** - Old APIs may have exposed `{ logger, typebox, pi }` that new APIs forgot to include.\n\n### Semantic porting process\n\nWhen upstream reworked a module:\n\n1. **Read the old implementation** - Understand what it did, what options it accepted, what it exposed.\n\n2. **Read the new implementation** - Understand the new abstractions and how they map to old behavior.\n\n3. **Verify feature parity** - For each capability in the old code, confirm the new code preserves it or explicitly removes it.\n\n4. **Grep for stragglers** - Search for old names/concepts that may have been missed in switch statements, handlers, UI components.\n\n5. **Test the boundaries** - CLI flags, SDK options, event handlers, default values—these are where regressions hide.\n\n### Quick checks\n\n```bash\n# Find all uses of an old concept that may need updating\nrg \"oldConceptName\" --type ts\n\n# Compare default values between versions\ngit show upstream/main:path/to/file.ts | rg \"default|DEFAULT\"\n\n# Check if all enum/union values have handlers\nrg \"case \\\"\" path/to/file.ts\n```\n\n## 13) Quick audit checklist\n\nUse this as a final pass before you finish:\n\n- [ ] Import extensions follow the local package convention (no blanket `.js` stripping)\n- [ ] No Node-only APIs in new/ported code\n- [ ] All package scopes updated\n- [ ] `package.json` scripts use Bun\n- [ ] Prompts are `.md` text imports (no inline prompt strings)\n- [ ] No `console.*` in coding-agent (use `logger`)\n- [ ] Assets load via Bun embed patterns (no copy scripts)\n- [ ] Tests or checks run (or explicitly noted as blocked)\n- [ ] No functionality regressions (see sections 11-12)\n\n## 14) Commit message format\n\nWhen committing a backport, follow the repo format `<type>(scope): <past-tense description>` and keep the commit\nrange in the title.\n\n```\nfix(coding-agent): backported pi-mono changes (<from>..<to>)\n\npackages/<package>:\n- <type>: <description>\n- <type>: <description> (#<issue> by @<contributor>)\n\npackages/<other-package>:\n- <type>: <description>\n```\n\n**Example:**\n\n```\nfix(coding-agent): backported pi-mono changes (9f3eef65f..52532c7c0)\n\npackages/ai:\n- fix: handle \"sensitive\" stop reason from Anthropic API\n- fix: normalize tool call IDs with special characters for Responses API\n- fix: add overflow detection for Bedrock, MiniMax, Kimi providers\n- fix: 429 status is rate limiting, not context overflow\n\npackages/tui:\n- fix: refactored autocomplete state tracking\n- fix: file autocomplete should not trigger on empty text\n- fix: configurable autocomplete max visible items\n- fix: improved table column width calculation with word-aware wrapping\n\npackages/coding-agent:\n- fix: preserve external config.yml edits on save (#1046 by @nicobailonMD)\n- fix: resolve macOS NFD and curly quote variants in file paths\n```\n\n**Rules:**\n\n- Group changes by package\n- Use conventional commit types (`fix`, `feat`, `refactor`, `perf`, `docs`)\n- Include upstream issue/PR numbers and contributor attribution for external contributions\n- The commit range in the title helps track sync points\n\n## 15) Intentional Divergences\n\nOur fork has architectural decisions that differ from upstream. **Do not port these upstream patterns:**\n\n### UI Architecture\n\n| Upstream | Our Fork | Reason |\n| ------------------------------------------- | --------------------------------------------------------- | --------------------------------------------------------------------- |\n| `FooterDataProvider` class | `StatusLineComponent` | Simpler, integrated status line |\n| `ctx.ui.setHeader()` / `ctx.ui.setFooter()` | Stub in non-TUI modes | Implemented in TUI, no-op elsewhere |\n| `ctx.ui.setEditorComponent()` | Stub in non-TUI modes | Implemented in TUI, no-op elsewhere |\n| `InteractiveModeOptions` options object | Positional constructor args (options type still exported) | Keep constructor signature; update the type when upstream adds fields |\n\n### Component Naming\n\n| Upstream | Our Fork |\n| ---------------------------- | ----------------------- |\n| `extension-input.ts` | `hook-input.ts` |\n| `extension-selector.ts` | `hook-selector.ts` |\n| `ExtensionInputComponent` | `HookInputComponent` |\n| `ExtensionSelectorComponent` | `HookSelectorComponent` |\n\n### API Naming\n\n| Upstream | Our Fork | Notes |\n| ---------------------------------------- | ---------------------------------------- | ----------------------------------------- |\n| `sessionManager.appendSessionInfo(name)` | `sessionManager.setSessionName(name)` | We use `sessionName` throughout |\n| `sessionManager.getSessionName()` | `sessionManager.getSessionName()` | Same (we unified to match upstream's RPC) |\n| `agent.sessionName` / `setSessionName()` | `agent.sessionName` / `setSessionName()` | Same |\n\n### File Consolidation\n\n| Upstream | Our Fork | Reason |\n| -------------------------------------------------- | --------------------------------------- | --------------------------------------- |\n| `clipboard.ts` + `clipboard-image.ts` (tool files) | `@oh-my-pi/pi-natives` clipboard module | Merged into N-API native implementation |\n\n### Test Framework\n\n| Upstream | Our Fork |\n| ------------------------- | ----------------------------- |\n| `vitest` with `vi.mock()` | `bun:test` with `vi` from bun |\n| `node:test` assertions | `expect()` matchers |\n\n### Tool Architecture\n\n| Upstream | Our Fork | Notes |\n| ----------------------------------- | ----------------------------------------------------------------- | --------------------------------------------------------- |\n| `createTool(cwd: string, options?)` | `createTools(session: ToolSession)` via `BUILTIN_TOOLS` registry | Tool factories accept `ToolSession` and can return `null` |\n| Per-tool `*Operations` interfaces | Per-tool interfaces remain (`FindOperations`, `GrepOperations`) | Used for SSH/remote overrides |\n| Node.js `fs/promises` everywhere | `Bun.file()`/`Bun.write()` for files; `node:fs/promises` for dirs | Prefer Bun APIs when they simplify |\n\n### Auth Storage\n\n| Upstream | Our Fork | Notes |\n| ------------------------------- | ------------------------------------------- | -------------------------------------------- |\n| `proper-lockfile` + `auth.json` | `agent.db` (bun:sqlite) | Credentials stored exclusively in `agent.db` |\n| Single credential per provider | Multi-credential with round-robin selection | Session affinity and backoff logic preserved |\n\n### Extensions\n\n| Upstream | Our Fork |\n| ----------------------------- | ------------------------------------------ |\n| `jiti` for TypeScript loading | Native Bun `import()` |\n| `pkg.pi` manifest field | `pkg.omp ?? pkg.pi` (prefer our namespace) |\n\n### Skip These Upstream Features\n\nWhen porting, **skip** these files/features entirely:\n\n- `footer-data-provider.ts` — we use StatusLineComponent\n- `clipboard-image.ts` — clipboard is in `@oh-my-pi/pi-natives` N-API module\n- GitHub workflow files — we have our own CI\n- `models.generated.ts` — auto-generated, regenerate locally (as models.json instead)\n\n### Features We Added (Preserve These)\n\nThese exist in our fork but not upstream. **Never overwrite:**\n\n- `StatusLineComponent` in interactive mode\n- Multi-credential auth with session affinity\n- Capability-based discovery system (`defineCapability`, `registerProvider`, `loadCapability`, `skillCapability`, etc.)\n- MCP/Exa/SSH integrations\n- LSP writethrough for format-on-save\n- Bash interception (`checkBashInterception`)\n- Fuzzy path suggestions in read tool\n",
38
- "porting-to-natives.md": "# Porting to pi-natives (N-API) — Field Notes\n\nThis is a practical guide for moving hot paths into `crates/pi-natives` and wiring them through the JS bindings. It exists to avoid the same failures happening twice.\n\n## When to port\n\nPort when any of these are true:\n\n- The hot path runs in render loops, tight UI updates, or large batches.\n- JS allocations dominate (string churn, regex backtracking, large arrays).\n- You already have a JS baseline and can benchmark both versions side by side.\n- The work is CPU-bound or blocking I/O that can run on the libuv thread pool.\n- The work is async I/O that can run on Tokio's runtime (e.g., shell execution).\n\nAvoid ports that depend on JS-only state or dynamic imports. N-API exports should be pure, data-in/data-out. Long-running work should go through `task::blocking` (CPU-bound/blocking I/O) or `task::future` (async I/O) with cancellation.\n\n## Anatomy of a native export\n\n**Rust side:**\n\n- Implementation lives in `crates/pi-natives/src/<module>.rs`. If you add a new module, register it in `crates/pi-natives/src/lib.rs`.\n- Export with `#[napi]`; snake_case exports are converted to camelCase automatically. Use explicit `js_name` only for true aliases/non-default names. Use `#[napi(object)]` for structs.\n- Use `task::blocking(tag, cancel_token, work)` (see `crates/pi-natives/src/task.rs`) for CPU-bound or blocking work. Use `task::future(env, tag, work)` for async work that needs Tokio (e.g., shell sessions). Pass a `CancelToken` when you expose `timeoutMs` or `AbortSignal`.\n\n**JS side:**\n\n- `packages/natives/src/bindings.ts` holds the base `NativeBindings` interface.\n- `packages/natives/src/<module>/types.ts` defines TS types and augments `NativeBindings` via declaration merging.\n- `packages/natives/src/native.ts` imports each `<module>/types.ts` file to activate the declarations.\n- `packages/natives/src/<module>/index.ts` wraps the `native` binding from `packages/natives/src/native.ts`.\n- `packages/natives/src/native.ts` loads the addon and `validateNative` enforces required exports.\n- `packages/natives/src/index.ts` re-exports the wrapper for callers in `packages/*`.\n\n## Porting checklist\n\n1. **Add the Rust implementation**\n\n- Put the core logic in a plain Rust function.\n- If it’s a new module, add it to `crates/pi-natives/src/lib.rs`.\n- Expose it with `#[napi]` so the default snake_case -> camelCase mapping stays consistent.\n- Keep signatures owned and simple: `String`, `Vec<String>`, `Uint8Array`, or `Either<JsString, Uint8Array>` for large string/byte inputs.\n- For CPU-bound or blocking work, use `task::blocking`; for async work, use `task::future`. Pass a `CancelToken` and call `heartbeat()` inside long loops.\n\n2. **Wire JS bindings**\n\n- Add the types and `NativeBindings` augmentation in `packages/natives/src/<module>/types.ts`.\n- Import `./<module>/types` in `packages/natives/src/native.ts` to trigger declaration merging.\n- Add a wrapper in `packages/natives/src/<module>/index.ts` that calls `native`.\n- Re-export from `packages/natives/src/index.ts`.\n\n3. **Update native validation**\n\n- Add `checkFn(\"newExport\")` in `validateNative` (`packages/natives/src/native.ts`).\n\n4. **Add benchmarks**\n\n- Put benchmarks next to the owning package (`packages/tui/bench`, `packages/natives/bench`, or `packages/coding-agent/bench`).\n- Include a JS baseline and native version in the same run.\n- Use `Bun.nanoseconds()` and a fixed iteration count.\n- Keep the benchmark inputs small and realistic (actual data seen in the hot path).\n\n5. **Build the native binary**\n\n- `bun --cwd=packages/natives run build`\n- Use `bun --cwd=packages/natives run build` and set `PI_DEV=1` if you want loader diagnostics while testing.\n\n6. **Run the benchmark**\n\n- `bun run packages/<pkg>/bench/<bench>.ts` (or `bun --cwd=packages/natives run bench`)\n\n7. **Decide on usage**\n\n- If native is slower, **keep JS** and leave the native export unused.\n- If native is faster, switch call sites to the native wrapper.\n\n## Pain points and how to avoid them\n\n### 1) Stale `pi_natives.node` prevents new exports\n\nThe loader prefers the platform-tagged binary in `packages/natives/native` (`pi_natives.<platform>-<arch>.node`). `PI_DEV=1` now only enables loader diagnostics; it no longer switches to a separate dev addon filename. There is also a fallback `pi_natives.node`. Compiled binaries extract to `~/.omp/natives/<version>/pi_natives.<platform>-<arch>.node`. If any of these are stale, exports won’t update.\n\n**Fix:** remove the stale file before rebuilding.\n\n```bash\nrm packages/natives/native/pi_natives.linux-x64.node\nrm packages/natives/native/pi_natives.node\nbun --cwd=packages/natives run build\n```\n\nIf you’re running a compiled binary, delete the cached addon directory:\n\n```bash\nrm -rf ~/.omp/natives/<version>\n```\n\nThen verify the export exists in the binary:\n\n```bash\nbun -e 'const tag = `${process.platform}-${process.arch}`; const mod = require(`./packages/natives/native/pi_natives.${tag}.node`); console.log(Object.keys(mod).includes(\"newExport\"));'\n```\n\n### 2) “Missing exports” errors from `validateNative`\n\nThis is **good** — it prevents silent mismatches. When you see this:\n\n```\nNative addon missing exports ... Missing: visibleWidth\n```\n\nit means your binary is stale, the Rust export name (or explicit alias when used) doesn’t match the JS name, or the export never compiled in. Fix the build and the naming mismatch, don’t weaken validation.\n\n### 3) Rust signature mismatch\n\nKeep it simple and owned. `String`, `Vec<String>`, and `Uint8Array` work. Avoid references like `&str` in public exports. If you need structured data, wrap it in `#[napi(object)]` structs.\n\n### 4) Benchmarking mistakes\n\n- Don’t compare different inputs or allocations.\n- Keep JS and native using identical input arrays.\n- Run both in the same benchmark file to avoid skew.\n\n## Benchmark template\n\n```ts\nconst ITERATIONS = 2000;\n\nfunction bench(name: string, fn: () => void): number {\n\tconst start = Bun.nanoseconds();\n\tfor (let i = 0; i < ITERATIONS; i++) fn();\n\tconst elapsed = (Bun.nanoseconds() - start) / 1e6;\n\tconsole.log(`${name}: ${elapsed.toFixed(2)}ms total (${(elapsed / ITERATIONS).toFixed(6)}ms/op)`);\n\treturn elapsed;\n}\n\nbench(\"feature/js\", () => {\n\tjsImpl(sample);\n});\n\nbench(\"feature/native\", () => {\n\tnativeImpl(sample);\n});\n```\n\n## Verification checklist\n\n- `validateNative` passes (no missing exports).\n- `NativeBindings` is augmented in `packages/natives/src/<module>/types.ts` and the wrapper is re-exported in `packages/natives/src/index.ts`.\n- `Object.keys(require(...))` includes your new export.\n- Bench numbers recorded in the PR/notes.\n- Call site updated **only if** native is faster or equal.\n\n## Rule of thumb\n\n- If native is slower, **do not switch**. Keep the export for future work, but the TUI should stay on the faster path.\n- If native is faster, switch the call site and keep the benchmark in place to catch regressions.\n",
38
+ "porting-to-natives.md": "# Porting to pi-natives (N-API) — Field Notes\n\nThis is a practical guide for moving hot paths into `crates/pi-natives` and wiring them through the JS bindings. It exists to avoid the same failures happening twice.\n\n## When to port\n\nPort when any of these are true:\n\n- The hot path runs in render loops, tight UI updates, or large batches.\n- JS allocations dominate (string churn, regex backtracking, large arrays).\n- You already have a JS baseline and can benchmark both versions side by side.\n- The work is CPU-bound or blocking I/O that can run on the libuv thread pool.\n- The work is async I/O that can run on Tokio's runtime (e.g., shell execution).\n\nAvoid ports that depend on JS-only state or dynamic imports. N-API exports should be pure, data-in/data-out. Long-running work should go through `task::blocking` (CPU-bound/blocking I/O) or `task::future` (async I/O) with cancellation.\n\n## Anatomy of a native export\n\n**Rust side:**\n\n- Implementation lives in `crates/pi-natives/src/<module>.rs`. If you add a new module, register it in `crates/pi-natives/src/lib.rs`.\n- Export with `#[napi]`; snake_case exports are converted to camelCase automatically. Use explicit `js_name` only for true aliases/non-default names. Use `#[napi(object)]` for structs.\n- Use `task::blocking(tag, cancel_token, work)` (see `crates/pi-natives/src/task.rs`) for CPU-bound or blocking work. Use `task::future(env, tag, work)` for async work that needs Tokio (e.g., shell sessions). Pass a `CancelToken` when you expose `timeoutMs` or `AbortSignal`.\n\n**JS side:**\n\n- `packages/natives/src/bindings.ts` holds the base `NativeBindings` interface.\n- `packages/natives/src/<module>/types.ts` defines TS types and augments `NativeBindings` via declaration merging.\n- `packages/natives/src/native.ts` imports each `<module>/types.ts` file to activate the declarations.\n- `packages/natives/src/<module>/index.ts` wraps the `native` binding from `packages/natives/src/native.ts`.\n- `packages/natives/src/native.ts` loads the addon and `validateNative` enforces required exports.\n- `packages/natives/src/index.ts` re-exports the wrapper for callers in `packages/*`.\n\n## Porting checklist\n\n1. **Add the Rust implementation**\n\n- Put the core logic in a plain Rust function.\n- If it’s a new module, add it to `crates/pi-natives/src/lib.rs`.\n- Expose it with `#[napi]` so the default snake_case -> camelCase mapping stays consistent.\n- Keep signatures owned and simple: `String`, `Vec<String>`, `Uint8Array`, or `Either<JsString, Uint8Array>` for large string/byte inputs.\n- For CPU-bound or blocking work, use `task::blocking`; for async work, use `task::future`. Pass a `CancelToken` and call `heartbeat()` inside long loops.\n\n2. **Wire JS bindings**\n\n- Add the types and `NativeBindings` augmentation in `packages/natives/src/<module>/types.ts`.\n- Import `./<module>/types` in `packages/natives/src/native.ts` to trigger declaration merging.\n- Add a wrapper in `packages/natives/src/<module>/index.ts` that calls `native`.\n- Re-export from `packages/natives/src/index.ts`.\n\n3. **Update native validation**\n\n- Add `checkFn(\"newExport\")` in `validateNative` (`packages/natives/src/native.ts`).\n\n4. **Add benchmarks**\n\n- Put benchmarks next to the owning package (`packages/tui/bench`, `packages/natives/bench`, or `packages/coding-agent/bench`).\n- Include a JS baseline and native version in the same run.\n- Use `Bun.nanoseconds()` and a fixed iteration count.\n- Keep the benchmark inputs small and realistic (actual data seen in the hot path).\n\n5. **Build the native binary**\n\n- `bun --cwd=packages/natives run build`\n\n6. **Run the benchmark**\n\n- `bun run packages/<pkg>/bench/<bench>.ts` (or `bun --cwd=packages/natives run bench`)\n\n7. **Decide on usage**\n\n- If native is slower, **keep JS** and leave the native export unused.\n- If native is faster, switch call sites to the native wrapper.\n\n## Pain points and how to avoid them\n\n### 1) Stale `pi_natives.node` prevents new exports\n\nThe loader prefers the platform-tagged binary in `packages/natives/native` (`pi_natives.<platform>-<arch>.node`). There is also a fallback `pi_natives.node`. Compiled binaries extract to `~/.omp/natives/<version>/pi_natives.<platform>-<arch>.node`. If any of these are stale, exports won’t update.\n\n**Fix:** remove the stale file before rebuilding.\n\n```bash\nrm packages/natives/native/pi_natives.linux-x64.node\nrm packages/natives/native/pi_natives.node\nbun --cwd=packages/natives run build\n```\n\nIf you’re running a compiled binary, delete the cached addon directory:\n\n```bash\nrm -rf ~/.omp/natives/<version>\n```\n\nThen verify the export exists in the binary:\n\n```bash\nbun -e 'const tag = `${process.platform}-${process.arch}`; const mod = require(`./packages/natives/native/pi_natives.${tag}.node`); console.log(Object.keys(mod).includes(\"newExport\"));'\n```\n\n### 2) “Missing exports” errors from `validateNative`\n\nThis is **good** — it prevents silent mismatches. When you see this:\n\n```\nNative addon missing exports ... Missing: visibleWidth\n```\n\nit means your binary is stale, the Rust export name (or explicit alias when used) doesn’t match the JS name, or the export never compiled in. Fix the build and the naming mismatch, don’t weaken validation.\n\n### 3) Rust signature mismatch\n\nKeep it simple and owned. `String`, `Vec<String>`, and `Uint8Array` work. Avoid references like `&str` in public exports. If you need structured data, wrap it in `#[napi(object)]` structs.\n\n### 4) Benchmarking mistakes\n\n- Don’t compare different inputs or allocations.\n- Keep JS and native using identical input arrays.\n- Run both in the same benchmark file to avoid skew.\n\n## Benchmark template\n\n```ts\nconst ITERATIONS = 2000;\n\nfunction bench(name: string, fn: () => void): number {\n\tconst start = Bun.nanoseconds();\n\tfor (let i = 0; i < ITERATIONS; i++) fn();\n\tconst elapsed = (Bun.nanoseconds() - start) / 1e6;\n\tconsole.log(`${name}: ${elapsed.toFixed(2)}ms total (${(elapsed / ITERATIONS).toFixed(6)}ms/op)`);\n\treturn elapsed;\n}\n\nbench(\"feature/js\", () => {\n\tjsImpl(sample);\n});\n\nbench(\"feature/native\", () => {\n\tnativeImpl(sample);\n});\n```\n\n## Verification checklist\n\n- `validateNative` passes (no missing exports).\n- `NativeBindings` is augmented in `packages/natives/src/<module>/types.ts` and the wrapper is re-exported in `packages/natives/src/index.ts`.\n- `Object.keys(require(...))` includes your new export.\n- Bench numbers recorded in the PR/notes.\n- Call site updated **only if** native is faster or equal.\n\n## Rule of thumb\n\n- If native is slower, **do not switch**. Keep the export for future work, but the TUI should stay on the faster path.\n- If native is faster, switch the call site and keep the benchmark in place to catch regressions.\n",
39
39
  "provider-streaming-internals.md": "# Provider streaming internals\n\nThis document explains how token/tool streaming is normalized in `@oh-my-pi/pi-ai`, then propagated through `@oh-my-pi/pi-agent-core` and `coding-agent` session events.\n\n## End-to-end flow\n\n1. `streamSimple()` (`packages/ai/src/stream.ts`) maps generic options and dispatches to a provider stream function.\n2. Provider stream functions (`anthropic.ts`, `openai-responses.ts`, `google.ts`) translate provider-native stream events into the unified `AssistantMessageEvent` sequence.\n3. Each provider pushes events into `AssistantMessageEventStream` (`packages/ai/src/utils/event-stream.ts`), which throttles delta events and exposes:\n - async iteration for incremental updates\n - `result()` for final `AssistantMessage`\n4. `agentLoop` (`packages/agent/src/agent-loop.ts`) consumes those events, mutates in-flight assistant state, and emits `message_update` events carrying the raw `assistantMessageEvent`.\n5. `AgentSession` (`packages/coding-agent/src/session/agent-session.ts`) subscribes to agent events, persists messages, drives extension hooks, and applies session behaviors (retry, compaction, TTSR, streaming-edit abort checks).\n\n## Unified stream contract in `@oh-my-pi/pi-ai`\n\nAll providers emit the same shape (`AssistantMessageEvent` in `packages/ai/src/types.ts`):\n\n- `start`\n- content block lifecycle triplets:\n - text: `text_start` → `text_delta`* → `text_end`\n - thinking: `thinking_start` → `thinking_delta`* → `thinking_end`\n - tool call: `toolcall_start` → `toolcall_delta`* → `toolcall_end`\n- terminal event:\n - `done` with `reason: \"stop\" | \"length\" | \"toolUse\"`\n - or `error` with `reason: \"aborted\" | \"error\"`\n\n`AssistantMessageEventStream` guarantees:\n\n- final result is resolved by terminal event (`done` or `error`)\n- deltas are batched/throttled (~50ms)\n- buffered deltas are flushed before non-delta events and before completion\n\n## Delta throttling and harmonization behavior\n\n`AssistantMessageEventStream` treats `text_delta`, `thinking_delta`, and `toolcall_delta` as mergeable events:\n\n- buffered deltas are merged only when **type + contentIndex** match\n- merge keeps the latest `partial` snapshot\n- non-delta events force immediate flush\n\nThis smooths high-frequency provider streams for TUI/event consumers, but is not provider backpressure: providers still produce at full speed, while the local stream buffers.\n\n## Provider normalization details\n\n## Anthropic (`anthropic-messages`)\n\nSource: `packages/ai/src/providers/anthropic.ts`\n\nNormalization points:\n\n- `message_start` initializes usage (input/output/cache tokens)\n- `content_block_start` maps to text/thinking/toolcall starts\n- `content_block_delta` maps:\n - `text_delta` → `text_delta`\n - `thinking_delta` → `thinking_delta`\n - `input_json_delta` → `toolcall_delta`\n - `signature_delta` updates `thinkingSignature` only (no event)\n- `content_block_stop` emits corresponding `*_end`\n- `message_delta.stop_reason` maps via `mapStopReason()`\n\nTool-call argument streaming:\n\n- each tool block carries internal `partialJson`\n- every JSON delta appends to `partialJson`\n- `arguments` are reparsed on each delta via `parseStreamingJson()`\n- `toolcall_end` reparses once more, then strips `partialJson`\n\n## OpenAI Responses (`openai-responses`)\n\nSource: `packages/ai/src/providers/openai-responses.ts`\n\nNormalization points:\n\n- `response.output_item.added` starts reasoning/text/function-call blocks\n- reasoning summary events (`response.reasoning_summary_text.delta`) become `thinking_delta`\n- output/refusal deltas become `text_delta`\n- `response.function_call_arguments.delta` becomes `toolcall_delta`\n- `response.output_item.done` emits `thinking_end` / `text_end` / `toolcall_end`\n- `response.completed` maps status to stop reason and usage\n\nTool-call argument streaming:\n\n- same `partialJson` accumulation pattern as Anthropic\n- providers that send only `response.function_call_arguments.done` still populate final args\n- tool call IDs are normalized as `\"<call_id>|<item_id>\"`\n\n## Google Generative AI (`google-generative-ai`)\n\nSource: `packages/ai/src/providers/google.ts`\n\nNormalization points:\n\n- iterates `candidate.content.parts`\n- text parts are split into thinking vs text by `isThinkingPart(part)`\n- block transitions close previous block before starting a new one\n- `part.functionCall` is treated as a complete tool call (start/delta/end emitted immediately)\n- finish reason mapped by `mapStopReason()` from `google-shared.ts`\n\nTool-call argument streaming:\n\n- function call args arrive as structured object, not incremental JSON text\n- implementation emits one synthetic `toolcall_delta` containing `JSON.stringify(arguments)`\n- no partial JSON parser needed for Google in this path\n\n## Partial tool-call JSON accumulation and recovery\n\nShared behavior for Anthropic/OpenAI Responses uses `parseStreamingJson()` (`packages/ai/src/utils/json-parse.ts`):\n\n1. try `JSON.parse`\n2. fallback to `partial-json` parser for incomplete fragments\n3. if both fail, return `{}`\n\nImplications:\n\n- malformed or truncated argument deltas do not crash stream processing immediately\n- in-progress `arguments` may temporarily be `{}`\n- later valid deltas can recover structured arguments because parsing is retried on every append\n- final `toolcall_end` performs one more parse attempt before emission\n\n## Stop reasons vs transport/runtime errors\n\nProvider stop reasons are mapped to normalized `stopReason`:\n\n- Anthropic: `end_turn`→`stop`, `max_tokens`→`length`, `tool_use`→`toolUse`, safety/refusal cases→`error`\n- OpenAI Responses: `completed`→`stop`, `incomplete`→`length`, `failed/cancelled`→`error`\n- Google: `STOP`→`stop`, `MAX_TOKENS`→`length`, safety/prohibited/malformed-function-call classes→`error`\n\nError semantics are split in two stages:\n\n1. **Model completion semantics** (provider reported finish reason/status)\n2. **Transport/runtime failure** (network/client/parser/abort exceptions)\n\nIf provider stream throws or signals failure, each provider wrapper catches and emits terminal `error` event with:\n\n- `stopReason = \"aborted\"` when abort signal is set\n- otherwise `stopReason = \"error\"`\n- `errorMessage = formatErrorMessageWithRetryAfter(error)`\n\n## Malformed chunk / SSE parse failure behavior\n\nFor these provider paths, chunk/SSE framing is handled by vendor SDK streams (Anthropic SDK, OpenAI SDK, Google SDK). This code does not implement a custom SSE decoder here.\n\nObserved behavior in current implementation:\n\n- malformed chunk/SSE parsing at SDK level surfaces as an exception or stream `error` event\n- provider wrapper converts that into unified terminal `error` event\n- no provider-specific resume/retry inside the stream function itself\n- higher-level retries are handled in `AgentSession` auto-retry logic (message-level retry, not stream-chunk replay)\n\n## Cancellation boundaries\n\nCancellation is layered:\n\n- AI provider request: `options.signal` is passed into provider client stream call.\n- Provider wrapper: after stream loop, aborted signal forces error path (`\"Request was aborted\"`).\n- Agent loop: checks `signal.aborted` before handling each provider event and can synthesize an aborted assistant message from the latest partial.\n- Session/agent controls: `AgentSession.abort()` -> `agent.abort()` -> shared abort controller cancellation.\n\nTool execution cancellation is separate from model stream cancellation:\n\n- tool runners use `AbortSignal.any([agentSignal, steeringAbortSignal])`\n- steering interrupts can abort remaining tool execution while preserving already-produced tool results\n\n## Backpressure boundaries\n\nThere is no hard backpressure mechanism between provider SDK stream and downstream consumers:\n\n- `EventStream` uses in-memory queues with no max size\n- throttling reduces UI update rate but does not slow provider intake\n- if consumers lag significantly, queued events can grow until completion\n\nCurrent design favors responsiveness and simple ordering over bounded-buffer flow control.\n\n## How stream events surface as agent/session events\n\n`agentLoop.streamAssistantResponse()` bridges `AssistantMessageEvent` to `AgentEvent`:\n\n- on `start`: pushes placeholder assistant message and emits `message_start`\n- on block events (`text_*`, `thinking_*`, `toolcall_*`): updates last assistant message, emits `message_update` with raw `assistantMessageEvent`\n- on terminal (`done`/`error`): resolves final message from `response.result()`, emits `message_end`\n\n`AgentSession` then consumes those events for session-level behaviors:\n\n- TTSR watches `message_update.assistantMessageEvent` for `text_delta` and `toolcall_delta`\n- streaming edit guard inspects `toolcall_delta`/`toolcall_end` on `edit` calls and can abort early\n- persistence writes finalized messages at `message_end`\n- auto-retry examines assistant `stopReason === \"error\"` plus `errorMessage` heuristics\n\n## Unified vs provider-specific responsibilities\n\nUnified (common contract):\n\n- event shape (`AssistantMessageEvent`)\n- final result extraction (`done`/`error`)\n- delta throttling + merge rules\n- agent/session event propagation model\n\nProvider-specific (not fully abstracted):\n\n- upstream event taxonomies and mapping logic\n- stop-reason translation tables\n- tool-call ID conventions\n- reasoning/thinking block semantics and signatures\n- usage token semantics and availability timing\n- message conversion constraints per API\n\n## Implementation files\n\n- [`../../ai/src/stream.ts`](../packages/ai/src/stream.ts) — provider dispatch, option mapping, API key/session plumbing.\n- [`../../ai/src/utils/event-stream.ts`](../packages/ai/src/utils/event-stream.ts) — generic stream queue + assistant delta throttling.\n- [`../../ai/src/utils/json-parse.ts`](../packages/ai/src/utils/json-parse.ts) — partial JSON parsing for streamed tool arguments.\n- [`../../ai/src/providers/anthropic.ts`](../packages/ai/src/providers/anthropic.ts) — Anthropic event translation and tool JSON delta accumulation.\n- [`../../ai/src/providers/openai-responses.ts`](../packages/ai/src/providers/openai-responses.ts) — OpenAI Responses event translation and status mapping.\n- [`../../ai/src/providers/google.ts`](../packages/ai/src/providers/google.ts) — Gemini stream chunk-to-block translation.\n- [`../../ai/src/providers/google-shared.ts`](../packages/ai/src/providers/google-shared.ts) — Gemini finish-reason mapping and shared conversion rules.\n- [`../../agent/src/agent-loop.ts`](../packages/agent/src/agent-loop.ts) — provider stream consumption and `message_update` bridging.\n- [`../src/session/agent-session.ts`](../packages/coding-agent/src/session/agent-session.ts) — session-level handling of streaming updates, abort, retry, and persistence.\n",
40
40
  "python-repl.md": "# Python Tool and IPython Runtime\n\nThis document describes the current Python execution stack in `packages/coding-agent`.\nIt covers tool behavior, kernel/gateway lifecycle, environment handling, execution semantics, output rendering, and operational failure modes.\n\n## Scope and Key Files\n\n- Tool surface: `src/tools/python.ts`\n- Session/per-call kernel orchestration: `src/ipy/executor.ts`\n- Kernel protocol + gateway integration: `src/ipy/kernel.ts`\n- Shared local gateway coordinator: `src/ipy/gateway-coordinator.ts`\n- Interactive-mode renderer for user-triggered Python runs: `src/modes/components/python-execution.ts`\n- Runtime/env filtering and Python resolution: `src/ipy/runtime.ts`\n\n## What the Python tool is\n\nThe `python` tool executes one or more Python cells through a Jupyter Kernel Gateway-backed kernel (not by spawning `python -c` directly per cell).\n\nTool params:\n\n```ts\n{\n cells: Array<{ code: string; title?: string }>;\n timeout?: number; // seconds, clamped to 1..600, default 30\n cwd?: string;\n reset?: boolean; // reset kernel before first cell only\n}\n```\n\nThe tool is `concurrency = \"exclusive\"` for a session, so calls do not overlap.\n\n## Gateway lifecycle\n\n### Modes\n\nThere are two gateway paths:\n\n1. **External gateway** (`PI_PYTHON_GATEWAY_URL` set)\n - Uses the configured URL directly.\n - Optional auth with `PI_PYTHON_GATEWAY_TOKEN`.\n - No local gateway process is spawned or managed.\n\n2. **Local shared gateway** (default path)\n - Uses a single shared process coordinated under `~/.omp/agent/python-gateway`.\n - Metadata file: `gateway.json`\n - Lock file: `gateway.lock`\n - Spawn command:\n - `python -m kernel_gateway`\n - bound to `127.0.0.1:<allocated-port>`\n - startup health check: `GET /api/kernelspecs`\n\n### Local shared gateway coordination\n\n`acquireSharedGateway()`:\n\n- Takes a file lock (`gateway.lock`) with heartbeat.\n- Reuses `gateway.json` if PID is alive and health check passes.\n- Cleans stale info/PIDs when needed.\n- Starts a new gateway when no healthy one exists.\n\n`releaseSharedGateway()` is currently a no-op (kernel shutdown does not tear down shared gateway).\n\n`shutdownSharedGateway()` explicitly terminates the shared process and clears gateway metadata.\n\n### Important constraint\n\n`python.sharedGateway=false` is rejected at kernel start:\n\n- Error: `Shared Python gateway required; local gateways are disabled`\n- There is no per-process non-shared local gateway mode.\n\n## Kernel lifecycle\n\nEach execution uses a kernel created via `POST /api/kernels` on the selected gateway.\n\nKernel startup sequence:\n\n1. Availability check (`checkPythonKernelAvailability`)\n2. Create kernel (`/api/kernels`)\n3. Open websocket (`/api/kernels/:id/channels`)\n4. Initialize kernel env (`cwd`, env vars, `sys.path`)\n5. Execute `PYTHON_PRELUDE`\n6. Load extension modules from:\n - user: `~/.omp/agent/modules/*.py`\n - project: `<cwd>/.omp/modules/*.py` (overrides same-name user module)\n\nKernel shutdown:\n\n- Deletes remote kernel via `DELETE /api/kernels/:id`\n- Closes websocket\n- Calls shared gateway release hook (no-op today)\n\n## Session persistence semantics\n\n`python.kernelMode` controls kernel reuse:\n\n- `session` (default)\n - Reuses kernel sessions keyed by session identity + cwd.\n - Execution is serialized per session via a queue.\n - Idle sessions are evicted after 5 minutes.\n - At most 4 sessions; oldest is evicted on overflow.\n - Heartbeat checks detect dead kernels.\n - Auto-restart allowed once; repeated crash => hard failure.\n\n- `per-call`\n - Creates a fresh kernel for each execute request.\n - Shuts kernel down after the request.\n - No cross-call state persistence.\n\n### Multi-cell behavior in a single tool call\n\nCells run sequentially in the same kernel instance for that tool call.\n\nIf an intermediate cell fails:\n\n- Earlier cell state remains in memory.\n- Tool returns a targeted error indicating which cell failed.\n- Later cells are not executed.\n\n`reset=true` only applies to the first cell execution in that call.\n\n## Environment filtering and runtime resolution\n\nEnvironment is filtered before launching gateway/kernel runtime:\n\n- Allowlist includes core vars like `PATH`, `HOME`, locale vars, `VIRTUAL_ENV`, `PYTHONPATH`, etc.\n- Allow-prefixes: `LC_`, `XDG_`, `PI_`\n- Denylist strips common API keys (OpenAI/Anthropic/Gemini/etc.)\n\nRuntime selection order:\n\n1. Active/located venv (`VIRTUAL_ENV`, then `<cwd>/.venv`, `<cwd>/venv`)\n2. Managed venv at `~/.omp/python-env`\n3. `python` or `python3` on PATH\n\nWhen a venv is selected, its bin/Scripts path is prepended to `PATH`.\n\nKernel env initialization inside Python also:\n\n- `os.chdir(cwd)`\n- injects provided env map into `os.environ`\n- ensures cwd is in `sys.path`\n\n## Tool availability and mode selection\n\n`python.toolMode` (default `both`) + optional `PI_PY` override controls exposure:\n\n- `ipy-only`\n- `bash-only`\n- `both`\n\n`PI_PY` accepted values:\n\n- `0` / `bash` -> `bash-only`\n- `1` / `py` -> `ipy-only`\n- `mix` / `both` -> `both`\n\nIf Python preflight fails, tool creation degrades to bash-only for that session.\n\n## Execution flow and cancellation/timeout\n\n### Tool-level timeout\n\n`python` tool timeout is in seconds, default 30, clamped to `1..600`.\n\nThe tool combines:\n\n- caller abort signal\n- timeout abort signal\n\nwith `AbortSignal.any(...)`.\n\n### Kernel execution cancellation\n\nOn abort/timeout:\n\n- Execution is marked cancelled.\n- Kernel interrupt is attempted via REST (`POST /interrupt`) and control-channel `interrupt_request`.\n- Result includes `cancelled=true`.\n- Timeout path annotates output as `Command timed out after <n> seconds`.\n\n### stdin behavior\n\nInteractive stdin is not supported.\n\nIf kernel emits `input_request`:\n\n- Tool records `stdinRequested=true`\n- Emits explanatory text\n- Sends empty `input_reply`\n- Execution is treated as failure at executor layer\n\n## Output capture and rendering\n\n### Captured output classes\n\nFrom kernel messages:\n\n- `stream` -> plain text chunks\n- `display_data`/`execute_result` -> rich display handling\n- `error` -> traceback text\n- custom MIME `application/x-omp-status` -> structured status events\n\nDisplay MIME precedence:\n\n1. `text/markdown`\n2. `text/plain`\n3. `text/html` (converted to basic markdown)\n\nAdditionally captured as structured outputs:\n\n- `application/json` -> JSON tree data\n- `image/png` -> image payloads\n- `application/x-omp-status` -> status events\n\n### Storage and truncation\n\nOutput is streamed through `OutputSink` and may be persisted to artifact storage.\n\nTool results can include truncation metadata and `artifact://<id>` for full output recovery.\n\n### Renderer behavior\n\n- Tool renderer (`python.ts`):\n - shows code-cell blocks with per-cell status\n - collapsed preview defaults to 10 lines\n - supports expanded mode for full output and richer status detail\n- Interactive renderer (`python-execution.ts`):\n - used for user-triggered Python execution in TUI\n - collapsed preview defaults to 20 lines\n - clamps very long individual lines to 4000 chars for display safety\n - shows cancellation/error/truncation notices\n\n## External gateway support\n\nSet:\n\n```bash\nexport PI_PYTHON_GATEWAY_URL=\"http://127.0.0.1:8888\"\n# Optional:\nexport PI_PYTHON_GATEWAY_TOKEN=\"...\"\n```\n\nBehavior differences from local shared gateway:\n\n- No local gateway lock/info files\n- No local process spawn/termination\n- Health checks and kernel CRUD run against external endpoint\n- Auth failures are surfaced with explicit token guidance\n\n## Operational troubleshooting (current failure modes)\n\n- **Python tool not available**\n - Check `python.toolMode` / `PI_PY`.\n - If preflight fails, runtime falls back to bash-only.\n\n- **Kernel availability errors**\n - Local mode requires both `kernel_gateway` and `ipykernel` importable in resolved Python runtime.\n - Install with:\n ```bash\n python -m pip install jupyter_kernel_gateway ipykernel\n ```\n\n- **`python.sharedGateway=false` causes startup failure**\n - This is expected with current implementation.\n\n- **External gateway auth/reachability failures**\n - 401/403 -> set `PI_PYTHON_GATEWAY_TOKEN`.\n - timeout/unreachable -> verify URL/network and gateway health.\n\n- **Execution hangs then times out**\n - Increase tool `timeout` (max 600s) if workload is legitimate.\n - For stuck code, cancellation triggers kernel interrupt but user code may still need refactor.\n\n- **stdin/input prompts in Python code**\n - `input()` is not supported interactively in this runtime path; pass data programmatically.\n\n- **Resource exhaustion (`EMFILE` / too many open files)**\n - Session manager triggers shared-gateway recovery (session teardown + shared gateway restart).\n\n- **Working directory errors**\n - Tool validates `cwd` exists and is a directory before execution.\n\n## Relevant environment variables\n\n- `PI_PY` — tool exposure override (`bash-only`/`ipy-only`/`both` mapping above)\n- `PI_PYTHON_GATEWAY_URL` — use external gateway\n- `PI_PYTHON_GATEWAY_TOKEN` — optional external gateway auth token\n- `PI_PYTHON_SKIP_CHECK=1` — bypass Python preflight/warm checks\n- `PI_PYTHON_IPC_TRACE=1` — log kernel IPC send/receive traces\n- `PI_DEBUG_STARTUP=1` — emit startup-stage debug markers\n",
41
41
  "resolve-tool-runtime.md": "# Resolve tool runtime internals\n\nThis document explains how preview/apply workflows are modeled in coding-agent and how custom tools can participate via `pushPendingAction`.\n\n## Scope and key files\n\n- [`src/tools/resolve.ts`](../packages/coding-agent/src/tools/resolve.ts)\n- [`src/tools/pending-action.ts`](../packages/coding-agent/src/tools/pending-action.ts)\n- [`src/tools/ast-edit.ts`](../packages/coding-agent/src/tools/ast-edit.ts)\n- [`src/extensibility/custom-tools/types.ts`](../packages/coding-agent/src/extensibility/custom-tools/types.ts)\n- [`src/extensibility/custom-tools/loader.ts`](../packages/coding-agent/src/extensibility/custom-tools/loader.ts)\n- [`src/sdk.ts`](../packages/coding-agent/src/sdk.ts)\n\n## What `resolve` does\n\n`resolve` is a hidden tool that finalizes a pending preview action.\n\n- `action: \"apply\"` executes `apply(reason)` on the pending action and persists changes.\n- `action: \"discard\"` invokes `reject(reason)` if provided; otherwise drops the action with a default \"Discarded\" message.\n\nIf no pending action exists, `resolve` fails with:\n\n- `No pending action to resolve. Nothing to apply or discard.`\n\n## Pending actions are a stack (LIFO)\n\nPending actions are stored in `PendingActionStore` as a push/pop stack:\n\n- `push(action)` adds a new pending action on top.\n- `peek()` inspects the current top action.\n- `pop()` removes and returns the top action.\n- `hasPending` indicates whether the stack is non-empty.\n\n`resolve` always consumes the **topmost** pending action first (`pop()`), so multiple preview-producing tools resolve in reverse order of registration.\n\n## Built-in producer example (`ast_edit`)\n\n`ast_edit` previews structural replacements first. When the preview has replacements and is not applied yet, it pushes a pending action that contains:\n\n- label (human-readable summary)\n- `sourceToolName` (`ast_edit`)\n- `apply(reason: string)` callback that reruns AST edit with `dryRun: false`\n\n`resolve(action=\"apply\", reason=\"...\")` passes `reason` into this callback.\n\n## Custom tools: `pushPendingAction`\n\nCustom tools can register resolve-compatible pending actions through `CustomToolAPI.pushPendingAction(...)`.\n\n`CustomToolPendingAction`:\n\n- `label: string` (required)\n- `apply(reason: string): Promise<AgentToolResult<unknown>>` (required) — invoked on apply; `reason` is the string passed to `resolve`\n- `reject?(reason: string): Promise<AgentToolResult<unknown> | undefined>` (optional) — invoked on discard; return value replaces the default \"Discarded\" message if provided\n- `details?: unknown` (optional)\n- `sourceToolName?: string` (optional, defaults to `\"custom_tool\"`)\n\n### Minimal usage example\n\n```ts\nimport type { CustomToolFactory } from \"@oh-my-pi/pi-coding-agent\";\n\nconst factory: CustomToolFactory = pi => ({\n\tname: \"batch_rename_preview\",\n\tlabel: \"Batch Rename Preview\",\n\tdescription: \"Previews renames and defers commit to resolve\",\n\tparameters: pi.typebox.Type.Object({\n\t\tfiles: pi.typebox.Type.Array(pi.typebox.Type.String()),\n\t}),\n\n\tasync execute(_toolCallId, params) {\n\t\tconst previewSummary = `Prepared rename plan for ${params.files.length} files`;\n\n\t\tpi.pushPendingAction({\n\t\t\tlabel: `Batch rename: ${params.files.length} files`,\n\t\t\tsourceToolName: \"batch_rename_preview\",\n\t\t\tapply: async (reason) => {\n\t\t\t\t// apply writes here\n\t\t\t\treturn {\n\t\t\t\t\tcontent: [{ type: \"text\", text: `Applied batch rename. Reason: ${reason}` }],\n\t\t\t\t};\n\t\t\t},\n\t\t\treject: async (reason) => {\n\t\t\t\t// optional: cleanup or notify on discard\n\t\t\t\treturn {\n\t\t\t\t\tcontent: [{ type: \"text\", text: `Discarded batch rename. Reason: ${reason}` }],\n\t\t\t\t};\n\t\t\t},\n\t\t});\n\n\t\treturn {\n\t\t\tcontent: [{ type: \"text\", text: `${previewSummary}. Call resolve to apply or discard.` }],\n\t\t};\n\t},\n});\n\nexport default factory;\n```\n\n## Runtime availability and failures\n\n`pushPendingAction` is wired by the custom tool loader using the active session `PendingActionStore`.\n\nIf the runtime has no pending-action store, `pushPendingAction` throws:\n\n- `Pending action store unavailable for custom tools in this runtime.`\n\n## Tool-choice behavior\n\nWhen `PendingActionStore.hasPending` is true, the agent runtime biases tool choice to `resolve` so pending previews are explicitly finalized before normal tool flow continues.\n\n## Developer guidance\n\n- Use pending actions only for destructive or high-impact operations that should support explicit apply/discard.\n- Keep `label` concise and specific; it is shown in resolve renderer output.\n- Ensure `apply(reason)` is deterministic and idempotent enough for one-shot execution; `reason` is informational and should not change behavior.\n- Implement `reject(reason)` when the discard needs cleanup (temp state, locks, notifications); omit it for stateless previews where the default message suffices.\n- If your tool can stage multiple previews, remember LIFO semantics: latest pushed action resolves first.\n",