omni-context-cli 0.0.77 → 0.0.79

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (59) hide show
  1. package/README.md +7 -29
  2. package/README.zh-CN.md +7 -29
  3. package/dist/cli.js +7 -7
  4. package/dist/clients/extension.vsix +0 -0
  5. package/dist/clients/web/assets/{_baseUniq-BbAI5D6T.js → _baseUniq-CUo0PbHL.js} +1 -1
  6. package/dist/clients/web/assets/{arc-C01cHi_J.js → arc-G5_lwiyZ.js} +1 -1
  7. package/dist/clients/web/assets/{architectureDiagram-VXUJARFQ-C4SW9geU.js → architectureDiagram-VXUJARFQ-BGuOh2Iq.js} +1 -1
  8. package/dist/clients/web/assets/{blockDiagram-VD42YOAC-BJBjUKKm.js → blockDiagram-VD42YOAC-Dq7C29Ov.js} +1 -1
  9. package/dist/clients/web/assets/{c4Diagram-YG6GDRKO-DRMS2exe.js → c4Diagram-YG6GDRKO-CMaw07Kd.js} +1 -1
  10. package/dist/clients/web/assets/channel-DUWet6ok.js +1 -0
  11. package/dist/clients/web/assets/{chunk-4BX2VUAB-DdM_Xgvz.js → chunk-4BX2VUAB-BENYmkTj.js} +1 -1
  12. package/dist/clients/web/assets/{chunk-55IACEB6-sCGtW4jw.js → chunk-55IACEB6-Buu1ye2G.js} +1 -1
  13. package/dist/clients/web/assets/{chunk-B4BG7PRW-CrTfpFlk.js → chunk-B4BG7PRW-BnxLYYM4.js} +1 -1
  14. package/dist/clients/web/assets/{chunk-DI55MBZ5-BJxcEBin.js → chunk-DI55MBZ5-CoJqKAAe.js} +1 -1
  15. package/dist/clients/web/assets/{chunk-FMBD7UC4-ZCzWQMnr.js → chunk-FMBD7UC4-sjroua7u.js} +1 -1
  16. package/dist/clients/web/assets/{chunk-QN33PNHL-7lsjOUTT.js → chunk-QN33PNHL-DIivVr4Y.js} +1 -1
  17. package/dist/clients/web/assets/{chunk-QZHKN3VN-DM6taDix.js → chunk-QZHKN3VN-2mFYr56r.js} +1 -1
  18. package/dist/clients/web/assets/{chunk-TZMSLE5B-Dc2DN3ii.js → chunk-TZMSLE5B-DgLF73y5.js} +1 -1
  19. package/dist/clients/web/assets/classDiagram-2ON5EDUG-BrKhteLz.js +1 -0
  20. package/dist/clients/web/assets/classDiagram-v2-WZHVMYZB-BrKhteLz.js +1 -0
  21. package/dist/clients/web/assets/clone-DA2MDRfq.js +1 -0
  22. package/dist/clients/web/assets/{cose-bilkent-S5V4N54A-r8T3f5yK.js → cose-bilkent-S5V4N54A-NYwX-qQE.js} +1 -1
  23. package/dist/clients/web/assets/{dagre-6UL2VRFP-UnxnSeQH.js → dagre-6UL2VRFP-kct9XwXY.js} +1 -1
  24. package/dist/clients/web/assets/{diagram-PSM6KHXK-D2ENZ6qS.js → diagram-PSM6KHXK-BOFXtN1x.js} +1 -1
  25. package/dist/clients/web/assets/{diagram-QEK2KX5R-Eye3JgZm.js → diagram-QEK2KX5R-BIIo48yD.js} +1 -1
  26. package/dist/clients/web/assets/{diagram-S2PKOQOG-Blf5P9RJ.js → diagram-S2PKOQOG-pv3OcQYA.js} +1 -1
  27. package/dist/clients/web/assets/{erDiagram-Q2GNP2WA-wyIeLoVZ.js → erDiagram-Q2GNP2WA-CiGpONRi.js} +1 -1
  28. package/dist/clients/web/assets/{flowDiagram-NV44I4VS-D99ef0t0.js → flowDiagram-NV44I4VS-B6GAXsT2.js} +1 -1
  29. package/dist/clients/web/assets/{ganttDiagram-JELNMOA3-CvsjeBuE.js → ganttDiagram-JELNMOA3-D6fnnGzF.js} +1 -1
  30. package/dist/clients/web/assets/{gitGraphDiagram-NY62KEGX-B2ylogok.js → gitGraphDiagram-NY62KEGX-B_GhKPTu.js} +1 -1
  31. package/dist/clients/web/assets/{graph-BEwCLq4h.js → graph-COudLt0o.js} +1 -1
  32. package/dist/clients/web/assets/{index-pGSHhaP9.css → index-Br5-Le-w.css} +1 -1
  33. package/dist/clients/web/assets/index-CmZ6Acxr.js +2 -0
  34. package/dist/clients/web/assets/{infoDiagram-WHAUD3N6-C9HLlPj3.js → infoDiagram-WHAUD3N6-BXDoib1f.js} +1 -1
  35. package/dist/clients/web/assets/{journeyDiagram-XKPGCS4Q-BGMoCoGu.js → journeyDiagram-XKPGCS4Q-Cx2trfR6.js} +1 -1
  36. package/dist/clients/web/assets/{kanban-definition-3W4ZIXB7-Z1B1b4dY.js → kanban-definition-3W4ZIXB7-CldLo4FB.js} +1 -1
  37. package/dist/clients/web/assets/{layout-KBr90mM8.js → layout-DlWRhlpt.js} +1 -1
  38. package/dist/clients/web/assets/{linear-VOTetRqU.js → linear-DqfGWoHa.js} +1 -1
  39. package/dist/clients/web/assets/{min-BcFr1Xgw.js → min-1D44uaaD.js} +1 -1
  40. package/dist/clients/web/assets/{mindmap-definition-VGOIOE7T-Bf9fJD2v.js → mindmap-definition-VGOIOE7T-Lz4tDnq8.js} +1 -1
  41. package/dist/clients/web/assets/{pieDiagram-ADFJNKIX-DITjn_1i.js → pieDiagram-ADFJNKIX-TOYmDJp3.js} +1 -1
  42. package/dist/clients/web/assets/{quadrantDiagram-AYHSOK5B-CG8qEQsc.js → quadrantDiagram-AYHSOK5B-W60A5nvZ.js} +1 -1
  43. package/dist/clients/web/assets/{requirementDiagram-UZGBJVZJ-_fiifEP6.js → requirementDiagram-UZGBJVZJ-BRPuONu3.js} +1 -1
  44. package/dist/clients/web/assets/{sankeyDiagram-TZEHDZUN-dGOtogtu.js → sankeyDiagram-TZEHDZUN-Cyzz8kP_.js} +1 -1
  45. package/dist/clients/web/assets/{sequenceDiagram-WL72ISMW-BPWg-aZ3.js → sequenceDiagram-WL72ISMW-B4NG_rPo.js} +1 -1
  46. package/dist/clients/web/assets/{stateDiagram-FKZM4ZOC-XlU9RkKz.js → stateDiagram-FKZM4ZOC-C00slvSW.js} +1 -1
  47. package/dist/clients/web/assets/stateDiagram-v2-4FDKWEC3-Chj_Rc8Z.js +1 -0
  48. package/dist/clients/web/assets/{timeline-definition-IT6M3QCI-CQGxtqhi.js → timeline-definition-IT6M3QCI-CtUjVVAm.js} +1 -1
  49. package/dist/clients/web/assets/{treemap-KMMF4GRG-Do7kW41s.js → treemap-KMMF4GRG-Dve5Lhqq.js} +1 -1
  50. package/dist/clients/web/assets/{xychartDiagram-PRI3JC2R-C_QymW0P.js → xychartDiagram-PRI3JC2R-DAiUFhKd.js} +1 -1
  51. package/dist/clients/web/index.html +2 -2
  52. package/package.json +1 -1
  53. package/scripts/zenmux-status.sh +12 -0
  54. package/dist/clients/web/assets/channel-BFxt5Aug.js +0 -1
  55. package/dist/clients/web/assets/classDiagram-2ON5EDUG-CMfvUA3j.js +0 -1
  56. package/dist/clients/web/assets/classDiagram-v2-WZHVMYZB-CMfvUA3j.js +0 -1
  57. package/dist/clients/web/assets/clone-BWI7Baa7.js +0 -1
  58. package/dist/clients/web/assets/index-30onxp6I.js +0 -2
  59. package/dist/clients/web/assets/stateDiagram-v2-4FDKWEC3-Rsaa7fF9.js +0 -1
package/README.md CHANGED
@@ -1,20 +1,16 @@
1
1
  # OmniContext CLI
2
2
 
3
- A zero-telemetry coding assistant that runs in your terminal and extends into VS Code, Office, the browser, and mobile. Most AI coding tools bolt a chat interface onto an LLM and call it a day. OmniContext CLI takes a different approach: it treats the context window as a scarce resource and engineers every layer of the system to use it efficiently. Lean system prompts, agent sub-delegation, automatic context editing, and native prompt caching work together so your tokens go toward solving the problem, not repeating boilerplate. Each LLM protocol (Anthropic, OpenAI, Gemini, Responses API) has its own dedicated request builder and stream handler with zero translation overhead. Custom workflows, agents, skills, and MCP servers make it fully extensible without touching the core.
3
+ A context-first, zero-telemetry coding assistant that extends into VS Code, Office, the browser, and mobile. Supports Anthropic, OpenAI, Gemini, and Responses API with custom workflows and cross-session memory.
4
4
 
5
5
  ## Context-First Architecture
6
6
 
7
7
  Every token matters. OmniContext CLI is engineered from the ground up to squeeze maximum value out of every context window.
8
8
 
9
- **Lean system prompts.** The built-in system prompts are short, focused, and free of boilerplate. Tool descriptions are minimal. Your context budget goes toward actual work, not framework overhead.
10
-
11
- **Agent sub-delegation.** Exploratory tasks (searching code, surveying project structure, previewing files) run as autonomous sub-agents on a cheaper model. Their intermediate tool calls, file contents, and reasoning never enter your main context. You get a concise answer back; the scratch work stays off the books.
12
-
13
- **Context editing.** As a conversation grows, older rounds accumulate tool call payloads and thinking blocks that are no longer relevant. Context editing automatically compresses these, replacing bulky tool inputs and outputs with compact placeholders and stripping reasoning traces, so the model sees a clean, focused history instead of a bloated one.
14
-
15
- **Auto-compaction.** When token usage hits 80% of the model's context limit, the conversation is automatically summarized, key memories are extracted, and a fresh session picks up seamlessly. You never have to manually manage context overflow.
16
-
17
- **Native prompt caching.** Automatic cache control for Anthropic and Gemini with configurable TTL (5-minute or 1-hour). Repeated context blocks are served from cache instead of being re-processed.
9
+ - **Lean system prompts** - the built-in system prompts are short, focused, and free of boilerplate. Tool descriptions are minimal. Your context budget goes toward actual work, not framework overhead.
10
+ - **Agentic tools** - exploratory tasks (searching code, surveying project structure, previewing files) run as agentic tools on a cheaper model. Intermediate tool calls, file contents, and reasoning never enter your main context. You get a concise answer back; the scratch work stays off the books.
11
+ - **Context editing** - as a conversation grows, older rounds accumulate tool call payloads and thinking blocks that are no longer relevant. Context editing automatically compresses these, replacing bulky tool inputs and outputs with compact placeholders and stripping reasoning traces, so the model sees a clean, focused history.
12
+ - **Cross-session memory** - every time a conversation is compacted, the model distills key points and evaluates existing memories. Good insights accumulate weight, stale knowledge decays naturally. When the next session starts, accumulated memories are injected automatically so the assistant knows your preferences and project context from the first message.
13
+ - **Prompt caching** - you choose the cache TTL: 5 minutes for quick iteration, 1 hour for long work sessions. Cache control for Anthropic is applied automatically, so repeated context blocks are served from cache, cutting token usage and cost significantly.
18
14
 
19
15
  ## Workflow System
20
16
 
@@ -94,29 +90,11 @@ Terminal is home base, but OmniContext CLI extends into every tool you use. The
94
90
  Custom workflows, agents, skills, and MCP servers. Everything is a markdown file or JSON config.
95
91
 
96
92
  - **Custom Workflows** - define your own system prompt and control exactly which tools are available: base tools, agents, MCP servers, and remote tools.
97
- - **Custom Agents** - write a markdown file with a prompt template, parameter schema, and tool permissions. It becomes a callable agent tool instantly. Add `OMX-AGENTS.md` for global agent instructions.
93
+ - **Custom Agentic Tools** - write a markdown file with a prompt template, parameter schema, and tool permissions. It becomes a callable agentic tool instantly. Add `AGENTS.md` for agent-specific instructions, or fall back to `OMX.md` project instructions automatically.
98
94
  - **Custom Skills** - directory-based prompt capabilities invoked as slash commands (`/skill-name`). Skills inject instructions into the current conversation for domain-specific knowledge and workflows. Compatible with Claude Code skill format.
99
95
  - **MCP Servers** - connect external tools and data sources via Model Context Protocol. Stdio and HTTP transports supported.
100
96
  - **Project Instructions** - drop an `OMX.md` or `CLAUDE.md` in your repo root and everyone on the team gets the same conventions and context.
101
97
 
102
- ## Install
103
-
104
- ```bash
105
- npm install -g omni-context-cli && omx
106
- ```
107
-
108
- ## Build & Release
109
-
110
- ```bash
111
- npm run release
112
- ```
113
-
114
- One command builds the CLI, all clients, and packages the desktop app for the current platform.
115
-
116
98
  ## Documentation
117
99
 
118
100
  **https://bluenoah1991.github.io/omni-context-cli-landing/docs/**
119
-
120
- ## License
121
-
122
- MIT
package/README.zh-CN.md CHANGED
@@ -1,20 +1,16 @@
1
1
  # OmniContext CLI
2
2
 
3
- 一个零遥测的编程助手,运行在终端,延伸到 VS Code、Office、浏览器和移动端。大多数 AI 编程工具只是给 LLM 套一个聊天界面就完事了。OmniContext CLI 走了不同的路,它把上下文窗口视为稀缺资源,从每一层设计上压榨它的利用效率。精简的系统提示词、Agent 子委托、自动上下文编辑和原生提示词缓存协同工作,让你的 token 花在解决问题上,而不是重复模板废话。每种 LLM 协议(Anthropic、OpenAI、GeminiResponses API)都有专用的请求构建器和流处理器,零转换开销。自定义工作流、Agent、技能和 MCP 服务器提供完整的可扩展性,无需修改核心代码。
3
+ 上下文优先、零遥测的编程助手,可集成到 VS Code、Office、浏览器和移动端。兼容 Anthropic、OpenAI、GeminiResponses API,支持自定义工作流与跨会话记忆。
4
4
 
5
5
  ## 上下文优先架构
6
6
 
7
7
  每一个 token 都很重要。OmniContext CLI 从底层开始就为最大化利用上下文窗口而设计。
8
8
 
9
- **精简系统提示词。** 内置提示词短小、聚焦、没有模板废话。工具描述力求精简。你的上下文预算用在实际工作上,而不是框架开销。
10
-
11
- **Agent 子委托。** 探索性任务(搜索代码、勘察项目结构、预览文件)由自主子 Agent 在更便宜的模型上运行。它们的中间工具调用、文件内容和推理过程不会进入你的主上下文。你只拿到一个简洁的结论,中间草稿完全不占空间。
12
-
13
- **上下文编辑。** 随着对话增长,早期的工具调用负载和思考块变得不再相关。上下文编辑自动压缩这些内容,用紧凑的占位符替换庞大的工具输入输出,剥离推理痕迹,让模型看到干净、聚焦的历史,而不是臃肿的堆积。
14
-
15
- **自动压缩。** 当 token 用量达到模型上下文限制的 80% 时,对话自动摘要,关键记忆被提取,新会话无缝接续。你永远不需要手动管理上下文溢出。
16
-
17
- **原生提示词缓存。** Anthropic 和 Gemini 的自动缓存控制,支持可配置的 TTL(5 分钟或 1 小时)。重复的上下文块从缓存读取,而不是重新处理。
9
+ - **精简系统提示词** - 内置提示词短小、聚焦、没有模板废话。工具描述力求精简。你的上下文预算用在实际工作上,而不是框架开销。
10
+ - **智能体式工具** - 探索性任务(搜索代码、勘察项目结构、预览文件)由智能体式工具在更便宜的模型上运行。中间工具调用、文件内容和推理过程不会进入主上下文。你只拿到一个简洁的结论,中间草稿完全不占空间。
11
+ - **上下文编辑** - 随着对话增长,早期的工具调用负载和思考块变得不再相关。上下文编辑自动压缩这些内容,用紧凑的占位符替换庞大的工具输入输出,剥离推理痕迹,让模型看到干净、聚焦的历史。
12
+ - **跨会话记忆** - 每次对话压缩时,模型自动提炼关键要点并评估已有记忆。好的见解不断积累权重,过时的知识自然衰减。下次会话开始时,积累的记忆自动注入,让助手从第一句话就了解你的偏好和项目背景。
13
+ - **提示词缓存** - 你可以自主选择缓存 TTL:5 分钟适合快速迭代,1 小时适合长时间工作。Anthropic 的缓存控制自动应用,重复的上下文块直接从缓存读取,大幅降低 token 消耗和成本。
18
14
 
19
15
  ## 工作流系统
20
16
 
@@ -94,29 +90,11 @@ omx --remove-provider openrouter
94
90
  自定义工作流、Agent、技能和 MCP 服务器。一切都是 Markdown 文件或 JSON 配置。
95
91
 
96
92
  - **自定义工作流** - 定义自己的系统提示词,精确控制可用工具列表:基础工具、Agent、MCP 服务器和远程工具。
97
- - **自定义 Agent** - 写一个带提示词模板、参数定义和工具权限的 Markdown 文件,立刻成为可调用的 Agent 工具。添加 `OMX-AGENTS.md` 作为全局 Agent 指令。
93
+ - **智能体式工具** - 写一个带提示词模板、参数定义和工具权限的 Markdown 文件,立刻成为可调用的智能体式工具。添加 `AGENTS.md` 作为 Agent 专属指令,未找到时自动回退到 `OMX.md` 项目指令。
98
94
  - **自定义技能** - 基于目录的提示词能力,通过斜杠命令(`/技能名`)调用。技能注入当前对话,提供领域知识和工作流。兼容 Claude Code 技能格式。
99
95
  - **MCP 服务器** - 通过 Model Context Protocol 接入外部工具和数据源,支持 stdio 和 HTTP 传输。
100
96
  - **项目指令** - 在仓库根目录放一个 `OMX.md` 或 `CLAUDE.md`,团队里每个人都能得到相同的约定和上下文。
101
97
 
102
- ## 安装
103
-
104
- ```bash
105
- npm install -g omni-context-cli && omx
106
- ```
107
-
108
- ## 构建与发布
109
-
110
- ```bash
111
- npm run release
112
- ```
113
-
114
- 一条命令构建 CLI 和所有客户端,并为当前平台打包桌面应用。
115
-
116
98
  ## 文档
117
99
 
118
100
  **https://bluenoah1991.github.io/omni-context-cli-landing/docs/zh-Hans/**
119
-
120
- ## 许可证
121
-
122
- MIT