@aemi-cli/aemi 0.0.7 → 0.0.13

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +165 -0
  2. package/package.json +5 -5
package/README.md ADDED
@@ -0,0 +1,165 @@
1
+ # AEMI
2
+
3
+ Agent Mirror - AI agent routing tool with Telegram and Discord bot integration.
4
+
5
+ A CLI tool that relays AI agent responses through Telegram/Discord bots.
6
+
7
+ ## Origin
8
+
9
+ This project is a fork of [kstost/cokacdir](https://github.com/kstost/cokacdir). It is based on the LLM CLI routing portion of the original project, with the TUI file manager removed to focus on bot relay functionality.
10
+
11
+ ## Features
12
+
13
+ - **AI Agent Routing**: Query AI agents and receive responses via `--prompt`
14
+ - **Telegram Bot**: Route AI agent through Telegram with `--routing telegram`
15
+ - **Discord Bot**: Route AI agent through Discord with `--routing discord`
16
+ - **Multi-Bot**: Run multiple Telegram bot tokens simultaneously
17
+ - **Access Control**: `--chat-id` (Telegram) / `--channel-id` (Discord) required for routing
18
+
19
+ ## Usage
20
+
21
+ ```bash
22
+ # Query Claude Code directly
23
+ aemi --prompt "explain this code"
24
+
25
+ # Start Telegram bot server with Claude (--chat-id required)
26
+ aemi --agent claude --routing telegram --token <TOKEN> --chat-id <CHAT_ID>
27
+
28
+ # Start Telegram bot server with Gemini
29
+ aemi --agent gemini --routing telegram --token <TOKEN> --chat-id <CHAT_ID>
30
+
31
+ # Start Discord bot server (--channel-id required)
32
+ aemi --agent claude --routing discord --token <TOKEN> --channel-id <CHANNEL_ID>
33
+
34
+ # Start Telegram bot server with Codex
35
+ aemi --agent codex --routing telegram --token <TOKEN> --chat-id <CHAT_ID>
36
+
37
+ # Start Discord bot server with Gemini
38
+ aemi --agent gemini --routing discord --token <TOKEN> --channel-id <CHANNEL_ID>
39
+
40
+ # Start Discord bot server with Codex
41
+ aemi --agent codex --routing discord --token <TOKEN> --channel-id <CHANNEL_ID>
42
+
43
+ # Start Telegram bot server with OpenCode
44
+ aemi --agent opencode --routing telegram --token <TOKEN> --chat-id <CHAT_ID>
45
+
46
+ # Start Discord bot server with OpenCode
47
+ aemi --agent opencode --routing discord --token <TOKEN> --channel-id <CHANNEL_ID>
48
+
49
+ # Run multiple Telegram bots simultaneously
50
+ aemi --agent claude --routing telegram --token <TOKEN1> <TOKEN2> <TOKEN3> --chat-id <CHAT_ID>
51
+ ```
52
+
53
+ ## Installation
54
+
55
+ ### Prerequisites
56
+
57
+ - Install the CLI tool for the agent you want to use (see [Agent Types](#agent-types))
58
+
59
+ ### Build from source
60
+
61
+ ```bash
62
+ # Clone
63
+ git clone https://github.com/KyongSik-Yoon/aemi.git
64
+ cd aemi
65
+
66
+ # Build
67
+ cargo build --release
68
+
69
+ # Binary location
70
+ ./target/release/aemi
71
+ ```
72
+
73
+ See [build_manual.md](build_manual.md) for detailed build instructions including cross-compilation.
74
+
75
+ ## Agent Types
76
+
77
+ | Agent | CLI Flag | Status | Priority |
78
+ |-------|----------|--------|----------|
79
+ | [Claude Code](https://docs.anthropic.com/en/docs/claude-code) | `--agent claude` | Available | - |
80
+ | [Gemini CLI](https://github.com/google-gemini/gemini-cli) | `--agent gemini` | Available | - |
81
+ | [Codex CLI](https://github.com/openai/codex) | `--agent codex` | Available | - |
82
+ | [OpenCode](https://opencode.ai) | `--agent opencode` | Available | - |
83
+
84
+ ### Prerequisites per Agent
85
+
86
+ Each agent requires its own CLI tool to be installed:
87
+
88
+ - **Claude**: `npm install -g @anthropic-ai/claude-code`
89
+ - **Gemini**: `npm install -g @google/gemini-cli`
90
+ - **Codex**: `npm install -g @openai/codex`
91
+ - **OpenCode**: `npm install -g opencode` (or see [opencode.ai](https://opencode.ai/docs/) for other methods)
92
+
93
+ ### Integration Feasibility
94
+
95
+ Each agent CLI provides a non-interactive mode and structured JSON output, making subprocess integration possible.
96
+
97
+ #### Gemini CLI (Priority 1)
98
+
99
+ - **Non-interactive mode**: `gemini -p "prompt"` — identical pattern to Claude's `-p` flag
100
+ - **JSON output**: `--output-format json` (single JSON) / `--output-format stream-json` (JSONL stream)
101
+ - **JSON structure**: `{ "response": "...", "stats": {...}, "error": null }`
102
+ - **Stream-json events**: `init`, `message`, `tool_use`, `tool_result`, `error`, `result`
103
+ - **Stability**: Stable release channel (nightly → preview → stable), latest stable v0.29.x
104
+ - **Stdin piping**: Supported (`echo "text" | gemini`)
105
+ - **Auth for subprocess**: `GEMINI_API_KEY` env var (avoids interactive OAuth)
106
+ - **Limitations**: Non-interactive mode restricts tool execution (WriteFile, shell commands require `--yolo/-y`)
107
+ - **Session continuity**: Not supported in non-interactive mode (single-turn only)
108
+ - **Exit codes**: `0` (success), `1` (error), `42` (input error), `53` (turn limit exceeded)
109
+ - **Note**: `-p` flag is deprecated in favor of positional arg (`gemini "prompt"`), but still works
110
+
111
+ > Gemini CLI has the lowest integration barrier due to its CLI interface being nearly identical to Claude Code.
112
+ > Stream-json event types (`init`, `message`, `tool_use`, `tool_result`) map directly to Claude's `StreamMessage` enum.
113
+
114
+ #### Codex CLI
115
+
116
+ - **Non-interactive mode**: `codex exec "prompt"` — uses `exec` subcommand instead of `-p` flag
117
+ - **JSON output**: `codex exec --json "prompt"` → JSONL event stream to stdout
118
+ - **Event types**: `thread.started`, `turn.started`, `turn.completed`, `item.*`, `error`
119
+ - **Auto-approve**: `--full-auto` flag for non-interactive tool execution
120
+ - **Stdin piping**: Supported (`cat prompt.md | codex exec -`)
121
+ - **Session resume**: `codex exec --resume <SESSION_ID> "prompt"`
122
+ - **Extra**: `--output-schema` for schema-constrained responses
123
+
124
+ > Codex CLI uses a different event model (`item.started`/`item.completed` lifecycle) compared to Claude/Gemini,
125
+ > but maps cleanly to `StreamMessage` via `thread.started`→`Init`, `agent_message`→`Text`, `command_execution`→`ToolUse`/`ToolResult`.
126
+
127
+ #### OpenCode CLI
128
+
129
+ - **Non-interactive mode**: `opencode run "prompt"` — uses `run` subcommand
130
+ - **JSON output**: `opencode run --format json "prompt"` → JSONL event stream to stdout
131
+ - **Event types**: `step_start`, `step_finish`, `text`, `tool_use`, `reasoning`, `error`
132
+ - **Tool state tracking**: `tool_use` events include `state.status` (`pending`, `running`, `completed`, `error`)
133
+ - **Session resume**: `opencode run --session <SESSION_ID> "prompt"` / `--continue` for last session
134
+ - **Model selection**: `--model provider/model` for flexible LLM provider switching
135
+ - **File attachment**: `--file <path>` for attaching files to the prompt
136
+
137
+ > OpenCode uses a part-based event model where `tool_use` events carry state transitions (`pending`→`running`→`completed`).
138
+ > Maps to `StreamMessage` via `text`→`Text`, `tool_use(running)`→`ToolUse`, `tool_use(completed)`→`ToolResult`, `step_finish`→`Done`.
139
+
140
+ ### Implementation Status
141
+
142
+ - [x] **Extract shared types** — `StreamMessage`, `CancelToken`, `AgentResponse` in `src/services/agent.rs`
143
+ - [x] **Add `src/services/gemini.rs`** — Gemini agent with `-p` and `--output-format stream-json`
144
+ - [x] **Map StreamMessage** — Gemini JSON events → `StreamMessage` enum (`message`→`Text`, `tool_use`→`ToolUse`, `result`→`Done`)
145
+ - [x] **Agent dispatch in bots** — `telegram.rs` and `discord.rs` branch on agent type
146
+ - [x] **Update routing in `main.rs`** — `--agent gemini` accepted alongside `claude`
147
+ - [x] **Add `src/services/codex.rs`** — Codex agent with `exec --json --full-auto`, session resume support
148
+ - [x] **Add `src/services/opencode.rs`** — OpenCode agent with `run --format json`, session resume support
149
+
150
+ ## Supported Platforms
151
+
152
+ - macOS (Apple Silicon & Intel)
153
+ - Linux (x86_64 & ARM64)
154
+
155
+ ## License
156
+
157
+ MIT License
158
+
159
+ ## Disclaimer
160
+
161
+ THIS SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
162
+
163
+ IN NO EVENT SHALL THE AUTHORS, COPYRIGHT HOLDERS, OR CONTRIBUTORS BE LIABLE FOR ANY CLAIM, DAMAGES, OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
164
+
165
+ **USE AT YOUR OWN RISK.**
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aemi-cli/aemi",
3
- "version": "0.0.7",
3
+ "version": "0.0.13",
4
4
  "description": "LLM CLI routing tool with Telegram and Discord bot integration",
5
5
  "license": "MIT",
6
6
  "repository": {
@@ -23,9 +23,9 @@
23
23
  "aemi": "bin/aemi"
24
24
  },
25
25
  "optionalDependencies": {
26
- "@aemi-cli/linux-x64": "0.0.7",
27
- "@aemi-cli/linux-arm64": "0.0.7",
28
- "@aemi-cli/darwin-x64": "0.0.7",
29
- "@aemi-cli/darwin-arm64": "0.0.7"
26
+ "@aemi-cli/linux-x64": "0.0.13",
27
+ "@aemi-cli/linux-arm64": "0.0.13",
28
+ "@aemi-cli/darwin-x64": "0.0.13",
29
+ "@aemi-cli/darwin-arm64": "0.0.13"
30
30
  }
31
31
  }