@aemi-cli/aemi 0.0.13 → 0.0.14
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +0 -57
- package/package.json +5 -5
package/README.md
CHANGED
|
@@ -90,63 +90,6 @@ Each agent requires its own CLI tool to be installed:
|
|
|
90
90
|
- **Codex**: `npm install -g @openai/codex`
|
|
91
91
|
- **OpenCode**: `npm install -g opencode` (or see [opencode.ai](https://opencode.ai/docs/) for other methods)
|
|
92
92
|
|
|
93
|
-
### Integration Feasibility
|
|
94
|
-
|
|
95
|
-
Each agent CLI provides a non-interactive mode and structured JSON output, making subprocess integration possible.
|
|
96
|
-
|
|
97
|
-
#### Gemini CLI (Priority 1)
|
|
98
|
-
|
|
99
|
-
- **Non-interactive mode**: `gemini -p "prompt"` — identical pattern to Claude's `-p` flag
|
|
100
|
-
- **JSON output**: `--output-format json` (single JSON) / `--output-format stream-json` (JSONL stream)
|
|
101
|
-
- **JSON structure**: `{ "response": "...", "stats": {...}, "error": null }`
|
|
102
|
-
- **Stream-json events**: `init`, `message`, `tool_use`, `tool_result`, `error`, `result`
|
|
103
|
-
- **Stability**: Stable release channel (nightly → preview → stable), latest stable v0.29.x
|
|
104
|
-
- **Stdin piping**: Supported (`echo "text" | gemini`)
|
|
105
|
-
- **Auth for subprocess**: `GEMINI_API_KEY` env var (avoids interactive OAuth)
|
|
106
|
-
- **Limitations**: Non-interactive mode restricts tool execution (WriteFile, shell commands require `--yolo/-y`)
|
|
107
|
-
- **Session continuity**: Not supported in non-interactive mode (single-turn only)
|
|
108
|
-
- **Exit codes**: `0` (success), `1` (error), `42` (input error), `53` (turn limit exceeded)
|
|
109
|
-
- **Note**: `-p` flag is deprecated in favor of positional arg (`gemini "prompt"`), but still works
|
|
110
|
-
|
|
111
|
-
> Gemini CLI has the lowest integration barrier due to its CLI interface being nearly identical to Claude Code.
|
|
112
|
-
> Stream-json event types (`init`, `message`, `tool_use`, `tool_result`) map directly to Claude's `StreamMessage` enum.
|
|
113
|
-
|
|
114
|
-
#### Codex CLI
|
|
115
|
-
|
|
116
|
-
- **Non-interactive mode**: `codex exec "prompt"` — uses `exec` subcommand instead of `-p` flag
|
|
117
|
-
- **JSON output**: `codex exec --json "prompt"` → JSONL event stream to stdout
|
|
118
|
-
- **Event types**: `thread.started`, `turn.started`, `turn.completed`, `item.*`, `error`
|
|
119
|
-
- **Auto-approve**: `--full-auto` flag for non-interactive tool execution
|
|
120
|
-
- **Stdin piping**: Supported (`cat prompt.md | codex exec -`)
|
|
121
|
-
- **Session resume**: `codex exec --resume <SESSION_ID> "prompt"`
|
|
122
|
-
- **Extra**: `--output-schema` for schema-constrained responses
|
|
123
|
-
|
|
124
|
-
> Codex CLI uses a different event model (`item.started`/`item.completed` lifecycle) compared to Claude/Gemini,
|
|
125
|
-
> but maps cleanly to `StreamMessage` via `thread.started`→`Init`, `agent_message`→`Text`, `command_execution`→`ToolUse`/`ToolResult`.
|
|
126
|
-
|
|
127
|
-
#### OpenCode CLI
|
|
128
|
-
|
|
129
|
-
- **Non-interactive mode**: `opencode run "prompt"` — uses `run` subcommand
|
|
130
|
-
- **JSON output**: `opencode run --format json "prompt"` → JSONL event stream to stdout
|
|
131
|
-
- **Event types**: `step_start`, `step_finish`, `text`, `tool_use`, `reasoning`, `error`
|
|
132
|
-
- **Tool state tracking**: `tool_use` events include `state.status` (`pending`, `running`, `completed`, `error`)
|
|
133
|
-
- **Session resume**: `opencode run --session <SESSION_ID> "prompt"` / `--continue` for last session
|
|
134
|
-
- **Model selection**: `--model provider/model` for flexible LLM provider switching
|
|
135
|
-
- **File attachment**: `--file <path>` for attaching files to the prompt
|
|
136
|
-
|
|
137
|
-
> OpenCode uses a part-based event model where `tool_use` events carry state transitions (`pending`→`running`→`completed`).
|
|
138
|
-
> Maps to `StreamMessage` via `text`→`Text`, `tool_use(running)`→`ToolUse`, `tool_use(completed)`→`ToolResult`, `step_finish`→`Done`.
|
|
139
|
-
|
|
140
|
-
### Implementation Status
|
|
141
|
-
|
|
142
|
-
- [x] **Extract shared types** — `StreamMessage`, `CancelToken`, `AgentResponse` in `src/services/agent.rs`
|
|
143
|
-
- [x] **Add `src/services/gemini.rs`** — Gemini agent with `-p` and `--output-format stream-json`
|
|
144
|
-
- [x] **Map StreamMessage** — Gemini JSON events → `StreamMessage` enum (`message`→`Text`, `tool_use`→`ToolUse`, `result`→`Done`)
|
|
145
|
-
- [x] **Agent dispatch in bots** — `telegram.rs` and `discord.rs` branch on agent type
|
|
146
|
-
- [x] **Update routing in `main.rs`** — `--agent gemini` accepted alongside `claude`
|
|
147
|
-
- [x] **Add `src/services/codex.rs`** — Codex agent with `exec --json --full-auto`, session resume support
|
|
148
|
-
- [x] **Add `src/services/opencode.rs`** — OpenCode agent with `run --format json`, session resume support
|
|
149
|
-
|
|
150
93
|
## Supported Platforms
|
|
151
94
|
|
|
152
95
|
- macOS (Apple Silicon & Intel)
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@aemi-cli/aemi",
|
|
3
|
-
"version": "0.0.
|
|
3
|
+
"version": "0.0.14",
|
|
4
4
|
"description": "LLM CLI routing tool with Telegram and Discord bot integration",
|
|
5
5
|
"license": "MIT",
|
|
6
6
|
"repository": {
|
|
@@ -23,9 +23,9 @@
|
|
|
23
23
|
"aemi": "bin/aemi"
|
|
24
24
|
},
|
|
25
25
|
"optionalDependencies": {
|
|
26
|
-
"@aemi-cli/linux-x64": "0.0.
|
|
27
|
-
"@aemi-cli/linux-arm64": "0.0.
|
|
28
|
-
"@aemi-cli/darwin-x64": "0.0.
|
|
29
|
-
"@aemi-cli/darwin-arm64": "0.0.
|
|
26
|
+
"@aemi-cli/linux-x64": "0.0.14",
|
|
27
|
+
"@aemi-cli/linux-arm64": "0.0.14",
|
|
28
|
+
"@aemi-cli/darwin-x64": "0.0.14",
|
|
29
|
+
"@aemi-cli/darwin-arm64": "0.0.14"
|
|
30
30
|
}
|
|
31
31
|
}
|