opencode-agent-sdk 0.2.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,385 @@
1
+ Metadata-Version: 2.4
2
+ Name: opencode-agent-sdk
3
+ Version: 0.2.0
4
+ Summary: Open-source Agent SDK backed by OpenCode ACP (drop-in replacement for claude_agent_sdk)
5
+ Author: OpenCode
6
+ License: MIT
7
+ Classifier: Development Status :: 3 - Alpha
8
+ Classifier: Intended Audience :: Developers
9
+ Classifier: License :: OSI Approved :: MIT License
10
+ Classifier: Programming Language :: Python :: 3
11
+ Classifier: Programming Language :: Python :: 3.10
12
+ Classifier: Programming Language :: Python :: 3.11
13
+ Classifier: Programming Language :: Python :: 3.12
14
+ Classifier: Topic :: Software Development :: Libraries
15
+ Requires-Python: >=3.10
16
+ Description-Content-Type: text/markdown
17
+ Requires-Dist: anyio
18
+ Requires-Dist: httpx
19
+ Provides-Extra: opencode-ai
20
+ Requires-Dist: opencode-ai>=0.1.0a36; extra == "opencode-ai"
21
+ Provides-Extra: dev
22
+ Requires-Dist: pytest>=8.0; extra == "dev"
23
+
24
+ # OpenCode Agent SDK for Python
25
+
26
+ Python SDK for building agents backed by [OpenCode](https://github.com/nichochar/opencode). Drop-in replacement for `claude_agent_sdk` with support for any LLM provider (Anthropic, OpenAI, xAI, etc.).
27
+
28
+ <video src="https://github.com/user-attachments/assets/061fb862-3253-46aa-92a7-71dc9258960a"
29
+ autoplay
30
+ muted
31
+ loop
32
+ playsinline
33
+ controls>
34
+ </video>
35
+
36
+ ## Installation
37
+
38
+ ```bash
39
+ pip install opencode-agent-sdk
40
+ ```
41
+
42
+ **Prerequisites:**
43
+
44
+ - Python 3.10+
45
+ - An OpenCode server (`opencode serve`) **or** the `opencode` CLI installed locally
46
+
47
+ ## Quick Start
48
+
49
+ ```python
50
+ import asyncio
51
+ from opencode_agent_sdk import SDKClient, AgentOptions, AssistantMessage, TextBlock
52
+
53
+ async def main():
54
+ client = SDKClient(options=AgentOptions(
55
+ model="claude-haiku-4-5",
56
+ server_url="http://localhost:54321",
57
+ ))
58
+
59
+ await client.connect()
60
+ await client.query("What is 2 + 2?")
61
+
62
+ async for message in client.receive_response():
63
+ if isinstance(message, AssistantMessage):
64
+ for block in message.content:
65
+ if isinstance(block, TextBlock):
66
+ print(block.text)
67
+
68
+ await client.disconnect()
69
+
70
+ asyncio.run(main())
71
+ ```
72
+
73
+ ## SDKClient
74
+
75
+ `SDKClient` supports bidirectional conversations with an LLM via OpenCode. It works in two transport modes:
76
+
77
+ - **HTTP mode** — communicates with a running `opencode serve` instance over REST
78
+ - **Subprocess mode** — spawns `opencode acp` locally over stdio JSON-RPC
79
+
80
+ ### HTTP Mode (recommended)
81
+
82
+ Start the server, then connect:
83
+
84
+ ```bash
85
+ docker compose up -d # starts opencode serve on port 54321
86
+ ```
87
+
88
+ ```python
89
+ from opencode_agent_sdk import SDKClient, AgentOptions
90
+
91
+ client = SDKClient(options=AgentOptions(
92
+ model="claude-haiku-4-5",
93
+ server_url="http://localhost:54321",
94
+ system_prompt="You are a helpful assistant",
95
+ ))
96
+
97
+ await client.connect()
98
+ await client.query("Hello!")
99
+
100
+ async for msg in client.receive_response():
101
+ print(msg)
102
+
103
+ await client.disconnect()
104
+ ```
105
+
106
+ ### Subprocess Mode
107
+
108
+ When `server_url` is not set, the SDK spawns `opencode acp` as a child process:
109
+
110
+ ```python
111
+ client = SDKClient(options=AgentOptions(
112
+ cwd="/path/to/project",
113
+ model="claude-haiku-4-5",
114
+ ))
115
+ ```
116
+
117
+ ### Resuming Sessions
118
+
119
+ ```python
120
+ options = AgentOptions(
121
+ resume="session-id-from-previous-run",
122
+ server_url="http://localhost:54321",
123
+ )
124
+ ```
125
+
126
+ ## AgentOptions
127
+
128
+ | Field | Type | Default | Description |
129
+ |-------|------|---------|-------------|
130
+ | `cwd` | `str` | `"."` | Working directory |
131
+ | `model` | `str` | `""` | Model identifier (e.g. `"claude-haiku-4-5"`) |
132
+ | `provider_id` | `str` | `"anthropic"` | Provider identifier |
133
+ | `system_prompt` | `str` | `""` | System prompt for the LLM |
134
+ | `server_url` | `str` | `""` | OpenCode server URL; enables HTTP mode when set |
135
+ | `mcp_servers` | `dict` | `{}` | MCP server configurations |
136
+ | `allowed_tools` | `list[str]` | `[]` | Tools the agent is allowed to use |
137
+ | `permission_mode` | `str` | `""` | Permission mode for tool execution |
138
+ | `hooks` | `dict` | `{}` | Hook matchers keyed by event type |
139
+ | `max_turns` | `int` | `100` | Maximum conversation turns |
140
+ | `resume` | `str \| None` | `None` | Session ID to resume |
141
+
142
+ ## Custom Tools (MCP Servers)
143
+
144
+ Define tools as Python functions and expose them as in-process MCP servers:
145
+
146
+ ```python
147
+ from opencode_agent_sdk import tool, create_sdk_mcp_server, SDKClient, AgentOptions
148
+
149
+ @tool("greet", "Greet a user", {"type": "object", "properties": {"name": {"type": "string"}}})
150
+ def greet_user(args):
151
+ return {"content": [{"type": "text", "text": f"Hello, {args['name']}!"}]}
152
+
153
+ server = create_sdk_mcp_server("my-tools", tools=[greet_user])
154
+
155
+ client = SDKClient(options=AgentOptions(
156
+ mcp_servers={"my-tools": server},
157
+ allowed_tools=["mcp__my-tools__greet"],
158
+ server_url="http://localhost:54321",
159
+ ))
160
+ ```
161
+
162
+ You can mix in-process SDK servers with external MCP servers:
163
+
164
+ ```python
165
+ options = AgentOptions(
166
+ mcp_servers={
167
+ "internal": sdk_server, # In-process SDK server
168
+ "external": { # External stdio server
169
+ "command": "external-server",
170
+ "args": ["--port", "8080"],
171
+ },
172
+ }
173
+ )
174
+ ```
175
+
176
+ ## Hooks
177
+
178
+ Hooks let you intercept and control tool execution. They run deterministically at specific points in the agent loop.
179
+
180
+ ```python
181
+ from opencode_agent_sdk import SDKClient, AgentOptions, HookMatcher
182
+
183
+ async def check_bash_command(input_data, tool_use_id, context):
184
+ tool_input = input_data["tool_input"]
185
+ command = tool_input.get("command", "")
186
+
187
+ if "rm -rf" in command:
188
+ return {
189
+ "hookSpecificOutput": {
190
+ "hookEventName": "PreToolUse",
191
+ "permissionDecision": "deny",
192
+ "permissionDecisionReason": "Destructive command blocked",
193
+ }
194
+ }
195
+ return {}
196
+
197
+ options = AgentOptions(
198
+ allowed_tools=["Bash"],
199
+ hooks={
200
+ "PreToolUse": [
201
+ HookMatcher(matcher="Bash", hooks=[check_bash_command]),
202
+ ],
203
+ },
204
+ server_url="http://localhost:54321",
205
+ )
206
+
207
+ client = SDKClient(options=options)
208
+ await client.connect()
209
+ await client.query("Run: echo hello")
210
+
211
+ async for msg in client.receive_response():
212
+ print(msg)
213
+
214
+ await client.disconnect()
215
+ ```
216
+
217
+ Hook event types: `"PreToolUse"`, `"Stop"`
218
+
219
+ ## Types
220
+
221
+ See [src/opencode_agent_sdk/types.py](src/opencode_agent_sdk/types.py) for complete type definitions:
222
+
223
+ - `AssistantMessage` — LLM response containing `TextBlock` and/or `ToolUseBlock`
224
+ - `ResultMessage` — Final message with usage stats, cost, and session info
225
+ - `SystemMessage` — Internal events (init, tool results, thoughts)
226
+ - `TextBlock` — Text content from the LLM
227
+ - `ToolUseBlock` — Tool invocation with name and input
228
+ - `HookMatcher` — Matches tool names to hook functions
229
+
230
+ ## Error Handling
231
+
232
+ ```python
233
+ from opencode_agent_sdk._errors import ProcessError
234
+
235
+ try:
236
+ await client.connect()
237
+ except ProcessError as e:
238
+ print(f"Failed with exit code: {e.exit_code}")
239
+ ```
240
+
241
+ ## Migrating from claude_agent_sdk
242
+
243
+ This SDK mirrors the `claude_agent_sdk` API. Migration requires renaming imports:
244
+
245
+ ```python
246
+ # Before (claude_agent_sdk)
247
+ from claude_agent_sdk import (
248
+ ClaudeAgentOptions, ClaudeSDKClient, AssistantMessage,
249
+ ResultMessage, SystemMessage, TextBlock, ToolUseBlock, HookMatcher,
250
+ )
251
+ from claude_agent_sdk._errors import ProcessError
252
+
253
+ # After (opencode_agent_sdk)
254
+ from opencode_agent_sdk import (
255
+ AgentOptions, SDKClient, AssistantMessage,
256
+ ResultMessage, SystemMessage, TextBlock, ToolUseBlock, HookMatcher,
257
+ )
258
+ from opencode_agent_sdk._errors import ProcessError
259
+ ```
260
+
261
+ All method calls, message types, hooks, and tool decorators stay the same. Only the class names change:
262
+
263
+ | claude_agent_sdk | opencode_agent_sdk |
264
+ |------------------|-------------------|
265
+ | `ClaudeSDKClient` | `SDKClient` |
266
+ | `ClaudeAgentOptions` | `AgentOptions` |
267
+
268
+ ## Demo: End-to-End Walkthrough
269
+
270
+ A full working demo that connects to `opencode serve`, sends a prompt to clone a GitHub repo, and streams the LLM response back through the SDK.
271
+
272
+ ### 1. Configure API keys
273
+
274
+ Create a `.env` file in the project root with your provider key:
275
+
276
+ ```bash
277
+ ANTHROPIC_API_KEY=sk-ant-...
278
+ ```
279
+
280
+ ### 2. Start the server
281
+
282
+ ```bash
283
+ docker compose up -d opencode
284
+ ```
285
+
286
+ This builds and starts the `opencode serve` container on port 54321.
287
+
288
+ ### 3. Install dependencies
289
+
290
+ ```bash
291
+ uv sync
292
+ ```
293
+
294
+ ### 4. Run the E2E demo
295
+
296
+ ```bash
297
+ uv run python scripts/e2e_test.py
298
+ ```
299
+
300
+ ### Expected output
301
+
302
+ ```
303
+ ============================================================
304
+ E2E Test: Clone repo & explain project
305
+ ============================================================
306
+ Server: http://127.0.0.1:54321
307
+
308
+ [*] Connecting ...
309
+ [*] Connected.
310
+
311
+ [>] Prompt:
312
+ Clone the repo https://github.com/dingkwang/opencode-agent-sdk-python and then
313
+ explain what the project does. Give a concise summary of its purpose,
314
+ architecture, and key components.
315
+
316
+ ------------------------------------------------------------
317
+
318
+ [system:init]
319
+ [system:step_start]
320
+
321
+ [assistant]
322
+ ## opencode-agent-sdk-python
323
+ ### Purpose
324
+ An open-source Python SDK that serves as a drop-in replacement for Anthropic's
325
+ proprietary `claude_agent_sdk`. It delegates all LLM work to OpenCode — an
326
+ open-source headless server that supports any provider ...
327
+ ...
328
+
329
+ ============================================================
330
+ [result] session = ses_...
331
+ cost = $0.024723
332
+ turns = 1
333
+ is_error = False
334
+ ============================================================
335
+
336
+ [*] Message counts: {'system': 2, 'assistant': 1, 'result': 1}
337
+ [*] E2E test complete.
338
+ ```
339
+
340
+ ### What's happening
341
+
342
+ 1. `SDKClient` creates an HTTP session against `opencode serve`
343
+ 2. `query()` sends the user prompt via `POST /session/{id}/message`
344
+ 3. `receive_response()` yields typed messages: `SystemMessage` (init, step events), `AssistantMessage` (LLM text/tool calls), and `ResultMessage` (cost, session ID, turn count)
345
+ 4. `disconnect()` cleans up the session
346
+
347
+ ### Customizing the demo
348
+
349
+ Set a custom server URL via environment variable:
350
+
351
+ ```bash
352
+ OPENCODE_SERVER_URL=http://your-host:54321 uv run python scripts/e2e_test.py
353
+ ```
354
+
355
+ ## Running with Docker
356
+
357
+ ```bash
358
+ # Start opencode serve
359
+ docker compose up -d
360
+
361
+ # Run the integration test
362
+ docker compose run --rm test
363
+ ```
364
+
365
+ The Docker setup uses `opencode-ai` v1.2.6 and exposes the REST API on port 54321. Pass provider API keys via `.env` (e.g. `ANTHROPIC_API_KEY`).
366
+
367
+ ## Development
368
+
369
+ ```bash
370
+ # Install dependencies
371
+ uv sync
372
+
373
+ # Run tests
374
+ uv run pytest
375
+
376
+ # Run demo against a running opencode serve
377
+ uv run python scripts/opencode_ai_demo.py
378
+
379
+ # Interactive multi-turn chat
380
+ uv run python scripts/chat.py
381
+ ```
382
+
383
+ ## License
384
+
385
+ MIT