python-durable 0.1.1__tar.gz → 0.2.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (26) hide show
  1. python_durable-0.2.1/PKG-INFO +290 -0
  2. python_durable-0.2.1/README.md +255 -0
  3. python_durable-0.2.1/examples/pydantic_ai_example.py +258 -0
  4. {python_durable-0.1.1 → python_durable-0.2.1}/examples/redis_example.py +1 -2
  5. {python_durable-0.1.1 → python_durable-0.2.1}/pyproject.toml +5 -1
  6. {python_durable-0.1.1 → python_durable-0.2.1}/src/durable/__init__.py +27 -1
  7. python_durable-0.2.1/src/durable/pydantic_ai.py +342 -0
  8. {python_durable-0.1.1 → python_durable-0.2.1}/src/durable/redis_store.py +1 -1
  9. python_durable-0.2.1/tests/test_pydantic_ai.py +344 -0
  10. python_durable-0.1.1/PKG-INFO +0 -137
  11. python_durable-0.1.1/README.md +0 -105
  12. python_durable-0.1.1/examples/pydantic_ai_example.py +0 -162
  13. {python_durable-0.1.1 → python_durable-0.2.1}/.github/workflows/publish.yml +0 -0
  14. {python_durable-0.1.1 → python_durable-0.2.1}/.gitignore +0 -0
  15. {python_durable-0.1.1 → python_durable-0.2.1}/LICENSE +0 -0
  16. {python_durable-0.1.1 → python_durable-0.2.1}/examples/approval.py +0 -0
  17. {python_durable-0.1.1 → python_durable-0.2.1}/examples/examples.py +0 -0
  18. {python_durable-0.1.1 → python_durable-0.2.1}/examples/in_memory_example.py +0 -0
  19. {python_durable-0.1.1 → python_durable-0.2.1}/src/durable/backoff.py +0 -0
  20. {python_durable-0.1.1 → python_durable-0.2.1}/src/durable/context.py +0 -0
  21. {python_durable-0.1.1 → python_durable-0.2.1}/src/durable/store.py +0 -0
  22. {python_durable-0.1.1 → python_durable-0.2.1}/src/durable/workflow.py +0 -0
  23. {python_durable-0.1.1 → python_durable-0.2.1}/tests/__init__.py +0 -0
  24. {python_durable-0.1.1 → python_durable-0.2.1}/tests/test_durable.py +0 -0
  25. {python_durable-0.1.1 → python_durable-0.2.1}/tests/test_redis_store.py +0 -0
  26. {python_durable-0.1.1 → python_durable-0.2.1}/tests/test_signals.py +0 -0
@@ -0,0 +1,290 @@
1
+ Metadata-Version: 2.4
2
+ Name: python-durable
3
+ Version: 0.2.1
4
+ Summary: Lightweight workflow durability for Python — make any async workflow resumable after crashes with just a decorator.
5
+ Project-URL: Repository, https://github.com/WillemDeGroef/python-durable
6
+ Author: Willem
7
+ License-Expression: MIT
8
+ License-File: LICENSE
9
+ Classifier: Development Status :: 3 - Alpha
10
+ Classifier: Framework :: AsyncIO
11
+ Classifier: Intended Audience :: Developers
12
+ Classifier: License :: OSI Approved :: MIT License
13
+ Classifier: Programming Language :: Python :: 3
14
+ Classifier: Programming Language :: Python :: 3.12
15
+ Classifier: Programming Language :: Python :: 3.13
16
+ Classifier: Typing :: Typed
17
+ Requires-Python: >=3.12
18
+ Requires-Dist: aiosqlite>=0.20
19
+ Provides-Extra: dev
20
+ Requires-Dist: fakeredis>=2.26; extra == 'dev'
21
+ Requires-Dist: pydantic-ai>=0.1; extra == 'dev'
22
+ Requires-Dist: pytest-asyncio>=0.24; extra == 'dev'
23
+ Requires-Dist: pytest>=8.0; extra == 'dev'
24
+ Requires-Dist: redis>=5.0; extra == 'dev'
25
+ Requires-Dist: ruff>=0.9; extra == 'dev'
26
+ Requires-Dist: ty>=0.0.1a7; extra == 'dev'
27
+ Provides-Extra: examples
28
+ Requires-Dist: pydantic-ai>=0.1; extra == 'examples'
29
+ Requires-Dist: pydantic>=2.0; extra == 'examples'
30
+ Provides-Extra: pydantic-ai
31
+ Requires-Dist: pydantic-ai>=0.1; extra == 'pydantic-ai'
32
+ Provides-Extra: redis
33
+ Requires-Dist: redis>=5.0; extra == 'redis'
34
+ Description-Content-Type: text/markdown
35
+
36
+ # durable
37
+
38
+ Lightweight workflow durability for Python. Make any async workflow resumable after crashes with just a decorator.
39
+
40
+ Backed by SQLite out of the box; swap in Redis or any `Store` subclass for production.
41
+
42
+ ## Install
43
+
44
+ ```bash
45
+ pip install python-durable
46
+
47
+ # With Redis support
48
+ pip install python-durable[redis]
49
+
50
+ # With Pydantic AI integration
51
+ pip install python-durable[pydantic-ai]
52
+ ```
53
+
54
+ ## Quick start
55
+
56
+ ```python
57
+ from durable import Workflow
58
+ from durable.backoff import exponential
59
+
60
+ wf = Workflow("my-app")
61
+
62
+ @wf.task(retries=3, backoff=exponential(base=2, max=60))
63
+ async def fetch_data(url: str) -> dict:
64
+ async with httpx.AsyncClient() as client:
65
+ return (await client.get(url)).json()
66
+
67
+ @wf.task
68
+ async def save_result(data: dict) -> None:
69
+ await db.insert(data)
70
+
71
+ @wf.workflow(id="pipeline-{source}")
72
+ async def run_pipeline(source: str) -> None:
73
+ data = await fetch_data(f"https://api.example.com/{source}")
74
+ await save_result(data)
75
+
76
+ # First call: runs all steps and checkpoints each one.
77
+ # If it crashes and you call it again with the same args,
78
+ # completed steps are replayed from SQLite instantly.
79
+ await run_pipeline(source="users")
80
+ ```
81
+
82
+ ## How it works
83
+
84
+ 1. **`@wf.task`** wraps an async function with checkpoint + retry logic. When called inside a workflow, results are persisted to the store. On re-run, completed steps return their cached result without re-executing.
85
+
86
+ 2. **`@wf.workflow`** marks the entry point of a durable run. It manages a `RunContext` (via `ContextVar`) so tasks automatically know which run they belong to. The `id` parameter is a template string resolved from function arguments at call time.
87
+
88
+ 3. **`Store`** is the persistence backend. `SQLiteStore` is the default (zero config, backed by aiosqlite). `RedisStore` is available for distributed setups. Subclass `Store` to use Postgres or anything else.
89
+
90
+ ## Features
91
+
92
+ - **Crash recovery** — completed steps are never re-executed after a restart
93
+ - **Automatic retries** — configurable per-task with `exponential`, `linear`, or `constant` backoff
94
+ - **Signals** — durably wait for external input (approvals, webhooks, human-in-the-loop)
95
+ - **Loop support** — use `step_id` to checkpoint each iteration independently
96
+ - **Zero magic outside workflows** — tasks work as plain async functions when called without a workflow context
97
+ - **Pluggable storage** — SQLite by default, Redis built-in, or bring your own `Store`
98
+
99
+ ## Signals
100
+
101
+ Workflows can pause and wait for external input using signals:
102
+
103
+ ```python
104
+ @wf.workflow(id="order-{order_id}")
105
+ async def process_order(order_id: str) -> None:
106
+ await prepare_order(order_id)
107
+ approval = await wf.signal("manager-approval") # pauses here
108
+ if approval["approved"]:
109
+ await ship_order(order_id)
110
+
111
+ # From the outside (e.g. a web handler):
112
+ await wf.complete("order-42", "manager-approval", {"approved": True})
113
+ ```
114
+
115
+ Signals are durable — if the workflow crashes and restarts, a previously delivered signal replays instantly.
116
+
117
+ ## Redis store
118
+
119
+ For distributed or multi-process setups, use `RedisStore` instead of the default SQLite:
120
+
121
+ ```python
122
+ from durable import Workflow, RedisStore
123
+
124
+ store = RedisStore(url="redis://localhost:6379/0", ttl=86400)
125
+ wf = Workflow("my-app", db=store)
126
+ ```
127
+
128
+ Keys auto-expire based on `ttl` (default: 24 hours).
129
+
130
+ ## Backoff strategies
131
+
132
+ ```python
133
+ from durable.backoff import exponential, linear, constant
134
+
135
+ @wf.task(retries=5, backoff=exponential(base=2, max=60)) # 2s, 4s, 8s, 16s, 32s
136
+ async def exp_task(): ...
137
+
138
+ @wf.task(retries=3, backoff=linear(start=2, step=3)) # 2s, 5s, 8s
139
+ async def linear_task(): ...
140
+
141
+ @wf.task(retries=3, backoff=constant(5)) # 5s, 5s, 5s
142
+ async def const_task(): ...
143
+ ```
144
+
145
+ ## Loops with step_id
146
+
147
+ When calling the same task in a loop, pass `step_id` so each iteration is checkpointed independently:
148
+
149
+ ```python
150
+ @wf.workflow(id="batch-{batch_id}")
151
+ async def process_batch(batch_id: str) -> None:
152
+ for i, item in enumerate(items):
153
+ await process_item(item, step_id=f"item-{i}")
154
+ ```
155
+
156
+ If the workflow crashes mid-loop, only the remaining items are processed on restart.
157
+
158
+ ## Pydantic AI integration
159
+
160
+ Make any [pydantic-ai](https://ai.pydantic.dev) agent durable with **zero infrastructure** — no Temporal server, no Prefect cloud, no Postgres. Just decorators and a SQLite file.
161
+
162
+ Pydantic AI natively supports three durable execution backends: **Temporal**, **DBOS**, and **Prefect**. All three require external infrastructure. `python-durable` is a fourth option that trades scale for simplicity:
163
+
164
+ | Feature | Temporal | DBOS | Prefect | **python-durable** |
165
+ |---------|----------|------|---------|-------------------|
166
+ | Infrastructure | Server + Worker | Postgres | Cloud/Server | **SQLite file** |
167
+ | Setup | Complex | Moderate | Moderate | **`pip install`** |
168
+ | Lines to wrap an agent | ~20 | ~10 | ~10 | **1** |
169
+ | Crash recovery | Yes | Yes | Yes | Yes |
170
+ | Retries + backoff | Yes | Yes | Yes | Yes |
171
+ | Human-in-the-loop signals | Yes | No | No | Yes |
172
+ | Multi-process / distributed | Yes | Yes | Yes | No (single process) |
173
+ | Production scale | Enterprise | Production | Production | **Dev / SME / CLI** |
174
+
175
+ **Best for:** prototyping, CLI tools, single-process services, SME deployments, and any situation where you want durable agents without ops overhead.
176
+
177
+ ### DurableAgent
178
+
179
+ ```python
180
+ from pydantic_ai import Agent
181
+ from durable import Workflow
182
+ from durable.pydantic_ai import DurableAgent, TaskConfig
183
+ from durable.backoff import exponential
184
+
185
+ wf = Workflow("my-app")
186
+ agent = Agent("openai:gpt-5.2", instructions="Be helpful.", name="assistant")
187
+
188
+ durable_agent = DurableAgent(agent, wf)
189
+
190
+ result = await durable_agent.run("What is the capital of France?")
191
+ print(result.output) # Paris
192
+
193
+ # Same run_id after crash → replayed from SQLite, no LLM call
194
+ result = await durable_agent.run("What is the capital of France?", run_id="same-id")
195
+ ```
196
+
197
+ With custom retry config:
198
+
199
+ ```python
200
+ durable_agent = DurableAgent(
201
+ agent,
202
+ wf,
203
+ model_task_config=TaskConfig(retries=5, backoff=exponential(base=2, max=120)),
204
+ tool_task_config=TaskConfig(retries=3),
205
+ )
206
+ ```
207
+
208
+ ### @durable_tool
209
+
210
+ Make individual tool functions durable:
211
+
212
+ ```python
213
+ from durable.pydantic_ai import durable_tool
214
+
215
+ @durable_tool(wf, retries=3, backoff=exponential(base=2, max=60))
216
+ async def web_search(query: str) -> str:
217
+ async with httpx.AsyncClient() as client:
218
+ return (await client.get(f"https://api.search.com?q={query}")).text
219
+ ```
220
+
221
+ ### @durable_pipeline
222
+
223
+ Multi-agent workflows with per-step checkpointing. On crash, completed steps replay from the store and only remaining work executes:
224
+
225
+ ```python
226
+ from durable.pydantic_ai import durable_pipeline
227
+
228
+ @durable_pipeline(wf, id="research-{topic_id}")
229
+ async def research(topic_id: str, topic: str) -> str:
230
+ plan = await plan_research(topic)
231
+ findings = []
232
+ for i, query in enumerate(plan["queries"]):
233
+ r = await search(query, step_id=f"q-{i}")
234
+ findings.append(r)
235
+ return await summarize(findings)
236
+ ```
237
+
238
+ ### Comparison with Temporal
239
+
240
+ ```python
241
+ # Temporal — requires server + worker + plugin
242
+ from temporalio import workflow
243
+ from pydantic_ai.durable_exec.temporal import TemporalAgent
244
+
245
+ temporal_agent = TemporalAgent(agent)
246
+
247
+ @workflow.defn
248
+ class MyWorkflow:
249
+ @workflow.run
250
+ async def run(self, prompt: str):
251
+ return await temporal_agent.run(prompt)
252
+
253
+ # python-durable
254
+ from durable import Workflow
255
+ from durable.pydantic_ai import DurableAgent
256
+
257
+ wf = Workflow("my-app")
258
+ durable_agent = DurableAgent(agent, wf)
259
+ result = await durable_agent.run("Hello")
260
+ ```
261
+
262
+ ### Caveats
263
+
264
+ - **Tool functions** registered on the pydantic-ai agent are NOT automatically wrapped. If they perform I/O, decorate them with `@durable_tool(wf)` or `@wf.task`.
265
+ - **Streaming** (`agent.run_stream()`) is not supported in durable mode (same limitation as DBOS). Use `agent.run()`.
266
+ - **Single process** — unlike Temporal/DBOS, python-durable runs in-process. For distributed workloads, use the Redis store.
267
+
268
+ See [`examples/pydantic_ai_example.py`](examples/pydantic_ai_example.py) for five complete patterns.
269
+
270
+ ## Important: JSON serialization
271
+
272
+ Task return values must be JSON-serializable (dicts, lists, strings, numbers, booleans, `None`). The store uses `json.dumps` internally.
273
+
274
+ For Pydantic models, return `.model_dump()` from tasks and reconstruct with `.model_validate()` downstream:
275
+
276
+ ```python
277
+ @wf.task
278
+ async def validate_invoice(draft: InvoiceDraft) -> dict:
279
+ validated = ValidatedInvoice(...)
280
+ return validated.model_dump()
281
+
282
+ @wf.task
283
+ async def book_invoice(data: dict) -> dict:
284
+ invoice = ValidatedInvoice.model_validate(data)
285
+ ...
286
+ ```
287
+
288
+ ## License
289
+
290
+ MIT
@@ -0,0 +1,255 @@
1
+ # durable
2
+
3
+ Lightweight workflow durability for Python. Make any async workflow resumable after crashes with just a decorator.
4
+
5
+ Backed by SQLite out of the box; swap in Redis or any `Store` subclass for production.
6
+
7
+ ## Install
8
+
9
+ ```bash
10
+ pip install python-durable
11
+
12
+ # With Redis support
13
+ pip install python-durable[redis]
14
+
15
+ # With Pydantic AI integration
16
+ pip install python-durable[pydantic-ai]
17
+ ```
18
+
19
+ ## Quick start
20
+
21
+ ```python
22
+ from durable import Workflow
23
+ from durable.backoff import exponential
24
+
25
+ wf = Workflow("my-app")
26
+
27
+ @wf.task(retries=3, backoff=exponential(base=2, max=60))
28
+ async def fetch_data(url: str) -> dict:
29
+ async with httpx.AsyncClient() as client:
30
+ return (await client.get(url)).json()
31
+
32
+ @wf.task
33
+ async def save_result(data: dict) -> None:
34
+ await db.insert(data)
35
+
36
+ @wf.workflow(id="pipeline-{source}")
37
+ async def run_pipeline(source: str) -> None:
38
+ data = await fetch_data(f"https://api.example.com/{source}")
39
+ await save_result(data)
40
+
41
+ # First call: runs all steps and checkpoints each one.
42
+ # If it crashes and you call it again with the same args,
43
+ # completed steps are replayed from SQLite instantly.
44
+ await run_pipeline(source="users")
45
+ ```
46
+
47
+ ## How it works
48
+
49
+ 1. **`@wf.task`** wraps an async function with checkpoint + retry logic. When called inside a workflow, results are persisted to the store. On re-run, completed steps return their cached result without re-executing.
50
+
51
+ 2. **`@wf.workflow`** marks the entry point of a durable run. It manages a `RunContext` (via `ContextVar`) so tasks automatically know which run they belong to. The `id` parameter is a template string resolved from function arguments at call time.
52
+
53
+ 3. **`Store`** is the persistence backend. `SQLiteStore` is the default (zero config, backed by aiosqlite). `RedisStore` is available for distributed setups. Subclass `Store` to use Postgres or anything else.
54
+
55
+ ## Features
56
+
57
+ - **Crash recovery** — completed steps are never re-executed after a restart
58
+ - **Automatic retries** — configurable per-task with `exponential`, `linear`, or `constant` backoff
59
+ - **Signals** — durably wait for external input (approvals, webhooks, human-in-the-loop)
60
+ - **Loop support** — use `step_id` to checkpoint each iteration independently
61
+ - **Zero magic outside workflows** — tasks work as plain async functions when called without a workflow context
62
+ - **Pluggable storage** — SQLite by default, Redis built-in, or bring your own `Store`
63
+
64
+ ## Signals
65
+
66
+ Workflows can pause and wait for external input using signals:
67
+
68
+ ```python
69
+ @wf.workflow(id="order-{order_id}")
70
+ async def process_order(order_id: str) -> None:
71
+ await prepare_order(order_id)
72
+ approval = await wf.signal("manager-approval") # pauses here
73
+ if approval["approved"]:
74
+ await ship_order(order_id)
75
+
76
+ # From the outside (e.g. a web handler):
77
+ await wf.complete("order-42", "manager-approval", {"approved": True})
78
+ ```
79
+
80
+ Signals are durable — if the workflow crashes and restarts, a previously delivered signal replays instantly.
81
+
82
+ ## Redis store
83
+
84
+ For distributed or multi-process setups, use `RedisStore` instead of the default SQLite:
85
+
86
+ ```python
87
+ from durable import Workflow, RedisStore
88
+
89
+ store = RedisStore(url="redis://localhost:6379/0", ttl=86400)
90
+ wf = Workflow("my-app", db=store)
91
+ ```
92
+
93
+ Keys auto-expire based on `ttl` (default: 24 hours).
94
+
95
+ ## Backoff strategies
96
+
97
+ ```python
98
+ from durable.backoff import exponential, linear, constant
99
+
100
+ @wf.task(retries=5, backoff=exponential(base=2, max=60)) # 2s, 4s, 8s, 16s, 32s
101
+ async def exp_task(): ...
102
+
103
+ @wf.task(retries=3, backoff=linear(start=2, step=3)) # 2s, 5s, 8s
104
+ async def linear_task(): ...
105
+
106
+ @wf.task(retries=3, backoff=constant(5)) # 5s, 5s, 5s
107
+ async def const_task(): ...
108
+ ```
109
+
110
+ ## Loops with step_id
111
+
112
+ When calling the same task in a loop, pass `step_id` so each iteration is checkpointed independently:
113
+
114
+ ```python
115
+ @wf.workflow(id="batch-{batch_id}")
116
+ async def process_batch(batch_id: str) -> None:
117
+ for i, item in enumerate(items):
118
+ await process_item(item, step_id=f"item-{i}")
119
+ ```
120
+
121
+ If the workflow crashes mid-loop, only the remaining items are processed on restart.
122
+
123
+ ## Pydantic AI integration
124
+
125
+ Make any [pydantic-ai](https://ai.pydantic.dev) agent durable with **zero infrastructure** — no Temporal server, no Prefect cloud, no Postgres. Just decorators and a SQLite file.
126
+
127
+ Pydantic AI natively supports three durable execution backends: **Temporal**, **DBOS**, and **Prefect**. All three require external infrastructure. `python-durable` is a fourth option that trades scale for simplicity:
128
+
129
+ | Feature | Temporal | DBOS | Prefect | **python-durable** |
130
+ |---------|----------|------|---------|-------------------|
131
+ | Infrastructure | Server + Worker | Postgres | Cloud/Server | **SQLite file** |
132
+ | Setup | Complex | Moderate | Moderate | **`pip install`** |
133
+ | Lines to wrap an agent | ~20 | ~10 | ~10 | **1** |
134
+ | Crash recovery | Yes | Yes | Yes | Yes |
135
+ | Retries + backoff | Yes | Yes | Yes | Yes |
136
+ | Human-in-the-loop signals | Yes | No | No | Yes |
137
+ | Multi-process / distributed | Yes | Yes | Yes | No (single process) |
138
+ | Production scale | Enterprise | Production | Production | **Dev / SME / CLI** |
139
+
140
+ **Best for:** prototyping, CLI tools, single-process services, SME deployments, and any situation where you want durable agents without ops overhead.
141
+
142
+ ### DurableAgent
143
+
144
+ ```python
145
+ from pydantic_ai import Agent
146
+ from durable import Workflow
147
+ from durable.pydantic_ai import DurableAgent, TaskConfig
148
+ from durable.backoff import exponential
149
+
150
+ wf = Workflow("my-app")
151
+ agent = Agent("openai:gpt-5.2", instructions="Be helpful.", name="assistant")
152
+
153
+ durable_agent = DurableAgent(agent, wf)
154
+
155
+ result = await durable_agent.run("What is the capital of France?")
156
+ print(result.output) # Paris
157
+
158
+ # Same run_id after crash → replayed from SQLite, no LLM call
159
+ result = await durable_agent.run("What is the capital of France?", run_id="same-id")
160
+ ```
161
+
162
+ With custom retry config:
163
+
164
+ ```python
165
+ durable_agent = DurableAgent(
166
+ agent,
167
+ wf,
168
+ model_task_config=TaskConfig(retries=5, backoff=exponential(base=2, max=120)),
169
+ tool_task_config=TaskConfig(retries=3),
170
+ )
171
+ ```
172
+
173
+ ### @durable_tool
174
+
175
+ Make individual tool functions durable:
176
+
177
+ ```python
178
+ from durable.pydantic_ai import durable_tool
179
+
180
+ @durable_tool(wf, retries=3, backoff=exponential(base=2, max=60))
181
+ async def web_search(query: str) -> str:
182
+ async with httpx.AsyncClient() as client:
183
+ return (await client.get(f"https://api.search.com?q={query}")).text
184
+ ```
185
+
186
+ ### @durable_pipeline
187
+
188
+ Multi-agent workflows with per-step checkpointing. On crash, completed steps replay from the store and only remaining work executes:
189
+
190
+ ```python
191
+ from durable.pydantic_ai import durable_pipeline
192
+
193
+ @durable_pipeline(wf, id="research-{topic_id}")
194
+ async def research(topic_id: str, topic: str) -> str:
195
+ plan = await plan_research(topic)
196
+ findings = []
197
+ for i, query in enumerate(plan["queries"]):
198
+ r = await search(query, step_id=f"q-{i}")
199
+ findings.append(r)
200
+ return await summarize(findings)
201
+ ```
202
+
203
+ ### Comparison with Temporal
204
+
205
+ ```python
206
+ # Temporal — requires server + worker + plugin
207
+ from temporalio import workflow
208
+ from pydantic_ai.durable_exec.temporal import TemporalAgent
209
+
210
+ temporal_agent = TemporalAgent(agent)
211
+
212
+ @workflow.defn
213
+ class MyWorkflow:
214
+ @workflow.run
215
+ async def run(self, prompt: str):
216
+ return await temporal_agent.run(prompt)
217
+
218
+ # python-durable
219
+ from durable import Workflow
220
+ from durable.pydantic_ai import DurableAgent
221
+
222
+ wf = Workflow("my-app")
223
+ durable_agent = DurableAgent(agent, wf)
224
+ result = await durable_agent.run("Hello")
225
+ ```
226
+
227
+ ### Caveats
228
+
229
+ - **Tool functions** registered on the pydantic-ai agent are NOT automatically wrapped. If they perform I/O, decorate them with `@durable_tool(wf)` or `@wf.task`.
230
+ - **Streaming** (`agent.run_stream()`) is not supported in durable mode (same limitation as DBOS). Use `agent.run()`.
231
+ - **Single process** — unlike Temporal/DBOS, python-durable runs in-process. For distributed workloads, use the Redis store.
232
+
233
+ See [`examples/pydantic_ai_example.py`](examples/pydantic_ai_example.py) for five complete patterns.
234
+
235
+ ## Important: JSON serialization
236
+
237
+ Task return values must be JSON-serializable (dicts, lists, strings, numbers, booleans, `None`). The store uses `json.dumps` internally.
238
+
239
+ For Pydantic models, return `.model_dump()` from tasks and reconstruct with `.model_validate()` downstream:
240
+
241
+ ```python
242
+ @wf.task
243
+ async def validate_invoice(draft: InvoiceDraft) -> dict:
244
+ validated = ValidatedInvoice(...)
245
+ return validated.model_dump()
246
+
247
+ @wf.task
248
+ async def book_invoice(data: dict) -> dict:
249
+ invoice = ValidatedInvoice.model_validate(data)
250
+ ...
251
+ ```
252
+
253
+ ## License
254
+
255
+ MIT