soe-ai 0.1.0__py3-none-any.whl → 0.1.2__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- soe/broker.py +4 -5
- soe/builtin_tools/__init__.py +39 -0
- soe/builtin_tools/soe_add_signal.py +82 -0
- soe/builtin_tools/soe_call_tool.py +111 -0
- soe/builtin_tools/soe_copy_context.py +80 -0
- soe/builtin_tools/soe_explore_docs.py +290 -0
- soe/builtin_tools/soe_get_available_tools.py +42 -0
- soe/builtin_tools/soe_get_context.py +50 -0
- soe/builtin_tools/soe_get_workflows.py +63 -0
- soe/builtin_tools/soe_inject_node.py +86 -0
- soe/builtin_tools/soe_inject_workflow.py +105 -0
- soe/builtin_tools/soe_list_contexts.py +73 -0
- soe/builtin_tools/soe_remove_node.py +72 -0
- soe/builtin_tools/soe_remove_workflow.py +62 -0
- soe/builtin_tools/soe_update_context.py +54 -0
- soe/docs/_config.yml +10 -0
- soe/docs/advanced_patterns/guide_fanout_and_aggregations.md +318 -0
- soe/docs/advanced_patterns/guide_inheritance.md +435 -0
- soe/docs/advanced_patterns/hybrid_intelligence.md +237 -0
- soe/docs/advanced_patterns/index.md +49 -0
- soe/docs/advanced_patterns/operational.md +781 -0
- soe/docs/advanced_patterns/self_evolving_workflows.md +385 -0
- soe/docs/advanced_patterns/swarm_intelligence.md +211 -0
- soe/docs/builtins/context.md +164 -0
- soe/docs/builtins/explore_docs.md +135 -0
- soe/docs/builtins/tools.md +164 -0
- soe/docs/builtins/workflows.md +199 -0
- soe/docs/guide_00_getting_started.md +341 -0
- soe/docs/guide_01_tool.md +206 -0
- soe/docs/guide_02_llm.md +143 -0
- soe/docs/guide_03_router.md +146 -0
- soe/docs/guide_04_patterns.md +475 -0
- soe/docs/guide_05_agent.md +159 -0
- soe/docs/guide_06_schema.md +397 -0
- soe/docs/guide_07_identity.md +540 -0
- soe/docs/guide_08_child.md +612 -0
- soe/docs/guide_09_ecosystem.md +690 -0
- soe/docs/guide_10_infrastructure.md +427 -0
- soe/docs/guide_11_builtins.md +118 -0
- soe/docs/index.md +104 -0
- soe/docs/primitives/backends.md +281 -0
- soe/docs/primitives/context.md +256 -0
- soe/docs/primitives/node_reference.md +259 -0
- soe/docs/primitives/primitives.md +331 -0
- soe/docs/primitives/signals.md +865 -0
- soe/docs_index.py +1 -1
- soe/init.py +2 -2
- soe/lib/__init__.py +0 -0
- soe/lib/child_context.py +46 -0
- soe/lib/context_fields.py +51 -0
- soe/lib/inheritance.py +172 -0
- soe/lib/jinja_render.py +113 -0
- soe/lib/operational.py +51 -0
- soe/lib/parent_sync.py +71 -0
- soe/lib/register_event.py +75 -0
- soe/lib/schema_validation.py +134 -0
- soe/lib/yaml_parser.py +14 -0
- soe/local_backends/__init__.py +18 -0
- soe/local_backends/factory.py +124 -0
- soe/local_backends/in_memory/context.py +38 -0
- soe/local_backends/in_memory/conversation_history.py +60 -0
- soe/local_backends/in_memory/identity.py +52 -0
- soe/local_backends/in_memory/schema.py +40 -0
- soe/local_backends/in_memory/telemetry.py +38 -0
- soe/local_backends/in_memory/workflow.py +33 -0
- soe/local_backends/storage/context.py +57 -0
- soe/local_backends/storage/conversation_history.py +82 -0
- soe/local_backends/storage/identity.py +118 -0
- soe/local_backends/storage/schema.py +96 -0
- soe/local_backends/storage/telemetry.py +72 -0
- soe/local_backends/storage/workflow.py +56 -0
- soe/nodes/__init__.py +13 -0
- soe/nodes/agent/__init__.py +10 -0
- soe/nodes/agent/factory.py +134 -0
- soe/nodes/agent/lib/loop_handlers.py +150 -0
- soe/nodes/agent/lib/loop_state.py +157 -0
- soe/nodes/agent/lib/prompts.py +65 -0
- soe/nodes/agent/lib/tools.py +35 -0
- soe/nodes/agent/stages/__init__.py +12 -0
- soe/nodes/agent/stages/parameter.py +37 -0
- soe/nodes/agent/stages/response.py +54 -0
- soe/nodes/agent/stages/router.py +37 -0
- soe/nodes/agent/state.py +111 -0
- soe/nodes/agent/types.py +66 -0
- soe/nodes/agent/validation/__init__.py +11 -0
- soe/nodes/agent/validation/config.py +95 -0
- soe/nodes/agent/validation/operational.py +24 -0
- soe/nodes/child/__init__.py +3 -0
- soe/nodes/child/factory.py +61 -0
- soe/nodes/child/state.py +59 -0
- soe/nodes/child/validation/__init__.py +11 -0
- soe/nodes/child/validation/config.py +126 -0
- soe/nodes/child/validation/operational.py +28 -0
- soe/nodes/lib/conditions.py +71 -0
- soe/nodes/lib/context.py +24 -0
- soe/nodes/lib/conversation_history.py +77 -0
- soe/nodes/lib/identity.py +64 -0
- soe/nodes/lib/llm_resolver.py +142 -0
- soe/nodes/lib/output.py +68 -0
- soe/nodes/lib/response_builder.py +91 -0
- soe/nodes/lib/signal_emission.py +79 -0
- soe/nodes/lib/signals.py +54 -0
- soe/nodes/lib/tools.py +100 -0
- soe/nodes/llm/__init__.py +7 -0
- soe/nodes/llm/factory.py +103 -0
- soe/nodes/llm/state.py +76 -0
- soe/nodes/llm/types.py +12 -0
- soe/nodes/llm/validation/__init__.py +11 -0
- soe/nodes/llm/validation/config.py +89 -0
- soe/nodes/llm/validation/operational.py +23 -0
- soe/nodes/router/__init__.py +3 -0
- soe/nodes/router/factory.py +37 -0
- soe/nodes/router/state.py +32 -0
- soe/nodes/router/validation/__init__.py +11 -0
- soe/nodes/router/validation/config.py +58 -0
- soe/nodes/router/validation/operational.py +16 -0
- soe/nodes/tool/factory.py +66 -0
- soe/nodes/tool/lib/__init__.py +11 -0
- soe/nodes/tool/lib/conditions.py +35 -0
- soe/nodes/tool/lib/failure.py +28 -0
- soe/nodes/tool/lib/parameters.py +67 -0
- soe/nodes/tool/state.py +66 -0
- soe/nodes/tool/types.py +27 -0
- soe/nodes/tool/validation/__init__.py +15 -0
- soe/nodes/tool/validation/config.py +132 -0
- soe/nodes/tool/validation/operational.py +16 -0
- soe/types.py +40 -28
- soe/validation/__init__.py +18 -0
- soe/validation/config.py +195 -0
- soe/validation/jinja.py +54 -0
- soe/validation/operational.py +110 -0
- {soe_ai-0.1.0.dist-info → soe_ai-0.1.2.dist-info}/METADATA +72 -9
- soe_ai-0.1.2.dist-info/RECORD +137 -0
- {soe_ai-0.1.0.dist-info → soe_ai-0.1.2.dist-info}/WHEEL +1 -1
- soe/validation.py +0 -8
- soe_ai-0.1.0.dist-info/RECORD +0 -11
- {soe_ai-0.1.0.dist-info → soe_ai-0.1.2.dist-info}/licenses/LICENSE +0 -0
- {soe_ai-0.1.0.dist-info → soe_ai-0.1.2.dist-info}/top_level.txt +0 -0
|
@@ -0,0 +1,612 @@
|
|
|
1
|
+
|
|
2
|
+
# SOE Guide: Chapter 8 - Suborchestration
|
|
3
|
+
|
|
4
|
+
## Introduction to Child Nodes
|
|
5
|
+
|
|
6
|
+
**Child Nodes** spawn sub-workflows for modular composition. A parent workflow can delegate work to child workflows, receive signals back, and share context data.
|
|
7
|
+
|
|
8
|
+
### Why Suborchestration?
|
|
9
|
+
|
|
10
|
+
- **Modularity**: Break complex workflows into reusable components.
|
|
11
|
+
- **Isolation**: Child workflows have their own context namespace—no pollution of parent state.
|
|
12
|
+
- **Composition**: Build complex systems from simple building blocks.
|
|
13
|
+
- **Parallel Execution**: Multiple children can run concurrently.
|
|
14
|
+
- **Custom Agent Solvers**: Encapsulate agent logic as reusable sub-workflows.
|
|
15
|
+
|
|
16
|
+
### The Power of Isolation
|
|
17
|
+
|
|
18
|
+
Child workflows run in **isolated context**:
|
|
19
|
+
|
|
20
|
+
```
|
|
21
|
+
Parent Context: { user_id: "alice", request: "analyze data" }
|
|
22
|
+
↓ input_fields: [request]
|
|
23
|
+
Child Context: { request: "analyze data" } ← Only copied fields!
|
|
24
|
+
↓ child does work, creates temp_data, intermediate_results...
|
|
25
|
+
Child Context: { request: "...", result: "done", temp_data: "...", ... }
|
|
26
|
+
↓ context_updates_to_parent: [result]
|
|
27
|
+
Parent Context: { user_id: "alice", request: "...", result: "done" }
|
|
28
|
+
```
|
|
29
|
+
|
|
30
|
+
The parent never sees `temp_data` or `intermediate_results`. This means:
|
|
31
|
+
- **No namespace collisions**: Child can use any field names.
|
|
32
|
+
- **Clean interfaces**: Explicit input/output contracts.
|
|
33
|
+
- **Testable units**: Test child workflows independently.
|
|
34
|
+
- **Safe experimentation**: Child failures don't corrupt parent state.
|
|
35
|
+
|
|
36
|
+
### Domain-Specific Business Logic
|
|
37
|
+
|
|
38
|
+
Each child workflow can encapsulate its **own domain** with specialized logic:
|
|
39
|
+
|
|
40
|
+
```yaml
|
|
41
|
+
# E-commerce orchestration - each domain is a separate workflow
|
|
42
|
+
order_workflow:
|
|
43
|
+
ProcessPayment:
|
|
44
|
+
node_type: child
|
|
45
|
+
event_triggers: [ORDER_PLACED]
|
|
46
|
+
child_workflow_name: payment_domain # Payment team owns this
|
|
47
|
+
input_fields: [amount, payment_method]
|
|
48
|
+
signals_to_parent: [PAYMENT_COMPLETE, PAYMENT_FAILED]
|
|
49
|
+
context_updates_to_parent: [transaction_id]
|
|
50
|
+
|
|
51
|
+
ShipOrder:
|
|
52
|
+
node_type: child
|
|
53
|
+
event_triggers: [PAYMENT_COMPLETE]
|
|
54
|
+
child_workflow_name: shipping_domain # Logistics team owns this
|
|
55
|
+
input_fields: [items, address]
|
|
56
|
+
signals_to_parent: [SHIPPED]
|
|
57
|
+
context_updates_to_parent: [tracking_number]
|
|
58
|
+
|
|
59
|
+
NotifyCustomer:
|
|
60
|
+
node_type: child
|
|
61
|
+
event_triggers: [SHIPPED]
|
|
62
|
+
child_workflow_name: notification_domain # Comms team owns this
|
|
63
|
+
input_fields: [email, tracking_number]
|
|
64
|
+
signals_to_parent: [NOTIFIED]
|
|
65
|
+
```
|
|
66
|
+
|
|
67
|
+
**Think in Steps, Not Monoliths:**
|
|
68
|
+
|
|
69
|
+
| Domain | Owns | Knows About |
|
|
70
|
+
|--------|------|-------------|
|
|
71
|
+
| `payment_domain` | Card processing, fraud detection, retries | Nothing about shipping |
|
|
72
|
+
| `shipping_domain` | Carrier selection, label generation, tracking | Nothing about payments |
|
|
73
|
+
| `notification_domain` | Email templates, SMS, push notifications | Nothing about logistics |
|
|
74
|
+
|
|
75
|
+
Each team develops, tests, and deploys their domain workflow independently. The parent orchestrator only knows the **interface**: input fields, output signals, context updates.
|
|
76
|
+
|
|
77
|
+
This enables:
|
|
78
|
+
- **Separation of concerns**: Payment logic doesn't leak into shipping.
|
|
79
|
+
- **Team autonomy**: Each team owns their workflow.
|
|
80
|
+
- **Independent evolution**: Update payment flow without touching shipping.
|
|
81
|
+
- **Domain expertise**: Specialists focus on their area.
|
|
82
|
+
|
|
83
|
+
## Your First Child Node
|
|
84
|
+
|
|
85
|
+
Spawn a simple child workflow:
|
|
86
|
+
|
|
87
|
+
### The Workflow
|
|
88
|
+
|
|
89
|
+
```yaml
|
|
90
|
+
parent_workflow:
|
|
91
|
+
SpawnChild:
|
|
92
|
+
node_type: child
|
|
93
|
+
event_triggers: [START]
|
|
94
|
+
child_workflow_name: child_workflow
|
|
95
|
+
child_initial_signals: [START]
|
|
96
|
+
signals_to_parent: [CHILD_DONE]
|
|
97
|
+
|
|
98
|
+
ChildDoneHandler:
|
|
99
|
+
node_type: router
|
|
100
|
+
event_triggers: [CHILD_DONE]
|
|
101
|
+
event_emissions:
|
|
102
|
+
- signal_name: WORKFLOW_COMPLETE
|
|
103
|
+
|
|
104
|
+
child_workflow:
|
|
105
|
+
DoWork:
|
|
106
|
+
node_type: router
|
|
107
|
+
event_triggers: [START]
|
|
108
|
+
event_emissions:
|
|
109
|
+
- signal_name: WORK_COMPLETE
|
|
110
|
+
|
|
111
|
+
Finish:
|
|
112
|
+
node_type: router
|
|
113
|
+
event_triggers: [WORK_COMPLETE]
|
|
114
|
+
event_emissions:
|
|
115
|
+
- signal_name: CHILD_DONE
|
|
116
|
+
```
|
|
117
|
+
|
|
118
|
+
### Configuration
|
|
119
|
+
|
|
120
|
+
| Field | Required | Description |
|
|
121
|
+
|-------|----------|-------------|
|
|
122
|
+
| `child_workflow_name` | ✓ | Name of workflow to spawn |
|
|
123
|
+
| `child_initial_signals` | ✓ | Signals to start child with |
|
|
124
|
+
| `signals_to_parent` | Optional | Child signals that propagate to parent |
|
|
125
|
+
| `input_fields` | Optional | Context fields to copy to child |
|
|
126
|
+
| `context_updates_to_parent` | Optional | Child context fields to copy back |
|
|
127
|
+
|
|
128
|
+
### How It Works
|
|
129
|
+
|
|
130
|
+
1. `SpawnChild` triggered by `START`.
|
|
131
|
+
2. Spawns `child_workflow` with `[START]` signal.
|
|
132
|
+
3. Child runs independently, emits `CHILD_DONE`.
|
|
133
|
+
4. `CHILD_DONE` propagates to parent via `signals_to_parent`.
|
|
134
|
+
5. `ChildDoneHandler` receives signal, emits `WORKFLOW_COMPLETE`.
|
|
135
|
+
|
|
136
|
+
## Passing Data to Child
|
|
137
|
+
|
|
138
|
+
Use `input_fields` to copy context to child:
|
|
139
|
+
|
|
140
|
+
### The Workflow
|
|
141
|
+
|
|
142
|
+
```yaml
|
|
143
|
+
parent_workflow:
|
|
144
|
+
SpawnProcessor:
|
|
145
|
+
node_type: child
|
|
146
|
+
event_triggers: [START]
|
|
147
|
+
child_workflow_name: processor_workflow
|
|
148
|
+
child_initial_signals: [START]
|
|
149
|
+
input_fields: [data_to_process]
|
|
150
|
+
signals_to_parent: [PROCESSED]
|
|
151
|
+
|
|
152
|
+
ProcessedHandler:
|
|
153
|
+
node_type: router
|
|
154
|
+
event_triggers: [PROCESSED]
|
|
155
|
+
event_emissions:
|
|
156
|
+
- signal_name: PARENT_COMPLETE
|
|
157
|
+
|
|
158
|
+
processor_workflow:
|
|
159
|
+
ProcessData:
|
|
160
|
+
node_type: router
|
|
161
|
+
event_triggers: [START]
|
|
162
|
+
event_emissions:
|
|
163
|
+
- signal_name: PROCESSED
|
|
164
|
+
condition: "{{ context.data_to_process is defined }}"
|
|
165
|
+
```
|
|
166
|
+
|
|
167
|
+
### Data Flow
|
|
168
|
+
|
|
169
|
+
1. Parent has `data_to_process: "important_data"` in context.
|
|
170
|
+
2. Child node specifies `input_fields: [data_to_process]`.
|
|
171
|
+
3. Child workflow receives copy of `data_to_process` in its context.
|
|
172
|
+
4. Child can use `{{ context.data_to_process }}` in conditions/prompts.
|
|
173
|
+
|
|
174
|
+
## Receiving Data from Child
|
|
175
|
+
|
|
176
|
+
Use `context_updates_to_parent` to propagate child data back:
|
|
177
|
+
|
|
178
|
+
### The Workflow
|
|
179
|
+
|
|
180
|
+
```yaml
|
|
181
|
+
parent_workflow:
|
|
182
|
+
SpawnCalculator:
|
|
183
|
+
node_type: child
|
|
184
|
+
event_triggers: [START]
|
|
185
|
+
child_workflow_name: calculator_workflow
|
|
186
|
+
child_initial_signals: [START]
|
|
187
|
+
input_fields: [calc_params]
|
|
188
|
+
signals_to_parent: [CALCULATION_DONE]
|
|
189
|
+
context_updates_to_parent: [result]
|
|
190
|
+
|
|
191
|
+
ResultHandler:
|
|
192
|
+
node_type: router
|
|
193
|
+
event_triggers: [CALCULATION_DONE]
|
|
194
|
+
event_emissions:
|
|
195
|
+
- signal_name: PARENT_COMPLETE
|
|
196
|
+
|
|
197
|
+
calculator_workflow:
|
|
198
|
+
Calculate:
|
|
199
|
+
node_type: tool
|
|
200
|
+
event_triggers: [START]
|
|
201
|
+
tool_name: sum_numbers
|
|
202
|
+
context_parameter_field: calc_params
|
|
203
|
+
output_field: result
|
|
204
|
+
event_emissions:
|
|
205
|
+
- signal_name: CALCULATION_DONE
|
|
206
|
+
```
|
|
207
|
+
|
|
208
|
+
### Data Flow
|
|
209
|
+
|
|
210
|
+
1. Parent passes `numbers: [1, 2, 3, 4, 5]` to child.
|
|
211
|
+
2. Child runs `sum_numbers` tool, stores result in `result`.
|
|
212
|
+
3. `context_updates_to_parent: [result]` copies `result` back.
|
|
213
|
+
4. Parent context now has `result: 15`.
|
|
214
|
+
|
|
215
|
+
## Child Continues After Callback
|
|
216
|
+
|
|
217
|
+
`signals_to_parent` is NOT a "done" signal—child can keep working!
|
|
218
|
+
|
|
219
|
+
### The Workflow
|
|
220
|
+
|
|
221
|
+
```yaml
|
|
222
|
+
parent_workflow:
|
|
223
|
+
SpawnWorker:
|
|
224
|
+
node_type: child
|
|
225
|
+
event_triggers: [START]
|
|
226
|
+
child_workflow_name: worker_workflow
|
|
227
|
+
child_initial_signals: [START]
|
|
228
|
+
signals_to_parent: [PROGRESS, COMPLETED]
|
|
229
|
+
|
|
230
|
+
ProgressHandler:
|
|
231
|
+
node_type: router
|
|
232
|
+
event_triggers: [PROGRESS]
|
|
233
|
+
event_emissions:
|
|
234
|
+
- signal_name: PROGRESS_LOGGED
|
|
235
|
+
|
|
236
|
+
CompleteHandler:
|
|
237
|
+
node_type: router
|
|
238
|
+
event_triggers: [COMPLETED]
|
|
239
|
+
event_emissions:
|
|
240
|
+
- signal_name: ALL_DONE
|
|
241
|
+
|
|
242
|
+
worker_workflow:
|
|
243
|
+
Phase1:
|
|
244
|
+
node_type: router
|
|
245
|
+
event_triggers: [START]
|
|
246
|
+
event_emissions:
|
|
247
|
+
- signal_name: PHASE1_DONE
|
|
248
|
+
|
|
249
|
+
ReportProgress:
|
|
250
|
+
node_type: router
|
|
251
|
+
event_triggers: [PHASE1_DONE]
|
|
252
|
+
event_emissions:
|
|
253
|
+
- signal_name: PROGRESS
|
|
254
|
+
|
|
255
|
+
Phase2:
|
|
256
|
+
node_type: router
|
|
257
|
+
event_triggers: [PROGRESS]
|
|
258
|
+
event_emissions:
|
|
259
|
+
- signal_name: PHASE2_DONE
|
|
260
|
+
|
|
261
|
+
ReportComplete:
|
|
262
|
+
node_type: router
|
|
263
|
+
event_triggers: [PHASE2_DONE]
|
|
264
|
+
event_emissions:
|
|
265
|
+
- signal_name: COMPLETED
|
|
266
|
+
```
|
|
267
|
+
|
|
268
|
+
### Execution Flow
|
|
269
|
+
|
|
270
|
+
1. Child starts, does Phase1.
|
|
271
|
+
2. Emits `PROGRESS` → propagates to parent → `PROGRESS_LOGGED`.
|
|
272
|
+
3. Child **continues** to Phase2 (not terminated).
|
|
273
|
+
4. Emits `COMPLETED` → propagates to parent → `ALL_DONE`.
|
|
274
|
+
|
|
275
|
+
This enables:
|
|
276
|
+
- Progress reporting
|
|
277
|
+
- Streaming updates
|
|
278
|
+
- Multi-phase child workflows
|
|
279
|
+
|
|
280
|
+
## Multiple Children (Parallel)
|
|
281
|
+
|
|
282
|
+
Multiple child nodes can spawn from the same trigger:
|
|
283
|
+
|
|
284
|
+
### The Workflow
|
|
285
|
+
|
|
286
|
+
```yaml
|
|
287
|
+
parent_workflow:
|
|
288
|
+
SpawnWorkerA:
|
|
289
|
+
node_type: child
|
|
290
|
+
event_triggers: [START]
|
|
291
|
+
child_workflow_name: worker_a
|
|
292
|
+
child_initial_signals: [START]
|
|
293
|
+
signals_to_parent: [A_DONE]
|
|
294
|
+
|
|
295
|
+
SpawnWorkerB:
|
|
296
|
+
node_type: child
|
|
297
|
+
event_triggers: [START]
|
|
298
|
+
child_workflow_name: worker_b
|
|
299
|
+
child_initial_signals: [START]
|
|
300
|
+
signals_to_parent: [B_DONE]
|
|
301
|
+
|
|
302
|
+
WaitForBoth:
|
|
303
|
+
node_type: router
|
|
304
|
+
event_triggers: [A_DONE, B_DONE]
|
|
305
|
+
event_emissions:
|
|
306
|
+
- signal_name: ALL_WORKERS_DONE
|
|
307
|
+
|
|
308
|
+
worker_a:
|
|
309
|
+
DoWorkA:
|
|
310
|
+
node_type: router
|
|
311
|
+
event_triggers: [START]
|
|
312
|
+
event_emissions:
|
|
313
|
+
- signal_name: A_DONE
|
|
314
|
+
|
|
315
|
+
worker_b:
|
|
316
|
+
DoWorkB:
|
|
317
|
+
node_type: router
|
|
318
|
+
event_triggers: [START]
|
|
319
|
+
event_emissions:
|
|
320
|
+
- signal_name: B_DONE
|
|
321
|
+
```
|
|
322
|
+
|
|
323
|
+
### Execution
|
|
324
|
+
|
|
325
|
+
1. `START` triggers both `SpawnWorkerA` and `SpawnWorkerB`.
|
|
326
|
+
2. Both child workflows run concurrently.
|
|
327
|
+
3. `WaitForBoth` listens for `A_DONE` and `B_DONE`.
|
|
328
|
+
4. When both arrive, emits `ALL_WORKERS_DONE`.
|
|
329
|
+
|
|
330
|
+
Use this for:
|
|
331
|
+
- Fan-out/fan-in patterns
|
|
332
|
+
- Parallel processing
|
|
333
|
+
- Concurrent sub-tasks
|
|
334
|
+
|
|
335
|
+
## Nested Children (Grandchild)
|
|
336
|
+
|
|
337
|
+
Children can spawn their own children:
|
|
338
|
+
|
|
339
|
+
### The Workflow
|
|
340
|
+
|
|
341
|
+
```yaml
|
|
342
|
+
main_workflow:
|
|
343
|
+
SpawnChild:
|
|
344
|
+
node_type: child
|
|
345
|
+
event_triggers: [START]
|
|
346
|
+
child_workflow_name: child_workflow
|
|
347
|
+
child_initial_signals: [START]
|
|
348
|
+
signals_to_parent: [CHILD_COMPLETE]
|
|
349
|
+
|
|
350
|
+
MainDone:
|
|
351
|
+
node_type: router
|
|
352
|
+
event_triggers: [CHILD_COMPLETE]
|
|
353
|
+
event_emissions:
|
|
354
|
+
- signal_name: MAIN_COMPLETE
|
|
355
|
+
|
|
356
|
+
child_workflow:
|
|
357
|
+
SpawnGrandchild:
|
|
358
|
+
node_type: child
|
|
359
|
+
event_triggers: [START]
|
|
360
|
+
child_workflow_name: grandchild_workflow
|
|
361
|
+
child_initial_signals: [START]
|
|
362
|
+
signals_to_parent: [GRANDCHILD_DONE]
|
|
363
|
+
|
|
364
|
+
ChildDone:
|
|
365
|
+
node_type: router
|
|
366
|
+
event_triggers: [GRANDCHILD_DONE]
|
|
367
|
+
event_emissions:
|
|
368
|
+
- signal_name: CHILD_COMPLETE
|
|
369
|
+
|
|
370
|
+
grandchild_workflow:
|
|
371
|
+
DoDeepWork:
|
|
372
|
+
node_type: router
|
|
373
|
+
event_triggers: [START]
|
|
374
|
+
event_emissions:
|
|
375
|
+
- signal_name: GRANDCHILD_DONE
|
|
376
|
+
```
|
|
377
|
+
|
|
378
|
+
### Signal Flow
|
|
379
|
+
|
|
380
|
+
```
|
|
381
|
+
main_workflow
|
|
382
|
+
↓ spawns
|
|
383
|
+
child_workflow
|
|
384
|
+
↓ spawns
|
|
385
|
+
grandchild_workflow
|
|
386
|
+
↓ emits GRANDCHILD_DONE
|
|
387
|
+
child_workflow (receives, emits CHILD_COMPLETE)
|
|
388
|
+
↓
|
|
389
|
+
main_workflow (receives CHILD_COMPLETE, emits MAIN_COMPLETE)
|
|
390
|
+
```
|
|
391
|
+
|
|
392
|
+
## Child with LLM
|
|
393
|
+
|
|
394
|
+
Child workflows can contain any node type:
|
|
395
|
+
|
|
396
|
+
### The Workflow
|
|
397
|
+
|
|
398
|
+
```yaml
|
|
399
|
+
parent_workflow:
|
|
400
|
+
SpawnAnalyzer:
|
|
401
|
+
node_type: child
|
|
402
|
+
event_triggers: [START]
|
|
403
|
+
child_workflow_name: analyzer_workflow
|
|
404
|
+
child_initial_signals: [START]
|
|
405
|
+
input_fields: [textToAnalyze]
|
|
406
|
+
signals_to_parent: [ANALYSIS_COMPLETE]
|
|
407
|
+
context_updates_to_parent: [analysisResult]
|
|
408
|
+
|
|
409
|
+
AnalysisDone:
|
|
410
|
+
node_type: router
|
|
411
|
+
event_triggers: [ANALYSIS_COMPLETE]
|
|
412
|
+
event_emissions:
|
|
413
|
+
- signal_name: PARENT_DONE
|
|
414
|
+
|
|
415
|
+
analyzer_workflow:
|
|
416
|
+
AnalyzeText:
|
|
417
|
+
node_type: llm
|
|
418
|
+
event_triggers: [START]
|
|
419
|
+
prompt: "Analyze this text: {{ context.textToAnalyze }}"
|
|
420
|
+
output_field: analysisResult
|
|
421
|
+
event_emissions:
|
|
422
|
+
- signal_name: ANALYSIS_COMPLETE
|
|
423
|
+
```
|
|
424
|
+
|
|
425
|
+
### Use Cases
|
|
426
|
+
|
|
427
|
+
- **AI Analysis Modules**: Reusable analysis sub-workflows.
|
|
428
|
+
- **Agent Delegation**: Parent routes to specialized agent children.
|
|
429
|
+
- **Tool Orchestration**: Child manages complex tool sequences.
|
|
430
|
+
|
|
431
|
+
## Suborchestration Patterns
|
|
432
|
+
|
|
433
|
+
### Pattern: Custom Agent Solvers
|
|
434
|
+
|
|
435
|
+
The most powerful use of child nodes is **encapsulating agents as reusable solvers**. Instead of embedding agent logic in your main workflow, create specialized agent workflows:
|
|
436
|
+
|
|
437
|
+
```yaml
|
|
438
|
+
main_workflow:
|
|
439
|
+
AnalyzeRequest:
|
|
440
|
+
node_type: router
|
|
441
|
+
event_triggers: [START]
|
|
442
|
+
event_emissions:
|
|
443
|
+
- signal_name: NEEDS_RESEARCH
|
|
444
|
+
condition: "{{ 'research' in context.task|lower }}"
|
|
445
|
+
- signal_name: NEEDS_CODING
|
|
446
|
+
condition: "{{ 'code' in context.task|lower }}"
|
|
447
|
+
|
|
448
|
+
ResearchSolver:
|
|
449
|
+
node_type: child
|
|
450
|
+
event_triggers: [NEEDS_RESEARCH]
|
|
451
|
+
child_workflow_name: research_agent_workflow
|
|
452
|
+
child_initial_signals: [START]
|
|
453
|
+
input_fields: [task, sources]
|
|
454
|
+
signals_to_parent: [RESEARCH_COMPLETE]
|
|
455
|
+
context_updates_to_parent: [findings]
|
|
456
|
+
|
|
457
|
+
CodingSolver:
|
|
458
|
+
node_type: child
|
|
459
|
+
event_triggers: [NEEDS_CODING]
|
|
460
|
+
child_workflow_name: coding_agent_workflow
|
|
461
|
+
child_initial_signals: [START]
|
|
462
|
+
input_fields: [task, language]
|
|
463
|
+
signals_to_parent: [CODE_COMPLETE]
|
|
464
|
+
context_updates_to_parent: [code_output]
|
|
465
|
+
|
|
466
|
+
# Reusable research agent - can be used by any parent workflow
|
|
467
|
+
research_agent_workflow:
|
|
468
|
+
ResearchAgent:
|
|
469
|
+
node_type: agent
|
|
470
|
+
event_triggers: [START]
|
|
471
|
+
system_prompt: "You are a research specialist. Use available tools to gather information."
|
|
472
|
+
user_prompt: "Research task: {{ context.task }}. Sources: {{ context.sources }}"
|
|
473
|
+
output_field: findings
|
|
474
|
+
available_tools: [web_search, summarize]
|
|
475
|
+
event_emissions:
|
|
476
|
+
- signal_name: RESEARCH_COMPLETE
|
|
477
|
+
|
|
478
|
+
# Reusable coding agent - encapsulates coding expertise
|
|
479
|
+
coding_agent_workflow:
|
|
480
|
+
CodingAgent:
|
|
481
|
+
node_type: agent
|
|
482
|
+
event_triggers: [START]
|
|
483
|
+
system_prompt: "You are an expert {{ context.language }} developer."
|
|
484
|
+
user_prompt: "Task: {{ context.task }}"
|
|
485
|
+
output_field: code_output
|
|
486
|
+
available_tools: [write_file, run_tests, lint_code]
|
|
487
|
+
event_emissions:
|
|
488
|
+
- signal_name: CODE_COMPLETE
|
|
489
|
+
```
|
|
490
|
+
|
|
491
|
+
**Benefits of Agent Solvers:**
|
|
492
|
+
|
|
493
|
+
- **Reusability**: Same agent workflow used by multiple parents.
|
|
494
|
+
- **Specialization**: Each agent has focused tools and prompts.
|
|
495
|
+
- **Testability**: Test agent workflows in isolation with mock tools.
|
|
496
|
+
- **Swappability**: Replace `coding_agent_workflow` without changing parent.
|
|
497
|
+
- **Identity per solver**: Each agent can have its own conversation history.
|
|
498
|
+
|
|
499
|
+
### Pattern: Specialized Agents
|
|
500
|
+
|
|
501
|
+
```yaml
|
|
502
|
+
parent_workflow:
|
|
503
|
+
Router:
|
|
504
|
+
node_type: router
|
|
505
|
+
event_triggers: [START]
|
|
506
|
+
event_emissions:
|
|
507
|
+
- signal_name: CODING_TASK
|
|
508
|
+
condition: "{{ 'code' in context.request|lower }}"
|
|
509
|
+
- signal_name: WRITING_TASK
|
|
510
|
+
|
|
511
|
+
CodingAgent:
|
|
512
|
+
node_type: child
|
|
513
|
+
event_triggers: [CODING_TASK]
|
|
514
|
+
child_workflow_name: coding_workflow
|
|
515
|
+
child_initial_signals: [START]
|
|
516
|
+
input_fields: [request]
|
|
517
|
+
signals_to_parent: [AGENT_DONE]
|
|
518
|
+
context_updates_to_parent: [response]
|
|
519
|
+
|
|
520
|
+
WritingAgent:
|
|
521
|
+
node_type: child
|
|
522
|
+
event_triggers: [WRITING_TASK]
|
|
523
|
+
child_workflow_name: writing_workflow
|
|
524
|
+
child_initial_signals: [START]
|
|
525
|
+
input_fields: [request]
|
|
526
|
+
signals_to_parent: [AGENT_DONE]
|
|
527
|
+
context_updates_to_parent: [response]
|
|
528
|
+
```
|
|
529
|
+
|
|
530
|
+
### Pattern: Pipeline with Stages
|
|
531
|
+
|
|
532
|
+
```yaml
|
|
533
|
+
pipeline_workflow:
|
|
534
|
+
Stage1:
|
|
535
|
+
node_type: child
|
|
536
|
+
event_triggers: [START]
|
|
537
|
+
child_workflow_name: extraction_workflow
|
|
538
|
+
child_initial_signals: [START]
|
|
539
|
+
input_fields: [raw_data]
|
|
540
|
+
signals_to_parent: [EXTRACTED]
|
|
541
|
+
context_updates_to_parent: [extracted_data]
|
|
542
|
+
|
|
543
|
+
Stage2:
|
|
544
|
+
node_type: child
|
|
545
|
+
event_triggers: [EXTRACTED]
|
|
546
|
+
child_workflow_name: transformation_workflow
|
|
547
|
+
child_initial_signals: [START]
|
|
548
|
+
input_fields: [extracted_data]
|
|
549
|
+
signals_to_parent: [TRANSFORMED]
|
|
550
|
+
context_updates_to_parent: [final_data]
|
|
551
|
+
```
|
|
552
|
+
|
|
553
|
+
## Shared Conversation History Across Sub-Orchestration
|
|
554
|
+
|
|
555
|
+
When you spawn child workflows, conversation history is **automatically shared** with the parent. This enables powerful patterns where child workflows continue conversations started by parents.
|
|
556
|
+
|
|
557
|
+
### How It Works
|
|
558
|
+
|
|
559
|
+
SOE maintains a `main_execution_id` in the operational context:
|
|
560
|
+
- For the root orchestration, `main_execution_id` equals the execution ID
|
|
561
|
+
- For child workflows, `main_execution_id` is inherited from the parent
|
|
562
|
+
- Conversation history is keyed by `main_execution_id`, not the child's execution ID
|
|
563
|
+
|
|
564
|
+
This means all nodes in the orchestration tree share the same conversation history—if they use `identity`.
|
|
565
|
+
|
|
566
|
+
### Parent-Child Conversation Example
|
|
567
|
+
|
|
568
|
+
```yaml
|
|
569
|
+
# ERROR: File not found: tests/test_cases/workflows/guide_07_child.py
|
|
570
|
+
```
|
|
571
|
+
|
|
572
|
+
### Execution Flow
|
|
573
|
+
|
|
574
|
+
1. `ParentLLMCall` executes with `identity: shared_session`
|
|
575
|
+
2. Conversation is stored using `main_execution_id`
|
|
576
|
+
3. Child workflow spawns, inheriting `main_execution_id`
|
|
577
|
+
4. `ChildLLMCall` has same `identity: shared_session`
|
|
578
|
+
5. Child sees parent's conversation in `conversation_history`
|
|
579
|
+
6. Both exchanges are stored under the same `main_execution_id`
|
|
580
|
+
|
|
581
|
+
### Nested Sub-Orchestration
|
|
582
|
+
|
|
583
|
+
The pattern extends to any depth:
|
|
584
|
+
|
|
585
|
+
```yaml
|
|
586
|
+
# ERROR: File not found: tests/test_cases/workflows/guide_07_child.py
|
|
587
|
+
```
|
|
588
|
+
|
|
589
|
+
Main → Child → Grandchild: The grandchild sees the full conversation from main.
|
|
590
|
+
|
|
591
|
+
### Use Cases
|
|
592
|
+
|
|
593
|
+
- **Progressive problem-solving**: Parent breaks down task, children solve parts while seeing full context
|
|
594
|
+
- **Iterative refinement**: Each sub-workflow builds on previous responses
|
|
595
|
+
- **Distributed agents**: Specialized child agents share knowledge with parent
|
|
596
|
+
- **Long-running workflows**: History persists across the entire orchestration tree
|
|
597
|
+
|
|
598
|
+
## Key Points
|
|
599
|
+
|
|
600
|
+
- **Isolated context**: Children have their own namespace—no state pollution.
|
|
601
|
+
- **Modular composition**: Break workflows into reusable sub-workflows.
|
|
602
|
+
- **Custom agent solvers**: Encapsulate agents as reusable child workflows.
|
|
603
|
+
- **Bidirectional data**: `input_fields` (parent→child), `context_updates_to_parent` (child→parent).
|
|
604
|
+
- **Signal propagation**: `signals_to_parent` specifies which child signals reach parent.
|
|
605
|
+
- **Child continues**: Signals to parent don't terminate child—it can keep working.
|
|
606
|
+
- **Nesting**: Children can spawn grandchildren indefinitely.
|
|
607
|
+
- **Shared conversation history**: Children share conversation history with parent via `main_execution_id`.
|
|
608
|
+
- **Recursive calls**: A child workflow can technically call itself! This enables recursive patterns, though use with caution and always have termination conditions.
|
|
609
|
+
|
|
610
|
+
## Next Steps
|
|
611
|
+
|
|
612
|
+
Now that you understand child workflows and composition, let's explore [The Workflows Ecosystem](guide_09_ecosystem.md) for multi-workflow registries and versioning →
|