quantalogic 0.50.28__py3-none-any.whl → 0.51.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,419 +1,506 @@
1
- # Quantalogic Flow YAML DSL Specification
2
1
 
3
- ## 1. Introduction
2
+ # Quantalogic Flow YAML DSL Specification 🚀
4
3
 
5
- The Quantalogic Flow YAML DSL (Domain Specific Language) offers a structured and human-readable way to define workflows. As of February 23, 2025, it provides a rich set of features for complex automation, including:
4
+ ## 1. Introduction 🌟
6
5
 
7
- * **Function Execution**: Executes asynchronous Python functions, either embedded directly or sourced from PyPI packages, local files, or remote URLs.
8
- * **Execution Flow**: Supports sequential, conditional, and parallel transitions between nodes.
9
- * **Sub-Workflows**: Enables hierarchical workflows through nested sub-workflows.
10
- * **LLM Integration**: Incorporates Large Language Model (LLM) nodes with plain text (`llm_node`) or structured output (`structured_llm_node`), using configurable prompts and parameters.
11
- * **Context Management**: Maintains state across nodes using a shared context dictionary.
12
- * **Robustness**: Provides configurable retries and timeouts for fault-tolerant execution.
13
- * **Programmatic Control**: Managed via the `WorkflowManager` class for dynamic creation and execution.
6
+ Welcome to the **Quantalogic Flow YAML DSL**—a powerful, human-readable way to craft workflows with the `quantalogic.flow` package! As of **March 1, 2025**, this DSL brings a suite of exciting features to automate complex tasks with ease:
14
7
 
15
- This DSL seamlessly integrates with the `Workflow`, `WorkflowEngine`, and `WorkflowManager` classes from the `quantalogic.flow` package, and it leverages the `Nodes` class for LLM functionality to minimize redundancy.
8
+ - **Function Execution** ⚙️: Run async Python functions—embedded or sourced from PyPI, local files, or URLs.
9
+ - **Execution Flow** ➡️: Define sequential, conditional, and parallel transitions.
10
+ - **Sub-Workflows** 🌳: Build hierarchical workflows for modularity.
11
+ - **LLM Integration** 🤖: Leverage Large Language Models with plain text or structured outputs.
12
+ - **Context Management** 📦: Share state across nodes via a dynamic context.
13
+ - **Robustness** 🛡️: Add retries, delays, and timeouts for reliability.
14
+ - **Observers** 👀: Monitor execution with custom event handlers.
15
+ - **Programmatic Power** 🧑‍💻: Control everything via the `WorkflowManager`.
16
16
 
17
- ## 2. Workflow Structure
17
+ This DSL integrates seamlessly with `Workflow`, `WorkflowEngine`, and `Nodes` classes, powering everything from simple scripts to AI-driven workflows. Let’s dive in! 🎉
18
18
 
19
- A YAML workflow file consists of three main sections:
19
+ ```mermaid
20
+ graph TD
21
+ A[YAML Workflow File] -->|Defines| B[functions ⚙️]
22
+ A -->|Configures| C[nodes 🧩]
23
+ A -->|Orchestrates| D[workflow 🌐]
24
+ style A fill:#f9f9ff,stroke:#333,stroke-width:2px,stroke-dasharray:5
25
+ style B fill:#e6f3ff,stroke:#0066cc
26
+ style C fill:#e6ffe6,stroke:#009933
27
+ style D fill:#fff0e6,stroke:#cc3300
28
+ ```
29
+
30
+ ## 2. Workflow Structure 🗺️
31
+
32
+ A workflow YAML file is split into three core sections:
33
+
34
+ - **`functions`**: Your toolbox of Python functions.
35
+ - **`nodes`**: The building blocks (tasks) of your workflow.
36
+ - **`workflow`**: The roadmap tying it all together.
20
37
 
21
- * **`functions`**: Defines Python functions used by nodes, supporting both inline code and external modules.
22
- * **`nodes`**: Configures individual tasks, linking to functions, sub-workflows, or LLM setups.
23
- * **`workflow`**: Specifies the execution flow, including the start node and transitions.
38
+ Here’s the skeleton:
24
39
 
25
40
  ```yaml
26
41
  functions:
27
- # Function definitions
42
+ # Your Python magic ✨
28
43
  nodes:
29
- # Node configurations
44
+ # Tasks to execute 🎯
30
45
  workflow:
31
- # Start node and transitions
46
+ # Flow control 🚦
32
47
  ```
33
48
 
34
- ```mermaid
35
- graph LR
36
- A[YAML Workflow File] --> B(functions);
37
- A --> C(nodes);
38
- A --> D(workflow);
39
- style A fill:#f9f,stroke:#333,stroke-width:2px
40
- ```
41
-
42
- ## 3. Functions
49
+ ## 3. Functions ⚙️
43
50
 
44
- The `functions` section maps function names to their implementations, which can be embedded in the YAML or sourced externally from Python modules.
51
+ The `functions` section defines reusable Python code—either embedded in the YAML or pulled from external sources.
45
52
 
46
- **Fields**
53
+ ### Fields 📋
47
54
 
48
- * `type` (string, required): Specifies the function source. Options:
49
- * `"embedded"`: Inline Python code.
50
- * `"external"`: Module-based function.
51
- * `code` (string, optional): Multi-line asynchronous Python code for embedded functions. Required if `type: embedded`.
52
- * `module` (string, optional): Source of the external module for external functions. Can be:
53
- * A PyPI package name (e.g., `"requests"`, `"numpy"`).
54
- * A local file path (e.g., `"/path/to/module.py"`).
55
- * A remote URL (e.g., `"https://example.com/module.py"`). Required if `type: external`.
56
- * `function` (string, optional): Name of the function within the module for external functions. Required if `type: external`.
55
+ - `type` (string, required): `"embedded"` (inline code) or `"external"` (module-based).
56
+ - `code` (string, optional): Multi-line Python code for `embedded`. Use `|` for readability!
57
+ - `module` (string, optional): Source for `external`. Options:
58
+ - PyPI package (e.g., `"requests"`).
59
+ - Local path (e.g., `"/path/to/module.py"`).
60
+ - URL (e.g., `"https://example.com/script.py"`).
61
+ - `function` (string, optional): Function name in the module (for `external`).
57
62
 
58
- **Rules**
63
+ ### Rules
59
64
 
60
- * Embedded functions must be asynchronous (using `async def`) and match the dictionary key name.
61
- * External functions require both `module` and `function` fields; `code` must not be present.
62
- * For PyPI modules, ensure the package is installed in the Python environment (e.g., via `pip install <module>`).
65
+ - Embedded functions must be `async def` and match their dictionary key.
66
+ - External functions need `module` and `function`; no `code` allowed.
67
+ - PyPI modules must be installed (e.g., `pip install requests`).
63
68
 
64
- **Examples**
65
-
66
- * **Embedded Function**
69
+ ### Examples 🌈
67
70
 
71
+ #### Embedded Function
68
72
  ```yaml
69
73
  functions:
70
- validate_order:
74
+ greet:
71
75
  type: embedded
72
76
  code: |
73
- async def validate_order(order: dict) -> bool:
74
- await asyncio.sleep(1)
75
- return bool(order.get("items"))
77
+ async def greet(name: str) -> str:
78
+ return f"Hello, {name}!"
76
79
  ```
77
80
 
78
- * **External Function from PyPI**
79
-
81
+ #### External from PyPI
80
82
  ```yaml
81
83
  functions:
82
- fetch_data:
84
+ fetch:
83
85
  type: external
84
86
  module: requests
85
87
  function: get
86
88
  ```
89
+ *Note*: Run `pip install requests` first!
87
90
 
88
- Note: Requires `pip install requests` if not already installed.
89
-
90
- * **External Function from Local File**
91
-
91
+ #### Local File
92
92
  ```yaml
93
93
  functions:
94
- process_data:
94
+ analyze:
95
95
  type: external
96
- module: /path/to/my_module.py
97
- function: process
96
+ module: ./utils/analyze.py
97
+ function: process_data
98
98
  ```
99
99
 
100
- * **External Function from URL**
101
-
100
+ #### Remote URL
102
101
  ```yaml
103
102
  functions:
104
- analyze:
103
+ compute:
105
104
  type: external
106
- module: https://example.com/analyze.py
107
- function: analyze_data
105
+ module: https://example.com/compute.py
106
+ function: calculate
108
107
  ```
109
108
 
110
109
  ```mermaid
111
- graph LR
112
- A[Function Definition] --> B{Type: embedded/external};
113
- B -- embedded --> C["Code (async def ...)"];
114
- B -- external --> D[Module: PyPI, Path, or URL];
115
- D --> E[Function Name];
116
- style A fill:#ccf,stroke:#333,stroke-width:2px
110
+ graph TD
111
+ A[Function Definition] --> B{Type?}
112
+ B -->|embedded| C[Code: async def ...]
113
+ B -->|external| D[Module: PyPI, Path, URL]
114
+ D --> E[Function Name]
115
+ style A fill:#e6f3ff,stroke:#0066cc,stroke-width:2px
116
+ style B fill:#fff,stroke:#333
117
+ style C fill:#cce6ff,stroke:#0066cc
118
+ style D fill:#cce6ff,stroke:#0066cc
119
+ style E fill:#cce6ff,stroke:#0066cc
117
120
  ```
118
121
 
119
- ## 4. Nodes
120
-
121
- Nodes represent individual tasks within the workflow, configurable as function executions, sub-workflows, or LLM operations.
122
+ ## 4. Nodes 🧩
122
123
 
123
- **Fields**
124
+ Nodes are the heartbeat of your workflow—each one’s a task, powered by functions, sub-workflows, or LLMs.
124
125
 
125
- * `function` (string, optional): References a function from the `functions` section. Mutually exclusive with `sub_workflow` and `llm_config`.
126
- * `sub_workflow` (object, optional): Defines a nested workflow. Mutually exclusive with `function` and `llm_config`.
127
- * `start` (string, required): Starting node of the sub-workflow.
128
- * `transitions` (list): Transition rules (see Workflow section).
129
- * `llm_config` (object, optional): Configures an LLM-based node. Mutually exclusive with `function` and `sub_workflow`.
130
- * `model` (string, optional, default: `"gpt-3.5-turbo"`): LLM model (e.g., `"gemini/gemini-2.0-flash"`, `"gro k/xai"`).
131
- * `system_prompt` (string, optional): Defines the LLM’s role or context.
132
- * `prompt_template` (string, required, default: `"{{ input }}"`): Jinja2 template for the user prompt (e.g., `"Summarize {{ text }}"`).
133
- * `temperature` (float, optional, default: `0.7`): Randomness control (`0.0` to `1.0`).
134
- * `max_tokens` (integer, optional, default: `2000`): Maximum response tokens.
135
- * `top_p` (float, optional, default: `1.0`): Nucleus sampling (`0.0` to `1.0`).
136
- * `presence_penalty` (float, optional, default: `0.0`): Penalizes repetition (`-2.0` to `2.0`).
137
- * `frequency_penalty` (float, optional, default: `0.0`): Reduces word repetition (`-2.0` to `2.0`).
138
- * `stop` (list of strings, optional): Stop sequences (e.g., `["\n"]`).
139
- * `response_model` (string, optional): Pydantic model path for structured output (e.g., `"my_module:OrderDetails"`). If present, uses `structured_llm_node`; otherwise, uses `llm_node`.
140
- * `api_key` (string, optional): Custom API key for the LLM provider.
141
- * `output` (string, optional): Context key for the node’s result. Defaults to `"<node_name>_result"` for function or LLM nodes if unspecified.
142
- * `retries` (integer, optional, default: `3`): Number of retry attempts on failure (≥ `0`).
143
- * `delay` (float, optional, default: `1.0`): Delay between retries in seconds (≥ `0`).
144
- * `timeout` (float or null, optional, default: `null`): Execution timeout in seconds (≥ `0` or `null` for no timeout).
145
- * `parallel` (boolean, optional, default: `false`): Enables parallel execution with other nodes.
126
+ ### Fields 📋
146
127
 
147
- **Rules**
128
+ - `function` (string, optional): Links to a `functions` entry.
129
+ - `sub_workflow` (object, optional): Nested workflow definition.
130
+ - `start` (string): Starting node.
131
+ - `transitions` (list): Flow rules (see Workflow section).
132
+ - `llm_config` (object, optional): LLM setup.
133
+ - `model` (string, default: `"gpt-3.5-turbo"`): e.g., `"gemini/gemini-2.0-flash"`.
134
+ - `system_prompt` (string, optional): LLM’s role.
135
+ - `prompt_template` (string, default: `"{{ input }}"`): Jinja2 template (e.g., `"Summarize {{ text }}"`).
136
+ - `temperature` (float, default: `0.7`): Randomness (0.0–1.0).
137
+ - `max_tokens` (int, optional): Token limit (e.g., `2000`).
138
+ - `top_p` (float, default: `1.0`): Nucleus sampling (0.0–1.0).
139
+ - `presence_penalty` (float, default: `0.0`): Topic repetition (-2.0–2.0).
140
+ - `frequency_penalty` (float, default: `0.0`): Word repetition (-2.0–2.0).
141
+ - `response_model` (string, optional): Structured output model (e.g., `"my_module:OrderDetails"`).
142
+ - `output` (string, optional): Context key for results (defaults to `<node_name>_result` for function/LLM nodes).
143
+ - `retries` (int, default: `3`): Retry attempts (≥ 0).
144
+ - `delay` (float, default: `1.0`): Seconds between retries (≥ 0).
145
+ - `timeout` (float/null, default: `null`): Max runtime in seconds.
146
+ - `parallel` (bool, default: `false`): Run concurrently?
148
147
 
149
- * Each node must specify exactly one of `function`, `sub_workflow`, or `llm_config`.
150
- * For `sub_workflow`, `output` is optional if the sub-workflow sets multiple context keys; inputs are derived from the start node’s requirements.
151
- * For `llm_config`, inputs are extracted from `prompt_template` placeholders (e.g., `{{ text }}` implies `text` as an input).
148
+ ### Rules
152
149
 
153
- **Examples**
150
+ - Exactly one of `function`, `sub_workflow`, or `llm_config` per node.
151
+ - LLM inputs come from `prompt_template` placeholders (e.g., `{{ text }}` → `text`).
154
152
 
155
- * **Function Node**
153
+ ### Examples 🌈
156
154
 
155
+ #### Function Node
157
156
  ```yaml
158
157
  nodes:
159
158
  validate:
160
159
  function: validate_order
161
160
  output: is_valid
162
161
  retries: 2
163
- delay: 0.5
164
162
  timeout: 5.0
165
163
  ```
166
164
 
167
- * **Sub-Workflow Node**
168
-
165
+ #### Sub-Workflow Node
169
166
  ```yaml
170
167
  nodes:
171
- payment_shipping:
168
+ payment_flow:
172
169
  sub_workflow:
173
- start: payment
170
+ start: pay
174
171
  transitions:
175
- - from: payment
176
- to: shipping
177
- output: shipping_confirmation
172
+ - from: pay
173
+ to: ship
174
+ output: shipping_status
178
175
  ```
179
176
 
180
- * **Plain LLM Node**
181
-
177
+ #### Plain LLM Node
182
178
  ```yaml
183
179
  nodes:
184
180
  summarize:
185
181
  llm_config:
186
182
  model: "gro k/xai"
187
- system_prompt: "You are a concise summarizer."
188
- prompt_template: "Summarize this text: {{ text }}"
183
+ system_prompt: "You’re a concise summarizer."
184
+ prompt_template: "Summarize: {{ text }}"
189
185
  temperature: 0.5
190
- max_tokens: 50
191
186
  output: summary
192
187
  ```
193
188
 
194
- * **Structured LLM Node**
195
-
189
+ #### Structured LLM Node
196
190
  ```yaml
197
191
  nodes:
198
- check_inventory:
192
+ inventory_check:
199
193
  llm_config:
200
194
  model: "gemini/gemini-2.0-flash"
201
- system_prompt: "Check inventory status."
202
- prompt_template: "Are {{ items }} in stock?"
203
- response_model: "my_module:InventoryStatus"
204
- output: inventory_status
195
+ system_prompt: "Check stock."
196
+ prompt_template: "Items: {{ items }}"
197
+ response_model: "inventory:StockStatus"
198
+ output: stock
205
199
  ```
206
200
 
207
201
  ```mermaid
208
- graph LR
209
- A[Node Definition] --> B{Choice: function, sub_workflow, llm_config};
210
- B -- function --> C[Function Name];
211
- B -- sub_workflow --> D[Start Node & Transitions];
212
- B -- llm_config --> E[LLM Configuration];
213
- style A fill:#ccf,stroke:#333,stroke-width:2px
202
+ graph TD
203
+ A[Node] --> B{Type?}
204
+ B -->|function| C[Function Ref]
205
+ B -->|sub_workflow| D[Start + Transitions]
206
+ B -->|llm_config| E[LLM Setup]
207
+ E --> F{Structured?}
208
+ F -->|Yes| G[response_model]
209
+ F -->|No| H[Plain Text]
210
+ style A fill:#e6ffe6,stroke:#009933,stroke-width:2px
211
+ style B fill:#fff,stroke:#333
212
+ style C fill:#ccffcc,stroke:#009933
213
+ style D fill:#ccffcc,stroke:#009933
214
+ style E fill:#ccffcc,stroke:#009933
215
+ style F fill:#fff,stroke:#333
216
+ style G fill:#b3ffb3,stroke:#009933
217
+ style H fill:#b3ffb3,stroke:#009933
214
218
  ```
215
219
 
216
- ## 5. Workflow
217
-
218
- The `workflow` section defines the top-level execution flow.
219
-
220
- **Fields**
220
+ ## 5. Workflow 🌐
221
221
 
222
- * `start` (string, optional): Name of the starting node.
223
- * `transitions` (list, required): List of transition rules.
222
+ The `workflow` section maps out how nodes connect and flow.
224
223
 
225
- **Transition Fields**
224
+ ### Fields 📋
226
225
 
227
- * `from` (string, required): Source node.
228
- * `to` (string or list, required): Target node(s). String for sequential, list for parallel execution.
229
- * `condition` (string, optional): Python expression using `ctx` (e.g., `"ctx.get('in_stock')"`). Transition occurs if `True`.
226
+ - `start` (string, optional): First node to run.
227
+ - `transitions` (list): Flow rules.
228
+ - `from` (string): Source node.
229
+ - `to` (string/list): Target(s)—string for sequential, list for parallel.
230
+ - `condition` (string, optional): Python expression (e.g., `"ctx['stock'].available"`).
230
231
 
231
- **Examples**
232
-
233
- * **Sequential Transition**
232
+ ### Examples 🌈
234
233
 
234
+ #### Sequential Flow
235
235
  ```yaml
236
236
  workflow:
237
237
  start: validate
238
238
  transitions:
239
239
  - from: validate
240
- to: check_inventory
240
+ to: process
241
241
  ```
242
242
 
243
- * **Conditional Transition**
244
-
243
+ #### Conditional Flow
245
244
  ```yaml
246
245
  workflow:
247
- start: check_inventory
246
+ start: inventory_check
248
247
  transitions:
249
- - from: check_inventory
250
- to: payment_shipping
251
- condition: "ctx.get('inventory_status').in_stock"
248
+ - from: inventory_check
249
+ to: payment_flow
250
+ condition: "ctx['stock'].available"
252
251
  ```
253
252
 
254
- * **Parallel Transition**
255
-
253
+ #### Parallel Flow
256
254
  ```yaml
257
255
  workflow:
258
- start: payment_shipping
256
+ start: payment_flow
259
257
  transitions:
260
- - from: payment_shipping
261
- to: [update_status, notify_customer]
258
+ - from: payment_flow
259
+ to: [update_db, send_email]
262
260
  ```
263
261
 
264
262
  ```mermaid
265
- graph LR
266
- A[Workflow Definition] --> B(Start Node);
267
- A --> C(Transitions);
268
- C --> D{From Node};
269
- D --> E{To Nodes};
270
- E -- Sequential --> F[Single Node];
271
- E -- Parallel --> G[List of Nodes];
272
- C --> H{Condition Optional};
273
- style A fill:#ccf,stroke:#333,stroke-width:2px
263
+ graph TD
264
+ A[Workflow] --> B[Start Node]
265
+ A --> C[Transitions]
266
+ C --> D[From]
267
+ D --> E{To}
268
+ E -->|Sequential| F[Single Node]
269
+ E -->|Parallel| G[List of Nodes]
270
+ C --> H[Condition?]
271
+ H -->|Yes| I[ctx-based Logic]
272
+ style A fill:#fff0e6,stroke:#cc3300,stroke-width:2px
273
+ style B fill:#ffe6cc,stroke:#cc3300
274
+ style C fill:#ffe6cc,stroke:#cc3300
275
+ style D fill:#ffd9b3,stroke:#cc3300
276
+ style E fill:#fff,stroke:#333
277
+ style F fill:#ffd9b3,stroke:#cc3300
278
+ style G fill:#ffd9b3,stroke:#cc3300
279
+ style H fill:#fff,stroke:#333
280
+ style I fill:#ffd9b3,stroke:#cc3300
274
281
  ```
275
282
 
276
- ## 6. Context
277
-
278
- The context (`ctx`) is a dictionary shared across the workflow and sub-workflows, storing node outputs. Examples:
283
+ ## 6. Observers 👀
279
284
 
280
- * Function node: `ctx["is_valid"] = True`.
281
- * Plain LLM node: `ctx["summary"] = "Brief text"`.
282
- * Structured LLM node: `ctx["inventory_status"] = InventoryStatus(items=["item1"], in_stock=True)`.
285
+ Add observers to watch workflow events (e.g., node start, completion, failures). Define them in `functions` and list them under `observers`.
283
286
 
284
- ## 7. Execution Flow
287
+ ### Example
288
+ ```yaml
289
+ functions:
290
+ log_event:
291
+ type: embedded
292
+ code: |
293
+ async def log_event(event):
294
+ print(f"{event.event_type}: {event.node_name}")
295
+ nodes:
296
+ task:
297
+ function: greet
298
+ workflow:
299
+ start: task
300
+ transitions: []
301
+ observers:
302
+ - log_event
303
+ ```
285
304
 
286
- The `WorkflowEngine` executes the workflow as follows:
305
+ ## 7. Context 📦
287
306
 
288
- 1. Begins at `workflow.start`.
289
- 2. Executes nodes, updating `ctx`:
290
- * **Function Nodes**: Calls the referenced function, storing the result in `output`.
291
- * **Sub-Workflow Nodes**: Runs the nested workflow, merging its context into the parent’s.
292
- * **LLM Nodes**: Uses `Nodes.llm_node` for text output or `Nodes.structured_llm_node` for structured output via `litellm` and `instructor`.
293
- 3. Evaluates transitions:
294
- * Conditions (if present) are checked against `ctx`.
295
- 4. Executes the next node(s) sequentially or in parallel based on `to`.
296
- 5. Continues until no further transitions remain.
307
+ The `ctx` dictionary carries data across nodes:
308
+ - `greet` `ctx["greeting"] = "Hello, Alice!"`
309
+ - `inventory_check` `ctx["stock"] = StockStatus(...)`
297
310
 
298
- ## 8. WorkflowManager
311
+ ## 8. Execution Flow 🏃‍♂️
299
312
 
300
- The `WorkflowManager` class provides programmatic control over workflows:
313
+ The `WorkflowEngine` runs it all:
314
+ 1. Starts at `workflow.start`.
315
+ 2. Executes nodes, updating `ctx`.
316
+ 3. Follows transitions based on conditions or parallel rules.
317
+ 4. Notifies observers of events.
318
+ 5. Stops when no transitions remain.
301
319
 
302
- * **Node Management**: Add, update, or remove nodes.
303
- * **Transition Management**: Define execution flow.
304
- * **Function Registration**: Embed or link to external functions.
305
- * **YAML I/O**: Load/save workflows from/to YAML files.
306
- * **Instantiation**: Builds a `Workflow` object with support for PyPI modules.
320
+ ## 9. Converting Between Python and YAML 🔄
307
321
 
308
- **Example**
322
+ The `quantalogic.flow` package provides tools to bridge Python-defined workflows and YAML definitions, making your workflows portable and standalone.
309
323
 
310
- ```python
311
- manager = WorkflowManager()
312
- manager.add_function("fetch", "external", module="requests", function="get")
313
- manager.add_node("start", function="fetch", output="response")
314
- manager.set_start_node("start")
315
- manager.save_to_yaml("workflow.yaml")
316
- ```
324
+ ### From Python to YAML with `flow_extractor.py` 📜
325
+ Want to turn a Python workflow (using `Nodes` and `Workflow`) into a YAML file? Use `quantalogic/flow/flow_extractor.py`! The `extract_workflow_from_file` function parses a Python file, extracting nodes, transitions, functions, and globals into a `WorkflowDefinition`. Then, `WorkflowManager` saves it as YAML. This is perfect for sharing or archiving workflows defined programmatically.
317
326
 
318
- If `requests` is missing, the manager raises:
327
+ #### How It Works
328
+ 1. **Parse**: `WorkflowExtractor` uses Python’s `ast` module to analyze the file, identifying `@Nodes` decorators (e.g., `define`, `llm_node`) and `Workflow` chaining.
329
+ 2. **Extract**: It builds a `WorkflowDefinition` with nodes, transitions, embedded functions, and observers.
330
+ 3. **Save**: `WorkflowManager.save_to_yaml` writes it to a YAML file.
319
331
 
320
- ```text
321
- Failed to import module 'requests': No module named 'requests'. This may be a PyPI package. Ensure it is installed using 'pip install requests' or check if the module name is correct.
322
- ```
332
+ #### Example
333
+ ```python
334
+ # story_generator.py
335
+ from quantalogic.flow import Nodes, Workflow
323
336
 
324
- ```mermaid
325
- graph LR
326
- A[WorkflowManager] --> B(Add/Update/Remove Nodes & Transitions);
327
- A --> C(Load/Save YAML);
328
- A --> D(Instantiate Workflow);
329
- style A fill:#ccf,stroke:#333,stroke-width:2px
330
- ```
337
+ @Nodes.define(output="greeting")
338
+ async def say_hello(name: str) -> str:
339
+ return f"Hello, {name}!"
331
340
 
332
- ## 9. Examples
341
+ workflow = Workflow("say_hello")
333
342
 
334
- **Example 1: Simple Workflow with PyPI Module**
343
+ # Convert to YAML
344
+ from quantalogic.flow.flow_extractor import extract_workflow_from_file
345
+ from quantalogic.flow.flow_manager import WorkflowManager
335
346
 
347
+ wf_def, globals = extract_workflow_from_file("story_generator.py")
348
+ manager = WorkflowManager(wf_def)
349
+ manager.save_to_yaml("story_workflow.yaml")
350
+ ```
351
+ **Output (`story_workflow.yaml`)**:
336
352
  ```yaml
337
353
  functions:
338
- fetch_page:
339
- type: external
340
- module: requests
341
- function: get
354
+ say_hello:
355
+ type: embedded
356
+ code: |
357
+ @Nodes.define(output="greeting")
358
+ async def say_hello(name: str) -> str:
359
+ return f"Hello, {name}!"
342
360
  nodes:
343
- fetch:
344
- function: fetch_page
345
- output: page_content
361
+ say_hello:
362
+ function: say_hello
363
+ output: greeting
364
+ retries: 3
365
+ delay: 1.0
346
366
  workflow:
347
- start: fetch
367
+ start: say_hello
348
368
  transitions: []
349
369
  ```
350
370
 
351
- Execution with `ctx = {"url": "https://example.com"}`:
371
+ ### From YAML to Standalone Python with `flow_generator.py` 🐍
372
+ Need a self-contained Python script from a `WorkflowDefinition`? `quantalogic/flow/flow_generator.py` has you covered with `generate_executable_script`. It creates an executable file with embedded functions, dependencies, and a `main` function—ready to run anywhere with `uv run`.
373
+
374
+ #### How It Works
375
+ 1. **Generate**: Takes a `WorkflowDefinition` and global variables.
376
+ 2. **Structure**: Adds a shebang (`#!/usr/bin/env -S uv run`), dependencies, globals, functions, and workflow chaining.
377
+ 3. **Execute**: Sets permissions to make it runnable.
378
+
379
+ #### Example
380
+ ```python
381
+ from quantalogic.flow.flow_manager import WorkflowManager
382
+ from quantalogic.flow.flow_generator import generate_executable_script
383
+
384
+ manager = WorkflowManager()
385
+ manager.load_from_yaml("story_workflow.yaml")
386
+ generate_executable_script(manager.workflow, {}, "standalone_story.py")
387
+ ```
388
+ **Output (`standalone_story.py`)**:
389
+ ```python
390
+ #!/usr/bin/env -S uv run
391
+ # /// script
392
+ # requires-python = ">=3.12"
393
+ # dependencies = ["loguru", "litellm", "pydantic>=2.0", "anyio", "quantalogic>=0.35", "jinja2", "instructor[litellm]"]
394
+ # ///
395
+ import anyio
396
+ from loguru import logger
397
+ from quantalogic.flow import Nodes, Workflow
398
+
399
+ @Nodes.define(output="greeting")
400
+ async def say_hello(name: str) -> str:
401
+ return f"Hello, {name}!"
402
+
403
+ workflow = Workflow("say_hello")
404
+
405
+ async def main():
406
+ initial_context = {"name": "World"}
407
+ engine = workflow.build()
408
+ result = await engine.run(initial_context)
409
+ logger.info(f"Workflow result: {result}")
410
+
411
+ if __name__ == "__main__":
412
+ anyio.run(main)
413
+ ```
414
+ Run it with `./standalone_story.py`—no extra setup needed (assuming `uv` is installed)!
415
+
416
+ ```mermaid
417
+ graph TD
418
+ A[Python Workflow] -->|flow_extractor.py| B[WorkflowDefinition]
419
+ B -->|WorkflowManager| C[YAML File]
420
+ C -->|WorkflowManager| D[WorkflowDefinition]
421
+ D -->|flow_generator.py| E[Standalone Python Script]
422
+ style A fill:#e6f3ff,stroke:#0066cc,stroke-width:2px
423
+ style B fill:#fff,stroke:#333
424
+ style C fill:#e6ffe6,stroke:#009933,stroke-width:2px
425
+ style D fill:#fff,stroke:#333
426
+ style E fill:#fff0e6,stroke:#cc3300,stroke-width:2px
427
+ ```
428
+
429
+ ## 10. WorkflowManager 🧑‍💻
430
+
431
+ The `WorkflowManager` lets you build workflows programmatically:
432
+ - Add nodes, transitions, functions, and observers.
433
+ - Load/save YAML.
434
+ - Instantiate a `Workflow` object.
352
435
 
353
- `fetch` → `ctx["page_content"] = <Response object>` (assuming `requests` is installed).
436
+ ### Example
437
+ ```python
438
+ manager = WorkflowManager()
439
+ manager.add_function("say_hi", "embedded", code="async def say_hi(name): return f'Hi, {name}!'")
440
+ manager.add_node("start", function="say_hi")
441
+ manager.set_start_node("start")
442
+ manager.save_to_yaml("hi.yaml")
443
+ ```
354
444
 
355
- **Example 2: E-commerce Workflow**
445
+ ## 11. Full Example: Order Processing 📦🤖
356
446
 
357
447
  ```yaml
358
448
  functions:
359
- validate_order:
449
+ validate:
360
450
  type: embedded
361
451
  code: |
362
- async def validate_order(order: dict) -> bool:
363
- await asyncio.sleep(1)
364
- return bool(order.get("items"))
365
- process_payment:
366
- type: external
367
- module: stripe
368
- function: create_charge
452
+ async def validate(order: dict) -> str:
453
+ return "valid" if order["items"] else "invalid"
454
+ track_usage:
455
+ type: embedded
456
+ code: |
457
+ def track_usage(event):
458
+ if event.usage:
459
+ print(f"{event.node_name}: {event.usage['total_tokens']} tokens")
369
460
  nodes:
370
- validate:
371
- function: validate_order
372
- output: is_valid
373
- inventory:
461
+ validate_order:
462
+ function: validate
463
+ output: validity
464
+ check_stock:
374
465
  llm_config:
375
466
  model: "gemini/gemini-2.0-flash"
376
467
  system_prompt: "Check inventory."
377
- prompt_template: "Are {{ items }} in stock?"
378
- response_model: "my_module:InventoryStatus"
379
- output: inventory_status
380
- payment:
381
- function: process_payment
382
- output: payment_status
468
+ prompt_template: "Items: {{ items }}"
469
+ response_model: "shop:Stock"
470
+ output: stock
383
471
  notify:
384
472
  llm_config:
385
- prompt_template: "Notify: Order {{ order_id }} processed."
386
- output: notification
473
+ prompt_template: "Order {{ order_id }} status: {{ validity }}"
474
+ output: message
387
475
  workflow:
388
- start: validate
476
+ start: validate_order
389
477
  transitions:
390
- - from: validate
391
- to: inventory
392
- - from: inventory
393
- to: payment
394
- condition: "ctx.get('inventory_status').in_stock"
395
- - from: payment
478
+ - from: validate_order
479
+ to: check_stock
480
+ condition: "ctx['validity'] == 'valid'"
481
+ - from: check_stock
396
482
  to: notify
483
+ observers:
484
+ - track_usage
397
485
  ```
398
486
 
399
- Execution with `ctx = {"order": {"items": ["item1"], "order_id": "123"}}`:
400
-
401
- * `validate` → `ctx["is_valid"] = True`.
402
- * `inventory` → `ctx["inventory_status"] = InventoryStatus(...)`.
403
- * `payment` → `ctx["payment_status"] = <Stripe response>` (requires `pip install stripe`).
404
- * `notify` `ctx["notification"] = "Notify: Order 123 processed."`.
487
+ ### Execution
488
+ With `ctx = {"order": {"items": ["book"], "order_id": "123"}}`:
489
+ 1. `validate_order` → `ctx["validity"] = "valid"`
490
+ 2. `check_stock` → `ctx["stock"] = Stock(...)`
491
+ 3. `notify` → `ctx["message"] = "Order 123 status: valid"`
492
+ 4. `track_usage` prints token usage for LLM nodes.
405
493
 
406
494
  ```mermaid
407
- graph LR
408
- A[validate] --> B[inventory];
409
- B -- "ctx.get('inventory_status').in_stock" --> C[payment];
410
- C --> D[notify];
411
- style A fill:#afa,stroke:#333,stroke-width:2px
412
- style B fill:#afa,stroke:#333,stroke-width:2px
413
- style C fill:#afa,stroke:#333,stroke-width:2px
414
- style D fill:#afa,stroke:#333,stroke-width:2px
495
+ graph TD
496
+ A["validate_order"] -->|"ctx['validity'] == 'valid'"| B["check_stock"]
497
+ B --> C["notify"]
498
+ style A fill:#e6ffe6,stroke:#009933,stroke-width:2px
499
+ style B fill:#e6ffe6,stroke:#009933,stroke-width:2px
500
+ style C fill:#e6ffe6,stroke:#009933,stroke-width:2px
415
501
  ```
416
502
 
417
- ## 10. Conclusion
503
+ ## 12. Conclusion 🎉
504
+
505
+ The Quantalogic Flow YAML DSL (March 1, 2025) is your go-to for crafting workflows—simple or sophisticated. With tools like `flow_extractor.py` and `flow_generator.py`, you can switch between Python and YAML effortlessly, making workflows portable and standalone. Add PyPI support, sub-workflows, LLM nodes, and observers, and you’ve got a versatile framework for automation and AI tasks. Pair it with `WorkflowManager` for maximum flexibility! 🚀
418
506
 
419
- The Quantalogic Flow YAML DSL, as of February 23, 2025, provides a flexible and powerful framework for defining workflows. Enhanced support for PyPI modules via the `module` field in `functions` ensures seamless integration with external libraries, with clear error messages guiding users to install missing packages (e.g., `pip install requests`). Combined with sub-workflows, LLM nodes, and robust execution controls, it supports a wide range of applications, from simple tasks to complex, AI-driven processes, all manageable through the `WorkflowManager`.