soe-ai 0.1.0__tar.gz → 0.1.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: soe-ai
3
- Version: 0.1.0
3
+ Version: 0.1.1
4
4
  Summary: Signal-driven Orchestration Engine - Agent orchestration with event-driven workflow engine
5
5
  Author-email: Pedro Garcia <pedro@example.com>
6
6
  License-Expression: MIT
@@ -91,12 +91,44 @@ All workflow state flows through **context**—a shared dictionary accessible vi
91
91
  - LLM prompts can interpolate any context field
92
92
  - No hidden state—everything is inspectable
93
93
 
94
- ### 3. Deterministic + Agentic
95
- Mix hard-coded logic with LLM-driven behavior in the same workflow. Router nodes are pure conditionals. Agent nodes can call tools. Use what you need.
94
+ ### 3. Purely Deterministic or Hybrid Agentic
95
+ SOE is a complete orchestration solution. You can use it as a purely deterministic engine for standard business logic, or mix in LLM-driven "Agentic" behavior.
96
+ - **Deterministic**: Use `router` and `tool` nodes for 100% predictable workflows.
97
+ - **Agentic**: Add `llm` and `agent` nodes for creative, reasoning-based tasks.
98
+ You get the safety of code with the flexibility of AI in a single, unified system.
96
99
 
97
100
  ### 4. Portable
98
101
  Workflows are YAML. Run them locally, in CI, in production. Extract them, version them, share them.
99
102
 
103
+ ### 5. Self-Evolving
104
+ Workflows can modify themselves at runtime. Built-in tools like `inject_workflow`, `inject_node_configuration`, and `add_signal` allow agents to:
105
+ - Create new workflows dynamically
106
+ - Add or modify nodes in existing workflows
107
+ - Update signal routing on the fly
108
+
109
+ This enables **meta-programming**: an AI system that can extend its own capabilities without human intervention.
110
+
111
+ ---
112
+
113
+ ## What SOE Unlocks
114
+
115
+ SOE is a **Protocol for Intelligence** that unlocks new forms of intelligent behavior:
116
+
117
+ ### Self-Evolving Intelligence
118
+ AI systems that can rewrite and improve themselves at runtime - the ultimate evolution of software.
119
+
120
+ ### Swarm Intelligence
121
+ Efficient collective decision-making among multiple agents through signal-based consensus.
122
+
123
+ ### Hybrid Intelligence
124
+ Seamless combination of deterministic logic and AI creativity with programmatic safety rails.
125
+
126
+ ### Fractal Intelligence
127
+ Hierarchical agent organizations that scale complexity while remaining manageable.
128
+
129
+ ### Infrastructure Intelligence
130
+ AI orchestration that works everywhere - from edge devices to cloud platforms.
131
+
100
132
  ---
101
133
 
102
134
  ## Installation
@@ -117,6 +149,37 @@ cd soe && uv sync
117
149
 
118
150
  ## Quick Start
119
151
 
152
+ ### 1. Provide Your LLM
153
+
154
+ SOE is LLM-agnostic. You must provide a `call_llm` function that matches this signature:
155
+
156
+ ```python
157
+ def call_llm(
158
+ prompt: str,
159
+ config: dict,
160
+ ) -> str:
161
+ """
162
+ Called by SOE when a node needs LLM processing.
163
+
164
+ Args:
165
+ prompt: The rendered prompt string (includes instructions, context, and schemas)
166
+ config: The full node configuration from YAML (useful for model parameters)
167
+
168
+ Returns:
169
+ The raw text response from the LLM.
170
+ """
171
+ # Example with OpenAI:
172
+ from openai import OpenAI
173
+ client = OpenAI()
174
+ response = client.chat.completions.create(
175
+ model=config.get("model", "gpt-4o"),
176
+ messages=[{"role": "user", "content": prompt}],
177
+ )
178
+ return response.choices[0].message.content
179
+ ```
180
+
181
+ ### 2. Run a Workflow
182
+
120
183
  ```python
121
184
  from soe import orchestrate, create_all_nodes
122
185
  from soe.local_backends import create_local_backends
@@ -134,8 +197,8 @@ example_workflow:
134
197
  # Create backends (storage for context, workflows, etc.)
135
198
  backends = create_local_backends("./data")
136
199
 
137
- # Create all node handlers
138
- nodes, broadcast = create_all_nodes(backends)
200
+ # Create all node handlers (pass your call_llm function)
201
+ nodes, broadcast = create_all_nodes(backends, call_llm=call_llm)
139
202
 
140
203
  # Run the workflow
141
204
  execution_id = orchestrate(
@@ -158,8 +221,8 @@ execution_id = orchestrate(
158
221
  | Audience | Start Here |
159
222
  |----------|------------|
160
223
  | **Builders** (workflow authors) | [Documentation](docs/index.md) — Step-by-step chapters |
161
- | **Engineers** (infrastructure) | [ARCHITECTURE.md](ai_docs/ARCHITECTURE.md) — Design philosophy |
162
- | **Researchers** (advanced patterns) | [Advanced Patterns](docs/advanced_patterns/index.md) — Swarm, hybrid, self-evolving |
224
+ | **Engineers** (infrastructure) | [Infrastructure Guide](docs/guide_10_infrastructure.md) — Backend protocols |
225
+ | **Researchers** (advanced patterns) | [Advanced Patterns](docs/advanced_patterns/) — Swarm, hybrid, self-evolving |
163
226
 
164
227
  ---
165
228
 
@@ -55,12 +55,44 @@ All workflow state flows through **context**—a shared dictionary accessible vi
55
55
  - LLM prompts can interpolate any context field
56
56
  - No hidden state—everything is inspectable
57
57
 
58
- ### 3. Deterministic + Agentic
59
- Mix hard-coded logic with LLM-driven behavior in the same workflow. Router nodes are pure conditionals. Agent nodes can call tools. Use what you need.
58
+ ### 3. Purely Deterministic or Hybrid Agentic
59
+ SOE is a complete orchestration solution. You can use it as a purely deterministic engine for standard business logic, or mix in LLM-driven "Agentic" behavior.
60
+ - **Deterministic**: Use `router` and `tool` nodes for 100% predictable workflows.
61
+ - **Agentic**: Add `llm` and `agent` nodes for creative, reasoning-based tasks.
62
+ You get the safety of code with the flexibility of AI in a single, unified system.
60
63
 
61
64
  ### 4. Portable
62
65
  Workflows are YAML. Run them locally, in CI, in production. Extract them, version them, share them.
63
66
 
67
+ ### 5. Self-Evolving
68
+ Workflows can modify themselves at runtime. Built-in tools like `inject_workflow`, `inject_node_configuration`, and `add_signal` allow agents to:
69
+ - Create new workflows dynamically
70
+ - Add or modify nodes in existing workflows
71
+ - Update signal routing on the fly
72
+
73
+ This enables **meta-programming**: an AI system that can extend its own capabilities without human intervention.
74
+
75
+ ---
76
+
77
+ ## What SOE Unlocks
78
+
79
+ SOE is a **Protocol for Intelligence** that unlocks new forms of intelligent behavior:
80
+
81
+ ### Self-Evolving Intelligence
82
+ AI systems that can rewrite and improve themselves at runtime - the ultimate evolution of software.
83
+
84
+ ### Swarm Intelligence
85
+ Efficient collective decision-making among multiple agents through signal-based consensus.
86
+
87
+ ### Hybrid Intelligence
88
+ Seamless combination of deterministic logic and AI creativity with programmatic safety rails.
89
+
90
+ ### Fractal Intelligence
91
+ Hierarchical agent organizations that scale complexity while remaining manageable.
92
+
93
+ ### Infrastructure Intelligence
94
+ AI orchestration that works everywhere - from edge devices to cloud platforms.
95
+
64
96
  ---
65
97
 
66
98
  ## Installation
@@ -81,6 +113,37 @@ cd soe && uv sync
81
113
 
82
114
  ## Quick Start
83
115
 
116
+ ### 1. Provide Your LLM
117
+
118
+ SOE is LLM-agnostic. You must provide a `call_llm` function that matches this signature:
119
+
120
+ ```python
121
+ def call_llm(
122
+ prompt: str,
123
+ config: dict,
124
+ ) -> str:
125
+ """
126
+ Called by SOE when a node needs LLM processing.
127
+
128
+ Args:
129
+ prompt: The rendered prompt string (includes instructions, context, and schemas)
130
+ config: The full node configuration from YAML (useful for model parameters)
131
+
132
+ Returns:
133
+ The raw text response from the LLM.
134
+ """
135
+ # Example with OpenAI:
136
+ from openai import OpenAI
137
+ client = OpenAI()
138
+ response = client.chat.completions.create(
139
+ model=config.get("model", "gpt-4o"),
140
+ messages=[{"role": "user", "content": prompt}],
141
+ )
142
+ return response.choices[0].message.content
143
+ ```
144
+
145
+ ### 2. Run a Workflow
146
+
84
147
  ```python
85
148
  from soe import orchestrate, create_all_nodes
86
149
  from soe.local_backends import create_local_backends
@@ -98,8 +161,8 @@ example_workflow:
98
161
  # Create backends (storage for context, workflows, etc.)
99
162
  backends = create_local_backends("./data")
100
163
 
101
- # Create all node handlers
102
- nodes, broadcast = create_all_nodes(backends)
164
+ # Create all node handlers (pass your call_llm function)
165
+ nodes, broadcast = create_all_nodes(backends, call_llm=call_llm)
103
166
 
104
167
  # Run the workflow
105
168
  execution_id = orchestrate(
@@ -122,8 +185,8 @@ execution_id = orchestrate(
122
185
  | Audience | Start Here |
123
186
  |----------|------------|
124
187
  | **Builders** (workflow authors) | [Documentation](docs/index.md) — Step-by-step chapters |
125
- | **Engineers** (infrastructure) | [ARCHITECTURE.md](ai_docs/ARCHITECTURE.md) — Design philosophy |
126
- | **Researchers** (advanced patterns) | [Advanced Patterns](docs/advanced_patterns/index.md) — Swarm, hybrid, self-evolving |
188
+ | **Engineers** (infrastructure) | [Infrastructure Guide](docs/guide_10_infrastructure.md) — Backend protocols |
189
+ | **Researchers** (advanced patterns) | [Advanced Patterns](docs/advanced_patterns/) — Swarm, hybrid, self-evolving |
127
190
 
128
191
  ---
129
192
 
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
4
4
 
5
5
  [project]
6
6
  name = "soe-ai"
7
- version = "0.1.0"
7
+ version = "0.1.1"
8
8
  description = "Signal-driven Orchestration Engine - Agent orchestration with event-driven workflow engine"
9
9
  readme = "README.md"
10
10
  requires-python = ">=3.8"
@@ -1,7 +1,6 @@
1
1
  from uuid import uuid4
2
- from typing import Dict, List, Any, Union, Callable, Optional
3
- from .types import Backends, BroadcastSignalsCaller
4
- from .local_backends import EventTypes
2
+ from typing import Dict, List, Any, Union, Optional
3
+ from .types import Backends, BroadcastSignalsCaller, NodeCaller, EventTypes, WorkflowValidationError
5
4
  from .lib.register_event import register_event
6
5
  from .lib.yaml_parser import parse_yaml
7
6
  from .lib.operational import add_operational_state
@@ -131,7 +130,7 @@ def orchestrate(
131
130
  def broadcast_signals(
132
131
  id: str,
133
132
  signals: List[str],
134
- nodes: Dict[str, Callable[[str, Dict[str, Any]], None]],
133
+ nodes: Dict[str, NodeCaller],
135
134
  backends: Backends,
136
135
  ) -> None:
137
136
  """Broadcast signals to matching nodes in the current workflow"""
@@ -139,7 +138,7 @@ def broadcast_signals(
139
138
 
140
139
  register_event(backends, id, EventTypes.SIGNALS_BROADCAST, {"signals": signals})
141
140
 
142
- workflows_registry = backends.workflow.soe_get_workflows_registry(id)
141
+ workflows_registry = backends.workflow.get_workflows_registry(id)
143
142
 
144
143
  workflow_name = backends.workflow.get_current_workflow_name(id)
145
144
  workflow = workflows_registry.get(workflow_name, {})