choreo-mini 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,148 @@
1
+ Choreo-Mini Source License
2
+ Version 1.2 (March 2026)
3
+
4
+ Copyright (c) 2026 Sivasathivel Kandasamy
5
+ https://www.linkedin.com/in/sivasathivelkandasamy/
6
+
7
+ This license allows broad use of Choreo-Mini, including commercial use as a
8
+ library or dependency, while restricting third-party productization of
9
+ Choreo-Mini itself.
10
+
11
+ 1. Definitions
12
+
13
+ "Software" means this project, including its source code, build artifacts,
14
+ and documentation released under this license.
15
+
16
+ "Derivative Work" means any modification of the Software, or any work based
17
+ on the Software, where Software code is changed, extended, or incorporated in
18
+ a way that creates a derived codebase.
19
+
20
+ For clarity, merely importing, linking, calling public APIs, or configuring
21
+ the Software without modifying its code does not by itself create a
22
+ Derivative Work.
23
+
24
+ "Larger Work" means software that uses, links to, or depends on the Software,
25
+ but is otherwise independently developed and not itself a Derivative Work.
26
+
27
+ "Productize" means, for third parties, to do any of the following with the
28
+ Software or a Derivative Work:
29
+ (a) sell, license, or sublicense it as a product, plugin, extension, toolkit,
30
+ model service, or API offering;
31
+ (b) offer it as a hosted or managed service (including SaaS) where users
32
+ access its functionality over a network;
33
+ (c) provide paid access to the Software or a Derivative Work, including when
34
+ bundled with other services.
35
+
36
+ "Internal Use" means use within your own legal entity and its controlled
37
+ entities, where no external party receives direct access to the Software or a
38
+ Derivative Work as a standalone offering.
39
+
40
+ 2. License Grant
41
+
42
+ Subject to compliance with this license, you are granted a non-exclusive,
43
+ worldwide, royalty-free right to:
44
+ (a) use, run, and reproduce the Software for any purpose, including commercial
45
+ Internal Use;
46
+ (b) modify the Software and create Derivative Works;
47
+ (c) use the Software inside Larger Works, including proprietary applications;
48
+ (d) distribute unmodified copies of the Software, provided this license and
49
+ attribution notices are included.
50
+
51
+ 3. Productization Restriction
52
+
53
+ Except for the original author (Sivasathivel Kandasamy), no person or entity
54
+ may Productize the Software or any Derivative Work under this license.
55
+
56
+ This license does not provide a case-by-case approval or exception process for
57
+ third-party Productization.
58
+
59
+ Clarification: using the Software as an internal dependency, build tool, or
60
+ embedded library inside a Larger Work does not by itself constitute
61
+ Productization, so long as the Software is not offered to third parties as a
62
+ standalone product, API, plugin, extension, or hosted platform.
63
+
64
+ Safe-harbor examples (permitted under this license):
65
+ (a) using the Software in internal enterprise workflows, including commercial
66
+ business operations;
67
+ (b) shipping a proprietary Larger Work that depends on the unmodified
68
+ Software, with required license notice retained;
69
+ (c) using the Software in private build, testing, or deployment tooling.
70
+
71
+ Examples of prohibited third-party Productization:
72
+ (a) selling a modified fork, plugin, or extension based on the Software;
73
+ (b) offering a paid hosted API or SaaS where users access Software or
74
+ Derivative Work functionality over a network;
75
+ (c) charging for direct access to Software or Derivative Work capabilities,
76
+ even when bundled with other paid services.
77
+
78
+ 4. Copyleft for Distributed or Hosted Derivatives
79
+
80
+ If you distribute a Derivative Work to third parties, or make a Derivative
81
+ Work available to third parties over a network (for example as a hosted API or
82
+ service), you must:
83
+ (a) provide the complete corresponding source code of that Derivative Work;
84
+ (b) license that Derivative Work under this same license;
85
+ (c) include this license text and preserve copyright notices;
86
+ (d) clearly mark modified files.
87
+
88
+ Scope limitation: this clause applies to the Software and Derivative Work
89
+ portions only. It does not require disclosure of independent source code in a
90
+ Larger Work that is not derived from the Software.
91
+
92
+ 5. Citation and Attribution
93
+
94
+ Any public materials or user-facing interface that references, demonstrates,
95
+ or provides access to software using the Software or a Derivative Work must
96
+ include visible attribution containing at least:
97
+ - Project: Choreo-Mini
98
+ - Author: Sivasathivel Kandasamy
99
+ - Source: https://github.com/Sivasathivel/Choreo-mini
100
+
101
+ Examples include public repositories, documentation sites, demo pages,
102
+ benchmark publications, product websites, and service UI pages. Backend-only
103
+ private deployments with no public-facing materials are not required to
104
+ display attribution.
105
+
106
+ 6. Contributions and Relicensing Rights
107
+
108
+ By submitting any contribution for inclusion in this project (including pull
109
+ requests, patches, or code changes), you agree that:
110
+ (a) your contribution may be distributed under this license; and
111
+ (b) you grant the author a perpetual, worldwide, irrevocable, royalty-free
112
+ right to use, modify, sublicense, and relicense your contribution,
113
+ including in future dual-licensed, commercial, or enterprise editions.
114
+
115
+ You represent that you have the legal right to submit the contribution under
116
+ these terms.
117
+
118
+ 7. Patent and Trademark
119
+
120
+ Each contributor grants a non-exclusive, royalty-free patent license for patent
121
+ claims necessarily infringed by their contribution as included in the Software.
122
+ No trademark rights are granted by this license.
123
+
124
+ 8. Reserved Author Rights and Future Licensing
125
+
126
+ The author reserves the exclusive right to:
127
+ (a) publish enterprise, dual-licensed, or otherwise differently licensed
128
+ future versions; and
129
+ (b) modify license terms for future releases.
130
+
131
+ Rights granted for previously released versions are not retroactively revoked
132
+ for recipients who remain in compliance with the license applicable to those
133
+ versions.
134
+
135
+ 9. Disclaimer of Warranty and Limitation of Liability
136
+
137
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
138
+ IMPLIED, INCLUDING BUT NOT LIMITED TO WARRANTIES OF MERCHANTABILITY, FITNESS
139
+ FOR A PARTICULAR PURPOSE, AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHOR OR
140
+ CONTRIBUTORS BE LIABLE FOR ANY CLAIM, DAMAGES, OR OTHER LIABILITY, WHETHER IN
141
+ AN ACTION OF CONTRACT, TORT, OR OTHERWISE, ARISING FROM, OUT OF, OR IN
142
+ CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
143
+
144
+ 10. Termination
145
+
146
+ Any material breach of this license automatically terminates rights granted
147
+ under it. Rights may be reinstated only by explicit written notice from the
148
+ author.
@@ -0,0 +1,232 @@
1
+ Metadata-Version: 2.4
2
+ Name: choreo-mini
3
+ Version: 0.1.0
4
+ Summary: A lightweight Python meta-framework for building, experimenting with, and orchestrating LLM-based agents across multiple runtimes.
5
+ Author: Sivasathivel Kandasamy
6
+ License: Choreo-Mini Source License 1.2
7
+ Project-URL: Homepage, https://github.com/Sivasathivel/Choreo-mini
8
+ Project-URL: Source, https://github.com/Sivasathivel/Choreo-mini
9
+ Project-URL: Author LinkedIn, https://www.linkedin.com/in/sivasathivelkandasamy/
10
+ Keywords: llm,agents,orchestration,langgraph,crewai,autogen,ai,workflow,agentic
11
+ Classifier: Development Status :: 3 - Alpha
12
+ Classifier: License :: Other/Proprietary License
13
+ Classifier: Programming Language :: Python :: 3
14
+ Classifier: Programming Language :: Python :: 3.10
15
+ Classifier: Programming Language :: Python :: 3.11
16
+ Classifier: Programming Language :: Python :: 3.12
17
+ Classifier: Operating System :: OS Independent
18
+ Classifier: Intended Audience :: Developers
19
+ Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
20
+ Requires-Python: >=3.10
21
+ Description-Content-Type: text/markdown
22
+ License-File: LICENSE
23
+ Requires-Dist: jinja2>=3.0.0
24
+ Provides-Extra: dev
25
+ Requires-Dist: pytest>=7.0; extra == "dev"
26
+ Requires-Dist: langgraph; extra == "dev"
27
+ Dynamic: license-file
28
+
29
+ # Choreo-Mini
30
+
31
+ **A lightweight Python meta-framework for building, experimenting with, and orchestrating LLM-based agents.**
32
+
33
+ Modern agent frameworks — LangGraph, CrewAI, AutoGen — each solve similar orchestration problems but introduce fragmented abstractions and steep learning curves. Choreo-Mini provides a Python-native developer experience that allows engineers to prototype agent workflows quickly while retaining the flexibility to run them on different orchestration runtimes.
34
+
35
+ Instead of forcing developers to commit to a single framework, Choreo-Mini acts as an orchestration meta-layer: you express your workflow once in plain Python, and it compiles it to your target runtime.
36
+
37
+ ---
38
+
39
+ ## How it works
40
+
41
+ ```
42
+ Your Python Workflow
43
+
44
+
45
+ AST Parser ── extracts agents, services, control flow, state mutations
46
+
47
+
48
+ Intermediate Representation
49
+
50
+
51
+ Jinja2 Template Compiler
52
+
53
+ ┌───┴────────────────┐
54
+ ▼ ▼ ▼
55
+ LangGraph CrewAI AutoGen
56
+ ```
57
+
58
+ ---
59
+
60
+ ## Status
61
+
62
+ Choreo-Mini is an actively developed prototype. The **LangGraph** backend is the most mature path, supporting branch-aware conditional routing, loop budgets, and service node dispatch. **CrewAI** and **AutoGen** backends produce structurally correct scaffolding and are being extended for deeper runtime fidelity.
63
+
64
+ ---
65
+
66
+ ## Installation
67
+
68
+ ```bash
69
+ pip install choreo-mini
70
+ ```
71
+
72
+ ---
73
+
74
+ ## Quick Start
75
+
76
+ **Define your workflow once in plain Python:**
77
+
78
+ ```python
79
+ # examples/my_workflow.py
80
+ from choreo_mini.core.workflow import Workflow
81
+ from choreo_mini.core.nodes import AgentNode, ServiceNode
82
+
83
+ wf = Workflow("support", enable_profiling=True)
84
+
85
+ # agents auto-register with the workflow on creation
86
+ classifier = AgentNode(wf, "Classifier", role="ticket triage")
87
+ specialist = AgentNode(wf, "Specialist", role="issue resolver")
88
+
89
+
90
+ def main():
91
+ while True:
92
+ ticket = input("Ticket> ")
93
+ if not ticket.strip():
94
+ break
95
+ category = wf.send("Classifier", ticket)
96
+ response = wf.send("Specialist", f"{category.content}: {ticket}")
97
+ print(response.content)
98
+ ```
99
+
100
+ **Compile to any supported runtime:**
101
+
102
+ ```bash
103
+ # to LangGraph
104
+ choreo_mini -f examples/my_workflow.py -b langgraph -o output/langgraph_output.py
105
+
106
+ # to CrewAI
107
+ choreo_mini -f examples/my_workflow.py -b crewai -o output/crewai_output.py
108
+
109
+ # to AutoGen
110
+ choreo_mini -f examples/my_workflow.py -b autogen -o output/autogen_output.py
111
+ ```
112
+
113
+ **Run the generated LangGraph app directly:**
114
+
115
+ ```python
116
+ from output.langgraph_output import app
117
+ from choreo_mini import Workflow, AgentNode
118
+ from choreo_mini.core.llm import LLM
119
+
120
+ wf = Workflow("support", enable_profiling=True)
121
+ AgentNode(wf, "Classifier", role="triage", llm=LLM.create("openai", api_key="..."))
122
+ AgentNode(wf, "Specialist", role="resolver", llm=LLM.create("openai", api_key="..."))
123
+
124
+ result = app.invoke({"wf": wf, "input": "login broken", "messages": [], "loop_budget": 1})
125
+ print(result["last_response"])
126
+ ```
127
+
128
+ ---
129
+
130
+ ## Python API
131
+
132
+ ```python
133
+ from choreo_mini import Workflow, AgentNode, ServiceNode, LLM, CustomLLM
134
+
135
+ # create a workflow — enable_profiling tracks latency and memory per agent
136
+ wf = Workflow("myflow", enable_profiling=True)
137
+
138
+ # attach agents with a real or custom LLM
139
+ A1 = AgentNode(wf, "Greeter", role="greeter", llm=LLM.create("openai", api_key="..."))
140
+ A2 = AgentNode(wf, "Responder", role="responder", llm=LLM.create("anthropic", api_key="..."))
141
+
142
+ # CustomLLM wraps any callable — great for local models or mocks
143
+ A3 = AgentNode(wf, "Fallback", role="fallback",
144
+ llm=CustomLLM(lambda prompt, **kw: f"Fallback: {prompt}"))
145
+
146
+ # service nodes wrap arbitrary data functions
147
+ loader = ServiceNode(wf, "Loader", service_fn=lambda wf, path: open(path).read())
148
+
149
+ # send a message to an agent — history and profiling handled automatically
150
+ response = wf.send("Greeter", "Hello")
151
+ print(response.content)
152
+
153
+ # inspect profiling
154
+ print(wf.get_profile("Greeter")) # {"calls": 1, "total_latency": ..., "total_memory": ...}
155
+ ```
156
+
157
+ ---
158
+
159
+ ## Observability
160
+
161
+ When `enable_profiling=True`, Choreo-Mini instruments every agent call automatically:
162
+
163
+ | Metric | Description |
164
+ |--------|-------------|
165
+ | `call_count` | Number of times the agent was invoked |
166
+ | `total_latency` | Cumulative wall-clock inference time (seconds) |
167
+ | `total_memory` | Cumulative memory delta across calls (bytes) |
168
+ | `history` | Full conversation history per agent |
169
+
170
+ This makes it straightforward to compare runtimes or detect bottlenecks before committing to a specific framework.
171
+
172
+ ---
173
+
174
+ ## Supported LLM Providers
175
+
176
+ | Provider | Class | Notes |
177
+ |----------|-------|-------|
178
+ | OpenAI | `LLM.create("openai", api_key=...)` | Stub — wire real `openai` client in `generate()` |
179
+ | Anthropic | `LLM.create("anthropic", api_key=...)` | Stub — wire real `anthropic` client |
180
+ | Gemini | `LLM.create("gemini", api_key=...)` | Stub — wire real Google client |
181
+ | Custom | `CustomLLM(fn)` | Wraps any `(prompt, **kw) -> str` callable |
182
+
183
+ The provider stubs make scaffolding easy; replace `generate()` with the real SDK call for production.
184
+
185
+ ---
186
+
187
+ ## Development
188
+
189
+ ```bash
190
+ git clone https://github.com/Sivasathivel/Choreo-mini
191
+ cd choreo-mini
192
+ python -m venv .venv && source .venv/bin/activate
193
+ pip install -e ".[dev]"
194
+ pytest tests/
195
+ ```
196
+
197
+ CI runs the full regression suite against Python 3.10, 3.11, and 3.12 on every push.
198
+
199
+ ---
200
+
201
+ ## Author
202
+
203
+ **Sivasathivel Kandasamy** — [LinkedIn](https://www.linkedin.com/in/sivasathivelkandasamy/)
204
+
205
+ ---
206
+
207
+ ## License
208
+
209
+ This project is released under the [Choreo-Mini Source License](LICENSE).
210
+
211
+ **What is allowed:**
212
+ - Use choreo-mini as a library or dependency inside any project, including
213
+ commercial applications and internal enterprise deployments — no restriction.
214
+ - Modify the source and contribute back.
215
+ - Keep your larger application closed source when it only depends on
216
+ choreo-mini and is not itself a derivative of choreo-mini.
217
+ - Ship a proprietary larger product that uses unmodified choreo-mini as a
218
+ component, with license notices preserved.
219
+
220
+ **What is not allowed:**
221
+ - Building and selling a product, plugin, extension, or SaaS where
222
+ choreo-mini is the core value being offered by a third party.
223
+ - Distributing or hosting a modified derivative of choreo-mini without
224
+ releasing the derivative source under the same license.
225
+ - Selling paid access to choreo-mini or a derivative API/service, even when
226
+ bundled with other paid features.
227
+
228
+ **Other terms:** citation is required in public materials and user-facing
229
+ interfaces (for example docs, demos, public repos, benchmark reports, websites,
230
+ or service UI); contributors grant the author relicensing rights; the author
231
+ reserves the right to publish enterprise/commercial editions. See
232
+ [CONTRIBUTING.md](CONTRIBUTING.md) for contribution terms.
@@ -0,0 +1,204 @@
1
+ # Choreo-Mini
2
+
3
+ **A lightweight Python meta-framework for building, experimenting with, and orchestrating LLM-based agents.**
4
+
5
+ Modern agent frameworks — LangGraph, CrewAI, AutoGen — each solve similar orchestration problems but introduce fragmented abstractions and steep learning curves. Choreo-Mini provides a Python-native developer experience that allows engineers to prototype agent workflows quickly while retaining the flexibility to run them on different orchestration runtimes.
6
+
7
+ Instead of forcing developers to commit to a single framework, Choreo-Mini acts as an orchestration meta-layer: you express your workflow once in plain Python, and it compiles it to your target runtime.
8
+
9
+ ---
10
+
11
+ ## How it works
12
+
13
+ ```
14
+ Your Python Workflow
15
+
16
+
17
+ AST Parser ── extracts agents, services, control flow, state mutations
18
+
19
+
20
+ Intermediate Representation
21
+
22
+
23
+ Jinja2 Template Compiler
24
+
25
+ ┌───┴────────────────┐
26
+ ▼ ▼ ▼
27
+ LangGraph CrewAI AutoGen
28
+ ```
29
+
30
+ ---
31
+
32
+ ## Status
33
+
34
+ Choreo-Mini is an actively developed prototype. The **LangGraph** backend is the most mature path, supporting branch-aware conditional routing, loop budgets, and service node dispatch. **CrewAI** and **AutoGen** backends produce structurally correct scaffolding and are being extended for deeper runtime fidelity.
35
+
36
+ ---
37
+
38
+ ## Installation
39
+
40
+ ```bash
41
+ pip install choreo-mini
42
+ ```
43
+
44
+ ---
45
+
46
+ ## Quick Start
47
+
48
+ **Define your workflow once in plain Python:**
49
+
50
+ ```python
51
+ # examples/my_workflow.py
52
+ from choreo_mini.core.workflow import Workflow
53
+ from choreo_mini.core.nodes import AgentNode, ServiceNode
54
+
55
+ wf = Workflow("support", enable_profiling=True)
56
+
57
+ # agents auto-register with the workflow on creation
58
+ classifier = AgentNode(wf, "Classifier", role="ticket triage")
59
+ specialist = AgentNode(wf, "Specialist", role="issue resolver")
60
+
61
+
62
+ def main():
63
+ while True:
64
+ ticket = input("Ticket> ")
65
+ if not ticket.strip():
66
+ break
67
+ category = wf.send("Classifier", ticket)
68
+ response = wf.send("Specialist", f"{category.content}: {ticket}")
69
+ print(response.content)
70
+ ```
71
+
72
+ **Compile to any supported runtime:**
73
+
74
+ ```bash
75
+ # to LangGraph
76
+ choreo_mini -f examples/my_workflow.py -b langgraph -o output/langgraph_output.py
77
+
78
+ # to CrewAI
79
+ choreo_mini -f examples/my_workflow.py -b crewai -o output/crewai_output.py
80
+
81
+ # to AutoGen
82
+ choreo_mini -f examples/my_workflow.py -b autogen -o output/autogen_output.py
83
+ ```
84
+
85
+ **Run the generated LangGraph app directly:**
86
+
87
+ ```python
88
+ from output.langgraph_output import app
89
+ from choreo_mini import Workflow, AgentNode
90
+ from choreo_mini.core.llm import LLM
91
+
92
+ wf = Workflow("support", enable_profiling=True)
93
+ AgentNode(wf, "Classifier", role="triage", llm=LLM.create("openai", api_key="..."))
94
+ AgentNode(wf, "Specialist", role="resolver", llm=LLM.create("openai", api_key="..."))
95
+
96
+ result = app.invoke({"wf": wf, "input": "login broken", "messages": [], "loop_budget": 1})
97
+ print(result["last_response"])
98
+ ```
99
+
100
+ ---
101
+
102
+ ## Python API
103
+
104
+ ```python
105
+ from choreo_mini import Workflow, AgentNode, ServiceNode, LLM, CustomLLM
106
+
107
+ # create a workflow — enable_profiling tracks latency and memory per agent
108
+ wf = Workflow("myflow", enable_profiling=True)
109
+
110
+ # attach agents with a real or custom LLM
111
+ A1 = AgentNode(wf, "Greeter", role="greeter", llm=LLM.create("openai", api_key="..."))
112
+ A2 = AgentNode(wf, "Responder", role="responder", llm=LLM.create("anthropic", api_key="..."))
113
+
114
+ # CustomLLM wraps any callable — great for local models or mocks
115
+ A3 = AgentNode(wf, "Fallback", role="fallback",
116
+ llm=CustomLLM(lambda prompt, **kw: f"Fallback: {prompt}"))
117
+
118
+ # service nodes wrap arbitrary data functions
119
+ loader = ServiceNode(wf, "Loader", service_fn=lambda wf, path: open(path).read())
120
+
121
+ # send a message to an agent — history and profiling handled automatically
122
+ response = wf.send("Greeter", "Hello")
123
+ print(response.content)
124
+
125
+ # inspect profiling
126
+ print(wf.get_profile("Greeter")) # {"calls": 1, "total_latency": ..., "total_memory": ...}
127
+ ```
128
+
129
+ ---
130
+
131
+ ## Observability
132
+
133
+ When `enable_profiling=True`, Choreo-Mini instruments every agent call automatically:
134
+
135
+ | Metric | Description |
136
+ |--------|-------------|
137
+ | `call_count` | Number of times the agent was invoked |
138
+ | `total_latency` | Cumulative wall-clock inference time (seconds) |
139
+ | `total_memory` | Cumulative memory delta across calls (bytes) |
140
+ | `history` | Full conversation history per agent |
141
+
142
+ This makes it straightforward to compare runtimes or detect bottlenecks before committing to a specific framework.
143
+
144
+ ---
145
+
146
+ ## Supported LLM Providers
147
+
148
+ | Provider | Class | Notes |
149
+ |----------|-------|-------|
150
+ | OpenAI | `LLM.create("openai", api_key=...)` | Stub — wire real `openai` client in `generate()` |
151
+ | Anthropic | `LLM.create("anthropic", api_key=...)` | Stub — wire real `anthropic` client |
152
+ | Gemini | `LLM.create("gemini", api_key=...)` | Stub — wire real Google client |
153
+ | Custom | `CustomLLM(fn)` | Wraps any `(prompt, **kw) -> str` callable |
154
+
155
+ The provider stubs make scaffolding easy; replace `generate()` with the real SDK call for production.
156
+
157
+ ---
158
+
159
+ ## Development
160
+
161
+ ```bash
162
+ git clone https://github.com/Sivasathivel/Choreo-mini
163
+ cd choreo-mini
164
+ python -m venv .venv && source .venv/bin/activate
165
+ pip install -e ".[dev]"
166
+ pytest tests/
167
+ ```
168
+
169
+ CI runs the full regression suite against Python 3.10, 3.11, and 3.12 on every push.
170
+
171
+ ---
172
+
173
+ ## Author
174
+
175
+ **Sivasathivel Kandasamy** — [LinkedIn](https://www.linkedin.com/in/sivasathivelkandasamy/)
176
+
177
+ ---
178
+
179
+ ## License
180
+
181
+ This project is released under the [Choreo-Mini Source License](LICENSE).
182
+
183
+ **What is allowed:**
184
+ - Use choreo-mini as a library or dependency inside any project, including
185
+ commercial applications and internal enterprise deployments — no restriction.
186
+ - Modify the source and contribute back.
187
+ - Keep your larger application closed source when it only depends on
188
+ choreo-mini and is not itself a derivative of choreo-mini.
189
+ - Ship a proprietary larger product that uses unmodified choreo-mini as a
190
+ component, with license notices preserved.
191
+
192
+ **What is not allowed:**
193
+ - Building and selling a product, plugin, extension, or SaaS where
194
+ choreo-mini is the core value being offered by a third party.
195
+ - Distributing or hosting a modified derivative of choreo-mini without
196
+ releasing the derivative source under the same license.
197
+ - Selling paid access to choreo-mini or a derivative API/service, even when
198
+ bundled with other paid features.
199
+
200
+ **Other terms:** citation is required in public materials and user-facing
201
+ interfaces (for example docs, demos, public repos, benchmark reports, websites,
202
+ or service UI); contributors grant the author relicensing rights; the author
203
+ reserves the right to publish enterprise/commercial editions. See
204
+ [CONTRIBUTING.md](CONTRIBUTING.md) for contribution terms.
@@ -0,0 +1,18 @@
1
+ """Choreo-Mini — Python-native LLM agent workflow orchestration."""
2
+
3
+ from choreo_mini.core.workflow import Workflow, AgentState
4
+ from choreo_mini.core.nodes import AgentNode, ServiceNode
5
+ from choreo_mini.core.llm import LLM, CustomLLM, Message
6
+
7
+ __version__ = "0.1.0"
8
+ __author__ = "Sivasathivel Kandasamy"
9
+
10
+ __all__ = [
11
+ "Workflow",
12
+ "AgentState",
13
+ "AgentNode",
14
+ "ServiceNode",
15
+ "LLM",
16
+ "CustomLLM",
17
+ "Message",
18
+ ]