auto-workflow 0.1.0__tar.gz → 0.1.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (30) hide show
  1. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/PKG-INFO +63 -52
  2. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/README.md +62 -51
  3. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/pyproject.toml +1 -1
  4. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/LICENSE +0 -0
  5. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/assets/logo.svg +0 -0
  6. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/__init__.py +0 -0
  7. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/__main__.py +0 -0
  8. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/artifacts.py +0 -0
  9. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/build.py +0 -0
  10. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/cache.py +0 -0
  11. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/cli.py +0 -0
  12. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/config.py +0 -0
  13. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/context.py +0 -0
  14. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/dag.py +0 -0
  15. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/events.py +0 -0
  16. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/exceptions.py +0 -0
  17. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/execution.py +0 -0
  18. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/fanout.py +0 -0
  19. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/flow.py +0 -0
  20. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/lifecycle.py +0 -0
  21. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/logging_middleware.py +0 -0
  22. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/metrics_provider.py +0 -0
  23. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/middleware.py +0 -0
  24. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/py.typed +0 -0
  25. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/scheduler.py +0 -0
  26. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/secrets.py +0 -0
  27. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/task.py +0 -0
  28. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/tracing.py +0 -0
  29. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/types.py +0 -0
  30. {auto_workflow-0.1.0 → auto_workflow-0.1.1}/auto_workflow/utils.py +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: auto-workflow
3
- Version: 0.1.0
3
+ Version: 0.1.1
4
4
  Summary: A lightweight, developer-first workflow & task orchestration engine for Python.
5
5
  Home-page: https://github.com/stoiandl/auto-workflow
6
6
  License: GPL-3.0-or-later
@@ -36,7 +36,6 @@ Description-Content-Type: text/markdown
36
36
  [![Docs build](https://github.com/stoiandl/auto-workflow/actions/workflows/docs.yml/badge.svg?branch=main&event=push)](https://github.com/stoiandl/auto-workflow/actions/workflows/docs.yml)
37
37
  [![Coverage Status](https://img.shields.io/codecov/c/github/stoiandl/auto-workflow/main?logo=codecov&label=coverage)](https://app.codecov.io/gh/stoiandl/auto-workflow)
38
38
  [![PyPI](https://img.shields.io/pypi/v/auto-workflow.svg?logo=pypi&label=PyPI)](https://pypi.org/project/auto-workflow/)
39
- [![PyPI - Downloads](https://img.shields.io/pypi/dm/auto-workflow.svg?label=downloads)](https://pypi.org/project/auto-workflow/)
40
39
  [![Docs](https://img.shields.io/badge/docs-GitHub%20Pages-blue)](https://stoiandl.github.io/auto-workflow/) [![License: GPLv3](https://img.shields.io/badge/License-GPLv3-blue.svg)](LICENSE)
41
40
 
42
41
  _A lightweight, zero-bloat, developer‑first workflow & task orchestration engine for Python._
@@ -103,7 +102,18 @@ Core values: **No mandatory DB**, **no daemon**, **no CLI bureaucracy**, **opt
103
102
 
104
103
 
105
104
  ## Feature Overview
106
- Planned / partially implemented capabilities:
105
+ Core capabilities:
106
+
107
+ - **Task Definition**: `@task` decorator with retry, timeout, caching, and execution mode options
108
+ - **Flow Orchestration**: `@flow` decorator for building DAGs with automatic dependency resolution
109
+ - **Dynamic Fan-Out**: `fan_out()` for runtime task creation based on upstream results
110
+ - **Multiple Execution Modes**: async, thread pool, and process pool execution
111
+ - **Caching & Artifacts**: Task result caching and large result persistence
112
+ - **Observability**: Built-in logging, metrics, tracing, and event system
113
+ - **Configuration**: Environment-based config with structured logging
114
+ - **CLI Tools**: Run, describe, and list flows via command line
115
+ - **Secrets Management**: Pluggable secrets providers
116
+ - **Failure Handling**: Configurable retry policies and failure propagation
107
117
 
108
118
 
109
119
 
@@ -124,7 +134,7 @@ poetry run pytest --cov=auto_workflow --cov-report=term-missing
124
134
 
125
135
  ### Define Tasks
126
136
  ```python
127
- from auto_workflow import task, flow
137
+ from auto_workflow import task, flow, fan_out
128
138
 
129
139
  @task
130
140
  def load_numbers() -> list[int]:
@@ -141,8 +151,8 @@ def aggregate(values: list[int]) -> int:
141
151
  @flow
142
152
  def pipeline():
143
153
  nums = load_numbers()
144
- # Fan-out map (dynamic child tasks)
145
- squared = [square(n) for n in nums] # Under the hood becomes dynamic tasks
154
+ # Dynamic fan-out: create tasks for each number
155
+ squared = fan_out(square, nums)
146
156
  return aggregate(squared)
147
157
 
148
158
  if __name__ == "__main__":
@@ -196,27 +206,18 @@ Mode selection:
196
206
 
197
207
 
198
208
  ## Building Flows & DAGs
199
- Two equivalent approaches (both may be supported):
200
-
201
- 1. **Imperative Functional** (Python execution builds nodes):
202
- ```python
203
- @flow
204
- def my_flow():
205
- a = task_a()
206
- b = task_b(a)
207
- c = task_c(a, b)
208
- return c
209
- ```
210
- 2. **Explicit Builder** (defer evaluation):
211
- ```python
212
- from auto_workflow import FlowBuilder
213
- fb = FlowBuilder(name="my_flow")
214
- a = fb.task(task_a)
215
- b = fb.task(task_b, a)
216
- c = fb.task(task_c, a, b)
217
- flow = fb.build()
218
- flow.run()
219
- ```
209
+ Flows are defined using the `@flow` decorator:
210
+
211
+ ```python
212
+ @flow
213
+ def my_flow():
214
+ a = task_a()
215
+ b = task_b(a)
216
+ c = task_c(a, b)
217
+ return c
218
+ ```
219
+
220
+ Task dependencies are determined automatically by passing task invocation results as arguments to other tasks.
220
221
 
221
222
 
222
223
  ## Dynamic Fan-Out / Conditional Branching
@@ -250,13 +251,16 @@ def conditional_flow(flag: bool):
250
251
 
251
252
 
252
253
  ## Result Handling, Caching & Idempotency
253
- Strategies (planned):
254
+ Tasks support caching with TTL and artifact persistence for large results:
254
255
 
255
- Example (concept):
256
256
  ```python
257
- @task(cache_ttl=3600)
257
+ @task(cache_ttl=3600) # Cache for 1 hour
258
258
  def expensive(x: int) -> int:
259
259
  return do_work(x)
260
+
261
+ @task(persist=True) # Store large results via artifact store
262
+ def produce_large_dataset() -> dict:
263
+ return {"data": list(range(1000000))}
260
264
  ```
261
265
 
262
266
 
@@ -266,7 +270,9 @@ Per-task configuration:
266
270
  @task(retries=3, retry_backoff=2.0, retry_jitter=0.3, timeout=30)
267
271
  def flaky(): ...
268
272
  ```
269
- Failure policy options (proposed):
273
+ Failure policy options:
274
+ - `FAIL_FAST`: Stop on first error (default)
275
+ - `CONTINUE`: Continue executing independent tasks
270
276
 
271
277
 
272
278
  ## Hooks, Events & Middleware
@@ -285,35 +291,40 @@ def timing_middleware(next_call):
285
291
  return wrapper
286
292
  ```
287
293
 
288
- Event bus emission (planned): structured events -> pluggable sinks (stdout logger, OTLP exporter, WebSocket UI).
294
+ Event bus: structured events with pluggable subscribers for custom logging and monitoring.
289
295
 
290
296
 
291
297
  ## Configuration & Environment
292
- Minimal first-class configuration (future `pyproject.toml` block):
293
- ```toml
294
- [tool.auto_workflow]
295
- default-executor = "async"
296
- log-level = "INFO"
297
- max-dynamic-tasks = 2048
298
- ```
299
- Environment overrides are available for documented keys (see docs/configuration.md).
298
+ Configuration via environment variables:
299
+ - `AUTO_WORKFLOW_LOG_LEVEL`: Set logging level (default: INFO)
300
+ - `AUTO_WORKFLOW_DISABLE_STRUCTURED_LOGS`: Disable structured logging
301
+ - `AUTO_WORKFLOW_MAX_DYNAMIC_TASKS`: Limit dynamic task expansion
302
+
303
+ See docs/configuration.md for full details.
300
304
 
301
305
 
302
306
  ## Observability (Logging, Metrics, Tracing)
303
- Implemented surface + extensions you can plug in:
307
+ Built-in observability features:
308
+
309
+ - **Structured Logging**: Automatic JSON-formatted logging with task/flow context
310
+ - **Metrics**: Pluggable metrics providers (in-memory and custom backends)
311
+ - **Tracing**: Task and flow execution spans for performance monitoring
312
+ - **Events**: Pub/sub event system for task lifecycle hooks
313
+ - **Middleware**: Chain custom logic around task execution
304
314
 
305
315
 
306
- ## Extensibility Roadmap
316
+ ## Extensibility
307
317
  | Extension | Interface | Status |
308
318
  |-----------|-----------|--------|
309
- | Executor plugins | `BaseExecutor` | Planned |
310
- | Storage backend | `ArtifactStore` | Planned |
311
- | Cache backend | `ResultCache` | Planned |
312
- | Metrics provider | `MetricsProvider` | Planned |
313
- | Tracing adapter | `Tracer` | Planned |
314
- | Retry policy | Strategy object | Planned |
315
- | Scheduling layer | External module | Backlog |
316
- | UI / API | Optional service | Backlog |
319
+ | Storage backend | `ArtifactStore` | Implemented |
320
+ | Cache backend | `ResultCache` | Implemented |
321
+ | Metrics provider | `MetricsProvider` | Implemented |
322
+ | Tracing adapter | `Tracer` | Implemented |
323
+ | Secrets provider | `SecretsProvider` | Implemented |
324
+ | Event middleware | Middleware chain | Implemented |
325
+ | Executor plugins | `BaseExecutor` | Future |
326
+ | Scheduling layer | External module | Future |
327
+ | UI / API | Optional service | Future |
317
328
 
318
329
 
319
330
  ## Security & Isolation Considerations
@@ -365,7 +376,7 @@ docs/
365
376
 
366
377
  **Q: Do I need decorators?** No; you can manually wrap callables into Tasks if you prefer pure functional style.
367
378
 
368
- **Q: How does it serialize arguments across processes?** Planned fallback: cloudpickle; user can register custom serializer.
379
+ **Q: How does it serialize arguments across processes?** Uses cloudpickle for process execution mode.
369
380
 
370
381
  **Q: Scheduling / cron?** Out of core scope—provide a thin adapter so external schedulers (cron, systemd timers, GitHub Actions) can invoke flows.
371
382
 
@@ -400,7 +411,7 @@ Contributions are welcome once the core API draft solidifies. Until then:
400
411
  3. Adhere to Ruff formatting & lint rules (pre-commit enforced).
401
412
  4. Add or update examples & docs for new features.
402
413
 
403
- Planned contribution guides: `CONTRIBUTING.md`, `CODE_OF_CONDUCT.md`.
414
+ See `CONTRIBUTING.md` for detailed contribution guidelines.
404
415
 
405
416
 
406
417
  ## Versioning & Stability
@@ -11,7 +11,6 @@
11
11
  [![Docs build](https://github.com/stoiandl/auto-workflow/actions/workflows/docs.yml/badge.svg?branch=main&event=push)](https://github.com/stoiandl/auto-workflow/actions/workflows/docs.yml)
12
12
  [![Coverage Status](https://img.shields.io/codecov/c/github/stoiandl/auto-workflow/main?logo=codecov&label=coverage)](https://app.codecov.io/gh/stoiandl/auto-workflow)
13
13
  [![PyPI](https://img.shields.io/pypi/v/auto-workflow.svg?logo=pypi&label=PyPI)](https://pypi.org/project/auto-workflow/)
14
- [![PyPI - Downloads](https://img.shields.io/pypi/dm/auto-workflow.svg?label=downloads)](https://pypi.org/project/auto-workflow/)
15
14
  [![Docs](https://img.shields.io/badge/docs-GitHub%20Pages-blue)](https://stoiandl.github.io/auto-workflow/) [![License: GPLv3](https://img.shields.io/badge/License-GPLv3-blue.svg)](LICENSE)
16
15
 
17
16
  _A lightweight, zero-bloat, developer‑first workflow & task orchestration engine for Python._
@@ -78,7 +77,18 @@ Core values: **No mandatory DB**, **no daemon**, **no CLI bureaucracy**, **opt
78
77
 
79
78
 
80
79
  ## Feature Overview
81
- Planned / partially implemented capabilities:
80
+ Core capabilities:
81
+
82
+ - **Task Definition**: `@task` decorator with retry, timeout, caching, and execution mode options
83
+ - **Flow Orchestration**: `@flow` decorator for building DAGs with automatic dependency resolution
84
+ - **Dynamic Fan-Out**: `fan_out()` for runtime task creation based on upstream results
85
+ - **Multiple Execution Modes**: async, thread pool, and process pool execution
86
+ - **Caching & Artifacts**: Task result caching and large result persistence
87
+ - **Observability**: Built-in logging, metrics, tracing, and event system
88
+ - **Configuration**: Environment-based config with structured logging
89
+ - **CLI Tools**: Run, describe, and list flows via command line
90
+ - **Secrets Management**: Pluggable secrets providers
91
+ - **Failure Handling**: Configurable retry policies and failure propagation
82
92
 
83
93
 
84
94
 
@@ -99,7 +109,7 @@ poetry run pytest --cov=auto_workflow --cov-report=term-missing
99
109
 
100
110
  ### Define Tasks
101
111
  ```python
102
- from auto_workflow import task, flow
112
+ from auto_workflow import task, flow, fan_out
103
113
 
104
114
  @task
105
115
  def load_numbers() -> list[int]:
@@ -116,8 +126,8 @@ def aggregate(values: list[int]) -> int:
116
126
  @flow
117
127
  def pipeline():
118
128
  nums = load_numbers()
119
- # Fan-out map (dynamic child tasks)
120
- squared = [square(n) for n in nums] # Under the hood becomes dynamic tasks
129
+ # Dynamic fan-out: create tasks for each number
130
+ squared = fan_out(square, nums)
121
131
  return aggregate(squared)
122
132
 
123
133
  if __name__ == "__main__":
@@ -171,27 +181,18 @@ Mode selection:
171
181
 
172
182
 
173
183
  ## Building Flows & DAGs
174
- Two equivalent approaches (both may be supported):
175
-
176
- 1. **Imperative Functional** (Python execution builds nodes):
177
- ```python
178
- @flow
179
- def my_flow():
180
- a = task_a()
181
- b = task_b(a)
182
- c = task_c(a, b)
183
- return c
184
- ```
185
- 2. **Explicit Builder** (defer evaluation):
186
- ```python
187
- from auto_workflow import FlowBuilder
188
- fb = FlowBuilder(name="my_flow")
189
- a = fb.task(task_a)
190
- b = fb.task(task_b, a)
191
- c = fb.task(task_c, a, b)
192
- flow = fb.build()
193
- flow.run()
194
- ```
184
+ Flows are defined using the `@flow` decorator:
185
+
186
+ ```python
187
+ @flow
188
+ def my_flow():
189
+ a = task_a()
190
+ b = task_b(a)
191
+ c = task_c(a, b)
192
+ return c
193
+ ```
194
+
195
+ Task dependencies are determined automatically by passing task invocation results as arguments to other tasks.
195
196
 
196
197
 
197
198
  ## Dynamic Fan-Out / Conditional Branching
@@ -225,13 +226,16 @@ def conditional_flow(flag: bool):
225
226
 
226
227
 
227
228
  ## Result Handling, Caching & Idempotency
228
- Strategies (planned):
229
+ Tasks support caching with TTL and artifact persistence for large results:
229
230
 
230
- Example (concept):
231
231
  ```python
232
- @task(cache_ttl=3600)
232
+ @task(cache_ttl=3600) # Cache for 1 hour
233
233
  def expensive(x: int) -> int:
234
234
  return do_work(x)
235
+
236
+ @task(persist=True) # Store large results via artifact store
237
+ def produce_large_dataset() -> dict:
238
+ return {"data": list(range(1000000))}
235
239
  ```
236
240
 
237
241
 
@@ -241,7 +245,9 @@ Per-task configuration:
241
245
  @task(retries=3, retry_backoff=2.0, retry_jitter=0.3, timeout=30)
242
246
  def flaky(): ...
243
247
  ```
244
- Failure policy options (proposed):
248
+ Failure policy options:
249
+ - `FAIL_FAST`: Stop on first error (default)
250
+ - `CONTINUE`: Continue executing independent tasks
245
251
 
246
252
 
247
253
  ## Hooks, Events & Middleware
@@ -260,35 +266,40 @@ def timing_middleware(next_call):
260
266
  return wrapper
261
267
  ```
262
268
 
263
- Event bus emission (planned): structured events -> pluggable sinks (stdout logger, OTLP exporter, WebSocket UI).
269
+ Event bus: structured events with pluggable subscribers for custom logging and monitoring.
264
270
 
265
271
 
266
272
  ## Configuration & Environment
267
- Minimal first-class configuration (future `pyproject.toml` block):
268
- ```toml
269
- [tool.auto_workflow]
270
- default-executor = "async"
271
- log-level = "INFO"
272
- max-dynamic-tasks = 2048
273
- ```
274
- Environment overrides are available for documented keys (see docs/configuration.md).
273
+ Configuration via environment variables:
274
+ - `AUTO_WORKFLOW_LOG_LEVEL`: Set logging level (default: INFO)
275
+ - `AUTO_WORKFLOW_DISABLE_STRUCTURED_LOGS`: Disable structured logging
276
+ - `AUTO_WORKFLOW_MAX_DYNAMIC_TASKS`: Limit dynamic task expansion
277
+
278
+ See docs/configuration.md for full details.
275
279
 
276
280
 
277
281
  ## Observability (Logging, Metrics, Tracing)
278
- Implemented surface + extensions you can plug in:
282
+ Built-in observability features:
283
+
284
+ - **Structured Logging**: Automatic JSON-formatted logging with task/flow context
285
+ - **Metrics**: Pluggable metrics providers (in-memory and custom backends)
286
+ - **Tracing**: Task and flow execution spans for performance monitoring
287
+ - **Events**: Pub/sub event system for task lifecycle hooks
288
+ - **Middleware**: Chain custom logic around task execution
279
289
 
280
290
 
281
- ## Extensibility Roadmap
291
+ ## Extensibility
282
292
  | Extension | Interface | Status |
283
293
  |-----------|-----------|--------|
284
- | Executor plugins | `BaseExecutor` | Planned |
285
- | Storage backend | `ArtifactStore` | Planned |
286
- | Cache backend | `ResultCache` | Planned |
287
- | Metrics provider | `MetricsProvider` | Planned |
288
- | Tracing adapter | `Tracer` | Planned |
289
- | Retry policy | Strategy object | Planned |
290
- | Scheduling layer | External module | Backlog |
291
- | UI / API | Optional service | Backlog |
294
+ | Storage backend | `ArtifactStore` | Implemented |
295
+ | Cache backend | `ResultCache` | Implemented |
296
+ | Metrics provider | `MetricsProvider` | Implemented |
297
+ | Tracing adapter | `Tracer` | Implemented |
298
+ | Secrets provider | `SecretsProvider` | Implemented |
299
+ | Event middleware | Middleware chain | Implemented |
300
+ | Executor plugins | `BaseExecutor` | Future |
301
+ | Scheduling layer | External module | Future |
302
+ | UI / API | Optional service | Future |
292
303
 
293
304
 
294
305
  ## Security & Isolation Considerations
@@ -340,7 +351,7 @@ docs/
340
351
 
341
352
  **Q: Do I need decorators?** No; you can manually wrap callables into Tasks if you prefer pure functional style.
342
353
 
343
- **Q: How does it serialize arguments across processes?** Planned fallback: cloudpickle; user can register custom serializer.
354
+ **Q: How does it serialize arguments across processes?** Uses cloudpickle for process execution mode.
344
355
 
345
356
  **Q: Scheduling / cron?** Out of core scope—provide a thin adapter so external schedulers (cron, systemd timers, GitHub Actions) can invoke flows.
346
357
 
@@ -375,7 +386,7 @@ Contributions are welcome once the core API draft solidifies. Until then:
375
386
  3. Adhere to Ruff formatting & lint rules (pre-commit enforced).
376
387
  4. Add or update examples & docs for new features.
377
388
 
378
- Planned contribution guides: `CONTRIBUTING.md`, `CODE_OF_CONDUCT.md`.
389
+ See `CONTRIBUTING.md` for detailed contribution guidelines.
379
390
 
380
391
 
381
392
  ## Versioning & Stability
@@ -1,6 +1,6 @@
1
1
  [tool.poetry]
2
2
  name = "auto-workflow"
3
- version = "0.1.0"
3
+ version = "0.1.1"
4
4
  description = "A lightweight, developer-first workflow & task orchestration engine for Python."
5
5
  authors = ["andreistoica <andreilst@yahoo.ro>"]
6
6
  readme = "README.md"
File without changes