dotflow 0.13.2.dev1__tar.gz → 0.14.0.dev1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (73) hide show
  1. dotflow-0.14.0.dev1/PKG-INFO +837 -0
  2. dotflow-0.14.0.dev1/README.md +773 -0
  3. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/__init__.py +1 -1
  4. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/abc/flow.py +2 -0
  5. dotflow-0.14.0.dev1/dotflow/abc/scheduler.py +16 -0
  6. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/cli/commands/__init__.py +2 -1
  7. dotflow-0.14.0.dev1/dotflow/cli/commands/init.py +19 -0
  8. dotflow-0.14.0.dev1/dotflow/cli/commands/schedule.py +58 -0
  9. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/cli/commands/start.py +21 -7
  10. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/cli/setup.py +80 -5
  11. dotflow-0.14.0.dev1/dotflow/cli/validators/start.py +17 -0
  12. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/action.py +51 -15
  13. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/config.py +28 -2
  14. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/dotflow.py +29 -7
  15. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/exception.py +10 -0
  16. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/execution.py +3 -3
  17. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/serializers/task.py +34 -16
  18. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/serializers/workflow.py +9 -0
  19. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/task.py +52 -21
  20. dotflow-0.14.0.dev1/dotflow/core/types/__init__.py +15 -0
  21. dotflow-0.14.0.dev1/dotflow/core/types/overlap.py +31 -0
  22. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/types/storage.py +2 -0
  23. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/workflow.py +77 -6
  24. dotflow-0.14.0.dev1/dotflow/providers/__init__.py +39 -0
  25. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/providers/log_default.py +1 -5
  26. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/providers/notify_telegram.py +2 -0
  27. dotflow-0.14.0.dev1/dotflow/providers/scheduler_cron.py +224 -0
  28. dotflow-0.14.0.dev1/dotflow/providers/scheduler_default.py +13 -0
  29. dotflow-0.14.0.dev1/dotflow/providers/storage_gcs.py +119 -0
  30. dotflow-0.14.0.dev1/dotflow/providers/storage_s3.py +121 -0
  31. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/utils/tools.py +5 -4
  32. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/pyproject.toml +51 -5
  33. dotflow-0.13.2.dev1/PKG-INFO +0 -421
  34. dotflow-0.13.2.dev1/README.md +0 -370
  35. dotflow-0.13.2.dev1/dotflow/cli/commands/init.py +0 -18
  36. dotflow-0.13.2.dev1/dotflow/cli/validators/start.py +0 -13
  37. dotflow-0.13.2.dev1/dotflow/core/types/__init__.py +0 -7
  38. dotflow-0.13.2.dev1/dotflow/providers/__init__.py +0 -15
  39. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/LICENSE +0 -0
  40. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/abc/__init__.py +0 -0
  41. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/abc/api.py +0 -0
  42. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/abc/file.py +0 -0
  43. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/abc/http.py +0 -0
  44. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/abc/log.py +0 -0
  45. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/abc/notify.py +0 -0
  46. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/abc/storage.py +0 -0
  47. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/abc/tcp.py +0 -0
  48. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/cli/__init__.py +0 -0
  49. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/cli/command.py +0 -0
  50. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/cli/commands/log.py +0 -0
  51. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/cli/validators/__init__.py +0 -0
  52. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/__init__.py +0 -0
  53. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/context.py +0 -0
  54. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/decorators/__init__.py +0 -0
  55. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/decorators/time.py +0 -0
  56. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/module.py +0 -0
  57. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/serializers/__init__.py +0 -0
  58. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/serializers/transport.py +0 -0
  59. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/types/execution.py +0 -0
  60. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/core/types/status.py +0 -0
  61. /dotflow-0.13.2.dev1/dotflow/core/types/worflow.py → /dotflow-0.14.0.dev1/dotflow/core/types/workflow.py +0 -0
  62. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/logging.py +0 -0
  63. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/main.py +0 -0
  64. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/providers/api_default.py +0 -0
  65. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/providers/notify_default.py +0 -0
  66. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/providers/storage_default.py +0 -0
  67. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/providers/storage_file.py +0 -0
  68. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/settings.py +0 -0
  69. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/storage.py +0 -0
  70. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/types.py +0 -0
  71. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/utils/__init__.py +0 -0
  72. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/utils/basic_functions.py +0 -0
  73. {dotflow-0.13.2.dev1 → dotflow-0.14.0.dev1}/dotflow/utils/error_handler.py +0 -0
@@ -0,0 +1,837 @@
1
+ Metadata-Version: 2.4
2
+ Name: dotflow
3
+ Version: 0.14.0.dev1
4
+ Summary: 🎲 Dotflow turns an idea into flow! Lightweight Python library for execution pipelines with retry, parallel, cron and async support.
5
+ License: MIT License
6
+
7
+ Copyright (c) 2025 Fernando Celmer
8
+
9
+ Permission is hereby granted, free of charge, to any person obtaining a copy
10
+ of this software and associated documentation files (the "Software"), to deal
11
+ in the Software without restriction, including without limitation the rights
12
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
13
+ copies of the Software, and to permit persons to whom the Software is
14
+ furnished to do so, subject to the following conditions:
15
+
16
+ The above copyright notice and this permission notice shall be included in all
17
+ copies or substantial portions of the Software.
18
+
19
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
20
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
21
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
22
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
23
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
24
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
25
+ SOFTWARE.
26
+ License-File: LICENSE
27
+ Keywords: pipeline,workflow,etl,task-runner,orchestration,automation,parallel,async,retry,data-pipeline,scheduler,cron,cron-job
28
+ Author: Fernando Celmer
29
+ Author-email: email@fernandocelmer.com
30
+ Requires-Python: >=3.9
31
+ Classifier: Development Status :: 4 - Beta
32
+ Classifier: Operating System :: OS Independent
33
+ Classifier: Intended Audience :: Developers
34
+ Classifier: Natural Language :: English
35
+ Classifier: License :: OSI Approved :: MIT License
36
+ Classifier: Topic :: Software Development :: Libraries
37
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
38
+ Classifier: Topic :: System :: Distributed Computing
39
+ Classifier: Programming Language :: Python :: 3.9
40
+ Classifier: Programming Language :: Python :: 3.10
41
+ Classifier: Programming Language :: Python :: 3.11
42
+ Classifier: Programming Language :: Python :: 3.12
43
+ Classifier: Programming Language :: Python :: 3.13
44
+ Provides-Extra: aws
45
+ Provides-Extra: gcp
46
+ Provides-Extra: mongodb
47
+ Provides-Extra: scheduler
48
+ Requires-Dist: boto3 ; extra == "aws"
49
+ Requires-Dist: cookiecutter (>=2.0)
50
+ Requires-Dist: croniter ; extra == "scheduler"
51
+ Requires-Dist: dotflow-mongodb ; extra == "mongodb"
52
+ Requires-Dist: google-cloud-storage ; extra == "gcp"
53
+ Requires-Dist: pydantic
54
+ Requires-Dist: requests
55
+ Requires-Dist: rich
56
+ Requires-Dist: typing-extensions
57
+ Project-URL: Changelog, https://dotflow-io.github.io/dotflow/nav/development/release-notes/
58
+ Project-URL: Documentation, https://dotflow-io.github.io/dotflow/
59
+ Project-URL: Homepage, https://github.com/dotflow-io/dotflow
60
+ Project-URL: Issues, https://github.com/dotflow-io/dotflow/issues
61
+ Project-URL: Repository, https://github.com/dotflow-io/dotflow
62
+ Description-Content-Type: text/markdown
63
+
64
+ <div align="center">
65
+ <a aria-label="Serverless.com" href="https://dotflow.io">Website</a>
66
+ &nbsp;•&nbsp;
67
+ <a aria-label="Dotflow Documentation" href="https://dotflow-io.github.io/dotflow/">Documentation</a>
68
+ &nbsp;•&nbsp;
69
+ <a aria-label="Pypi" href="https://pypi.org/project/dotflow/">Pypi</a>
70
+ </div>
71
+
72
+ <br/>
73
+
74
+ <div align="center">
75
+
76
+ ![](https://raw.githubusercontent.com/FernandoCelmer/dotflow/master/docs/assets/dotflow.gif)
77
+
78
+ ![GitHub Org's stars](https://img.shields.io/github/stars/dotflow-io?label=Dotflow&style=flat-square)
79
+ ![GitHub last commit](https://img.shields.io/github/last-commit/dotflow-io/dotflow?style=flat-square)
80
+ ![PyPI](https://img.shields.io/pypi/v/dotflow?style=flat-square)
81
+ ![PyPI - Python Version](https://img.shields.io/pypi/pyversions/dotflow?style=flat-square)
82
+ ![PyPI - Downloads](https://img.shields.io/pypi/dm/dotflow?style=flat-square)
83
+
84
+ </div>
85
+
86
+ # Welcome to Dotflow
87
+
88
+ Dotflow is a lightweight Python library for building execution pipelines. Define tasks with decorators, chain them together, and run workflows in sequential, parallel, or background mode — with built-in retry, timeout, storage, notifications, and more.
89
+
90
+ > **[Read the full documentation](https://dotflow-io.github.io/dotflow/)**
91
+
92
+ ## Table of Contents
93
+
94
+ <details>
95
+ <summary>Click to expand</summary>
96
+
97
+ - [Getting Help](#getting-help)
98
+ - [Installation](#installation)
99
+ - [Quick Start](#quick-start)
100
+ - [Features](#features)
101
+ - [Execution Modes](#execution-modes)
102
+ - [Retry, Timeout & Backoff](#retry-timeout--backoff)
103
+ - [Context System](#context-system)
104
+ - [Checkpoint & Resume](#checkpoint--resume)
105
+ - [Storage Providers](#storage-providers)
106
+ - [Notifications](#notifications)
107
+ - [Class-Based Steps](#class-based-steps)
108
+ - [Task Groups](#task-groups)
109
+ - [Callbacks](#callbacks)
110
+ - [Error Handling](#error-handling)
111
+ - [Async Support](#async-support)
112
+ - [Scheduler / Cron](#scheduler--cron)
113
+ - [CLI](#cli)
114
+ - [Dependency Injection via Config](#dependency-injection-via-config)
115
+ - [More Examples](#more-examples)
116
+ - [Commit Style](#commit-style)
117
+ - [License](#license)
118
+
119
+ </details>
120
+
121
+ ## Getting Help
122
+
123
+ We use GitHub issues for tracking bugs and feature requests.
124
+
125
+ - [Bug Report](https://github.com/dotflow-io/dotflow/issues/new/choose)
126
+ - [Documentation](https://github.com/dotflow-io/dotflow/issues/new/choose)
127
+ - [Feature Request](https://github.com/dotflow-io/dotflow/issues/new/choose)
128
+ - [Security Issue](https://github.com/dotflow-io/dotflow/issues/new/choose)
129
+ - [General Question](https://github.com/dotflow-io/dotflow/issues/new/choose)
130
+
131
+ ## Installation
132
+
133
+ ```bash
134
+ pip install dotflow
135
+ ```
136
+
137
+ **Optional extras:**
138
+
139
+ ```bash
140
+ pip install dotflow[aws] # AWS S3 storage
141
+ pip install dotflow[gcp] # Google Cloud Storage
142
+ pip install dotflow[scheduler] # Cron-based scheduler
143
+ ```
144
+
145
+ ## Quick Start
146
+
147
+ ```python
148
+ from dotflow import DotFlow, action
149
+
150
+ @action
151
+ def extract():
152
+ return {"users": 150}
153
+
154
+ @action
155
+ def transform(previous_context):
156
+ total = previous_context.storage["users"]
157
+ return {"users": total, "active": int(total * 0.8)}
158
+
159
+ @action
160
+ def load(previous_context):
161
+ print(f"Loaded {previous_context.storage['active']} active users")
162
+
163
+ workflow = DotFlow()
164
+ workflow.task.add(step=extract)
165
+ workflow.task.add(step=transform)
166
+ workflow.task.add(step=load)
167
+
168
+ workflow.start()
169
+ ```
170
+
171
+ ## Features
172
+
173
+ ### Execution Modes
174
+
175
+ > [Process Mode docs](https://dotflow-io.github.io/dotflow/nav/learn/process-mode-sequential/)
176
+
177
+ Dotflow supports 4 execution strategies out of the box:
178
+
179
+ #### Sequential (default)
180
+
181
+ Tasks run one after another. The context from each task flows to the next.
182
+
183
+ ```python
184
+ workflow.task.add(step=task_a)
185
+ workflow.task.add(step=task_b)
186
+
187
+ workflow.start() # or mode="sequential"
188
+ ```
189
+
190
+ ```mermaid
191
+ flowchart LR
192
+ A[task_a] --> B[task_b] --> C[Finish]
193
+ ```
194
+
195
+ #### Background
196
+
197
+ Same as sequential, but runs in a background thread — non-blocking.
198
+
199
+ ```python
200
+ workflow.start(mode="background")
201
+ ```
202
+
203
+ #### Parallel
204
+
205
+ Every task runs simultaneously in its own process.
206
+
207
+ ```python
208
+ workflow.task.add(step=task_a)
209
+ workflow.task.add(step=task_b)
210
+ workflow.task.add(step=task_c)
211
+
212
+ workflow.start(mode="parallel")
213
+ ```
214
+
215
+ ```mermaid
216
+ flowchart TD
217
+ S[Start] --> A[task_a] & B[task_b] & C[task_c]
218
+ A & B & C --> F[Finish]
219
+ ```
220
+
221
+ #### Parallel Groups
222
+
223
+ Assign tasks to named groups. Groups run in parallel, but tasks within each group run sequentially.
224
+
225
+ ```python
226
+ workflow.task.add(step=fetch_users, group_name="users")
227
+ workflow.task.add(step=save_users, group_name="users")
228
+ workflow.task.add(step=fetch_orders, group_name="orders")
229
+ workflow.task.add(step=save_orders, group_name="orders")
230
+
231
+ workflow.start()
232
+ ```
233
+
234
+ ```mermaid
235
+ flowchart TD
236
+ S[Start] --> G1[Group: users] & G2[Group: orders]
237
+ G1 --> A[fetch_users] --> B[save_users]
238
+ G2 --> C[fetch_orders] --> D[save_orders]
239
+ B & D --> F[Finish]
240
+ ```
241
+
242
+ ---
243
+
244
+ ### Retry, Timeout & Backoff
245
+
246
+ > [Retry docs](https://dotflow-io.github.io/dotflow/nav/tutorial/task-retry/) | [Backoff docs](https://dotflow-io.github.io/dotflow/nav/tutorial/task-backoff/) | [Timeout docs](https://dotflow-io.github.io/dotflow/nav/tutorial/task-timeout/)
247
+
248
+ The `@action` decorator supports built-in resilience options:
249
+
250
+ ```python
251
+ @action(retry=3, timeout=10, retry_delay=2, backoff=True)
252
+ def unreliable_api_call():
253
+ response = requests.get("https://api.example.com/data")
254
+ response.raise_for_status()
255
+ return response.json()
256
+ ```
257
+
258
+ | Parameter | Type | Default | Description |
259
+ |-----------|------|---------|-------------|
260
+ | `retry` | `int` | `1` | Number of attempts before failing |
261
+ | `timeout` | `int` | `0` | Max seconds per attempt (0 = no limit) |
262
+ | `retry_delay` | `int` | `1` | Seconds to wait between retries |
263
+ | `backoff` | `bool` | `False` | Exponential backoff (delay doubles each retry) |
264
+
265
+ ---
266
+
267
+ ### Context System
268
+
269
+ > [Context docs](https://dotflow-io.github.io/dotflow/nav/tutorial/initial-context/) | [Previous Context](https://dotflow-io.github.io/dotflow/nav/tutorial/previous-context/) | [Many Contexts](https://dotflow-io.github.io/dotflow/nav/tutorial/many-contexts/)
270
+
271
+ Tasks communicate through a context chain. Each task receives the previous task's output and can access its own initial context.
272
+
273
+ ```python
274
+ @action
275
+ def step_one():
276
+ return "Hello"
277
+
278
+ @action
279
+ def step_two(previous_context, initial_context):
280
+ greeting = previous_context.storage # "Hello"
281
+ name = initial_context.storage # "World"
282
+ return f"{greeting}, {name}!"
283
+
284
+ workflow = DotFlow()
285
+ workflow.task.add(step=step_one)
286
+ workflow.task.add(step=step_two, initial_context="World")
287
+ workflow.start()
288
+ ```
289
+
290
+ Each `Context` object contains:
291
+ - **`storage`** — the return value from the task
292
+ - **`task_id`** — the task identifier
293
+ - **`workflow_id`** — the workflow identifier
294
+ - **`time`** — timestamp of execution
295
+
296
+ ---
297
+
298
+ ### Checkpoint & Resume
299
+
300
+ > [Checkpoint docs](https://dotflow-io.github.io/dotflow/nav/tutorial/checkpoint/)
301
+
302
+ Resume a workflow from where it left off. Requires a persistent storage provider and a fixed `workflow_id`.
303
+
304
+ ```python
305
+ from dotflow import DotFlow, Config, action
306
+ from dotflow.providers import StorageFile
307
+
308
+ config = Config(storage=StorageFile())
309
+
310
+ workflow = DotFlow(config=config, workflow_id="my-pipeline-v1")
311
+ workflow.task.add(step=step_a)
312
+ workflow.task.add(step=step_b)
313
+ workflow.task.add(step=step_c)
314
+
315
+ # First run — executes all tasks and saves checkpoints
316
+ workflow.start()
317
+
318
+ # If step_c failed, fix and re-run — skips step_a and step_b
319
+ workflow.start(resume=True)
320
+ ```
321
+
322
+ ---
323
+
324
+ ### Storage Providers
325
+
326
+ > [Storage docs](https://dotflow-io.github.io/dotflow/nav/tutorial/storage-default/)
327
+
328
+ Choose where task results are persisted:
329
+
330
+ #### In-Memory (default)
331
+
332
+ ```python
333
+ from dotflow import DotFlow
334
+
335
+ workflow = DotFlow() # uses StorageDefault (in-memory)
336
+ ```
337
+
338
+ #### File System
339
+
340
+ ```python
341
+ from dotflow import DotFlow, Config
342
+ from dotflow.providers import StorageFile
343
+
344
+ config = Config(storage=StorageFile(path=".output"))
345
+ workflow = DotFlow(config=config)
346
+ ```
347
+
348
+ #### AWS S3
349
+
350
+ ```bash
351
+ pip install dotflow[aws]
352
+ ```
353
+
354
+ ```python
355
+ from dotflow import DotFlow, Config
356
+ from dotflow.providers import StorageS3
357
+
358
+ config = Config(storage=StorageS3(bucket="my-bucket", prefix="pipelines/", region="us-east-1"))
359
+ workflow = DotFlow(config=config)
360
+ ```
361
+
362
+ #### Google Cloud Storage
363
+
364
+ ```bash
365
+ pip install dotflow[gcp]
366
+ ```
367
+
368
+ ```python
369
+ from dotflow import DotFlow, Config
370
+ from dotflow.providers import StorageGCS
371
+
372
+ config = Config(storage=StorageGCS(bucket="my-bucket", prefix="pipelines/", project="my-project"))
373
+ workflow = DotFlow(config=config)
374
+ ```
375
+
376
+ ---
377
+
378
+ ### Notifications
379
+
380
+ > [Telegram docs](https://dotflow-io.github.io/dotflow/nav/tutorial/notify-telegram/)
381
+
382
+ Get notified about task status changes via Telegram.
383
+
384
+ ```python
385
+ from dotflow import DotFlow, Config
386
+ from dotflow.providers import NotifyTelegram
387
+ from dotflow.core.types.status import TypeStatus
388
+
389
+ notify = NotifyTelegram(
390
+ token="YOUR_BOT_TOKEN",
391
+ chat_id=123456789,
392
+ notification_type=TypeStatus.FAILED, # only notify on failures (optional)
393
+ )
394
+
395
+ config = Config(notify=notify)
396
+ workflow = DotFlow(config=config)
397
+ ```
398
+
399
+ Status types: `NOT_STARTED`, `IN_PROGRESS`, `COMPLETED`, `PAUSED`, `RETRY`, `FAILED`
400
+
401
+ ---
402
+
403
+ ### Class-Based Steps
404
+
405
+ Return a class instance from a task, and Dotflow will automatically discover and execute all `@action`-decorated methods in source order.
406
+
407
+ ```python
408
+ from dotflow import action
409
+
410
+ class ETLPipeline:
411
+ @action
412
+ def extract(self):
413
+ return {"raw": [1, 2, 3]}
414
+
415
+ @action
416
+ def transform(self, previous_context):
417
+ data = previous_context.storage["raw"]
418
+ return {"processed": [x * 2 for x in data]}
419
+
420
+ @action
421
+ def load(self, previous_context):
422
+ print(f"Loaded: {previous_context.storage['processed']}")
423
+
424
+ @action
425
+ def run_pipeline():
426
+ return ETLPipeline()
427
+
428
+ workflow = DotFlow()
429
+ workflow.task.add(step=run_pipeline)
430
+ workflow.start()
431
+ ```
432
+
433
+ ---
434
+
435
+ ### Task Groups
436
+
437
+ > [Groups docs](https://dotflow-io.github.io/dotflow/nav/tutorial/groups/)
438
+
439
+ Organize tasks into named groups for parallel group execution.
440
+
441
+ ```python
442
+ workflow.task.add(step=scrape_site_a, group_name="scraping")
443
+ workflow.task.add(step=scrape_site_b, group_name="scraping")
444
+ workflow.task.add(step=process_data, group_name="processing")
445
+ workflow.task.add(step=save_results, group_name="processing")
446
+
447
+ workflow.start() # groups run in parallel, tasks within each group run sequentially
448
+ ```
449
+
450
+ ---
451
+
452
+ ### Callbacks
453
+
454
+ > [Task Callback docs](https://dotflow-io.github.io/dotflow/nav/tutorial/task-callback/) | [Workflow Callback docs](https://dotflow-io.github.io/dotflow/nav/tutorial/workflow-callback/)
455
+
456
+ Execute a function after each task completes — useful for logging, alerting, or side effects.
457
+
458
+ ```python
459
+ def on_task_done(task):
460
+ print(f"Task {task.task_id} finished with status: {task.status}")
461
+
462
+ workflow.task.add(step=my_step, callback=on_task_done)
463
+ ```
464
+
465
+ Workflow-level callbacks for success and failure:
466
+
467
+ ```python
468
+ def on_success(*args, **kwargs):
469
+ print("All tasks completed!")
470
+
471
+ def on_failure(*args, **kwargs):
472
+ print("Something went wrong.")
473
+
474
+ workflow.start(on_success=on_success, on_failure=on_failure)
475
+ ```
476
+
477
+ ---
478
+
479
+ ### Error Handling
480
+
481
+ > [Error Handling docs](https://dotflow-io.github.io/dotflow/nav/tutorial/error-handling/) | [Keep Going docs](https://dotflow-io.github.io/dotflow/nav/tutorial/keep-going/)
482
+
483
+ Control whether the workflow stops or continues when a task fails:
484
+
485
+ ```python
486
+ # Stop on first failure (default)
487
+ workflow.start(keep_going=False)
488
+
489
+ # Continue executing remaining tasks even if one fails
490
+ workflow.start(keep_going=True)
491
+ ```
492
+
493
+ Each task tracks its errors with full detail:
494
+ - Attempt number
495
+ - Exception type and message
496
+ - Traceback
497
+
498
+ Access results after execution:
499
+
500
+ ```python
501
+ for task in workflow.result_task():
502
+ print(f"Task {task.task_id}: {task.status}")
503
+ if task.errors:
504
+ print(f" Errors: {task.errors}")
505
+ ```
506
+
507
+ ---
508
+
509
+ ### Async Support
510
+
511
+ > [Async docs](https://dotflow-io.github.io/dotflow/nav/tutorial/async-actions/)
512
+
513
+ `@action` automatically detects and handles async functions:
514
+
515
+ ```python
516
+ import httpx
517
+ from dotflow import DotFlow, action
518
+
519
+ @action(timeout=30)
520
+ async def fetch_data():
521
+ async with httpx.AsyncClient() as client:
522
+ response = await client.get("https://api.example.com/data")
523
+ return response.json()
524
+
525
+ workflow = DotFlow()
526
+ workflow.task.add(step=fetch_data)
527
+ workflow.start()
528
+ ```
529
+
530
+ ---
531
+
532
+ ### Scheduler / Cron
533
+
534
+ Schedule workflows to run automatically using cron expressions.
535
+
536
+ ```bash
537
+ pip install dotflow[scheduler]
538
+ ```
539
+
540
+ ```python
541
+ from dotflow import DotFlow, Config, action
542
+ from dotflow.providers import SchedulerCron
543
+
544
+ @action
545
+ def sync_data():
546
+ return {"synced": True}
547
+
548
+ config = Config(scheduler=SchedulerCron(cron="*/5 * * * *"))
549
+
550
+ workflow = DotFlow(config=config)
551
+ workflow.task.add(step=sync_data)
552
+ workflow.schedule()
553
+ ```
554
+
555
+ #### Overlap Strategies
556
+
557
+ Control what happens when a new execution triggers while the previous one is still running:
558
+
559
+ | Strategy | Description |
560
+ |----------|-------------|
561
+ | `skip` | Drops the new run if the previous is still active (default) |
562
+ | `queue` | Buffers one pending run, executes when the current finishes |
563
+ | `parallel` | Runs up to 10 concurrent executions via semaphore |
564
+
565
+ ```python
566
+ from dotflow.providers import SchedulerCron
567
+
568
+ # Queue overlapping executions
569
+ scheduler = SchedulerCron(cron="*/5 * * * *", overlap="queue")
570
+
571
+ # Allow parallel executions
572
+ scheduler = SchedulerCron(cron="*/5 * * * *", overlap="parallel")
573
+ ```
574
+
575
+ The scheduler handles graceful shutdown via `SIGINT`/`SIGTERM` signals automatically.
576
+
577
+ ---
578
+
579
+ ### CLI
580
+
581
+ > [CLI docs](https://dotflow-io.github.io/dotflow/nav/learn/cli/simple-start/)
582
+
583
+ Run workflows directly from the command line:
584
+
585
+ ```bash
586
+ # Simple execution
587
+ dotflow start --step my_module.my_task
588
+
589
+ # With initial context
590
+ dotflow start --step my_module.my_task --initial-context '{"key": "value"}'
591
+
592
+ # With callback
593
+ dotflow start --step my_module.my_task --callback my_module.on_done
594
+
595
+ # With execution mode
596
+ dotflow start --step my_module.my_task --mode parallel
597
+
598
+ # With file storage
599
+ dotflow start --step my_module.my_task --storage file --path .output
600
+
601
+ # With S3 storage
602
+ dotflow start --step my_module.my_task --storage s3
603
+
604
+ # With GCS storage
605
+ dotflow start --step my_module.my_task --storage gcs
606
+
607
+ # Schedule with cron
608
+ dotflow schedule --step my_module.my_task --cron "*/5 * * * *"
609
+
610
+ # Schedule with overlap strategy
611
+ dotflow schedule --step my_module.my_task --cron "0 * * * *" --overlap queue
612
+
613
+ # Schedule with resume
614
+ dotflow schedule --step my_module.my_task --cron "0 */6 * * *" --storage file --resume
615
+ ```
616
+
617
+ Available CLI commands:
618
+
619
+ | Command | Description |
620
+ |---------|-------------|
621
+ | `dotflow init` | Initialize a new Dotflow project |
622
+ | `dotflow start` | Run a workflow |
623
+ | `dotflow schedule` | Run a workflow on a cron schedule |
624
+ | `dotflow log` | View execution logs |
625
+
626
+ ---
627
+
628
+ ### Dependency Injection via Config
629
+
630
+ The `Config` class lets you swap providers for storage, notifications, logging, and scheduling:
631
+
632
+ ```python
633
+ from dotflow import DotFlow, Config
634
+ from dotflow.providers import StorageFile, NotifyTelegram, LogDefault, SchedulerCron
635
+
636
+ config = Config(
637
+ storage=StorageFile(path=".output"),
638
+ notify=NotifyTelegram(token="...", chat_id=123),
639
+ log=LogDefault(),
640
+ scheduler=SchedulerCron(cron="0 * * * *"),
641
+ )
642
+
643
+ workflow = DotFlow(config=config)
644
+ ```
645
+
646
+ Extend Dotflow by implementing the abstract base classes:
647
+
648
+ | ABC | Methods | Purpose |
649
+ |-----|---------|---------|
650
+ | `Storage` | `post`, `get`, `key` | Custom storage backends |
651
+ | `Notify` | `send` | Custom notification channels |
652
+ | `Log` | `info`, `error` | Custom logging |
653
+ | `Scheduler` | `start`, `stop` | Custom scheduling strategies |
654
+
655
+ ---
656
+
657
+ ### Results & Inspection
658
+
659
+ After execution, inspect results directly from the workflow object:
660
+
661
+ ```python
662
+ workflow.start()
663
+
664
+ # List of Task objects
665
+ tasks = workflow.result_task()
666
+
667
+ # List of Context objects (one per task)
668
+ contexts = workflow.result_context()
669
+
670
+ # List of storage values (raw return values)
671
+ storages = workflow.result_storage()
672
+
673
+ # Serialized result (Pydantic model)
674
+ result = workflow.result()
675
+ ```
676
+
677
+ Task builder utilities:
678
+
679
+ ```python
680
+ workflow.task.count() # Number of tasks
681
+ workflow.task.clear() # Remove all tasks
682
+ workflow.task.reverse() # Reverse execution order
683
+ workflow.task.schema() # Pydantic schema of the workflow
684
+ ```
685
+
686
+ ---
687
+
688
+ ### Dynamic Module Import
689
+
690
+ Reference tasks and callbacks by their module path string instead of importing them directly:
691
+
692
+ ```python
693
+ workflow.task.add(step="my_package.tasks.process_data")
694
+ workflow.task.add(step="my_package.tasks.save_results", callback="my_package.callbacks.notify")
695
+ ```
696
+
697
+ ---
698
+
699
+ ## More Examples
700
+
701
+ All examples are available in the [`docs_src/`](https://github.com/dotflow-io/dotflow/tree/develop/docs_src) directory.
702
+
703
+ #### Basic
704
+
705
+ | Example | Description | Command |
706
+ |---------|-------------|---------|
707
+ | [first_steps](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/first_steps/first_steps.py) | Minimal workflow with callback | `python docs_src/first_steps/first_steps.py` |
708
+ | [simple_function_workflow](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/basic/simple_function_workflow.py) | Simple function-based workflow | `python docs_src/basic/simple_function_workflow.py` |
709
+ | [simple_class_workflow](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/basic/simple_class_workflow.py) | Class-based step with retry | `python docs_src/basic/simple_class_workflow.py` |
710
+ | [simple_function_workflow_with_error](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/basic/simple_function_workflow_with_error.py) | Error inspection after failure | `python docs_src/basic/simple_function_workflow_with_error.py` |
711
+
712
+ #### Async
713
+
714
+ | Example | Description | Command |
715
+ |---------|-------------|---------|
716
+ | [async_action](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/async/async_action.py) | Async task functions | `python docs_src/async/async_action.py` |
717
+
718
+ #### Context
719
+
720
+ | Example | Description | Command |
721
+ |---------|-------------|---------|
722
+ | [context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/context/context.py) | Creating and inspecting a Context | `python docs_src/context/context.py` |
723
+ | [initial_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/initial_context/initial_context.py) | Passing initial context per task | `python docs_src/initial_context/initial_context.py` |
724
+ | [previous_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/previous_context/previous_context.py) | Chaining context between tasks | `python docs_src/previous_context/previous_context.py` |
725
+ | [many_contexts](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/context/many_contexts.py) | Using both initial and previous context | `python docs_src/context/many_contexts.py` |
726
+
727
+ #### Process Modes
728
+
729
+ | Example | Description | Command |
730
+ |---------|-------------|---------|
731
+ | [sequential](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/process_mode/sequential.py) | Sequential execution | `python docs_src/process_mode/sequential.py` |
732
+ | [background](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/process_mode/background.py) | Background (non-blocking) execution | `python docs_src/process_mode/background.py` |
733
+ | [parallel](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/process_mode/parallel.py) | Parallel execution | `python docs_src/process_mode/parallel.py` |
734
+ | [parallel_group](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/process_mode/parallel_group.py) | Parallel groups execution | `python docs_src/process_mode/parallel_group.py` |
735
+ | [sequential_group_mode](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/workflow/sequential_group_mode.py) | Sequential with named groups | `python docs_src/workflow/sequential_group_mode.py` |
736
+
737
+ #### Resilience (Retry, Backoff, Timeout)
738
+
739
+ | Example | Description | Command |
740
+ |---------|-------------|---------|
741
+ | [retry](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/retry/retry.py) | Retry on function and class steps | `python docs_src/retry/retry.py` |
742
+ | [retry_delay](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/retry/retry_delay.py) | Retry with delay between attempts | `python docs_src/retry/retry_delay.py` |
743
+ | [backoff](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/backoff/backoff.py) | Exponential backoff on retries | `python docs_src/backoff/backoff.py` |
744
+ | [timeout](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/timeout/timeout.py) | Timeout per task execution | `python docs_src/timeout/timeout.py` |
745
+
746
+ #### Callbacks
747
+
748
+ | Example | Description | Command |
749
+ |---------|-------------|---------|
750
+ | [task_callback](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/callback/task_callback.py) | Per-task callback on completion | `python docs_src/callback/task_callback.py` |
751
+ | [workflow_callback_success](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/callback/workflow_callback_success.py) | Workflow-level success callback | `python docs_src/callback/workflow_callback_success.py` |
752
+ | [workflow_callback_failure](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/callback/workflow_callback_failure.py) | Workflow-level failure callback | `python docs_src/callback/workflow_callback_failure.py` |
753
+
754
+ #### Error Handling
755
+
756
+ | Example | Description | Command |
757
+ |---------|-------------|---------|
758
+ | [errors](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/errors/errors.py) | Inspecting task errors and retry count | `python docs_src/errors/errors.py` |
759
+ | [keep_going_true](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/workflow/keep_going_true.py) | Continue workflow after task failure | `python docs_src/workflow/keep_going_true.py` |
760
+
761
+ #### Groups
762
+
763
+ | Example | Description | Command |
764
+ |---------|-------------|---------|
765
+ | [step_with_groups](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/group/step_with_groups.py) | Tasks in named parallel groups | `python docs_src/group/step_with_groups.py` |
766
+
767
+ #### Storage
768
+
769
+ | Example | Description | Command |
770
+ |---------|-------------|---------|
771
+ | [storage_file](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/storage/storage_file.py) | File-based JSON storage | `python docs_src/storage/storage_file.py` |
772
+ | [storage_s3](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/storage/storage_s3.py) | AWS S3 storage | `python docs_src/storage/storage_s3.py` |
773
+ | [storage_gcs](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/storage/storage_gcs.py) | Google Cloud Storage | `python docs_src/storage/storage_gcs.py` |
774
+
775
+ #### Checkpoint & Resume
776
+
777
+ | Example | Description | Command |
778
+ |---------|-------------|---------|
779
+ | [checkpoint](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/checkpoint/checkpoint.py) | Resume workflow from last checkpoint | `python docs_src/checkpoint/checkpoint.py` |
780
+
781
+ #### Notifications
782
+
783
+ | Example | Description | Command |
784
+ |---------|-------------|---------|
785
+ | [notify_telegram](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/notify/notify_telegram.py) | Telegram notifications on failure | `python docs_src/notify/notify_telegram.py` |
786
+
787
+ #### Config & Providers
788
+
789
+ | Example | Description | Command |
790
+ |---------|-------------|---------|
791
+ | [config](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/config/config.py) | Full Config with storage, notify, log | `python docs_src/config/config.py` |
792
+ | [storage_provider](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/config/storage_provider.py) | Swapping storage providers | `python docs_src/config/storage_provider.py` |
793
+ | [notify_provider](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/config/notify_provider.py) | Swapping notification providers | `python docs_src/config/notify_provider.py` |
794
+ | [log_provider](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/config/log_provider.py) | Custom log provider | `python docs_src/config/log_provider.py` |
795
+
796
+ #### Results & Output
797
+
798
+ | Example | Description | Command |
799
+ |---------|-------------|---------|
800
+ | [step_function_result_task](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_function_result_task.py) | Inspect task results (function) | `python docs_src/output/step_function_result_task.py` |
801
+ | [step_function_result_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_function_result_context.py) | Inspect context results (function) | `python docs_src/output/step_function_result_context.py` |
802
+ | [step_function_result_storage](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_function_result_storage.py) | Inspect storage results (function) | `python docs_src/output/step_function_result_storage.py` |
803
+ | [step_class_result_task](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_class_result_task.py) | Inspect task results (class) | `python docs_src/output/step_class_result_task.py` |
804
+ | [step_class_result_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_class_result_context.py) | Inspect context results (class) | `python docs_src/output/step_class_result_context.py` |
805
+ | [step_class_result_storage](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_class_result_storage.py) | Inspect storage results (class) | `python docs_src/output/step_class_result_storage.py` |
806
+
807
+ #### CLI
808
+
809
+ | Example | Description | Command |
810
+ |---------|-------------|---------|
811
+ | [simple_cli](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/basic/simple_cli.py) | Basic CLI execution | `dotflow start --step docs_src.basic.simple_cli.simple_step` |
812
+ | [cli_with_callback](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/cli/cli_with_callback.py) | CLI with callback function | `dotflow start --step docs_src.cli.cli_with_callback.simple_step --callback docs_src.cli.cli_with_callback.callback` |
813
+ | [cli_with_initial_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/cli/cli_with_initial_context.py) | CLI with initial context | `dotflow start --step docs_src.cli.cli_with_initial_context.simple_step --initial-context abc` |
814
+ | [cli_with_mode](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/cli/cli_with_mode.py) | CLI with execution mode | `dotflow start --step docs_src.cli.cli_with_mode.simple_step --mode sequential` |
815
+ | [cli_with_output_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/cli/cli_with_output_context.py) | CLI with file storage output | `dotflow start --step docs_src.cli.cli_with_output_context.simple_step --storage file` |
816
+ | [cli_with_path](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/cli/cli_with_path.py) | CLI with custom storage path | `dotflow start --step docs_src.cli.cli_with_path.simple_step --path .storage --storage file` |
817
+
818
+ ## Commit Style
819
+
820
+ | Icon | Type | Description |
821
+ |------|-----------|--------------------------------------------|
822
+ | ⚙️ | FEATURE | New feature |
823
+ | 📝 | PEP8 | Formatting fixes following PEP8 |
824
+ | 📌 | ISSUE | Reference to issue |
825
+ | 🪲 | BUG | Bug fix |
826
+ | 📘 | DOCS | Documentation changes |
827
+ | 📦 | PyPI | PyPI releases |
828
+ | ❤️️ | TEST | Automated tests |
829
+ | ⬆️ | CI/CD | Changes in continuous integration/delivery |
830
+ | ⚠️ | SECURITY | Security improvements |
831
+
832
+ ## License
833
+
834
+ ![GitHub License](https://img.shields.io/github/license/dotflow-io/dotflow)
835
+
836
+ This project is licensed under the terms of the MIT License.
837
+