ddeutil-workflow 0.0.15__py3-none-any.whl → 0.0.17__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: ddeutil-workflow
3
- Version: 0.0.15
3
+ Version: 0.0.17
4
4
  Summary: Lightweight workflow orchestration with less dependencies
5
5
  Author-email: ddeutils <korawich.anu@gmail.com>
6
6
  License: MIT
@@ -22,8 +22,8 @@ Classifier: Programming Language :: Python :: 3.13
22
22
  Requires-Python: >=3.9.13
23
23
  Description-Content-Type: text/markdown
24
24
  License-File: LICENSE
25
- Requires-Dist: ddeutil >=0.4.0
26
- Requires-Dist: ddeutil-io >=0.1.13
25
+ Requires-Dist: ddeutil >=0.4.3
26
+ Requires-Dist: ddeutil-io[toml,yaml] >=0.2.3
27
27
  Requires-Dist: python-dotenv ==1.0.1
28
28
  Requires-Dist: typer <1.0.0,==0.12.5
29
29
  Requires-Dist: schedule <2.0.0,==1.2.2
@@ -33,6 +33,7 @@ Requires-Dist: fastapi <1.0.0,>=0.115.0 ; extra == 'api'
33
33
  # Workflow
34
34
 
35
35
  [![test](https://github.com/ddeutils/ddeutil-workflow/actions/workflows/tests.yml/badge.svg?branch=main)](https://github.com/ddeutils/ddeutil-workflow/actions/workflows/tests.yml)
36
+ [![codecov](https://codecov.io/gh/ddeutils/ddeutil-workflow/graph/badge.svg?token=3NDPN2I0H9)](https://codecov.io/gh/ddeutils/ddeutil-workflow)
36
37
  [![pypi version](https://img.shields.io/pypi/v/ddeutil-workflow)](https://pypi.org/project/ddeutil-workflow/)
37
38
  [![python support version](https://img.shields.io/pypi/pyversions/ddeutil-workflow)](https://pypi.org/project/ddeutil-workflow/)
38
39
  [![size](https://img.shields.io/github/languages/code-size/ddeutils/ddeutil-workflow)](https://github.com/ddeutils/ddeutil-workflow)
@@ -74,8 +75,9 @@ configuration. It called **Metadata Driven Data Workflow**.
74
75
 
75
76
  ## :round_pushpin: Installation
76
77
 
77
- This project need `ddeutil-io` extension namespace packages. If you want to install
78
- this package with application add-ons, you should add `app` in installation;
78
+ This project need `ddeutil` and `ddeutil-io` extension namespace packages.
79
+ If you want to install this package with application add-ons, you should add
80
+ `app` in installation;
79
81
 
80
82
  | Usecase | Install Optional | Support |
81
83
  |-------------------|------------------------------------------|--------------------|
@@ -179,29 +181,32 @@ The main configuration that use to dynamic changing with your propose of this
179
181
  application. If any configuration values do not set yet, it will use default value
180
182
  and do not raise any error to you.
181
183
 
182
- | Environment | Component | Default | Description |
183
- |-------------------------------------|-----------|----------------------------------|----------------------------------------------------------------------------|
184
- | `WORKFLOW_ROOT_PATH` | Core | . | The root path of the workflow application |
185
- | `WORKFLOW_CORE_REGISTRY` | Core | src.ddeutil.workflow,tests.utils | List of importable string for the hook stage |
186
- | `WORKFLOW_CORE_REGISTRY_FILTER` | Core | ddeutil.workflow.utils | List of importable string for the filter template |
187
- | `WORKFLOW_CORE_PATH_CONF` | Core | conf | The config path that keep all template `.yaml` files |
188
- | `WORKFLOW_CORE_TIMEZONE` | Core | Asia/Bangkok | A Timezone string value that will pass to `ZoneInfo` object |
189
- | `WORKFLOW_CORE_STAGE_DEFAULT_ID` | Core | true | A flag that enable default stage ID that use for catch an execution output |
190
- | `WORKFLOW_CORE_STAGE_RAISE_ERROR` | Core | true | A flag that all stage raise StageException from stage execution |
191
- | `WORKFLOW_CORE_MAX_NUM_POKING` | Core | 4 | |
192
- | `WORKFLOW_CORE_MAX_JOB_PARALLEL` | Core | 2 | The maximum job number that able to run parallel in workflow executor |
193
- | `WORKFLOW_LOG_DEBUG_MODE` | Log | true | A flag that enable logging with debug level mode |
194
- | `WORKFLOW_LOG_ENABLE_WRITE` | Log | true | A flag that enable logging object saving log to its destination |
195
- | `WORKFLOW_APP_PROCESS_WORKER` | Schedule | 2 | The maximum process worker number that run in scheduler app module |
196
- | `WORKFLOW_APP_SCHEDULE_PER_PROCESS` | Schedule | 100 | A schedule per process that run parallel |
197
- | `WORKFLOW_APP_STOP_BOUNDARY_DELTA` | Schedule | '{"minutes": 5, "seconds": 20}' | A time delta value that use to stop scheduler app in json string format |
184
+ | Environment | Component | Default | Description | Remark |
185
+ |:----------------------------------------|-----------|----------------------------------|--------------------------------------------------------------------------------------------------------------------|--------|
186
+ | `WORKFLOW_ROOT_PATH` | Core | . | The root path of the workflow application. | |
187
+ | `WORKFLOW_CORE_REGISTRY` | Core | src.ddeutil.workflow,tests.utils | List of importable string for the hook stage. | |
188
+ | `WORKFLOW_CORE_REGISTRY_FILTER` | Core | ddeutil.workflow.utils | List of importable string for the filter template. | |
189
+ | `WORKFLOW_CORE_PATH_CONF` | Core | conf | The config path that keep all template `.yaml` files. | |
190
+ | `WORKFLOW_CORE_TIMEZONE` | Core | Asia/Bangkok | A Timezone string value that will pass to `ZoneInfo` object. | |
191
+ | `WORKFLOW_CORE_STAGE_DEFAULT_ID` | Core | true | A flag that enable default stage ID that use for catch an execution output. | |
192
+ | `WORKFLOW_CORE_STAGE_RAISE_ERROR` | Core | false | A flag that all stage raise StageException from stage execution. | |
193
+ | `WORKFLOW_CORE_JOB_DEFAULT_ID` | Core | false | A flag that enable default job ID that use for catch an execution output. The ID that use will be sequence number. | |
194
+ | `WORKFLOW_CORE_JOB_RAISE_ERROR` | Core | true | A flag that all job raise JobException from job strategy execution. | |
195
+ | `WORKFLOW_CORE_MAX_NUM_POKING` | Core | 4 | . | |
196
+ | `WORKFLOW_CORE_MAX_JOB_PARALLEL` | Core | 2 | The maximum job number that able to run parallel in workflow executor. | |
197
+ | `WORKFLOW_CORE_WORKFLOW_ID_SIMPLE_MODE` | Core | true | . | |
198
+ | `WORKFLOW_LOG_DEBUG_MODE` | Log | true | A flag that enable logging with debug level mode. | |
199
+ | `WORKFLOW_LOG_ENABLE_WRITE` | Log | true | A flag that enable logging object saving log to its destination. | |
200
+ | `WORKFLOW_APP_MAX_PROCESS` | Schedule | 2 | The maximum process worker number that run in scheduler app module. | |
201
+ | `WORKFLOW_APP_MAX_SCHEDULE_PER_PROCESS` | Schedule | 100 | A schedule per process that run parallel. | |
202
+ | `WORKFLOW_APP_STOP_BOUNDARY_DELTA` | Schedule | '{"minutes": 5, "seconds": 20}' | A time delta value that use to stop scheduler app in json string format. | |
198
203
 
199
204
  **API Application**:
200
205
 
201
- | Environment | Component | Default | Description |
202
- |--------------------------------------|-----------|---------|-----------------------------------------------------------------------------------|
203
- | `WORKFLOW_API_ENABLE_ROUTE_WORKFLOW` | API | true | A flag that enable workflow route to manage execute manually and workflow logging |
204
- | `WORKFLOW_API_ENABLE_ROUTE_SCHEDULE` | API | true | A flag that enable run scheduler |
206
+ | Environment | Component | Default | Description | Remark |
207
+ |:--------------------------------------|-----------|---------|------------------------------------------------------------------------------------|--------|
208
+ | `WORKFLOW_API_ENABLE_ROUTE_WORKFLOW` | API | true | A flag that enable workflow route to manage execute manually and workflow logging. | |
209
+ | `WORKFLOW_API_ENABLE_ROUTE_SCHEDULE` | API | true | A flag that enable run scheduler. | |
205
210
 
206
211
  ## :rocket: Deployment
207
212
 
@@ -224,3 +229,17 @@ like crontab job but via Python API.
224
229
  > [!NOTE]
225
230
  > If this package already deploy, it able to use
226
231
  > `uvicorn ddeutil.workflow.api:app --host 127.0.0.1 --port 80 --workers 4`
232
+
233
+ ### Docker Container
234
+
235
+ Create Docker image;
236
+
237
+ ```shell
238
+ $ docker build -t ddeutil-workflow:latest -f .container/Dockerfile .
239
+ ```
240
+
241
+ Run the above Docker image;
242
+
243
+ ```shell
244
+ $ docker run -i ddeutil-workflow:latest
245
+ ```
@@ -0,0 +1,21 @@
1
+ ddeutil/workflow/__about__.py,sha256=z3f1GAF3VbZK1m4FWAXXMsWplP_jSe-X-wVlshvlDWU,28
2
+ ddeutil/workflow/__cron.py,sha256=ZiuV4ASkXvAyFJYxEb9PKiAFNYnUt4AJozu_kH3pI4U,25777
3
+ ddeutil/workflow/__init__.py,sha256=RNKME4FPMAjqtrBR-IBwQVEKeoY5yBAiHYcZw0k9cI4,729
4
+ ddeutil/workflow/__types.py,sha256=yizLXzjQpBt_WPaof2pIyncitJvYeksw4Q1zYJeuCLA,3707
5
+ ddeutil/workflow/api.py,sha256=vUT2RVS9sF3hvY-IrzAEnahxwq4ZFYP0G3xfctHbNsw,4701
6
+ ddeutil/workflow/cli.py,sha256=baHhvtI8snbHYHeThoX401Cd6SMB2boyyCbCtTrIl3E,3278
7
+ ddeutil/workflow/conf.py,sha256=SV4GMtjUc-Bor9BPi0yOtTIsiZ0FImsoRbuJysUIE9w,15395
8
+ ddeutil/workflow/exceptions.py,sha256=Uf1-Tn8rAzj0aiVHSqo4fBqO80W0za7UFZgKv24E-tg,706
9
+ ddeutil/workflow/job.py,sha256=dW9NXR_bttDGLwelVi7qXXlLd96KX-TKG8xnHejA6u0,24041
10
+ ddeutil/workflow/on.py,sha256=rneZB5HyFWTBWriGef999bovA3glQIK6LTgC996q9Gc,7334
11
+ ddeutil/workflow/repeat.py,sha256=9uKku5uMcQgzY5fWyaJMwJ0wPFX0oTwmu7vXKdgB_ec,4923
12
+ ddeutil/workflow/route.py,sha256=JALwOH6xKu5rnII7DgA1Lbp_E5ehCoBbOW_eKqB_Olk,6753
13
+ ddeutil/workflow/scheduler.py,sha256=Oa6bZpphjlGp0mXdBuLMk1m6G-dezaBNQxQX-SB3WJ0,47032
14
+ ddeutil/workflow/stage.py,sha256=fMv_oFkoqpfoewzPUMdl3-BQcrJ8SE53cF7es8yGxfs,25525
15
+ ddeutil/workflow/utils.py,sha256=lpnqGGd_Rw7eZo2wDbZ-NZNItBooFooPjwM4_40Csh8,25152
16
+ ddeutil_workflow-0.0.17.dist-info/LICENSE,sha256=nGFZ1QEhhhWeMHf9n99_fdt4vQaXS29xWKxt-OcLywk,1085
17
+ ddeutil_workflow-0.0.17.dist-info/METADATA,sha256=btmCr-yjy4gzhnZppfXjANfPH-3tKUJFGon2aOMUK30,13574
18
+ ddeutil_workflow-0.0.17.dist-info/WHEEL,sha256=OVMc5UfuAQiSplgO0_WdW7vXVGAt9Hdd6qtN4HotdyA,91
19
+ ddeutil_workflow-0.0.17.dist-info/entry_points.txt,sha256=0BVOgO3LdUdXVZ-CiHHDKxzEk2c8J30jEwHeKn2YCWI,62
20
+ ddeutil_workflow-0.0.17.dist-info/top_level.txt,sha256=m9M6XeSWDwt_yMsmH6gcOjHZVK5O0-vgtNBuncHjzW4,8
21
+ ddeutil_workflow-0.0.17.dist-info/RECORD,,
@@ -1,5 +1,5 @@
1
1
  Wheel-Version: 1.0
2
- Generator: setuptools (75.1.0)
2
+ Generator: setuptools (75.2.0)
3
3
  Root-Is-Purelib: true
4
4
  Tag: py3-none-any
5
5
 
ddeutil/workflow/log.py DELETED
@@ -1,198 +0,0 @@
1
- # ------------------------------------------------------------------------------
2
- # Copyright (c) 2022 Korawich Anuttra. All rights reserved.
3
- # Licensed under the MIT License. See LICENSE in the project root for
4
- # license information.
5
- # ------------------------------------------------------------------------------
6
- from __future__ import annotations
7
-
8
- import json
9
- import logging
10
- import os
11
- from abc import ABC, abstractmethod
12
- from datetime import datetime
13
- from functools import lru_cache
14
- from pathlib import Path
15
- from typing import ClassVar, Optional, Union
16
-
17
- from ddeutil.core import str2bool
18
- from pydantic import BaseModel, Field
19
- from pydantic.functional_validators import model_validator
20
- from typing_extensions import Self
21
-
22
- from .__types import DictData
23
- from .utils import load_config
24
-
25
-
26
- @lru_cache
27
- def get_logger(name: str):
28
- """Return logger object with an input module name.
29
-
30
- :param name: A module name that want to log.
31
- """
32
- logger = logging.getLogger(name)
33
- formatter = logging.Formatter(
34
- fmt=(
35
- "%(asctime)s.%(msecs)03d (%(name)-10s, %(process)-5d, "
36
- "%(thread)-5d) [%(levelname)-7s] %(message)-120s "
37
- "(%(filename)s:%(lineno)s)"
38
- ),
39
- datefmt="%Y-%m-%d %H:%M:%S",
40
- )
41
- stream = logging.StreamHandler()
42
- stream.setFormatter(formatter)
43
- logger.addHandler(stream)
44
-
45
- debug: bool = str2bool(os.getenv("WORKFLOW_LOG_DEBUG_MODE", "true"))
46
- logger.setLevel(logging.DEBUG if debug else logging.INFO)
47
- return logger
48
-
49
-
50
- class BaseLog(BaseModel, ABC):
51
- """Base Log Pydantic Model with abstraction class property that implement
52
- only model fields. This model should to use with inherit to logging
53
- sub-class like file, sqlite, etc.
54
- """
55
-
56
- name: str = Field(description="A workflow name.")
57
- on: str = Field(description="A cronjob string of this piepline schedule.")
58
- release: datetime = Field(description="A release datetime.")
59
- context: DictData = Field(
60
- default_factory=dict,
61
- description=(
62
- "A context data that receive from a workflow execution result.",
63
- ),
64
- )
65
- parent_run_id: Optional[str] = Field(default=None)
66
- run_id: str
67
- update: datetime = Field(default_factory=datetime.now)
68
-
69
- @model_validator(mode="after")
70
- def __model_action(self) -> Self:
71
- """Do before the Log action with WORKFLOW_LOG_ENABLE_WRITE env variable.
72
-
73
- :rtype: Self
74
- """
75
- if str2bool(os.getenv("WORKFLOW_LOG_ENABLE_WRITE", "false")):
76
- self.do_before()
77
- return self
78
-
79
- def do_before(self) -> None:
80
- """To something before end up of initial log model."""
81
-
82
- @abstractmethod
83
- def save(self, excluded: list[str] | None) -> None:
84
- """Save this model logging to target logging store."""
85
- raise NotImplementedError("Log should implement ``save`` method.")
86
-
87
-
88
- class FileLog(BaseLog):
89
- """File Log Pydantic Model that use to saving log data from result of
90
- workflow execution. It inherit from BaseLog model that implement the
91
- ``self.save`` method for file.
92
- """
93
-
94
- filename: ClassVar[str] = (
95
- "./logs/workflow={name}/release={release:%Y%m%d%H%M%S}"
96
- )
97
-
98
- def do_before(self) -> None:
99
- """Create directory of release before saving log file."""
100
- self.pointer().mkdir(parents=True, exist_ok=True)
101
-
102
- @classmethod
103
- def find_logs(cls, name: str):
104
- pointer: Path = (
105
- load_config().engine.paths.root / f"./logs/workflow={name}"
106
- )
107
- for file in pointer.glob("./release=*/*.log"):
108
- with file.open(mode="r", encoding="utf-8") as f:
109
- yield json.load(f)
110
-
111
- @classmethod
112
- def find_log(cls, name: str, release: datetime | None = None):
113
- if release is not None:
114
- pointer: Path = (
115
- load_config().engine.paths.root
116
- / f"./logs/workflow={name}/release={release:%Y%m%d%H%M%S}"
117
- )
118
- if not pointer.exists():
119
- raise FileNotFoundError(
120
- f"Pointer: ./logs/workflow={name}/"
121
- f"release={release:%Y%m%d%H%M%S} does not found."
122
- )
123
- return cls.model_validate(
124
- obj=json.loads(pointer.read_text(encoding="utf-8"))
125
- )
126
- raise NotImplementedError("Find latest log does not implement yet.")
127
-
128
- @classmethod
129
- def is_pointed(
130
- cls,
131
- name: str,
132
- release: datetime,
133
- *,
134
- queue: list[datetime] | None = None,
135
- ) -> bool:
136
- """Check this log already point in the destination.
137
-
138
- :param name: A workflow name.
139
- :param release: A release datetime.
140
- :param queue: A list of queue of datetime that already run in the
141
- future.
142
- """
143
- # NOTE: Check environ variable was set for real writing.
144
- if not str2bool(os.getenv("WORKFLOW_LOG_ENABLE_WRITE", "false")):
145
- return False
146
-
147
- # NOTE: create pointer path that use the same logic of pointer method.
148
- pointer: Path = load_config().engine.paths.root / cls.filename.format(
149
- name=name, release=release
150
- )
151
-
152
- if not queue:
153
- return pointer.exists()
154
- return pointer.exists() or (release in queue)
155
-
156
- def pointer(self) -> Path:
157
- """Return release directory path that was generated from model data.
158
-
159
- :rtype: Path
160
- """
161
- return load_config().engine.paths.root / self.filename.format(
162
- name=self.name, release=self.release
163
- )
164
-
165
- def save(self, excluded: list[str] | None) -> Self:
166
- """Save logging data that receive a context data from a workflow
167
- execution result.
168
-
169
- :param excluded: An excluded list of key name that want to pass in the
170
- model_dump method.
171
- :rtype: Self
172
- """
173
- # NOTE: Check environ variable was set for real writing.
174
- if not str2bool(os.getenv("WORKFLOW_LOG_ENABLE_WRITE", "false")):
175
- return self
176
-
177
- log_file: Path = self.pointer() / f"{self.run_id}.log"
178
- log_file.write_text(
179
- json.dumps(
180
- self.model_dump(exclude=excluded),
181
- default=str,
182
- indent=2,
183
- ),
184
- encoding="utf-8",
185
- )
186
- return self
187
-
188
-
189
- class SQLiteLog(BaseLog):
190
-
191
- def save(self, excluded: list[str] | None) -> None:
192
- raise NotImplementedError("SQLiteLog does not implement yet.")
193
-
194
-
195
- Log = Union[
196
- FileLog,
197
- SQLiteLog,
198
- ]
@@ -1,22 +0,0 @@
1
- ddeutil/workflow/__about__.py,sha256=w_vBOopUg1crMbDyfdE0LgsxsncnhGYp0D39LSSnSVI,28
2
- ddeutil/workflow/__init__.py,sha256=-DIy8SGFsD7_wqp-V-K8v8jTxacmqrcyj_SFx1WS6qg,687
3
- ddeutil/workflow/__types.py,sha256=WWugALcayRiP0IQO-eBWK767_XxK7KGlY7SuVgyaJnk,3196
4
- ddeutil/workflow/api.py,sha256=cwju_qhY6m0kLtaoa77QLglC9tl7RjjZ4UnJYV3SlQQ,4810
5
- ddeutil/workflow/cli.py,sha256=Ikcq526WeIl-737-v55T0PwAZ2pNiZFxlN0Y-DjhDbQ,3374
6
- ddeutil/workflow/conf.py,sha256=D0g7rHXilpGwOD36QwVd9I5kEwqsAUA0Z3tAINS2Pws,1287
7
- ddeutil/workflow/cron.py,sha256=naWefHc3EnVo41Yf1zQeXOzF27YlTlnfj0XnQ6_HO-U,25514
8
- ddeutil/workflow/exceptions.py,sha256=Uf1-Tn8rAzj0aiVHSqo4fBqO80W0za7UFZgKv24E-tg,706
9
- ddeutil/workflow/job.py,sha256=9H_2C0ikD5y6jLVdIBj8de4CdSpS632XOfqYVhM4bHI,21582
10
- ddeutil/workflow/log.py,sha256=Ev-Szi0KC_MmbFY4g4BWv6tUSmcLKWKZ03ZInmYPmgU,6490
11
- ddeutil/workflow/on.py,sha256=vsZG19mNoztDSB_ObD_4ZWPKgHYpBDJMWw97ZiTavNE,7237
12
- ddeutil/workflow/repeat.py,sha256=e3dekPTlMlxCCizfBYsZ8dD8Juy4rtfqDZJU3Iky2oA,5011
13
- ddeutil/workflow/route.py,sha256=ABEk-WlVo9XGFc7zCPbckX33URCNH7woQFU1keX_8PQ,6970
14
- ddeutil/workflow/scheduler.py,sha256=12Dd5CVphOVKjUwoiB8dCHt4WpYRPG3dSOt-pR6NNxc,46167
15
- ddeutil/workflow/stage.py,sha256=Avz1Mbb8WAP6kFn0bnN0p14-EnQ_AzdKr435JRxjkao,23844
16
- ddeutil/workflow/utils.py,sha256=XUD5hoygAyxi4xo1spTacoDNGKN2TlRob_o8qfCj4Pc,30993
17
- ddeutil_workflow-0.0.15.dist-info/LICENSE,sha256=nGFZ1QEhhhWeMHf9n99_fdt4vQaXS29xWKxt-OcLywk,1085
18
- ddeutil_workflow-0.0.15.dist-info/METADATA,sha256=6JvY9y-cT3WnirRva45NS582Iz7ZuXJZpsiCtN57OoA,11653
19
- ddeutil_workflow-0.0.15.dist-info/WHEEL,sha256=GV9aMThwP_4oNCtvEC2ec3qUYutgWeAzklro_0m4WJQ,91
20
- ddeutil_workflow-0.0.15.dist-info/entry_points.txt,sha256=0BVOgO3LdUdXVZ-CiHHDKxzEk2c8J30jEwHeKn2YCWI,62
21
- ddeutil_workflow-0.0.15.dist-info/top_level.txt,sha256=m9M6XeSWDwt_yMsmH6gcOjHZVK5O0-vgtNBuncHjzW4,8
22
- ddeutil_workflow-0.0.15.dist-info/RECORD,,