ddeutil-workflow 0.0.5__py3-none-any.whl → 0.0.7__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: ddeutil-workflow
3
- Version: 0.0.5
3
+ Version: 0.0.7
4
4
  Summary: Data Developer & Engineer Workflow Utility Objects
5
5
  Author-email: ddeutils <korawich.anu@gmail.com>
6
6
  License: MIT
@@ -24,36 +24,42 @@ License-File: LICENSE
24
24
  Requires-Dist: fmtutil
25
25
  Requires-Dist: ddeutil-io
26
26
  Requires-Dist: python-dotenv ==1.0.1
27
- Requires-Dist: schedule ==1.2.2
27
+ Provides-Extra: api
28
+ Requires-Dist: fastapi[standard] ==0.112.0 ; extra == 'api'
29
+ Requires-Dist: apscheduler[sqlalchemy] <4.0.0,==3.10.4 ; extra == 'api'
30
+ Requires-Dist: croniter ==3.0.3 ; extra == 'api'
31
+ Provides-Extra: app
32
+ Requires-Dist: schedule <2.0.0,==1.2.2 ; extra == 'app'
28
33
 
29
- # Data Utility: _Workflow_
34
+ # Workflow
30
35
 
31
36
  [![test](https://github.com/ddeutils/ddeutil-workflow/actions/workflows/tests.yml/badge.svg?branch=main)](https://github.com/ddeutils/ddeutil-workflow/actions/workflows/tests.yml)
32
37
  [![python support version](https://img.shields.io/pypi/pyversions/ddeutil-workflow)](https://pypi.org/project/ddeutil-workflow/)
33
38
  [![size](https://img.shields.io/github/languages/code-size/ddeutils/ddeutil-workflow)](https://github.com/ddeutils/ddeutil-workflow)
34
39
  [![gh license](https://img.shields.io/github/license/ddeutils/ddeutil-workflow)](https://github.com/ddeutils/ddeutil-workflow/blob/main/LICENSE)
35
40
 
36
-
37
41
  **Table of Contents**:
38
42
 
39
43
  - [Installation](#installation)
40
44
  - [Getting Started](#getting-started)
41
- - [Connection](#connection)
42
- - [Dataset](#dataset)
43
- - [Schedule](#schedule)
44
- - [Pipeline Examples](#examples)
45
- - [Python & Shell](#python--shell)
46
- - [Tasks (EL)](#tasks-extract--load)
47
- - [Hooks (T)](#tasks-transform)
45
+ - [On](#on)
46
+ - [Pipeline](#pipeline)
47
+ - [Usage](#usage)
48
+ - [Python & Bash](#python--bash)
49
+ - [Hook (EL)](#hook-extract--load)
50
+ - [Hook (T)](#hook-transform)
48
51
  - [Configuration](#configuration)
52
+ - [Deployment](#deployment)
49
53
 
50
- This **Utility Workflow** objects was created for easy to make a simple metadata
51
- driven pipeline that able to **ETL, T, EL, or ELT** by `.yaml` file.
54
+ This **Workflow** objects was created for easy to make a simple metadata
55
+ driven for data pipeline orchestration that able to use for **ETL, T, EL, or
56
+ ELT** by a `.yaml` file template.
52
57
 
53
- I think we should not create the multiple pipeline per use-case if we able to
54
- write some dynamic pipeline that just change the input parameters per use-case
55
- instead. This way we can handle a lot of pipelines in our orgs with metadata only.
56
- It called **Metadata Driven**.
58
+ In my opinion, I think it should not create duplicate pipeline codes if I can
59
+ write with dynamic input parameters on the one template pipeline that just change
60
+ the input parameters per use-case instead.
61
+ This way I can handle a lot of logical pipelines in our orgs with only metadata
62
+ configuration. It called **Metadata Driven Data Pipeline**.
57
63
 
58
64
  Next, we should get some monitoring tools for manage logging that return from
59
65
  pipeline running. Because it not show us what is a use-case that running data
@@ -70,7 +76,16 @@ pipeline.
70
76
  pip install ddeutil-workflow
71
77
  ```
72
78
 
73
- This project need `ddeutil-io`, `ddeutil-model` extension namespace packages.
79
+ This project need `ddeutil-io` extension namespace packages. If you want to install
80
+ this package with application add-ons, you should add `app` in installation;
81
+
82
+ ```shell
83
+ pip install ddeutil-workflow[app]
84
+ ```
85
+
86
+ ```shell
87
+ pip install ddeutil-workflow[api]
88
+ ```
74
89
 
75
90
  ## Getting Started
76
91
 
@@ -87,38 +102,42 @@ will passing parameters and catching the output for re-use it to next step.
87
102
  > dynamic registries instead of main features because it have a lot of maintain
88
103
  > vendor codes and deps. (I do not have time to handle this features)
89
104
 
90
- ---
105
+ ### On
91
106
 
92
- ### Schedule
107
+ The **On** is schedule object.
93
108
 
94
109
  ```yaml
95
- schd_for_node:
96
- type: schedule.Schedule
110
+ on_every_5_min:
111
+ type: on.On
97
112
  cron: "*/5 * * * *"
98
113
  ```
99
114
 
100
115
  ```python
101
- from ddeutil.workflow.on import Schedule
116
+ from ddeutil.workflow.on import On
102
117
 
103
- scdl = Schedule.from_loader(name='schd_for_node', externals={})
104
- assert '*/5 * * * *' == str(scdl.cronjob)
118
+ schedule = On.from_loader(name='on_every_5_min', externals={})
119
+ assert '*/5 * * * *' == str(schedule.cronjob)
105
120
 
106
- cron_iterate = scdl.generate('2022-01-01 00:00:00')
107
- assert '2022-01-01 00:05:00' f"{cron_iterate.next:%Y-%m-%d %H:%M:%S}"
108
- assert '2022-01-01 00:10:00' f"{cron_iterate.next:%Y-%m-%d %H:%M:%S}"
109
- assert '2022-01-01 00:15:00' f"{cron_iterate.next:%Y-%m-%d %H:%M:%S}"
110
- assert '2022-01-01 00:20:00' f"{cron_iterate.next:%Y-%m-%d %H:%M:%S}"
111
- assert '2022-01-01 00:25:00' f"{cron_iterate.next:%Y-%m-%d %H:%M:%S}"
121
+ cron_iter = schedule.generate('2022-01-01 00:00:00')
122
+ assert '2022-01-01 00:05:00' f"{cron_iter.next:%Y-%m-%d %H:%M:%S}"
123
+ assert '2022-01-01 00:10:00' f"{cron_iter.next:%Y-%m-%d %H:%M:%S}"
124
+ assert '2022-01-01 00:15:00' f"{cron_iter.next:%Y-%m-%d %H:%M:%S}"
125
+ assert '2022-01-01 00:20:00' f"{cron_iter.next:%Y-%m-%d %H:%M:%S}"
112
126
  ```
113
127
 
114
- ---
115
-
116
128
  ### Pipeline
117
129
 
130
+ The **Pipeline** object that is the core feature of this project.
131
+
118
132
  ```yaml
119
133
  run_py_local:
120
134
  type: ddeutil.workflow.pipeline.Pipeline
121
- ...
135
+ on: 'on_every_5_min'
136
+ params:
137
+ author-run:
138
+ type: str
139
+ run-date:
140
+ type: datetime
122
141
  ```
123
142
 
124
143
  ```python
@@ -128,27 +147,39 @@ pipe = Pipeline.from_loader(name='run_py_local', externals={})
128
147
  pipe.execute(params={'author-run': 'Local Workflow', 'run-date': '2024-01-01'})
129
148
  ```
130
149
 
131
- ## Examples
150
+ > [!NOTE]
151
+ > The above parameter use short declarative statement. You can pass a parameter
152
+ > type to the key of a parameter name.
153
+ > ```yaml
154
+ > params:
155
+ > author-run: str
156
+ > run-date: datetime
157
+ > ```
158
+ >
159
+ > And for the type, you can remove `ddeutil.workflow` prefix because we can find
160
+ > it by looping search from `WORKFLOW_CORE_REGISTRY` value.
161
+
162
+ ## Usage
132
163
 
133
164
  This is examples that use workflow file for running common Data Engineering
134
165
  use-case.
135
166
 
136
- ### Python & Shell
167
+ > [!IMPORTANT]
168
+ > I recommend you to use `task` stage for all actions that you want to do with
169
+ > pipeline object.
137
170
 
138
- The state of doing lists that worker should to do. It be collection of the stage.
171
+ ### Python & Bash
139
172
 
140
173
  ```yaml
141
174
  run_py_local:
142
- type: ddeutil.workflow.pipeline.Pipeline
175
+ type: pipeline.Pipeline
143
176
  params:
144
- author-run:
145
- type: str
146
- run-date:
147
- type: datetime
177
+ author-run: str
178
+ run-date: datetime
148
179
  jobs:
149
180
  first-job:
150
181
  stages:
151
- - name: Printing Information
182
+ - name: "Printing Information"
152
183
  id: define-func
153
184
  run: |
154
185
  x = '${{ params.author-run }}'
@@ -157,7 +188,7 @@ run_py_local:
157
188
  def echo(name: str):
158
189
  print(f'Hello {name}')
159
190
 
160
- - name: Run Sequence and use var from Above
191
+ - name: "Run Sequence and use var from Above"
161
192
  vars:
162
193
  x: ${{ params.author-run }}
163
194
  run: |
@@ -165,16 +196,16 @@ run_py_local:
165
196
  # Change x value
166
197
  x: int = 1
167
198
 
168
- - name: Call Function
199
+ - name: "Call Function"
169
200
  vars:
170
201
  echo: ${{ stages.define-func.outputs.echo }}
171
202
  run: |
172
203
  echo('Caller')
173
204
  second-job:
174
205
  stages:
175
- - name: Echo Shell Script
206
+ - name: "Echo Bash Script"
176
207
  id: shell-echo
177
- shell: |
208
+ bash: |
178
209
  echo "Hello World from Shell"
179
210
  ```
180
211
 
@@ -192,24 +223,20 @@ pipe.execute(params={'author-run': 'Local Workflow', 'run-date': '2024-01-01'})
192
223
  > Hello World from Shell
193
224
  ```
194
225
 
195
- ---
196
-
197
- ### Tasks (Extract & Load)
226
+ ### Hook (Extract & Load)
198
227
 
199
228
  ```yaml
200
229
  pipe_el_pg_to_lake:
201
- type: ddeutil.workflow.pipeline.Pipeline
230
+ type: pipeline.Pipeline
202
231
  params:
203
- run-date:
204
- type: datetime
205
- author-email:
206
- type: str
232
+ run-date: datetime
233
+ author-email: str
207
234
  jobs:
208
235
  extract-load:
209
236
  stages:
210
237
  - name: "Extract Load from Postgres to Lake"
211
238
  id: extract-load
212
- task: tasks/postgres-to-delta@polars
239
+ uses: tasks/postgres-to-delta@polars
213
240
  with:
214
241
  source:
215
242
  conn: conn_postgres_url
@@ -221,15 +248,23 @@ pipe_el_pg_to_lake:
221
248
  endpoint: "/${{ params.name }}"
222
249
  ```
223
250
 
224
- ---
251
+ Implement hook:
225
252
 
226
- ### Tasks (Transform)
253
+ ```python
254
+ from ddeutil.workflow.utils import tag
227
255
 
228
- > I recommend you to use task for all actions that you want to do.
256
+ @tag('polars', alias='postgres-to-delta')
257
+ def postgres_to_delta(source, sink):
258
+ return {
259
+ "source": source, "sink": sink
260
+ }
261
+ ```
262
+
263
+ ### Hook (Transform)
229
264
 
230
265
  ```yaml
231
- pipe_hook_mssql_proc:
232
- type: ddeutil.workflow.pipeline.Pipeline
266
+ pipeline_hook_mssql_proc:
267
+ type: pipeline.Pipeline
233
268
  params:
234
269
  run_date: datetime
235
270
  sp_name: str
@@ -240,7 +275,7 @@ pipe_hook_mssql_proc:
240
275
  stages:
241
276
  - name: "Transform Data in MS SQL Server"
242
277
  id: transform
243
- task: tasks/mssql-proc@odbc
278
+ uses: tasks/mssql-proc@odbc
244
279
  with:
245
280
  exec: ${{ params.sp_name }}
246
281
  params:
@@ -250,16 +285,57 @@ pipe_hook_mssql_proc:
250
285
  target: ${{ params.target_name }}
251
286
  ```
252
287
 
253
- > [!NOTE]
254
- > The above parameter use short declarative statement. You can pass a parameter
255
- > type to the key of a parameter name.
288
+ Implement hook:
289
+
290
+ ```python
291
+ from ddeutil.workflow.utils import tag
292
+
293
+ @tag('odbc', alias='mssql-proc')
294
+ def odbc_mssql_procedure(_exec: str, params: dict):
295
+ return {
296
+ "exec": _exec, "params": params
297
+ }
298
+ ```
256
299
 
257
300
  ## Configuration
258
301
 
259
- ```text
302
+ ```bash
303
+ export WORKFLOW_ROOT_PATH=.
304
+ export WORKFLOW_CORE_REGISTRY=ddeutil.workflow,tests.utils
305
+ export WORKFLOW_CORE_REGISTRY_FILTER=ddeutil.workflow.utils
306
+ export WORKFLOW_CORE_PATH_CONF=conf
307
+ export WORKFLOW_CORE_TIMEZONE=Asia/Bangkok
308
+ export WORKFLOW_CORE_DEFAULT_STAGE_ID=true
260
309
 
310
+ export WORKFLOW_CORE_MAX_PIPELINE_POKING=4
311
+ export WORKFLOW_CORE_MAX_JOB_PARALLEL=2
261
312
  ```
262
313
 
263
- ## License
314
+ Application config:
264
315
 
265
- This project was licensed under the terms of the [MIT license](LICENSE).
316
+ ```bash
317
+ export WORKFLOW_APP_DB_URL=postgresql+asyncpg://user:pass@localhost:5432/schedule
318
+ export WORKFLOW_APP_INTERVAL=10
319
+ ```
320
+
321
+ ## Deployment
322
+
323
+ This package able to run as a application service for receive manual trigger
324
+ from the master node via RestAPI or use to be Scheduler background service
325
+ like crontab job but via Python API.
326
+
327
+ ### Schedule Service
328
+
329
+ ```shell
330
+ (venv) $ python src.ddeutil.workflow.app
331
+ ```
332
+
333
+ ### API Server
334
+
335
+ ```shell
336
+ (venv) $ uvicorn src.ddeutil.workflow.api:app --host 0.0.0.0 --port 80 --reload
337
+ ```
338
+
339
+ > [!NOTE]
340
+ > If this package already deploy, it able to use
341
+ > `uvicorn ddeutil.workflow.api:app --host 0.0.0.0 --port 80`
@@ -0,0 +1,20 @@
1
+ ddeutil/workflow/__about__.py,sha256=b23XabBwtuoPOLmS_Hj_gSA4LZ0fRfAkACM6c3szVoc,27
2
+ ddeutil/workflow/__init__.py,sha256=4PEL3RdHmUowK0Dz-tK7fO0wvFX4u9CLd0Up7b3lrAQ,760
3
+ ddeutil/workflow/__types.py,sha256=SYMoxbENQX8uPsiCZkjtpHAqqHOh8rUrarAFicAJd0E,1773
4
+ ddeutil/workflow/api.py,sha256=d2Mmv9jTtN3FITIy-2mivyAKdBOGZxtkNWRMPbCLlFI,3341
5
+ ddeutil/workflow/app.py,sha256=GbdwvUkE8lO2Ze4pZ0-J-7p9mcZAaORfjkHwW_oZIP0,1076
6
+ ddeutil/workflow/exceptions.py,sha256=BH7COn_3uz3z7oJBZOQGiuo8osBFgeXL8HYymnjCOPQ,671
7
+ ddeutil/workflow/loader.py,sha256=_ZD-XP5P7VbUeqItrUVPaKIZu6dMUZ2aywbCbReW1hQ,2778
8
+ ddeutil/workflow/log.py,sha256=_GJEdJr7bqpcQDxZjrqHd-hkiW3NKFaVoR6voE6Ty0o,952
9
+ ddeutil/workflow/on.py,sha256=YoEqDbzJUwqOA3JRltbvlYr0rNTtxdmb7cWMxl8U19k,6717
10
+ ddeutil/workflow/pipeline.py,sha256=dKF09TFS_v5TCD-5o8tp1UhB5sGuWIQu4zl_UFtlIC0,25951
11
+ ddeutil/workflow/repeat.py,sha256=sNoRfbOR4cYm_edrSvlVy9N8Dk_osLIq9FC5GMZz32M,4621
12
+ ddeutil/workflow/route.py,sha256=Ck_O1xJwI-vKkMJr37El0-1PGKlwKF8__DDNWVQrf0A,2079
13
+ ddeutil/workflow/scheduler.py,sha256=FqmkvWCqwJ4eRf8aDn5Ce4FcNWqmcvu2aTTfL34lfgs,22184
14
+ ddeutil/workflow/stage.py,sha256=z05bKk2QFQDXjidSnQYCVOdceSpSO13sHXE0B1UH6XA,14978
15
+ ddeutil/workflow/utils.py,sha256=pDM2jaYVP-USH0pLd_XmHOguxVPGVzZ76hOh1AZdINU,18495
16
+ ddeutil_workflow-0.0.7.dist-info/LICENSE,sha256=nGFZ1QEhhhWeMHf9n99_fdt4vQaXS29xWKxt-OcLywk,1085
17
+ ddeutil_workflow-0.0.7.dist-info/METADATA,sha256=ba2nH57cpHB2P4ldQCRT8ZWDj3r1OPx9a1dgcB0a2Ws,9702
18
+ ddeutil_workflow-0.0.7.dist-info/WHEEL,sha256=HiCZjzuy6Dw0hdX5R3LCFPDmFS4BWl8H-8W39XfmgX4,91
19
+ ddeutil_workflow-0.0.7.dist-info/top_level.txt,sha256=m9M6XeSWDwt_yMsmH6gcOjHZVK5O0-vgtNBuncHjzW4,8
20
+ ddeutil_workflow-0.0.7.dist-info/RECORD,,
@@ -1,5 +1,5 @@
1
1
  Wheel-Version: 1.0
2
- Generator: setuptools (72.1.0)
2
+ Generator: setuptools (72.2.0)
3
3
  Root-Is-Purelib: true
4
4
  Tag: py3-none-any
5
5
 
@@ -1,44 +0,0 @@
1
- # -------------------------------------------------------------------------
2
- # Copyright (c) 2022 Korawich Anuttra. All rights reserved.
3
- # Licensed under the MIT License. See LICENSE in the project root for
4
- # license information.
5
- # --------------------------------------------------------------------------
6
- import re
7
- from re import (
8
- IGNORECASE,
9
- MULTILINE,
10
- UNICODE,
11
- VERBOSE,
12
- Pattern,
13
- )
14
-
15
-
16
- class RegexConf:
17
- """Regular expression config."""
18
-
19
- # NOTE: Search caller
20
- __re_caller: str = r"""
21
- \$
22
- {{
23
- \s*(?P<caller>
24
- [a-zA-Z0-9_.\s'\"\[\]\(\)\-\{}]+?
25
- )\s*
26
- }}
27
- """
28
- RE_CALLER: Pattern = re.compile(
29
- __re_caller, MULTILINE | IGNORECASE | UNICODE | VERBOSE
30
- )
31
-
32
- # NOTE: Search task
33
- __re_task_fmt: str = r"""
34
- ^
35
- (?P<path>[^/@]+)
36
- /
37
- (?P<func>[^@]+)
38
- @
39
- (?P<tag>.+)
40
- $
41
- """
42
- RE_TASK_FMT: Pattern = re.compile(
43
- __re_task_fmt, MULTILINE | IGNORECASE | UNICODE | VERBOSE
44
- )
@@ -1,6 +0,0 @@
1
- # ------------------------------------------------------------------------------
2
- # Copyright (c) 2022 Korawich Anuttra. All rights reserved.
3
- # Licensed under the MIT License. See LICENSE in the project root for
4
- # license information.
5
- # ------------------------------------------------------------------------------
6
- from .dummy import *
@@ -1,52 +0,0 @@
1
- # ------------------------------------------------------------------------------
2
- # Copyright (c) 2022 Korawich Anuttra. All rights reserved.
3
- # Licensed under the MIT License. See LICENSE in the project root for
4
- # license information.
5
- # ------------------------------------------------------------------------------
6
- from __future__ import annotations
7
-
8
- from typing import Any
9
-
10
- from ddeutil.workflow.utils import tag
11
-
12
-
13
- @tag("polars-dir", name="el-csv-to-parquet")
14
- def dummy_task_1(
15
- source: str,
16
- sink: str,
17
- conversion: dict[str, Any] | None = None,
18
- ) -> dict[str, int]:
19
- """Extract Load data from CSV to Parquet file.
20
-
21
- :param source:
22
- :param sink:
23
- :param conversion:
24
- """
25
- print("Start EL for CSV to Parquet with Polars Engine")
26
- print("---")
27
- print(f"Reading data from {source}")
28
-
29
- conversion: dict[str, Any] = conversion or {}
30
- if conversion:
31
- print("Start Schema Conversion ...")
32
-
33
- print(f"Writing data to {sink}")
34
- return {"records": 1}
35
-
36
-
37
- @tag("polars-dir-scan", name="el-csv-to-parquet")
38
- def dummy_task_2(
39
- source: str,
40
- sink: str,
41
- conversion: dict[str, Any] | None = None,
42
- ) -> dict[str, int]:
43
- print("Start EL for CSV to Parquet with Polars Engine")
44
- print("---")
45
- print(f"Reading data from {source}")
46
-
47
- conversion: dict[str, Any] = conversion or {}
48
- if conversion:
49
- print("Start Schema Conversion ...")
50
-
51
- print(f"Writing data to {sink}")
52
- return {"records": 1}
@@ -1,17 +0,0 @@
1
- ddeutil/workflow/__about__.py,sha256=jgkUUyo8sKJnE1-6McC_AbxbZqvAFoYRfSE3HCAexlk,27
2
- ddeutil/workflow/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
3
- ddeutil/workflow/__regex.py,sha256=bOngaQ0zJgy3vfNwF2MlI8XhLu_Ei1Vz8y50iLj8ao4,1061
4
- ddeutil/workflow/__scheduler.py,sha256=wSzv6EN2Nx99lLxzrA80qfqQ_AxOFJAOk_EZYgafVzk,20965
5
- ddeutil/workflow/__types.py,sha256=AkpQq6QlrclpurCZZVY9RMxoyS9z2WGzhaz_ikeTaCU,453
6
- ddeutil/workflow/exceptions.py,sha256=XAq82VHSMLNb4UjGatp7hYfjxFtMiKFtBqJyAhwTl-s,434
7
- ddeutil/workflow/loader.py,sha256=xiOtxluhLXfryMp3q1OIJggykr01WENKV7zBkcJ9-Yc,5763
8
- ddeutil/workflow/on.py,sha256=vYNcvh74MN2ttd9KOehngOhJDNqDheFbbsXeEhDSqyk,5280
9
- ddeutil/workflow/pipeline.py,sha256=tnzeAQrXb_-OIo53IGv6LxZoMiOJyMPWXAbisdBCXHI,19298
10
- ddeutil/workflow/utils.py,sha256=A-k3L4OUFFq6utlsbpEVMGyofiLbDrvThM8IqFDE9gE,6093
11
- ddeutil/workflow/tasks/__init__.py,sha256=HcQ7xNETFOKovMOs4lL2Pl8hXFZ515jU5Mc3LFZcSGE,336
12
- ddeutil/workflow/tasks/dummy.py,sha256=b_y6eHGxj4aQ-ZmvcNL7aBHu3eIzL6BeXgqj0MDqSPw,1460
13
- ddeutil_workflow-0.0.5.dist-info/LICENSE,sha256=nGFZ1QEhhhWeMHf9n99_fdt4vQaXS29xWKxt-OcLywk,1085
14
- ddeutil_workflow-0.0.5.dist-info/METADATA,sha256=dcZadiwnhPD6DnoBaugdPHlOw108lRCmhDD7KF2s-Dg,7717
15
- ddeutil_workflow-0.0.5.dist-info/WHEEL,sha256=R0nc6qTxuoLk7ShA2_Y-UWkN8ZdfDBG2B6Eqpz2WXbs,91
16
- ddeutil_workflow-0.0.5.dist-info/top_level.txt,sha256=m9M6XeSWDwt_yMsmH6gcOjHZVK5O0-vgtNBuncHjzW4,8
17
- ddeutil_workflow-0.0.5.dist-info/RECORD,,