task-pipeliner 0.3.2 → 0.3.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.ko.md +77 -38
- package/README.md +81 -42
- package/dist/index.cjs +169 -77
- package/package.json +1 -1
package/README.ko.md
CHANGED
|
@@ -2,17 +2,31 @@
|
|
|
2
2
|
|
|
3
3
|
> 조건 기반 실행과 아름다운 CLI 출력을 제공하는 강력한 워크플로우 오케스트레이션 도구
|
|
4
4
|
|
|
5
|
-
**버전:** 0.3.
|
|
5
|
+
**버전:** 0.3.4
|
|
6
6
|
|
|
7
7
|

|
|
8
8
|
|
|
9
9
|
[](https://www.npmjs.com/package/task-pipeliner)
|
|
10
10
|
[](https://opensource.org/licenses/MIT)
|
|
11
11
|
|
|
12
|
-
**task-pipeliner**는 간단한 YAML 또는 JSON 파일로 복잡한 워크플로우를 정의, 조율, 실행할 수 있는 현대적인 워크플로우 오케스트레이션 도구입니다. 조건부 실행, 병렬 작업, 대화형 프롬프트, 그리고 아름다운 터미널 출력을 제공하여 빌드 스크립트, 배포 워크플로우, CI/CD 파이프라인에 완벽합니다.
|
|
12
|
+
**task-pipeliner**는 간단한 YAML 또는 JSON 파일로 복잡한 워크플로우를 정의, 조율, 실행할 수 있는 현대적인 워크플로우 오케스트레이션 도구입니다. 조건부 실행, 병렬 작업, 대화형 프롬프트, 그리고 아름다운 터미널 출력을 제공하여 빌드 스크립트, 배포 워크플로우, CI/CD 파이프라인에 완벽합니다. *아직 베타 버전이라 인터페이스가 조금씩 수정될 수 있습니다.*
|
|
13
13
|
|
|
14
14
|
**README-Language-Map** [KR [한국어 버전]](https://github.com/racgoo/task-pipeliner/blob/main/README.ko.md) / [EN [English Version]](https://github.com/racgoo/task-pipeliner)
|
|
15
15
|
|
|
16
|
+
## 리소스
|
|
17
|
+
|
|
18
|
+
### 문서 및 도구
|
|
19
|
+
|
|
20
|
+
- 📚 **[문서](https://task-pipeliner.racgoo.com/)** - 완전한 DSL 참조 및 가이드
|
|
21
|
+
- 🎨 **[워크플로우 생성기](https://task-pipeliner-generator.racgoo.com/)** - 브라우저에서 시각적으로 워크플로우 생성
|
|
22
|
+
|
|
23
|
+
### 저장소 및 패키지 관리자
|
|
24
|
+
|
|
25
|
+
- 💻 **[GitHub](https://github.com/racgoo/task-pipeliner)** - 소스 코드 및 이슈 추적
|
|
26
|
+
- 📦 **[npm](https://www.npmjs.com/package/task-pipeliner)** - npm 레지스트리 패키지
|
|
27
|
+
- 🍺 **[Homebrew](https://github.com/racgoo/homebrew-task-pipeliner)** - macOS/Linux용 Homebrew 탭
|
|
28
|
+
- 🪟 **[Scoop](https://github.com/racgoo/scoop-task-pipeliner)** - Windows용 Scoop 버킷
|
|
29
|
+
|
|
16
30
|
## ✨ 주요 기능
|
|
17
31
|
|
|
18
32
|
- **조건 기반 실행** - 파일 존재 여부, 사용자 선택, 환경 변수 등을 기반으로 단계 실행
|
|
@@ -31,26 +45,18 @@
|
|
|
31
45
|
|
|
32
46
|
- **워크플로우 스케줄링** - cron 표현식을 사용하여 지정된 시간에 워크플로우 자동 실행
|
|
33
47
|
|
|
34
|
-
## 리소스
|
|
35
|
-
|
|
36
|
-
### 문서 및 도구
|
|
37
|
-
|
|
38
|
-
- 📚 **[문서](https://task-pipeliner.racgoo.com/)** - 완전한 DSL 참조 및 가이드
|
|
39
|
-
- 🎨 **[워크플로우 생성기](https://task-pipeliner-generator.racgoo.com/)** - 브라우저에서 시각적으로 워크플로우 생성
|
|
40
|
-
|
|
41
|
-
### 저장소 및 패키지 관리자
|
|
42
|
-
|
|
43
|
-
- 💻 **[GitHub](https://github.com/racgoo/task-pipeliner)** - 소스 코드 및 이슈 추적
|
|
44
|
-
- 📦 **[npm](https://www.npmjs.com/package/task-pipeliner)** - npm 레지스트리 패키지
|
|
45
|
-
- 🍺 **[Homebrew](https://github.com/racgoo/homebrew-task-pipeliner)** - macOS/Linux용 Homebrew 탭
|
|
46
|
-
- 🪟 **[Scoop](https://github.com/racgoo/scoop-task-pipeliner)** - Windows용 Scoop 버킷
|
|
47
|
-
|
|
48
48
|
### CLI 명령어
|
|
49
49
|
|
|
50
|
+
**프로젝트 셋업 (신규 프로젝트 권장):**
|
|
51
|
+
```bash
|
|
52
|
+
tp setup # tp/, tp/workflows, tp/schedules 생성 및 예시 워크플로우 2개·예시 스케줄 2개 추가 (echo 기반; choose, when, profiles, prompt 포함)
|
|
53
|
+
```
|
|
54
|
+
프로젝트 루트에서 실행. `tp/workflows/`, `tp/schedules/`를 만들고, choose·when·profiles·prompt를 쓴 예시 워크플로우와 cron·프로필을 쓴 예시 스케줄을 넣습니다. 이미 있는 파일은 덮어쓰지 않습니다.
|
|
55
|
+
|
|
50
56
|
**워크플로우 실행:**
|
|
51
57
|
```bash
|
|
52
58
|
tp run workflow.yaml # 워크플로우 실행
|
|
53
|
-
tp run # 가장 가까운 tp 디렉토리에서 워크플로우 선택하여 실행
|
|
59
|
+
tp run # 가장 가까운 tp/workflows 디렉토리에서 워크플로우 선택하여 실행
|
|
54
60
|
tp run workflow.yaml --profile Test # 프로필로 실행 (프로필에 설정된 변수는 choose/prompt 생략)
|
|
55
61
|
tp run workflow.yaml -p Test # 프로필 짧은 형식
|
|
56
62
|
tp run workflow.yaml --silent # 사일런트 모드로 실행 (모든 콘솔 출력 억제)
|
|
@@ -74,16 +80,18 @@ tp history remove-all # 모든 히스토리 삭제
|
|
|
74
80
|
**워크플로우 스케줄링:**
|
|
75
81
|
```bash
|
|
76
82
|
tp schedule # 모든 스케줄 보기 (tp schedule list와 동일)
|
|
77
|
-
tp schedule list # 스케줄 목록 및 데몬 상태
|
|
78
|
-
tp schedule add schedules.yaml # 스케줄 파일에서
|
|
79
|
-
tp schedule
|
|
83
|
+
tp schedule list # 스케줄 목록 및 데몬 상태 (각 스케줄을 카드로 표시: cron, 언제 실행되는지 설명, 다음 실행 시각 등)
|
|
84
|
+
tp schedule add schedules.yaml # 스케줄 파일에서 추가; 파일 경로 생략 시 가장 가까운 tp/schedules에서 선택
|
|
85
|
+
tp schedule add # 가장 가까운 tp/schedules 디렉토리에서 스케줄 파일 선택
|
|
86
|
+
tp schedule remove # 스케줄 삭제; 삭제 후 삭제된 스케줄을 list와 동일한 카드 형식으로 표시
|
|
80
87
|
tp schedule remove-all # 모든 스케줄 삭제
|
|
81
|
-
tp schedule toggle # 스케줄
|
|
88
|
+
tp schedule toggle # 스케줄 활성화/비활성화; 토글 후 ENABLED/DISABLED를 굵게·색상으로 강조하고 스케줄 카드 표시
|
|
82
89
|
tp schedule start # 포그라운드 모드로 스케줄러 시작
|
|
83
90
|
tp schedule start -d # 백그라운드 데몬 모드로 스케줄러 시작
|
|
84
91
|
tp schedule stop # 스케줄러 데몬 종료
|
|
85
92
|
tp schedule status # 데몬·스케줄 상태 확인 (실시간 모드; Ctrl+C는 화면만 종료, 데몬은 계속 실행)
|
|
86
93
|
```
|
|
94
|
+
`tp schedule add`, `toggle`, `remove` 후에는 해당 스케줄이 `tp schedule list`와 같은 카드 레이아웃(크론, 언제 실행되는지 설명, 다음 실행, 활성 여부)으로 표시됩니다. 토글 후에는 ENABLED/DISABLED가 강조되어 새 상태를 한눈에 알 수 있습니다.
|
|
87
95
|
|
|
88
96
|
**데이터 및 업그레이드:**
|
|
89
97
|
```bash
|
|
@@ -114,6 +122,13 @@ task-pipeliner run workflow.yaml
|
|
|
114
122
|
tp run workflow.yaml
|
|
115
123
|
```
|
|
116
124
|
|
|
125
|
+
설치 확인:
|
|
126
|
+
```bash
|
|
127
|
+
task-pipeliner --version
|
|
128
|
+
# 또는
|
|
129
|
+
tp --version
|
|
130
|
+
```
|
|
131
|
+
|
|
117
132
|
**업데이트:**
|
|
118
133
|
```bash
|
|
119
134
|
# 먼저 Homebrew의 패키지 레지스트리를 업데이트합니다
|
|
@@ -144,6 +159,13 @@ task-pipeliner run workflow.yaml
|
|
|
144
159
|
tp run workflow.yaml
|
|
145
160
|
```
|
|
146
161
|
|
|
162
|
+
설치 확인:
|
|
163
|
+
```bash
|
|
164
|
+
task-pipeliner --version
|
|
165
|
+
# 또는
|
|
166
|
+
tp --version
|
|
167
|
+
```
|
|
168
|
+
|
|
147
169
|
**업데이트:**
|
|
148
170
|
```bash
|
|
149
171
|
scoop update task-pipeliner
|
|
@@ -168,6 +190,13 @@ task-pipeliner run workflow.yaml
|
|
|
168
190
|
tp run workflow.yaml
|
|
169
191
|
```
|
|
170
192
|
|
|
193
|
+
설치 확인:
|
|
194
|
+
```bash
|
|
195
|
+
task-pipeliner --version
|
|
196
|
+
# 또는
|
|
197
|
+
tp --version
|
|
198
|
+
```
|
|
199
|
+
|
|
171
200
|
#### 프로젝트 설치 (개발 모드)
|
|
172
201
|
|
|
173
202
|
프로젝트에 devDependency로 설치하면 `npx`로 사용할 수 있습니다:
|
|
@@ -280,26 +309,31 @@ tp run workflow.yaml -s
|
|
|
280
309
|
|
|
281
310
|
**`tp` 디렉토리 사용하기 (권장):**
|
|
282
311
|
|
|
283
|
-
|
|
312
|
+
권장 프로젝트 구조는 `tp` 디렉토리 아래에 두 개의 하위 디렉토리를 두는 방식입니다.
|
|
313
|
+
|
|
314
|
+
- **`tp/workflows/`** – 워크플로우 파일(YAML 또는 JSON). 파일 없이 `tp run`을 실행하면 task-pipeliner가 가장 가까운 `tp` 디렉토리를 찾고, **`tp/workflows/`** 안에서 실행할 워크플로우를 선택할 수 있게 합니다.
|
|
315
|
+
- **`tp/schedules/`** – 스케줄 파일(YAML 또는 JSON). `tp schedule add` 실행 시 파일 경로를 주지 않으면 가장 가까운 **`tp/schedules/`** 안에서 스케줄 파일을 선택할 수 있습니다.
|
|
316
|
+
|
|
317
|
+
**빠른 셋업:** 프로젝트 루트에서 `tp setup`을 실행하면 `tp/`, `tp/workflows/`, `tp/schedules/`가 생성되고, 예시 워크플로우 2개와 예시 스케줄 파일 2개(echo 기반, choose·when·profiles·prompt 및 스케줄 프로필 사용 예시 포함)가 추가됩니다. 이미 있는 파일은 덮어쓰지 않습니다.
|
|
284
318
|
|
|
285
319
|
```bash
|
|
286
|
-
# tp
|
|
287
|
-
|
|
288
|
-
mv workflow.yaml tp/
|
|
320
|
+
# 방법 1: tp setup 사용 (tp/workflows, tp/schedules + 예시 생성)
|
|
321
|
+
tp setup
|
|
289
322
|
|
|
290
|
-
#
|
|
323
|
+
# 방법 2: 수동으로 구조 만들기
|
|
324
|
+
mkdir -p tp/workflows tp/schedules
|
|
325
|
+
mv workflow.yaml tp/workflows/
|
|
326
|
+
|
|
327
|
+
# 파일 없이 실행 - tp/workflows에서 대화형 선택
|
|
291
328
|
tp run
|
|
292
329
|
```
|
|
293
330
|
|
|
294
|
-
|
|
295
|
-
1. 가장 가까운 `tp` 디렉토리를 찾습니다 (현재 디렉토리 또는 상위
|
|
296
|
-
2.
|
|
297
|
-
3.
|
|
298
|
-
- 타이핑하여 실시간으로 워크플로우 필터링
|
|
299
|
-
- 화살표 키 (↑↓)로 탐색
|
|
300
|
-
- Enter를 눌러 선택하고 실행
|
|
331
|
+
파일을 지정하지 않고 `tp run`을 실행하면:
|
|
332
|
+
1. 가장 가까운 `tp` 디렉토리를 찾습니다 (현재 디렉토리 또는 상위).
|
|
333
|
+
2. **`tp/workflows/`** 안의 모든 워크플로우 파일 (`.yaml`, `.yml`, `.json`)을 나열합니다.
|
|
334
|
+
3. 타이핑으로 필터, 화살표(↑↓)로 이동, Enter로 선택·실행하는 대화형 검색 메뉴를 띄웁니다.
|
|
301
335
|
|
|
302
|
-
|
|
336
|
+
메뉴에는 파일 이름과 워크플로우의 `name`(YAML/JSON)이 함께 표시되어 구분하기 쉽습니다.
|
|
303
337
|
|
|
304
338
|
**사일런트 모드:**
|
|
305
339
|
`--silent` (또는 `-s`) 플래그는 워크플로우 실행 중 모든 콘솔 출력을 억제합니다. 다음 경우에 유용합니다:
|
|
@@ -1429,9 +1463,11 @@ schedules:
|
|
|
1429
1463
|
|
|
1430
1464
|
```bash
|
|
1431
1465
|
tp schedule add schedules.yaml
|
|
1466
|
+
# 또는 경로 없이: 가장 가까운 tp/schedules/에서 파일 선택
|
|
1467
|
+
tp schedule add
|
|
1432
1468
|
```
|
|
1433
1469
|
|
|
1434
|
-
각 스케줄에 대해 별칭을 확인하거나 변경할 수
|
|
1470
|
+
각 스케줄에 대해 별칭을 확인하거나 변경할 수 있습니다. 추가 후에는 추가된 스케줄이 `tp schedule list`와 같은 카드 형식(크론, 언제 실행되는지 설명, 다음 실행, 활성 여부)으로 표시됩니다.
|
|
1435
1471
|
|
|
1436
1472
|
**Cron 표현식 형식:**
|
|
1437
1473
|
|
|
@@ -1464,19 +1500,21 @@ tp schedule add schedules.yaml
|
|
|
1464
1500
|
### 스케줄 관리
|
|
1465
1501
|
|
|
1466
1502
|
```bash
|
|
1467
|
-
# 모든 스케줄 목록 보기
|
|
1503
|
+
# 모든 스케줄 목록 보기 (카드 레이아웃: cron, "언제" 설명, 다음 실행 등)
|
|
1468
1504
|
tp schedule list
|
|
1469
1505
|
|
|
1470
|
-
# 스케줄 삭제
|
|
1506
|
+
# 스케줄 삭제 (삭제 후 삭제된 스케줄을 동일한 카드 형식으로 표시)
|
|
1471
1507
|
tp schedule remove
|
|
1472
1508
|
|
|
1473
1509
|
# 모든 스케줄 삭제
|
|
1474
1510
|
tp schedule remove-all
|
|
1475
1511
|
|
|
1476
|
-
# 스케줄 활성화/비활성화
|
|
1512
|
+
# 스케줄 활성화/비활성화 (토글 후 ENABLED/DISABLED를 굵게·색상으로 강조하고 스케줄 카드 표시)
|
|
1477
1513
|
tp schedule toggle
|
|
1478
1514
|
```
|
|
1479
1515
|
|
|
1516
|
+
**통일된 스케줄 UI:** list, add, toggle, remove 모두 동일한 스케줄 카드 레이아웃을 사용합니다. 각 카드에는 크론 표현식, 언제 실행되는지 사람이 읽기 쉬운 설명(예: "매분"), 타임존, 워크플로우 경로, 프로필(설정 시), 마지막 실행, 다음 실행이 표시됩니다. `tp schedule toggle` 후에는 새 상태(ENABLED는 초록, DISABLED는 회색)가 강조되어 한눈에 구분할 수 있습니다.
|
|
1517
|
+
|
|
1480
1518
|
### 스케줄러 실행
|
|
1481
1519
|
|
|
1482
1520
|
예약된 시간에 워크플로우를 실행하려면 스케줄러를 시작하세요. 두 가지 모드로 실행할 수 있습니다:
|
|
@@ -1531,6 +1569,7 @@ tp schedule status -n # 한 번만 표시 후 종료 (갱신 없음)
|
|
|
1531
1569
|
|
|
1532
1570
|
완전한 프로젝트 예제는 `examples/` 디렉토리를 확인하세요:
|
|
1533
1571
|
|
|
1572
|
+
- **`tp setup`** – 프로젝트 루트에서 `tp setup`을 실행하면 `tp/workflows/`, `tp/schedules/`와 예시 워크플로우 2개(choose, when, profiles, prompt), 예시 스케줄 파일 2개(프로필 사용 포함)가 생성됩니다. 모든 단계는 `echo`로 되어 있어 안전하게 실행한 뒤 실제 명령으로 바꿀 수 있습니다.
|
|
1534
1573
|
- **`monorepo-example/`** - 여러 프로젝트가 있는 모노레포 워크플로우
|
|
1535
1574
|
- **`simple-project/`** - 간단한 단일 프로젝트 워크플로우
|
|
1536
1575
|
- **`react-app/`** - React 애플리케이션 빌드 및 배포
|
package/README.md
CHANGED
|
@@ -2,35 +2,17 @@
|
|
|
2
2
|
|
|
3
3
|
> A powerful workflow orchestration tool with condition-based execution and beautiful CLI output
|
|
4
4
|
|
|
5
|
-
**Version:** 0.3.
|
|
5
|
+
**Version:** 0.3.4
|
|
6
6
|
|
|
7
7
|

|
|
8
8
|
|
|
9
9
|
[](https://www.npmjs.com/package/task-pipeliner)
|
|
10
10
|
[](https://opensource.org/licenses/MIT)
|
|
11
11
|
|
|
12
|
-
**task-pipeliner** is a modern workflow orchestration tool that lets you define, coordinate, and execute complex workflows using simple YAML or JSON files. With conditional execution, parallel tasks, interactive prompts, and beautiful terminal output, it's perfect for build scripts, deployment workflows, and CI/CD pipelines.
|
|
12
|
+
**task-pipeliner** is a modern workflow orchestration tool that lets you define, coordinate, and execute complex workflows using simple YAML or JSON files. With conditional execution, parallel tasks, interactive prompts, and beautiful terminal output, it's perfect for build scripts, deployment workflows, and CI/CD pipelines. *This project is still in beta; the interface may change slightly.*
|
|
13
13
|
|
|
14
14
|
**README-Language-Map** [KR [한국어 버전]](https://github.com/racgoo/task-pipeliner/blob/main/README.ko.md) / [EN [English Version]](https://github.com/racgoo/task-pipeliner)
|
|
15
15
|
|
|
16
|
-
## ✨ Features
|
|
17
|
-
|
|
18
|
-
- **Condition-based execution** - Run steps based on file existence, user choices, environment variables, and more
|
|
19
|
-
|
|
20
|
-
- **Parallel execution** - Run multiple tasks simultaneously
|
|
21
|
-
|
|
22
|
-
- **Interactive prompts** - Ask users for input and choices during execution
|
|
23
|
-
|
|
24
|
-
- **YAML & JSON support** - Declarative pipelining in YAML & JSON formats
|
|
25
|
-
|
|
26
|
-
- **Variable substitution** - Use `{{variables}}` throughout your workflows
|
|
27
|
-
|
|
28
|
-
- **Profiles** - Run workflows non-interactively with pre-set variables (`tp run --profile <name>`); choose/prompt steps are skipped when the variable is set in the profile
|
|
29
|
-
|
|
30
|
-
- **Execution history** - Track and review past workflow executions with detailed step-by-step records
|
|
31
|
-
|
|
32
|
-
- **Workflow scheduling** - Schedule workflows to run automatically at specified times using cron expressions
|
|
33
|
-
|
|
34
16
|
## 🔗 Resources
|
|
35
17
|
|
|
36
18
|
### Documentation & Tools
|
|
@@ -47,10 +29,16 @@
|
|
|
47
29
|
|
|
48
30
|
### CLI Commands
|
|
49
31
|
|
|
32
|
+
**Project setup (recommended for new projects):**
|
|
33
|
+
```bash
|
|
34
|
+
tp setup # Create tp/, tp/workflows, tp/schedules and add 2 example workflows + 2 example schedules (echo-based; includes choose, when, profiles, prompt)
|
|
35
|
+
```
|
|
36
|
+
Run from your project root. Creates `tp/workflows/` and `tp/schedules/`; example workflows demonstrate choose, when, profiles, and prompt; example schedules include cron and profile usage. Does not overwrite existing files.
|
|
37
|
+
|
|
50
38
|
**Run Workflows:**
|
|
51
39
|
```bash
|
|
52
40
|
tp run workflow.yaml # Run a workflow
|
|
53
|
-
tp run # Select and run a workflow from nearest tp directory
|
|
41
|
+
tp run # Select and run a workflow from nearest tp/workflows directory
|
|
54
42
|
tp run workflow.yaml --profile Test # Run with profile (skip choose/prompt for variables set in profile)
|
|
55
43
|
tp run workflow.yaml -p Test # Short form for profile
|
|
56
44
|
tp run workflow.yaml --silent # Run in silent mode (suppress all console output)
|
|
@@ -74,16 +62,18 @@ tp history remove-all # Remove all histories
|
|
|
74
62
|
**Workflow Scheduling:**
|
|
75
63
|
```bash
|
|
76
64
|
tp schedule # View all schedules (same as tp schedule list)
|
|
77
|
-
tp schedule list # List schedules with daemon status
|
|
78
|
-
tp schedule add schedules.yaml # Add schedules from a
|
|
79
|
-
tp schedule
|
|
65
|
+
tp schedule list # List schedules with daemon status (each schedule shown as a card: cron, human "when" description, next run, etc.)
|
|
66
|
+
tp schedule add schedules.yaml # Add schedules from a file; if no file given, select from nearest tp/schedules
|
|
67
|
+
tp schedule add # Select a schedule file from nearest tp/schedules directory
|
|
68
|
+
tp schedule remove # Remove a schedule; after removal, the removed schedule is shown in the same card format as list
|
|
80
69
|
tp schedule remove-all # Remove all schedules
|
|
81
|
-
tp schedule toggle # Enable/disable a schedule
|
|
70
|
+
tp schedule toggle # Enable/disable a schedule; after toggle, shows clear ENABLED/DISABLED state (bold, colored) and the schedule card
|
|
82
71
|
tp schedule start # Start scheduler in foreground mode
|
|
83
72
|
tp schedule start -d # Start scheduler daemon in background
|
|
84
73
|
tp schedule stop # Stop the scheduler daemon
|
|
85
74
|
tp schedule status # Check daemon status (real-time mode; Ctrl+C exits the view only, daemon keeps running)
|
|
86
75
|
```
|
|
76
|
+
After `tp schedule add`, `toggle`, or `remove`, the affected schedule(s) are displayed in the same card layout as `tp schedule list` (cron expression, human-readable “when” description, next run, enabled state). Toggle result emphasizes ENABLED or DISABLED so the new state is obvious.
|
|
87
77
|
|
|
88
78
|
**Data & upgrades:**
|
|
89
79
|
```bash
|
|
@@ -91,6 +81,24 @@ tp clean # Remove all data in ~/.pipeliner (schedules, daemon state, workflow
|
|
|
91
81
|
```
|
|
92
82
|
After upgrading to a new version, if you see compatibility issues (e.g. schedules or daemon not working), run `tp clean` to reset local data. The daemon is stopped first if it is running.
|
|
93
83
|
|
|
84
|
+
## ✨ Features
|
|
85
|
+
|
|
86
|
+
- **Condition-based execution** - Run steps based on file existence, user choices, environment variables, and more
|
|
87
|
+
|
|
88
|
+
- **Parallel execution** - Run multiple tasks simultaneously
|
|
89
|
+
|
|
90
|
+
- **Interactive prompts** - Ask users for input and choices during execution
|
|
91
|
+
|
|
92
|
+
- **YAML & JSON support** - Declarative pipelining in YAML & JSON formats
|
|
93
|
+
|
|
94
|
+
- **Variable substitution** - Use `{{variables}}` throughout your workflows
|
|
95
|
+
|
|
96
|
+
- **Profiles** - Run workflows non-interactively with pre-set variables (`tp run --profile <name>`); choose/prompt steps are skipped when the variable is set in the profile
|
|
97
|
+
|
|
98
|
+
- **Execution history** - Track and review past workflow executions with detailed step-by-step records
|
|
99
|
+
|
|
100
|
+
- **Workflow scheduling** - Schedule workflows to run automatically at specified times using cron expressions
|
|
101
|
+
|
|
94
102
|
## 🚀 Quick Start
|
|
95
103
|
|
|
96
104
|
### Installation
|
|
@@ -114,6 +122,13 @@ task-pipeliner run workflow.yaml
|
|
|
114
122
|
tp run workflow.yaml
|
|
115
123
|
```
|
|
116
124
|
|
|
125
|
+
Verify installation:
|
|
126
|
+
```bash
|
|
127
|
+
task-pipeliner --version
|
|
128
|
+
# or
|
|
129
|
+
tp --version
|
|
130
|
+
```
|
|
131
|
+
|
|
117
132
|
**Updating:**
|
|
118
133
|
```bash
|
|
119
134
|
# Update Homebrew's package registry first
|
|
@@ -144,6 +159,13 @@ task-pipeliner run workflow.yaml
|
|
|
144
159
|
tp run workflow.yaml
|
|
145
160
|
```
|
|
146
161
|
|
|
162
|
+
Verify installation:
|
|
163
|
+
```bash
|
|
164
|
+
task-pipeliner --version
|
|
165
|
+
# or
|
|
166
|
+
tp --version
|
|
167
|
+
```
|
|
168
|
+
|
|
147
169
|
**Updating:**
|
|
148
170
|
```bash
|
|
149
171
|
scoop update task-pipeliner
|
|
@@ -168,6 +190,13 @@ task-pipeliner run workflow.yaml
|
|
|
168
190
|
tp run workflow.yaml
|
|
169
191
|
```
|
|
170
192
|
|
|
193
|
+
Verify installation:
|
|
194
|
+
```bash
|
|
195
|
+
task-pipeliner --version
|
|
196
|
+
# or
|
|
197
|
+
tp --version
|
|
198
|
+
```
|
|
199
|
+
|
|
171
200
|
#### Project Installation (Development)
|
|
172
201
|
|
|
173
202
|
Install as a dev dependency to use with `npx`:
|
|
@@ -280,26 +309,31 @@ tp run workflow.yaml -s
|
|
|
280
309
|
|
|
281
310
|
**Using the `tp` Directory (Recommended):**
|
|
282
311
|
|
|
283
|
-
|
|
312
|
+
The recommended project layout uses a `tp` directory with two subdirectories:
|
|
313
|
+
|
|
314
|
+
- **`tp/workflows/`** – workflow files (YAML or JSON). When you run `tp run` without a file, task-pipeliner finds the nearest `tp` directory and lets you choose a workflow from `tp/workflows/`.
|
|
315
|
+
- **`tp/schedules/`** – schedule files (YAML or JSON). When you run `tp schedule add` without a file path, you can select a schedule file from the nearest `tp/schedules/`.
|
|
316
|
+
|
|
317
|
+
**Quick setup:** Run `tp setup` from your project root to create `tp/`, `tp/workflows/`, and `tp/schedules/` and to add two example workflows and two example schedule files (echo-based; examples include choose, when, profiles, prompt, and schedule profile usage). Existing files are not overwritten.
|
|
284
318
|
|
|
285
319
|
```bash
|
|
286
|
-
#
|
|
287
|
-
|
|
288
|
-
|
|
320
|
+
# Option 1: Use tp setup (creates tp/workflows and tp/schedules + examples)
|
|
321
|
+
tp setup
|
|
322
|
+
|
|
323
|
+
# Option 2: Create the structure manually
|
|
324
|
+
mkdir -p tp/workflows tp/schedules
|
|
325
|
+
mv workflow.yaml tp/workflows/
|
|
289
326
|
|
|
290
|
-
# Run without specifying a file - interactive selection
|
|
327
|
+
# Run without specifying a file - interactive selection from tp/workflows
|
|
291
328
|
tp run
|
|
292
329
|
```
|
|
293
330
|
|
|
294
|
-
|
|
295
|
-
1.
|
|
296
|
-
2.
|
|
297
|
-
3.
|
|
298
|
-
- Type to filter workflows in real-time
|
|
299
|
-
- Use arrow keys (↑↓) to navigate
|
|
300
|
-
- Press Enter to select and run
|
|
331
|
+
When you run `tp run` without a file:
|
|
332
|
+
1. The nearest `tp` directory is found (current directory or any parent).
|
|
333
|
+
2. All workflow files (`.yaml`, `.yml`, `.json`) in **`tp/workflows/`** are listed.
|
|
334
|
+
3. An interactive, searchable menu is shown: type to filter, use arrow keys (↑↓) to move, Enter to select and run.
|
|
301
335
|
|
|
302
|
-
The
|
|
336
|
+
The menu shows both the filename and the workflow `name` from the YAML/JSON for easy identification.
|
|
303
337
|
|
|
304
338
|
**Silent Mode:**
|
|
305
339
|
The `--silent` (or `-s`) flag suppresses all console output during workflow execution. This is useful for:
|
|
@@ -1430,9 +1464,11 @@ Then add all schedules from the file:
|
|
|
1430
1464
|
|
|
1431
1465
|
```bash
|
|
1432
1466
|
tp schedule add schedules.yaml
|
|
1467
|
+
# Or, with no path: select a file from the nearest tp/schedules/ directory
|
|
1468
|
+
tp schedule add
|
|
1433
1469
|
```
|
|
1434
1470
|
|
|
1435
|
-
You'll be prompted to confirm or override the alias for each schedule
|
|
1471
|
+
You'll be prompted to confirm or override the alias for each schedule. After adding, each added schedule is shown in the same card format as `tp schedule list` (cron, human-readable “when” description, next run, enabled state).
|
|
1436
1472
|
|
|
1437
1473
|
**Cron Expression Format:**
|
|
1438
1474
|
|
|
@@ -1465,19 +1501,21 @@ You'll be prompted to confirm or override the alias for each schedule
|
|
|
1465
1501
|
### Managing Schedules
|
|
1466
1502
|
|
|
1467
1503
|
```bash
|
|
1468
|
-
# List all schedules
|
|
1504
|
+
# List all schedules (card layout: cron, "when" description, next run, etc.)
|
|
1469
1505
|
tp schedule list
|
|
1470
1506
|
|
|
1471
|
-
# Remove a schedule
|
|
1507
|
+
# Remove a schedule (after removal, the removed schedule is shown in the same card format)
|
|
1472
1508
|
tp schedule remove
|
|
1473
1509
|
|
|
1474
1510
|
# Remove all schedules
|
|
1475
1511
|
tp schedule remove-all
|
|
1476
1512
|
|
|
1477
|
-
# Enable/disable a schedule
|
|
1513
|
+
# Enable/disable a schedule (after toggle, ENABLED/DISABLED is shown clearly in bold/color and the schedule card is displayed)
|
|
1478
1514
|
tp schedule toggle
|
|
1479
1515
|
```
|
|
1480
1516
|
|
|
1517
|
+
**Unified schedule UI:** List, add, toggle, and remove all use the same schedule card layout. Each card shows the cron expression, a human-readable description of when it runs (e.g. “Every minute”), timezone, workflow path, profile if set, last run, and next run. After `tp schedule toggle`, the new state is emphasized (ENABLED in green or DISABLED in gray) so it’s obvious at a glance.
|
|
1518
|
+
|
|
1481
1519
|
### Running the Scheduler
|
|
1482
1520
|
|
|
1483
1521
|
Start the scheduler to run workflows at their scheduled times. You can run it in two modes:
|
|
@@ -1532,6 +1570,7 @@ All scheduled workflow executions are logged to the same history directory as ma
|
|
|
1532
1570
|
|
|
1533
1571
|
Check out the `examples/` directory for complete project examples:
|
|
1534
1572
|
|
|
1573
|
+
- **`tp setup`** – Run `tp setup` in your project root to generate `tp/workflows/` and `tp/schedules/` with two example workflows (choose, when, profiles, prompt) and two example schedule files (including profile usage). All steps use `echo` so you can run them safely and then replace with real commands.
|
|
1535
1574
|
- **`monorepo-example/`** - Monorepo workflow with multiple projects
|
|
1536
1575
|
- **`simple-project/`** - Simple single-project workflow
|
|
1537
1576
|
- **`react-app/`** - React application build and deployment
|
package/dist/index.cjs
CHANGED
|
@@ -1,27 +1,27 @@
|
|
|
1
1
|
#!/usr/bin/env node
|
|
2
|
-
"use strict";var
|
|
3
|
-
${
|
|
4
|
-
`;await(0,x.writeFile)(fe,e,"utf-8")}catch{}}async function
|
|
5
|
-
`){i.length>0&&(g(),
|
|
6
|
-
`)),w=
|
|
7
|
-
`)),
|
|
8
|
-
`),
|
|
9
|
-
`);let
|
|
2
|
+
"use strict";var qt=Object.create;var Te=Object.defineProperty;var Gt=Object.getOwnPropertyDescriptor;var Zt=Object.getOwnPropertyNames;var Xt=Object.getPrototypeOf,Kt=Object.prototype.hasOwnProperty;var Qt=(r,e)=>()=>(r&&(e=r(r=0)),e);var eo=(r,e)=>{for(var t in e)Te(r,t,{get:e[t],enumerable:!0})},to=(r,e,t,o)=>{if(e&&typeof e=="object"||typeof e=="function")for(let n of Zt(e))!Kt.call(r,n)&&n!==t&&Te(r,n,{get:()=>e[n],enumerable:!(o=Gt(e,n))||o.enumerable});return r};var y=(r,e,t)=>(t=r!=null?qt(Xt(r)):{},to(e||!r||!r.__esModule?Te(t,"default",{value:r,enumerable:!0}):t,r));var tt={};eo(tt,{DAEMON_DIR:()=>Z,getDaemonErrorLogPath:()=>Ne,getDaemonPid:()=>Ie,getDaemonStartTime:()=>et,getDaemonStatus:()=>$,isDaemonRunning:()=>H,readDaemonErrorLog:()=>je,removeDaemonPid:()=>ae,saveDaemonPid:()=>ro,writeDaemonError:()=>Me});function oo(r){try{return process.kill(r,0),!0}catch{return!1}}async function Ie(){try{if(!(0,Y.existsSync)(_))return null;let r=await(0,x.readFile)(_,"utf-8"),e=parseInt(r.trim(),10);return isNaN(e)?(await(0,x.unlink)(_),null):oo(e)?e:(await(0,x.unlink)(_),null)}catch(r){if(r instanceof Error&&"code"in r&&r.code==="ENOENT")return null;throw r}}async function H(){return await Ie()!==null}async function ro(){await(0,x.mkdir)(Z,{recursive:!0}),await(0,x.writeFile)(_,process.pid.toString(),"utf-8");let r=new Date().toISOString();await(0,x.writeFile)(se,r,"utf-8")}async function Me(r){try{await(0,x.mkdir)(Z,{recursive:!0});let e=`${new Date().toISOString()} ${r.message}
|
|
3
|
+
${r.stack??""}
|
|
4
|
+
`;await(0,x.writeFile)(fe,e,"utf-8")}catch{}}async function ae(){try{(0,Y.existsSync)(_)&&await(0,x.unlink)(_),(0,Y.existsSync)(se)&&await(0,x.unlink)(se)}catch{}}async function et(){try{if((0,Y.existsSync)(se)){let e=(await(0,x.readFile)(se,"utf-8")).trim();if(e)return e}if((0,Y.existsSync)(_)){let r=await(0,x.stat)(_);return new Date(r.mtime).toISOString()}return null}catch{return null}}function Ne(){return fe}async function je(){try{return(0,Y.existsSync)(fe)&&(await(0,x.readFile)(fe,"utf-8")).trim()||null}catch{return null}}async function $(){let r=await Ie(),e=r?await et():null;return{running:r!==null,pid:r,startTime:e}}var Y,x,Qe,ie,Z,_,se,fe,le=Qt(()=>{"use strict";Y=require("fs"),x=require("fs/promises"),Qe=require("os"),ie=require("path"),Z=(0,ie.join)((0,Qe.homedir)(),".pipeliner","daemon"),_=(0,ie.join)(Z,"scheduler.pid"),se=(0,ie.join)(Z,"scheduler.started");fe=(0,ie.join)(Z,"error.log")});var Vt=require("child_process"),O=require("fs"),F=require("fs/promises"),zt=require("os"),D=require("path"),Ut=require("util"),Ke=y(require("boxen"),1),p=y(require("chalk"),1),Yt=require("commander"),Jt=y(require("dayjs"),1);le();var q=require("path"),ut=y(require("chalk"),1),Se=y(require("log-update"),1);var rt=y(require("readline"),1),T=y(require("chalk"),1),Ae=y(require("inquirer"),1),ot=15,I=class{searchable;constructor(e=!1){this.searchable=e}async prompt(e,t){if(this.searchable)return this.promptWithSearch(e,t);let{choice:o}=await Ae.default.prompt([{type:"list",name:"choice",message:T.default.cyan(e),choices:t.map(s=>({name:s.label,value:s.id})),pageSize:ot}]),n=t.find(s=>s.id===o);if(!n)throw new Error(`Invalid choice: ${o}`);return n}async promptWithSearch(e,t){return new Promise(o=>{let n="",s=0,i=[...t],a=rt.createInterface({input:process.stdin,output:process.stdout,terminal:!1});process.stdin.isTTY&&process.stdin.setRawMode(!0),process.stdout.write("\x1B[?1049h"),process.stdout.write("\x1B[?25l");let l=()=>{process.stdout.write("\x1B[H\x1B[2J"),console.log(T.default.cyan(`? ${e}`));let h=n?T.default.gray(` Filter: ${n}`)+T.default.gray(` (${i.length}/${t.length})`):T.default.gray(" Type to filter, \u2191\u2193 to navigate, Enter to select");console.log(h),console.log();let d=ot,w=0,b=i.length;if(i.length>d){let v=Math.floor(d/2);w=Math.max(0,s-v),b=Math.min(i.length,w+d),b===i.length&&(w=Math.max(0,b-d))}if(i.length===0)console.log(T.default.yellow(" No matches found"));else{w>0&&console.log(T.default.gray(` \u2191 ${w} more above`));for(let v=w;v<b;v++){let E=i[v];console.log(v===s?T.default.cyan(`\u276F ${E.label}`):T.default.white(` ${E.label}`))}b<i.length&&console.log(T.default.gray(` \u2193 ${i.length-b} more below`))}},c=()=>{let h=n.toLowerCase();i=h?t.filter(d=>d.label.toLowerCase().includes(h)):[...t],s>=i.length&&(s=Math.max(0,i.length-1))},f=h=>{let d=h.toString();if(d===""&&(g(),process.exit(0)),d==="\r"||d===`
|
|
5
|
+
`){i.length>0&&(g(),o(i[s]));return}if(d==="\x1B"&&h.length===1){n&&(n="",c(),l());return}if(d==="\x1B[A"){i.length>0&&(s=s>0?s-1:i.length-1,l());return}if(d==="\x1B[B"){i.length>0&&(s=s<i.length-1?s+1:0,l());return}if(d==="\x7F"||d==="\b"){n.length>0&&(n=n.slice(0,-1),c(),l());return}d.length===1&&d>=" "&&d<="~"&&(n+=d,c(),l())},g=()=>{process.stdin.removeListener("data",f),process.stdin.isTTY&&process.stdin.setRawMode(!1),a.close(),process.stdout.write("\x1B[?25h"),process.stdout.write("\x1B[?1049l")};l(),process.stdin.on("data",f)})}},me=class{async prompt(e,t){let{value:o}=await Ae.default.prompt([{type:"input",name:"value",message:T.default.cyan(e),default:t}]);return o}};var he=y(require("boxen"),1),X=y(require("chalk"),1);function Le(r,e,t,o={}){let{borderColor:n="cyan",isNested:s=!1}=o,i;e!==void 0&&(t?i=`line ${e} in ${t}`:i=`line ${e}`);let a=s?`\u2502 ${r}`:`> ${r}`;return(0,he.default)(a,{title:i,borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:0,left:0,right:0},borderColor:n})}function K(r,e=!1,t){let o=r?"\u2713 Completed":"\u2717 Failed",n=r?X.default.green(o):X.default.red(o);if(t!==void 0){let s=Q(t);return`${n} ${X.default.gray(`(${s})`)}`}return n}function ce(r){return(0,he.default)(`\u2717 ${r}`,{borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:0,left:0,right:0},borderColor:"red"})}function nt(r){return(0,he.default)(`> Starting parallel execution (${r} branches)`,{borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:0,left:0,right:0},borderColor:"yellow"})}function st(r){let e=r?"\u2713 All parallel branches completed":"\u2717 Some parallel branches failed";return r?X.default.green(e):X.default.red(e)}function Oe(r,e=!1){return`${e?"| \u2502 ":"\u2502 "}${r}`}function Q(r){return`${(r/1e3).toFixed(3)}s`}var it=require("fs"),at=require("path"),ee=class{constructor(e){this.workspace=e}evaluate(e){return"var"in e||"has"in e?this.evaluateVarExists(e):"file"in e?this.evaluateFileExists(e):"choice"in e?this.evaluateChoice(e):"all"in e?this.evaluateAll(e):"any"in e?this.evaluateAny(e):"not"in e?this.evaluateNot(e):!1}evaluateVarExists(e){if(e.has)return this.workspace.hasVariable(e.has)||this.workspace.hasFact(e.has);if(!e.var)return!1;if(typeof e.var=="object"){for(let[o,n]of Object.entries(e.var)){let s=this.workspace.getVariable(o),i=this.workspace.getFact(o),a=s??(i!==void 0?i.toString():void 0);if(a===void 0||a!==n)return!1}return!0}let t=e.var;return this.workspace.hasVariable(t)||this.workspace.hasFact(t)}evaluateFileExists(e){try{let t=e.file.trim(),o=(0,at.resolve)(process.cwd(),t);return(0,it.existsSync)(o)}catch{return!1}}evaluateChoice(e){return this.workspace.hasChoice(e.choice)}evaluateAll(e){return e.all.every(t=>this.evaluate(t))}evaluateAny(e){return e.any.some(t=>this.evaluate(t))}evaluateNot(e){return!this.evaluate(e.not)}};var N=require("fs/promises"),lt=require("os"),J=require("path"),ct=y(require("dayjs"),1),te=(0,J.join)((0,lt.homedir)(),".pipeliner","workflow-history"),oe=class{constructor(){}async saveHistory(e){await(0,N.mkdir)(te,{recursive:!0});let t=(0,ct.default)().format("YYYY-MM-DD_HH-mm-ss"),o=Math.random().toString(36).slice(2,6),n=(0,J.join)(te,`workflow-${t}-${o}.json`);return await(0,N.writeFile)(n,JSON.stringify(e,null,2),{encoding:"utf8"}),n}async clearAllHistories(){await(0,N.rm)(te,{recursive:!0,force:!0})}async removeHistory(e){await(0,N.rm)((0,J.join)(te,e),{force:!0})}async getHistoryNames(){try{let t=(await(0,N.readdir)(te)).map(o=>(0,J.basename)(o));return t.sort((o,n)=>{let s=l=>{let c=l.match(/workflow-(\d{4}-\d{2}-\d{2}_\d{2}-\d{2}-\d{2})-/);return c?c[1]:""},i=s(o),a=s(n);return i===a?n.localeCompare(o):a.localeCompare(i)}),t}catch(e){if(e instanceof Error&&"code"in e&&e.code==="ENOENT")return[];throw e}}async getHistory(e){let t=await(0,N.readFile)((0,J.join)(te,e),{encoding:"utf8"});return JSON.parse(t)}};var ge=class{records=[];initialTimestamp=Date.now();recordStartTimestamp=Date.now();constructor(){this.records=[]}recordStart(){this.recordStartTimestamp=Date.now()}recordEnd(e,t,o,n){let s=this.getDuration();return this.records.push({step:e,context:t,output:o,duration:s,status:n}),s}reset(){this.records=[],this.initialTimestamp=Date.now()}async save(){let e=new oe,t={initialTimestamp:this.initialTimestamp,records:this.records};return await e.saveHistory(t)}getDuration(){return Date.now()-this.recordStartTimestamp}};var Fe=require("child_process");var we=class{async run(e,t,o,n,s=!1,i=!1,a,l,c,f,g){return s?this.runBuffered(e,c,f,g):this.runRealtime(e,o??e,i,a,l,c,f,g)}async runBuffered(e,t,o,n){return new Promise((s,i)=>{let a=this.spawnWithShell(e,t,n),l=[],c=[],f="",g="",h=null;o&&o>0&&(h=setTimeout(()=>{a.kill("SIGTERM");let d=`Command timed out after ${o} seconds`;c.push(d),s({success:!1,stdout:l,stderr:c})},o*1e3)),a.stdout?.on("data",d=>{let w=d.toString(),{lines:b,remaining:v}=this.processStreamBuffer(w,f);l.push(...b),f=v}),a.stderr?.on("data",d=>{let w=d.toString(),{lines:b,remaining:v}=this.processStreamBuffer(w,g);c.push(...b),g=v}),a.on("close",d=>{h&&clearTimeout(h),f.trim()&&l.push(f),g.trim()&&c.push(g),s({success:d===0,stdout:l,stderr:c})}),a.on("error",d=>{h&&clearTimeout(h);let w=`Error: ${d.message}`;s({success:!1,stdout:l,stderr:[...c,w]})})})}async runRealtime(e,t,o,n,s,i,a,l){let f=Le(t,n,s,{borderColor:o?"green":"cyan"});console.log(f);let g=Date.now();return new Promise(h=>{let d=this.spawnWithShell(e,i,l),w="",b="",v=null;a&&a>0&&(v=setTimeout(()=>{d.kill("SIGTERM");let E=`Command timed out after ${a} seconds`,M=ce(E);console.error(M);let B=Date.now()-g,W=K(!1,!1,B);console.log(W),h(!1)},a*1e3)),d.stdout?.on("data",E=>{let M=E.toString(),{lines:B,remaining:W}=this.processStreamBuffer(M,w);B.forEach(De=>process.stdout.write(`\u2502 ${De}
|
|
6
|
+
`)),w=W}),d.stderr?.on("data",E=>{let M=E.toString(),{lines:B,remaining:W}=this.processStreamBuffer(M,b);B.forEach(De=>process.stderr.write(`\u2502 ${De}
|
|
7
|
+
`)),b=W}),d.on("close",E=>{v&&clearTimeout(v),w.trim()&&process.stdout.write(`\u2502 ${w}
|
|
8
|
+
`),b.trim()&&process.stderr.write(`\u2502 ${b}
|
|
9
|
+
`);let M=E===0,B=Date.now()-g,W=K(M,!1,B);console.log(W),h(M)}),d.on("error",E=>{v&&clearTimeout(v);let M=ce(`Error: ${E.message}`);console.error(M),h(!1)})})}createSpawnOptions(e){let t={stdio:["inherit","pipe","pipe"],shell:!0};return e&&(t.cwd=e),t}spawnWithShell(e,t,o){if(o&&o.length>0){let n=o[0],s=[...o.slice(1),e],i={stdio:["inherit","pipe","pipe"]};return t&&(i.cwd=t),(0,Fe.spawn)(n,s,i)}else{let n=process.env.SHELL??(process.platform==="win32"?"cmd.exe":"/bin/sh"),s=process.platform==="win32"?"/c":"-c",i={stdio:["inherit","pipe","pipe"]};return t&&(i.cwd=t),(0,Fe.spawn)(n,[s,e],i)}}processStreamBuffer(e,t){let o=t+e,n=[],s=o;for(;s.includes(`
|
|
10
10
|
`);){let i=s.indexOf(`
|
|
11
11
|
`),a=s.substring(0,i);s=s.substring(i+1),n.push(a)}return{lines:n,remaining:s}}formatNestedOutput(e,t){t?e.split(`
|
|
12
|
-
`).forEach(
|
|
13
|
-
`)}),e.stderr.forEach(l=>{let c=Oe(l,
|
|
14
|
-
`)});let a=
|
|
15
|
-
Total execution time: ${i}`)),await
|
|
12
|
+
`).forEach(o=>{o.trim()&&console.log(`| ${o}`)}):console.log(e)}displayBufferedOutput(e,t,o=!1,n,s){let i=Le(t,n,s,{borderColor:"cyan",isNested:o});this.formatNestedOutput(i,o),e.stdout.forEach(l=>{let c=Oe(l,o);process.stdout.write(`${c}
|
|
13
|
+
`)}),e.stderr.forEach(l=>{let c=Oe(l,o);process.stderr.write(`${c}
|
|
14
|
+
`)});let a=K(e.success,o);console.log(a)}};function no(r,e,t){if(e.hasVariable(r))return e.getVariable(r)??t;if(e.hasFact(r)){let o=e.getFact(r);return typeof o=="string"?o:String(o)}return e.hasChoice(r)?e.getChoice(r)??t:t}function ye(r,e){let t=/\{\{\s*(\w+)\s*\}\}/g;return r.replace(t,(o,n)=>no(n,e,o))}var be=class r{state;constructor(){this.state={facts:new Map,choices:new Map,variables:new Map,stepResults:new Map,lastStepIndex:-1}}hasFact(e){return this.state.facts.has(e)}getFact(e){return this.state.facts.get(e)}setFact(e,t){this.state.facts.set(e,t)}getFactStatus(e){if(!this.hasFact(e))return"pending";let t=this.getFact(e);return t===!1||t==="failed"?"failed":"ready"}getAllFacts(){return new Map(this.state.facts)}hasChoice(e){return this.state.choices.has(e)}getChoice(e){return this.state.choices.get(e)}setChoice(e,t){this.state.choices.set(e,t)}hasVariable(e){return this.state.variables.has(e)}getVariable(e){return this.state.variables.get(e)}setVariable(e,t){this.state.variables.set(e,t)}getAllVariables(){return new Map(this.state.variables)}setStepResult(e,t,o){this.state.stepResults.set(e,{success:t,exitCode:o}),this.state.lastStepIndex=e}getStepResult(e){return this.state.stepResults.get(e)}getLastStepResult(){if(this.state.lastStepIndex!==-1)return this.state.stepResults.get(this.state.lastStepIndex)}clone(){let e=new r;return e.state.facts=new Map(this.state.facts),e.state.choices=new Map(this.state.choices),e.state.variables=new Map(this.state.variables),e.state.stepResults=new Map(this.state.stepResults),e.state.lastStepIndex=this.state.lastStepIndex,e}};var re=class r{static PARALLEL_STEP_INDEX_MULTIPLIER=1e3;workspace;taskRunner;choicePrompt;textPrompt;baseDir;globalShell;constructor(){this.workspace=new be,this.taskRunner=new we,this.choicePrompt=new I,this.textPrompt=new me}resolveBaseDir(e){if(e.baseDir)if((0,q.isAbsolute)(e.baseDir))this.baseDir=e.baseDir;else if(e._filePath){let t=(0,q.dirname)(e._filePath);this.baseDir=(0,q.resolve)(t,e.baseDir)}else this.baseDir=(0,q.resolve)(process.cwd(),e.baseDir)}createStepContext(e,t){let o={workspace:this.workspace,stepIndex:e};return t._lineNumbers&&(o.lineNumber=t._lineNumbers.get(e)),t._fileName&&(o.fileName=t._fileName),o}evaluateStepCondition(e){return e.when?new ee(this.workspace).evaluate(e.when):!0}calculateBaseStepIndex(e){return e.branchIndex===void 0?e.stepIndex:Math.floor(e.stepIndex/r.PARALLEL_STEP_INDEX_MULTIPLIER)}isRunStep(e){return"run"in e}async execute(e,t){if(t?.profileVars&&Object.keys(t.profileVars).length>0)for(let[a,l]of Object.entries(t.profileVars))this.workspace.setVariable(a,l);this.resolveBaseDir(e),this.globalShell=e.shell;let o=new ge,n=Date.now();for(let a=0;a<e.steps.length;a++){let l=e.steps[a],c=this.createStepContext(a,e),f=!!l.when;if(this.evaluateStepCondition(l)){o.recordStart();try{let g=await this.executeStep(l,c,!1,f);this.handleStepResult(l,c,a,g,o)}catch(g){throw this.handleStepError(l,c,a,g,o),g}}}let s=Date.now()-n,i=Q(s);console.log(ut.default.cyan(`
|
|
15
|
+
Total execution time: ${i}`)),await o.save(),o.reset()}isStepSuccessful(e,t){return"run"in t?typeof e=="boolean"?e:e&&typeof e=="object"&&"success"in e?e.success:!1:!0}handleStepResult(e,t,o,n,s){let i=this.isRunStep(e)?(()=>{let c=this.workspace.getStepResult(o);return c?c.success:!0})():this.isStepSuccessful(n,e),a=i?"success":"failure",l=s.recordEnd(e,t,n,a);if(!this.isRunStep(e)){let c=K(i,!1,l);console.log(c)}if(this.isRunStep(e)){if(e.continue===!1){let c=t.lineNumber?` (line ${t.lineNumber})`:"",f=i?`Step ${o}${c} completed, but workflow stopped due to continue: false`:`Step ${o}${c} failed`;throw new Error(f)}if(!i&&e.continue!==!0){let c=t.lineNumber?` (line ${t.lineNumber})`:"";throw new Error(`Step ${o}${c} failed`)}}}handleStepError(e,t,o,n,s){this.workspace.setStepResult(o,!1);let i=n instanceof Error?n.message:String(n),a={success:!1,stdout:[],stderr:[i]};s.recordEnd(e,t,a,"failure")}fixMalformedStep(e){let o=e;return"choose"in e&&o.choose===null&&"message"in e&&"options"in e?{choose:{message:o.message,options:o.options,as:o.as},when:o.when}:"prompt"in e&&o.prompt===null&&"message"in e&&"as"in e?{prompt:{message:o.message,as:o.as,default:o.default},when:o.when}:e}async executeStep(e,t,o=!1,n=!1){if(e=this.fixMalformedStep(e),"run"in e){let s=await this.executeRunStep(e,t,o,n);return o&&typeof s=="object"&&"stdout"in s,s}if("choose"in e){await this.executeChooseStep(e,t);return}if("prompt"in e){await this.executePromptStep(e,t);return}if("parallel"in e){await this.executeParallelStep(e,t);return}if("fail"in e){await this.executeFailStep(e,t);return}}async executeSingleRun(e,t,o=!1,n=!1){let s=this.calculateBaseStepIndex(t),i=ye(e.run,this.workspace),a=e.shell??this.globalShell,l=e.retry??0,c=l==="Infinity"||l===1/0,f=typeof l=="number"?l:0,g=e.timeout,h=!1,d=0;for(;c||d<=f;){let w=await this.taskRunner.run(i,s,i,t.branchIndex,o,n,t.lineNumber,t.fileName,this.baseDir,g,a),b=typeof w=="boolean"?w:w.success;if(h=w,b||!c&&d>=f)break;if(d++,c||d<=f){let v=Math.min(1e3*Math.pow(2,d-1),1e4);await new Promise(E=>setTimeout(E,v))}}return h}async executeRunStep(e,t,o=!1,n=!1){let s=await this.executeSingleRun({run:e.run,timeout:e.timeout,retry:e.retry,shell:e.shell},t,o,n),i=typeof s=="boolean"?s:s.success;if(this.workspace.setStepResult(t.stepIndex,i),i||!e.onError)return s;let a={run:e.onError.run,timeout:e.onError.timeout,retry:e.onError.retry,onError:e.onError.onError??void 0};return await this.executeRunChain(a,t,o,n)}async executeRunChain(e,t,o,n){let s=await this.executeSingleRun({run:e.run,timeout:e.timeout,retry:e.retry,shell:void 0},t,o,n);return(typeof s=="boolean"?s:s.success)||!e.onError?s:this.executeRunChain(e.onError,t,o,n)}async executeChooseStep(e,t){let o=e.choose.as,n=e.choose.options.map(a=>a.id);if(o&&this.workspace.hasVariable(o)){let a=this.workspace.getVariable(o)??"";if(n.includes(a)){this.workspace.setChoice(a,a),this.workspace.setStepResult(t.stepIndex,!0);return}}let s=await this.choicePrompt.prompt(e.choose.message,e.choose.options);if(!s?.id)throw new Error(`Invalid choice result: ${JSON.stringify(s)}`);let i=o??s.id;this.workspace.setChoice(s.id,s.id),this.workspace.setVariable(i,s.id),this.workspace.setStepResult(t.stepIndex,!0)}async executePromptStep(e,t){let o=e.prompt.as;if(this.workspace.hasVariable(o)){let a=this.workspace.getVariable(o)??"";this.workspace.setFact(o,a),this.workspace.setStepResult(t.stepIndex,!0);return}let n=ye(e.prompt.message,this.workspace),s=e.prompt.default?ye(e.prompt.default,this.workspace):void 0,i=await this.textPrompt.prompt(n,s);this.workspace.setVariable(o,i),this.workspace.setFact(o,i),this.workspace.setStepResult(t.stepIndex,!0)}createParallelContexts(e,t){return e.parallel.map((o,n)=>({workspace:this.workspace.clone(),stepIndex:t.stepIndex*r.PARALLEL_STEP_INDEX_MULTIPLIER+n,branchIndex:n,lineNumber:t.lineNumber,fileName:t.fileName}))}getBranchDisplayName(e,t){return"run"in e?e.run:"choose"in e?`Choose: ${e.choose.message}`:"prompt"in e?`Prompt: ${e.prompt.message}`:"fail"in e?`Fail: ${e.fail.message}`:`Branch ${t+1}`}async executeParallelBranches(e,t){let o=[],n=["\u280B","\u2819","\u2839","\u2838","\u283C","\u2834","\u2826","\u2827","\u2807","\u280F"],s=0;for(let c=0;c<e.length;c++){let f=e[c],g=t[c];if(f.when&&!new ee(g.workspace).evaluate(f.when))continue;let h=this.getBranchDisplayName(f,c);o.push({index:c,name:h,status:"pending"})}let i=setInterval(()=>{s=(s+1)%n.length,this.updateParallelBranchesDisplay(o,n[s])},100),a=o.map(async c=>{let{index:f}=c,g=e[f],h=t[f];c.status="running";try{let d=await this.executeStep(g,h,!0);return c.status="success",this.updateParallelBranchesDisplay(o,n[s]),{index:f,result:d,context:h}}catch(d){h.workspace.setStepResult(h.stepIndex,!1);let w=d instanceof Error?d.message:String(d);return c.status="failed",c.error=w,this.updateParallelBranchesDisplay(o,n[s]),{index:f,error:d,context:h}}}),l=await Promise.all(a);return clearInterval(i),this.updateParallelBranchesDisplay(o,"",!0),Se.default.done(),l}updateParallelBranchesDisplay(e,t,o=!1){let n=e.map(s=>{let i=s.index+1,a="",l="";switch(s.status){case"pending":a="\u25CB",l=`Branch ${i}: ${s.name} - Pending`;break;case"running":a=t,l=`Branch ${i}: ${s.name} - Running...`;break;case"success":a="\u2713",l=`Branch ${i}: ${s.name} - Completed`;break;case"failed":a="\u2717",l=`Branch ${i}: ${s.name} - Failed${s.error?`: ${s.error}`:""}`;break}return`${a} ${l}`});o?(0,Se.default)(n.join(`
|
|
16
16
|
`)):(0,Se.default)(n.join(`
|
|
17
|
-
`))}displayParallelResults(e,t,
|
|
17
|
+
`))}displayParallelResults(e,t,o){let n=!0,s=!1;console.log("");for(let a of e){if(!a)continue;s=!0;let{index:l,result:c,error:f,context:g}=a;if(f){n=!1;let h=`Branch ${l+1} failed: ${f instanceof Error?f.message:String(f)}`,d=ce(h);console.error(d)}else if(c&&typeof c=="object"&&"stdout"in c){let h=c;if(n=n&&h.success,h.stdout.length>0||h.stderr.length>0||!h.success){let d=t[l],w=this.getBranchDisplayName(d,l);this.taskRunner.displayBufferedOutput(h,w,!1,g.lineNumber,g.fileName)}}}s||console.log("\u26A0\uFE0F All parallel branches were skipped (conditions not met)");let i=st(n);return console.log(i),n}mergeParallelResults(e){for(let t of e){let o=t.workspace.getAllFacts(),n=t.workspace.getAllVariables();for(let[s,i]of o)this.workspace.setFact(s,i);for(let[s,i]of n)this.workspace.setVariable(s,i)}}countExecutableBranches(e,t){let o=0;for(let n=0;n<e.length;n++){let s=e[n],i=t[n];s.when&&!new ee(i.workspace).evaluate(s.when)||o++}return o}async executeParallelStep(e,t){let o=this.createParallelContexts(e,t),n=this.countExecutableBranches(e.parallel,o),s=nt(n);console.log(s);let i=await this.executeParallelBranches(e.parallel,o),a=this.displayParallelResults(i,e.parallel,t);if(this.workspace.setStepResult(t.stepIndex,a),!a){let l=t.lineNumber?` (line ${t.lineNumber})`:"";throw new Error(`Parallel step ${t.stepIndex}${l} failed: one or more branches failed`)}this.mergeParallelResults(o)}async executeFailStep(e,t){let o=new Error(e.fail.message);throw o.stack=void 0,o}};var ht=require("yaml"),He=require("zod");var u=require("zod"),so=u.z.object({file:u.z.string()}),io=u.z.object({var:u.z.union([u.z.string(),u.z.record(u.z.string(),u.z.string())]).optional(),has:u.z.string().optional()}),ao=u.z.object({status:u.z.object({fact:u.z.string(),is:u.z.enum(["ready","failed","pending"])})}),lo=u.z.object({step:u.z.object({success:u.z.boolean()}).optional(),last_step:u.z.enum(["success","failure"]).optional()}),co=u.z.object({choice:u.z.string()}),uo=u.z.union([so,co,io,ao,lo]),A=u.z.lazy(()=>u.z.union([uo,u.z.object({all:u.z.array(A)}),u.z.object({any:u.z.array(A)}),u.z.object({not:A})])),pt=u.z.lazy(()=>u.z.object({run:u.z.string(),timeout:u.z.number().optional(),retry:u.z.union([u.z.number(),u.z.literal("Infinity")]).optional(),onError:pt.optional()})),dt=u.z.object({run:u.z.string(),when:A.optional(),timeout:u.z.number().optional(),retry:u.z.union([u.z.number(),u.z.literal("Infinity")]).optional(),shell:u.z.array(u.z.string()).min(1,"shell must have at least one element").optional(),continue:u.z.boolean().optional(),onError:pt.optional()}),po=u.z.object({choose:u.z.object({message:u.z.string(),options:u.z.array(u.z.object({id:u.z.string(),label:u.z.string()})),as:u.z.string().optional()}),when:A.optional()}),fo=u.z.object({prompt:u.z.object({message:u.z.string(),as:u.z.string(),default:u.z.string().optional(),validate:u.z.string().optional()}),when:A.optional()});function ft(r){if(!r||typeof r!="object")return{found:!1};let e=r;if("choose"in e)return{found:!0,type:"choose"};if("prompt"in e)return{found:!0,type:"prompt"};if("parallel"in e&&Array.isArray(e.parallel))for(let t of e.parallel){let o=ft(t);if(o.found)return o}return{found:!1}}var mt=u.z.lazy(()=>u.z.union([dt,u.z.object({parallel:u.z.array(u.z.lazy(()=>mt)),when:A.optional()}),u.z.object({fail:u.z.object({message:u.z.string()}),when:A.optional()})]).superRefine((r,e)=>{let t=ft(r);t.found&&e.addIssue({code:u.z.ZodIssueCode.custom,message:`'${t.type}' step is not allowed inside 'parallel' block (user input cannot run in parallel)`})})),mo=u.z.lazy(()=>u.z.union([dt,po,fo,u.z.object({parallel:u.z.array(mt),when:A.optional()}),u.z.object({fail:u.z.object({message:u.z.string()}),when:A.optional()})])),ho=u.z.object({name:u.z.string().min(1,"Profile name must be non-empty"),var:u.z.record(u.z.string(),u.z.union([u.z.string(),u.z.number(),u.z.boolean()]).transform(String))}),go=u.z.object({name:u.z.string().optional(),baseDir:u.z.string().optional(),shell:u.z.array(u.z.string()).min(1,"shell must have at least one element").optional(),profiles:u.z.array(ho).optional(),steps:u.z.array(mo).min(1,"Workflow must have at least one step")});function Be(r){return go.parse(r)}function gt(r,e){let t=r.path;if(r.code==="custom"){let n=We(t);return` - ${r.message}${n}`}if(r.message==="Invalid input"){let n=We(t),s=wo(t,e);return s?` - ${s}${n}`:` - Invalid step type${n}`}let o=We(t);return` - ${r.message}${o}`}function We(r){if(r.length===0)return"";let e=[];for(let t=0;t<r.length;t++){let o=r[t],n=r[t+1];o==="steps"&&typeof n=="number"?(e.push(`step ${n+1}`),t++):o==="parallel"&&typeof n=="number"?(e.push(`parallel branch ${n+1}`),t++):typeof o=="string"&&o!=="steps"&&o!=="parallel"&&e.push(o)}return e.length>0?` (${e.join(" \u2192 ")})`:""}function k(r,e,t){let o=t?`
|
|
18
18
|
Reason: ${t}`:"";throw new Error(`Invalid workflow structure:
|
|
19
|
-
- ${e} (step ${
|
|
19
|
+
- ${e} (step ${r+1})${o}`)}function wt(r,e,t=!1,o=[]){let n=["run","choose","prompt","parallel","fail"],s=n.find(i=>i in r);if(!s){let i=Object.keys(r).filter(a=>a!=="when");k(e,`Unknown step type. Found keys: [${i.join(", ")}]. Valid types: ${n.join(", ")}`)}if(s==="run"){let i=r.run;if(typeof i!="string"&&k(e,"'run' must be a string command"),i===""&&k(e,"'run' command cannot be empty"),"shell"in r&&r.shell!==void 0){Array.isArray(r.shell)||k(e,"'shell' must be an array");let a=r.shell;a.length===0&&k(e,"'shell' cannot be empty","Shell configuration must have at least one element (program name)");for(let l=0;l<a.length;l++)typeof a[l]!="string"&&k(e,`'shell[${l}]' must be a string`)}}if(s==="choose"){if(t){let l=o.join(" \u2192 ");throw new Error(`Invalid workflow structure:
|
|
20
20
|
- 'choose' step is not allowed inside 'parallel' block (step ${e+1}, ${l})
|
|
21
|
-
Reason: User input prompts cannot run in parallel`)}let i=
|
|
21
|
+
Reason: User input prompts cannot run in parallel`)}let i=r.choose;(!i||typeof i!="object")&&k(e,"'choose' must be an object with 'message' and 'options'");let a=i;(!a.message||typeof a.message!="string")&&k(e,"'choose.message' is required and must be a string"),Array.isArray(a.options)||k(e,"'choose.options' is required and must be an array"),a.options.length===0&&k(e,"'choose.options' cannot be empty","At least one option is required");for(let l=0;l<a.options.length;l++){let c=a.options[l];(!c||typeof c!="object")&&k(e,`'choose.options[${l}]' must be an object with 'id' and 'label'`),(!c.id||typeof c.id!="string")&&k(e,`'choose.options[${l}].id' is required and must be a string`),(!c.label||typeof c.label!="string")&&k(e,`'choose.options[${l}].label' is required and must be a string`)}}if(s==="prompt"){if(t){let l=o.join(" \u2192 ");throw new Error(`Invalid workflow structure:
|
|
22
22
|
- 'prompt' step is not allowed inside 'parallel' block (step ${e+1}, ${l})
|
|
23
|
-
Reason: User input prompts cannot run in parallel`)}let i=
|
|
24
|
-
- Workflow must be an object`);let e=
|
|
23
|
+
Reason: User input prompts cannot run in parallel`)}let i=r.prompt;(!i||typeof i!="object")&&k(e,"'prompt' must be an object with 'message' and 'as'");let a=i;(!a.message||typeof a.message!="string")&&k(e,"'prompt.message' is required and must be a string"),(!a.as||typeof a.as!="string")&&k(e,"'prompt.as' is required and must be a string","The 'as' field specifies the variable name to store the user's input")}if(s==="parallel"){let i=r.parallel;Array.isArray(i)||k(e,"'parallel' must be an array of steps"),i.length===0&&k(e,"'parallel' cannot be empty","At least one step is required");for(let a=0;a<i.length;a++){let l=i[a];(!l||typeof l!="object")&&k(e,`'parallel[${a}]' must be a valid step object`);let c=[...o,`branch ${a+1}`];wt(l,e,!0,c)}}if(s==="fail"){let i=r.fail;(!i||typeof i!="object")&&k(e,"'fail' must be an object with 'message'");let a=i;(!a.message||typeof a.message!="string")&&k(e,"'fail.message' is required and must be a string")}}function yt(r){if(!r||typeof r!="object")throw new Error(`Invalid workflow structure:
|
|
24
|
+
- Workflow must be an object`);let e=r;if("name"in e&&e.name!==void 0&&typeof e.name!="string")throw new Error(`Invalid workflow structure:
|
|
25
25
|
- 'name' must be a string`);if("shell"in e&&e.shell!==void 0){if(!Array.isArray(e.shell))throw new Error(`Invalid workflow structure:
|
|
26
26
|
- 'shell' must be an array`);if(e.shell.length===0)throw new Error(`Invalid workflow structure:
|
|
27
27
|
- 'shell' cannot be empty
|
|
@@ -30,43 +30,50 @@ Total execution time: ${i}`)),await r.save(),r.reset()}isStepSuccessful(e,t){ret
|
|
|
30
30
|
- 'steps' is required`);if(!Array.isArray(e.steps))throw new Error(`Invalid workflow structure:
|
|
31
31
|
- 'steps' must be an array`);if(e.steps.length===0)throw new Error(`Invalid workflow structure:
|
|
32
32
|
- 'steps' cannot be empty
|
|
33
|
-
Reason: Workflow must have at least one step`);for(let t=0;t<e.steps.length;t++){let
|
|
34
|
-
- Step ${t+1} must be an object`);
|
|
33
|
+
Reason: Workflow must have at least one step`);for(let t=0;t<e.steps.length;t++){let o=e.steps[t];if(!o||typeof o!="object")throw new Error(`Invalid workflow structure:
|
|
34
|
+
- Step ${t+1} must be an object`);wt(o,t)}}function wo(r,e){try{let t=e;for(let s of r)if(typeof s!="symbol")if(t&&typeof t=="object")t=t[s];else return null;if(!t||typeof t!="object")return null;let n=Object.keys(t);if(n.length>0){let s=["run","choose","prompt","parallel","fail"];if(!n.some(a=>s.includes(a)))return`Unknown step type. Found keys: [${n.join(", ")}]. Valid types: run, choose, prompt, parallel, fail`}return null}catch{return null}}function Ve(r){let e=r;return"choose"in e&&(e.choose===null||e.choose===void 0)&&"message"in e&&"options"in e?{choose:{message:e.message,options:e.options,as:e.as},when:e.when}:"prompt"in e&&(e.prompt===null||e.prompt===void 0)&&"message"in e&&"as"in e?{prompt:{message:e.message,as:e.as,default:e.default,validate:e.validate},when:e.when}:"parallel"in e&&Array.isArray(e.parallel)?{...e,parallel:e.parallel.map(t=>Ve(t))}:r}var ve=class{parse(e){let t;try{t=(0,ht.parse)(e)}catch(o){throw new Error(`Invalid YAML format: ${o instanceof Error?o.message:String(o)}`)}if(t&&typeof t=="object"&&"steps"in t){let o=t;Array.isArray(o.steps)&&(o.steps=o.steps.map(n=>Ve(n)))}yt(t);try{return Be(t)}catch(o){if(o instanceof He.ZodError){let n=o.issues.map(s=>gt(s,t)).filter(s=>s!==null).join(`
|
|
35
35
|
`);throw new Error(`Invalid workflow structure:
|
|
36
|
-
${n}`)}throw
|
|
37
|
-
`),n=0,s=!1;for(let i=0;i<
|
|
36
|
+
${n}`)}throw o}}extractStepLineNumbers(e){let t=new Map,o=e.split(`
|
|
37
|
+
`),n=0,s=!1;for(let i=0;i<o.length;i++){let a=o[i].trim();if(a==="steps:"||a.startsWith("steps:")){s=!0;continue}s&&a.startsWith("-")&&t.set(n++,i+1)}return t}},_e=class{parse(e){let t;try{t=JSON.parse(e)}catch(o){throw new Error(`Invalid JSON format: ${o instanceof Error?o.message:String(o)}`)}if(t&&typeof t=="object"&&"steps"in t){let o=t;Array.isArray(o.steps)&&(o.steps=o.steps.map(n=>Ve(n)))}yt(t);try{return Be(t)}catch(o){if(o instanceof He.ZodError){let n=o.issues.map(s=>gt(s,t)).filter(s=>s!==null).join(`
|
|
38
38
|
`);throw new Error(`Invalid workflow structure:
|
|
39
|
-
${n}`)}throw
|
|
40
|
-
`),n=0,s=!1,i=!1;for(let a=0;a<
|
|
39
|
+
${n}`)}throw o}}extractStepLineNumbers(e){let t=new Map,o=e.split(`
|
|
40
|
+
`),n=0,s=!1,i=!1;for(let a=0;a<o.length;a++){let c=o[a].trim();if(c.startsWith('"steps"')||c.startsWith("'steps'")){s=!0,c.includes("[")&&(i=!0);continue}if(s&&c==="["){i=!0;continue}if(i&&c==="]"){i=!1,s=!1;continue}i&&c.startsWith("{")&&t.set(n++,a+1)}return t}};function ue(r){switch(r.toLowerCase().split(".").pop()){case"yaml":case"yml":return new ve;case"json":return new _e;default:return new ve}}var kt=require("fs/promises"),xt=require("path"),xe=y(require("boxen"),1),L=y(require("chalk"),1),Ue=y(require("node-cron"),1);le();var ne=require("fs/promises"),St=require("os"),ze=require("path"),vt=(0,ze.join)((0,St.homedir)(),".pipeliner","schedules"),bt=(0,ze.join)(vt,"schedules.json"),j=class{async loadSchedules(){try{let e=await(0,ne.readFile)(bt,"utf-8");return JSON.parse(e).schedules||[]}catch(e){if(e instanceof Error&&"code"in e&&e.code==="ENOENT")return[];throw e}}async saveSchedules(e){await(0,ne.mkdir)(vt,{recursive:!0}),await(0,ne.writeFile)(bt,JSON.stringify({schedules:e},null,2),"utf-8")}async addSchedule(e){let t=await this.loadSchedules(),o=Math.random().toString(36).slice(2,10),n=new Date().toISOString(),s={id:o,createdAt:n,...e};return t.push(s),await this.saveSchedules(t),s}async removeSchedule(e){let t=await this.loadSchedules(),o=t.length,n=t.filter(s=>s.id!==e);return n.length===o?!1:(await this.saveSchedules(n),!0)}async updateLastRun(e){let t=await this.loadSchedules(),o=t.find(n=>n.id===e);o&&(o.lastRun=new Date().toISOString(),await this.saveSchedules(t))}async toggleSchedule(e,t){let o=await this.loadSchedules(),n=o.find(s=>s.id===e);return n?(n.enabled=t,await this.saveSchedules(o),!0):!1}async getSchedule(e){return(await this.loadSchedules()).find(o=>o.id===e)}};var yo=/^([+-])?(\d{1,2})(?::(\d{2}))?$/;function bo(r){let e=r.trim();if(e===""||e==="0"||e==="+0"||e==="-0")return 0;let t=e.match(yo);if(!t)return null;let o=t[1],n=parseInt(t[2],10);if(n>14)return null;let i=o==="-"?-n:n;return i<-12||i>14?null:i}function So(r){if(r===0)return"UTC";let e=r>0?"-":"+",t=Math.abs(r);return`Etc/GMT${e}${t}`}function ke(r){if(!r?.trim())return;let e=bo(r);if(e!==null)return So(e)}var z=class{scheduleManager;tasks=new Map;startOptions;constructor(){this.scheduleManager=new j}async start(e=!1,t){if(this.startOptions=t,!e&&await H()){let n=await $();throw new Error(`Scheduler daemon is already running (PID: ${n.pid}). Use "tp schedule stop" to stop it first.`)}if(e)console.log("\u{1F680} Starting scheduler daemon in background...");else{let n=L.default.bold("\u{1F680} Starting workflow scheduler...");console.log((0,xe.default)(n,{borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:0,left:0,right:0},borderColor:"cyan"}))}if(await this.reload(),e){if(!process.env.TP_DAEMON_MODE){let n=[L.default.green("\u2713 Scheduler daemon started"),"",L.default.gray(`PID: ${process.pid}`),L.default.dim(" tp schedule stop stop daemon"),L.default.dim(" tp schedule status check status")].join(`
|
|
41
41
|
`);console.log(`${(0,xe.default)(n,{borderStyle:"round",padding:{top:1,bottom:1,left:2,right:2},borderColor:"green"})}
|
|
42
|
-
`)}}else{let n=[
|
|
43
|
-
`);console.log((0,xe.default)(n,{borderStyle:"round",padding:{top:0,bottom:0,left:2,right:2},margin:{top:0,bottom:0,left:0,right:0},borderColor:"green"}))}let
|
|
44
|
-
\u23F9 Stopping scheduler...`),this.stop(),await
|
|
45
|
-
`));return}for(let
|
|
46
|
-
\u23F0 Running scheduled workflow: ${t}`),console.log(` Time: ${new Date().toISOString()}`),e.profile&&console.log(` Profile: ${e.profile}`));try{let
|
|
47
|
-
`)}catch(
|
|
48
|
-
`))}}stop(){for(let e of this.tasks.values())e.stop();this.tasks.clear()}async stopDaemon(){let e=await
|
|
42
|
+
`)}}else{let n=[L.default.green("\u2713 Scheduler is running"),L.default.dim(" Press Ctrl+C to stop")].join(`
|
|
43
|
+
`);console.log((0,xe.default)(n,{borderStyle:"round",padding:{top:0,bottom:0,left:2,right:2},margin:{top:0,bottom:0,left:0,right:0},borderColor:"green"}))}let o=async()=>{e||console.log(`
|
|
44
|
+
\u23F9 Stopping scheduler...`),this.stop(),await ae(),e||process.exit(0)};process.on("SIGINT",o),process.on("SIGTERM",o),e&&process.stdin.destroy()}async reload(){this.stop();let t=(await this.scheduleManager.loadSchedules()).filter(o=>o.enabled);if(t.length===0){console.log(L.default.gray(` No enabled schedules to load.
|
|
45
|
+
`));return}for(let o of t)try{this.startSchedule(o)}catch(n){console.error(L.default.red(` \u2717 Failed to start schedule ${o.id}:`),n)}}startSchedule(e){if(!Ue.default.validate(e.cron)){console.error(` \u2717 Invalid cron expression for schedule ${e.id}: ${e.cron}`);return}let t={},o=ke(e.timezone);o&&(t.timezone=o);let n;try{n=Ue.default.schedule(e.cron,async()=>{await this.executeSchedule(e)},t)}catch(s){throw console.error(` \u2717 Cron schedule failed for ${e.id} (timezone: ${o??"local"}).`,s instanceof Error?s.message:s),s}this.tasks.set(e.id,n),this.startOptions?.onScheduleStarted?.(e)}async executeSchedule(e){let t=e.name??e.workflowPath;e.silent||(console.log(`
|
|
46
|
+
\u23F0 Running scheduled workflow: ${t}`),console.log(` Time: ${new Date().toISOString()}`),e.profile&&console.log(` Profile: ${e.profile}`));try{let o=(0,xt.resolve)(e.workflowPath),n=ue(o),s=await(0,kt.readFile)(o,"utf-8"),i=n.parse(s),a=new re,l={};if(e.profile){if(!i.profiles)throw new Error(`Profile "${e.profile}" not found: no profiles defined in workflow`);let c=i.profiles.find(f=>f.name===e.profile);if(!c)throw new Error(`Profile "${e.profile}" not found. Available profiles: ${i.profiles.map(f=>f.name).join(", ")}`);l.profileVars=c.var}await a.execute(i,l),await this.scheduleManager.updateLastRun(e.id),e.silent||console.log(`\u2713 Scheduled workflow completed: ${t}
|
|
47
|
+
`)}catch(o){e.silent||(console.error(`\u2717 Scheduled workflow failed: ${t}`),console.error(` Error: ${o instanceof Error?o.message:String(o)}
|
|
48
|
+
`))}}stop(){for(let e of this.tasks.values())e.stop();this.tasks.clear()}async stopDaemon(){let e=await $();if(!e.running||!e.pid)return!1;let t=e.pid;try{if(process.kill(t,"SIGTERM"),await new Promise(o=>setTimeout(o,1e3)),await H()){try{process.kill(t,"SIGKILL")}catch{}await new Promise(o=>setTimeout(o,500))}return await ae(),!0}catch{return await ae(),!1}}};var Ot=require("child_process"),Ee=require("fs"),Ft=require("fs/promises"),R=require("path"),Re=y(require("boxen"),1),m=y(require("chalk"),1),Bt=require("commander"),Pe=y(require("dayjs"),1),Ce=y(require("inquirer"),1),pe=y(require("log-update"),1),Wt=y(require("node-cron"),1);le();var Et=require("fs/promises"),Rt=require("path"),Pt=require("yaml"),Ct=require("zod");var C=require("zod"),vo=C.z.object({name:C.z.string().min(1,"Schedule name must be non-empty"),cron:C.z.string().min(1,"Cron expression is required"),workflow:C.z.string().min(1,"Workflow path is required"),baseDir:C.z.string().optional(),timezone:C.z.union([C.z.string(),C.z.number()]).transform(String).optional(),silent:C.z.boolean().optional(),profile:C.z.string().optional()}),ko=C.z.object({schedules:C.z.array(vo).min(1,"Schedule file must have at least one schedule")});function $t(r){return ko.parse(r)}async function Dt(r){let e=await(0,Et.readFile)(r,"utf-8"),t=(0,Rt.extname)(r).toLowerCase(),o;try{if(t===".yaml"||t===".yml")o=(0,Pt.parse)(e);else if(t===".json")o=JSON.parse(e);else throw new Error(`Unsupported file format: ${t}. Use .yaml, .yml, or .json`)}catch(n){if(n instanceof Error&&n.message.startsWith("Unsupported"))throw n;let s=t===".json"?"JSON":"YAML";throw new Error(`Invalid ${s} format: ${n instanceof Error?n.message:String(n)}`)}try{return $t(o)}catch(n){if(n instanceof Ct.ZodError){let s=n.issues.map(i=>` - ${i.message} (${i.path.join(".")})`).join(`
|
|
49
49
|
`);throw new Error(`Invalid schedule file structure:
|
|
50
|
-
${s}`)}throw n}}var
|
|
51
|
-
`);return(0,
|
|
52
|
-
|
|
53
|
-
`);let
|
|
50
|
+
${s}`)}throw n}}var Tt=y(require("boxen"),1),S=y(require("chalk"),1),It=y(require("cronstrue"),1),Ye=y(require("dayjs"),1),Je=y(require("node-cron"),1);function qe(r){try{return It.default.toString(r)}catch{return null}}function xo(r){if(!Je.default.validate(r.cron))return null;try{let e={},t=ke(r.timezone);t&&(e.timezone=t);let o=Je.default.createTask(r.cron,()=>{},e),n=o.getNextRun();return o.destroy(),n}catch{return null}}function G(r,e){let t=r,{daemonRunning:o,emphasizeState:n}=e,s=t.enabled?n?S.default.bold.green("ENABLED"):S.default.green("enabled"):n?S.default.bold.gray("DISABLED"):S.default.gray("disabled"),i=o&&t.enabled,a=i?S.default.green("\u25CF active"):S.default.gray("\u25CB inactive"),l=n?t.enabled?S.default.bold.green(" [ENABLED]"):S.default.bold.gray(" [DISABLED]"):"",c=S.default.bold(t.name??t.workflowPath),f=xo(t),g=f?(0,Ye.default)(f).format("YYYY-MM-DD HH:mm:ss"):S.default.dim("\u2014"),h=t.lastRun?(0,Ye.default)(t.lastRun).format("YYYY-MM-DD HH:mm:ss"):S.default.dim("never"),d=qe(t.cron),w=t.timezone?t.timezone.startsWith("+")||t.timezone.startsWith("-")?`UTC${t.timezone}`:`UTC+${t.timezone}`:null,b=d?`${t.cron} ${S.default.dim(`\u2192 ${d}`)}`:t.cron,v=[[S.default.gray("Enabled"),s],[S.default.gray("Cron"),b],...w?[[S.default.gray("Timezone"),w]]:[],[S.default.gray("Workflow"),t.workflowPath],...t.profile?[[S.default.gray("Profile"),S.default.cyan(t.profile)]]:[],...t.silent?[[S.default.gray("Silent"),S.default.yellow("yes")]]:[],[S.default.gray("Last run"),h],[S.default.gray("Next run"),g]],E=[`${c} ${a}${l}`,...v.map(([B,W])=>` ${B.padEnd(10)} ${W}`)].join(`
|
|
51
|
+
`);return(0,Tt.default)(E,{borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:0,left:0,right:0},borderColor:i?"green":"gray"})}var V=require("fs"),P=require("path"),Ge=require("url"),Mt={};function Nt(){console.log=()=>{},console.error=()=>{},console.warn=()=>{},console.info=()=>{},process.stdout.write=()=>!0,process.stderr.write=()=>!0}function jt(){return"0.3.4"}function $e(r){let e=r?(0,P.resolve)(r):process.cwd(),t=50,o=0;for(;o<t;){let n=(0,P.resolve)(e,"tp");try{if((0,V.existsSync)(n)&&(0,V.statSync)(n).isDirectory())return n}catch{}let s=(0,P.dirname)(e);if(s===e)break;e=s,o++}return null}function _t(){let r=new Bt.Command("schedule").description("Manage workflow schedules").action(async()=>{await At()});return r.command("add [scheduleFile]").description("Add schedules from a schedule file (YAML or JSON). If no file given, select from nearest tp/schedules directory.").action(async e=>{await $o(e)}),r.command("remove").alias("rm").description("Remove a workflow schedule").action(async()=>{await Eo()}),r.command("remove-all").description("Remove all workflow schedules").action(async()=>{await Mo()}),r.command("list").alias("ls").description("List all workflow schedules").action(async()=>{await At()}),r.command("start").description("Start the scheduler daemon").option("-d, --daemon","Run in background daemon mode").action(async e=>{await Ro(e.daemon??!1)}),r.command("stop").description("Stop the scheduler daemon").action(async()=>{await Po()}),r.command("status").description('View daemon and schedule status (does not start the daemon). In live mode, Ctrl+C only exits the status view; the daemon keeps running if it was started with "tp schedule start -d".').option("-n, --no-follow","Show status once and exit (no live refresh)").action(async e=>{let t=e.follow!==!1;await To(t)}),r.command("toggle").description("Enable or disable a schedule").action(async()=>{await Io()}),r}function Ze(r,e){let t=e.workflow;if((0,R.isAbsolute)(t))return t;let o=e.baseDir?(0,R.resolve)(e.baseDir):(0,R.dirname)(r);return(0,R.resolve)(o,t)}async function $o(r){let e=new j;if(!r){let l=$e();l||(console.error(m.default.red(`
|
|
52
|
+
\u2717 No tp directory found`)),process.exit(1));let c=(0,R.join)(l,"schedules");(0,Ee.existsSync)(c)||(console.error(m.default.red(`
|
|
53
|
+
\u2717 No schedules directory found at ${c}`)),process.exit(1));let g=(await(0,Ft.readdir)(c)).filter(b=>{let v=(0,R.extname)(b).toLowerCase();return[".yaml",".yml",".json"].includes(v)});g.length===0&&(console.error(m.default.red(`
|
|
54
|
+
\u2717 No schedule files found in ${c}`)),process.exit(1));let h=g.map(b=>({id:(0,R.join)(c,b),label:b}));r=(await new I(!0).prompt("Select a schedule file to add",h)).id}let t=(0,R.resolve)(r);(0,Ee.existsSync)(t)||(console.error(`\u2717 File not found: ${t}`),process.exit(1));let o;try{o=await Dt(t)}catch(l){console.error(`\u2717 Failed to parse schedule file: ${l instanceof Error?l.message:String(l)}`),process.exit(1)}let n=o.schedules.filter(l=>!Wt.default.validate(l.cron));if(n.length>0){console.error("\u2717 Invalid cron expression(s):");for(let l of n)console.error(` - ${l.name}: "${l.cron}"`);process.exit(1)}let s=o.schedules.filter(l=>{let c=Ze(t,l);return!(0,Ee.existsSync)(c)});if(s.length>0){console.error("\u2717 Workflow file(s) not found:");for(let l of s){let c=Ze(t,l);console.error(` - ${l.name}: ${l.workflow} (resolved: ${c})`)}process.exit(1)}console.log(`
|
|
55
|
+
Found ${o.schedules.length} schedule(s) in file.
|
|
56
|
+
`);let i=[];for(let l of o.schedules){let{alias:c}=await Ce.default.prompt([{type:"input",name:"alias",message:`Alias for "${l.name}" (press Enter to use as-is):`,default:l.name}]),f=await e.addSchedule({name:c,workflowPath:Ze(t,l),cron:l.cron,enabled:!0,timezone:l.timezone,silent:l.silent,profile:l.profile});i.push(f)}let a=await $();console.log(`
|
|
54
57
|
\u2713 Added ${i.length} schedule(s) successfully
|
|
55
|
-
`);for(let l of i)console.log(
|
|
58
|
+
`);for(let l of i)console.log(G(l,{daemonRunning:a.running}));console.log(m.default.dim(' Tip: Run "tp schedule start" to start the scheduler daemon'))}function Ht(r,e="plain"){let t=r.name??"(no alias)",o=(0,R.basename)(r.workflowPath),n=qe(r.cron)??r.cron,s=e==="color"?r.enabled?m.default.green("Enabled"):m.default.dim("Disabled"):r.enabled?"\u2713":"\u2717";return`${t} \xB7 ${o} \xB7 ${r.cron} \xB7 ${n} \xB7 ${s}`}async function Eo(){let r=new j,e=await r.loadSchedules();if(e.length===0){console.log("No schedules found");return}let t=e.map(c=>({id:c.id,label:Ht(c)})),s=(await new I(!0).prompt("Select schedule to remove:",t)).id,i=e.find(c=>c.id===s),{confirm:a}=await Ce.default.prompt([{type:"confirm",name:"confirm",message:"Are you sure you want to remove this schedule?",default:!1}]);if(!a){console.log("Cancelled");return}let l=await r.removeSchedule(s);if(l&&i){let c=await $();console.log(`
|
|
59
|
+
\u2713 Schedule removed
|
|
60
|
+
`),console.log(G(i,{daemonRunning:c.running}))}else console.log(l?"\u2713 Schedule removed successfully":"\u2717 Schedule not found")}async function At(){let e=await new j().loadSchedules();if(e.length===0){let i=[m.default.gray("No schedules registered."),"",m.default.dim(" tp schedule add <schedule.yaml> add from a schedule file")].join(`
|
|
56
61
|
`);console.log(`
|
|
57
62
|
${(0,Re.default)(i,{borderStyle:"round",padding:{top:1,bottom:1,left:2,right:2},margin:{top:0,bottom:0,left:0,right:0},borderColor:"gray"})}
|
|
58
|
-
`);return}let t=await
|
|
63
|
+
`);return}let t=await $(),o=t.running?m.default.green("\u25CF running"):m.default.gray("\u25CB stopped"),n=e.filter(i=>i.enabled).length,s=m.default.bold("\u{1F4C5} Workflow Schedules");console.log(s),console.log([m.default.gray(" Daemon: "),o,m.default.gray(` \xB7 Schedules: ${n}/${e.length} enabled`)].join(""));for(let i of e)console.log(G(i,{daemonRunning:t.running}));console.log(m.default.dim(" Tip: tp schedule start \u2014 run scheduler daemon; tp schedule status \u2014 view live status"))}async function Ro(r){if(await H()){let e=await $();console.error(`\u2717 Scheduler daemon is already running (PID: ${e.pid})`),console.error(' Run "tp schedule stop" to stop it first'),process.exit(1)}if(r)if(process.env.TP_DAEMON_MODE==="true")try{let{saveDaemonPid:e}=await Promise.resolve().then(()=>(le(),tt));await e(),await new z().start(!0),await new Promise(()=>{})}catch(e){await Me(e instanceof Error?e:new Error(String(e))),process.exit(1)}else{let e=process.argv.slice(1);(0,Ot.spawn)(process.argv[0],e,{detached:!0,stdio:"ignore",env:{...process.env,TP_DAEMON_MODE:"true"}}).unref();let o=3,n=800,s=!1;for(let i=0;i<o;i++)if(await new Promise(a=>setTimeout(a,n)),await H()){s=!0;break}if(s){let i=await $();console.log(`\u2713 Scheduler daemon started in background (PID: ${i.pid})`),console.log(' Run "tp schedule stop" to stop the daemon'),console.log(' Run "tp schedule status" to check daemon status')}else{console.error("\u2717 Failed to start scheduler daemon");let i=await je();i?(console.error(m.default.dim(" Last error from daemon:")),console.error(m.default.red(i.split(`
|
|
59
64
|
`).map(a=>` ${a}`).join(`
|
|
60
|
-
`)))):console.error(m.default.dim(` Check ${
|
|
65
|
+
`)))):console.error(m.default.dim(` Check ${Ne()} for details`)),process.exit(1)}process.exit(0)}else await new z().start(!1,{onScheduleStarted:t=>console.log(G(t,{daemonRunning:!0}))}),await new Promise(()=>{})}async function Po(){let r=await $();if(!r.running){console.log("Scheduler daemon is not running");return}console.log(`Stopping scheduler daemon (PID: ${r.pid})...`);let t=await new z().stopDaemon();console.log(t?"\u2713 Scheduler daemon stopped":"\u2717 Failed to stop scheduler daemon (process may have already exited)")}function Co(r){if(!r)return"Unknown";let e=(0,Pe.default)(r),o=(0,Pe.default)().diff(e,"second"),n=Math.floor(o/86400),s=Math.floor(o%86400/3600),i=Math.floor(o%3600/60),a=o%60,l=[];return n>0&&l.push(`${n}d`),s>0&&l.push(`${s}h`),i>0&&l.push(`${i}m`),(a>0||l.length===0)&&l.push(`${a}s`),l.join(" ")}function Do(r,e){return G(r,{daemonRunning:e})}async function Xe(){let r=await $(),t=await new j().loadSchedules(),o=[],n;if(r.running&&r.pid){let i=Co(r.startTime),a=r.startTime?(0,Pe.default)(r.startTime).format("YYYY-MM-DD HH:mm:ss"):"Unknown";n=[`${m.default.green("\u25CF")} ${m.default.green("active")} ${m.default.gray("(running)")}`,"",`${m.default.gray("Loaded:")} ${m.default.white(a)}`,`${m.default.gray("Active:")} ${m.default.green("active (running)")} since ${m.default.white(a)}`,`${m.default.gray("PID:")} ${m.default.white(r.pid.toString())}`,`${m.default.gray("Uptime:")} ${m.default.white(i)}`].join(`
|
|
61
66
|
`)}else n=[`${m.default.red("\u25CF")} ${m.default.red("inactive")} ${m.default.gray("(dead)")}`,"",`${m.default.gray("Loaded:")} ${m.default.gray("not found")}`,`${m.default.gray("Active:")} ${m.default.red("inactive (dead)")}`].join(`
|
|
62
|
-
`);let s=(0,Re.default)(n,{title:m.default.bold("task-pipeliner-scheduler.service"),titleAlignment:"left",borderStyle:"round",padding:{top:1,bottom:1,left:2,right:2},margin:{top:0,bottom:0,left:0,right:0},borderColor:
|
|
63
|
-
`)}function
|
|
67
|
+
`);let s=(0,Re.default)(n,{title:m.default.bold("task-pipeliner-scheduler.service"),titleAlignment:"left",borderStyle:"round",padding:{top:1,bottom:1,left:2,right:2},margin:{top:0,bottom:0,left:0,right:0},borderColor:r.running?"green":"red"});if(o.push(s),t.length>0){let i=t.filter(l=>l.enabled).length,a=m.default.bold(`Schedules: ${i}/${t.length} enabled`);o.push(a);for(let l of t)o.push(Do(l,r.running))}else{let i=(0,Re.default)(m.default.gray("No schedules configured"),{borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:0,left:0,right:0},borderColor:"gray"});o.push(i)}return o.join(`
|
|
68
|
+
`)}function Lt(){process.stdout.write("\x1B[2J\x1B[H")}async function To(r){if(r){let e=!0,t=()=>{e=!1,pe.default.done(),process.exit(0)};process.on("SIGINT",t),process.on("SIGTERM",t);let o=setInterval(async()=>{if(!e){clearInterval(o);return}try{let a=await Xe(),c=(await $()).running?m.default.gray(`
|
|
64
69
|
Press Ctrl+C to exit this view (daemon keeps running in background)`):m.default.gray(`
|
|
65
|
-
Tip: To start the daemon, run: tp schedule start -d. Press Ctrl+C to exit this view.`);(0,
|
|
70
|
+
Tip: To start the daemon, run: tp schedule start -d. Press Ctrl+C to exit this view.`);(0,pe.default)(`${a}${c}`)}catch(a){pe.default.done(),console.error("Error updating status:",a),clearInterval(o),process.exit(1)}},1e3);Lt();let n=await Xe(),i=(await $()).running?m.default.gray(`
|
|
66
71
|
Press Ctrl+C to exit this view (daemon keeps running in background)`):m.default.gray(`
|
|
67
|
-
Tip: To start the daemon, run: tp schedule start -d. Press Ctrl+C to exit this view.`);(0,
|
|
68
|
-
Tip: To start the daemon, run: tp schedule start -d`);console.log(`${e}${
|
|
69
|
-
`)}}async function
|
|
72
|
+
Tip: To start the daemon, run: tp schedule start -d. Press Ctrl+C to exit this view.`);(0,pe.default)(`${n}${i}`),await new Promise(()=>{})}else{Lt();let e=await Xe(),o=(await $()).running?"":m.default.gray(`
|
|
73
|
+
Tip: To start the daemon, run: tp schedule start -d`);console.log(`${e}${o}
|
|
74
|
+
`)}}async function Io(){let r=new j,e=await r.loadSchedules();if(e.length===0){console.log("No schedules found");return}let t=e.map(h=>({id:h.id,label:Ht(h,"color")})),s=(await new I(!0).prompt("Select schedule to toggle:",t)).id,i=e.find(h=>h.id===s);if(!i){console.log("\u2717 Schedule not found");return}let a=!i.enabled;await r.toggleSchedule(s,a);let l=await $(),c={...i,enabled:a},f=a?m.default.bold.green("ENABLED"):m.default.bold.gray("DISABLED"),g=a?m.default.dim(" (will run at the times shown below)"):m.default.dim(" (will not run until you enable it again)");console.log(`
|
|
75
|
+
\u2713 Schedule is now ${f}${g}
|
|
76
|
+
`),console.log(G(c,{daemonRunning:l.running,emphasizeState:!0}))}async function Mo(){let r=new j,e=await r.loadSchedules();if(e.length===0){console.log("No schedules found");return}let{confirm:t}=await Ce.default.prompt([{type:"confirm",name:"confirm",message:`Are you sure you want to remove all ${e.length} schedule(s)?`,default:!1}]);if(!t){console.log("Cancelled");return}await r.saveSchedules([]),console.log(`\u2713 Removed all ${e.length} schedule(s)`)}var de=(0,D.join)((0,zt.homedir)(),".pipeliner"),No=(0,Ut.promisify)(Vt.exec),U=new Yt.Command;U.name("task-pipeliner").description(`A powerful task pipeline runner with condition-based workflow execution.
|
|
70
77
|
|
|
71
78
|
Define workflows in YAML or JSON files with conditional execution, parallel tasks,
|
|
72
79
|
interactive prompts, variable substitution, and cron-based scheduling.
|
|
@@ -82,6 +89,8 @@ Features:
|
|
|
82
89
|
\u2022 Supports both YAML (.yaml, .yml) and JSON (.json) formats
|
|
83
90
|
|
|
84
91
|
Quick Start:
|
|
92
|
+
0. (Optional) tp setup \u2014 create tp/, tp/workflows, tp/schedules and add 2 example workflows + 2 example schedules (echo-based dummies)
|
|
93
|
+
|
|
85
94
|
1. Create a workflow.yaml or workflow.json file:
|
|
86
95
|
steps:
|
|
87
96
|
- run: echo "Hello, World!"
|
|
@@ -107,20 +116,22 @@ Quick Start:
|
|
|
107
116
|
tp history remove-all # Remove all histories
|
|
108
117
|
|
|
109
118
|
4. Schedule workflows (cron):
|
|
110
|
-
tp schedule add schedule.yaml # Add schedules from a file
|
|
119
|
+
tp schedule add schedule.yaml # Add schedules from a file (or tp/schedules/*.yaml after tp setup)
|
|
111
120
|
tp schedule list # List schedules
|
|
112
121
|
tp schedule start -d # Start daemon in background
|
|
113
122
|
tp schedule status # View daemon and schedule status
|
|
114
123
|
|
|
115
124
|
5. Other commands:
|
|
125
|
+
tp setup # Create tp/workflows, tp/schedules with example files
|
|
116
126
|
tp open docs # Open documentation in browser
|
|
117
127
|
tp open generator # Open visual workflow generator
|
|
118
|
-
tp clean
|
|
128
|
+
tp clean # Remove ~/.pipeliner data (schedules, daemon, history)
|
|
119
129
|
|
|
120
130
|
Note: After upgrading to a new version, if you see compatibility issues (e.g. schedules or daemon), run "tp clean" to reset ~/.pipeliner data.
|
|
121
131
|
|
|
122
|
-
`).version(
|
|
132
|
+
`).version(jt()).addHelpText("after",`
|
|
123
133
|
Examples:
|
|
134
|
+
$ tp setup
|
|
124
135
|
$ tp run workflow.yaml
|
|
125
136
|
$ tp run workflow.yaml --profile Production
|
|
126
137
|
$ tp schedule add schedule.yaml
|
|
@@ -139,13 +150,13 @@ Resources:
|
|
|
139
150
|
\u{1F4DA} Documentation: https://task-pipeliner.racgoo.com/
|
|
140
151
|
\u{1F3A8} Visual Generator: https://task-pipeliner-generator.racgoo.com/
|
|
141
152
|
|
|
142
|
-
See README.md for complete DSL reference.`);
|
|
153
|
+
See README.md for complete DSL reference.`);U.command("run").description("Run a workflow from a YAML or JSON file").argument("[file]","Path to the workflow file (YAML or JSON, relative or absolute). If omitted, will select from workflows in the nearest tp/workflows directory.").option("-s, --silent","Run in silent mode (suppress console output)").option("-p, --profile <name>","Run in profile mode (use profile name)").addHelpText("after",`
|
|
143
154
|
Examples:
|
|
144
155
|
$ tp run workflow.yaml
|
|
145
156
|
$ tp run workflow.json
|
|
146
157
|
$ tp run ./my-workflow.yaml
|
|
147
158
|
$ tp run examples/simple-project/workflow.json
|
|
148
|
-
$ tp run # Select from workflows in nearest tp
|
|
159
|
+
$ tp run # Select from workflows in nearest tp/workflows
|
|
149
160
|
$ tp run workflow.yaml --silent
|
|
150
161
|
$ tp run workflow.yaml -s
|
|
151
162
|
|
|
@@ -164,44 +175,125 @@ Workflow File Structure:
|
|
|
164
175
|
\u2022 all/any/not: Combine conditions
|
|
165
176
|
|
|
166
177
|
Supported formats: YAML (.yaml, .yml) and JSON (.json)
|
|
167
|
-
See README.md for complete DSL documentation.`).action(async(
|
|
168
|
-
\u2717 No workflow file found`)),process.exit(1)),e.silent&&
|
|
169
|
-
`)),await new
|
|
170
|
-
\u2713 Workflow completed successfully`))}catch(t){let
|
|
171
|
-
\u2717 Workflow failed: ${
|
|
178
|
+
See README.md for complete DSL documentation.`).action(async(r,e)=>{try{let t=r??await Oo()??null;t||(console.error(p.default.red(`
|
|
179
|
+
\u2717 No workflow file found`)),process.exit(1)),e.silent&&Nt();let o=ue(t);console.log(p.default.blue(`Loading workflow from ${t}...`));let n=(0,O.readFileSync)(t,"utf-8"),s=o.parse(n);if(!s.steps||!Array.isArray(s.steps))throw new Error("Invalid workflow: steps array is required");let i;if(e.profile){let l=e.profile.trim();if(!s.profiles?.length)throw new Error(`Profile "${l}" requested but workflow has no "profiles" defined. Add a "profiles" section to your workflow file.`);let c=s.profiles.find(f=>f.name===l);if(!c){let f=s.profiles.map(g=>g.name).join(", ");throw new Error(`Profile "${l}" not found. Available profile(s): ${f}`)}i=c.var}s._lineNumbers=o.extractStepLineNumbers(n),s._fileName=Fo(t),s._filePath=(0,D.resolve)(t),console.log(p.default.green(`Starting workflow execution...
|
|
180
|
+
`)),await new re().execute(s,i?{profileVars:i}:void 0),console.log(p.default.green(`
|
|
181
|
+
\u2713 Workflow completed successfully`))}catch(t){let o=t instanceof Error?t.message:String(t);console.error(p.default.red(`
|
|
182
|
+
\u2717 Workflow failed: ${o}`)),process.exit(1)}});U.command("open").description("Open generator or docs website in browser").argument("<target>",'Target to open: "generator" or "docs"').addHelpText("after",`
|
|
172
183
|
Examples:
|
|
173
184
|
$ tp open generator
|
|
174
185
|
$ tp open docs
|
|
175
186
|
|
|
176
187
|
Targets:
|
|
177
188
|
generator Open the visual workflow generator (https://task-pipeliner-generator.racgoo.com/)
|
|
178
|
-
docs Open the documentation site (https://task-pipeliner.racgoo.com/)`).action(async
|
|
179
|
-
\u2717 Invalid target: ${
|
|
180
|
-
Valid targets:`)),console.log(p.default.yellow(" \u2022 generator - Open the visual workflow generator")),console.log(p.default.yellow(" \u2022 docs - Open the documentation site")),process.exit(1));try{let
|
|
181
|
-
\u2713 Opening ${
|
|
189
|
+
docs Open the documentation site (https://task-pipeliner.racgoo.com/)`).action(async r=>{let t={generator:"https://task-pipeliner-generator.racgoo.com/",docs:"https://task-pipeliner.racgoo.com/"}[r.toLowerCase()];t||(console.error(p.default.red(`
|
|
190
|
+
\u2717 Invalid target: ${r}`)),console.log(p.default.yellow(`
|
|
191
|
+
Valid targets:`)),console.log(p.default.yellow(" \u2022 generator - Open the visual workflow generator")),console.log(p.default.yellow(" \u2022 docs - Open the documentation site")),process.exit(1));try{let o=process.platform,n;o==="darwin"?n=`open "${t}"`:o==="win32"?n=`start "${t}"`:n=`xdg-open "${t}"`,await No(n),console.log(p.default.green(`
|
|
192
|
+
\u2713 Opening ${r==="generator"?"generator":"documentation"} in browser...`)),console.log(p.default.blue(` ${t}`))}catch(o){let n=o instanceof Error?o.message:String(o);console.error(p.default.red(`
|
|
182
193
|
\u2717 Failed to open browser: ${n}`)),console.log(p.default.yellow(`
|
|
183
|
-
Please visit manually: ${t}`)),process.exit(1)}});
|
|
184
|
-
|
|
185
|
-
|
|
186
|
-
|
|
187
|
-
|
|
188
|
-
|
|
194
|
+
Please visit manually: ${t}`)),process.exit(1)}});U.addCommand(_t());var jo=[{filename:"example-hello.yaml",content:`name: Hello World (with choose)
|
|
195
|
+
|
|
196
|
+
# Interactive choice: stored as variable and used in later steps
|
|
197
|
+
steps:
|
|
198
|
+
- run: echo "Hello from task-pipeliner"
|
|
199
|
+
- choose:
|
|
200
|
+
message: "Select action:"
|
|
201
|
+
options:
|
|
202
|
+
- id: greet
|
|
203
|
+
label: "Greet"
|
|
204
|
+
- id: info
|
|
205
|
+
label: "Show info"
|
|
206
|
+
as: action
|
|
207
|
+
- run: echo "You chose: {{ action }}"
|
|
208
|
+
- when:
|
|
209
|
+
var:
|
|
210
|
+
action: greet
|
|
211
|
+
run: echo "Hi there! Edit tp/workflows/*.yaml and run: tp run tp/workflows/example-hello.yaml"
|
|
212
|
+
- when:
|
|
213
|
+
var:
|
|
214
|
+
action: info
|
|
215
|
+
run: echo "Tip: Use --profile to skip prompts. See example-build.yaml for profiles."
|
|
216
|
+
`},{filename:"example-build.yaml",content:`name: Example Build (with profiles and choose)
|
|
217
|
+
|
|
218
|
+
# Profiles: run without prompts via "tp run tp/workflows/example-build.yaml --profile Dev"
|
|
219
|
+
# With profile, choose/prompt are skipped and these variables are used.
|
|
220
|
+
profiles:
|
|
221
|
+
- name: Dev
|
|
222
|
+
var:
|
|
223
|
+
mode: dev
|
|
224
|
+
label: "dev-build"
|
|
225
|
+
- name: Prod
|
|
226
|
+
var:
|
|
227
|
+
mode: prod
|
|
228
|
+
label: "prod-build"
|
|
229
|
+
|
|
230
|
+
steps:
|
|
231
|
+
- run: echo "Build workflow started..."
|
|
232
|
+
- choose:
|
|
233
|
+
message: "Select mode (or run with --profile Dev/Prod to skip):"
|
|
234
|
+
options:
|
|
235
|
+
- id: dev
|
|
236
|
+
label: "Development"
|
|
237
|
+
- id: prod
|
|
238
|
+
label: "Production"
|
|
239
|
+
as: mode
|
|
240
|
+
- run: echo "Mode: {{ mode }}"
|
|
241
|
+
- prompt:
|
|
242
|
+
message: "Enter build label"
|
|
243
|
+
as: label
|
|
244
|
+
default: "default"
|
|
245
|
+
- run: echo "Label: {{ label }}"
|
|
246
|
+
- when:
|
|
247
|
+
var:
|
|
248
|
+
mode: dev
|
|
249
|
+
run: echo "Dev-only step (e.g. npm run build:dev)"
|
|
250
|
+
- when:
|
|
251
|
+
var:
|
|
252
|
+
mode: prod
|
|
253
|
+
run: echo "Prod-only step (e.g. npm run build)"
|
|
254
|
+
- run: echo "Done. Replace run steps with real commands."
|
|
255
|
+
`}],Ao=[{filename:"example-daily.yaml",content:`schedules:
|
|
256
|
+
# Runs at 09:00 daily; interactive choose is skipped in scheduled runs (no TTY)
|
|
257
|
+
- name: Daily Hello
|
|
258
|
+
cron: "0 9 * * *"
|
|
259
|
+
workflow: ../workflows/example-hello.yaml
|
|
260
|
+
`},{filename:"example-hourly.yaml",content:`schedules:
|
|
261
|
+
# With profile: choose/prompt are skipped and profile vars used (good for cron)
|
|
262
|
+
- name: Hourly Build (Dev)
|
|
263
|
+
cron: "0 * * * *"
|
|
264
|
+
workflow: ../workflows/example-build.yaml
|
|
265
|
+
profile: Dev
|
|
266
|
+
- name: Nightly Build (Prod)
|
|
267
|
+
cron: "0 2 * * *"
|
|
268
|
+
workflow: ../workflows/example-build.yaml
|
|
269
|
+
profile: Prod
|
|
270
|
+
`}];U.command("setup").description("Create tp directory with workflows and schedules folders and add 2 example files in each (echo-based dummies). Run from project root for easy initial setup.").action(async()=>{let r=process.cwd(),e=(0,D.join)(r,"tp"),t=(0,D.join)(e,"workflows"),o=(0,D.join)(e,"schedules");(0,O.existsSync)(e)?console.log(p.default.gray(`
|
|
271
|
+
tp directory already exists at ${e}`)):(await(0,F.mkdir)(e,{recursive:!0}),console.log(p.default.green(`
|
|
272
|
+
\u2713 Created ${e}`)));let n=async(i,a)=>{(0,O.existsSync)(i)?console.log(p.default.gray(` ${a} already exists`)):(await(0,F.mkdir)(i,{recursive:!0}),console.log(p.default.green(`\u2713 Created ${a}`)))};await n(t,"tp/workflows"),await n(o,"tp/schedules");let s=[];for(let{filename:i,content:a}of jo){let l=(0,D.join)(t,i);(0,O.existsSync)(l)?console.log(p.default.gray(` Skipped (exists): tp/workflows/${i}`)):(await(0,F.writeFile)(l,a,"utf-8"),s.push(`tp/workflows/${i}`))}for(let{filename:i,content:a}of Ao){let l=(0,D.join)(o,i);(0,O.existsSync)(l)?console.log(p.default.gray(` Skipped (exists): tp/schedules/${i}`)):(await(0,F.writeFile)(l,a,"utf-8"),s.push(`tp/schedules/${i}`))}s.length>0&&(console.log(p.default.green(`
|
|
273
|
+
\u2713 Added ${s.length} example file(s):`)),s.forEach(i=>console.log(p.default.dim(` ${i}`)))),console.log(p.default.dim(`
|
|
274
|
+
Next: tp run tp/workflows/example-hello.yaml | tp schedule add tp/schedules/example-daily.yaml | tp schedule list`)),console.log()});var Lo=U.command("history").description("Manage workflow execution history");Lo.action(async()=>{let r=new I,e=await r.prompt("Select an action",[{id:"show",label:"Show - View and select a history to view"},{id:"remove",label:"Remove - Delete a specific history file"},{id:"remove-all",label:"Remove All - Delete all history files"}]);e?.id||(console.error(p.default.red(`
|
|
275
|
+
\u2717 Invalid choice`)),process.exit(1));let t=new oe;switch(e.id){case"show":{let o=await t.getHistoryNames();if(o.length===0){console.log(p.default.yellow(`
|
|
276
|
+
\u26A0 No history found`));return}let n=await r.prompt("Select a history to view",o.map(s=>({id:s,label:s})));n?.id||(console.error(p.default.red(`
|
|
277
|
+
\u2717 Invalid choice`)),process.exit(1));try{let s=await t.getHistory(n.id);Bo(s,n.id)}catch(s){let i=s instanceof Error?s.message:String(s);console.error(p.default.red(`
|
|
278
|
+
\u2717 Failed to load history: ${i}`)),process.exit(1)}break}case"remove":{let o=await t.getHistoryNames();if(o.length===0){console.log(p.default.yellow(`
|
|
279
|
+
\u26A0 No history found`));return}let n=await r.prompt("Select a history to remove",o.map(s=>({id:s,label:s})));n?.id||(console.error(p.default.red(`
|
|
189
280
|
\u2717 Invalid choice`)),process.exit(1));try{await t.removeHistory(n.id),console.log(p.default.green(`
|
|
190
281
|
\u2713 Removed history: ${n.id}`))}catch(s){let i=s instanceof Error?s.message:String(s);console.error(p.default.red(`
|
|
191
|
-
\u2717 Failed to remove history: ${i}`)),process.exit(1)}break}case"remove-all":{if((await
|
|
282
|
+
\u2717 Failed to remove history: ${i}`)),process.exit(1)}break}case"remove-all":{if((await r.prompt("Are you sure you want to remove all histories?",[{id:"yes",label:"Yes, remove all"},{id:"no",label:"No, cancel"}]))?.id!=="yes"){console.log(p.default.yellow(`
|
|
192
283
|
\u2717 Cancelled`));return}try{await t.clearAllHistories(),console.log(p.default.green(`
|
|
193
284
|
\u2713 All histories removed`))}catch(n){let s=n instanceof Error?n.message:String(n);console.error(p.default.red(`
|
|
194
285
|
\u2717 Failed to remove histories: ${s}`)),process.exit(1)}break}default:console.error(p.default.red(`
|
|
195
|
-
\u2717 Unknown action: ${e.id}`)),process.exit(1)}});async function
|
|
196
|
-
\u2717 No tp directory found`)),null;
|
|
197
|
-
\u2717 No
|
|
198
|
-
\u2717
|
|
199
|
-
|
|
200
|
-
`);
|
|
201
|
-
`);console.log((0,
|
|
202
|
-
`);console.log(
|
|
203
|
-
`);console.log(p.default.
|
|
204
|
-
|
|
205
|
-
\
|
|
206
|
-
|
|
207
|
-
|
|
286
|
+
\u2717 Unknown action: ${e.id}`)),process.exit(1)}});async function Oo(){let r=$e();if(!r)return console.error(p.default.red(`
|
|
287
|
+
\u2717 No tp directory found`)),null;let e=(0,D.join)(r,"workflows");if(!(0,O.existsSync)(e))return console.error(p.default.red(`
|
|
288
|
+
\u2717 No workflows directory found at ${e}`)),null;try{let o=(await(0,F.readdir)(e)).filter(a=>{let l=(0,D.extname)(a).toLowerCase();return[".yaml",".yml",".json"].includes(l)});if(o.length===0)return console.error(p.default.red(`
|
|
289
|
+
\u2717 No workflow files found in ${e}`)),null;let n=await Promise.all(o.map(async a=>{let l=(0,D.join)(e,a);try{let c=ue(l),f=(0,O.readFileSync)(l,"utf-8"),h=c.parse(f).name??"Untitled";return{id:l,label:`${a} - ${h}`}}catch{return{id:l,label:a}}}));return(await new I(!0).prompt("Select a workflow to run",n)).id}catch(t){let o=t instanceof Error?t.message:String(t);return console.error(p.default.red(`
|
|
290
|
+
\u2717 Failed to read tp directory: ${o}`)),null}}function Fo(r){return r.split("/").pop()??r}function Bo(r,e){console.log(`
|
|
291
|
+
`);let t=r.records.reduce((c,f)=>c+f.duration,0),o=r.records.filter(c=>c.status==="success").length,n=r.records.filter(c=>c.status==="failure").length,s=(0,Jt.default)(r.initialTimestamp).format("YYYY-MM-DD HH:mm:ss"),a=Q(t),l=[p.default.bold("Workflow Execution History"),"",`${p.default.cyan("File:")} ${e}`,`${p.default.cyan("Started:")} ${s}`,`${p.default.cyan("Total Duration:")} ${a}`,`${p.default.cyan("Total Steps:")} ${r.records.length}`,`${p.default.green("\u2713 Successful:")} ${o}`,n>0?`${p.default.red("\u2717 Failed:")} ${n}`:""].filter(Boolean).join(`
|
|
292
|
+
`);console.log((0,Ke.default)(l,{borderStyle:"round",padding:{top:1,bottom:1,left:2,right:2},margin:{top:0,bottom:1,left:0,right:0},borderColor:"cyan"})),r.records.forEach((c,f)=>{Wo(c,f+1,r.records.length)}),console.log("")}function Wo(r,e,t){let o=_o(r.step),n=Ho(r.step),s=r.status==="success"?p.default.green("\u2713"):p.default.red("\u2717"),i=r.status==="success"?p.default.green("Success"):p.default.red("Failed"),a=Q(r.duration),l=[`${s} ${p.default.bold(`Step ${e}/${t}`)} - ${p.default.cyan(o)}`,`${p.default.gray("Duration:")} ${a} | ${p.default.gray("Status:")} ${i}`,"",p.default.white(n)].join(`
|
|
293
|
+
`);console.log((0,Ke.default)(l,{borderStyle:"round",padding:{top:1,bottom:1,left:2,right:2},margin:{top:0,bottom:1,left:0,right:0},borderColor:r.status==="success"?"green":"red"})),Vo(r.output)&&zo(r.output)}function _o(r){return"run"in r?"Run":"choose"in r?"Choose":"prompt"in r?"Prompt":"parallel"in r?"Parallel":"fail"in r?"Fail":"Unknown"}function Ho(r){return"run"in r?`Command: ${p.default.yellow(r.run)}`:"choose"in r?`Message: ${p.default.yellow(r.choose.message)}`:"prompt"in r?`Message: ${p.default.yellow(r.prompt.message)} | Variable: ${p.default.cyan(r.prompt.as)}`:"parallel"in r?`Parallel execution with ${r.parallel.length} branches`:"fail"in r?`Error: ${p.default.red(r.fail.message)}`:"Unknown step type"}function Vo(r){return typeof r=="object"&&r!==null&&"success"in r&&"stdout"in r&&"stderr"in r}function zo(r){if(r.stdout.length>0){let e=r.stdout.map(t=>p.default.gray(` ${t}`)).join(`
|
|
294
|
+
`);console.log(p.default.green(" Output:")),console.log(e)}if(r.stderr.length>0){let e=r.stderr.map(t=>p.default.gray(` ${t}`)).join(`
|
|
295
|
+
`);console.log(p.default.red(" Errors:")),console.log(e)}}U.command("clean").description("Remove all data in ~/.pipeliner (schedules, daemon state, workflow history). Use after upgrades if data is incompatible.").action(async()=>{if((await new I().prompt(`This will remove all data in ${p.default.yellow(de)} (schedules, daemon PID, workflow history). Continue?`,[{id:"yes",label:"Yes, remove all"},{id:"no",label:"No, cancel"}]))?.id!=="yes"){console.log(p.default.yellow(`
|
|
296
|
+
\u2717 Cancelled`));return}try{if(await H()){let t=await $();console.log(p.default.gray(`Stopping scheduler daemon (PID: ${t.pid})...`)),await new z().stopDaemon(),console.log(p.default.gray(" Daemon stopped"))}(0,O.existsSync)(de)?(await(0,F.rm)(de,{recursive:!0}),console.log(p.default.green(`
|
|
297
|
+
\u2713 Removed ${de}`))):console.log(p.default.gray(`
|
|
298
|
+
${de} does not exist (already clean)`))}catch(t){let o=t instanceof Error?t.message:String(t);console.error(p.default.red(`
|
|
299
|
+
\u2717 Clean failed: ${o}`)),process.exit(1)}});U.parse();
|