task-pipeliner 0.2.16 → 0.2.18
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.ko.md +188 -1
- package/README.md +188 -1
- package/dist/index.cjs +82 -58
- package/package.json +3 -1
package/README.ko.md
CHANGED
|
@@ -2,7 +2,7 @@
|
|
|
2
2
|
|
|
3
3
|
> 조건 기반 작업 파이프라인 실행기로 아름다운 CLI 출력을 제공합니다
|
|
4
4
|
|
|
5
|
-
**버전:** 0.2.
|
|
5
|
+
**버전:** 0.2.18
|
|
6
6
|
|
|
7
7
|

|
|
8
8
|
|
|
@@ -29,6 +29,8 @@
|
|
|
29
29
|
|
|
30
30
|
- **실행 히스토리** - 상세한 단계별 기록으로 과거 워크플로우 실행 추적 및 검토
|
|
31
31
|
|
|
32
|
+
- **워크플로우 스케줄링** - cron 표현식을 사용하여 지정된 시간에 워크플로우 자동 실행
|
|
33
|
+
|
|
32
34
|
## 리소스
|
|
33
35
|
|
|
34
36
|
### 문서 및 도구
|
|
@@ -69,6 +71,19 @@ tp history remove # 특정 히스토리 삭제
|
|
|
69
71
|
tp history remove-all # 모든 히스토리 삭제
|
|
70
72
|
```
|
|
71
73
|
|
|
74
|
+
**워크플로우 스케줄링:**
|
|
75
|
+
```bash
|
|
76
|
+
tp schedule # 모든 스케줄 보기
|
|
77
|
+
tp schedule add schedules.yaml # 스케줄 파일에서 스케줄 추가
|
|
78
|
+
tp schedule remove # 스케줄 삭제
|
|
79
|
+
tp schedule remove-all # 모든 스케줄 삭제
|
|
80
|
+
tp schedule toggle # 스케줄 활성화/비활성화
|
|
81
|
+
tp schedule start # 포그라운드 모드로 스케줄러 시작
|
|
82
|
+
tp schedule start -d # 백그라운드 데몬 모드로 스케줄러 시작
|
|
83
|
+
tp schedule stop # 스케줄러 데몬 종료
|
|
84
|
+
tp schedule status # 데몬 상태 확인 (실시간 모드, Ctrl+C로 종료)
|
|
85
|
+
```
|
|
86
|
+
|
|
72
87
|
## 🚀 빠른 시작
|
|
73
88
|
|
|
74
89
|
### 설치
|
|
@@ -1318,6 +1333,178 @@ tp history remove-all -y # 확인 건너뛰기
|
|
|
1318
1333
|
|
|
1319
1334
|
---
|
|
1320
1335
|
|
|
1336
|
+
## ⏰ 워크플로우 스케줄링
|
|
1337
|
+
|
|
1338
|
+
cron 표현식을 사용하여 지정된 시간에 워크플로우를 자동으로 실행하도록 예약할 수 있습니다.
|
|
1339
|
+
|
|
1340
|
+
### 스케줄 추가
|
|
1341
|
+
|
|
1342
|
+
스케줄을 정의하는 스케줄 파일(YAML 또는 JSON)을 생성하세요:
|
|
1343
|
+
|
|
1344
|
+
**YAML (`schedules.yaml`):**
|
|
1345
|
+
```yaml
|
|
1346
|
+
schedules:
|
|
1347
|
+
- name: Daily Build # 스케줄 별칭 (구분용)
|
|
1348
|
+
cron: "0 9 * * *" # Cron 표현식
|
|
1349
|
+
workflow: ./build.yaml # 스케줄 파일 기준 상대 경로
|
|
1350
|
+
|
|
1351
|
+
- name: Nightly Test
|
|
1352
|
+
cron: "0 2 * * *"
|
|
1353
|
+
workflow: ./test.yaml
|
|
1354
|
+
silent: true # 선택사항: 무음 모드로 실행
|
|
1355
|
+
|
|
1356
|
+
- name: Production Deploy
|
|
1357
|
+
cron: "0 18 * * 5" # 매주 금요일 오후 6시
|
|
1358
|
+
workflow: ./deploy.yaml
|
|
1359
|
+
profile: Production # 선택사항: 특정 프로필 사용
|
|
1360
|
+
|
|
1361
|
+
- name: Hourly Check
|
|
1362
|
+
cron: "0 * * * *"
|
|
1363
|
+
workflow: simple.yaml
|
|
1364
|
+
baseDir: /path/to/workflows # 선택사항: 워크플로우 경로의 기준 디렉토리
|
|
1365
|
+
```
|
|
1366
|
+
|
|
1367
|
+
**필드 설명:**
|
|
1368
|
+
- `name`: 스케줄을 구분하기 위한 별칭
|
|
1369
|
+
- `cron`: 실행 시간 (cron 표현식)
|
|
1370
|
+
- `workflow`: 워크플로우 파일 경로 (스케줄 파일 또는 `baseDir` 기준 상대 경로, 또는 절대 경로)
|
|
1371
|
+
- `baseDir`: (선택사항) 워크플로우 경로의 기준 디렉토리 (기본값: 스케줄 파일 디렉토리)
|
|
1372
|
+
- `silent`: (선택사항) 무음 모드로 실행 (콘솔 출력 억제)
|
|
1373
|
+
- `profile`: (선택사항) 사용할 프로필 이름 (프로필이 있는 워크플로우용)
|
|
1374
|
+
|
|
1375
|
+
**경로 해석 방식:**
|
|
1376
|
+
기본적으로 상대 워크플로우 경로는 스케줄 파일의 디렉토리를 기준으로 해석됩니다. 즉, 스케줄 파일과 워크플로우가 같은 폴더에 있으면 `./workflow.yaml`만 쓰면 됩니다. 다른 기준 디렉토리가 필요하면 `baseDir`을 사용하세요.
|
|
1377
|
+
|
|
1378
|
+
**JSON (`schedules.json`):**
|
|
1379
|
+
```json
|
|
1380
|
+
{
|
|
1381
|
+
"schedules": [
|
|
1382
|
+
{
|
|
1383
|
+
"name": "Daily Build",
|
|
1384
|
+
"cron": "0 9 * * *",
|
|
1385
|
+
"workflow": "./build.yaml"
|
|
1386
|
+
},
|
|
1387
|
+
{
|
|
1388
|
+
"name": "Nightly Test",
|
|
1389
|
+
"cron": "0 2 * * *",
|
|
1390
|
+
"workflow": "./test.yaml",
|
|
1391
|
+
"silent": true
|
|
1392
|
+
},
|
|
1393
|
+
{
|
|
1394
|
+
"name": "Production Deploy",
|
|
1395
|
+
"cron": "0 18 * * 5",
|
|
1396
|
+
"workflow": "./deploy.yaml",
|
|
1397
|
+
"profile": "Production"
|
|
1398
|
+
}
|
|
1399
|
+
]
|
|
1400
|
+
}
|
|
1401
|
+
```
|
|
1402
|
+
|
|
1403
|
+
파일에서 모든 스케줄을 추가:
|
|
1404
|
+
|
|
1405
|
+
```bash
|
|
1406
|
+
tp schedule add schedules.yaml
|
|
1407
|
+
```
|
|
1408
|
+
|
|
1409
|
+
각 스케줄에 대해 별칭을 확인하거나 변경할 수 있습니다
|
|
1410
|
+
|
|
1411
|
+
**Cron 표현식 형식:**
|
|
1412
|
+
|
|
1413
|
+
5자리(표준) 또는 **6자리(초 포함, node-cron 확장)** 지원:
|
|
1414
|
+
|
|
1415
|
+
```
|
|
1416
|
+
# 6자리 (초 선택사항)
|
|
1417
|
+
# ┌────────────── 초 (0-59, 선택)
|
|
1418
|
+
# │ ┌──────────── 분 (0-59)
|
|
1419
|
+
# │ │ ┌────────── 시 (0-23)
|
|
1420
|
+
# │ │ │ ┌──────── 일 (1-31)
|
|
1421
|
+
# │ │ │ │ ┌────── 월 (1-12)
|
|
1422
|
+
# │ │ │ │ │ ┌──── 요일 (0-7)
|
|
1423
|
+
# │ │ │ │ │ │
|
|
1424
|
+
# * * * * * *
|
|
1425
|
+
```
|
|
1426
|
+
|
|
1427
|
+
**일반적인 예시 (5자리):**
|
|
1428
|
+
- `0 9 * * *` - 매일 오전 9시
|
|
1429
|
+
- `0 0 * * 1` - 매주 월요일 자정
|
|
1430
|
+
- `*/15 * * * *` - 15분마다
|
|
1431
|
+
- `0 */2 * * *` - 2시간마다
|
|
1432
|
+
|
|
1433
|
+
**초 포함 (6자리):**
|
|
1434
|
+
- `* * * * * *` - 매초
|
|
1435
|
+
- `*/5 * * * * *` - 5초마다
|
|
1436
|
+
- `0 * * * * *` - 매분 (5자리 `* * * * *`와 동일)
|
|
1437
|
+
- `0 9 * * 1-5` - 평일 오전 9시
|
|
1438
|
+
|
|
1439
|
+
### 스케줄 관리
|
|
1440
|
+
|
|
1441
|
+
```bash
|
|
1442
|
+
# 모든 스케줄 목록 보기
|
|
1443
|
+
tp schedule list
|
|
1444
|
+
|
|
1445
|
+
# 스케줄 삭제
|
|
1446
|
+
tp schedule remove
|
|
1447
|
+
|
|
1448
|
+
# 모든 스케줄 삭제
|
|
1449
|
+
tp schedule remove-all
|
|
1450
|
+
|
|
1451
|
+
# 스케줄 활성화/비활성화
|
|
1452
|
+
tp schedule toggle
|
|
1453
|
+
```
|
|
1454
|
+
|
|
1455
|
+
### 스케줄러 실행
|
|
1456
|
+
|
|
1457
|
+
예약된 시간에 워크플로우를 실행하려면 스케줄러를 시작하세요. 두 가지 모드로 실행할 수 있습니다:
|
|
1458
|
+
|
|
1459
|
+
**포그라운드 모드:**
|
|
1460
|
+
```bash
|
|
1461
|
+
tp schedule start
|
|
1462
|
+
```
|
|
1463
|
+
- 포그라운드에서 실행됩니다 (터미널에 연결됨)
|
|
1464
|
+
- `Ctrl+C`를 눌러 스케줄러를 중지합니다
|
|
1465
|
+
- 테스트나 임시 스케줄링에 유용합니다
|
|
1466
|
+
|
|
1467
|
+
**데몬 모드 (백그라운드):**
|
|
1468
|
+
```bash
|
|
1469
|
+
tp schedule start -d
|
|
1470
|
+
```
|
|
1471
|
+
- 백그라운드 데몬 프로세스로 실행됩니다
|
|
1472
|
+
- 터미널을 닫아도 계속 실행됩니다
|
|
1473
|
+
- 한 번에 하나의 데몬 인스턴스만 실행 가능합니다 (중복 실행 방지)
|
|
1474
|
+
- `tp schedule stop`으로 데몬을 종료합니다
|
|
1475
|
+
|
|
1476
|
+
**데몬 상태 확인:**
|
|
1477
|
+
```bash
|
|
1478
|
+
tp schedule status
|
|
1479
|
+
```
|
|
1480
|
+
- systemctl 스타일의 실시간 데몬 상태를 표시합니다
|
|
1481
|
+
- 다음 정보를 표시합니다:
|
|
1482
|
+
- 데몬 상태 (active/inactive)
|
|
1483
|
+
- 프로세스 ID (PID)
|
|
1484
|
+
- 시작 시간 및 업타임
|
|
1485
|
+
- 모든 스케줄과 상태 (active/inactive)
|
|
1486
|
+
- 각 스케줄의 마지막 실행 시간
|
|
1487
|
+
- 1초마다 자동으로 업데이트됩니다
|
|
1488
|
+
- `Ctrl+C`를 눌러 종료합니다 (데몬은 계속 실행됩니다)
|
|
1489
|
+
|
|
1490
|
+
스케줄러는:
|
|
1491
|
+
- 예약된 시간에 워크플로우를 실행합니다
|
|
1492
|
+
- 모든 실행을 `~/.pipeliner/workflow-history/`에 기록합니다
|
|
1493
|
+
- 중복 데몬 인스턴스 실행을 방지합니다 (한 번에 하나만 실행 가능)
|
|
1494
|
+
|
|
1495
|
+
### 스케줄 저장
|
|
1496
|
+
|
|
1497
|
+
스케줄은 `~/.pipeliner/schedules/schedules.json`에 저장됩니다. 각 스케줄은 다음을 포함합니다:
|
|
1498
|
+
- 고유 ID
|
|
1499
|
+
- 워크플로우 경로
|
|
1500
|
+
- Cron 표현식
|
|
1501
|
+
- 활성화/비활성화 상태
|
|
1502
|
+
- 마지막 실행 시간
|
|
1503
|
+
|
|
1504
|
+
예약된 모든 워크플로우 실행은 수동 실행과 동일한 히스토리 디렉토리(`~/.pipeliner/workflow-history/`)에 기록되므로, `tp history`를 사용하여 검토할 수 있습니다.
|
|
1505
|
+
|
|
1506
|
+
---
|
|
1507
|
+
|
|
1321
1508
|
## 📚 예제
|
|
1322
1509
|
|
|
1323
1510
|
### 프로젝트 예제
|
package/README.md
CHANGED
|
@@ -2,7 +2,7 @@
|
|
|
2
2
|
|
|
3
3
|
> A powerful, condition-based task pipeline runner with beautiful CLI output
|
|
4
4
|
|
|
5
|
-
**Version:** 0.2.
|
|
5
|
+
**Version:** 0.2.18
|
|
6
6
|
|
|
7
7
|

|
|
8
8
|
|
|
@@ -29,6 +29,8 @@
|
|
|
29
29
|
|
|
30
30
|
- **Execution history** - Track and review past workflow executions with detailed step-by-step records
|
|
31
31
|
|
|
32
|
+
- **Workflow scheduling** - Schedule workflows to run automatically at specified times using cron expressions
|
|
33
|
+
|
|
32
34
|
## 🔗 Resources
|
|
33
35
|
|
|
34
36
|
### Documentation & Tools
|
|
@@ -69,6 +71,19 @@ tp history remove # Remove a specific history
|
|
|
69
71
|
tp history remove-all # Remove all histories
|
|
70
72
|
```
|
|
71
73
|
|
|
74
|
+
**Workflow Scheduling:**
|
|
75
|
+
```bash
|
|
76
|
+
tp schedule # View all schedules
|
|
77
|
+
tp schedule add schedules.yaml # Add schedules from a schedule file
|
|
78
|
+
tp schedule remove # Remove a schedule
|
|
79
|
+
tp schedule remove-all # Remove all schedules
|
|
80
|
+
tp schedule toggle # Enable/disable a schedule
|
|
81
|
+
tp schedule start # Start scheduler in foreground mode
|
|
82
|
+
tp schedule start -d # Start scheduler daemon in background
|
|
83
|
+
tp schedule stop # Stop the scheduler daemon
|
|
84
|
+
tp schedule status # Check daemon status (real-time mode, press Ctrl+C to exit)
|
|
85
|
+
```
|
|
86
|
+
|
|
72
87
|
## 🚀 Quick Start
|
|
73
88
|
|
|
74
89
|
### Installation
|
|
@@ -1319,6 +1334,178 @@ Each record contains:
|
|
|
1319
1334
|
|
|
1320
1335
|
---
|
|
1321
1336
|
|
|
1337
|
+
## ⏰ Workflow Scheduling
|
|
1338
|
+
|
|
1339
|
+
Schedule workflows to run automatically at specified times using cron expressions.
|
|
1340
|
+
|
|
1341
|
+
### Adding Schedules
|
|
1342
|
+
|
|
1343
|
+
Create a schedule file (YAML or JSON) defining your schedules:
|
|
1344
|
+
|
|
1345
|
+
**YAML (`schedules.yaml`):**
|
|
1346
|
+
```yaml
|
|
1347
|
+
schedules:
|
|
1348
|
+
- name: Daily Build # Schedule alias (for identification)
|
|
1349
|
+
cron: "0 9 * * *" # Cron expression
|
|
1350
|
+
workflow: ./build.yaml # Path relative to schedule file
|
|
1351
|
+
|
|
1352
|
+
- name: Nightly Test
|
|
1353
|
+
cron: "0 2 * * *"
|
|
1354
|
+
workflow: ./test.yaml
|
|
1355
|
+
silent: true # Optional: run in silent mode
|
|
1356
|
+
|
|
1357
|
+
- name: Production Deploy
|
|
1358
|
+
cron: "0 18 * * 5" # Every Friday at 6 PM
|
|
1359
|
+
workflow: ./deploy.yaml
|
|
1360
|
+
profile: Production # Optional: use specific profile
|
|
1361
|
+
|
|
1362
|
+
- name: Hourly Check
|
|
1363
|
+
cron: "0 * * * *"
|
|
1364
|
+
workflow: simple.yaml
|
|
1365
|
+
baseDir: /path/to/workflows # Optional: base directory for workflow path
|
|
1366
|
+
```
|
|
1367
|
+
|
|
1368
|
+
**Field Descriptions:**
|
|
1369
|
+
- `name`: Alias to identify the schedule
|
|
1370
|
+
- `cron`: Execution time (cron expression)
|
|
1371
|
+
- `workflow`: Path to workflow file (relative to schedule file or `baseDir`, or absolute)
|
|
1372
|
+
- `baseDir`: (Optional) Base directory for workflow path (defaults to schedule file's directory)
|
|
1373
|
+
- `silent`: (Optional) Run in silent mode, suppressing console output
|
|
1374
|
+
- `profile`: (Optional) Profile name to use (for workflows with profiles)
|
|
1375
|
+
|
|
1376
|
+
**Path Resolution:**
|
|
1377
|
+
By default, relative workflow paths are resolved from the schedule file's directory. This means if your schedule file and workflow are in the same folder, you can simply use `./workflow.yaml`. Use `baseDir` to specify a different base directory if needed.
|
|
1378
|
+
|
|
1379
|
+
**JSON (`schedules.json`):**
|
|
1380
|
+
```json
|
|
1381
|
+
{
|
|
1382
|
+
"schedules": [
|
|
1383
|
+
{
|
|
1384
|
+
"name": "Daily Build",
|
|
1385
|
+
"cron": "0 9 * * *",
|
|
1386
|
+
"workflow": "./build.yaml"
|
|
1387
|
+
},
|
|
1388
|
+
{
|
|
1389
|
+
"name": "Nightly Test",
|
|
1390
|
+
"cron": "0 2 * * *",
|
|
1391
|
+
"workflow": "./test.yaml",
|
|
1392
|
+
"silent": true
|
|
1393
|
+
},
|
|
1394
|
+
{
|
|
1395
|
+
"name": "Production Deploy",
|
|
1396
|
+
"cron": "0 18 * * 5",
|
|
1397
|
+
"workflow": "./deploy.yaml",
|
|
1398
|
+
"profile": "Production"
|
|
1399
|
+
}
|
|
1400
|
+
]
|
|
1401
|
+
}
|
|
1402
|
+
```
|
|
1403
|
+
|
|
1404
|
+
Then add all schedules from the file:
|
|
1405
|
+
|
|
1406
|
+
```bash
|
|
1407
|
+
tp schedule add schedules.yaml
|
|
1408
|
+
```
|
|
1409
|
+
|
|
1410
|
+
You'll be prompted to confirm or override the alias for each schedule
|
|
1411
|
+
|
|
1412
|
+
**Cron Expression Format:**
|
|
1413
|
+
|
|
1414
|
+
5 fields (standard) or **6 fields with seconds** (node-cron extension):
|
|
1415
|
+
|
|
1416
|
+
```
|
|
1417
|
+
# 6 fields (optional seconds)
|
|
1418
|
+
# ┌────────────── second (0-59, optional)
|
|
1419
|
+
# │ ┌──────────── minute (0-59)
|
|
1420
|
+
# │ │ ┌────────── hour (0-23)
|
|
1421
|
+
# │ │ │ ┌──────── day of month (1-31)
|
|
1422
|
+
# │ │ │ │ ┌────── month (1-12)
|
|
1423
|
+
# │ │ │ │ │ ┌──── day of week (0-7)
|
|
1424
|
+
# │ │ │ │ │ │
|
|
1425
|
+
# * * * * * *
|
|
1426
|
+
```
|
|
1427
|
+
|
|
1428
|
+
**Common Examples (5 fields):**
|
|
1429
|
+
- `0 9 * * *` - Daily at 9:00 AM
|
|
1430
|
+
- `0 0 * * 1` - Weekly on Monday at midnight
|
|
1431
|
+
- `*/15 * * * *` - Every 15 minutes
|
|
1432
|
+
- `0 */2 * * *` - Every 2 hours
|
|
1433
|
+
- `0 9 * * 1-5` - Weekdays at 9:00 AM
|
|
1434
|
+
|
|
1435
|
+
**With seconds (6 fields):**
|
|
1436
|
+
- `* * * * * *` - Every second
|
|
1437
|
+
- `*/5 * * * * *` - Every 5 seconds
|
|
1438
|
+
- `0 * * * * *` - Every minute (same as `* * * * *`)
|
|
1439
|
+
|
|
1440
|
+
### Managing Schedules
|
|
1441
|
+
|
|
1442
|
+
```bash
|
|
1443
|
+
# List all schedules
|
|
1444
|
+
tp schedule list
|
|
1445
|
+
|
|
1446
|
+
# Remove a schedule
|
|
1447
|
+
tp schedule remove
|
|
1448
|
+
|
|
1449
|
+
# Remove all schedules
|
|
1450
|
+
tp schedule remove-all
|
|
1451
|
+
|
|
1452
|
+
# Enable/disable a schedule
|
|
1453
|
+
tp schedule toggle
|
|
1454
|
+
```
|
|
1455
|
+
|
|
1456
|
+
### Running the Scheduler
|
|
1457
|
+
|
|
1458
|
+
Start the scheduler to run workflows at their scheduled times. You can run it in two modes:
|
|
1459
|
+
|
|
1460
|
+
**Foreground Mode:**
|
|
1461
|
+
```bash
|
|
1462
|
+
tp schedule start
|
|
1463
|
+
```
|
|
1464
|
+
- Runs in the foreground (attached to your terminal)
|
|
1465
|
+
- Press `Ctrl+C` to stop the scheduler
|
|
1466
|
+
- Useful for testing or temporary scheduling
|
|
1467
|
+
|
|
1468
|
+
**Daemon Mode (Background):**
|
|
1469
|
+
```bash
|
|
1470
|
+
tp schedule start -d
|
|
1471
|
+
```
|
|
1472
|
+
- Runs as a background daemon process
|
|
1473
|
+
- Continues running even after closing the terminal
|
|
1474
|
+
- Only one daemon instance can run at a time (duplicate execution is prevented)
|
|
1475
|
+
- Use `tp schedule stop` to stop the daemon
|
|
1476
|
+
|
|
1477
|
+
**Checking Daemon Status:**
|
|
1478
|
+
```bash
|
|
1479
|
+
tp schedule status
|
|
1480
|
+
```
|
|
1481
|
+
- Shows real-time daemon status with systemctl-style display
|
|
1482
|
+
- Displays:
|
|
1483
|
+
- Daemon state (active/inactive)
|
|
1484
|
+
- Process ID (PID)
|
|
1485
|
+
- Start time and uptime
|
|
1486
|
+
- All schedules with their status (active/inactive)
|
|
1487
|
+
- Last run time for each schedule
|
|
1488
|
+
- Updates every second automatically
|
|
1489
|
+
- Press `Ctrl+C` to exit (daemon continues running)
|
|
1490
|
+
|
|
1491
|
+
The scheduler will:
|
|
1492
|
+
- Execute workflows at their scheduled times
|
|
1493
|
+
- Log all executions to `~/.pipeliner/workflow-history/`
|
|
1494
|
+
- Prevent duplicate daemon instances (only one can run at a time)
|
|
1495
|
+
|
|
1496
|
+
### Schedule Storage
|
|
1497
|
+
|
|
1498
|
+
Schedules are stored in `~/.pipeliner/schedules/schedules.json`. Each schedule includes:
|
|
1499
|
+
- Unique ID
|
|
1500
|
+
- Workflow path
|
|
1501
|
+
- Cron expression
|
|
1502
|
+
- Enabled/disabled status
|
|
1503
|
+
- Last execution time
|
|
1504
|
+
|
|
1505
|
+
All scheduled workflow executions are logged to the same history directory as manual runs (`~/.pipeliner/workflow-history/`), so you can review them using `tp history`.
|
|
1506
|
+
|
|
1507
|
+
---
|
|
1508
|
+
|
|
1322
1509
|
## 📚 Examples
|
|
1323
1510
|
|
|
1324
1511
|
### Project Examples
|
package/dist/index.cjs
CHANGED
|
@@ -1,25 +1,25 @@
|
|
|
1
1
|
#!/usr/bin/env node
|
|
2
|
-
"use strict";var
|
|
3
|
-
`){
|
|
4
|
-
`)),
|
|
5
|
-
`)),
|
|
6
|
-
`),
|
|
7
|
-
`);let
|
|
8
|
-
`);){let
|
|
9
|
-
`),
|
|
10
|
-
`).forEach(
|
|
11
|
-
`)}),e.stderr.forEach(
|
|
12
|
-
`)});let
|
|
13
|
-
Total execution time: ${
|
|
14
|
-
`)):(0,
|
|
15
|
-
`))}displayParallelResults(e,t,
|
|
2
|
+
"use strict";var Dt=Object.create;var Se=Object.defineProperty;var Mt=Object.getOwnPropertyDescriptor;var It=Object.getOwnPropertyNames;var jt=Object.getPrototypeOf,Tt=Object.prototype.hasOwnProperty;var Nt=(r,e)=>()=>(r&&(e=r(r=0)),e);var At=(r,e)=>{for(var t in e)Se(r,t,{get:e[t],enumerable:!0})},Ft=(r,e,t,o)=>{if(e&&typeof e=="object"||typeof e=="function")for(let s of It(e))!Tt.call(r,s)&&s!==t&&Se(r,s,{get:()=>e[s],enumerable:!(o=Mt(e,s))||o.enumerable});return r};var b=(r,e,t)=>(t=r!=null?Dt(jt(r)):{},Ft(e||!r||!r.__esModule?Se(t,"default",{value:r,enumerable:!0}):t,r));var rt={};At(rt,{DAEMON_DIR:()=>he,getDaemonPid:()=>Me,getDaemonStartTime:()=>ot,getDaemonStatus:()=>P,isDaemonRunning:()=>W,removeDaemonPid:()=>ne,saveDaemonPid:()=>Xt});function Gt(r){try{return process.kill(r,0),!0}catch{return!1}}async function Me(){try{if(!(0,Z.existsSync)(j))return null;let r=await(0,k.readFile)(j,"utf-8"),e=parseInt(r.trim(),10);return isNaN(e)?(await(0,k.unlink)(j),null):Gt(e)?e:(await(0,k.unlink)(j),null)}catch(r){if(r instanceof Error&&"code"in r&&r.code==="ENOENT")return null;throw r}}async function W(){return await Me()!==null}async function Xt(){await(0,k.mkdir)(he,{recursive:!0}),await(0,k.writeFile)(j,process.pid.toString(),"utf-8");let r=new Date().toISOString();await(0,k.writeFile)(re,r,"utf-8")}async function ne(){try{(0,Z.existsSync)(j)&&await(0,k.unlink)(j),(0,Z.existsSync)(re)&&await(0,k.unlink)(re)}catch{}}async function ot(){try{if((0,Z.existsSync)(re)){let e=(await(0,k.readFile)(re,"utf-8")).trim();if(e)return e}if((0,Z.existsSync)(j)){let r=await(0,k.stat)(j);return new Date(r.mtime).toISOString()}return null}catch{return null}}async function P(){let r=await Me(),e=r?await ot():null;return{running:r!==null,pid:r,startTime:e}}var Z,k,tt,me,he,j,re,ge=Nt(()=>{"use strict";Z=require("fs"),k=require("fs/promises"),tt=require("os"),me=require("path"),he=(0,me.join)((0,tt.homedir)(),".pipeliner","daemon"),j=(0,me.join)(he,"scheduler.pid"),re=(0,me.join)(he,"scheduler.started")});var $t=require("child_process"),Fe=require("fs"),Rt=require("fs/promises"),Q=require("path"),Pt=require("util"),Le=b(require("boxen"),1),p=b(require("chalk"),1),Et=require("commander"),Ct=b(require("dayjs"),1);var B=require("path"),Je=b(require("chalk"),1),de=b(require("log-update"),1);var Be=b(require("readline"),1),R=b(require("chalk"),1),ve=b(require("inquirer"),1),Oe=15,L=class{searchable;constructor(e=!1){this.searchable=e}async prompt(e,t){if(this.searchable)return this.promptWithSearch(e,t);let{choice:o}=await ve.default.prompt([{type:"list",name:"choice",message:R.default.cyan(e),choices:t.map(n=>({name:n.label,value:n.id})),pageSize:Oe}]),s=t.find(n=>n.id===o);if(!s)throw new Error(`Invalid choice: ${o}`);return s}async promptWithSearch(e,t){return new Promise(o=>{let s="",n=0,a=[...t],i=Be.createInterface({input:process.stdin,output:process.stdout,terminal:!1});process.stdin.isTTY&&process.stdin.setRawMode(!0),process.stdout.write("\x1B[?1049h"),process.stdout.write("\x1B[?25l");let l=()=>{process.stdout.write("\x1B[H\x1B[2J"),console.log(R.default.cyan(`? ${e}`));let m=s?R.default.gray(` Filter: ${s}`)+R.default.gray(` (${a.length}/${t.length})`):R.default.gray(" Type to filter, \u2191\u2193 to navigate, Enter to select");console.log(m),console.log();let d=Oe,w=0,S=a.length;if(a.length>d){let v=Math.floor(d/2);w=Math.max(0,n-v),S=Math.min(a.length,w+d),S===a.length&&(w=Math.max(0,S-d))}if(a.length===0)console.log(R.default.yellow(" No matches found"));else{w>0&&console.log(R.default.gray(` \u2191 ${w} more above`));for(let v=w;v<S;v++){let $=a[v];console.log(v===n?R.default.cyan(`\u276F ${$.label}`):R.default.white(` ${$.label}`))}S<a.length&&console.log(R.default.gray(` \u2193 ${a.length-S} more below`))}},c=()=>{let m=s.toLowerCase();a=m?t.filter(d=>d.label.toLowerCase().includes(m)):[...t],n>=a.length&&(n=Math.max(0,a.length-1))},f=m=>{let d=m.toString();if(d===""&&(g(),process.exit(0)),d==="\r"||d===`
|
|
3
|
+
`){a.length>0&&(g(),o(a[n]));return}if(d==="\x1B"&&m.length===1){s&&(s="",c(),l());return}if(d==="\x1B[A"){a.length>0&&(n=n>0?n-1:a.length-1,l());return}if(d==="\x1B[B"){a.length>0&&(n=n<a.length-1?n+1:0,l());return}if(d==="\x7F"||d==="\b"){s.length>0&&(s=s.slice(0,-1),c(),l());return}d.length===1&&d>=" "&&d<="~"&&(s+=d,c(),l())},g=()=>{process.stdin.removeListener("data",f),process.stdin.isTTY&&process.stdin.setRawMode(!1),i.close(),process.stdout.write("\x1B[?25h"),process.stdout.write("\x1B[?1049l")};l(),process.stdin.on("data",f)})}},ie=class{async prompt(e,t){let{value:o}=await ve.default.prompt([{type:"input",name:"value",message:R.default.cyan(e),default:t}]);return o}};var ae=b(require("boxen"),1),V=b(require("chalk"),1);function ke(r,e,t,o={}){let{borderColor:s="cyan",isNested:n=!1}=o,a;e!==void 0&&(t?a=`line ${e} in ${t}`:a=`line ${e}`);let i=n?`\u2502 ${r}`:`> ${r}`;return(0,ae.default)(i,{title:a,borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:0,left:0,right:0},borderColor:s})}function H(r,e=!1,t){let o=r?"\u2713 Completed":"\u2717 Failed",s=r?V.default.green(o):V.default.red(o);if(t!==void 0){let n=Y(t);return`${s} ${V.default.gray(`(${n})`)}`}return s}function te(r){return(0,ae.default)(`\u2717 ${r}`,{borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:0,left:0,right:0},borderColor:"red"})}function We(r){return(0,ae.default)(`> Starting parallel execution (${r} branches)`,{borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:0,left:0,right:0},borderColor:"yellow"})}function _e(r){let e=r?"\u2713 All parallel branches completed":"\u2717 Some parallel branches failed";return r?V.default.green(e):V.default.red(e)}function xe(r,e=!1){return`${e?"| \u2502 ":"\u2502 "}${r}`}function Y(r){return`${(r/1e3).toFixed(3)}s`}var Ve=require("fs"),He=require("path"),U=class{constructor(e){this.workspace=e}evaluate(e){return"var"in e||"has"in e?this.evaluateVarExists(e):"file"in e?this.evaluateFileExists(e):"choice"in e?this.evaluateChoice(e):"all"in e?this.evaluateAll(e):"any"in e?this.evaluateAny(e):"not"in e?this.evaluateNot(e):!1}evaluateVarExists(e){if(e.has)return this.workspace.hasVariable(e.has)||this.workspace.hasFact(e.has);if(!e.var)return!1;if(typeof e.var=="object"){for(let[o,s]of Object.entries(e.var)){let n=this.workspace.getVariable(o),a=this.workspace.getFact(o),i=n??(a!==void 0?a.toString():void 0);if(i===void 0||i!==s)return!1}return!0}let t=e.var;return this.workspace.hasVariable(t)||this.workspace.hasFact(t)}evaluateFileExists(e){try{let t=e.file.trim(),o=(0,He.resolve)(process.cwd(),t);return(0,Ve.existsSync)(o)}catch{return!1}}evaluateChoice(e){return this.workspace.hasChoice(e.choice)}evaluateAll(e){return e.all.every(t=>this.evaluate(t))}evaluateAny(e){return e.any.some(t=>this.evaluate(t))}evaluateNot(e){return!this.evaluate(e.not)}};var E=require("fs/promises"),Ye=require("os"),O=require("path"),Ue=b(require("dayjs"),1),J=(0,O.join)((0,Ye.homedir)(),".pipeliner","workflow-history"),z=class{constructor(){}async saveHistory(e){await(0,E.mkdir)(J,{recursive:!0});let t=(0,Ue.default)().format("YYYY-MM-DD_HH-mm-ss"),o=Math.random().toString(36).slice(2,6),s=(0,O.join)(J,`workflow-${t}-${o}.json`);return await(0,E.writeFile)(s,JSON.stringify(e,null,2),{encoding:"utf8"}),s}async clearAllHistories(){await(0,E.rm)(J,{recursive:!0,force:!0})}async removeHistory(e){await(0,E.rm)((0,O.join)(J,e),{force:!0})}async getHistoryNames(){try{let t=(await(0,E.readdir)(J)).map(o=>(0,O.basename)(o));return t.sort((o,s)=>{let n=l=>{let c=l.match(/workflow-(\d{4}-\d{2}-\d{2}_\d{2}-\d{2}-\d{2})-/);return c?c[1]:""},a=n(o),i=n(s);return a===i?s.localeCompare(o):i.localeCompare(a)}),t}catch(e){if(e instanceof Error&&"code"in e&&e.code==="ENOENT")return[];throw e}}async getHistory(e){let t=await(0,E.readFile)((0,O.join)(J,e),{encoding:"utf8"});return JSON.parse(t)}};var le=class{records=[];initialTimestamp=Date.now();recordStartTimestamp=Date.now();constructor(){this.records=[]}recordStart(){this.recordStartTimestamp=Date.now()}recordEnd(e,t,o,s){let n=this.getDuration();return this.records.push({step:e,context:t,output:o,duration:n,status:s}),n}reset(){this.records=[],this.initialTimestamp=Date.now()}async save(){let e=new z,t={initialTimestamp:this.initialTimestamp,records:this.records};return await e.saveHistory(t)}getDuration(){return Date.now()-this.recordStartTimestamp}};var $e=require("child_process");var ce=class{async run(e,t,o,s,n=!1,a=!1,i,l,c,f,g){return n?this.runBuffered(e,c,f,g):this.runRealtime(e,o||e,a,i,l,c,f,g)}async runBuffered(e,t,o,s){return new Promise((n,a)=>{let i=this.spawnWithShell(e,t,s),l=[],c=[],f="",g="",m=null;o&&o>0&&(m=setTimeout(()=>{i.kill("SIGTERM");let d=`Command timed out after ${o} seconds`;c.push(d),n({success:!1,stdout:l,stderr:c})},o*1e3)),i.stdout?.on("data",d=>{let w=d.toString(),{lines:S,remaining:v}=this.processStreamBuffer(w,f);l.push(...S),f=v}),i.stderr?.on("data",d=>{let w=d.toString(),{lines:S,remaining:v}=this.processStreamBuffer(w,g);c.push(...S),g=v}),i.on("close",d=>{m&&clearTimeout(m),f.trim()&&l.push(f),g.trim()&&c.push(g),n({success:d===0,stdout:l,stderr:c})}),i.on("error",d=>{m&&clearTimeout(m);let w=`Error: ${d.message}`;n({success:!1,stdout:l,stderr:[...c,w]})})})}async runRealtime(e,t,o,s,n,a,i,l){let f=ke(t,s,n,{borderColor:o?"green":"cyan"});console.log(f);let g=Date.now();return new Promise(m=>{let d=this.spawnWithShell(e,a,l),w="",S="",v=null;i&&i>0&&(v=setTimeout(()=>{d.kill("SIGTERM");let $=`Command timed out after ${i} seconds`,D=te($);console.error(D);let A=Date.now()-g,F=H(!1,!1,A);console.log(F),m(!1)},i*1e3)),d.stdout?.on("data",$=>{let D=$.toString(),{lines:A,remaining:F}=this.processStreamBuffer(D,w);A.forEach(be=>process.stdout.write(`\u2502 ${be}
|
|
4
|
+
`)),w=F}),d.stderr?.on("data",$=>{let D=$.toString(),{lines:A,remaining:F}=this.processStreamBuffer(D,S);A.forEach(be=>process.stderr.write(`\u2502 ${be}
|
|
5
|
+
`)),S=F}),d.on("close",$=>{v&&clearTimeout(v),w.trim()&&process.stdout.write(`\u2502 ${w}
|
|
6
|
+
`),S.trim()&&process.stderr.write(`\u2502 ${S}
|
|
7
|
+
`);let D=$===0,A=Date.now()-g,F=H(D,!1,A);console.log(F),m(D)}),d.on("error",$=>{v&&clearTimeout(v);let D=te(`Error: ${$.message}`);console.error(D),m(!1)})})}createSpawnOptions(e){let t={stdio:["inherit","pipe","pipe"],shell:!0};return e&&(t.cwd=e),t}spawnWithShell(e,t,o){if(o&&o.length>0){let s=o[0],n=[...o.slice(1),e],a={stdio:["inherit","pipe","pipe"]};return t&&(a.cwd=t),(0,$e.spawn)(s,n,a)}else{let s=process.env.SHELL||(process.platform==="win32"?"cmd.exe":"/bin/sh"),n=process.platform==="win32"?"/c":"-c",a={stdio:["inherit","pipe","pipe"]};return t&&(a.cwd=t),(0,$e.spawn)(s,[n,e],a)}}processStreamBuffer(e,t){let o=t+e,s=[],n=o;for(;n.includes(`
|
|
8
|
+
`);){let a=n.indexOf(`
|
|
9
|
+
`),i=n.substring(0,a);n=n.substring(a+1),s.push(i)}return{lines:s,remaining:n}}formatNestedOutput(e,t){t?e.split(`
|
|
10
|
+
`).forEach(o=>{o.trim()&&console.log(`| ${o}`)}):console.log(e)}displayBufferedOutput(e,t,o=!1,s,n){let a=ke(t,s,n,{borderColor:"cyan",isNested:o});this.formatNestedOutput(a,o),e.stdout.forEach(l=>{let c=xe(l,o);process.stdout.write(`${c}
|
|
11
|
+
`)}),e.stderr.forEach(l=>{let c=xe(l,o);process.stderr.write(`${c}
|
|
12
|
+
`)});let i=H(e.success,o);console.log(i)}};function Lt(r,e,t){if(e.hasVariable(r)){let o=e.getVariable(r);return o??t}if(e.hasFact(r)){let o=e.getFact(r);return typeof o=="string"?o:String(o)}if(e.hasChoice(r)){let o=e.getChoice(r);return o??t}return t}function ue(r,e){let t=/\{\{\s*(\w+)\s*\}\}/g;return r.replace(t,(o,s)=>Lt(s,e,o))}var pe=class r{state;constructor(){this.state={facts:new Map,choices:new Map,variables:new Map,stepResults:new Map,lastStepIndex:-1}}hasFact(e){return this.state.facts.has(e)}getFact(e){return this.state.facts.get(e)}setFact(e,t){this.state.facts.set(e,t)}getFactStatus(e){if(!this.hasFact(e))return"pending";let t=this.getFact(e);return t===!1||t==="failed"?"failed":"ready"}getAllFacts(){return new Map(this.state.facts)}hasChoice(e){return this.state.choices.has(e)}getChoice(e){return this.state.choices.get(e)}setChoice(e,t){this.state.choices.set(e,t)}hasVariable(e){return this.state.variables.has(e)}getVariable(e){return this.state.variables.get(e)}setVariable(e,t){this.state.variables.set(e,t)}getAllVariables(){return new Map(this.state.variables)}setStepResult(e,t,o){this.state.stepResults.set(e,{success:t,exitCode:o}),this.state.lastStepIndex=e}getStepResult(e){return this.state.stepResults.get(e)}getLastStepResult(){if(this.state.lastStepIndex!==-1)return this.state.stepResults.get(this.state.lastStepIndex)}clone(){let e=new r;return e.state.facts=new Map(this.state.facts),e.state.choices=new Map(this.state.choices),e.state.variables=new Map(this.state.variables),e.state.stepResults=new Map(this.state.stepResults),e.state.lastStepIndex=this.state.lastStepIndex,e}};var q=class r{static PARALLEL_STEP_INDEX_MULTIPLIER=1e3;workspace;taskRunner;choicePrompt;textPrompt;baseDir;globalShell;constructor(){this.workspace=new pe,this.taskRunner=new ce,this.choicePrompt=new L,this.textPrompt=new ie}resolveBaseDir(e){if(e.baseDir)if((0,B.isAbsolute)(e.baseDir))this.baseDir=e.baseDir;else if(e._filePath){let t=(0,B.dirname)(e._filePath);this.baseDir=(0,B.resolve)(t,e.baseDir)}else this.baseDir=(0,B.resolve)(process.cwd(),e.baseDir)}createStepContext(e,t){let o={workspace:this.workspace,stepIndex:e};return t._lineNumbers&&(o.lineNumber=t._lineNumbers.get(e)),t._fileName&&(o.fileName=t._fileName),o}evaluateStepCondition(e){return e.when?new U(this.workspace).evaluate(e.when):!0}calculateBaseStepIndex(e){return e.branchIndex===void 0?e.stepIndex:Math.floor(e.stepIndex/r.PARALLEL_STEP_INDEX_MULTIPLIER)}isRunStep(e){return"run"in e}async execute(e,t){if(t?.profileVars&&Object.keys(t.profileVars).length>0)for(let[i,l]of Object.entries(t.profileVars))this.workspace.setVariable(i,l);this.resolveBaseDir(e),this.globalShell=e.shell;let o=new le,s=Date.now();for(let i=0;i<e.steps.length;i++){let l=e.steps[i],c=this.createStepContext(i,e),f=!!l.when;if(this.evaluateStepCondition(l)){o.recordStart();try{let g=await this.executeStep(l,c,!1,f);this.handleStepResult(l,c,i,g,o)}catch(g){throw this.handleStepError(l,c,i,g,o),g}}}let n=Date.now()-s,a=Y(n);console.log(Je.default.cyan(`
|
|
13
|
+
Total execution time: ${a}`)),await o.save(),o.reset()}isStepSuccessful(e,t){return"run"in t?typeof e=="boolean"?e:e&&typeof e=="object"&&"success"in e?e.success:!1:!0}handleStepResult(e,t,o,s,n){let a=this.isRunStep(e)?(()=>{let c=this.workspace.getStepResult(o);return c?c.success:!0})():this.isStepSuccessful(s,e),i=a?"success":"failure",l=n.recordEnd(e,t,s,i);if(!this.isRunStep(e)){let c=H(a,!1,l);console.log(c)}if(this.isRunStep(e)){if(e.continue===!1){let c=t.lineNumber?` (line ${t.lineNumber})`:"",f=a?`Step ${o}${c} completed, but workflow stopped due to continue: false`:`Step ${o}${c} failed`;throw new Error(f)}if(!a&&e.continue!==!0){let c=t.lineNumber?` (line ${t.lineNumber})`:"";throw new Error(`Step ${o}${c} failed`)}}}handleStepError(e,t,o,s,n){this.workspace.setStepResult(o,!1);let a=s instanceof Error?s.message:String(s),i={success:!1,stdout:[],stderr:[a]};n.recordEnd(e,t,i,"failure")}fixMalformedStep(e){let o=e;return"choose"in e&&o.choose===null&&"message"in e&&"options"in e?{choose:{message:o.message,options:o.options,as:o.as},when:o.when}:"prompt"in e&&o.prompt===null&&"message"in e&&"as"in e?{prompt:{message:o.message,as:o.as,default:o.default},when:o.when}:e}async executeStep(e,t,o=!1,s=!1){if(e=this.fixMalformedStep(e),"run"in e){let n=await this.executeRunStep(e,t,o,s);return o&&typeof n=="object"&&"stdout"in n,n}if("choose"in e){await this.executeChooseStep(e,t);return}if("prompt"in e){await this.executePromptStep(e,t);return}if("parallel"in e){await this.executeParallelStep(e,t);return}if("fail"in e){await this.executeFailStep(e,t);return}}async executeSingleRun(e,t,o=!1,s=!1){let n=this.calculateBaseStepIndex(t),a=ue(e.run,this.workspace),i=e.shell||this.globalShell,l=e.retry??0,c=e.timeout,f=!1,g=0;for(;g<=l;){let m=await this.taskRunner.run(a,n,a,t.branchIndex,o,s,t.lineNumber,t.fileName,this.baseDir,c,i),d=typeof m=="boolean"?m:m.success;if(f=m,d||g>=l)break;if(g++,g<=l){let w=Math.min(1e3*Math.pow(2,g-1),1e4);await new Promise(S=>setTimeout(S,w))}}return f}async executeRunStep(e,t,o=!1,s=!1){let n=await this.executeSingleRun({run:e.run,timeout:e.timeout,retry:e.retry,shell:e.shell},t,o,s),a=typeof n=="boolean"?n:n.success;if(this.workspace.setStepResult(t.stepIndex,a),a||!e.onError)return n;let i={run:e.onError.run,timeout:e.onError.timeout,retry:e.onError.retry,onError:e.onError.onError??void 0};return await this.executeRunChain(i,t,o,s)}async executeRunChain(e,t,o,s){let n=await this.executeSingleRun({run:e.run,timeout:e.timeout,retry:e.retry,shell:void 0},t,o,s);return(typeof n=="boolean"?n:n.success)||!e.onError?n:this.executeRunChain(e.onError,t,o,s)}async executeChooseStep(e,t){let o=e.choose.as,s=e.choose.options.map(i=>i.id);if(o&&this.workspace.hasVariable(o)){let i=this.workspace.getVariable(o)??"";if(s.includes(i)){this.workspace.setChoice(i,i),this.workspace.setStepResult(t.stepIndex,!0);return}}let n=await this.choicePrompt.prompt(e.choose.message,e.choose.options);if(!n?.id)throw new Error(`Invalid choice result: ${JSON.stringify(n)}`);let a=o??n.id;this.workspace.setChoice(n.id,n.id),this.workspace.setVariable(a,n.id),this.workspace.setStepResult(t.stepIndex,!0)}async executePromptStep(e,t){let o=e.prompt.as;if(this.workspace.hasVariable(o)){let i=this.workspace.getVariable(o)??"";this.workspace.setFact(o,i),this.workspace.setStepResult(t.stepIndex,!0);return}let s=ue(e.prompt.message,this.workspace),n=e.prompt.default?ue(e.prompt.default,this.workspace):void 0,a=await this.textPrompt.prompt(s,n);this.workspace.setVariable(o,a),this.workspace.setFact(o,a),this.workspace.setStepResult(t.stepIndex,!0)}createParallelContexts(e,t){return e.parallel.map((o,s)=>({workspace:this.workspace.clone(),stepIndex:t.stepIndex*r.PARALLEL_STEP_INDEX_MULTIPLIER+s,branchIndex:s,lineNumber:t.lineNumber,fileName:t.fileName}))}getBranchDisplayName(e,t){return"run"in e?e.run:"choose"in e?`Choose: ${e.choose.message}`:"prompt"in e?`Prompt: ${e.prompt.message}`:"fail"in e?`Fail: ${e.fail.message}`:`Branch ${t+1}`}async executeParallelBranches(e,t){let o=[],s=["\u280B","\u2819","\u2839","\u2838","\u283C","\u2834","\u2826","\u2827","\u2807","\u280F"],n=0;for(let c=0;c<e.length;c++){let f=e[c],g=t[c];if(f.when&&!new U(g.workspace).evaluate(f.when))continue;let m=this.getBranchDisplayName(f,c);o.push({index:c,name:m,status:"pending"})}let a=setInterval(()=>{n=(n+1)%s.length,this.updateParallelBranchesDisplay(o,s[n])},100),i=o.map(async c=>{let{index:f}=c,g=e[f],m=t[f];c.status="running";try{let d=await this.executeStep(g,m,!0);return c.status="success",this.updateParallelBranchesDisplay(o,s[n]),{index:f,result:d,context:m}}catch(d){m.workspace.setStepResult(m.stepIndex,!1);let w=d instanceof Error?d.message:String(d);return c.status="failed",c.error=w,this.updateParallelBranchesDisplay(o,s[n]),{index:f,error:d,context:m}}}),l=await Promise.all(i);return clearInterval(a),this.updateParallelBranchesDisplay(o,"",!0),de.default.done(),l}updateParallelBranchesDisplay(e,t,o=!1){let s=e.map(n=>{let a=n.index+1,i="",l="";switch(n.status){case"pending":i="\u25CB",l=`Branch ${a}: ${n.name} - Pending`;break;case"running":i=t,l=`Branch ${a}: ${n.name} - Running...`;break;case"success":i="\u2713",l=`Branch ${a}: ${n.name} - Completed`;break;case"failed":i="\u2717",l=`Branch ${a}: ${n.name} - Failed${n.error?`: ${n.error}`:""}`;break}return`${i} ${l}`});o?(0,de.default)(s.join(`
|
|
14
|
+
`)):(0,de.default)(s.join(`
|
|
15
|
+
`))}displayParallelResults(e,t,o){let s=!0,n=!1;console.log("");for(let i of e){if(!i)continue;n=!0;let{index:l,result:c,error:f,context:g}=i;if(f){s=!1;let m=`Branch ${l+1} failed: ${f instanceof Error?f.message:String(f)}`,d=te(m);console.error(d)}else if(c&&typeof c=="object"&&"stdout"in c){let m=c;if(s=s&&m.success,m.stdout.length>0||m.stderr.length>0||!m.success){let d=t[l],w=this.getBranchDisplayName(d,l);this.taskRunner.displayBufferedOutput(m,w,!1,g.lineNumber,g.fileName)}}}n||console.log("\u26A0\uFE0F All parallel branches were skipped (conditions not met)");let a=_e(s);return console.log(a),s}mergeParallelResults(e){for(let t of e){let o=t.workspace.getAllFacts(),s=t.workspace.getAllVariables();for(let[n,a]of o)this.workspace.setFact(n,a);for(let[n,a]of s)this.workspace.setVariable(n,a)}}countExecutableBranches(e,t){let o=0;for(let s=0;s<e.length;s++){let n=e[s],a=t[s];n.when&&!new U(a.workspace).evaluate(n.when)||o++}return o}async executeParallelStep(e,t){let o=this.createParallelContexts(e,t),s=this.countExecutableBranches(e.parallel,o),n=We(s);console.log(n);let a=await this.executeParallelBranches(e.parallel,o),i=this.displayParallelResults(a,e.parallel,t);if(this.workspace.setStepResult(t.stepIndex,i),!i){let l=t.lineNumber?` (line ${t.lineNumber})`:"";throw new Error(`Parallel step ${t.stepIndex}${l} failed: one or more branches failed`)}this.mergeParallelResults(o)}async executeFailStep(e,t){let o=new Error(e.fail.message);throw o.stack=void 0,o}};var Xe=require("yaml"),Ce=require("zod");var u=require("zod"),Ot=u.z.object({file:u.z.string()}),Bt=u.z.object({var:u.z.union([u.z.string(),u.z.record(u.z.string(),u.z.string())]).optional(),has:u.z.string().optional()}),Wt=u.z.object({status:u.z.object({fact:u.z.string(),is:u.z.enum(["ready","failed","pending"])})}),_t=u.z.object({step:u.z.object({success:u.z.boolean()}).optional(),last_step:u.z.enum(["success","failure"]).optional()}),Vt=u.z.object({choice:u.z.string()}),Ht=u.z.union([Ot,Vt,Bt,Wt,_t]),M=u.z.lazy(()=>u.z.union([Ht,u.z.object({all:u.z.array(M)}),u.z.object({any:u.z.array(M)}),u.z.object({not:M})])),ze=u.z.lazy(()=>u.z.object({run:u.z.string(),timeout:u.z.number().optional(),retry:u.z.number().optional(),onError:ze.optional()})),qe=u.z.object({run:u.z.string(),when:M.optional(),timeout:u.z.number().optional(),retry:u.z.number().optional(),shell:u.z.array(u.z.string()).min(1,"shell must have at least one element").optional(),continue:u.z.boolean().optional(),onError:ze.optional()}),Yt=u.z.object({choose:u.z.object({message:u.z.string(),options:u.z.array(u.z.object({id:u.z.string(),label:u.z.string()})),as:u.z.string().optional()}),when:M.optional()}),Ut=u.z.object({prompt:u.z.object({message:u.z.string(),as:u.z.string(),default:u.z.string().optional(),validate:u.z.string().optional()}),when:M.optional()});function Ze(r){if(!r||typeof r!="object")return{found:!1};let e=r;if("choose"in e)return{found:!0,type:"choose"};if("prompt"in e)return{found:!0,type:"prompt"};if("parallel"in e&&Array.isArray(e.parallel))for(let t of e.parallel){let o=Ze(t);if(o.found)return o}return{found:!1}}var Ge=u.z.lazy(()=>u.z.union([qe,u.z.object({parallel:u.z.array(u.z.lazy(()=>Ge)),when:M.optional()}),u.z.object({fail:u.z.object({message:u.z.string()}),when:M.optional()})]).superRefine((r,e)=>{let t=Ze(r);t.found&&e.addIssue({code:u.z.ZodIssueCode.custom,message:`'${t.type}' step is not allowed inside 'parallel' block (user input cannot run in parallel)`})})),Jt=u.z.lazy(()=>u.z.union([qe,Yt,Ut,u.z.object({parallel:u.z.array(Ge),when:M.optional()}),u.z.object({fail:u.z.object({message:u.z.string()}),when:M.optional()})])),zt=u.z.object({name:u.z.string().min(1,"Profile name must be non-empty"),var:u.z.record(u.z.string(),u.z.union([u.z.string(),u.z.number(),u.z.boolean()]).transform(String))}),qt=u.z.object({name:u.z.string().optional(),baseDir:u.z.string().optional(),shell:u.z.array(u.z.string()).min(1,"shell must have at least one element").optional(),profiles:u.z.array(zt).optional(),steps:u.z.array(Jt).min(1,"Workflow must have at least one step")});function Re(r){return qt.parse(r)}function Ke(r,e){let t=r.path;if(r.code==="custom"){let s=Pe(t);return` - ${r.message}${s}`}if(r.message==="Invalid input"){let s=Pe(t),n=Zt(t,e);return n?` - ${n}${s}`:` - Invalid step type${s}`}let o=Pe(t);return` - ${r.message}${o}`}function Pe(r){if(r.length===0)return"";let e=[];for(let t=0;t<r.length;t++){let o=r[t],s=r[t+1];o==="steps"&&typeof s=="number"?(e.push(`step ${s+1}`),t++):o==="parallel"&&typeof s=="number"?(e.push(`parallel branch ${s+1}`),t++):typeof o=="string"&&o!=="steps"&&o!=="parallel"&&e.push(o)}return e.length>0?` (${e.join(" \u2192 ")})`:""}function y(r,e,t){let o=t?`
|
|
16
16
|
Reason: ${t}`:"";throw new Error(`Invalid workflow structure:
|
|
17
|
-
- ${e} (step ${
|
|
18
|
-
- 'choose' step is not allowed inside 'parallel' block (step ${e+1}, ${
|
|
19
|
-
Reason: User input prompts cannot run in parallel`)}let
|
|
20
|
-
- 'prompt' step is not allowed inside 'parallel' block (step ${e+1}, ${
|
|
21
|
-
Reason: User input prompts cannot run in parallel`)}let
|
|
22
|
-
- Workflow must be an object`);let e=
|
|
17
|
+
- ${e} (step ${r+1})${o}`)}function Qe(r,e,t=!1,o=[]){let s=["run","choose","prompt","parallel","fail"],n=s.find(a=>a in r);if(!n){let a=Object.keys(r).filter(i=>i!=="when");y(e,`Unknown step type. Found keys: [${a.join(", ")}]. Valid types: ${s.join(", ")}`)}if(n==="run"){let a=r.run;if(typeof a!="string"&&y(e,"'run' must be a string command"),a===""&&y(e,"'run' command cannot be empty"),"shell"in r&&r.shell!==void 0){Array.isArray(r.shell)||y(e,"'shell' must be an array");let i=r.shell;i.length===0&&y(e,"'shell' cannot be empty","Shell configuration must have at least one element (program name)");for(let l=0;l<i.length;l++)typeof i[l]!="string"&&y(e,`'shell[${l}]' must be a string`)}}if(n==="choose"){if(t){let l=o.join(" \u2192 ");throw new Error(`Invalid workflow structure:
|
|
18
|
+
- 'choose' step is not allowed inside 'parallel' block (step ${e+1}, ${l})
|
|
19
|
+
Reason: User input prompts cannot run in parallel`)}let a=r.choose;(!a||typeof a!="object")&&y(e,"'choose' must be an object with 'message' and 'options'");let i=a;(!i.message||typeof i.message!="string")&&y(e,"'choose.message' is required and must be a string"),Array.isArray(i.options)||y(e,"'choose.options' is required and must be an array"),i.options.length===0&&y(e,"'choose.options' cannot be empty","At least one option is required");for(let l=0;l<i.options.length;l++){let c=i.options[l];(!c||typeof c!="object")&&y(e,`'choose.options[${l}]' must be an object with 'id' and 'label'`),(!c.id||typeof c.id!="string")&&y(e,`'choose.options[${l}].id' is required and must be a string`),(!c.label||typeof c.label!="string")&&y(e,`'choose.options[${l}].label' is required and must be a string`)}}if(n==="prompt"){if(t){let l=o.join(" \u2192 ");throw new Error(`Invalid workflow structure:
|
|
20
|
+
- 'prompt' step is not allowed inside 'parallel' block (step ${e+1}, ${l})
|
|
21
|
+
Reason: User input prompts cannot run in parallel`)}let a=r.prompt;(!a||typeof a!="object")&&y(e,"'prompt' must be an object with 'message' and 'as'");let i=a;(!i.message||typeof i.message!="string")&&y(e,"'prompt.message' is required and must be a string"),(!i.as||typeof i.as!="string")&&y(e,"'prompt.as' is required and must be a string","The 'as' field specifies the variable name to store the user's input")}if(n==="parallel"){let a=r.parallel;Array.isArray(a)||y(e,"'parallel' must be an array of steps"),a.length===0&&y(e,"'parallel' cannot be empty","At least one step is required");for(let i=0;i<a.length;i++){let l=a[i];(!l||typeof l!="object")&&y(e,`'parallel[${i}]' must be a valid step object`);let c=[...o,`branch ${i+1}`];Qe(l,e,!0,c)}}if(n==="fail"){let a=r.fail;(!a||typeof a!="object")&&y(e,"'fail' must be an object with 'message'");let i=a;(!i.message||typeof i.message!="string")&&y(e,"'fail.message' is required and must be a string")}}function et(r){if(!r||typeof r!="object")throw new Error(`Invalid workflow structure:
|
|
22
|
+
- Workflow must be an object`);let e=r;if("name"in e&&e.name!==void 0&&typeof e.name!="string")throw new Error(`Invalid workflow structure:
|
|
23
23
|
- 'name' must be a string`);if("shell"in e&&e.shell!==void 0){if(!Array.isArray(e.shell))throw new Error(`Invalid workflow structure:
|
|
24
24
|
- 'shell' must be an array`);if(e.shell.length===0)throw new Error(`Invalid workflow structure:
|
|
25
25
|
- 'shell' cannot be empty
|
|
@@ -28,14 +28,38 @@ Total execution time: ${i}`)),await r.save(),r.reset()}isStepSuccessful(e,t){ret
|
|
|
28
28
|
- 'steps' is required`);if(!Array.isArray(e.steps))throw new Error(`Invalid workflow structure:
|
|
29
29
|
- 'steps' must be an array`);if(e.steps.length===0)throw new Error(`Invalid workflow structure:
|
|
30
30
|
- 'steps' cannot be empty
|
|
31
|
-
Reason: Workflow must have at least one step`);for(let t=0;t<e.steps.length;t++){let
|
|
32
|
-
- Step ${t+1} must be an object`);
|
|
31
|
+
Reason: Workflow must have at least one step`);for(let t=0;t<e.steps.length;t++){let o=e.steps[t];if(!o||typeof o!="object")throw new Error(`Invalid workflow structure:
|
|
32
|
+
- Step ${t+1} must be an object`);Qe(o,t)}}function Zt(r,e){try{let t=e;for(let n of r)if(typeof n!="symbol")if(t&&typeof t=="object")t=t[n];else return null;if(!t||typeof t!="object")return null;let s=Object.keys(t);if(s.length>0){let n=["run","choose","prompt","parallel","fail"];if(!s.some(i=>n.includes(i)))return`Unknown step type. Found keys: [${s.join(", ")}]. Valid types: run, choose, prompt, parallel, fail`}return null}catch{return null}}function De(r){let e=r;return"choose"in e&&(e.choose===null||e.choose===void 0)&&"message"in e&&"options"in e?{choose:{message:e.message,options:e.options,as:e.as},when:e.when}:"prompt"in e&&(e.prompt===null||e.prompt===void 0)&&"message"in e&&"as"in e?{prompt:{message:e.message,as:e.as,default:e.default,validate:e.validate},when:e.when}:"parallel"in e&&Array.isArray(e.parallel)?{...e,parallel:e.parallel.map(t=>De(t))}:r}var fe=class{parse(e){let t;try{t=(0,Xe.parse)(e)}catch(o){throw new Error(`Invalid YAML format: ${o instanceof Error?o.message:String(o)}`)}if(t&&typeof t=="object"&&"steps"in t){let o=t;Array.isArray(o.steps)&&(o.steps=o.steps.map(s=>De(s)))}et(t);try{return Re(t)}catch(o){if(o instanceof Ce.ZodError){let s=o.issues.map(n=>Ke(n,t)).filter(n=>n!==null).join(`
|
|
33
33
|
`);throw new Error(`Invalid workflow structure:
|
|
34
|
-
${
|
|
35
|
-
`),
|
|
34
|
+
${s}`)}throw o}}extractStepLineNumbers(e){let t=new Map,o=e.split(`
|
|
35
|
+
`),s=0,n=!1;for(let a=0;a<o.length;a++){let i=o[a].trim();if(i==="steps:"||i.startsWith("steps:")){n=!0;continue}n&&i.startsWith("-")&&t.set(s++,a+1)}return t}},Ee=class{parse(e){let t;try{t=JSON.parse(e)}catch(o){throw new Error(`Invalid JSON format: ${o instanceof Error?o.message:String(o)}`)}if(t&&typeof t=="object"&&"steps"in t){let o=t;Array.isArray(o.steps)&&(o.steps=o.steps.map(s=>De(s)))}et(t);try{return Re(t)}catch(o){if(o instanceof Ce.ZodError){let s=o.issues.map(n=>Ke(n,t)).filter(n=>n!==null).join(`
|
|
36
36
|
`);throw new Error(`Invalid workflow structure:
|
|
37
|
-
${
|
|
38
|
-
`),
|
|
37
|
+
${s}`)}throw o}}extractStepLineNumbers(e){let t=new Map,o=e.split(`
|
|
38
|
+
`),s=0,n=!1,a=!1;for(let i=0;i<o.length;i++){let c=o[i].trim();if(c.startsWith('"steps"')||c.startsWith("'steps'")){n=!0,c.includes("[")&&(a=!0);continue}if(n&&c==="["){a=!0;continue}if(a&&c==="]"){a=!1,n=!1;continue}a&&c.startsWith("{")&&t.set(s++,i+1)}return t}};function oe(r){switch(r.toLowerCase().split(".").pop()){case"yaml":case"yml":return new fe;case"json":return new Ee;default:return new fe}}var gt=require("child_process"),we=require("fs"),T=require("path"),ye=b(require("boxen"),1),h=b(require("chalk"),1),wt=require("commander"),K=b(require("dayjs"),1),_=b(require("inquirer"),1),se=b(require("log-update"),1),yt=b(require("node-cron"),1);ge();var st=require("fs/promises"),it=require("path"),at=require("yaml"),lt=require("zod");var I=require("zod"),Kt=I.z.object({name:I.z.string().min(1,"Schedule name must be non-empty"),cron:I.z.string().min(1,"Cron expression is required"),workflow:I.z.string().min(1,"Workflow path is required"),baseDir:I.z.string().optional(),silent:I.z.boolean().optional(),profile:I.z.string().optional()}),Qt=I.z.object({schedules:I.z.array(Kt).min(1,"Schedule file must have at least one schedule")});function nt(r){return Qt.parse(r)}async function ct(r){let e=await(0,st.readFile)(r,"utf-8"),t=(0,it.extname)(r).toLowerCase(),o;try{if(t===".yaml"||t===".yml")o=(0,at.parse)(e);else if(t===".json")o=JSON.parse(e);else throw new Error(`Unsupported file format: ${t}. Use .yaml, .yml, or .json`)}catch(s){if(s instanceof Error&&s.message.startsWith("Unsupported"))throw s;let n=t===".json"?"JSON":"YAML";throw new Error(`Invalid ${n} format: ${s instanceof Error?s.message:String(s)}`)}try{return nt(o)}catch(s){if(s instanceof lt.ZodError){let n=s.issues.map(a=>` - ${a.message} (${a.path.join(".")})`).join(`
|
|
39
|
+
`);throw new Error(`Invalid schedule file structure:
|
|
40
|
+
${n}`)}throw s}}var G=require("fs/promises"),pt=require("os"),Ie=require("path"),dt=(0,Ie.join)((0,pt.homedir)(),".pipeliner","schedules"),ut=(0,Ie.join)(dt,"schedules.json"),C=class{async loadSchedules(){try{let e=await(0,G.readFile)(ut,"utf-8");return JSON.parse(e).schedules||[]}catch(e){if(e instanceof Error&&"code"in e&&e.code==="ENOENT")return[];throw e}}async saveSchedules(e){await(0,G.mkdir)(dt,{recursive:!0}),await(0,G.writeFile)(ut,JSON.stringify({schedules:e},null,2),"utf-8")}async addSchedule(e){let t=await this.loadSchedules(),o=Math.random().toString(36).slice(2,10),s=new Date().toISOString(),n={id:o,createdAt:s,...e};return t.push(n),await this.saveSchedules(t),n}async removeSchedule(e){let t=await this.loadSchedules(),o=t.length,s=t.filter(n=>n.id!==e);return s.length===o?!1:(await this.saveSchedules(s),!0)}async updateLastRun(e){let t=await this.loadSchedules(),o=t.find(s=>s.id===e);o&&(o.lastRun=new Date().toISOString(),await this.saveSchedules(t))}async toggleSchedule(e,t){let o=await this.loadSchedules(),s=o.find(n=>n.id===e);return s?(s.enabled=t,await this.saveSchedules(o),!0):!1}async getSchedule(e){return(await this.loadSchedules()).find(o=>o.id===e)}};var ft=require("fs/promises"),mt=require("path"),je=b(require("node-cron"),1);ge();var X=class{scheduleManager;tasks=new Map;constructor(){this.scheduleManager=new C}async start(e=!1){if(await W()){let o=await P();throw new Error(`Scheduler daemon is already running (PID: ${o.pid}). Use "tp schedule stop" to stop it first.`)}console.log(e?"\u{1F680} Starting scheduler daemon in background...":"\u{1F680} Starting workflow scheduler..."),await this.reload(),e?process.env.TP_DAEMON_MODE||(console.log(`\u2713 Scheduler daemon started (PID: ${process.pid})`),console.log(' Run "tp schedule stop" to stop the daemon'),console.log(' Run "tp schedule status" to check daemon status')):(console.log("\u2713 Scheduler is running"),console.log(" Press Ctrl+C to stop"));let t=async()=>{e||console.log(`
|
|
41
|
+
\u23F9 Stopping scheduler...`),this.stop(),await ne(),e||process.exit(0)};process.on("SIGINT",t),process.on("SIGTERM",t),e&&process.stdin.destroy()}async reload(){this.stop();let t=(await this.scheduleManager.loadSchedules()).filter(o=>o.enabled);if(t.length===0){console.log(" No active schedules found");return}console.log(` Loading ${t.length} schedule(s)...`);for(let o of t)try{this.startSchedule(o)}catch(s){console.error(` \u2717 Failed to start schedule ${o.id}:`,s)}}startSchedule(e){if(!je.default.validate(e.cron)){console.error(` \u2717 Invalid cron expression for schedule ${e.id}: ${e.cron}`);return}let t=je.default.schedule(e.cron,async()=>{await this.executeSchedule(e)});this.tasks.set(e.id,t);let o=e.name??e.workflowPath;console.log(` \u2713 Scheduled: ${o}`),console.log(` Cron: ${e.cron}`),console.log(` Workflow: ${e.workflowPath}`)}async executeSchedule(e){let t=e.name??e.workflowPath;e.silent||(console.log(`
|
|
42
|
+
\u23F0 Running scheduled workflow: ${t}`),console.log(` Time: ${new Date().toISOString()}`),e.profile&&console.log(` Profile: ${e.profile}`));try{let o=(0,mt.resolve)(e.workflowPath),s=oe(o),n=await(0,ft.readFile)(o,"utf-8"),a=s.parse(n),i=new q,l={};if(e.profile){if(!a.profiles)throw new Error(`Profile "${e.profile}" not found: no profiles defined in workflow`);let c=a.profiles.find(f=>f.name===e.profile);if(!c)throw new Error(`Profile "${e.profile}" not found. Available profiles: ${a.profiles.map(f=>f.name).join(", ")}`);l.profileVars=c.var}await i.execute(a,l),await this.scheduleManager.updateLastRun(e.id),e.silent||console.log(`\u2713 Scheduled workflow completed: ${t}
|
|
43
|
+
`)}catch(o){e.silent||(console.error(`\u2717 Scheduled workflow failed: ${t}`),console.error(` Error: ${o instanceof Error?o.message:String(o)}
|
|
44
|
+
`))}}stop(){for(let e of this.tasks.values())e.stop();this.tasks.clear()}async stopDaemon(){let e=await P();if(!e.running||!e.pid)return!1;let t=e.pid;try{if(process.kill(t,"SIGTERM"),await new Promise(o=>setTimeout(o,1e3)),await W()){try{process.kill(t,"SIGKILL")}catch{}await new Promise(o=>setTimeout(o,500))}return await ne(),!0}catch{return await ne(),!1}}};function bt(){let r=new wt.Command("schedule").description("Manage workflow schedules").action(async()=>{await ht()});return r.command("add [scheduleFile]").description("Add schedules from a schedule file (YAML or JSON)").action(async e=>{await eo(e)}),r.command("remove").alias("rm").description("Remove a workflow schedule").action(async()=>{await to()}),r.command("remove-all").description("Remove all workflow schedules").action(async()=>{await lo()}),r.command("list").alias("ls").description("List all workflow schedules").action(async()=>{await ht()}),r.command("start").description("Start the scheduler daemon").option("-d, --daemon","Run in background daemon mode").action(async e=>{await oo(e.daemon??!1)}),r.command("stop").description("Stop the scheduler daemon").action(async()=>{await ro()}),r.command("status").description("Check scheduler daemon status (real-time mode, press Ctrl+C to exit)").action(async()=>{await io(!0)}),r.command("toggle").description("Enable or disable a schedule").action(async()=>{await ao()}),r}function Te(r,e){let t=e.workflow;if((0,T.isAbsolute)(t))return t;let o=e.baseDir?(0,T.resolve)(e.baseDir):(0,T.dirname)(r);return(0,T.resolve)(o,t)}async function eo(r){let e=new C;if(!r){let{path:i}=await _.default.prompt([{type:"input",name:"path",message:"Schedule file path (YAML or JSON):",validate:l=>{let c=(0,T.resolve)(l);return(0,we.existsSync)(c)?!0:`File not found: ${c}`}}]);r=i}let t=(0,T.resolve)(r);(0,we.existsSync)(t)||(console.error(`\u2717 File not found: ${t}`),process.exit(1));let o;try{o=await ct(t)}catch(i){console.error(`\u2717 Failed to parse schedule file: ${i instanceof Error?i.message:String(i)}`),process.exit(1)}let s=o.schedules.filter(i=>!yt.default.validate(i.cron));if(s.length>0){console.error("\u2717 Invalid cron expression(s):");for(let i of s)console.error(` - ${i.name}: "${i.cron}"`);process.exit(1)}let n=o.schedules.filter(i=>{let l=Te(t,i);return!(0,we.existsSync)(l)});if(n.length>0){console.error("\u2717 Workflow file(s) not found:");for(let i of n){let l=Te(t,i);console.error(` - ${i.name}: ${i.workflow} (resolved: ${l})`)}process.exit(1)}console.log(`
|
|
45
|
+
Found ${o.schedules.length} schedule(s) in file.
|
|
46
|
+
`);let a=[];for(let i of o.schedules){let{alias:l}=await _.default.prompt([{type:"input",name:"alias",message:`Alias for "${i.name}" (press Enter to use as-is):`,default:i.name}]),c=await e.addSchedule({name:l,workflowPath:Te(t,i),cron:i.cron,enabled:!0,silent:i.silent,profile:i.profile});a.push(c)}console.log(`
|
|
47
|
+
\u2713 Added ${a.length} schedule(s) successfully
|
|
48
|
+
`);for(let i of a)console.log(` - ${i.name??"N/A"}`),console.log(` Cron: ${i.cron}`),console.log(` Workflow: ${i.workflowPath}`),i.silent&&console.log(" Silent: Yes"),i.profile&&console.log(` Profile: ${i.profile}`),console.log(` Status: ${i.enabled?"Enabled":"Disabled"}`),console.log();console.log('Run "tp schedule start" to start the scheduler daemon')}async function to(){let r=new C,e=await r.loadSchedules();if(e.length===0){console.log("No schedules found");return}let{scheduleId:t}=await _.default.prompt([{type:"list",name:"scheduleId",message:"Select schedule to remove:",choices:e.map(n=>({name:`${n.name??n.workflowPath} (${n.cron}) ${n.enabled?"\u2713":"\u2717"}`,value:n.id}))}]),{confirm:o}=await _.default.prompt([{type:"confirm",name:"confirm",message:"Are you sure you want to remove this schedule?",default:!1}]);if(!o){console.log("Cancelled");return}let s=await r.removeSchedule(t);console.log(s?"\u2713 Schedule removed successfully":"\u2717 Schedule not found")}async function ht(){let e=await new C().loadSchedules();if(e.length===0){console.log("No schedules found"),console.log(`
|
|
49
|
+
Run "tp schedule add" to create a new schedule`);return}console.log(`
|
|
50
|
+
\u{1F4C5} Workflow Schedules
|
|
51
|
+
`);for(let t of e){let o=t.enabled?"\u2713 Enabled":"\u2717 Disabled",s=t.name??t.workflowPath,n=t.lastRun?(0,K.default)(t.lastRun).format("YYYY-MM-DD HH:mm:ss"):"Never";console.log(` ${o} ${s}`),console.log(` ID: ${t.id}`),console.log(` Cron: ${t.cron}`),console.log(` Workflow: ${t.workflowPath}`),console.log(` Last run: ${n}`),console.log()}}async function oo(r){if(await W()){let e=await P();console.error(`\u2717 Scheduler daemon is already running (PID: ${e.pid})`),console.error(' Run "tp schedule stop" to stop it first'),process.exit(1)}if(r)if(process.env.TP_DAEMON_MODE==="true"){let{saveDaemonPid:e}=await Promise.resolve().then(()=>(ge(),rt));await e(),await new X().start(!0),await new Promise(()=>{})}else{let e=process.argv.slice(1);if((0,gt.spawn)(process.argv[0],e,{detached:!0,stdio:"ignore",env:{...process.env,TP_DAEMON_MODE:"true"}}).unref(),await new Promise(o=>setTimeout(o,1e3)),await W()){let o=await P();console.log(`\u2713 Scheduler daemon started in background (PID: ${o.pid})`),console.log(' Run "tp schedule stop" to stop the daemon'),console.log(' Run "tp schedule status" to check daemon status')}else console.error("\u2717 Failed to start scheduler daemon"),process.exit(1);process.exit(0)}else await new X().start(!1),await new Promise(()=>{})}async function ro(){let r=await P();if(!r.running){console.log("Scheduler daemon is not running");return}console.log(`Stopping scheduler daemon (PID: ${r.pid})...`);let t=await new X().stopDaemon();console.log(t?"\u2713 Scheduler daemon stopped":"\u2717 Failed to stop scheduler daemon (process may have already exited)")}function no(r){if(!r)return"Unknown";let e=(0,K.default)(r),o=(0,K.default)().diff(e,"second"),s=Math.floor(o/86400),n=Math.floor(o%86400/3600),a=Math.floor(o%3600/60),i=o%60,l=[];return s>0&&l.push(`${s}d`),n>0&&l.push(`${n}h`),a>0&&l.push(`${a}m`),(i>0||l.length===0)&&l.push(`${i}s`),l.join(" ")}function so(r){let e=r.name??r.workflowPath,t=r.enabled?h.default.green("\u25CF active"):h.default.gray("\u25CB inactive"),o=r.lastRun?(0,K.default)(r.lastRun).format("YYYY-MM-DD HH:mm:ss"):h.default.gray("never"),s=r.profile?h.default.cyan(` [profile: ${r.profile}]`):"",n=r.silent?h.default.gray(" [silent]"):"",a=[`${t} ${h.default.bold(e)}${s}${n}`,`${h.default.gray("Cron:")} ${r.cron}`,`${h.default.gray("Last run:")} ${o}`].join(`
|
|
52
|
+
`);return(0,ye.default)(a,{borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:1,left:0,right:0},borderColor:r.enabled?"green":"gray"})}async function Ne(){let r=await P(),t=await new C().loadSchedules(),o=[],s;if(r.running&&r.pid){let a=no(r.startTime),i=r.startTime?(0,K.default)(r.startTime).format("YYYY-MM-DD HH:mm:ss"):"Unknown";s=[`${h.default.green("\u25CF")} ${h.default.green("active")} ${h.default.gray("(running)")}`,"",`${h.default.gray("Loaded:")} ${h.default.white(i)}`,`${h.default.gray("Active:")} ${h.default.green("active (running)")} since ${h.default.white(i)}`,`${h.default.gray("PID:")} ${h.default.white(r.pid.toString())}`,`${h.default.gray("Uptime:")} ${h.default.white(a)}`].join(`
|
|
53
|
+
`)}else s=[`${h.default.red("\u25CF")} ${h.default.red("inactive")} ${h.default.gray("(dead)")}`,"",`${h.default.gray("Loaded:")} ${h.default.gray("not found")}`,`${h.default.gray("Active:")} ${h.default.red("inactive (dead)")}`].join(`
|
|
54
|
+
`);let n=(0,ye.default)(s,{title:h.default.bold("task-pipeliner-scheduler.service"),titleAlignment:"left",borderStyle:"round",padding:{top:1,bottom:1,left:2,right:2},margin:{top:0,bottom:1,left:0,right:0},borderColor:r.running?"green":"red"});if(o.push(n),t.length>0){let a=t.filter(l=>l.enabled).length,i=h.default.bold(`Schedules: ${a}/${t.length} active`);o.push(i),o.push("");for(let l of t)o.push(so(l))}else{let a=(0,ye.default)(h.default.gray("No schedules configured"),{borderStyle:"round",padding:{top:0,bottom:0,left:1,right:1},margin:{top:0,bottom:0,left:0,right:0},borderColor:"gray"});o.push(a)}return o.join(`
|
|
55
|
+
`)}async function io(r){if(r){let e=!0,t=()=>{e=!1,se.default.done(),process.exit(0)};process.on("SIGINT",t),process.on("SIGTERM",t);let o=setInterval(async()=>{if(!e){clearInterval(o);return}try{let i=await Ne(),c=(await P()).running?h.default.gray(`
|
|
56
|
+
Press Ctrl+C to exit (daemon will continue running)`):h.default.gray(`
|
|
57
|
+
Run "tp schedule start -d" to start the daemon`);(0,se.default)(`${i}${c}`)}catch(i){se.default.done(),console.error("Error updating status:",i),clearInterval(o),process.exit(1)}},1e3),s=await Ne(),a=(await P()).running?h.default.gray(`
|
|
58
|
+
Press Ctrl+C to exit (daemon will continue running)`):h.default.gray(`
|
|
59
|
+
Run "tp schedule start -d" to start the daemon`);(0,se.default)(`${s}${a}`),await new Promise(()=>{})}else{let e=await Ne(),o=(await P()).running?"":h.default.gray(`
|
|
60
|
+
Run "tp schedule start -d" to start the daemon`);console.log(`
|
|
61
|
+
${e}${o}
|
|
62
|
+
`)}}async function ao(){let r=new C,e=await r.loadSchedules();if(e.length===0){console.log("No schedules found");return}let{scheduleId:t}=await _.default.prompt([{type:"list",name:"scheduleId",message:"Select schedule to toggle:",choices:e.map(n=>({name:`${n.name??n.workflowPath} (${n.cron}) ${n.enabled?"\u2713":"\u2717"}`,value:n.id}))}]),o=e.find(n=>n.id===t);if(!o){console.log("\u2717 Schedule not found");return}let s=!o.enabled;await r.toggleSchedule(t,s),console.log(`\u2713 Schedule ${s?"enabled":"disabled"}: ${o.name??o.workflowPath}`)}async function lo(){let r=new C,e=await r.loadSchedules();if(e.length===0){console.log("No schedules found");return}let{confirm:t}=await _.default.prompt([{type:"confirm",name:"confirm",message:`Are you sure you want to remove all ${e.length} schedule(s)?`,default:!1}]);if(!t){console.log("Cancelled");return}await r.saveSchedules([]),console.log(`\u2713 Removed all ${e.length} schedule(s)`)}var N=require("fs"),x=require("path"),Ae=require("url"),St={};function vt(){console.log=()=>{},console.error=()=>{},console.warn=()=>{},console.info=()=>{},process.stdout.write=()=>!0,process.stderr.write=()=>!0}function kt(){return"0.2.18"}function xt(r){let e=r?(0,x.resolve)(r):process.cwd(),t=50,o=0;for(;o<t;){let s=(0,x.resolve)(e,"tp");try{if((0,N.existsSync)(s)&&(0,N.statSync)(s).isDirectory())return s}catch{}let n=(0,x.dirname)(e);if(n===e)break;e=n,o++}return null}var co=(0,Pt.promisify)($t.exec),ee=new Et.Command;ee.name("task-pipeliner").description(`A powerful task pipeline runner with condition-based workflow execution.
|
|
39
63
|
|
|
40
64
|
Define workflows in YAML or JSON files with conditional execution, parallel tasks,
|
|
41
65
|
interactive prompts, and variable substitution.
|
|
@@ -73,7 +97,7 @@ Quick Start:
|
|
|
73
97
|
tp history remove # Remove a specific history
|
|
74
98
|
tp history remove-all # Remove all histories
|
|
75
99
|
|
|
76
|
-
`).version(
|
|
100
|
+
`).version(kt()).addHelpText("after",`
|
|
77
101
|
Examples:
|
|
78
102
|
$ tp run workflow.yaml
|
|
79
103
|
$ tp run examples/simple-project/workflow.yaml
|
|
@@ -86,7 +110,7 @@ Resources:
|
|
|
86
110
|
\u{1F4DA} Documentation: https://task-pipeliner.racgoo.com/
|
|
87
111
|
\u{1F3A8} Visual Generator: https://task-pipeliner-generator.racgoo.com/
|
|
88
112
|
|
|
89
|
-
See README.md for complete DSL reference.`);
|
|
113
|
+
See README.md for complete DSL reference.`);ee.command("run").description("Run a workflow from a YAML or JSON file").argument("[file]","Path to the workflow file (YAML or JSON, relative or absolute). If omitted, will search for workflows in the nearest tp directory.").option("-s, --silent","Run in silent mode (suppress console output)").option("-p, --profile <name>","Run in profile mode (use profile name)").addHelpText("after",`
|
|
90
114
|
Examples:
|
|
91
115
|
$ tp run workflow.yaml
|
|
92
116
|
$ tp run workflow.json
|
|
@@ -111,40 +135,40 @@ Workflow File Structure:
|
|
|
111
135
|
\u2022 all/any/not: Combine conditions
|
|
112
136
|
|
|
113
137
|
Supported formats: YAML (.yaml, .yml) and JSON (.json)
|
|
114
|
-
See README.md for complete DSL documentation.`).action(async(
|
|
115
|
-
\u2717 No workflow file found`)),process.exit(1)),e.silent&&
|
|
116
|
-
`)),await new
|
|
117
|
-
\u2713 Workflow completed successfully`))}catch(t){let
|
|
118
|
-
\u2717 Workflow failed: ${
|
|
138
|
+
See README.md for complete DSL documentation.`).action(async(r,e)=>{try{let t=r??await po()??null;t||(console.error(p.default.red(`
|
|
139
|
+
\u2717 No workflow file found`)),process.exit(1)),e.silent&&vt();let o=oe(t);console.log(p.default.blue(`Loading workflow from ${t}...`));let s=(0,Fe.readFileSync)(t,"utf-8"),n=o.parse(s);if(!n.steps||!Array.isArray(n.steps))throw new Error("Invalid workflow: steps array is required");let a;if(e.profile){let l=e.profile.trim();if(!n.profiles?.length)throw new Error(`Profile "${l}" requested but workflow has no "profiles" defined. Add a "profiles" section to your workflow file.`);let c=n.profiles.find(f=>f.name===l);if(!c){let f=n.profiles.map(g=>g.name).join(", ");throw new Error(`Profile "${l}" not found. Available profile(s): ${f}`)}a=c.var}n._lineNumbers=o.extractStepLineNumbers(s),n._fileName=fo(t),n._filePath=(0,Q.resolve)(t),console.log(p.default.green(`Starting workflow execution...
|
|
140
|
+
`)),await new q().execute(n,a?{profileVars:a}:void 0),console.log(p.default.green(`
|
|
141
|
+
\u2713 Workflow completed successfully`))}catch(t){let o=t instanceof Error?t.message:String(t);console.error(p.default.red(`
|
|
142
|
+
\u2717 Workflow failed: ${o}`)),process.exit(1)}});ee.command("open").description("Open generator or docs website in browser").argument("<target>",'Target to open: "generator" or "docs"').addHelpText("after",`
|
|
119
143
|
Examples:
|
|
120
144
|
$ tp open generator
|
|
121
145
|
$ tp open docs
|
|
122
146
|
|
|
123
147
|
Targets:
|
|
124
148
|
generator Open the visual workflow generator (https://task-pipeliner-generator.racgoo.com/)
|
|
125
|
-
docs Open the documentation site (https://task-pipeliner.racgoo.com/)`).action(async
|
|
126
|
-
\u2717 Invalid target: ${
|
|
127
|
-
Valid targets:`)),console.log(p.default.yellow(" \u2022 generator - Open the visual workflow generator")),console.log(p.default.yellow(" \u2022 docs - Open the documentation site")),process.exit(1));try{let
|
|
128
|
-
\u2713 Opening ${
|
|
129
|
-
\u2717 Failed to open browser: ${
|
|
130
|
-
Please visit manually: ${t}`)),process.exit(1)}});var
|
|
131
|
-
\u2717 Invalid choice`)),process.exit(1));let t=new
|
|
132
|
-
\u26A0 No history found`));return}let
|
|
133
|
-
\u2717 Invalid choice`)),process.exit(1));try{let
|
|
134
|
-
\u2717 Failed to load history: ${
|
|
135
|
-
\u26A0 No history found`));return}let
|
|
136
|
-
\u2717 Invalid choice`)),process.exit(1));try{await t.removeHistory(
|
|
137
|
-
\u2713 Removed history: ${
|
|
138
|
-
\u2717 Failed to remove history: ${
|
|
149
|
+
docs Open the documentation site (https://task-pipeliner.racgoo.com/)`).action(async r=>{let t={generator:"https://task-pipeliner-generator.racgoo.com/",docs:"https://task-pipeliner.racgoo.com/"}[r.toLowerCase()];t||(console.error(p.default.red(`
|
|
150
|
+
\u2717 Invalid target: ${r}`)),console.log(p.default.yellow(`
|
|
151
|
+
Valid targets:`)),console.log(p.default.yellow(" \u2022 generator - Open the visual workflow generator")),console.log(p.default.yellow(" \u2022 docs - Open the documentation site")),process.exit(1));try{let o=process.platform,s;o==="darwin"?s=`open "${t}"`:o==="win32"?s=`start "${t}"`:s=`xdg-open "${t}"`,await co(s),console.log(p.default.green(`
|
|
152
|
+
\u2713 Opening ${r==="generator"?"generator":"documentation"} in browser...`)),console.log(p.default.blue(` ${t}`))}catch(o){let s=o instanceof Error?o.message:String(o);console.error(p.default.red(`
|
|
153
|
+
\u2717 Failed to open browser: ${s}`)),console.log(p.default.yellow(`
|
|
154
|
+
Please visit manually: ${t}`)),process.exit(1)}});ee.addCommand(bt());var uo=ee.command("history").description("Manage workflow execution history");uo.action(async()=>{let r=new L,e=await r.prompt("Select an action",[{id:"show",label:"Show - View and select a history to view"},{id:"remove",label:"Remove - Delete a specific history file"},{id:"remove-all",label:"Remove All - Delete all history files"}]);e?.id||(console.error(p.default.red(`
|
|
155
|
+
\u2717 Invalid choice`)),process.exit(1));let t=new z;switch(e.id){case"show":{let o=await t.getHistoryNames();if(o.length===0){console.log(p.default.yellow(`
|
|
156
|
+
\u26A0 No history found`));return}let s=await r.prompt("Select a history to view",o.map(n=>({id:n,label:n})));s?.id||(console.error(p.default.red(`
|
|
157
|
+
\u2717 Invalid choice`)),process.exit(1));try{let n=await t.getHistory(s.id);mo(n,s.id)}catch(n){let a=n instanceof Error?n.message:String(n);console.error(p.default.red(`
|
|
158
|
+
\u2717 Failed to load history: ${a}`)),process.exit(1)}break}case"remove":{let o=await t.getHistoryNames();if(o.length===0){console.log(p.default.yellow(`
|
|
159
|
+
\u26A0 No history found`));return}let s=await r.prompt("Select a history to remove",o.map(n=>({id:n,label:n})));s?.id||(console.error(p.default.red(`
|
|
160
|
+
\u2717 Invalid choice`)),process.exit(1));try{await t.removeHistory(s.id),console.log(p.default.green(`
|
|
161
|
+
\u2713 Removed history: ${s.id}`))}catch(n){let a=n instanceof Error?n.message:String(n);console.error(p.default.red(`
|
|
162
|
+
\u2717 Failed to remove history: ${a}`)),process.exit(1)}break}case"remove-all":{if((await r.prompt("Are you sure you want to remove all histories?",[{id:"yes",label:"Yes, remove all"},{id:"no",label:"No, cancel"}]))?.id!=="yes"){console.log(p.default.yellow(`
|
|
139
163
|
\u2717 Cancelled`));return}try{await t.clearAllHistories(),console.log(p.default.green(`
|
|
140
|
-
\u2713 All histories removed`))}catch(
|
|
141
|
-
\u2717 Failed to remove histories: ${
|
|
142
|
-
\u2717 Unknown action: ${e.id}`)),process.exit(1)}});async function
|
|
143
|
-
\u2717 No tp directory found`)),null;try{let t=(await(0,
|
|
144
|
-
\u2717 No workflow files found in ${
|
|
145
|
-
\u2717 Failed to read tp directory: ${t}`)),null}}function
|
|
146
|
-
`);let t=
|
|
147
|
-
`);console.log((0,
|
|
148
|
-
`);console.log((0,
|
|
149
|
-
`);console.log(p.default.green(" Output:")),console.log(e)}if(
|
|
150
|
-
`);console.log(p.default.red(" Errors:")),console.log(e)}}
|
|
164
|
+
\u2713 All histories removed`))}catch(s){let n=s instanceof Error?s.message:String(s);console.error(p.default.red(`
|
|
165
|
+
\u2717 Failed to remove histories: ${n}`)),process.exit(1)}break}default:console.error(p.default.red(`
|
|
166
|
+
\u2717 Unknown action: ${e.id}`)),process.exit(1)}});async function po(){let r=xt();if(!r)return console.error(p.default.red(`
|
|
167
|
+
\u2717 No tp directory found`)),null;try{let t=(await(0,Rt.readdir)(r)).filter(a=>{let i=(0,Q.extname)(a).toLowerCase();return[".yaml",".yml",".json"].includes(i)});if(t.length===0)return console.error(p.default.red(`
|
|
168
|
+
\u2717 No workflow files found in ${r}`)),null;let o=await Promise.all(t.map(async a=>{let i=(0,Q.join)(r,a);try{let l=oe(i),c=(0,Fe.readFileSync)(i,"utf-8"),g=l.parse(c).name??"Untitled";return{id:i,label:`${a} - ${g}`}}catch{return{id:i,label:a}}}));return(await new L(!0).prompt("Select a workflow to run",o)).id}catch(e){let t=e instanceof Error?e.message:String(e);return console.error(p.default.red(`
|
|
169
|
+
\u2717 Failed to read tp directory: ${t}`)),null}}function fo(r){return r.split("/").pop()??r}function mo(r,e){console.log(`
|
|
170
|
+
`);let t=r.records.reduce((c,f)=>c+f.duration,0),o=r.records.filter(c=>c.status==="success").length,s=r.records.filter(c=>c.status==="failure").length,n=(0,Ct.default)(r.initialTimestamp).format("YYYY-MM-DD HH:mm:ss"),i=Y(t),l=[p.default.bold("Workflow Execution History"),"",`${p.default.cyan("File:")} ${e}`,`${p.default.cyan("Started:")} ${n}`,`${p.default.cyan("Total Duration:")} ${i}`,`${p.default.cyan("Total Steps:")} ${r.records.length}`,`${p.default.green("\u2713 Successful:")} ${o}`,s>0?`${p.default.red("\u2717 Failed:")} ${s}`:""].filter(Boolean).join(`
|
|
171
|
+
`);console.log((0,Le.default)(l,{borderStyle:"round",padding:{top:1,bottom:1,left:2,right:2},margin:{top:0,bottom:1,left:0,right:0},borderColor:"cyan"})),r.records.forEach((c,f)=>{ho(c,f+1,r.records.length)}),console.log("")}function ho(r,e,t){let o=go(r.step),s=wo(r.step),n=r.status==="success"?p.default.green("\u2713"):p.default.red("\u2717"),a=r.status==="success"?p.default.green("Success"):p.default.red("Failed"),i=Y(r.duration),l=[`${n} ${p.default.bold(`Step ${e}/${t}`)} - ${p.default.cyan(o)}`,`${p.default.gray("Duration:")} ${i} | ${p.default.gray("Status:")} ${a}`,"",p.default.white(s)].join(`
|
|
172
|
+
`);console.log((0,Le.default)(l,{borderStyle:"round",padding:{top:1,bottom:1,left:2,right:2},margin:{top:0,bottom:1,left:0,right:0},borderColor:r.status==="success"?"green":"red"})),yo(r.output)&&bo(r.output)}function go(r){return"run"in r?"Run":"choose"in r?"Choose":"prompt"in r?"Prompt":"parallel"in r?"Parallel":"fail"in r?"Fail":"Unknown"}function wo(r){return"run"in r?`Command: ${p.default.yellow(r.run)}`:"choose"in r?`Message: ${p.default.yellow(r.choose.message)}`:"prompt"in r?`Message: ${p.default.yellow(r.prompt.message)} | Variable: ${p.default.cyan(r.prompt.as)}`:"parallel"in r?`Parallel execution with ${r.parallel.length} branches`:"fail"in r?`Error: ${p.default.red(r.fail.message)}`:"Unknown step type"}function yo(r){return typeof r=="object"&&r!==null&&"success"in r&&"stdout"in r&&"stderr"in r}function bo(r){if(r.stdout.length>0){let e=r.stdout.map(t=>p.default.gray(` ${t}`)).join(`
|
|
173
|
+
`);console.log(p.default.green(" Output:")),console.log(e)}if(r.stderr.length>0){let e=r.stderr.map(t=>p.default.gray(` ${t}`)).join(`
|
|
174
|
+
`);console.log(p.default.red(" Errors:")),console.log(e)}}ee.parse();
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "task-pipeliner",
|
|
3
|
-
"version": "0.2.
|
|
3
|
+
"version": "0.2.18",
|
|
4
4
|
"description": "A task pipeline runner with condition-based workflow execution",
|
|
5
5
|
"type": "module",
|
|
6
6
|
"main": "./dist/index.cjs",
|
|
@@ -35,6 +35,7 @@
|
|
|
35
35
|
"dayjs": "^1.11.19",
|
|
36
36
|
"inquirer": "^8.2.6",
|
|
37
37
|
"log-update": "^4.0.0",
|
|
38
|
+
"node-cron": "^4.2.1",
|
|
38
39
|
"ora": "^5.4.1",
|
|
39
40
|
"yaml": "^2.3.4",
|
|
40
41
|
"zod": "^4.3.5"
|
|
@@ -43,6 +44,7 @@
|
|
|
43
44
|
"@napi-rs/cli": "^2.16.0",
|
|
44
45
|
"@types/inquirer": "^9.0.7",
|
|
45
46
|
"@types/node": "^20.10.0",
|
|
47
|
+
"@types/node-cron": "^3.0.11",
|
|
46
48
|
"@typescript-eslint/eslint-plugin": "^8.53.1",
|
|
47
49
|
"@typescript-eslint/parser": "^8.53.1",
|
|
48
50
|
"eslint": "^9.39.2",
|