@livingdata/pipex 0.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,261 @@
1
+ # Pipex
2
+
3
+ Execution engine for containerized steps via Docker CLI.
4
+
5
+ Runs containers with explicit volume mounts and manages artifacts through a staging/commit lifecycle. Designed to be driven by different orchestrators (CLI included, AI agent planned).
6
+
7
+ ## Installation
8
+
9
+ ```bash
10
+ npm install
11
+ cp .env.example .env
12
+ # Edit .env to set PIPEX_WORKDIR if needed (defaults to ./workdir)
13
+ ```
14
+
15
+ ## Prerequisites
16
+
17
+ - Node.js 24+
18
+ - Docker CLI installed and accessible
19
+
20
+ ## Usage
21
+
22
+ ### Running a pipeline
23
+
24
+ ```bash
25
+ # Interactive mode (default)
26
+ npm start -- run pipeline.example.json
27
+
28
+ # With workspace name (enables caching)
29
+ npm start -- run pipeline.example.json --workspace my-build
30
+
31
+ # JSON mode (for CI/CD)
32
+ npm start -- run pipeline.example.json --json
33
+
34
+ # Custom workdir
35
+ npm start -- run pipeline.example.json --workdir /tmp/builds
36
+ ```
37
+
38
+ ### Managing workspaces
39
+
40
+ ```bash
41
+ # List workspaces (with artifact/cache counts)
42
+ npm start -- list
43
+ npm start -- ls --json
44
+
45
+ # Remove specific workspaces
46
+ npm start -- rm my-build other-build
47
+
48
+ # Remove all workspaces
49
+ npm start -- clean
50
+ ```
51
+
52
+ ### Via npx
53
+
54
+ ```bash
55
+ # Build first
56
+ npm run build
57
+
58
+ # Run locally via npx
59
+ npx . run example/pipeline.json --workspace my-build
60
+ npx . list
61
+ ```
62
+
63
+ ### Commands
64
+
65
+ | Command | Description |
66
+ |---------|-------------|
67
+ | `run <pipeline>` | Execute a pipeline |
68
+ | `list` (alias `ls`) | List workspaces |
69
+ | `rm <workspace...>` | Remove one or more workspaces |
70
+ | `clean` | Remove all workspaces |
71
+
72
+ ### Global Options
73
+
74
+ | Option | Description |
75
+ |--------|-------------|
76
+ | `--workdir <path>` | Workspaces root directory (default: `./workdir`) |
77
+ | `--json` | Structured JSON logs instead of interactive UI |
78
+
79
+ ### Run Options
80
+
81
+ | Option | Alias | Description |
82
+ |--------|-------|-------------|
83
+ | `--workspace <name>` | `-w` | Workspace name for caching |
84
+ | `--force [steps]` | `-f` | Skip cache for all steps, or a comma-separated list |
85
+
86
+ ## Pipeline Format
87
+
88
+ Minimal example:
89
+
90
+ ```json
91
+ {
92
+ "name": "my-pipeline",
93
+ "steps": [
94
+ {
95
+ "id": "download",
96
+ "image": "alpine:3.19",
97
+ "cmd": ["sh", "-c", "echo hello > /output/hello.txt"]
98
+ },
99
+ {
100
+ "id": "process",
101
+ "image": "alpine:3.19",
102
+ "cmd": ["cat", "/input/download/hello.txt"],
103
+ "inputs": [{"step": "download"}]
104
+ }
105
+ ]
106
+ }
107
+ ```
108
+
109
+ ### Step Options
110
+
111
+ | Field | Type | Description |
112
+ |-------|------|-------------|
113
+ | `id` | string | Step identifier (required) |
114
+ | `image` | string | Docker image (required) |
115
+ | `cmd` | string[] | Command to execute (required) |
116
+ | `inputs` | InputSpec[] | Previous steps to mount as read-only |
117
+ | `env` | Record<string, string> | Environment variables |
118
+ | `outputPath` | string | Output mount point (default: `/output`) |
119
+ | `mounts` | MountSpec[] | Host directories to bind mount (read-only) |
120
+ | `caches` | CacheSpec[] | Persistent caches to mount |
121
+ | `timeoutSec` | number | Execution timeout |
122
+ | `allowFailure` | boolean | Continue pipeline if step fails |
123
+ | `allowNetwork` | boolean | Enable network access |
124
+
125
+ ### Inputs
126
+
127
+ Mount previous steps as read-only:
128
+
129
+ ```json
130
+ "inputs": [
131
+ {"step": "step1"},
132
+ {"step": "step2", "copyToOutput": true}
133
+ ]
134
+ ```
135
+
136
+ - Mounted under `/input/{stepName}/`
137
+ - `copyToOutput: true` copies content to output before execution
138
+
139
+ ### Host Mounts
140
+
141
+ Mount host directories into containers as **read-only**:
142
+
143
+ ```json
144
+ "mounts": [
145
+ {"host": "src/app", "container": "/app"},
146
+ {"host": "config", "container": "/config"}
147
+ ]
148
+ ```
149
+
150
+ - `host` must be a **relative** path (resolved from the pipeline file's directory)
151
+ - `container` must be an **absolute** path
152
+ - Neither path can contain `..`
153
+ - Always mounted read-only -- containers cannot modify host files
154
+
155
+ This means a pipeline at `/project/ci/pipeline.json` can only mount subdirectories of `/project/ci/`. Use `/tmp` or `/output` inside the container for writes.
156
+
157
+ ### Caches
158
+
159
+ Persistent read-write directories shared across steps and executions:
160
+
161
+ ```json
162
+ "caches": [
163
+ {"name": "pnpm-store", "path": "/root/.local/share/pnpm/store"},
164
+ {"name": "build-cache", "path": "/tmp/cache"}
165
+ ]
166
+ ```
167
+
168
+ - **Persistent**: Caches survive across pipeline executions
169
+ - **Shared**: Multiple steps can use the same cache
170
+ - **Mutable**: Steps can read and write to caches
171
+
172
+ Common use cases:
173
+ - Package manager caches (pnpm, npm, cargo, maven)
174
+ - Build caches (gradle, ccache)
175
+ - Downloaded assets
176
+
177
+ **Note**: Caches are workspace-scoped (not global). Different workspaces have isolated caches.
178
+
179
+ ## Example
180
+
181
+ The `example/` directory contains a multi-language pipeline that chains Node.js and Python steps:
182
+
183
+ ```
184
+ example/
185
+ ├── pipeline.json
186
+ └── scripts/
187
+ ├── nodejs/ # lodash-based data analysis
188
+ │ ├── package.json
189
+ │ ├── analyze.js
190
+ │ └── transform.js
191
+ └── python/ # pyyaml-based enrichment
192
+ ├── pyproject.toml
193
+ ├── analyze.py
194
+ └── transform.py
195
+ ```
196
+
197
+ The pipeline runs 4 steps: `node-analyze` → `node-transform` → `python-analyze` → `python-transform`. Each step mounts its scripts directory as read-only and passes artifacts to the next step via `/input`.
198
+
199
+ ```bash
200
+ npm start -- run example/pipeline.json --workspace example-test
201
+ ```
202
+
203
+ ## Caching & Workspaces
204
+
205
+ Workspaces enable caching across runs. Name is determined by:
206
+ 1. CLI flag `--workspace` (highest priority)
207
+ 2. Config `"name"` field
208
+ 3. Filename (e.g., `build.json` → `build`)
209
+ 4. Auto-generated timestamp
210
+
211
+ **Cache behavior**: Steps are skipped if image, cmd, env, inputs, and mounts haven't changed. See code documentation for details.
212
+
213
+ ## Troubleshooting
214
+
215
+ ### Docker not found
216
+
217
+ ```bash
218
+ # Verify Docker is accessible
219
+ docker --version
220
+ docker ps
221
+ ```
222
+
223
+ ### Permission denied (Linux)
224
+
225
+ ```bash
226
+ sudo usermod -aG docker $USER
227
+ newgrp docker
228
+ ```
229
+
230
+ ### Workspace disk full
231
+
232
+ Clean old workspaces:
233
+
234
+ ```bash
235
+ npm start -- list
236
+ npm start -- rm old-workspace-id
237
+ # Or remove all at once
238
+ npm start -- clean
239
+ ```
240
+
241
+ ### Cached step with missing artifact
242
+
243
+ Force re-execution:
244
+
245
+ ```bash
246
+ rm $PIPEX_WORKDIR/{workspace-id}/state.json
247
+ ```
248
+
249
+ ## Development
250
+
251
+ ```bash
252
+ npm run build
253
+ npm run lint
254
+ npm run lint:fix
255
+ ```
256
+
257
+ ## Architecture
258
+
259
+ For implementation details, see code documentation in:
260
+ - `src/engine/` - Low-level container execution (workspace, executor)
261
+ - `src/cli/` - Pipeline orchestration (runner, loader, state)
@@ -0,0 +1,126 @@
1
+ #!/usr/bin/env node
2
+ import 'dotenv/config';
3
+ import process from 'node:process';
4
+ import { resolve } from 'node:path';
5
+ import chalk from 'chalk';
6
+ import { Command } from 'commander';
7
+ import { Workspace } from '../engine/workspace.js';
8
+ import { DockerCliExecutor } from '../engine/docker-executor.js';
9
+ import { PipelineLoader } from './pipeline-loader.js';
10
+ import { PipelineRunner } from './pipeline-runner.js';
11
+ import { ConsoleReporter, InteractiveReporter } from './reporter.js';
12
+ function getGlobalOptions(cmd) {
13
+ return cmd.optsWithGlobals();
14
+ }
15
+ async function main() {
16
+ const program = new Command();
17
+ program
18
+ .name('pipex')
19
+ .description('Execution engine for containerized steps')
20
+ .version('0.1.0')
21
+ .option('--workdir <path>', 'Workspaces root directory', process.env.PIPEX_WORKDIR ?? './workdir')
22
+ .option('--json', 'Output structured JSON logs');
23
+ program
24
+ .command('run')
25
+ .description('Execute a pipeline')
26
+ .argument('<pipeline>', 'Pipeline JSON file to execute')
27
+ .option('-w, --workspace <name>', 'Workspace name (for caching)')
28
+ .option('-f, --force [steps]', 'Skip cache for all steps, or a comma-separated list (e.g. --force step1,step2)')
29
+ .action(async (pipelineFile, options, cmd) => {
30
+ const { workdir, json } = getGlobalOptions(cmd);
31
+ const workdirRoot = resolve(workdir);
32
+ const loader = new PipelineLoader();
33
+ const runtime = new DockerCliExecutor();
34
+ const reporter = json ? new ConsoleReporter() : new InteractiveReporter();
35
+ const runner = new PipelineRunner(loader, runtime, reporter, workdirRoot);
36
+ try {
37
+ const force = options.force === true
38
+ ? true
39
+ : (typeof options.force === 'string' ? options.force.split(',') : undefined);
40
+ await runner.run(pipelineFile, { workspace: options.workspace, force });
41
+ if (json) {
42
+ console.log('Pipeline completed');
43
+ }
44
+ }
45
+ catch (error) {
46
+ if (json) {
47
+ console.error('Pipeline failed:', error instanceof Error ? error.message : error);
48
+ }
49
+ throw error;
50
+ }
51
+ });
52
+ program
53
+ .command('list')
54
+ .alias('ls')
55
+ .description('List workspaces')
56
+ .action(async (_options, cmd) => {
57
+ const { workdir, json } = getGlobalOptions(cmd);
58
+ const workdirRoot = resolve(workdir);
59
+ const names = await Workspace.list(workdirRoot);
60
+ if (json) {
61
+ console.log(JSON.stringify(names));
62
+ return;
63
+ }
64
+ if (names.length === 0) {
65
+ console.log(chalk.gray('No workspaces found.'));
66
+ return;
67
+ }
68
+ const rows = [];
69
+ for (const name of names) {
70
+ const ws = await Workspace.open(workdirRoot, name);
71
+ const artifacts = await ws.listArtifacts();
72
+ const caches = await ws.listCaches();
73
+ rows.push({ name, artifacts: artifacts.length, caches: caches.length });
74
+ }
75
+ const nameWidth = Math.max('WORKSPACE'.length, ...rows.map(r => r.name.length));
76
+ const header = `${'WORKSPACE'.padEnd(nameWidth)} ARTIFACTS CACHES`;
77
+ console.log(chalk.bold(header));
78
+ for (const row of rows) {
79
+ console.log(`${row.name.padEnd(nameWidth)} ${String(row.artifacts).padStart(9)} ${String(row.caches).padStart(6)}`);
80
+ }
81
+ });
82
+ program
83
+ .command('rm')
84
+ .description('Remove one or more workspaces')
85
+ .argument('<workspace...>', 'Workspace names to remove')
86
+ .action(async (workspaces, _options, cmd) => {
87
+ const { workdir } = getGlobalOptions(cmd);
88
+ const workdirRoot = resolve(workdir);
89
+ const existing = await Workspace.list(workdirRoot);
90
+ for (const name of workspaces) {
91
+ if (!existing.includes(name)) {
92
+ console.error(chalk.red(`Workspace not found: ${name}`));
93
+ process.exitCode = 1;
94
+ return;
95
+ }
96
+ }
97
+ for (const name of workspaces) {
98
+ await Workspace.remove(workdirRoot, name);
99
+ console.log(chalk.green(`Removed ${name}`));
100
+ }
101
+ });
102
+ program
103
+ .command('clean')
104
+ .description('Remove all workspaces')
105
+ .action(async (_options, cmd) => {
106
+ const { workdir } = getGlobalOptions(cmd);
107
+ const workdirRoot = resolve(workdir);
108
+ const names = await Workspace.list(workdirRoot);
109
+ if (names.length === 0) {
110
+ console.log(chalk.gray('No workspaces to clean.'));
111
+ return;
112
+ }
113
+ for (const name of names) {
114
+ await Workspace.remove(workdirRoot, name);
115
+ }
116
+ console.log(chalk.green(`Removed ${names.length} workspace${names.length > 1 ? 's' : ''}.`));
117
+ });
118
+ await program.parseAsync();
119
+ }
120
+ try {
121
+ await main();
122
+ }
123
+ catch (error) {
124
+ console.error('Fatal error:', error);
125
+ throw error;
126
+ }
@@ -0,0 +1,87 @@
1
+ import { readFile } from 'node:fs/promises';
2
+ export class PipelineLoader {
3
+ async load(filePath) {
4
+ const content = await readFile(filePath, 'utf8');
5
+ const config = JSON.parse(content);
6
+ if (!Array.isArray(config.steps) || config.steps.length === 0) {
7
+ throw new Error('Invalid pipeline: steps must be a non-empty array');
8
+ }
9
+ for (const step of config.steps) {
10
+ this.validateStep(step);
11
+ }
12
+ return config;
13
+ }
14
+ validateStep(step) {
15
+ if (!step.id || typeof step.id !== 'string') {
16
+ throw new Error('Invalid step: id is required');
17
+ }
18
+ this.validateIdentifier(step.id, 'step id');
19
+ if (!step.image || typeof step.image !== 'string') {
20
+ throw new Error(`Invalid step ${step.id}: image is required`);
21
+ }
22
+ if (!Array.isArray(step.cmd) || step.cmd.length === 0) {
23
+ throw new Error(`Invalid step ${step.id}: cmd must be a non-empty array`);
24
+ }
25
+ if (step.inputs) {
26
+ for (const input of step.inputs) {
27
+ this.validateIdentifier(input.step, `input step name in step ${step.id}`);
28
+ }
29
+ }
30
+ if (step.mounts) {
31
+ this.validateMounts(step.id, step.mounts);
32
+ }
33
+ if (step.caches) {
34
+ this.validateCaches(step.id, step.caches);
35
+ }
36
+ }
37
+ validateMounts(stepId, mounts) {
38
+ if (!Array.isArray(mounts)) {
39
+ throw new TypeError(`Step ${stepId}: mounts must be an array`);
40
+ }
41
+ for (const mount of mounts) {
42
+ if (!mount.host || typeof mount.host !== 'string') {
43
+ throw new Error(`Step ${stepId}: mount.host is required and must be a string`);
44
+ }
45
+ if (mount.host.startsWith('/')) {
46
+ throw new Error(`Step ${stepId}: mount.host '${mount.host}' must be a relative path`);
47
+ }
48
+ if (mount.host.includes('..')) {
49
+ throw new Error(`Step ${stepId}: mount.host '${mount.host}' must not contain '..'`);
50
+ }
51
+ if (!mount.container || typeof mount.container !== 'string') {
52
+ throw new Error(`Step ${stepId}: mount.container is required and must be a string`);
53
+ }
54
+ if (!mount.container.startsWith('/')) {
55
+ throw new Error(`Step ${stepId}: mount.container '${mount.container}' must be an absolute path`);
56
+ }
57
+ if (mount.container.includes('..')) {
58
+ throw new Error(`Step ${stepId}: mount.container '${mount.container}' must not contain '..'`);
59
+ }
60
+ }
61
+ }
62
+ validateCaches(stepId, caches) {
63
+ if (!Array.isArray(caches)) {
64
+ throw new TypeError(`Step ${stepId}: caches must be an array`);
65
+ }
66
+ for (const cache of caches) {
67
+ if (!cache.name || typeof cache.name !== 'string') {
68
+ throw new Error(`Step ${stepId}: cache.name is required and must be a string`);
69
+ }
70
+ this.validateIdentifier(cache.name, `cache name in step ${stepId}`);
71
+ if (!cache.path || typeof cache.path !== 'string') {
72
+ throw new Error(`Step ${stepId}: cache.path is required and must be a string`);
73
+ }
74
+ if (!cache.path.startsWith('/')) {
75
+ throw new Error(`Step ${stepId}: cache.path '${cache.path}' must be an absolute path`);
76
+ }
77
+ }
78
+ }
79
+ validateIdentifier(id, context) {
80
+ if (!/^[\w-]+$/.test(id)) {
81
+ throw new Error(`Invalid ${context}: '${id}' must contain only alphanumeric characters, underscore, and hyphen`);
82
+ }
83
+ if (id.includes('..')) {
84
+ throw new Error(`Invalid ${context}: '${id}' cannot contain '..'`);
85
+ }
86
+ }
87
+ }
@@ -0,0 +1,193 @@
1
+ import { cp } from 'node:fs/promises';
2
+ import { basename, dirname, resolve } from 'node:path';
3
+ import { Workspace } from '../engine/index.js';
4
+ import { StateManager } from './state.js';
5
+ /**
6
+ * Orchestrates pipeline execution with dependency resolution and caching.
7
+ *
8
+ * ## Workflow
9
+ *
10
+ * 1. **Workspace Resolution**: Determines workspace ID from CLI flag, config, or filename
11
+ * 2. **State Loading**: Loads cached fingerprints from state.json
12
+ * 3. **Step Execution**: For each step:
13
+ * a. Computes fingerprint (image + cmd + env + input artifact IDs)
14
+ * b. Checks cache (fingerprint match + artifact exists)
15
+ * c. If cached: skips execution
16
+ * d. If not cached: resolves inputs, prepares staging, executes container
17
+ * e. On success: commits artifact, saves state
18
+ * f. On failure: discards artifact, halts pipeline (unless allowFailure)
19
+ * 4. **Completion**: Reports final pipeline status
20
+ *
21
+ * ## Dependencies
22
+ *
23
+ * Steps declare dependencies via `inputs: [{step: "stepId"}]`.
24
+ * The runner:
25
+ * - Mounts input artifacts as read-only volumes
26
+ * - Optionally copies inputs to output staging (if `copyToOutput: true`)
27
+ * - Tracks execution order to resolve step names to artifact IDs
28
+ *
29
+ * ## Caching
30
+ *
31
+ * Cache invalidation is automatic:
32
+ * - Changing a step's configuration re-runs it
33
+ * - Re-running a step invalidates all dependent steps
34
+ */
35
+ export class PipelineRunner {
36
+ loader;
37
+ runtime;
38
+ reporter;
39
+ workdirRoot;
40
+ constructor(loader, runtime, reporter, workdirRoot) {
41
+ this.loader = loader;
42
+ this.runtime = runtime;
43
+ this.reporter = reporter;
44
+ this.workdirRoot = workdirRoot;
45
+ }
46
+ async run(pipelineFilePath, options) {
47
+ const { workspace: workspaceName, force } = options ?? {};
48
+ const config = await this.loader.load(pipelineFilePath);
49
+ const pipelineRoot = dirname(resolve(pipelineFilePath));
50
+ // Workspace ID priority: CLI arg > config.name > filename
51
+ const workspaceId = workspaceName
52
+ ?? config.name
53
+ ?? basename(pipelineFilePath, '.json').replaceAll(/[^\w-]/g, '-');
54
+ let workspace;
55
+ try {
56
+ workspace = await Workspace.open(this.workdirRoot, workspaceId);
57
+ }
58
+ catch {
59
+ workspace = await Workspace.create(this.workdirRoot, workspaceId);
60
+ }
61
+ await workspace.cleanupStaging();
62
+ await this.runtime.check();
63
+ const state = new StateManager(workspace.root);
64
+ await state.load();
65
+ const stepArtifacts = new Map();
66
+ this.reporter.state(workspace.id, 'PIPELINE_START');
67
+ for (const step of config.steps) {
68
+ const inputArtifactIds = step.inputs
69
+ ?.map(i => stepArtifacts.get(i.step))
70
+ .filter((id) => id !== undefined);
71
+ const resolvedMounts = step.mounts?.map(m => ({
72
+ hostPath: resolve(pipelineRoot, m.host),
73
+ containerPath: m.container
74
+ }));
75
+ const currentFingerprint = StateManager.fingerprint({
76
+ image: step.image,
77
+ cmd: step.cmd,
78
+ env: step.env,
79
+ inputArtifactIds,
80
+ mounts: resolvedMounts
81
+ });
82
+ const skipCache = force === true || (Array.isArray(force) && force.includes(step.id));
83
+ if (!skipCache && await this.tryUseCache({ workspace, state, step, currentFingerprint, stepArtifacts })) {
84
+ continue;
85
+ }
86
+ this.reporter.state(workspace.id, 'STEP_STARTING', step.id);
87
+ const artifactId = workspace.generateArtifactId();
88
+ const stagingPath = await workspace.prepareArtifact(artifactId);
89
+ await this.prepareStagingWithInputs(workspace, step, stagingPath, stepArtifacts);
90
+ // Prepare caches
91
+ if (step.caches) {
92
+ for (const cache of step.caches) {
93
+ await workspace.prepareCache(cache.name);
94
+ }
95
+ }
96
+ const { inputs, output, caches, mounts } = this.buildMounts(step, artifactId, stepArtifacts, pipelineRoot);
97
+ const result = await this.runtime.run(workspace, {
98
+ name: `pipex-${workspace.id}-${step.id}-${Date.now()}`,
99
+ image: step.image,
100
+ cmd: step.cmd,
101
+ env: step.env,
102
+ inputs,
103
+ output,
104
+ caches,
105
+ mounts,
106
+ network: step.allowNetwork ? 'bridge' : 'none',
107
+ timeoutSec: step.timeoutSec
108
+ }, ({ stream, line }) => {
109
+ this.reporter.log(workspace.id, step.id, stream, line);
110
+ });
111
+ this.reporter.result(workspace.id, step.id, result);
112
+ if (result.exitCode === 0 || step.allowFailure) {
113
+ await workspace.commitArtifact(artifactId);
114
+ stepArtifacts.set(step.id, artifactId);
115
+ state.setStep(step.id, artifactId, currentFingerprint);
116
+ await state.save();
117
+ this.reporter.state(workspace.id, 'STEP_FINISHED', step.id, { artifactId });
118
+ }
119
+ else {
120
+ await workspace.discardArtifact(artifactId);
121
+ this.reporter.state(workspace.id, 'STEP_FAILED', step.id, { exitCode: result.exitCode });
122
+ this.reporter.state(workspace.id, 'PIPELINE_FAILED');
123
+ throw new Error(`Step ${step.id} failed with exit code ${result.exitCode}`);
124
+ }
125
+ }
126
+ this.reporter.state(workspace.id, 'PIPELINE_FINISHED');
127
+ }
128
+ async tryUseCache({ workspace, state, step, currentFingerprint, stepArtifacts }) {
129
+ const cached = state.getStep(step.id);
130
+ if (cached?.fingerprint === currentFingerprint) {
131
+ try {
132
+ const artifacts = await workspace.listArtifacts();
133
+ if (artifacts.includes(cached.artifactId)) {
134
+ stepArtifacts.set(step.id, cached.artifactId);
135
+ this.reporter.state(workspace.id, 'STEP_SKIPPED', step.id, { artifactId: cached.artifactId, reason: 'cached' });
136
+ return true;
137
+ }
138
+ }
139
+ catch {
140
+ // Artifact missing, proceed with execution
141
+ }
142
+ }
143
+ return false;
144
+ }
145
+ async prepareStagingWithInputs(workspace, step, stagingPath, stepArtifacts) {
146
+ if (!step.inputs) {
147
+ return;
148
+ }
149
+ for (const input of step.inputs) {
150
+ const inputArtifactId = stepArtifacts.get(input.step);
151
+ if (!inputArtifactId) {
152
+ throw new Error(`Step ${step.id}: input step '${input.step}' not found or not yet executed`);
153
+ }
154
+ if (input.copyToOutput) {
155
+ await cp(workspace.artifactPath(inputArtifactId), stagingPath, { recursive: true });
156
+ }
157
+ }
158
+ }
159
+ buildMounts(step, outputArtifactId, stepArtifacts, pipelineRoot) {
160
+ const inputs = [];
161
+ if (step.inputs) {
162
+ for (const input of step.inputs) {
163
+ const inputArtifactId = stepArtifacts.get(input.step);
164
+ if (inputArtifactId) {
165
+ inputs.push({
166
+ artifactId: inputArtifactId,
167
+ containerPath: `/input/${input.step}`
168
+ });
169
+ }
170
+ }
171
+ }
172
+ const output = {
173
+ stagingArtifactId: outputArtifactId,
174
+ containerPath: step.outputPath ?? '/output'
175
+ };
176
+ // Build cache mounts
177
+ let caches;
178
+ if (step.caches) {
179
+ caches = step.caches.map(c => ({
180
+ name: c.name,
181
+ containerPath: c.path
182
+ }));
183
+ }
184
+ let mounts;
185
+ if (step.mounts) {
186
+ mounts = step.mounts.map(m => ({
187
+ hostPath: resolve(pipelineRoot, m.host),
188
+ containerPath: m.container
189
+ }));
190
+ }
191
+ return { inputs, output, caches, mounts };
192
+ }
193
+ }