builderman 1.2.0 β†’ 1.4.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,129 +1,361 @@
1
- # **builderman**
2
-
3
- #### _A simple task runner for building and developing projects._
4
-
5
- <br />
1
+ # builderman
2
+
3
+ #### A dependency-aware task runner for building, developing, and orchestrating complex workflows.
4
+
5
+ **builderman** lets you define tasks with explicit dependencies, lifecycle hooks, and multiple execution modes (`dev`, `build`, `deploy`, etc.), then compose them into pipelines that run **deterministically**, **observably**, and **safely**.
6
+
7
+ It is designed for monorepos, long-running development processes, and CI/CD pipelines where **cleanup, cancellation, and failure handling matter**.
8
+
9
+ ---
10
+
11
+ ## Table of Contents
12
+
13
+ > - [Key Features](#key-features)
14
+ > - [Installation](#installation)
15
+ > - [Quick Start](#quick-start)
16
+ > - [Core Concepts](#core-concepts)
17
+ > - [Tasks](#tasks)
18
+ > - [Commands & Modes](#commands--modes)
19
+ > - [Environment Variables](#environment-variables)
20
+ > - [Dependencies](#dependencies)
21
+ > - [Pipelines](#pipelines)
22
+ > - [Pipeline Composition](#pipeline-composition)
23
+ > - [Error Handling Guarantees](#error-handling-guarantees)
24
+ > - [Cancellation](#cancellation)
25
+ > - [Teardown](#teardown)
26
+ > - [Basic Teardown](#basic-teardown)
27
+ > - [Teardown Callbacks](#teardown-callbacks)
28
+ > - [Teardown Execution Rules](#teardown-execution-rules)
29
+ > - [Skipping Tasks](#skipping-tasks)
30
+ > - [Strict Mode](#strict-mode)
31
+ > - [Task-Level Skip Override](#task-level-skip-override)
32
+ > - [Execution Statistics](#execution-statistics)
33
+ > - [Pipeline Statistics](#pipeline-statistics)
34
+ > - [Task Statistics](#task-statistics)
35
+ > - [When Should I Use builderman?](#when-should-i-use-builderman)
36
+
37
+ ## Key Features
38
+
39
+ - 🧩 **Explicit dependency graph** β€” tasks run only when their dependencies are satisfied
40
+ - πŸ” **Multi-mode commands** β€” `dev`, `build`, `deploy`, or any custom mode
41
+ - ⏳ **Readiness detection** β€” wait for long-running processes to become β€œready”
42
+ - 🧹 **Guaranteed teardown** β€” automatic cleanup in reverse dependency order
43
+ - πŸ›‘ **Cancellation support** β€” abort pipelines using `AbortSignal`
44
+ - πŸ“Š **Rich execution statistics** β€” always available, even on failure
45
+ - ❌ **Never throws** β€” failures are returned as structured results
46
+ - 🧱 **Composable pipelines** β€” pipelines can be converted into tasks
47
+
48
+ ---
6
49
 
7
50
  ## Installation
8
51
 
9
- ```bash
52
+ ```sh
10
53
  npm install builderman
11
54
  ```
12
55
 
13
- ## Usage
56
+ ---
57
+
58
+ ## Quick Start
14
59
 
15
60
  ```ts
16
61
  import { task, pipeline } from "builderman"
17
62
 
18
- const task1 = task({
63
+ const build = task({
64
+ name: "build",
65
+ commands: { build: "tsc" },
66
+ cwd: "packages/my-package", // Optional: defaults to "."
67
+ })
68
+
69
+ const test = task({
70
+ name: "test",
71
+ commands: { build: "npm test" },
72
+ dependencies: [build],
73
+ cwd: "packages/my-package",
74
+ })
75
+
76
+ const result = await pipeline([build, test]).run({
77
+ command: "build",
78
+ })
79
+
80
+ if (!result.ok) {
81
+ console.error("Pipeline failed:", result.error.message)
82
+ }
83
+ ```
84
+
85
+ This defines a simple dependency graph where `test` runs only after `build` completes successfully.
86
+
87
+ ---
88
+
89
+ ## Core Concepts
90
+
91
+ ### Tasks
92
+
93
+ A **task** represents a unit of work. Each task:
94
+
95
+ - Has a unique name
96
+ - Defines commands for one or more modes
97
+ - May depend on other tasks
98
+ - May register teardown logic
99
+ - Has an optional working directory (`cwd`, defaults to `"."`)
100
+
101
+ ```ts
102
+ import { task } from "builderman"
103
+
104
+ const libTask = task({
19
105
  name: "lib:build",
20
106
  commands: {
21
107
  build: "tsc",
22
108
  dev: {
23
109
  run: "tsc --watch",
24
- readyWhen: (stdout) => {
25
- // mark this task as ready when the process is watching for file changes
26
- return stdout.includes("Watching for file changes.")
27
- },
110
+ readyWhen: (stdout) => stdout.includes("Watching for file changes."),
28
111
  },
29
112
  },
30
113
  cwd: "packages/lib",
31
114
  })
115
+ ```
32
116
 
33
- const task2 = task({
34
- name: "consumer:dev",
117
+ ---
118
+
119
+ ### Commands & Modes
120
+
121
+ Each task can define commands for different **modes** (for example `dev`, `build`, `deploy`).
122
+
123
+ When running a pipeline:
124
+
125
+ - If `command` is provided, that mode is used
126
+ - Otherwise:
127
+ - `"build"` is used when `NODE_ENV === "production"`
128
+ - `"dev"` is used in all other cases
129
+
130
+ Commands may be:
131
+
132
+ - A string (executed directly), or
133
+ - An object with:
134
+ - `run`: the command to execute
135
+ - `readyWhen`: a predicate that marks the task as ready
136
+ - `teardown`: cleanup logic to run after completion
137
+ - `env`: environment variables specific to this command
138
+
139
+ ---
140
+
141
+ ### Environment Variables
142
+
143
+ Environment variables can be provided at multiple levels, with more specific levels overriding less specific ones:
144
+
145
+ **Precedence order (highest to lowest):**
146
+
147
+ 1. Command-level `env` (in command config)
148
+ 2. Task-level `env` (in task config)
149
+ 3. Pipeline-level `env` (in `pipeline.run()`)
150
+ 4. Process environment variables
151
+
152
+ #### Command-Level Environment Variables
153
+
154
+ ```ts
155
+ const apiTask = task({
156
+ name: "api",
157
+ commands: {
158
+ dev: {
159
+ run: "npm run dev",
160
+ env: {
161
+ PORT: "3000",
162
+ NODE_ENV: "development",
163
+ },
164
+ },
165
+ },
166
+ })
167
+ ```
168
+
169
+ #### Task-Level Environment Variables
170
+
171
+ ```ts
172
+ const apiTask = task({
173
+ name: "api",
35
174
  commands: {
36
- build: "npm run build",
37
175
  dev: "npm run dev",
38
- deploy: "npm run deploy",
176
+ build: "npm run build",
177
+ },
178
+ env: {
179
+ API_URL: "http://localhost:3000",
180
+ LOG_LEVEL: "debug",
39
181
  },
40
- cwd: "packages/consumer",
41
- dependencies: [task1],
42
182
  })
183
+ ```
184
+
185
+ #### Pipeline-Level Environment Variables
43
186
 
44
- await pipeline([task1, task2]).run({
45
- // default command is "build" if process.NODE_ENV is "production", otherwise "dev".
46
- command: "deploy",
47
- onTaskBegin: (taskName) => {
48
- console.log(`[${taskName}] Starting...`)
187
+ ```ts
188
+ const result = await pipeline([apiTask]).run({
189
+ env: {
190
+ DATABASE_URL: "postgres://localhost/mydb",
191
+ REDIS_URL: "redis://localhost:6379",
49
192
  },
50
- onTaskComplete: (taskName) => {
51
- console.log(`[${taskName}] Complete!`)
193
+ })
194
+ ```
195
+
196
+ #### Nested Pipeline Environment Variables
197
+
198
+ When converting a pipeline to a task, you can provide environment variables that will be merged with the outer pipeline's environment:
199
+
200
+ ```ts
201
+ const innerPipeline = pipeline([
202
+ /* ... */
203
+ ])
204
+ const innerTask = innerPipeline.toTask({
205
+ name: "inner",
206
+ env: {
207
+ INNER_VAR: "inner-value",
52
208
  },
53
- onPipelineComplete: () => {
54
- console.log("All tasks complete! πŸŽ‰")
209
+ })
210
+
211
+ const outerPipeline = pipeline([innerTask])
212
+ const result = await outerPipeline.run({
213
+ env: {
214
+ OUTER_VAR: "outer-value",
55
215
  },
56
- onPipelineError: (error) => {
57
- console.error(`Pipeline error: ${error.message}`)
216
+ })
217
+ ```
218
+
219
+ In this example, tasks in `innerPipeline` will receive both `INNER_VAR` and `OUTER_VAR`, with `INNER_VAR` taking precedence if there's a conflict.
220
+
221
+ ---
222
+
223
+ ### Dependencies
224
+
225
+ Tasks may depend on other tasks. A task will not start until all its dependencies have completed (or been skipped).
226
+
227
+ ```ts
228
+ const consumerTask = task({
229
+ name: "consumer:dev",
230
+ commands: {
231
+ build: "npm run build",
232
+ dev: "npm run dev",
58
233
  },
234
+ cwd: "packages/consumer",
235
+ dependencies: [libTask],
59
236
  })
60
237
  ```
61
238
 
62
- ## Error Handling
239
+ ---
240
+
241
+ ### Pipelines
63
242
 
64
- Pipeline errors are provided as `PipelineError` instances with error codes for easier handling:
243
+ A **pipeline** executes a set of tasks according to their dependency graph.
65
244
 
66
245
  ```ts
67
- import { pipeline, PipelineError } from "builderman"
246
+ import { pipeline } from "builderman"
68
247
 
69
- await pipeline([task1, task2]).run({
70
- onPipelineError: (error) => {
71
- switch (error.code) {
72
- case PipelineError.Aborted:
73
- console.error("Pipeline was cancelled")
74
- break
75
- case PipelineError.TaskFailed:
76
- console.error(`Task failed: ${error.message}`)
77
- break
78
- case PipelineError.ProcessTerminated:
79
- console.error("Process was terminated")
80
- break
81
- case PipelineError.InvalidTask:
82
- console.error(`Invalid task configuration: ${error.message}`)
83
- break
84
- case PipelineError.InvalidSignal:
85
- console.error("Invalid abort signal")
86
- break
87
- }
248
+ const result = await pipeline([libTask, consumerTask]).run({
249
+ command: "dev",
250
+ onTaskBegin: (name) => {
251
+ console.log(`[${name}] starting`)
252
+ },
253
+ onTaskComplete: (name) => {
254
+ console.log(`[${name}] complete`)
88
255
  },
89
256
  })
90
257
  ```
91
258
 
92
- ## Cancellation
259
+ ---
260
+
261
+ ### Pipeline Composition
93
262
 
94
- You can cancel a running pipeline by providing an `AbortSignal`:
263
+ Pipelines can be converted into tasks and composed like any other unit of work.
264
+
265
+ ```ts
266
+ const build = pipeline([
267
+ /* ... */
268
+ ])
269
+ const test = pipeline([
270
+ /* ... */
271
+ ])
272
+ const deploy = pipeline([
273
+ /* ... */
274
+ ])
275
+
276
+ const buildTask = build.toTask({ name: "build" })
277
+ const testTask = test.toTask({
278
+ name: "test",
279
+ dependencies: [buildTask],
280
+ env: { TEST_ENV: "test-value" }, // Optional: env for nested pipeline
281
+ })
282
+ const deployTask = deploy.toTask({ name: "deploy", dependencies: [testTask] })
283
+
284
+ const ci = pipeline([buildTask, testTask, deployTask])
285
+ const result = await ci.run()
286
+ ```
287
+
288
+ When a pipeline is converted to a task, it becomes a **single node** in the dependency graph. The nested pipeline must fully complete before dependents can start.
289
+
290
+ ---
291
+
292
+ ## Error Handling Guarantees
293
+
294
+ **builderman pipelines never throw.**
295
+
296
+ All failures β€” including task errors, invalid configuration, cancellation, and process termination β€” are reported through a structured `RunResult`.
95
297
 
96
298
  ```ts
97
299
  import { pipeline, PipelineError } from "builderman"
98
300
 
99
- const abortController = new AbortController()
301
+ const result = await pipeline([libTask, consumerTask]).run()
100
302
 
101
- const runPromise = pipeline([task1, task2]).run({
102
- signal: abortController.signal,
103
- onPipelineError: (error) => {
104
- if (error.code === PipelineError.Aborted) {
303
+ if (!result.ok) {
304
+ switch (result.error.code) {
305
+ case PipelineError.Aborted:
105
306
  console.error("Pipeline was cancelled")
106
- }
107
- },
307
+ break
308
+ case PipelineError.TaskFailed:
309
+ console.error(`Task failed: ${result.error.message}`)
310
+ break
311
+ case PipelineError.ProcessTerminated:
312
+ console.error("Process was terminated")
313
+ break
314
+ case PipelineError.InvalidTask:
315
+ console.error(`Invalid task configuration: ${result.error.message}`)
316
+ break
317
+ case PipelineError.InvalidSignal:
318
+ console.error("Invalid abort signal")
319
+ break
320
+ }
321
+ }
322
+ ```
323
+
324
+ Execution statistics are **always available**, even on failure.
325
+
326
+ ---
327
+
328
+ ## Cancellation
329
+
330
+ You can cancel a running pipeline using an `AbortSignal`.
331
+
332
+ ```ts
333
+ const controller = new AbortController()
334
+
335
+ const runPromise = pipeline([libTask, consumerTask]).run({
336
+ signal: controller.signal,
108
337
  })
109
338
 
110
- // Cancel the pipeline after 5 seconds
339
+ // Cancel after 5 seconds
111
340
  setTimeout(() => {
112
- abortController.abort()
341
+ controller.abort()
113
342
  }, 5000)
114
343
 
115
- try {
116
- await runPromise
117
- } catch (error) {
118
- if (error instanceof PipelineError && error.code === PipelineError.Aborted) {
119
- // Pipeline was cancelled
120
- }
344
+ const result = await runPromise
345
+
346
+ if (!result.ok && result.error.code === PipelineError.Aborted) {
347
+ console.error("Pipeline was cancelled")
348
+ console.log(`Tasks still running: ${result.stats.summary.running}`)
121
349
  }
122
350
  ```
123
351
 
352
+ ---
353
+
124
354
  ## Teardown
125
355
 
126
- Tasks can specify teardown commands that run automatically when the task completes or fails. Teardowns are executed in reverse dependency order (dependents before dependencies) to ensure proper cleanup.
356
+ Tasks may specify teardown commands that run automatically when a task completes or fails.
357
+
358
+ Teardowns are executed **in reverse dependency order** (dependents before dependencies) to ensure safe cleanup.
127
359
 
128
360
  ### Basic Teardown
129
361
 
@@ -137,92 +369,63 @@ const dbTask = task({
137
369
  },
138
370
  build: "echo build",
139
371
  },
140
- cwd: ".",
141
372
  })
142
373
  ```
143
374
 
375
+ ---
376
+
144
377
  ### Teardown Callbacks
145
378
 
146
- You can monitor teardown execution with callbacks. Note that teardown failures do not cause the pipeline to fail - they are fire-and-forget cleanup operations:
379
+ You can observe teardown execution using callbacks. Teardown failures do **not** cause the pipeline to fail β€” they are best-effort cleanup operations.
147
380
 
148
381
  ```ts
149
- await pipeline([dbTask]).run({
382
+ const result = await pipeline([dbTask]).run({
150
383
  onTaskTeardown: (taskName) => {
151
- console.log(`[${taskName}] Starting teardown...`)
384
+ console.log(`[${taskName}] starting teardown`)
152
385
  },
153
386
  onTaskTeardownError: (taskName, error) => {
154
- console.error(`[${taskName}] Teardown failed: ${error.message}`)
155
- // error is a regular Error instance (not a PipelineError)
156
- // Teardown failures do not affect pipeline success/failure
387
+ console.error(`[${taskName}] teardown failed: ${error.message}`)
157
388
  },
158
389
  })
159
390
  ```
160
391
 
392
+ Teardown results are recorded in task statistics.
393
+
394
+ ---
395
+
161
396
  ### Teardown Execution Rules
162
397
 
163
398
  Teardowns run when:
164
399
 
165
- - βœ… The command entered the running state (regardless of success or failure)
166
- - βœ… The pipeline completes successfully
167
- - βœ… The pipeline fails after tasks have started
400
+ - The command entered the running state
401
+ - The pipeline completes successfully
402
+ - The pipeline fails after tasks have started
168
403
 
169
404
  Teardowns do **not** run when:
170
405
 
171
- - ❌ The task was skipped (no command for the current mode)
172
- - ❌ The task failed before starting (spawn error)
173
- - ❌ The pipeline never began execution
406
+ - The task was skipped
407
+ - The task failed before starting (spawn error)
408
+ - The pipeline never began execution
174
409
 
175
- ### Reverse Dependency Order
176
-
177
- Teardowns execute in reverse dependency order to ensure dependents are cleaned up before their dependencies:
178
-
179
- ```ts
180
- const db = task({
181
- name: "db",
182
- commands: {
183
- dev: { run: "docker-compose up", teardown: "docker-compose down" },
184
- build: "echo build",
185
- },
186
- cwd: ".",
187
- })
188
-
189
- const api = task({
190
- name: "api",
191
- commands: {
192
- dev: { run: "npm run dev", teardown: "echo stopping api" },
193
- build: "echo build",
194
- },
195
- cwd: ".",
196
- dependencies: [db], // api depends on db
197
- })
198
-
199
- // Teardown order: api first, then db
200
- await pipeline([db, api]).run()
201
- ```
410
+ ---
202
411
 
203
412
  ## Skipping Tasks
204
413
 
205
- Tasks can be automatically skipped when they don't have a command for the current mode. This is useful for multi-mode pipelines where some tasks are only relevant in certain contexts.
414
+ If a task does not define a command for the current mode, it is **skipped** by default.
206
415
 
207
- ### Default Behavior
416
+ Skipped tasks:
208
417
 
209
- If a task has no command for the current mode, it is **skipped**:
210
-
211
- - βœ… The task participates in the dependency graph
212
- - βœ… The task resolves immediately (satisfies dependencies)
213
- - βœ… Dependents are unblocked
214
- - ❌ No command is executed
215
- - ❌ No teardown is registered
216
- - ❌ No readiness is waited for
418
+ - Participate in the dependency graph
419
+ - Resolve immediately
420
+ - Unblock dependent tasks
421
+ - Do not execute commands or teardowns
217
422
 
218
423
  ```ts
219
424
  const dbTask = task({
220
425
  name: "database",
221
426
  commands: {
222
427
  dev: "docker-compose up",
223
- // No build command - will be skipped in build mode
224
428
  },
225
- cwd: ".",
226
429
  })
227
430
 
228
431
  const apiTask = task({
@@ -231,131 +434,108 @@ const apiTask = task({
231
434
  dev: "npm run dev",
232
435
  build: "npm run build",
233
436
  },
234
- cwd: ".",
235
- dependencies: [dbTask], // dbTask will be skipped, but apiTask will still run
437
+ dependencies: [dbTask],
236
438
  })
237
439
 
238
- await pipeline([dbTask, apiTask]).run({
440
+ const result = await pipeline([dbTask, apiTask]).run({
239
441
  command: "build",
240
442
  onTaskSkipped: (taskName, mode) => {
241
- console.log(`[${taskName}] skipped (no command for mode "${mode}")`)
443
+ console.log(`[${taskName}] skipped (no "${mode}" command)`)
242
444
  },
243
445
  })
244
446
  ```
245
447
 
448
+ ---
449
+
246
450
  ### Strict Mode
247
451
 
248
- In strict mode, missing commands cause the pipeline to fail. Use this for CI/release pipelines where every task is expected to participate:
452
+ In **strict mode**, missing commands cause the pipeline to fail. This is useful for CI and release pipelines.
249
453
 
250
454
  ```ts
251
- await pipeline([dbTask, apiTask]).run({
455
+ const result = await pipeline([dbTask, apiTask]).run({
252
456
  command: "build",
253
- strict: true, // Missing commands will cause pipeline to fail
457
+ strict: true,
254
458
  })
459
+
460
+ if (!result.ok) {
461
+ console.error("Pipeline failed in strict mode:", result.error.message)
462
+ }
255
463
  ```
256
464
 
257
- ### Task-Level Override
465
+ ---
466
+
467
+ ### Task-Level Skip Override
258
468
 
259
- Even with global strict mode, you can explicitly allow a task to be skipped:
469
+ Tasks may explicitly allow skipping, even when strict mode is enabled.
260
470
 
261
471
  ```ts
262
472
  const dbTask = task({
263
473
  name: "database",
264
474
  commands: {
265
475
  dev: "docker-compose up",
266
- // No build command, but explicitly allowed to skip
267
476
  },
268
- cwd: ".",
269
- allowSkip: true, // Explicitly allow skipping even in strict mode
477
+ allowSkip: true,
270
478
  })
271
479
 
272
- await pipeline([dbTask]).run({
480
+ const result = await pipeline([dbTask]).run({
273
481
  command: "build",
274
- strict: true, // Global strict mode
275
- // dbTask will still be skipped because allowSkip: true
482
+ strict: true,
276
483
  })
277
484
  ```
278
485
 
279
- ### Nested Pipeline Behavior
280
-
281
- When a pipeline is converted to a task, skip behavior is preserved:
486
+ ---
282
487
 
283
- - If **all** inner tasks are skipped β†’ outer task is skipped
284
- - If **some** run, some skip β†’ outer task is completed
285
- - If **any** fail β†’ outer task fails
488
+ ## Execution Statistics
286
489
 
287
- ```ts
288
- const innerPipeline = pipeline([
289
- task({ name: "inner1", commands: { dev: "..." }, cwd: "." }),
290
- task({ name: "inner2", commands: { dev: "..." }, cwd: "." }),
291
- ])
490
+ Every pipeline run returns detailed execution statistics.
292
491
 
293
- const outerTask = innerPipeline.toTask({ name: "outer" })
492
+ ### Pipeline Statistics
294
493
 
295
- // If all inner tasks are skipped in build mode, outer task is also skipped
296
- await pipeline([outerTask]).run({ command: "build" })
494
+ ```ts
495
+ console.log(result.stats.status) // "success" | "failed" | "aborted"
496
+ console.log(result.stats.command) // Executed mode
497
+ console.log(result.stats.durationMs) // Total execution time
498
+ console.log(result.stats.summary.total)
499
+ console.log(result.stats.summary.completed)
500
+ console.log(result.stats.summary.failed)
501
+ console.log(result.stats.summary.skipped)
502
+ console.log(result.stats.summary.running)
297
503
  ```
298
504
 
299
- ## Pipeline Composition
300
-
301
- Build complex workflows by composing tasks and pipelines together.
505
+ ---
302
506
 
303
- ### Task Chaining
507
+ ### Task Statistics
304
508
 
305
- Chain tasks together using `andThen()` to create a pipeline that will run the tasks in order, automatically adding the previous task as a dependency:
509
+ Each task provides detailed per-task data:
306
510
 
307
511
  ```ts
308
- import { task } from "builderman"
512
+ for (const task of Object.values(result.stats.tasks)) {
513
+ console.log(task.name, task.status)
514
+ console.log(task.durationMs)
309
515
 
310
- const build = task({
311
- name: "compile",
312
- commands: {
313
- build: "tsc",
314
- dev: {
315
- run: "tsc --watch",
316
- readyWhen: (output) => output.includes("Watching for file changes."),
317
- },
318
- },
319
- cwd: "packages/lib",
320
- }).andThen({
321
- name: "bundle",
322
- commands: {
323
- build: "rollup",
324
- dev: {
325
- run: "rollup --watch",
326
- readyWhen: (output) => output.includes("Watching for file changes."),
327
- },
328
- },
329
- cwd: "packages/lib",
330
- })
516
+ if (task.status === "failed") {
517
+ console.error(task.error?.message)
518
+ console.error(task.exitCode)
519
+ }
331
520
 
332
- await build.run()
521
+ if (task.teardown) {
522
+ console.log("Teardown:", task.teardown.status)
523
+ }
524
+ }
333
525
  ```
334
526
 
335
- ### Composing Pipelines as Tasks
527
+ ---
336
528
 
337
- Convert pipelines to tasks and compose them with explicit dependencies:
529
+ ## When Should I Use builderman?
338
530
 
339
- ```ts
340
- const build = pipeline([
341
- /* ... */
342
- ])
343
- const test = pipeline([
344
- /* ... */
345
- ])
346
- const deploy = pipeline([
347
- /* ... */
348
- ])
531
+ **builderman** is a good fit when:
349
532
 
350
- // Convert to tasks first
351
- const buildTask = build.toTask({ name: "build" })
352
- const testTask = test.toTask({ name: "test", dependencies: [buildTask] })
353
- const deployTask = deploy.toTask({ name: "deploy", dependencies: [testTask] })
533
+ - You have dependent tasks that must run in a strict order
534
+ - You run long-lived dev processes that need readiness detection
535
+ - Cleanup matters (databases, containers, servers)
536
+ - You want structured results instead of log-scraping
354
537
 
355
- // Compose into final pipeline
356
- const ci = pipeline([buildTask, testTask, deployTask])
357
-
358
- await ci.run()
359
- ```
538
+ It may be overkill if:
360
539
 
361
- **Note:** When a pipeline is converted to a task, it becomes a single unit in the dependency graph. The nested pipeline will execute completely before any dependent tasks can start.
540
+ - You only need a few linear npm scripts
541
+ - You do not need dependency graphs or teardown guarantees