@openfn/cli 0.4.16 → 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -15,6 +15,7 @@ The CLI includes:
15
15
 
16
16
  - [Installation](#installation)
17
17
  - [Updating](#updating)
18
+ - [Terminology](#terminology)
18
19
  - [Migrating from devtools](#migrating-from-devtools)
19
20
  - [Basic Usage](#basic-usage)
20
21
  - [Advanced Usage](#advanced-usage)
@@ -71,27 +72,36 @@ npm uninstall -g @openfn/cli
71
72
 
72
73
  And then re-installing.
73
74
 
74
- ## Migrating from devtools
75
+ ## Terminology
75
76
 
76
- If you're coming to the CLI from the old openfn devtools, here are a couple of key points to be aware of:
77
+ The CLI (and the wider OpenFn stack) has some very particular terminology
77
78
 
78
- - The CLI has a shorter, sleeker syntax, so your command should be much shorter
79
- - The CLI will automatically install adaptors for you (with full version control)
79
+ - An **Expression** is a string of Javascript (or Javascript-like code) written to be run in the CLI or Lightning.
80
+ - A **Job** is an expression plus some metadata required to run it - typically an adaptor and credentials.
81
+ The terms Job and Expression are often used interchangeably.
82
+ - A **Workflow** is a series of steps to be executed in sequence. Steps are usually Jobs (and so job and step are often used
83
+ interchangeably), but can be Triggers.
84
+ - An **Execution Plan** is a Workflow plus some options which inform how it should be executed (ie, start node, timeout).
85
+ The term "Execution plan" is mostly used internally and not exposed to users, and is usually interchangeable with Workflow.
86
+
87
+ Note that an expression is not generally portable (ie, cannot run in other environments) unless it is compiled.
88
+ A compiled expression has imports and exports and, so long as packages are available, can run in a simple
89
+ JavaScript runtime.
80
90
 
81
91
  ## Basic Usage
82
92
 
83
- You're probably here to run jobs (expressions) or workflows, which the CLI makes easy:
93
+ You're probably here to run Workflows (or individual jobs), which the CLI makes easy:
84
94
 
85
95
  ```
86
96
  openfn path/to/workflow.json
87
- openfn path/to/job.js -ia adaptor-name
97
+ openfn path/to/job.js -a adaptor-name
88
98
  ```
89
99
 
90
100
  If running a single job, you MUST specify which adaptor to use.
91
101
 
92
- Pass the `-i` flag to auto-install any required adaptors (it's safe to do this redundantly, although the run will be a little slower).
102
+ If the requested adaptor (or a matching version) is not already installed, it will be installed automatically. To disable this behaviour, pass the `--no-autoinstall` flag.
93
103
 
94
- When the finished, the CLI will write the resulting state to disk. By default the CLI will create an `output.json` next to the job file. You can pass a path to output by passing `-o path/to/output.json` and state by adding `-s path/to/state.json`. You can use `-S` and `-O` to pass state through stdin and return the output through stdout.
104
+ When finished, the CLI will write the resulting state to disk. By default the CLI will create an `output.json` next to the job file. You can pass a path to output by passing `-o path/to/output.json` and state by adding `-s path/to/state.json`. You can use `-S` and `-O` to pass state through stdin and return the output through stdout.
95
105
 
96
106
  The CLI maintains a repo for auto-installed adaptors. Run `openfn repo list` to see where the repo is, and what's in it. Set the `OPENFN_REPO_DIR` env var to specify the repo folder. When autoinstalling, the CLI will check to see if a matching version is found in the repo. `openfn repo clean` will remove all adaptors from the repo. The repo also includes any documentation and metadata built with the CLI.
97
107
 
@@ -103,14 +113,16 @@ You can pass `--log info` to get more feedback about what's happening, or `--log
103
113
 
104
114
  ## Advanced Usage
105
115
 
106
- The CLI has a number of commands (the first argument after openfn)
116
+ The CLI has a number of commands (the first argument after `openfn`):
107
117
 
108
118
  - execute - run a job
109
- - compile - compile a job to a .js file
119
+ - compile - compile a job to a .js file (prints to stdout by default)
110
120
  - docs - show documentation for an adaptor function
111
121
  - repo - manage the repo of installed modules
112
122
  - docgen - generate JSON documentation for an adaptor based on its typescript
113
123
 
124
+ For example, `openfn compile job.js -a common` will compile the code at `job.js` with the common adaptor.
125
+
114
126
  If no command is specified, execute will run.
115
127
 
116
128
  To get more information about a command, including usage examples, run `openfn <command> help`, ie, `openfn compile help`.
@@ -253,38 +265,43 @@ Pass `--log-json` to the CLI to do this. You can also set the OPENFN_LOG_JSON en
253
265
 
254
266
  ## Workflows
255
267
 
256
- As of v0.0.35 the CLI supports running workflows as well as jobs.
257
-
258
- A workflow is in execution plan for running several jobs in a sequence. It is defined as a JSON structure.
268
+ A workflow is an execution plan for running several jobs in a sequence. It is defined as a JSON structure.
259
269
 
260
270
  To see an example workflow, run the test command with `openfn test`.
261
271
 
262
- A workflow has a structure like this (better documentation is coming soon):
272
+ A workflow has a structure like this:
263
273
 
264
274
  ```json
265
275
  {
266
- "start": "a", // optionally specify the start node (defaults to jobs[0])
267
- "jobs": [
268
- {
269
- "id": "a",
270
- "expression": "fn((state) => state)", // code or a path
271
- "adaptor": "@openfn/language-common@1.75", // specifiy the adaptor to use (version optional)
272
- "data": {}, // optionally pre-populate the data object (this will be overriden by keys in in previous state)
273
- "configuration": {}, // Use this to pass credentials
274
- "next": {
275
- // This object defines which jobs to call next
276
- // All edges returning true will run
277
- // If there are no next edges, the workflow will end
278
- "b": true,
279
- "c": {
280
- "condition": "!state.error" // Note that this is an expression, not a function
276
+ "workflow": {
277
+ "name": "my-workflow", // human readable name used in logging
278
+ "steps": [
279
+ {
280
+ "name": "a", // human readable name used in logging
281
+ "expression": "fn((state) => state)", // code or a path to an expression.js file
282
+ "adaptor": "@openfn/language-common@1.7.5", // specifiy the adaptor to use (version optional)
283
+ "data": {}, // optionally pre-populate the data object (this will be overriden by keys in in previous state)
284
+ "configuration": {}, // Use this to pass credentials
285
+ "next": {
286
+ // This object defines which jobs to call next
287
+ // All edges returning true will run
288
+ // If there are no next edges, the workflow will end
289
+ "b": true,
290
+ "c": {
291
+ "condition": "!state.error" // Note that this is a strict Javascript expression, not a function, and has no adaptor support
292
+ }
281
293
  }
282
294
  }
283
- }
284
- ]
295
+ ]
296
+ },
297
+ "options": {
298
+ "start": "a" // optionally specify the start node (defaults to steps[0])
299
+ }
285
300
  }
286
301
  ```
287
302
 
303
+ See `packages/lexicon` for type definitions (the workflow format is covered by the `ExecutionPlan` type)/
304
+
288
305
  ## Compilation
289
306
 
290
307
  The CLI will compile your job code into regular Javascript. It does a number of things to make your code robust and portable:
@@ -298,8 +315,6 @@ The result of this is a lightweight, modern JS module. It can be executed in any
298
315
 
299
316
  The CLI uses openfn's own runtime to execute jobs in a safe environment.
300
317
 
301
- All jobs which work against `@openfn/core` will work in the new CLI and runtime environment (note: although this is a work in progress and we are actively looking for help to test this!).
302
-
303
318
  If you want to see how the compiler is changing your job, run `openfn compile path/to/job -a <adaptor>` to return the compiled code to stdout. Add `-o path/to/output.js` to save the result to disk.
304
319
 
305
320
  ## Contributing
@@ -355,10 +370,10 @@ export OPENFN_ADAPTORS_REPO=~/repo/openfn/adaptors
355
370
 
356
371
  ### Contributing changes
357
372
 
358
- Open a PR at https://github.com/openfn/kit. Include a changeset and a description of your change.
359
-
360
- See the root readme for more details about changests,
373
+ Include a changeset and a description of your change. Run this command and follow the interactive prompt (it's really easy, promise!)
361
374
 
362
375
  ```
363
-
376
+ pnpm changeset
364
377
  ```
378
+
379
+ Commit the changeset files and open a PR at https://github.com/openfn/kit.
package/dist/index.js CHANGED
@@ -49,19 +49,18 @@ var expand = (name) => {
49
49
  }
50
50
  return name;
51
51
  };
52
- var expand_adaptors_default = (opts2) => {
53
- const { adaptors: adaptors2, workflow } = opts2;
54
- if (adaptors2) {
55
- opts2.adaptors = adaptors2?.map(expand);
56
- }
57
- if (workflow) {
58
- Object.values(workflow.jobs).forEach((job) => {
59
- if (job.adaptor) {
60
- job.adaptor = expand(job.adaptor);
61
- }
62
- });
63
- }
64
- return opts2;
52
+ var expand_adaptors_default = (input) => {
53
+ if (Array.isArray(input)) {
54
+ return input?.map(expand);
55
+ }
56
+ const plan = input;
57
+ Object.values(plan.workflow.steps).forEach((step) => {
58
+ const job = step;
59
+ if (job.adaptor) {
60
+ job.adaptor = expand(job.adaptor);
61
+ }
62
+ });
63
+ return plan;
65
64
  };
66
65
 
67
66
  // src/util/logger.ts
@@ -157,7 +156,7 @@ var adaptors = {
157
156
  opts2.adaptors = [];
158
157
  }
159
158
  if (opts2.expandAdaptors) {
160
- expand_adaptors_default(opts2);
159
+ opts2.adaptors = expand_adaptors_default(opts2.adaptors);
161
160
  }
162
161
  delete opts2.adaptor;
163
162
  delete opts2.a;
@@ -168,8 +167,8 @@ var autoinstall = {
168
167
  yargs: {
169
168
  alias: ["i"],
170
169
  boolean: true,
171
- description: "Auto-install the language adaptor",
172
- default: false
170
+ description: "Auto-install the language adaptor(s)",
171
+ default: true
173
172
  }
174
173
  };
175
174
  var compile = {
@@ -263,12 +262,12 @@ var inputPath = {
263
262
  ensure: (opts2) => {
264
263
  const { path: basePath } = opts2;
265
264
  if (basePath?.endsWith(".json")) {
266
- opts2.workflowPath = basePath;
265
+ opts2.planPath = basePath;
267
266
  } else if (basePath?.endsWith(".js")) {
268
- opts2.jobPath = basePath;
267
+ opts2.expressionPath = basePath;
269
268
  } else {
270
269
  const base = getBaseDir(opts2);
271
- setDefaultValue(opts2, "jobPath", path2.join(base, "job.js"));
270
+ setDefaultValue(opts2, "expressionPath", path2.join(base, "job.js"));
272
271
  }
273
272
  }
274
273
  };
@@ -345,33 +344,6 @@ var start = {
345
344
  description: "Specifiy the start node in a workflow"
346
345
  }
347
346
  };
348
- var strictOutput = {
349
- name: "no-strict-output",
350
- yargs: {
351
- deprecated: true,
352
- hidden: true,
353
- boolean: true
354
- },
355
- ensure: (opts2) => {
356
- if (!opts2.hasOwnProperty("strict")) {
357
- opts2.strict = opts2.strictOutput;
358
- }
359
- delete opts2.strictOutput;
360
- }
361
- };
362
- var strict = {
363
- name: "strict",
364
- yargs: {
365
- default: false,
366
- boolean: true,
367
- description: "Enables strict state handling, meaning only state.data is returned from a job."
368
- },
369
- ensure: (opts2) => {
370
- if (!opts2.hasOwnProperty("strictOutput")) {
371
- setDefaultValue(opts2, "strict", false);
372
- }
373
- }
374
- };
375
347
  var skipAdaptorValidation = {
376
348
  name: "skip-adaptor-validation",
377
349
  yargs: {
@@ -573,18 +545,14 @@ var options4 = [
573
545
  start,
574
546
  statePath,
575
547
  stateStdin,
576
- strict,
577
- strictOutput,
578
548
  timeout,
579
549
  useAdaptorsMonorepo
580
550
  ];
581
551
  var executeCommand = {
582
552
  command: "execute [path]",
583
- describe: `Run an openfn job or workflow. Get more help by running openfn <command> help.
584
-
585
- Execute will run a job/workflow at the path and write the output state to disk (to ./state.json unless otherwise specified)
553
+ describe: `Run an openfn expression or workflow. Get more help by running openfn <command> help.
586
554
 
587
- By default only state.data will be returned fron a job. Include --no-strict to write the entire state object.
555
+ Execute will run a expression/workflow at the path and write the output state to disk (to ./state.json unless otherwise specified)
588
556
 
589
557
  Remember to include the adaptor name with -a. Auto install adaptors with the -i flag.`,
590
558
  aliases: ["$0"],
@@ -603,7 +571,7 @@ Remember to include the adaptor name with -a. Auto install adaptors with the -i
603
571
  "Execute job.js with common adaptor and info-level logging"
604
572
  ).example(
605
573
  "openfn compile job.js -a http",
606
- "Compile job.js with the http adaptor and print the code to stdout"
574
+ "Compile the expression at job.js with the http adaptor and print the code to stdout"
607
575
  )
608
576
  };
609
577
  var command_default5 = executeCommand;
@@ -32,18 +32,15 @@ var logger_default = createLogger;
32
32
  var createNullLogger = () => createLogger(void 0, { log: { default: "none" } });
33
33
 
34
34
  // src/execute/execute.ts
35
- var execute_default = async (input, state, opts) => {
35
+ var execute_default = async (plan, input, opts) => {
36
36
  try {
37
- const result = await run(input, state, {
38
- strict: opts.strict,
39
- start: opts.start,
40
- timeout: opts.timeout,
37
+ const result = await run(plan, input, {
41
38
  immutableState: opts.immutable,
42
39
  logger: logger_default(RUNTIME, opts),
43
40
  jobLogger: logger_default(JOB, opts),
44
41
  linker: {
45
42
  repo: opts.repoDir,
46
- modules: parseAdaptors(opts)
43
+ modules: parseAdaptors(plan)
47
44
  }
48
45
  });
49
46
  return result;
@@ -52,7 +49,7 @@ var execute_default = async (input, state, opts) => {
52
49
  throw e;
53
50
  }
54
51
  };
55
- function parseAdaptors(opts) {
52
+ function parseAdaptors(plan) {
56
53
  const extractInfo = (specifier) => {
57
54
  const [module, path7] = specifier.split("=");
58
55
  const { name, version } = getNameAndVersion(module);
@@ -68,21 +65,13 @@ function parseAdaptors(opts) {
68
65
  return info;
69
66
  };
70
67
  const adaptors = {};
71
- if (opts.adaptors) {
72
- opts.adaptors.reduce((obj, exp) => {
73
- const { name, ...maybeVersionAndPath } = extractInfo(exp);
74
- obj[name] = { ...maybeVersionAndPath };
75
- return obj;
76
- }, adaptors);
77
- }
78
- if (opts.workflow) {
79
- Object.values(opts.workflow.jobs).forEach((job) => {
80
- if (job.adaptor) {
81
- const { name, ...maybeVersionAndPath } = extractInfo(job.adaptor);
82
- adaptors[name] = { ...maybeVersionAndPath };
83
- }
84
- });
85
- }
68
+ Object.values(plan.workflow.steps).forEach((step) => {
69
+ const job = step;
70
+ if (job.adaptor) {
71
+ const { name, ...maybeVersionAndPath } = extractInfo(job.adaptor);
72
+ adaptors[name] = maybeVersionAndPath;
73
+ }
74
+ });
86
75
  return adaptors;
87
76
  }
88
77
 
@@ -91,15 +80,8 @@ import { writeFile } from "node:fs/promises";
91
80
  var serializeOutput = async (options, result, logger) => {
92
81
  let output = result;
93
82
  if (output && (output.configuration || output.data)) {
94
- if (options.strict) {
95
- output = { data: output.data };
96
- if (result.errors) {
97
- output.errors = result.errors;
98
- }
99
- } else {
100
- const { configuration, ...rest } = result;
101
- output = rest;
102
- }
83
+ const { configuration, ...rest } = result;
84
+ output = rest;
103
85
  }
104
86
  if (output === void 0) {
105
87
  output = "";
@@ -119,20 +101,15 @@ var serializeOutput = async (options, result, logger) => {
119
101
  var serialize_output_default = serializeOutput;
120
102
 
121
103
  // src/execute/get-autoinstall-targets.ts
122
- var getAutoinstallTargets = (options) => {
123
- if (options.workflow) {
124
- const adaptors = {};
125
- Object.values(options.workflow.jobs).forEach((job) => {
126
- if (job.adaptor) {
127
- adaptors[job.adaptor] = true;
128
- }
129
- });
130
- return Object.keys(adaptors);
131
- }
132
- if (options.adaptors) {
133
- return options.adaptors?.filter((a) => !/=/.test(a));
134
- }
135
- return [];
104
+ var getAutoinstallTargets = (plan) => {
105
+ const adaptors = {};
106
+ Object.values(plan.workflow.steps).forEach((step) => {
107
+ const job = step;
108
+ if (job.adaptor && !/=/.test(job.adaptor)) {
109
+ adaptors[job.adaptor] = true;
110
+ }
111
+ });
112
+ return Object.keys(adaptors);
136
113
  };
137
114
  var get_autoinstall_targets_default = getAutoinstallTargets;
138
115
 
@@ -232,20 +209,15 @@ var abort_default = (logger, reason, error, help) => {
232
209
  };
233
210
 
234
211
  // src/compile/compile.ts
235
- var compile_default = async (opts, log) => {
236
- log.debug("Compiling...");
237
- let job;
238
- if (opts.workflow) {
239
- job = compileWorkflow(opts.workflow, opts, log);
240
- } else {
241
- job = await compileJob(opts.job || opts.jobPath, opts, log);
242
- }
243
- if (opts.jobPath) {
244
- log.success(`Compiled from ${opts.jobPath}`);
245
- } else {
246
- log.success("Compilation complete");
212
+ var compile_default = async (planOrPath, opts, log) => {
213
+ if (typeof planOrPath === "string") {
214
+ const result = await compileJob(planOrPath, opts, log);
215
+ log.success(`Compiled expression from ${opts.expressionPath}`);
216
+ return result;
247
217
  }
248
- return job;
218
+ const compiledPlan = compileWorkflow(planOrPath, opts, log);
219
+ log.success("Compiled all expressions in workflow");
220
+ return compiledPlan;
249
221
  };
250
222
  var compileJob = async (job, opts, log, jobName) => {
251
223
  try {
@@ -258,10 +230,12 @@ var compileJob = async (job, opts, log, jobName) => {
258
230
  e,
259
231
  "Check the syntax of the job expression:\n\n" + job
260
232
  );
233
+ return "";
261
234
  }
262
235
  };
263
- var compileWorkflow = async (workflow, opts, log) => {
264
- for (const job of workflow.jobs) {
236
+ var compileWorkflow = async (plan, opts, log) => {
237
+ for (const step of plan.workflow.steps) {
238
+ const job = step;
265
239
  const jobOpts = {
266
240
  ...opts
267
241
  };
@@ -277,7 +251,7 @@ var compileWorkflow = async (workflow, opts, log) => {
277
251
  );
278
252
  }
279
253
  }
280
- return workflow;
254
+ return plan;
281
255
  };
282
256
  var stripVersionSpecifier = (specifier) => {
283
257
  const idx = specifier.lastIndexOf("@");
@@ -379,15 +353,16 @@ var validateAdaptors = async (options, logger) => {
379
353
  if (options.skipAdaptorValidation) {
380
354
  return;
381
355
  }
356
+ const isPlan = options.planPath || options.workflowPath;
382
357
  const hasDeclaredAdaptors = options.adaptors && options.adaptors.length > 0;
383
- if (options.workflowPath && hasDeclaredAdaptors) {
358
+ if (isPlan && hasDeclaredAdaptors) {
384
359
  logger.error("ERROR: adaptor and workflow provided");
385
360
  logger.error(
386
361
  "This is probably not what you meant to do. A workflow should declare an adaptor for each job."
387
362
  );
388
363
  throw new Error("adaptor and workflow provided");
389
364
  }
390
- if (!options.workflowPath && !hasDeclaredAdaptors) {
365
+ if (!isPlan && !hasDeclaredAdaptors) {
391
366
  logger.warn("WARNING: No adaptor provided!");
392
367
  logger.warn(
393
368
  "This job will probably fail. Pass an adaptor with the -a flag, eg:"
@@ -399,118 +374,10 @@ var validateAdaptors = async (options, logger) => {
399
374
  };
400
375
  var validate_adaptors_default = validateAdaptors;
401
376
 
402
- // src/util/load-input.ts
403
- import path from "node:path";
377
+ // src/util/load-plan.ts
404
378
  import fs2 from "node:fs/promises";
379
+ import path2 from "node:path";
405
380
  import { isPath } from "@openfn/compiler";
406
- var load_input_default = async (opts, log) => {
407
- const { job, workflow, jobPath, workflowPath } = opts;
408
- if (workflow || workflowPath) {
409
- return loadWorkflow(opts, log);
410
- }
411
- if (job) {
412
- return job;
413
- }
414
- if (jobPath) {
415
- try {
416
- log.debug(`Loading job from ${jobPath}`);
417
- opts.job = await fs2.readFile(jobPath, "utf8");
418
- return opts.job;
419
- } catch (e) {
420
- abort_default(
421
- log,
422
- "Job not found",
423
- void 0,
424
- `Failed to load the job from ${jobPath}`
425
- );
426
- }
427
- }
428
- };
429
- var loadWorkflow = async (opts, log) => {
430
- const { workflowPath, workflow } = opts;
431
- const readWorkflow = async () => {
432
- try {
433
- const text = await fs2.readFile(workflowPath, "utf8");
434
- return text;
435
- } catch (e) {
436
- abort_default(
437
- log,
438
- "Workflow not found",
439
- void 0,
440
- `Failed to load a workflow from ${workflowPath}`
441
- );
442
- }
443
- };
444
- const parseWorkflow = (contents) => {
445
- try {
446
- return JSON.parse(contents);
447
- } catch (e) {
448
- abort_default(
449
- log,
450
- "Invalid JSON in workflow",
451
- e,
452
- `Check the syntax of the JSON at ${workflowPath}`
453
- );
454
- }
455
- };
456
- const fetchWorkflowFile = async (jobId, rootDir = "", filePath) => {
457
- try {
458
- const fullPath = filePath.startsWith("~") ? filePath : path.resolve(rootDir, filePath);
459
- const result = await fs2.readFile(fullPath, "utf8");
460
- return result;
461
- } catch (e) {
462
- abort_default(
463
- log,
464
- `File not found for job ${jobId}: ${filePath}`,
465
- void 0,
466
- `This workflow references a file which cannot be found at ${filePath}
467
-
468
- Paths inside the workflow are relative to the workflow.json`
469
- );
470
- }
471
- };
472
- log.debug(`Loading workflow from ${workflowPath}`);
473
- try {
474
- let wf;
475
- let rootDir = opts.baseDir;
476
- if (workflowPath) {
477
- let workflowRaw = await readWorkflow();
478
- wf = parseWorkflow(workflowRaw);
479
- if (!rootDir) {
480
- rootDir = path.dirname(workflowPath);
481
- }
482
- } else {
483
- wf = workflow;
484
- }
485
- let idx = 0;
486
- for (const job of wf.jobs) {
487
- idx += 1;
488
- const expressionStr = typeof job.expression === "string" && job.expression?.trim();
489
- const configurationStr = typeof job.configuration === "string" && job.configuration?.trim();
490
- if (expressionStr && isPath(expressionStr)) {
491
- job.expression = await fetchWorkflowFile(
492
- job.id || `${idx}`,
493
- rootDir,
494
- expressionStr
495
- );
496
- }
497
- if (configurationStr && isPath(configurationStr)) {
498
- const configString = await fetchWorkflowFile(
499
- job.id || `${idx}`,
500
- rootDir,
501
- configurationStr
502
- );
503
- job.configuration = JSON.parse(configString);
504
- }
505
- }
506
- opts.workflow = wf;
507
- log.debug("Workflow loaded!");
508
- return opts.workflow;
509
- } catch (e) {
510
- log.error(`Error loading workflow from ${workflowPath}`);
511
- throw e;
512
- }
513
- };
514
381
 
515
382
  // src/util/expand-adaptors.ts
516
383
  var expand = (name) => {
@@ -523,24 +390,23 @@ var expand = (name) => {
523
390
  }
524
391
  return name;
525
392
  };
526
- var expand_adaptors_default = (opts) => {
527
- const { adaptors, workflow } = opts;
528
- if (adaptors) {
529
- opts.adaptors = adaptors?.map(expand);
393
+ var expand_adaptors_default = (input) => {
394
+ if (Array.isArray(input)) {
395
+ return input?.map(expand);
530
396
  }
531
- if (workflow) {
532
- Object.values(workflow.jobs).forEach((job) => {
533
- if (job.adaptor) {
534
- job.adaptor = expand(job.adaptor);
535
- }
536
- });
537
- }
538
- return opts;
397
+ const plan = input;
398
+ Object.values(plan.workflow.steps).forEach((step) => {
399
+ const job = step;
400
+ if (job.adaptor) {
401
+ job.adaptor = expand(job.adaptor);
402
+ }
403
+ });
404
+ return plan;
539
405
  };
540
406
 
541
407
  // src/util/map-adaptors-to-monorepo.ts
542
408
  import { readFile } from "node:fs/promises";
543
- import path2 from "node:path";
409
+ import path from "node:path";
544
410
  import assert from "node:assert";
545
411
  import { getNameAndVersion as getNameAndVersion2 } from "@openfn/runtime";
546
412
  var validateMonoRepo = async (repoPath, log) => {
@@ -564,33 +430,193 @@ var updatePath = (adaptor, repoPath, log) => {
564
430
  );
565
431
  }
566
432
  const shortName = name.replace("@openfn/language-", "");
567
- const abspath = path2.resolve(repoPath, "packages", shortName);
433
+ const abspath = path.resolve(repoPath, "packages", shortName);
434
+ log.info(`Mapped adaptor ${name} to monorepo: ${abspath}`);
568
435
  return `${name}=${abspath}`;
569
436
  };
570
- var mapAdaptorsToMonorepo = async (options, log) => {
571
- const { adaptors, monorepoPath, workflow } = options;
437
+ var mapAdaptorsToMonorepo = (monorepoPath = "", input = [], log) => {
572
438
  if (monorepoPath) {
573
- await validateMonoRepo(monorepoPath, log);
574
- log.success(`Loading adaptors from monorepo at ${monorepoPath}`);
575
- if (adaptors) {
576
- options.adaptors = adaptors.map((a) => {
577
- const p = updatePath(a, monorepoPath, log);
578
- log.info(`Mapped adaptor ${a} to monorepo: ${p.split("=")[1]}`);
579
- return p;
580
- });
581
- }
582
- if (workflow) {
583
- Object.values(workflow.jobs).forEach((job) => {
584
- if (job.adaptor) {
585
- job.adaptor = updatePath(job.adaptor, monorepoPath, log);
586
- }
587
- });
439
+ if (Array.isArray(input)) {
440
+ const adaptors = input;
441
+ return adaptors.map((a) => updatePath(a, monorepoPath, log));
588
442
  }
443
+ const plan = input;
444
+ Object.values(plan.workflow.steps).forEach((step) => {
445
+ const job = step;
446
+ if (job.adaptor) {
447
+ job.adaptor = updatePath(job.adaptor, monorepoPath, log);
448
+ }
449
+ });
450
+ return plan;
589
451
  }
590
- return options;
452
+ return input;
591
453
  };
592
454
  var map_adaptors_to_monorepo_default = mapAdaptorsToMonorepo;
593
455
 
456
+ // src/util/load-plan.ts
457
+ var loadPlan = async (options, logger) => {
458
+ const { workflowPath, planPath, expressionPath } = options;
459
+ if (expressionPath) {
460
+ return loadExpression(options, logger);
461
+ }
462
+ const jsonPath = planPath || workflowPath;
463
+ if (!options.baseDir) {
464
+ options.baseDir = path2.dirname(jsonPath);
465
+ }
466
+ const json = await loadJson(jsonPath, logger);
467
+ const defaultName = path2.parse(jsonPath).name;
468
+ if (json.workflow) {
469
+ return loadXPlan(json, options, logger, defaultName);
470
+ } else {
471
+ return loadOldWorkflow(json, options, logger, defaultName);
472
+ }
473
+ };
474
+ var load_plan_default = loadPlan;
475
+ var loadJson = async (workflowPath, logger) => {
476
+ let text;
477
+ try {
478
+ text = await fs2.readFile(workflowPath, "utf8");
479
+ } catch (e) {
480
+ return abort_default(
481
+ logger,
482
+ "Workflow not found",
483
+ void 0,
484
+ `Failed to load a workflow from ${workflowPath}`
485
+ );
486
+ }
487
+ let json;
488
+ try {
489
+ json = JSON.parse(text);
490
+ } catch (e) {
491
+ return abort_default(
492
+ logger,
493
+ "Invalid JSON in workflow",
494
+ e,
495
+ `Check the syntax of the JSON at ${workflowPath}`
496
+ );
497
+ }
498
+ return json;
499
+ };
500
+ var maybeAssign = (a, b, keys) => {
501
+ keys.forEach((key) => {
502
+ if (a.hasOwnProperty(key)) {
503
+ b[key] = a[key];
504
+ }
505
+ });
506
+ };
507
+ var loadExpression = async (options, logger) => {
508
+ const expressionPath = options.expressionPath;
509
+ logger.debug(`Loading expression from ${expressionPath}`);
510
+ try {
511
+ const expression = await fs2.readFile(expressionPath, "utf8");
512
+ const name = path2.parse(expressionPath).name;
513
+ const step = { expression };
514
+ if (options.adaptors) {
515
+ const [adaptor] = options.adaptors;
516
+ if (adaptor) {
517
+ step.adaptor = adaptor;
518
+ }
519
+ }
520
+ const wfOptions = {};
521
+ maybeAssign(options, wfOptions, ["timeout"]);
522
+ const plan = {
523
+ workflow: {
524
+ name,
525
+ steps: [step]
526
+ },
527
+ options: wfOptions
528
+ };
529
+ return loadXPlan(plan, options, logger);
530
+ } catch (e) {
531
+ abort_default(
532
+ logger,
533
+ "Expression not found",
534
+ void 0,
535
+ `Failed to load the expression from ${expressionPath}`
536
+ );
537
+ return {};
538
+ }
539
+ };
540
+ var loadOldWorkflow = async (workflow, options, logger, defaultName = "") => {
541
+ const plan = {
542
+ workflow: {
543
+ steps: workflow.jobs
544
+ },
545
+ options: {
546
+ start: workflow.start
547
+ }
548
+ };
549
+ if (workflow.id) {
550
+ plan.id = workflow.id;
551
+ }
552
+ const final = await loadXPlan(plan, options, logger, defaultName);
553
+ logger.warn("Converted workflow into new format:");
554
+ logger.warn(final);
555
+ return final;
556
+ };
557
+ var fetchFile = async (jobId, rootDir = "", filePath, log) => {
558
+ try {
559
+ const fullPath = filePath.startsWith("~") ? filePath : path2.resolve(rootDir, filePath);
560
+ const result = await fs2.readFile(fullPath, "utf8");
561
+ return result;
562
+ } catch (e) {
563
+ abort_default(
564
+ log,
565
+ `File not found for job ${jobId}: ${filePath}`,
566
+ void 0,
567
+ `This workflow references a file which cannot be found at ${filePath}
568
+
569
+ Paths inside the workflow are relative to the workflow.json`
570
+ );
571
+ return ".";
572
+ }
573
+ };
574
+ var importExpressions = async (plan, rootDir, log) => {
575
+ let idx = 0;
576
+ for (const step of plan.workflow.steps) {
577
+ const job = step;
578
+ if (!job.expression) {
579
+ continue;
580
+ }
581
+ idx += 1;
582
+ const expressionStr = typeof job.expression === "string" && job.expression?.trim();
583
+ const configurationStr = typeof job.configuration === "string" && job.configuration?.trim();
584
+ if (expressionStr && isPath(expressionStr)) {
585
+ job.expression = await fetchFile(
586
+ job.id || `${idx}`,
587
+ rootDir,
588
+ expressionStr,
589
+ log
590
+ );
591
+ }
592
+ if (configurationStr && isPath(configurationStr)) {
593
+ const configString = await fetchFile(
594
+ job.id || `${idx}`,
595
+ rootDir,
596
+ configurationStr,
597
+ log
598
+ );
599
+ job.configuration = JSON.parse(configString);
600
+ }
601
+ }
602
+ };
603
+ var loadXPlan = async (plan, options, logger, defaultName = "") => {
604
+ if (!plan.options) {
605
+ plan.options = {};
606
+ }
607
+ if (!plan.workflow.name && defaultName) {
608
+ plan.workflow.name = defaultName;
609
+ }
610
+ await importExpressions(plan, options.baseDir, logger);
611
+ if (options.expandAdaptors) {
612
+ expand_adaptors_default(plan);
613
+ }
614
+ await map_adaptors_to_monorepo_default(options.monorepoPath, plan, logger);
615
+ maybeAssign(options, plan.options, ["timeout", "start"]);
616
+ logger.info(`Loaded workflow ${plan.workflow.name ?? ""}`);
617
+ return plan;
618
+ };
619
+
594
620
  // src/util/assert-path.ts
595
621
  var assert_path_default = (path7) => {
596
622
  if (!path7) {
@@ -608,20 +634,13 @@ var executeHandler = async (options, logger) => {
608
634
  const start = (/* @__PURE__ */ new Date()).getTime();
609
635
  assert_path_default(options.path);
610
636
  await validate_adaptors_default(options, logger);
611
- let input = await load_input_default(options, logger);
612
- if (options.workflow) {
613
- expand_adaptors_default(options);
614
- await map_adaptors_to_monorepo_default(
615
- options,
616
- logger
617
- );
618
- }
637
+ let plan = await load_plan_default(options, logger);
619
638
  const { repoDir, monorepoPath, autoinstall } = options;
620
639
  if (autoinstall) {
621
640
  if (monorepoPath) {
622
641
  logger.warn("Skipping auto-install as monorepo is being used");
623
642
  } else {
624
- const autoInstallTargets = get_autoinstall_targets_default(options);
643
+ const autoInstallTargets = get_autoinstall_targets_default(plan);
625
644
  if (autoInstallTargets.length) {
626
645
  logger.info("Auto-installing language adaptors");
627
646
  await install({ packages: autoInstallTargets, repoDir }, logger);
@@ -630,12 +649,12 @@ var executeHandler = async (options, logger) => {
630
649
  }
631
650
  const state = await load_state_default(options, logger);
632
651
  if (options.compile) {
633
- input = await compile_default(options, logger);
652
+ plan = await compile_default(plan, options, logger);
634
653
  } else {
635
654
  logger.info("Skipping compilation as noCompile is set");
636
655
  }
637
656
  try {
638
- const result = await execute_default(input, state, options);
657
+ const result = await execute_default(plan, state, options);
639
658
  await serialize_output_default(options, result, logger);
640
659
  const duration = printDuration((/* @__PURE__ */ new Date()).getTime() - start);
641
660
  if (result?.errors) {
@@ -661,21 +680,16 @@ var handler_default = executeHandler;
661
680
  import { writeFile as writeFile2 } from "node:fs/promises";
662
681
  var compileHandler = async (options, logger) => {
663
682
  assert_path_default(options.path);
664
- await load_input_default(options, logger);
665
- if (options.workflow) {
666
- expand_adaptors_default(options);
667
- await map_adaptors_to_monorepo_default(
668
- options,
669
- logger
670
- );
671
- }
672
- let result = await compile_default(options, logger);
673
- if (options.workflow) {
674
- result = JSON.stringify(result);
683
+ let result;
684
+ if (options.expressionPath) {
685
+ result = await compile_default(options.expressionPath, options, logger);
686
+ } else {
687
+ const plan = await load_plan_default(options, logger);
688
+ result = await compile_default(plan, options, logger);
689
+ result = JSON.stringify(result, null, 2);
675
690
  }
676
691
  if (options.outputStdout) {
677
- logger.success("Compiled code:");
678
- logger.success("\n" + result);
692
+ logger.success("Result:\n\n" + result);
679
693
  } else {
680
694
  await writeFile2(options.outputPath, result);
681
695
  logger.success(`Compiled to ${options.outputPath}`);
@@ -685,37 +699,41 @@ var handler_default2 = compileHandler;
685
699
 
686
700
  // src/test/handler.ts
687
701
  var testHandler = async (options, logger) => {
688
- logger.log("Running test job...");
702
+ logger.log("Running test workflow...");
689
703
  const opts = { ...options };
690
704
  opts.compile = true;
691
705
  opts.adaptors = [];
692
- opts.workflow = {
693
- start: "start",
694
- jobs: [
695
- {
696
- id: "start",
697
- state: { data: { defaultAnswer: 42 } },
698
- expression: "const fn = () => (state) => { console.log('Starting computer...'); return state; }; fn()",
699
- next: {
700
- calculate: "!state.error"
701
- }
702
- },
703
- {
704
- id: "calculate",
705
- expression: "const fn = () => (state) => { console.log('Calculating to life, the universe, and everything..'); return state }; fn()",
706
- next: {
707
- result: true
706
+ const plan = {
707
+ options: {
708
+ start: "start"
709
+ },
710
+ workflow: {
711
+ steps: [
712
+ {
713
+ id: "start",
714
+ state: { data: { defaultAnswer: 42 } },
715
+ expression: "const fn = () => (state) => { console.log('Starting computer...'); return state; }; fn()",
716
+ next: {
717
+ calculate: "!state.error"
718
+ }
719
+ },
720
+ {
721
+ id: "calculate",
722
+ expression: "const fn = () => (state) => { console.log('Calculating to life, the universe, and everything..'); return state }; fn()",
723
+ next: {
724
+ result: true
725
+ }
726
+ },
727
+ {
728
+ id: "result",
729
+ expression: "const fn = () => (state) => ({ data: { answer: state.data.answer || state.data.defaultAnswer } }); fn()"
708
730
  }
709
- },
710
- {
711
- id: "result",
712
- expression: "const fn = () => (state) => ({ data: { answer: state.data.answer || state.data.defaultAnswer } }); fn()"
713
- }
714
- ]
731
+ ]
732
+ }
715
733
  };
716
734
  logger.break();
717
- logger.info("Workflow object:");
718
- logger.info(JSON.stringify(opts.workflow, null, 2));
735
+ logger.info("Execution plan:");
736
+ logger.info(JSON.stringify(plan, null, 2));
719
737
  logger.break();
720
738
  if (!opts.stateStdin) {
721
739
  logger.debug(
@@ -724,8 +742,8 @@ var testHandler = async (options, logger) => {
724
742
  logger.debug('eg: -S "{ "data": { "answer": 33 } }"');
725
743
  }
726
744
  const state = await load_state_default(opts, createNullLogger());
727
- const code = await compile_default(opts, logger);
728
- const result = await execute_default(code, state, opts);
745
+ const compiledPlan = await compile_default(plan, opts, logger);
746
+ const result = await execute_default(compiledPlan, state, opts);
729
747
  logger.success(`Result: ${result.data.answer}`);
730
748
  return result;
731
749
  };
@@ -909,7 +927,7 @@ ${data.functions.map((fn) => ` ${fn.name}(${fn.parameters.map((p) => p.name).jo
909
927
  `;
910
928
  var docsHandler = async (options, logger) => {
911
929
  const { adaptor, operation, repoDir } = options;
912
- const { adaptors } = expand_adaptors_default({ adaptors: [adaptor] });
930
+ const adaptors = expand_adaptors_default([adaptor]);
913
931
  const [adaptorName] = adaptors;
914
932
  let { name, version } = getNameAndVersion4(adaptorName);
915
933
  if (!version) {
@@ -1177,7 +1195,7 @@ var loadVersionFromPath = (adaptorPath) => {
1177
1195
  return "unknown";
1178
1196
  }
1179
1197
  };
1180
- var printVersions = async (logger, options = {}) => {
1198
+ var printVersions = async (logger, options = {}, includeComponents = false) => {
1181
1199
  const { adaptors, logJson } = options;
1182
1200
  let adaptor = "";
1183
1201
  if (adaptors && adaptors.length) {
@@ -1190,6 +1208,9 @@ var printVersions = async (logger, options = {}) => {
1190
1208
  const [namePart, pathPart] = adaptor.split("=");
1191
1209
  adaptorVersion = loadVersionFromPath(pathPart);
1192
1210
  adaptorName = getNameAndVersion5(namePart).name;
1211
+ } else if (options.monorepoPath) {
1212
+ adaptorName = getNameAndVersion5(adaptor).name;
1213
+ adaptorVersion = "monorepo";
1193
1214
  } else {
1194
1215
  const { name, version: version2 } = getNameAndVersion5(adaptor);
1195
1216
  adaptorName = name;
@@ -1210,22 +1231,29 @@ var printVersions = async (logger, options = {}) => {
1210
1231
  output = {
1211
1232
  versions: {
1212
1233
  "node.js": process.version.substring(1),
1213
- cli: version,
1214
- runtime: runtimeVersion,
1215
- compiler: compilerVersion
1234
+ cli: version
1216
1235
  }
1217
1236
  };
1237
+ if (includeComponents) {
1238
+ output.versions.runtime = runtimeVersion;
1239
+ output.versions.compiler = compilerVersion;
1240
+ }
1218
1241
  if (adaptorName) {
1219
1242
  output.versions[adaptorName] = adaptorVersion;
1220
1243
  }
1221
1244
  } else {
1222
- const adaptorVersionString = adaptorName ? `
1223
- ${prefix(adaptorName)}${adaptorVersion}` : "";
1224
1245
  output = `Versions:
1225
1246
  ${prefix(NODE)}${process.version.substring(1)}
1226
- ${prefix(CLI2)}${version}
1247
+ ${prefix(CLI2)}${version}`;
1248
+ if (includeComponents) {
1249
+ output += `
1227
1250
  ${prefix(RUNTIME2)}${runtimeVersion}
1228
- ${prefix(COMPILER2)}${compilerVersion}${adaptorVersionString}`;
1251
+ ${prefix(COMPILER2)}${compilerVersion}`;
1252
+ }
1253
+ if (adaptorName) {
1254
+ output += `
1255
+ ${prefix(adaptorName)}${adaptorVersion}`;
1256
+ }
1229
1257
  }
1230
1258
  logger.always(output);
1231
1259
  };
@@ -1245,23 +1273,27 @@ var handlers = {
1245
1273
  ["repo-install"]: install,
1246
1274
  ["repo-pwd"]: pwd,
1247
1275
  ["repo-list"]: list,
1248
- version: async (opts, logger) => print_versions_default(logger, opts)
1276
+ version: async (opts, logger) => print_versions_default(logger, opts, true)
1249
1277
  };
1250
1278
  var parse = async (options, log) => {
1251
1279
  const logger = log || logger_default(CLI, options);
1252
1280
  if (options.command === "execute" || options.command === "test") {
1253
1281
  await print_versions_default(logger, options);
1254
1282
  }
1255
- if (options.monorepoPath) {
1256
- if (options.monorepoPath === "ERR") {
1283
+ const { monorepoPath } = options;
1284
+ if (monorepoPath) {
1285
+ if (monorepoPath === "ERR") {
1257
1286
  logger.error(
1258
1287
  "ERROR: --use-adaptors-monorepo was passed, but OPENFN_ADAPTORS_REPO env var is undefined"
1259
1288
  );
1260
1289
  logger.error("Set OPENFN_ADAPTORS_REPO to a path pointing to the repo");
1261
1290
  process.exit(9);
1262
1291
  }
1263
- await map_adaptors_to_monorepo_default(
1264
- options,
1292
+ await validateMonoRepo(monorepoPath, logger);
1293
+ logger.success(`Loading adaptors from monorepo at ${monorepoPath}`);
1294
+ options.adaptors = map_adaptors_to_monorepo_default(
1295
+ monorepoPath,
1296
+ options.adaptors,
1265
1297
  logger
1266
1298
  );
1267
1299
  }
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@openfn/cli",
3
- "version": "0.4.16",
3
+ "version": "1.0.0",
4
4
  "description": "CLI devtools for the openfn toolchain.",
5
5
  "engines": {
6
6
  "node": ">=18",
@@ -35,7 +35,8 @@
35
35
  "ts-node": "^10.9.1",
36
36
  "tslib": "^2.4.0",
37
37
  "tsup": "^7.2.0",
38
- "typescript": "^5.1.6"
38
+ "typescript": "^5.1.6",
39
+ "@openfn/lexicon": "^1.0.0"
39
40
  },
40
41
  "dependencies": {
41
42
  "@inquirer/prompts": "^1.1.4",
@@ -44,11 +45,11 @@
44
45
  "rimraf": "^3.0.2",
45
46
  "treeify": "^1.1.0",
46
47
  "yargs": "^17.7.2",
47
- "@openfn/compiler": "0.0.39",
48
- "@openfn/deploy": "0.4.1",
49
48
  "@openfn/describe-package": "0.0.18",
50
- "@openfn/logger": "0.0.20",
51
- "@openfn/runtime": "0.2.6"
49
+ "@openfn/compiler": "0.0.40",
50
+ "@openfn/logger": "1.0.0",
51
+ "@openfn/runtime": "1.0.0",
52
+ "@openfn/deploy": "0.4.2"
52
53
  },
53
54
  "files": [
54
55
  "dist",