@openfn/cli 0.0.34 → 0.0.38

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,13 +1,13 @@
1
1
  # @openfn/cli
2
2
 
3
- This package contains a new devtools CLI for running openfn jobs.
3
+ This package contains a new devtools CLI for running OpenFn jobs.
4
4
 
5
- The new CLI includes:
5
+ The CLI includes:
6
6
 
7
- * A new runtime for executing openfn jobs
8
- * A new compiler for making openfn jobs runnable
9
- * Improved, customisable logging output
10
- * Auto installation of language adaptors
7
+ * A secure runtime for executing OpenFn jobs and workflows
8
+ * A compiler for making OpenFn jobs runnable
9
+ * Configurable logging output
10
+ * Auto-installation of language adaptors
11
11
  * Support for the adaptors monorepo
12
12
 
13
13
  ## Getting Started
@@ -36,6 +36,22 @@ Get help:
36
36
  openfn help
37
37
  ```
38
38
 
39
+ ## Updating
40
+
41
+ You should be able to install a new version straight on top of your current installation:
42
+
43
+ ```
44
+ npm install -g @openfn/cli
45
+ ```
46
+
47
+ If this fails, try uninstalling the current version first:
48
+
49
+ ```
50
+ npm uninstall -g @openfn/cli
51
+ ```
52
+
53
+ And then re-installing.
54
+
39
55
  ## Migrating from devtools
40
56
 
41
57
  If you're coming to the CLI from the old openfn devtools, here are a couple of key points to be aware of:
@@ -46,31 +62,37 @@ If you're coming to the CLI from the old openfn devtools, here are a couple of k
46
62
 
47
63
  ## Basic Usage
48
64
 
49
- You're probably here to run jobs (expressions), which the CLI makes easy:
65
+ You're probably here to run jobs (expressions) or workflows, which the CLI makes easy:
66
+
50
67
 
51
68
  ```
69
+ openfn path/to/workflow.json
52
70
  openfn path/to/job.js -ia adaptor-name
53
71
  ```
54
72
 
55
- You MUST specify which adaptor to use. Pass the `-i` flag to auto-install that adaptor (it's safe to do this redundantly).
73
+ If running a single job, you MUST specify which adaptor to use.
74
+
75
+ Pass the `-i` flag to auto-install any required adaptors (it's safe to do this redundantly, although the run will be a little slower).
56
76
 
57
- When the job is finished, the CLI will write the `data` property of your state to disk. By default the CLI will create an `output.json` next to the job file. You can pass a path to output by passing `-o path/to/output.json` and state by adding `-s path/to/state.json`. You can use `-S` and `-O` to pass state through stdin and return the output through stdout. To write the entire state object (not just `data`), pass `--no-strict-output`.
77
+ When the finished, the CLI will write the resulting state to disk. By default the CLI will create an `output.json` next to the job file. You can pass a path to output by passing `-o path/to/output.json` and state by adding `-s path/to/state.json`. You can use `-S` and `-O` to pass state through stdin and return the output through stdout.
58
78
 
59
- The CLI can auto-install language adaptors to its own privately maintained repo, just include the `-i` flag in the command and your adaptors will be forever fully managed. Run `openfn repo list` to see where the repo is, and what's in it. Set the `OPENFN_REPO_DIR` env var to specify the repo folder. When autoinstalling, the CLI will check to see if a matching version is found in the repo.
79
+ Note that the CLI will only include the `state.data` key in the output. To write the entire state object (not just `data`), pass `--no-strict-output`.
80
+
81
+ The CLI maintains a repo for auto-installed adaptors. Run `openfn repo list` to see where the repo is, and what's in it. Set the `OPENFN_REPO_DIR` env var to specify the repo folder. When autoinstalling, the CLI will check to see if a matching version is found in the repo. `openfn repo clean` will remove all adaptors from the repo. The repo also includes any documentation and metadata built with the CLI.
60
82
 
61
83
  You can specify adaptors with a shorthand (`http`) or use the full package name (`@openfn/language-http`). You can add a specific version like `http@2.0.0`. You can pass a path to a locally installed adaptor like `http=/repo/openfn/adaptors/my-http-build`.
62
84
 
63
- If you have the adaptors monorepo set up on your machine, you can also run adaptors straight from source. Pass the `-m <path>` flag to load from the monorepo. You can also set the monorepo location by setting the `OPENFN_ADAPTORS_REPO` env var to a valid path. After that just include `-m` to load from the monorepo. Remember that adaptors will be loaded from the BUILT package in `dist`, so remember to build an adaptor before running!
85
+ If you have the adaptors monorepo set up on your machine, you can also run adaptors straight from the local build. Pass the `-m <path>` flag to load from the monorepo. You can also set the monorepo location by setting the `OPENFN_ADAPTORS_REPO` env var to a valid path. After that just include `-m` to load from the monorepo. Remember that adaptors will be loaded from the BUILT package in `dist`, so remember to build an adaptor before running!
64
86
 
65
87
  You can pass `--log info` to get more feedback about what's happening, or `--log debug` for more details than you could ever use.
66
88
 
67
89
  ## Advanced Usage
68
90
 
69
- The CLI has actually has a number of commands (the first argument after openfn)
91
+ The CLI has a number of commands (the first argument after openfn)
70
92
 
71
93
  * execute - run a job
72
94
  * compile - compile a job to a .js file
73
- * doc - show documentation for an adaptor function
95
+ * docs - show documentation for an adaptor function
74
96
  * repo - manage the repo of installed modules
75
97
  * docgen - generate JSON documentation for an adaptor based on its typescript
76
98
 
@@ -109,9 +131,42 @@ For a more structured output, you can emit logs as JSON objects with `level`, `n
109
131
  ```
110
132
  { level: 'info', name: 'CLI', message: ['Loaded adaptor'] }
111
133
  ```
112
-
113
134
  Pass `--log-json` to the CLI to do this. You can also set the OPENFN_LOG_JSON env var (and use `--no-log-json` to disable).
114
135
 
136
+ ## Workflows
137
+
138
+ As of v0.0.35 the CLI supports running workflows as well as jobs.
139
+
140
+ A workflow is in execution plan for running several jobs in a sequence. It is defined as a JSON structure.
141
+
142
+ To see an example workflow, run the test command with `openfn test`.
143
+
144
+ A workflow has a structure like this (better documentation is coming soon):
145
+
146
+ ```
147
+ {
148
+ "start": "a", // optionally specify the start node (defaults to jobs[0])
149
+ "jobs": [
150
+ {
151
+ "id": "a",
152
+ "expression": "fn((state) => state)", // code or a path
153
+ "adaptor": "@openfn/language-common@1.75", // specifiy the adaptor to use (version optional)
154
+ "data": {}, // optionally pre-populate the data object (this will be overriden by keys in in previous state)
155
+ "configuration": {}, // Use this to pass credentials
156
+ "next": {
157
+ // This object defines which jobs to call next
158
+ // All edges returning true will run
159
+ // If there are no next edges, the workflow will end
160
+ "b": true,
161
+ "c": {
162
+ "condition": "!state.error" // Not that this is an expression, not a function
163
+ }
164
+ }
165
+ },
166
+ ]
167
+ }
168
+ ```
169
+
115
170
  ## Compilation
116
171
 
117
172
  The CLI will attempt to compile your job code into normalized Javascript. It will do a number of things to make your code robust and portable:
@@ -129,12 +184,6 @@ All jobs which work against `@openfn/core` will work in the new CLI and runtime
129
184
 
130
185
  If you want to see how the compiler is changing your job, run `openfn compile path/to/job -a <adaptor>` to return the compiled code to stdout. Add `-o path/to/output.js` to save the result to disk.
131
186
 
132
- ## New Runtime notes
133
-
134
- The new OpenFn runtime will create a secure sandboxed environemtn which loads a Javascript Module, finds the default export, and execute the functions held within it.
135
-
136
- So long as your job has an array of functions as its default export, it will run in the new runtime.
137
-
138
187
  # Contributing
139
188
 
140
189
  First of all, thanks for helping! You're contributing to a digital public good that will always be free and open source and aimed at serving innovative NGOs, governments, and social impact organizations the world over! You rock. heart
@@ -170,8 +219,8 @@ The CLI will save and load adaptors from an arbitrary folder on your system.
170
219
 
171
220
  You should set the OPENFN_REPO_DIR env var to something sensible.
172
221
 
222
+ In `~/.bashrc` (or whatever you use), add:
173
223
  ```
174
- # In ~/.bashc or whatever
175
224
  export OPENFN_REPO_DIR=~/repo/openfn/cli-repo
176
225
  ```
177
226
 
@@ -1,5 +1,5 @@
1
1
  // src/util/expand-adaptors.ts
2
- var expand_adaptors_default = (names) => names?.map((name) => {
2
+ var expand = (name) => {
3
3
  if (typeof name === "string") {
4
4
  const [left] = name.split("=");
5
5
  if (left.match("/") || left.endsWith(".js")) {
@@ -8,7 +8,21 @@ var expand_adaptors_default = (names) => names?.map((name) => {
8
8
  return `@openfn/language-${name}`;
9
9
  }
10
10
  return name;
11
- });
11
+ };
12
+ var expand_adaptors_default = (opts) => {
13
+ const { adaptors, workflow } = opts;
14
+ if (adaptors) {
15
+ opts.adaptors = adaptors?.map(expand);
16
+ }
17
+ if (workflow) {
18
+ Object.values(workflow.jobs).forEach((job) => {
19
+ if (job.adaptor) {
20
+ job.adaptor = expand(job.adaptor);
21
+ }
22
+ });
23
+ }
24
+ return opts;
25
+ };
12
26
 
13
27
  // src/util/logger.ts
14
28
  import actualCreateLogger, { printDuration } from "@openfn/logger";
@@ -24,13 +38,15 @@ var namespaces = {
24
38
  [JOB]: "JOB"
25
39
  };
26
40
  var createLogger = (name = "", options) => {
27
- const logOptions = options.log;
41
+ const logOptions = options.log || {};
42
+ let json = false;
28
43
  let level = logOptions[name] || logOptions.default || "default";
29
44
  if (options.logJson) {
30
- logOptions.json = true;
45
+ json = true;
31
46
  }
32
47
  return actualCreateLogger(namespaces[name] || name, {
33
48
  level,
49
+ json,
34
50
  ...logOptions
35
51
  });
36
52
  };
package/dist/index.js CHANGED
@@ -2,7 +2,7 @@
2
2
  import {
3
3
  DEFAULT_REPO_DIR,
4
4
  expand_adaptors_default
5
- } from "./chunk-XYZNU5CH.js";
5
+ } from "./chunk-DOKL2XM5.js";
6
6
 
7
7
  // src/process/spawn.ts
8
8
  import path from "node:path";
@@ -128,7 +128,7 @@ var adaptors = {
128
128
  opts2.adaptors = [];
129
129
  }
130
130
  if (opts2.expandAdaptors) {
131
- opts2.adaptors = expand_adaptors_default(opts2.adaptors);
131
+ expand_adaptors_default(opts2);
132
132
  }
133
133
  delete opts2.adaptor;
134
134
  delete opts2.a;
@@ -193,19 +193,21 @@ var ignoreImports = {
193
193
  };
194
194
  var getBaseDir = (opts2) => {
195
195
  const basePath = opts2.path ?? ".";
196
- if (basePath.endsWith(".js")) {
196
+ if (/\.(jso?n?)$/.test(basePath)) {
197
197
  return path2.dirname(basePath);
198
198
  }
199
199
  return basePath;
200
200
  };
201
- var jobPath = {
202
- name: "job-path",
201
+ var inputPath = {
202
+ name: "input-path",
203
203
  yargs: {
204
204
  hidden: true
205
205
  },
206
206
  ensure: (opts2) => {
207
207
  const { path: basePath } = opts2;
208
- if (basePath?.endsWith(".js")) {
208
+ if (basePath?.endsWith(".json")) {
209
+ opts2.workflowPath = basePath;
210
+ } else if (basePath?.endsWith(".js")) {
209
211
  opts2.jobPath = basePath;
210
212
  } else {
211
213
  const base = getBaseDir(opts2);
@@ -262,6 +264,13 @@ var repoDir = {
262
264
  default: process.env.OPENFN_REPO_DIR || DEFAULT_REPO_DIR
263
265
  }
264
266
  };
267
+ var start = {
268
+ name: "start",
269
+ yargs: {
270
+ string: true,
271
+ description: "Specifiy the start node in a workflow"
272
+ }
273
+ };
265
274
  var strictOutput = {
266
275
  name: "no-strict-output",
267
276
  yargs: {
@@ -325,12 +334,13 @@ var options = [
325
334
  compile,
326
335
  immutable,
327
336
  ignoreImports,
328
- jobPath,
337
+ inputPath,
329
338
  logJson,
330
339
  outputPath,
331
340
  outputStdout,
332
341
  repoDir,
333
342
  skipAdaptorValidation,
343
+ start,
334
344
  statePath,
335
345
  stateStdin,
336
346
  strictOutput,
@@ -339,9 +349,9 @@ var options = [
339
349
  ];
340
350
  var executeCommand = {
341
351
  command: "execute [path]",
342
- desc: `Run an openfn job. Get more help by running openfn <command> help.
352
+ desc: `Run an openfn job or workflow. Get more help by running openfn <command> help.
343
353
 
344
- Execute will run a job/expression and write the output state to disk (to ./state.json unless otherwise specified)
354
+ Execute will run a job/workflow at the path and write the output state to disk (to ./state.json unless otherwise specified)
345
355
 
346
356
  By default only state.data will be written to the output. Include --no-strict-output to write the entire state object.
347
357
 
@@ -349,14 +359,14 @@ Remember to include the adaptor name with -a. Auto install adaptors with the -i
349
359
  aliases: ["$0"],
350
360
  handler: ensure("execute", options),
351
361
  builder: (yargs2) => build(options, yargs2).positional("path", {
352
- describe: "The path to load the job from (a .js file or a dir containing a job.js file)",
362
+ describe: "The path to load the job or workflow from (a .js or .json file or a dir containing a job.js file)",
353
363
  demandOption: true
354
364
  }).example(
355
365
  "openfn foo/job.js",
356
366
  "Execute foo/job.js with no adaptor and write the final state to foo/job.json"
357
367
  ).example(
358
- "openfn job.js -ia common",
359
- "Execute job.js using @openfn/language-commom , with autoinstall enabled)"
368
+ "openfn workflow.json -ia common",
369
+ "Execute workflow.json using @openfn/language-commom (with autoinstall enabled)"
360
370
  ).example(
361
371
  "openfn job.js -a common --log info",
362
372
  "Execute job.js with common adaptor and info-level logging"
@@ -372,7 +382,7 @@ var options2 = [
372
382
  expandAdaptors,
373
383
  adaptors,
374
384
  ignoreImports,
375
- jobPath,
385
+ inputPath,
376
386
  logJson,
377
387
  override(outputStdout, {
378
388
  default: true
@@ -383,32 +393,28 @@ var options2 = [
383
393
  ];
384
394
  var compileCommand = {
385
395
  command: "compile [path]",
386
- desc: "Compile an openfn job and print or save the resulting JavaScript.",
396
+ desc: "Compile an openfn job or workflow and print or save the resulting JavaScript.",
387
397
  handler: ensure("compile", options2),
388
398
  builder: (yargs2) => build(options2, yargs2).positional("path", {
389
- describe: "The path to load the job from (a .js file or a dir containing a job.js file)",
399
+ describe: "The path to load the job or workflow from (a .js or .json file or a dir containing a job.js file)",
390
400
  demandOption: true
391
401
  }).example(
392
- "compile foo/job.js -O",
393
- "Compiles foo/job.js and prints the result to stdout"
402
+ "compile foo/job.js",
403
+ "Compiles the job at foo/job.js and prints the result to stdout"
394
404
  ).example(
395
- "compile foo/job.js -o foo/job-compiled.js",
396
- "Compiles foo/job.js and saves the result to foo/job-compiled.js"
405
+ "compile foo/workflow.json -o foo/workflow-compiled.json",
406
+ "Compiles the workflow at foo/work.json and prints the result to -o foo/workflow-compiled.json"
397
407
  )
398
408
  };
399
409
  var command_default2 = compileCommand;
400
410
 
401
411
  // src/test/command.ts
412
+ var options3 = [stateStdin];
402
413
  var command_default3 = {
403
414
  command: "test",
404
415
  desc: "Compiles and runs a test job, printing the result to stdout",
405
- handler: (argv) => {
406
- argv.command = "test";
407
- },
408
- builder: (yargs2) => yargs2.option("state-stdin", {
409
- alias: "S",
410
- description: "Read state from stdin (instead of a file)"
411
- }).example("test", "run the test script").example("test -S 42", "run the test script with state 42")
416
+ handler: ensure("test", options3),
417
+ builder: (yargs2) => build(options3, yargs2).example("test", "Run the test script")
412
418
  };
413
419
 
414
420
  // src/docgen/command.ts
@@ -435,7 +441,7 @@ var command_default5 = {
435
441
  };
436
442
 
437
443
  // src/metadata/command.ts
438
- var options3 = [
444
+ var options4 = [
439
445
  expandAdaptors,
440
446
  adaptors,
441
447
  force,
@@ -448,8 +454,8 @@ var options3 = [
448
454
  var command_default6 = {
449
455
  command: "metadata",
450
456
  desc: "Generate metadata for an adaptor config",
451
- handler: ensure("metadata", options3),
452
- builder: (yargs2) => build(options3, yargs2).example(
457
+ handler: ensure("metadata", options4),
458
+ builder: (yargs2) => build(options4, yargs2).example(
453
459
  "metadata -a salesforce -s tmp/state.json",
454
460
  "Generate salesforce metadata from config in state.json"
455
461
  )
@@ -10,54 +10,13 @@ import {
10
10
  expand_adaptors_default,
11
11
  logger_default,
12
12
  printDuration
13
- } from "../chunk-XYZNU5CH.js";
14
-
15
- // src/execute/handler.ts
16
- import { readFile } from "node:fs/promises";
17
-
18
- // src/execute/load-state.ts
19
- import fs from "node:fs/promises";
20
- var load_state_default = async (opts, log) => {
21
- const { stateStdin, statePath } = opts;
22
- log.debug("Load state...");
23
- if (stateStdin) {
24
- try {
25
- const json = JSON.parse(stateStdin);
26
- log.success("Read state from stdin");
27
- log.debug("state:", json);
28
- return json;
29
- } catch (e) {
30
- log.error("Failed to load state from stdin");
31
- log.error(stateStdin);
32
- log.error(e);
33
- process.exit(1);
34
- }
35
- }
36
- if (statePath) {
37
- try {
38
- const str = await fs.readFile(statePath, "utf8");
39
- const json = JSON.parse(str);
40
- log.success(`Loaded state from ${statePath}`);
41
- log.debug("state:", json);
42
- return json;
43
- } catch (e) {
44
- log.warn(`Error loading state from ${statePath}`);
45
- log.warn(e);
46
- }
47
- }
48
- log.info(
49
- "No state provided - using default state { data: {}, configuration: {}"
50
- );
51
- return {
52
- data: {},
53
- configuration: {}
54
- };
55
- };
13
+ } from "../chunk-DOKL2XM5.js";
56
14
 
57
15
  // src/execute/execute.ts
58
16
  import run, { getNameAndVersion } from "@openfn/runtime";
59
- var execute_default = (code, state, opts) => {
60
- return run(code, state, {
17
+ var execute_default = (input, state, opts) => {
18
+ return run(input, state, {
19
+ start: opts.start,
61
20
  timeout: opts.timeout,
62
21
  immutableState: opts.immutable,
63
22
  logger: logger_default(RUNTIME, opts),
@@ -69,92 +28,41 @@ var execute_default = (code, state, opts) => {
69
28
  });
70
29
  };
71
30
  function parseAdaptors(opts) {
72
- const adaptors = {};
73
- opts.adaptors.reduce((obj, exp) => {
74
- const [module, path5] = exp.split("=");
31
+ const extractInfo = (specifier) => {
32
+ const [module, path6] = specifier.split("=");
75
33
  const { name, version } = getNameAndVersion(module);
76
- const info = {};
77
- if (path5) {
78
- info.path = path5;
34
+ const info = {
35
+ name
36
+ };
37
+ if (path6) {
38
+ info.path = path6;
79
39
  }
80
40
  if (version) {
81
41
  info.version = version;
82
42
  }
83
- obj[name] = info;
84
- return obj;
85
- }, adaptors);
86
- return adaptors;
87
- }
88
-
89
- // src/compile/compile.ts
90
- import compile, { preloadAdaptorExports } from "@openfn/compiler";
91
- import { getModulePath } from "@openfn/runtime";
92
- var compile_default = async (opts, log) => {
93
- log.debug("Loading job...");
94
- const compilerOptions = await loadTransformOptions(opts, log);
95
- const job = compile(opts.jobSource || opts.jobPath, compilerOptions);
96
- if (opts.jobPath) {
97
- log.success(`Compiled job from ${opts.jobPath}`);
98
- } else {
99
- log.success("Compiled job");
100
- }
101
- return job;
102
- };
103
- var stripVersionSpecifier = (specifier) => {
104
- const idx = specifier.lastIndexOf("@");
105
- if (idx > 0) {
106
- return specifier.substring(0, idx);
107
- }
108
- return specifier;
109
- };
110
- var resolveSpecifierPath = async (pattern, repoDir, log) => {
111
- const [specifier, path5] = pattern.split("=");
112
- if (path5) {
113
- log.debug(`Resolved ${specifier} to path: ${path5}`);
114
- return path5;
115
- }
116
- const repoPath = await getModulePath(specifier, repoDir, log);
117
- if (repoPath) {
118
- return repoPath;
119
- }
120
- return null;
121
- };
122
- var loadTransformOptions = async (opts, log) => {
123
- const options = {
124
- logger: log || logger_default(COMPILER, opts)
43
+ return info;
125
44
  };
126
- if (opts.adaptors?.length && opts.ignoreImports != true) {
127
- let exports;
128
- const [pattern] = opts.adaptors;
129
- const [specifier] = pattern.split("=");
130
- log.debug(`Attempting to preload types for ${specifier}`);
131
- const path5 = await resolveSpecifierPath(pattern, opts.repoDir, log);
132
- if (path5) {
133
- try {
134
- exports = await preloadAdaptorExports(path5, log);
135
- } catch (e) {
136
- log.error(`Failed to load adaptor typedefs from path ${path5}`);
137
- log.error(e);
138
- }
139
- }
140
- if (!exports || exports.length === 0) {
141
- log.debug(`No module exports found for ${pattern}`);
142
- }
143
- options["add-imports"] = {
144
- ignore: opts.ignoreImports,
145
- adaptor: {
146
- name: stripVersionSpecifier(specifier),
147
- exports,
148
- exportAll: true
45
+ const adaptors = {};
46
+ if (opts.adaptors) {
47
+ opts.adaptors.reduce((obj, exp) => {
48
+ const { name, ...maybeVersionAndPath } = extractInfo(exp);
49
+ obj[name] = { ...maybeVersionAndPath };
50
+ return obj;
51
+ }, adaptors);
52
+ }
53
+ if (opts.workflow) {
54
+ Object.values(opts.workflow.jobs).forEach((job) => {
55
+ if (job.adaptor) {
56
+ const { name, ...maybeVersionAndPath } = extractInfo(job.adaptor);
57
+ adaptors[name] = { ...maybeVersionAndPath };
149
58
  }
150
- };
59
+ });
151
60
  }
152
- return options;
153
- };
61
+ return adaptors;
62
+ }
154
63
 
155
64
  // src/execute/serialize-output.ts
156
65
  import { writeFile } from "node:fs/promises";
157
- import stringify from "fast-safe-stringify";
158
66
  var serializeOutput = async (options, result, logger) => {
159
67
  let output = result;
160
68
  if (output && (output.configuration || output.data)) {
@@ -171,7 +79,7 @@ var serializeOutput = async (options, result, logger) => {
171
79
  if (output === void 0) {
172
80
  output = "";
173
81
  } else {
174
- output = stringify(output, void 0, 2);
82
+ output = JSON.stringify(output, void 0, 2);
175
83
  }
176
84
  if (options.outputStdout) {
177
85
  logger.success(`Result: `);
@@ -184,6 +92,24 @@ var serializeOutput = async (options, result, logger) => {
184
92
  };
185
93
  var serialize_output_default = serializeOutput;
186
94
 
95
+ // src/execute/get-autoinstall-targets.ts
96
+ var getAutoinstallTargets = (options) => {
97
+ if (options.workflow) {
98
+ const adaptors = {};
99
+ Object.values(options.workflow.jobs).forEach((job) => {
100
+ if (job.adaptor) {
101
+ adaptors[job.adaptor] = true;
102
+ }
103
+ });
104
+ return Object.keys(adaptors);
105
+ }
106
+ if (options.adaptors) {
107
+ return options.adaptors?.filter((a) => !/=/.test(a));
108
+ }
109
+ return [];
110
+ };
111
+ var get_autoinstall_targets_default = getAutoinstallTargets;
112
+
187
113
  // src/repo/handler.ts
188
114
  import { exec } from "node:child_process";
189
115
  import treeify from "treeify";
@@ -195,9 +121,10 @@ var install = async (opts, log = defaultLogger) => {
195
121
  log.success("Installing packages...");
196
122
  log.debug("repoDir is set to:", repoDir);
197
123
  if (adaptor) {
198
- packages = expand_adaptors_default(packages);
124
+ const expanded = expand_adaptors_default({ adaptors: packages });
125
+ packages = expanded.adaptors;
199
126
  }
200
- await rtInstall(packages, repoDir, log);
127
+ await rtInstall(packages ?? [], repoDir, log);
201
128
  const duration = log.timer("install");
202
129
  log.success(`Installation complete in ${duration}`);
203
130
  }
@@ -254,12 +181,149 @@ var list = async (options, logger) => {
254
181
  logger.success("Installed packages:\n\n" + treeify.asTree(output));
255
182
  };
256
183
 
184
+ // src/compile/compile.ts
185
+ import compile, { preloadAdaptorExports } from "@openfn/compiler";
186
+ import { getModulePath } from "@openfn/runtime";
187
+ var compile_default = async (opts, log) => {
188
+ log.debug("Compiling...");
189
+ let job;
190
+ if (opts.workflow) {
191
+ job = compileWorkflow(opts.workflow, opts, log);
192
+ } else {
193
+ const compilerOptions = await loadTransformOptions(opts, log);
194
+ job = compile(opts.job || opts.jobPath, compilerOptions);
195
+ }
196
+ if (opts.jobPath) {
197
+ log.success(`Compiled job from ${opts.jobPath}`);
198
+ } else {
199
+ log.success("Compiled job");
200
+ }
201
+ return job;
202
+ };
203
+ var compileWorkflow = async (workflow, opts, log) => {
204
+ for (const job of workflow.jobs) {
205
+ const jobOpts = {
206
+ ...opts
207
+ };
208
+ if (job.adaptor) {
209
+ jobOpts.adaptors = [job.adaptor];
210
+ }
211
+ const compilerOptions = await loadTransformOptions(jobOpts, log);
212
+ if (job.expression) {
213
+ job.expression = compile(job.expression, compilerOptions);
214
+ }
215
+ }
216
+ return workflow;
217
+ };
218
+ var stripVersionSpecifier = (specifier) => {
219
+ const idx = specifier.lastIndexOf("@");
220
+ if (idx > 0) {
221
+ return specifier.substring(0, idx);
222
+ }
223
+ return specifier;
224
+ };
225
+ var resolveSpecifierPath = async (pattern, repoDir, log) => {
226
+ const [specifier, path6] = pattern.split("=");
227
+ if (path6) {
228
+ log.debug(`Resolved ${specifier} to path: ${path6}`);
229
+ return path6;
230
+ }
231
+ const repoPath = await getModulePath(specifier, repoDir, log);
232
+ if (repoPath) {
233
+ return repoPath;
234
+ }
235
+ return null;
236
+ };
237
+ var loadTransformOptions = async (opts, log) => {
238
+ const options = {
239
+ logger: log || logger_default(COMPILER, opts)
240
+ };
241
+ if (opts.adaptors?.length && opts.ignoreImports != true) {
242
+ let exports;
243
+ const [pattern] = opts.adaptors;
244
+ const [specifier] = pattern.split("=");
245
+ log.debug(`Attempting to preload types for ${specifier}`);
246
+ const path6 = await resolveSpecifierPath(pattern, opts.repoDir, log);
247
+ if (path6) {
248
+ try {
249
+ exports = await preloadAdaptorExports(
250
+ path6,
251
+ opts.useAdaptorsMonorepo,
252
+ log
253
+ );
254
+ } catch (e) {
255
+ log.error(`Failed to load adaptor typedefs from path ${path6}`);
256
+ log.error(e);
257
+ }
258
+ }
259
+ if (!exports || exports.length === 0) {
260
+ log.debug(`No module exports found for ${pattern}`);
261
+ }
262
+ options["add-imports"] = {
263
+ ignore: opts.ignoreImports,
264
+ adaptor: {
265
+ name: stripVersionSpecifier(specifier),
266
+ exports,
267
+ exportAll: true
268
+ }
269
+ };
270
+ }
271
+ return options;
272
+ };
273
+
274
+ // src/util/load-state.ts
275
+ import fs from "node:fs/promises";
276
+ var load_state_default = async (opts, log) => {
277
+ const { stateStdin, statePath } = opts;
278
+ log.debug("Loading state...");
279
+ if (stateStdin) {
280
+ try {
281
+ const json = JSON.parse(stateStdin);
282
+ log.success("Read state from stdin");
283
+ log.debug("state:", json);
284
+ return json;
285
+ } catch (e) {
286
+ log.error("Failed to load state from stdin");
287
+ log.error(stateStdin);
288
+ log.error(e);
289
+ process.exit(1);
290
+ }
291
+ }
292
+ if (statePath) {
293
+ try {
294
+ const str = await fs.readFile(statePath, "utf8");
295
+ const json = JSON.parse(str);
296
+ log.success(`Loaded state from ${statePath}`);
297
+ log.debug("state:", json);
298
+ return json;
299
+ } catch (e) {
300
+ log.warn(`Error loading state from ${statePath}`);
301
+ log.warn(e);
302
+ }
303
+ }
304
+ log.info(
305
+ "No state provided - using default state { data: {}, configuration: {} }"
306
+ );
307
+ return {
308
+ data: {},
309
+ configuration: {}
310
+ };
311
+ };
312
+
257
313
  // src/util/validate-adaptors.ts
258
314
  var validateAdaptors = async (options, logger) => {
259
315
  if (options.skipAdaptorValidation) {
260
316
  return;
261
317
  }
262
- if (!options.adaptors || options.adaptors.length === 0) {
318
+ const hasDeclaredAdaptors = options.adaptors && options.adaptors.length > 0;
319
+ if (options.workflowPath && hasDeclaredAdaptors) {
320
+ logger.error("ERROR: adaptor and workflow provided");
321
+ logger.error(
322
+ "This is probably not what you meant to do. A workflow should declare an adaptor for each job."
323
+ );
324
+ throw new Error("adaptor and workflow provided");
325
+ }
326
+ if (!options.workflowPath && !hasDeclaredAdaptors) {
263
327
  logger.warn("WARNING: No adaptor provided!");
264
328
  logger.warn(
265
329
  "This job will probably fail. Pass an adaptor with the -a flag, eg:"
@@ -271,22 +335,76 @@ var validateAdaptors = async (options, logger) => {
271
335
  };
272
336
  var validate_adaptors_default = validateAdaptors;
273
337
 
274
- // src/execute/handler.ts
275
- var getAutoinstallTargets = (options) => {
276
- if (options.adaptors) {
277
- return options.adaptors?.filter((a) => !/=/.test(a));
338
+ // src/util/load-input.ts
339
+ import path from "node:path";
340
+ import fs2 from "node:fs/promises";
341
+ import { isPath } from "@openfn/compiler";
342
+ var load_input_default = async (opts, log) => {
343
+ log.debug("Loading input...");
344
+ const { job, workflow, jobPath, workflowPath } = opts;
345
+ if (workflow || workflowPath) {
346
+ return loadWorkflow(opts, log);
347
+ }
348
+ if (job) {
349
+ return job;
350
+ }
351
+ if (jobPath) {
352
+ log.debug(`Loading job from ${jobPath}`);
353
+ opts.job = await fs2.readFile(jobPath, "utf8");
354
+ return opts.job;
355
+ }
356
+ };
357
+ var fetchFile = (rootDir, filePath) => {
358
+ const jobPath = filePath.startsWith("~") ? filePath : path.resolve(rootDir, filePath);
359
+ return fs2.readFile(jobPath, "utf8");
360
+ };
361
+ var loadWorkflow = async (opts, log) => {
362
+ const { workflowPath, workflow } = opts;
363
+ log.debug(`Loading workflow from ${workflowPath}`);
364
+ try {
365
+ let wf;
366
+ let rootDir = opts.baseDir;
367
+ if (workflowPath) {
368
+ const workflowRaw = await fs2.readFile(workflowPath, "utf8");
369
+ wf = JSON.parse(workflowRaw);
370
+ if (!rootDir) {
371
+ rootDir = path.dirname(workflowPath);
372
+ }
373
+ } else {
374
+ wf = workflow;
375
+ }
376
+ for (const job of wf.jobs) {
377
+ if (typeof job.expression === "string" && isPath(job.expression)) {
378
+ job.expression = await fetchFile(rootDir, job.expression);
379
+ }
380
+ if (typeof job.configuration === "string" && isPath(job.configuration)) {
381
+ const configString = await fetchFile(rootDir, job.configuration);
382
+ job.configuration = JSON.parse(configString);
383
+ }
384
+ }
385
+ opts.workflow = wf;
386
+ log.debug("Workflow loaded!");
387
+ return opts.workflow;
388
+ } catch (e) {
389
+ log.error(`Error loading workflow from ${workflowPath}`);
390
+ throw e;
278
391
  }
279
- return [];
280
392
  };
393
+
394
+ // src/execute/handler.ts
281
395
  var executeHandler = async (options, logger) => {
282
396
  const start = new Date().getTime();
283
397
  await validate_adaptors_default(options, logger);
398
+ let input = await load_input_default(options, logger);
399
+ if (options.workflow) {
400
+ expand_adaptors_default(options);
401
+ }
284
402
  const { repoDir, monorepoPath, autoinstall } = options;
285
403
  if (autoinstall) {
286
404
  if (monorepoPath) {
287
405
  logger.warn("Skipping auto-install as monorepo is being used");
288
406
  } else {
289
- const autoInstallTargets = getAutoinstallTargets(options);
407
+ const autoInstallTargets = get_autoinstall_targets_default(options);
290
408
  if (autoInstallTargets.length) {
291
409
  logger.info("Auto-installing language adaptors");
292
410
  await install({ packages: autoInstallTargets, repoDir }, logger);
@@ -294,21 +412,17 @@ var executeHandler = async (options, logger) => {
294
412
  }
295
413
  }
296
414
  const state = await load_state_default(options, logger);
297
- let code = "";
298
415
  if (options.compile) {
299
- code = await compile_default(options, logger);
416
+ input = await compile_default(options, logger);
300
417
  } else {
301
418
  logger.info("Skipping compilation as noCompile is set");
302
- if (options.jobPath) {
303
- code = await readFile(options.jobPath, "utf8");
304
- logger.success(`Loaded job from ${options.jobPath} (no compilation)`);
305
- }
306
419
  }
307
420
  try {
308
- const result = await execute_default(code, state, options);
421
+ const result = await execute_default(input, state, options);
309
422
  await serialize_output_default(options, result, logger);
310
423
  const duration = printDuration(new Date().getTime() - start);
311
424
  logger.success(`Done in ${duration}! \u2728`);
425
+ return result;
312
426
  } catch (error) {
313
427
  logger.error(error);
314
428
  const duration = printDuration(new Date().getTime() - start);
@@ -321,33 +435,65 @@ var handler_default = executeHandler;
321
435
  // src/compile/handler.ts
322
436
  import { writeFile as writeFile2 } from "node:fs/promises";
323
437
  var compileHandler = async (options, logger) => {
324
- const code = await compile_default(options, logger);
438
+ await load_input_default(options, logger);
439
+ let result = await compile_default(options, logger);
440
+ if (options.workflow) {
441
+ result = JSON.stringify(result);
442
+ }
325
443
  if (options.outputStdout) {
326
444
  logger.success("Compiled code:");
327
- logger.success("\n" + code);
445
+ logger.success("\n" + result);
328
446
  } else {
329
- await writeFile2(options.outputPath, code);
447
+ await writeFile2(options.outputPath, result);
330
448
  logger.success(`Compiled to ${options.outputPath}`);
331
449
  }
332
450
  };
333
451
  var handler_default2 = compileHandler;
334
452
 
335
453
  // src/test/handler.ts
336
- var sillyMessage = "Calculating the answer to life, the universe, and everything...";
337
454
  var testHandler = async (options, logger) => {
338
455
  logger.log("Running test job...");
339
456
  options.compile = true;
340
- options.jobSource = `const fn = () => state => { console.log('${sillyMessage}'); return state * 2; } ; fn()`;
341
- delete options.jobPath;
457
+ options.adaptors = [];
458
+ options.workflow = {
459
+ start: "start",
460
+ jobs: [
461
+ {
462
+ id: "start",
463
+ data: { defaultAnswer: 42 },
464
+ expression: "const fn = () => (state) => { console.log('Starting computer...'); return state; }; fn()",
465
+ next: {
466
+ calculate: "!state.error"
467
+ }
468
+ },
469
+ {
470
+ id: "calculate",
471
+ expression: "const fn = () => (state) => { console.log('Calculating to life, the universe, and everything..'); return state }; fn()",
472
+ next: {
473
+ result: true
474
+ }
475
+ },
476
+ {
477
+ id: "result",
478
+ expression: "const fn = () => (state) => ({ data: { answer: state.data.answer || state.data.defaultAnswer } }); fn()"
479
+ }
480
+ ]
481
+ };
482
+ logger.break();
483
+ logger.info("Workflow object:");
484
+ logger.info(JSON.stringify(options.workflow, null, 2));
485
+ logger.break();
342
486
  if (!options.stateStdin) {
343
- logger.debug("No state provided: try -S <number> to provide some state");
344
- options.stateStdin = "21";
487
+ logger.debug(
488
+ "No state provided: pass an object with state.data.answer to provide custom input"
489
+ );
490
+ logger.debug('eg: -S "{ "data": { "answer": 33 } }"');
345
491
  }
346
492
  const silentLogger = createNullLogger();
347
493
  const state = await load_state_default(options, silentLogger);
348
494
  const code = await compile_default(options, logger);
349
495
  const result = await execute_default(code, state, options);
350
- logger.success(`Result: ${result}`);
496
+ logger.success(`Result: ${result.data.answer}`);
351
497
  return result;
352
498
  };
353
499
  var handler_default3 = testHandler;
@@ -355,28 +501,28 @@ var handler_default3 = testHandler;
355
501
  // src/docgen/handler.ts
356
502
  import { writeFile as writeFile3 } from "node:fs/promises";
357
503
  import { readFileSync, writeFileSync, mkdirSync, rmSync } from "node:fs";
358
- import path from "node:path";
504
+ import path2 from "node:path";
359
505
  import { describePackage } from "@openfn/describe-package";
360
506
  import { getNameAndVersion as getNameAndVersion2 } from "@openfn/runtime";
361
507
  var RETRY_DURATION = 500;
362
508
  var RETRY_COUNT = 20;
363
509
  var TIMEOUT_MS = 1e3 * 60;
364
510
  var actualDocGen = (specifier) => describePackage(specifier, {});
365
- var ensurePath = (filePath) => mkdirSync(path.dirname(filePath), { recursive: true });
366
- var generatePlaceholder = (path5) => {
367
- writeFileSync(path5, `{ "loading": true, "timestamp": ${Date.now()}}`);
511
+ var ensurePath = (filePath) => mkdirSync(path2.dirname(filePath), { recursive: true });
512
+ var generatePlaceholder = (path6) => {
513
+ writeFileSync(path6, `{ "loading": true, "timestamp": ${Date.now()}}`);
368
514
  };
369
515
  var finish = (logger, resultPath) => {
370
516
  logger.success("Done! Docs can be found at:\n");
371
- logger.print(` ${path.resolve(resultPath)}`);
517
+ logger.print(` ${path2.resolve(resultPath)}`);
372
518
  };
373
- var generateDocs = async (specifier, path5, docgen, logger) => {
519
+ var generateDocs = async (specifier, path6, docgen, logger) => {
374
520
  const result = await docgen(specifier);
375
- await writeFile3(path5, JSON.stringify(result, null, 2));
376
- finish(logger, path5);
377
- return path5;
521
+ await writeFile3(path6, JSON.stringify(result, null, 2));
522
+ finish(logger, path6);
523
+ return path6;
378
524
  };
379
- var waitForDocs = async (docs, path5, logger, retryDuration = RETRY_DURATION) => {
525
+ var waitForDocs = async (docs, path6, logger, retryDuration = RETRY_DURATION) => {
380
526
  try {
381
527
  if (docs.hasOwnProperty("loading")) {
382
528
  logger.info("Docs are being loaded by another process. Waiting.");
@@ -388,19 +534,19 @@ var waitForDocs = async (docs, path5, logger, retryDuration = RETRY_DURATION) =>
388
534
  clearInterval(i);
389
535
  reject(new Error("Timed out waiting for docs to load"));
390
536
  }
391
- const updated = JSON.parse(readFileSync(path5, "utf8"));
537
+ const updated = JSON.parse(readFileSync(path6, "utf8"));
392
538
  if (!updated.hasOwnProperty("loading")) {
393
539
  logger.info("Docs found!");
394
540
  clearInterval(i);
395
- resolve(path5);
541
+ resolve(path6);
396
542
  }
397
543
  count++;
398
544
  }, retryDuration);
399
545
  });
400
546
  } else {
401
- logger.info(`Docs already written to cache at ${path5}`);
402
- finish(logger, path5);
403
- return path5;
547
+ logger.info(`Docs already written to cache at ${path6}`);
548
+ finish(logger, path6);
549
+ return path6;
404
550
  }
405
551
  } catch (e) {
406
552
  logger.error("Existing doc JSON corrupt. Aborting");
@@ -417,28 +563,28 @@ var docgenHandler = (options, logger, docgen = actualDocGen, retryDuration = RET
417
563
  process.exit(9);
418
564
  }
419
565
  logger.success(`Generating docs for ${specifier}`);
420
- const path5 = `${repoDir}/docs/${specifier}.json`;
421
- ensurePath(path5);
566
+ const path6 = `${repoDir}/docs/${specifier}.json`;
567
+ ensurePath(path6);
422
568
  const handleError = () => {
423
569
  logger.info("Removing placeholder");
424
- rmSync(path5);
570
+ rmSync(path6);
425
571
  };
426
572
  try {
427
- const existing = readFileSync(path5, "utf8");
573
+ const existing = readFileSync(path6, "utf8");
428
574
  const json = JSON.parse(existing);
429
575
  if (json && json.timeout && Date.now() - json.timeout >= TIMEOUT_MS) {
430
576
  logger.info(`Expired placeholder found. Removing.`);
431
- rmSync(path5);
577
+ rmSync(path6);
432
578
  throw new Error("TIMEOUT");
433
579
  }
434
- return waitForDocs(json, path5, logger, retryDuration);
580
+ return waitForDocs(json, path6, logger, retryDuration);
435
581
  } catch (e) {
436
582
  if (e.message !== "TIMEOUT") {
437
- logger.info(`Docs JSON not found at ${path5}`);
583
+ logger.info(`Docs JSON not found at ${path6}`);
438
584
  }
439
585
  logger.debug("Generating placeholder");
440
- generatePlaceholder(path5);
441
- return generateDocs(specifier, path5, docgen, logger).catch((e2) => {
586
+ generatePlaceholder(path6);
587
+ return generateDocs(specifier, path6, docgen, logger).catch((e2) => {
442
588
  logger.error("Error generating documentation");
443
589
  logger.error(e2);
444
590
  handleError();
@@ -448,7 +594,7 @@ var docgenHandler = (options, logger, docgen = actualDocGen, retryDuration = RET
448
594
  var handler_default4 = docgenHandler;
449
595
 
450
596
  // src/docs/handler.ts
451
- import { readFile as readFile2 } from "node:fs/promises";
597
+ import { readFile } from "node:fs/promises";
452
598
  import { getNameAndVersion as getNameAndVersion3, getLatestVersion } from "@openfn/runtime";
453
599
  var describeFn = (adaptorName, fn) => `## ${fn.name}(${fn.parameters.map(({ name }) => name).join(",")})
454
600
 
@@ -477,7 +623,8 @@ ${data.functions.map((fn) => ` ${fn.name}(${fn.parameters.map((p) => p.name).jo
477
623
  `;
478
624
  var docsHandler = async (options, logger) => {
479
625
  const { adaptor, operation, repoDir } = options;
480
- const [adaptorName] = expand_adaptors_default([adaptor], logger);
626
+ const { adaptors } = expand_adaptors_default({ adaptors: [adaptor] });
627
+ const [adaptorName] = adaptors;
481
628
  let { name, version } = getNameAndVersion3(adaptorName);
482
629
  if (!version) {
483
630
  logger.info("No version number provided, looking for latest...");
@@ -486,29 +633,38 @@ var docsHandler = async (options, logger) => {
486
633
  logger.success(`Showing docs for ${adaptorName} v${version}`);
487
634
  }
488
635
  logger.info("Generating/loading documentation...");
489
- const path5 = await handler_default4(
636
+ const path6 = await handler_default4(
490
637
  {
491
638
  specifier: `${name}@${version}`,
492
639
  repoDir
493
640
  },
494
641
  createNullLogger()
495
642
  );
496
- if (path5) {
497
- const source = await readFile2(path5, "utf8");
643
+ let didError = false;
644
+ if (path6) {
645
+ const source = await readFile(path6, "utf8");
498
646
  const data = JSON.parse(source);
499
647
  let desc;
500
648
  if (operation) {
501
649
  const fn = data.functions.find(({ name: name2 }) => name2 === operation);
502
- logger.debug("Operation schema:", fn);
503
- logger.success(`Documentation for ${name}.${operation} v${version}:
650
+ if (fn) {
651
+ logger.debug("Operation schema:", fn);
652
+ logger.success(`Documentation for ${name}.${operation} v${version}:
504
653
  `);
505
- desc = describeFn(name, fn);
654
+ desc = describeFn(name, fn);
655
+ } else {
656
+ logger.error(`Failed to find ${operation} in ${name}`);
657
+ }
506
658
  } else {
507
659
  logger.debug("No operation provided, listing available operations");
508
660
  desc = describeLib(name, data);
509
661
  }
510
662
  logger.print(desc);
511
- logger.success("Done!");
663
+ if (didError) {
664
+ logger.error("Error");
665
+ } else {
666
+ logger.success("Done!");
667
+ }
512
668
  } else {
513
669
  logger.error("Not found");
514
670
  }
@@ -518,13 +674,29 @@ var handler_default5 = docsHandler;
518
674
  // src/metadata/cache.ts
519
675
  import { createHash } from "node:crypto";
520
676
  import { readFileSync as readFileSync2 } from "node:fs";
521
- import path2 from "node:path";
677
+ import path3 from "node:path";
522
678
  import { writeFile as writeFile4, mkdir } from "node:fs/promises";
523
679
  var getPath = (repoDir, key) => `${repoDir}/meta/${key}.json`;
524
- var generateKey = (config) => createHash("sha256").update(JSON.stringify(config)).digest("hex");
680
+ var sortKeys = (obj) => {
681
+ const newObj = {};
682
+ Object.keys(obj).sort().forEach((k) => {
683
+ const v = obj[k];
684
+ if (!v || typeof v == "string" || !isNaN(v) || Array.isArray(v)) {
685
+ newObj[k] = v;
686
+ } else {
687
+ newObj[k] = sortKeys(v);
688
+ }
689
+ });
690
+ return newObj;
691
+ };
692
+ var generateKey = (config, adaptor) => {
693
+ const sorted = sortKeys(config);
694
+ const key = `${JSON.stringify(sorted)}-${adaptor}}`;
695
+ return createHash("sha256").update(key).digest("hex");
696
+ };
525
697
  var get = (repoPath, key) => {
526
698
  try {
527
- const data = readFileSync2(getPath(repoPath, key));
699
+ const data = readFileSync2(getPath(repoPath, key), "utf8");
528
700
  const json = JSON.parse(data);
529
701
  return json;
530
702
  } catch (e) {
@@ -533,10 +705,10 @@ var get = (repoPath, key) => {
533
705
  };
534
706
  var set = async (repoPath, key, data) => {
535
707
  const fullPath = getPath(repoPath, key);
536
- await mkdir(path2.dirname(fullPath), { recursive: true });
708
+ await mkdir(path3.dirname(fullPath), { recursive: true });
537
709
  await writeFile4(fullPath, JSON.stringify(data));
538
710
  };
539
- var cache_default = { get, set, generateKey, getPath };
711
+ var cache_default = { get, set, generateKey, getPath, sortKeys };
540
712
 
541
713
  // src/metadata/handler.ts
542
714
  import { getModuleEntryPoint } from "@openfn/runtime";
@@ -576,8 +748,8 @@ var metadataHandler = async (options, logger) => {
576
748
  const adaptor = adaptors[0];
577
749
  const state = await load_state_default(options, logger);
578
750
  logger.success(`Generating metadata`);
751
+ logger.info("config:", state);
579
752
  const config = state.configuration;
580
- logger.info("config:", config);
581
753
  if (!config || Object.keys(config).length === 0) {
582
754
  logger.error("ERROR: Invalid configuration passed");
583
755
  process.exit(1);
@@ -586,7 +758,7 @@ var metadataHandler = async (options, logger) => {
586
758
  logger.success("Done!");
587
759
  logger.print(cache_default.getPath(repoDir, id));
588
760
  };
589
- const id = cache_default.generateKey(config);
761
+ const id = cache_default.generateKey(config, adaptor);
590
762
  if (!options.force) {
591
763
  logger.debug("config hash: ", id);
592
764
  const cached = await cache_default.get(repoDir, id);
@@ -617,13 +789,13 @@ var metadataHandler = async (options, logger) => {
617
789
  var handler_default6 = metadataHandler;
618
790
 
619
791
  // src/util/use-adaptors-repo.ts
620
- import { readFile as readFile3 } from "node:fs/promises";
621
- import path3 from "node:path";
792
+ import { readFile as readFile2 } from "node:fs/promises";
793
+ import path4 from "node:path";
622
794
  import assert from "node:assert";
623
- import { getNameAndVersion as getNameAndVersion4 } from "@openfn/runtime";
795
+ import { getNameAndVersion as getNameAndVersion5 } from "@openfn/runtime";
624
796
  var validateMonoRepo = async (repoPath, log) => {
625
797
  try {
626
- const raw = await readFile3(`${repoPath}/package.json`, "utf8");
798
+ const raw = await readFile2(`${repoPath}/package.json`, "utf8");
627
799
  const pkg = JSON.parse(raw);
628
800
  assert(pkg.name === "adaptors");
629
801
  } catch (e) {
@@ -635,14 +807,14 @@ var updatePath = (adaptor, repoPath, log) => {
635
807
  if (adaptor.match("=")) {
636
808
  return adaptor;
637
809
  }
638
- const { name, version } = getNameAndVersion4(adaptor);
810
+ const { name, version } = getNameAndVersion5(adaptor);
639
811
  if (version) {
640
812
  log.warn(
641
813
  `Warning: Ignoring version specifier on ${adaptor} as loading from the adaptors monorepo`
642
814
  );
643
815
  }
644
816
  const shortName = name.replace("@openfn/language-", "");
645
- const abspath = path3.resolve(repoPath, "packages", shortName);
817
+ const abspath = path4.resolve(repoPath, "packages", shortName);
646
818
  return `${name}=${abspath}`;
647
819
  };
648
820
  var useAdaptorsRepo = async (adaptors, repoPath, log) => {
@@ -659,8 +831,8 @@ var use_adaptors_repo_default = useAdaptorsRepo;
659
831
 
660
832
  // src/util/print-versions.ts
661
833
  import { readFileSync as readFileSync3 } from "node:fs";
662
- import path4 from "node:path";
663
- import { getNameAndVersion as getNameAndVersion5 } from "@openfn/runtime";
834
+ import path5 from "node:path";
835
+ import { getNameAndVersion as getNameAndVersion6 } from "@openfn/runtime";
664
836
  import { mainSymbols } from "figures";
665
837
  var NODE = "node.js";
666
838
  var CLI2 = "cli";
@@ -669,7 +841,7 @@ var COMPILER2 = "compiler";
669
841
  var { triangleRightSmall: t } = mainSymbols;
670
842
  var loadVersionFromPath = (adaptorPath) => {
671
843
  try {
672
- const pkg = JSON.parse(readFileSync3(path4.resolve(adaptorPath, "package.json"), "utf8"));
844
+ const pkg = JSON.parse(readFileSync3(path5.resolve(adaptorPath, "package.json"), "utf8"));
673
845
  return pkg.version;
674
846
  } catch (e) {
675
847
  return "unknown";
@@ -687,9 +859,9 @@ var printVersions = async (logger, options = {}) => {
687
859
  if (adaptor.match("=")) {
688
860
  const [namePart, pathPart] = adaptor.split("=");
689
861
  adaptorVersion = loadVersionFromPath(pathPart);
690
- adaptorName = getNameAndVersion5(namePart).name;
862
+ adaptorName = getNameAndVersion6(namePart).name;
691
863
  } else {
692
- const { name, version: version2 } = getNameAndVersion5(adaptor);
864
+ const { name, version: version2 } = getNameAndVersion6(adaptor);
693
865
  adaptorName = name;
694
866
  adaptorVersion = version2 || "latest";
695
867
  }
@@ -746,7 +918,7 @@ var handlers = {
746
918
  ["repo-list"]: list,
747
919
  version: async (opts, logger) => print_versions_default(logger, opts)
748
920
  };
749
- var maybeEnsureOpts = (basePath, options) => /^(execute|compile)$/.test(options.command) ? ensureLogOpts(options) : ensureOpts(basePath, options);
921
+ var maybeEnsureOpts = (basePath, options) => /^(execute|compile|test)$/.test(options.command) ? ensureLogOpts(options) : ensureOpts(basePath, options);
750
922
  var parse = async (basePath, options, log) => {
751
923
  const opts = maybeEnsureOpts(basePath, options);
752
924
  const logger = log || logger_default(CLI, opts);
@@ -767,9 +939,9 @@ var parse = async (basePath, options, log) => {
767
939
  logger
768
940
  );
769
941
  } else if (opts.adaptors && opts.expandAdaptors) {
770
- opts.adaptors = expand_adaptors_default(opts.adaptors);
942
+ expand_adaptors_default(opts);
771
943
  }
772
- if (/^(test|version)$/.test(opts.command) && !opts.repoDir) {
944
+ if (!/^(test|version)$/.test(opts.command) && !opts.repoDir) {
773
945
  logger.warn(
774
946
  "WARNING: no repo module dir found! Using the default (/tmp/repo)"
775
947
  );
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@openfn/cli",
3
- "version": "0.0.34",
3
+ "version": "0.0.38",
4
4
  "description": "CLI devtools for the openfn toolchain.",
5
5
  "engines": {
6
6
  "node": ">=18",
@@ -36,15 +36,14 @@
36
36
  "typescript": "^4.7.4"
37
37
  },
38
38
  "dependencies": {
39
- "fast-safe-stringify": "^2.1.1",
39
+ "@openfn/compiler": "0.0.31",
40
+ "@openfn/describe-package": "0.0.16",
41
+ "@openfn/logger": "0.0.12",
42
+ "@openfn/runtime": "0.0.23",
40
43
  "figures": "^5.0.0",
41
44
  "rimraf": "^3.0.2",
42
45
  "treeify": "^1.1.0",
43
- "yargs": "^17.5.1",
44
- "@openfn/compiler": "0.0.28",
45
- "@openfn/describe-package": "0.0.15",
46
- "@openfn/logger": "0.0.12",
47
- "@openfn/runtime": "0.0.21"
46
+ "yargs": "^17.5.1"
48
47
  },
49
48
  "files": [
50
49
  "dist",