@datapos/datapos-development 0.3.102 → 0.3.113

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,9 +1,10 @@
1
1
  # Data Positioning Development Library
2
2
 
3
+ [![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=data-positioning_datapos-development&metric=alert_status)](https://sonarcloud.io/summary/new_code?id=data-positioning_datapos-development)
3
4
  [![npm version](https://img.shields.io/npm/v/@datapos/datapos-development.svg)](https://www.npmjs.com/package/@datapos/datapos-development)
4
5
  [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](./LICENSE)
5
6
 
6
- A TypeScript library of utilities for managing the Data Positioning repositories.
7
+ A library of utilities for managing the Data Positioning repositories.
7
8
 
8
9
  ## Requirements
9
10
 
@@ -33,57 +34,74 @@ registry=https://registry.npmjs.org/
33
34
 
34
35
  The `src/index.ts' file exposes the following utilities:
35
36
 
36
- | Name | Notes |
37
- | ------------------------- | ------------------------------------------------------------------------ |
38
- | buildConfig | Build the config.json file for the repository. |
39
- | buildConnectorConfig | Build the connector config.json file for the repository. |
40
- | buildContextConfig | Build the context config.json file for the repository. |
41
- | buildInformerConfig | Build the informer config.json file for the repository. |
42
- | buildPresenterConfig | Build the presenter config.json file for the repository. |
43
- | buildPublicDirectoryIndex | Build an index for the repositories public directory. |
44
- | bumpVersion | Bump the repositories version number. |
45
- | echoScriptNotImplemented | Echo script not implemented message to console.. |
46
- | sendDeploymentNotice | Send a deployment notice for the repository. |
47
- | syncWithGitHub | Synchronise the local repository with the main GitHub repository. |
48
- | uploadDirectoryToR2 | Upload a directory to Cloudflare R2 storage. |
49
- | uploadModuleConfigToDO | Upload a modules configuration to the Cloudflare `state` durable object. |
50
- | uploadModuleToR2 | Upload a module to Cloudflare R2 storage. |
37
+ | Name | Notes |
38
+ | ----------------------------------------- | ---------------------------------------------------------------------------- |
39
+ | buildConfig | Build the config.json file for the repository. |
40
+ | buildConnectorConfig | Build the connector config.json file for the repository. |
41
+ | buildContextConfig | Build the context config.json file for the repository. |
42
+ | buildPresenterConfig | Build the presenter config.json file for the repository. |
43
+ | buildPublicDirectoryIndex | Build an index for the repositories public directory. |
44
+ | bumpVersion | Bump the repositories version number. |
45
+ | echoScriptNotImplemented | Echo script not implemented message to console.. |
46
+ | insertLicensesIntoReadme | Insert the licenses for all production dependencies into the README.md file. |
47
+ | insertOWASPDependencyCheckBadgeIntoReadme | |
48
+ | sendDeploymentNotice | Send a deployment notice for the repository. |
49
+ | syncWithGitHub | Synchronise the local repository with the main GitHub repository. |
50
+ | uploadDirectoryToR2 | Upload a directory to Cloudflare R2 storage. |
51
+ | uploadModuleConfigToDO | Upload a modules configuration to the Cloudflare `state` durable object. |
52
+ | uploadModuleToR2 | Upload a module to Cloudflare R2 storage. |
53
+
54
+ ## Reports & Compliance
55
+
56
+ ### Dependency Check Report
57
+
58
+ The OWASP Dependency Check Report identifies known vulnerabilities in project dependencies. It is generated automatically on each release using the npm package `owasp-dependency-check`.
59
+
60
+ [View the OWASP Dependency Check Report](https://data-positioning.github.io/datapos-tool-micromark/dependency-check-reports/dependency-check-report.html)
61
+
62
+ ### Dependency Licenses
63
+
64
+ The following table lists top-level production and peer dependencies only. All dependency licenses (including transitive dependencies) have been recursively verified to conform to Apache-2.0, CC0-1.0, MIT, or n/a. Developers cloning this repository should independently verify dev and optional dependencies; users of the uploaded library are covered by these checks.
65
+
66
+ <!-- DEPENDENCY_LICENSES_START -->
67
+ | Name | Type | Installed | Latest | Latest Modified |
68
+ | :---------------------- | :--: | :-------: | :-----: | :----------------------- |
69
+ | @datapos/datapos-shared | MIT | 0.3.252 | 0.3.252 | 2025-11-25T16:48:28.532Z |
70
+ <!-- DEPENDENCY_LICENSES_END -->
71
+
72
+ ### Bundle Analysis Report
73
+
74
+ The Bundle Analysis Report provides a detailed breakdown of the bundle's composition and module sizes, helping to identify which modules contribute most to the final build. It is generated automatically on each release using the npm package `rollup-plugin-visualizer`.
75
+
76
+ [View the Bundle Analysis Report](https://data-positioning.github.io/datapos-development/stats/index.html)
51
77
 
52
78
  ## Repository Common Management Commands
53
79
 
54
80
  The table below lists the repository management commands available in this project.
55
81
  For detailed implementation, see the `scripts` section in the `package.json` file.
56
82
 
57
- | Name | Key Code | Notes |
83
+ | Name | VS Key Code | Notes |
58
84
  | ------------------ | ---------------- | ----------------------------------------------------------------------------------------------------------------------------------------------- |
59
85
  | audit | alt+ctrl+shift+a | Audit the project's dependencies for known security vulnerabilities. |
60
86
  | build | alt+ctrl+shift+b | Build the package using Vite. Output to '/dist' directory. |
61
87
  | check | alt+ctrl+shift+c | Identify outdated dependencies using npm `outdated` and `npm-check-updates` with option to install latest versions. Also runs `retire` scanner. |
62
88
  | document | alt+ctrl+shift+d | Identify licenses of the project's production and peer dependencies. See [LICENSES.json](./LICENSES.json). |
63
- | format | alt+ctrl+shift+f | Use `prettier`to enforce formatting style rules. |
64
- | lint | alt+ctrl+shift+l | Use `eslint`to check the code for potential errors and enforces coding style rules. |
89
+ | format | alt+ctrl+shift+f | Use `prettier` to enforce formatting style rules. |
90
+ | lint | alt+ctrl+shift+l | Use `eslint` to check the code for potential errors and enforces coding style rules. |
65
91
  | publish | alt+ctrl+shift+p | Publish the package to `npm`. |
66
92
  | release | alt+ctrl+shift+r | Bump version, build library, synchronise with `GitHub` and publish to `npm`. |
67
93
  | sync:withGitHub | alt+ctrl+shift+s | Synchronise local repository with the main GitHub repository. |
68
94
  | test | alt+ctrl+shift+t | ❌ Not implemented. |
69
95
  | update:dataPosDeps | alt+ctrl+shift+u | Install the latest version of all Data Positioning dependencies. |
70
96
 
71
- ## Bundle Analysis
72
-
73
- View the [bundle report](https://data-positioning.github.io/datapos-development/stats/index.html) to analyze the bundle composition and module sizes (generated with rollup-plugin-visualizer).
74
-
75
97
  ## TODO
76
98
 
77
99
  1. Enhance `uploadDirectoryToR2`to batch upload files so more efficient and performant.
78
100
  1. Replace regex with TypeScript AST in `buildConnectorConfig`. Extracting method names from index.ts via regex is fragile — breaks with decorators, comments, or new syntax (e.g., class fields). AST parsing (via @babel/parser or typescript) would be safer.
79
101
  1. Implement zod to validate config schemas.
80
102
 
81
- ## Compliance
82
-
83
- The following badge reflects FOSSA's assessment of this repository's open-source license compliance.
84
-
85
- [![FOSSA Status](https://app.fossa.com/api/projects/git%2Bgithub.com%2Fdata-positioning%2Fdatapos-development.svg?type=large&issueType=license)](https://app.fossa.com/projects/git%2Bgithub.com%2Fdata-positioning%2Fdatapos-development?ref=badge_large&issueType=license)
86
-
87
103
  ## License
88
104
 
105
+ This project is licensed under the MIT License, allowing free use, modification, and distribution.
106
+
89
107
  [MIT](./LICENSE) © 2026 Data Positioning Pty Ltd
@@ -1,13 +1,13 @@
1
1
  import { exec as m } from "node:child_process";
2
- import { promises as e } from "node:fs";
3
- import { nanoid as w } from "nanoid";
4
- import { promisify as y } from "node:util";
5
- const h = ["createObject", "dropObject", "removeRecords", "upsertRecords"], b = ["findObject", "getRecord", "listNodes", "previewObject", "retrieveRecords"], g = y(m);
2
+ import { promises as n } from "node:fs";
3
+ import { nanoid as y } from "nanoid";
4
+ import { promisify as w } from "node:util";
5
+ const h = ["createObject", "dropObject", "removeRecords", "upsertRecords"], b = ["findObject", "getRecord", "listNodes", "previewObject", "retrieveRecords"], g = w(m);
6
6
  async function S() {
7
7
  try {
8
8
  console.info("🚀 Building configuration...");
9
- const o = JSON.parse(await e.readFile("package.json", "utf8")), n = JSON.parse(await e.readFile("config.json", "utf8"));
10
- o.name != null && (n.id = o.name.replace("@datapos/", "").replace("@data-positioning/", "")), o.version != null && (n.version = o.version), await e.writeFile("config.json", JSON.stringify(n, void 0, 4), "utf8"), console.info("✅ Configuration built.");
9
+ const o = JSON.parse(await n.readFile("package.json", "utf8")), e = JSON.parse(await n.readFile("config.json", "utf8"));
10
+ o.name != null && (e.id = o.name.replace("@datapos/", "").replace("@data-positioning/", "")), o.version != null && (e.version = o.version), await n.writeFile("config.json", JSON.stringify(e, void 0, 4), "utf8"), console.info("✅ Configuration built.");
11
11
  } catch (o) {
12
12
  console.error("❌ Error building configuration.", o);
13
13
  }
@@ -15,21 +15,21 @@ async function S() {
15
15
  async function O(o) {
16
16
  try {
17
17
  console.info(`🚀 Building public directory index for identifier '${o}'...`);
18
- const n = {};
19
- async function t(r, s) {
18
+ const e = {};
19
+ async function i(r, s) {
20
20
  console.info(`⚙️ Processing directory '${r}'...`);
21
21
  const d = [], a = r.substring(`public/${o}`.length);
22
- n[a] = d;
22
+ e[a] = d;
23
23
  for (const c of s) {
24
24
  const l = `${r}/${c}`;
25
25
  try {
26
- const f = await e.stat(l);
26
+ const f = await n.stat(l);
27
27
  if (f.isDirectory()) {
28
- const p = await e.readdir(l), u = { childCount: p.length, name: `${c}`, typeId: "folder" };
29
- d.push(u), await t(l, p);
28
+ const u = await n.readdir(l), p = { childCount: u.length, name: `${c}`, typeId: "folder" };
29
+ d.push(p), await i(l, u);
30
30
  } else {
31
- const p = { id: w(), lastModifiedAt: f.mtimeMs, name: c, size: f.size, typeId: "object" };
32
- d.push(p);
31
+ const u = { id: y(), lastModifiedAt: f.mtimeMs, name: c, size: f.size, typeId: "object" };
32
+ d.push(u);
33
33
  }
34
34
  } catch (f) {
35
35
  throw new Error(`Unable to get information for '${c}' in 'buildPublicDirectoryIndex'. ${String(f)}`);
@@ -40,24 +40,24 @@ async function O(o) {
40
40
  return f === 0 ? c.name.localeCompare(l.name) : f;
41
41
  });
42
42
  }
43
- const i = await e.readdir(`public/${o}`);
44
- await t(`public/${o}`, i), await e.writeFile(`./public/${o}Index.json`, JSON.stringify(n), "utf8"), console.info("✅ Public directory index built.");
45
- } catch (n) {
46
- console.error("❌ Error building public directory index.", n);
43
+ const t = await n.readdir(`public/${o}`);
44
+ await i(`public/${o}`, t), await n.writeFile(`./public/${o}Index.json`, JSON.stringify(e), "utf8"), console.info("✅ Public directory index built.");
45
+ } catch (e) {
46
+ console.error("❌ Error building public directory index.", e);
47
47
  }
48
48
  }
49
49
  async function J() {
50
50
  try {
51
51
  console.info("🚀 Building connector configuration...");
52
- const o = JSON.parse(await e.readFile("package.json", "utf8")), n = JSON.parse(await e.readFile("config.json", "utf8")), t = await e.readFile("src/index.ts", "utf8");
53
- let i = !1, r = !1;
54
- const s = /^\s{4}(?:async\s+)?(private\s+)?(?:public\s+|protected\s+)?([A-Za-z_]\w*)\s*\(/gm, d = [...t.matchAll(s)].filter((c) => !c[1] && c[2] !== "constructor").map((c) => {
52
+ const o = JSON.parse(await n.readFile("package.json", "utf8")), e = JSON.parse(await n.readFile("config.json", "utf8")), i = await n.readFile("src/index.ts", "utf8");
53
+ let t = !1, r = !1;
54
+ const s = /^\s{4}(?:async\s+)?(private\s+)?(?:public\s+|protected\s+)?([A-Za-z_]\w*)\s*\(/gm, d = [...i.matchAll(s)].filter((c) => c[1] == null && c[2] !== "constructor").map((c) => {
55
55
  const l = c[2];
56
- return i = i || h.includes(l), r = r || b.includes(l), l;
56
+ return t = t || h.includes(l), r = r || b.includes(l), l;
57
57
  });
58
58
  d.length > 0 ? console.info(`ℹ️ Implements ${d.length} operations.`) : console.warn("⚠️ Implements no operations.");
59
- const a = r && i ? "bidirectional" : r ? "source" : i ? "destination" : "unknown";
60
- a && console.info(`ℹ️ Supports ${a} usage.`), o.name != null && (n.id = o.name), n.operations = d, n.usageId = a, o.version != null && (n.version = o.version), await e.writeFile("config.json", JSON.stringify(n, void 0, 4), "utf8"), console.info("✅ Connector configuration built.");
59
+ const a = r && t ? "bidirectional" : r ? "source" : t ? "destination" : "unknown";
60
+ a && console.info(`ℹ️ Supports ${a} usage.`), o.name != null && (e.id = o.name), e.operations = d, e.usageId = a, o.version != null && (e.version = o.version), await n.writeFile("config.json", JSON.stringify(e, void 0, 4), "utf8"), console.info("✅ Connector configuration built.");
61
61
  } catch (o) {
62
62
  console.error("❌ Error building connector configuration.", o);
63
63
  }
@@ -65,8 +65,8 @@ async function J() {
65
65
  async function x() {
66
66
  try {
67
67
  console.info("🚀 Building context configuration...");
68
- const o = JSON.parse(await e.readFile("package.json", "utf8")), n = JSON.parse(await e.readFile("config.json", "utf8")), t = await e.readFile("src/index.ts", "utf8"), i = /^\s{4}(?:async\s+)?(private\s+)?(?:public\s+|protected\s+)?([A-Za-z_]\w*)\s*\(/gm, r = [...t.matchAll(i)].filter((s) => !s[1] && s[2] !== "constructor").map((s) => s[2]);
69
- o.name != null && (n.id = o.name), n.operations = r, o.version != null && (n.version = o.version), await e.writeFile("config.json", JSON.stringify(n, void 0, 4), "utf8"), console.info("✅ Context configuration built.");
68
+ const o = JSON.parse(await n.readFile("package.json", "utf8")), e = JSON.parse(await n.readFile("config.json", "utf8")), i = await n.readFile("src/index.ts", "utf8"), t = /^\s{4}(?:async\s+)?(private\s+)?(?:public\s+|protected\s+)?([A-Za-z_]\w*)\s*\(/gm, r = [...i.matchAll(t)].filter((s) => s[1] == null && s[2] !== "constructor").map((s) => s[2]);
69
+ o.name != null && (e.id = o.name), e.operations = r, o.version != null && (e.version = o.version), await n.writeFile("config.json", JSON.stringify(e, void 0, 4), "utf8"), console.info("✅ Context configuration built.");
70
70
  } catch (o) {
71
71
  console.error("❌ Error building context configuration.", o);
72
72
  }
@@ -74,50 +74,63 @@ async function x() {
74
74
  async function j() {
75
75
  try {
76
76
  console.info("🚀 Building presenter configuration...");
77
- const o = JSON.parse(await e.readFile("package.json", "utf8")), n = JSON.parse(await e.readFile("config.json", "utf8")), t = await e.readFile("src/index.ts", "utf8"), i = /^\s{4}(?:async\s+)?(private\s+)?(?:public\s+|protected\s+)?([A-Za-z_]\w*)\s*\(/gm, r = [...t.matchAll(i)].filter((s) => !s[1] && s[2] !== "constructor").map((s) => s[2]);
78
- o.name != null && (n.id = o.name), n.operations = r, o.version != null && (n.version = o.version), await e.writeFile("config.json", JSON.stringify(n, void 0, 4), "utf8"), console.info("✅ Presenter configuration built.");
77
+ const o = JSON.parse(await n.readFile("package.json", "utf8")), e = JSON.parse(await n.readFile("config.json", "utf8")), i = await n.readFile("src/index.ts", "utf8"), t = /^\s{4}(?:async\s+)?(private\s+)?(?:public\s+|protected\s+)?([A-Za-z_]\w*)\s*\(/gm, r = [...i.matchAll(t)].filter((s) => !s[1] && s[2] !== "constructor").map((s) => s[2]);
78
+ o.name != null && (e.id = o.name), e.operations = r, o.version != null && (e.version = o.version), await n.writeFile("config.json", JSON.stringify(e, void 0, 4), "utf8"), console.info("✅ Presenter configuration built.");
79
79
  } catch (o) {
80
80
  console.error("❌ Error building context configuration.", o);
81
81
  }
82
82
  }
83
- async function C() {
83
+ async function C(o = "./") {
84
84
  try {
85
85
  console.info("🚀 Bumping version...");
86
- const o = JSON.parse(await e.readFile("package.json", "utf8"));
87
- if (o.version != null) {
88
- const n = o.version, t = o.version.split(".");
89
- o.version = `${t[0]}.${t[1]}.${Number(t[2]) + 1}`, await e.writeFile("package.json", JSON.stringify(o, void 0, 4), "utf8"), console.info(`✅ Version bumped from ${n} to ${o.version}.`);
90
- } else
91
- o.version = "0.0.001", await e.writeFile("package.json", JSON.stringify(o, void 0, 4), "utf8"), console.warn(`⚠️ Version initialised to ${o.version}.`);
92
- } catch (o) {
93
- console.error("❌ Error bumping package version.", o);
86
+ const e = JSON.parse(await n.readFile(`${o}package.json`, "utf8"));
87
+ if (e.version == null)
88
+ e.version = "0.0.001", await n.writeFile(`${o}package.json`, JSON.stringify(e, void 0, 4), "utf8"), console.warn(`⚠️ Version initialised to ${e.version}.`);
89
+ else {
90
+ const i = e.version, t = e.version.split(".");
91
+ e.version = `${t[0]}.${t[1]}.${Number(t[2]) + 1}`, await n.writeFile(`${o}package.json`, JSON.stringify(e, void 0, 4), "utf8"), console.info(`✅ Version bumped from ${i} to ${e.version}.`);
92
+ }
93
+ } catch (e) {
94
+ console.error("❌ Error bumping package version.", e);
94
95
  }
95
96
  }
96
97
  function F(o) {
97
98
  console.error(`❌ ${o} script not implemented.`);
98
99
  }
99
100
  async function R() {
100
- const o = "<!-- DEPENDENCY_LICENSES_START -->", n = "<!-- DEPENDENCY_LICENSES_END -->";
101
+ const o = "<!-- DEPENDENCY_LICENSES_START -->", e = "<!-- DEPENDENCY_LICENSES_END -->";
101
102
  try {
102
- const t = (await e.readFile("./licenses.md", "utf8")).trim(), i = await e.readFile("./README.md", "utf8"), r = i.indexOf(o), s = i.indexOf(n);
103
+ const i = (await n.readFile("./licenses.md", "utf8")).trim(), t = await n.readFile("./README.md", "utf8"), r = t.indexOf(o), s = t.indexOf(e);
103
104
  (r === -1 || s === -1) && (console.error("Error: Markers not found in README.md"), process.exit(1));
104
- const d = i.substring(0, r + o.length) + `
105
- ` + t + `
106
- ` + i.substring(s);
107
- await e.writeFile("README.md", d, "utf8"), console.log("✓ README.md updated with license information");
108
- } catch (t) {
109
- console.error("Error updating README:", t), process.exit(1);
105
+ const d = t.substring(0, r + o.length) + `
106
+ ` + i + `
107
+ ` + t.substring(s);
108
+ await n.writeFile("README.md", d, "utf8"), console.log("✓ README.md updated with license information");
109
+ } catch (i) {
110
+ console.error("Error updating README:", i), process.exit(1);
110
111
  }
111
112
  }
112
113
  async function k() {
114
+ try {
115
+ const o = JSON.parse(await n.readFile("./dependency-check-report.json", "utf-8"));
116
+ let e = 0;
117
+ for (const t of o.dependencies)
118
+ e += t.vulnerabilities.length;
119
+ console.log("vulnerabilityCount", e);
120
+ const i = await n.readFile("./README.md", "utf8");
121
+ } catch (o) {
122
+ console.error("Error updating README:", o), process.exit(1);
123
+ }
124
+ }
125
+ async function D() {
113
126
  try {
114
127
  console.info("🚀 Sending deployment notice...");
115
- const o = JSON.parse(await e.readFile("config.json", "utf8")), n = {
128
+ const o = JSON.parse(await n.readFile("config.json", "utf8")), e = {
116
129
  body: JSON.stringify(o),
117
130
  headers: { "Content-Type": "application/json" },
118
131
  method: "PUT"
119
- }, t = await fetch(`https://api.datapos.app/states/${o.id}`, n);
120
- if (!t.ok) throw new Error(await t.text());
132
+ }, i = await fetch(`https://api.datapos.app/states/${o.id}`, e);
133
+ if (!i.ok) throw new Error(await i.text());
121
134
  console.info("✅ Deployment notice sent.");
122
135
  } catch (o) {
123
136
  console.error("❌ Error sending deployment notice.", o);
@@ -126,43 +139,43 @@ async function k() {
126
139
  async function I() {
127
140
  try {
128
141
  console.info("🚀 Synchronising with GitHub....");
129
- const o = JSON.parse(await e.readFile("package.json", "utf8"));
142
+ const o = JSON.parse(await n.readFile("package.json", "utf8"));
130
143
  await g("git add ."), await g(`git commit -m "v${o.version}"`), await g("git push origin main:main"), console.info(`✅ Synchronised version ${o.version} with GitHub.`);
131
144
  } catch (o) {
132
145
  console.error("❌ Error synchronising with GitHub.", o);
133
146
  }
134
147
  }
135
- async function D(o, n) {
148
+ async function A(o, e) {
136
149
  try {
137
150
  console.info("🚀 Uploading directory to R2....");
138
- async function t(r, s, d) {
151
+ async function i(r, s, d) {
139
152
  for (const a of d) {
140
153
  const c = `${r}/${a}`, l = `${s}/${a}`;
141
- if ((await e.stat(c)).isDirectory()) {
142
- const p = await e.readdir(c);
143
- await t(c, l, p);
154
+ if ((await n.stat(c)).isDirectory()) {
155
+ const u = await n.readdir(c);
156
+ await i(c, l, u);
144
157
  } else {
145
158
  console.info(`⚙️ Uploading '${r}/${a}'...`);
146
- const p = `wrangler r2 object put "datapos-sample-data-eu/${s}/${a}" --file="${r}/${a}" --jurisdiction=eu --remote`, u = await g(p);
147
- if (u.stderr) throw new Error(u.stderr);
159
+ const u = `wrangler r2 object put "datapos-sample-data-eu/${s}/${a}" --file="${r}/${a}" --jurisdiction=eu --remote`, p = await g(u);
160
+ if (p.stderr) throw new Error(p.stderr);
148
161
  }
149
162
  }
150
163
  }
151
- const i = await e.readdir(`${o}/${n}/`);
152
- await t(`${o}/${n}`, n, i), console.info("✅ Directory uploaded to R2.");
153
- } catch (t) {
154
- console.error("❌ Error uploading directory to R2.", t);
164
+ const t = await n.readdir(`${o}/${e}/`);
165
+ await i(`${o}/${e}`, e, t), console.info("✅ Directory uploaded to R2.");
166
+ } catch (i) {
167
+ console.error("❌ Error uploading directory to R2.", i);
155
168
  }
156
169
  }
157
- async function A() {
170
+ async function M() {
158
171
  try {
159
172
  console.info("🚀 Uploading module configuration....");
160
- const o = JSON.parse(await e.readFile("config.json", "utf8")), n = o.id, t = {
173
+ const o = JSON.parse(await n.readFile("config.json", "utf8")), e = o.id, i = {
161
174
  body: JSON.stringify(o),
162
175
  headers: { "Content-Type": "application/json" },
163
176
  method: "PUT"
164
- }, i = await fetch(`https://api.datapos.app/states/${n}`, t);
165
- if (!i.ok) throw new Error(await i.text());
177
+ }, t = await fetch(`https://api.datapos.app/states/${e}`, i);
178
+ if (!t.ok) throw new Error(await t.text());
166
179
  console.info("✅ Module configuration uploaded.");
167
180
  } catch (o) {
168
181
  console.error("❌ Error uploading module configuration.", o);
@@ -171,22 +184,22 @@ async function A() {
171
184
  async function P(o) {
172
185
  try {
173
186
  console.info("🚀 Uploading module to R2...");
174
- const t = `v${JSON.parse(await e.readFile("package.json", "utf8")).version}`;
175
- async function i(r, s = "") {
176
- const d = await e.readdir(r, { withFileTypes: !0 });
187
+ const i = `v${JSON.parse(await n.readFile("package.json", "utf8")).version}`;
188
+ async function t(r, s = "") {
189
+ const d = await n.readdir(r, { withFileTypes: !0 });
177
190
  for (const a of d) {
178
191
  const c = `${r}/${a.name}`, l = s ? `${s}/${a.name}` : a.name;
179
192
  if (!a.isDirectory()) {
180
- const f = `${o}_${t}/${l}`.replace(/\\/g, "/"), p = a.name.endsWith(".js") ? "application/javascript" : a.name.endsWith(".css") ? "text/css" : "application/octet-stream";
193
+ const f = `${o}_${i}/${l}`.replace(/\\/g, "/"), u = a.name.endsWith(".js") ? "application/javascript" : a.name.endsWith(".css") ? "text/css" : "application/octet-stream";
181
194
  console.info(`⚙️ Uploading '${l}' → '${f}'...`);
182
- const { stderr: u } = await g(`wrangler r2 object put "${f}" --file="${c}" --content-type ${p} --jurisdiction=eu --remote`);
183
- if (u) throw new Error(u);
195
+ const { stderr: p } = await g(`wrangler r2 object put "${f}" --file="${c}" --content-type ${u} --jurisdiction=eu --remote`);
196
+ if (p) throw new Error(p);
184
197
  }
185
198
  }
186
199
  }
187
- await i("dist"), console.info("✅ Module uploaded to R2.");
188
- } catch (n) {
189
- console.error("❌ Error uploading module to R2.", n);
200
+ await t("dist"), console.info("✅ Module uploaded to R2.");
201
+ } catch (e) {
202
+ console.error("❌ Error uploading module to R2.", e);
190
203
  }
191
204
  }
192
205
  export {
@@ -198,9 +211,10 @@ export {
198
211
  C as bumpVersion,
199
212
  F as echoScriptNotImplemented,
200
213
  R as insertLicensesIntoReadme,
201
- k as sendDeploymentNotice,
214
+ k as insertOWASPDependencyCheckBadgeIntoReadme,
215
+ D as sendDeploymentNotice,
202
216
  I as syncWithGitHub,
203
- D as uploadDirectoryToR2,
204
- A as uploadModuleConfigToDO,
217
+ A as uploadDirectoryToR2,
218
+ M as uploadModuleConfigToDO,
205
219
  P as uploadModuleToR2
206
220
  };
@@ -6,12 +6,13 @@ declare function buildPublicDirectoryIndex(id: string): Promise<void>;
6
6
  declare function buildConnectorConfig(): Promise<void>;
7
7
  declare function buildContextConfig(): Promise<void>;
8
8
  declare function buildPresenterConfig(): Promise<void>;
9
- declare function bumpVersion(): Promise<void>;
9
+ declare function bumpVersion(path?: string): Promise<void>;
10
10
  declare function echoScriptNotImplemented(name: string): void;
11
11
  declare function insertLicensesIntoReadme(): Promise<void>;
12
+ declare function insertOWASPDependencyCheckBadgeIntoReadme(): Promise<void>;
12
13
  declare function sendDeploymentNotice(): Promise<void>;
13
14
  declare function syncWithGitHub(): Promise<void>;
14
15
  declare function uploadDirectoryToR2(sourceDirectory: string, uploadDirectory: string): Promise<void>;
15
16
  declare function uploadModuleConfigToDO(): Promise<void>;
16
17
  declare function uploadModuleToR2(uploadDirPath: string): Promise<void>;
17
- export { buildConfig, buildConnectorConfig, buildContextConfig, buildPresenterConfig, buildPublicDirectoryIndex, bumpVersion, echoScriptNotImplemented, insertLicensesIntoReadme, sendDeploymentNotice, syncWithGitHub, uploadDirectoryToR2, uploadModuleConfigToDO, uploadModuleToR2 };
18
+ export { buildConfig, buildConnectorConfig, buildContextConfig, buildPresenterConfig, buildPublicDirectoryIndex, bumpVersion, echoScriptNotImplemented, insertLicensesIntoReadme, insertOWASPDependencyCheckBadgeIntoReadme, sendDeploymentNotice, syncWithGitHub, uploadDirectoryToR2, uploadModuleConfigToDO, uploadModuleToR2 };
@@ -0,0 +1,4 @@
1
+ /**
2
+ * Tests.
3
+ */
4
+ export {};
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "@datapos/datapos-development",
3
- "version": "0.3.102",
4
- "description": "A TypeScript library of utilities for managing the Data Positioning repositories.",
3
+ "version": "0.3.113",
4
+ "description": "A library of utilities for managing the Data Positioning repositories.",
5
5
  "license": "MIT",
6
6
  "author": "Jonathan Terrell <terrell.jm@gmail.com>",
7
7
  "private": false,
@@ -37,8 +37,10 @@
37
37
  "jiti": "^2.6.1",
38
38
  "license-report": "^6.8.1",
39
39
  "license-report-check": "^0.1.2",
40
+ "license-report-recursive": "^6.8.2",
40
41
  "nanoid": "^5.1.6",
41
42
  "npm-check-updates": "^19.1.2",
43
+ "owasp-dependency-check": "^1.0.0",
42
44
  "prettier": "^3.6.2",
43
45
  "retire": "^5.3.0",
44
46
  "rollup-plugin-visualizer": "^6.0.5",
@@ -47,23 +49,29 @@
47
49
  "typescript": "^5.9.3",
48
50
  "vite": "^7.2.4",
49
51
  "vite-plugin-dts": "^4.5.4",
52
+ "vitest": "^4.0.14",
50
53
  "zod": "^4.1.13"
51
54
  },
52
55
  "scripts": {
53
56
  "audit": "npm audit",
54
57
  "build": "vite build",
55
58
  "check": "npm outdated; npm-check-updates -i && retire",
56
- "document": "npm run _document:licenceReport && npm run _document:licenceCheck",
59
+ "document": "npm run _document:licenceReportJSON && npm run _document:licenceReportMarkdown && npm run _document:licenceReportCheck && npm run _document:insertLicensesIntoReadme && npm run _document:licenceTree && npm run _document:licenceTreeCheck",
57
60
  "format": "prettier --write src/",
58
61
  "lint": "eslint .",
59
62
  "publish:toNPM": "npm publish --access public",
60
63
  "release": "npm run _bump:version && npm run build && npm run _sync:withGitHub && npm run publish:toNPM",
61
64
  "sync": "npm run _bump:version && npm run _sync:withGitHub",
62
- "test": "node -e \"import('./dist/datapos-development.es.js').then(m => m.echoScriptNotImplemented('Test'))\"",
65
+ "test": "vitest",
63
66
  "update:dataPosDeps": "npm run _update:sharedDep",
64
67
  "_bump:version": "node -e \"import('./dist/datapos-development.es.js').then(m => m.bumpVersion())\"",
65
- "_document:licenceReport": "license-report --only=prod,peer > LICENSES.json",
66
- "_document:licenceCheck": "license-report-check --source ./LICENSES.json --allowed 'MIT' --allowed 'n/a' --allowed 'Apache-2.0' --output=table",
68
+ "_check:owasp": "set -a && source .env && set +a && owasp-dependency-check --project \"@datapos/datapos-tool-micromark\" --enableRetired --nvdApiKey \"$NVD_API_KEY\"",
69
+ "_document:licenceReportJSON": "license-report --only=prod,peer --department.value=n/a --licensePeriod=n/a --material=n/a --relatedTo.value=n/a > licenses.json",
70
+ "_document:licenceReportMarkdown": "license-report --config license-report-config.json --only=prod,peer --output=markdown > licenses.md",
71
+ "_document:licenceReportCheck": "license-report-check --source ./licenses.json --allowed 'MIT' --allowed 'n/a' --allowed 'Apache-2.0' --allowed 'CC0-1.0' --output=table",
72
+ "_document:licenceTree": "license-report-recursive --only=prod,peer --department.value=n/a --licensePeriod=n/a --material=n/a --relatedTo.value=n/a --recurse --output=tree > licenseTree.json",
73
+ "_document:licenceTreeCheck": "license-report-check --source ./licenseTree.json --allowed 'MIT' --allowed 'n/a' --allowed 'Apache-2.0' --allowed 'CC0-1.0' --output=table",
74
+ "_document:insertLicensesIntoReadme": "node -e \"import('./dist/datapos-development.es.js').then(m => m.insertLicensesIntoReadme())\"",
67
75
  "_sync:withGitHub": "node -e \"import('./dist/datapos-development.es.js').then(m => m.syncWithGitHub())\"",
68
76
  "_update:sharedDep": "npm install @datapos/datapos-shared@latest"
69
77
  },