@datapos/datapos-development 0.3.101 → 0.3.110
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +31 -14
- package/dist/datapos-development.es.js +52 -52
- package/dist/types/src/index.d.ts +1 -1
- package/dist/types/tests/index.test.d.ts +4 -0
- package/package.json +14 -6
package/README.md
CHANGED
|
@@ -1,9 +1,10 @@
|
|
|
1
1
|
# Data Positioning Development Library
|
|
2
2
|
|
|
3
|
+
[](https://sonarcloud.io/summary/new_code?id=data-positioning_datapos-development)
|
|
3
4
|
[](https://www.npmjs.com/package/@datapos/datapos-development)
|
|
4
5
|
[](./LICENSE)
|
|
5
6
|
|
|
6
|
-
A
|
|
7
|
+
A library of utilities for managing the Data Positioning repositories.
|
|
7
8
|
|
|
8
9
|
## Requirements
|
|
9
10
|
|
|
@@ -49,41 +50,57 @@ The `src/index.ts' file exposes the following utilities:
|
|
|
49
50
|
| uploadModuleConfigToDO | Upload a modules configuration to the Cloudflare `state` durable object. |
|
|
50
51
|
| uploadModuleToR2 | Upload a module to Cloudflare R2 storage. |
|
|
51
52
|
|
|
53
|
+
## Reports & Compliance
|
|
54
|
+
|
|
55
|
+
### Dependency Check Report
|
|
56
|
+
|
|
57
|
+
The OWASP Dependency Check Report identifies known vulnerabilities in project dependencies. It is generated automatically on each release using the npm package `owasp-dependency-check`.
|
|
58
|
+
|
|
59
|
+
[View the OWASP Dependency Check Report](https://data-positioning.github.io/datapos-tool-micromark/dependency-check-reports/dependency-check-report.html)
|
|
60
|
+
|
|
61
|
+
### Dependency Licenses
|
|
62
|
+
|
|
63
|
+
The following table lists top-level production and peer dependencies only. All dependency licenses (including transitive dependencies) have been recursively verified to conform to Apache-2.0, CC0-1.0, MIT, or n/a. Developers cloning this repository should independently verify dev and optional dependencies; users of the uploaded library are covered by these checks.
|
|
64
|
+
|
|
65
|
+
<!-- DEPENDENCY_LICENSES_START -->
|
|
66
|
+
| Name | Type | Installed | Latest | Latest Modified |
|
|
67
|
+
| :---------------------- | :--: | :-------: | :-----: | :----------------------- |
|
|
68
|
+
| @datapos/datapos-shared | MIT | 0.3.252 | 0.3.252 | 2025-11-25T16:48:28.532Z |
|
|
69
|
+
<!-- DEPENDENCY_LICENSES_END -->
|
|
70
|
+
|
|
71
|
+
### Bundle Analysis Report
|
|
72
|
+
|
|
73
|
+
The Bundle Analysis Report provides a detailed breakdown of the bundle's composition and module sizes, helping to identify which modules contribute most to the final build. It is generated automatically on each release using the npm package `rollup-plugin-visualizer`.
|
|
74
|
+
|
|
75
|
+
[View the Bundle Analysis Report](https://data-positioning.github.io/datapos-development/stats/index.html)
|
|
76
|
+
|
|
52
77
|
## Repository Common Management Commands
|
|
53
78
|
|
|
54
79
|
The table below lists the repository management commands available in this project.
|
|
55
80
|
For detailed implementation, see the `scripts` section in the `package.json` file.
|
|
56
81
|
|
|
57
|
-
| Name | Key Code
|
|
82
|
+
| Name | VS Key Code | Notes |
|
|
58
83
|
| ------------------ | ---------------- | ----------------------------------------------------------------------------------------------------------------------------------------------- |
|
|
59
84
|
| audit | alt+ctrl+shift+a | Audit the project's dependencies for known security vulnerabilities. |
|
|
60
85
|
| build | alt+ctrl+shift+b | Build the package using Vite. Output to '/dist' directory. |
|
|
61
86
|
| check | alt+ctrl+shift+c | Identify outdated dependencies using npm `outdated` and `npm-check-updates` with option to install latest versions. Also runs `retire` scanner. |
|
|
62
87
|
| document | alt+ctrl+shift+d | Identify licenses of the project's production and peer dependencies. See [LICENSES.json](./LICENSES.json). |
|
|
63
|
-
| format | alt+ctrl+shift+f | Use `prettier`to enforce formatting style rules.
|
|
64
|
-
| lint | alt+ctrl+shift+l | Use `eslint`to check the code for potential errors and enforces coding style rules.
|
|
88
|
+
| format | alt+ctrl+shift+f | Use `prettier` to enforce formatting style rules. |
|
|
89
|
+
| lint | alt+ctrl+shift+l | Use `eslint` to check the code for potential errors and enforces coding style rules. |
|
|
65
90
|
| publish | alt+ctrl+shift+p | Publish the package to `npm`. |
|
|
66
91
|
| release | alt+ctrl+shift+r | Bump version, build library, synchronise with `GitHub` and publish to `npm`. |
|
|
67
92
|
| sync:withGitHub | alt+ctrl+shift+s | Synchronise local repository with the main GitHub repository. |
|
|
68
93
|
| test | alt+ctrl+shift+t | ❌ Not implemented. |
|
|
69
94
|
| update:dataPosDeps | alt+ctrl+shift+u | Install the latest version of all Data Positioning dependencies. |
|
|
70
95
|
|
|
71
|
-
## Bundle Analysis
|
|
72
|
-
|
|
73
|
-
View the [bundle report](https://data-positioning.github.io/datapos-development/stats/index.html) to analyze the bundle composition and module sizes (generated with rollup-plugin-visualizer).
|
|
74
|
-
|
|
75
96
|
## TODO
|
|
76
97
|
|
|
77
98
|
1. Enhance `uploadDirectoryToR2`to batch upload files so more efficient and performant.
|
|
78
99
|
1. Replace regex with TypeScript AST in `buildConnectorConfig`. Extracting method names from index.ts via regex is fragile — breaks with decorators, comments, or new syntax (e.g., class fields). AST parsing (via @babel/parser or typescript) would be safer.
|
|
79
100
|
1. Implement zod to validate config schemas.
|
|
80
101
|
|
|
81
|
-
## Compliance
|
|
82
|
-
|
|
83
|
-
The following badge reflects FOSSA's assessment of this repository's open-source license compliance.
|
|
84
|
-
|
|
85
|
-
[](https://app.fossa.com/projects/git%2Bgithub.com%2Fdata-positioning%2Fdatapos-development?ref=badge_large&issueType=license)
|
|
86
|
-
|
|
87
102
|
## License
|
|
88
103
|
|
|
104
|
+
This project is licensed under the MIT License, allowing free use, modification, and distribution.
|
|
105
|
+
|
|
89
106
|
[MIT](./LICENSE) © 2026 Data Positioning Pty Ltd
|
|
@@ -1,8 +1,8 @@
|
|
|
1
|
-
import { exec as m } from "child_process";
|
|
2
|
-
import { promises as e } from "fs";
|
|
1
|
+
import { exec as m } from "node:child_process";
|
|
2
|
+
import { promises as e } from "node:fs";
|
|
3
3
|
import { nanoid as w } from "nanoid";
|
|
4
|
-
import { promisify as y } from "util";
|
|
5
|
-
const
|
|
4
|
+
import { promisify as y } from "node:util";
|
|
5
|
+
const $ = ["createObject", "dropObject", "removeRecords", "upsertRecords"], h = ["findObject", "getRecord", "listNodes", "previewObject", "retrieveRecords"], g = y(m);
|
|
6
6
|
async function S() {
|
|
7
7
|
try {
|
|
8
8
|
console.info("🚀 Building configuration...");
|
|
@@ -16,7 +16,7 @@ async function O(o) {
|
|
|
16
16
|
try {
|
|
17
17
|
console.info(`🚀 Building public directory index for identifier '${o}'...`);
|
|
18
18
|
const n = {};
|
|
19
|
-
async function
|
|
19
|
+
async function i(r, s) {
|
|
20
20
|
console.info(`⚙️ Processing directory '${r}'...`);
|
|
21
21
|
const d = [], a = r.substring(`public/${o}`.length);
|
|
22
22
|
n[a] = d;
|
|
@@ -25,11 +25,11 @@ async function O(o) {
|
|
|
25
25
|
try {
|
|
26
26
|
const f = await e.stat(l);
|
|
27
27
|
if (f.isDirectory()) {
|
|
28
|
-
const
|
|
29
|
-
d.push(
|
|
28
|
+
const u = await e.readdir(l), p = { childCount: u.length, name: `${c}`, typeId: "folder" };
|
|
29
|
+
d.push(p), await i(l, u);
|
|
30
30
|
} else {
|
|
31
|
-
const
|
|
32
|
-
d.push(
|
|
31
|
+
const u = { id: w(), lastModifiedAt: f.mtimeMs, name: c, size: f.size, typeId: "object" };
|
|
32
|
+
d.push(u);
|
|
33
33
|
}
|
|
34
34
|
} catch (f) {
|
|
35
35
|
throw new Error(`Unable to get information for '${c}' in 'buildPublicDirectoryIndex'. ${String(f)}`);
|
|
@@ -40,8 +40,8 @@ async function O(o) {
|
|
|
40
40
|
return f === 0 ? c.name.localeCompare(l.name) : f;
|
|
41
41
|
});
|
|
42
42
|
}
|
|
43
|
-
const
|
|
44
|
-
await
|
|
43
|
+
const t = await e.readdir(`public/${o}`);
|
|
44
|
+
await i(`public/${o}`, t), await e.writeFile(`./public/${o}Index.json`, JSON.stringify(n), "utf8"), console.info("✅ Public directory index built.");
|
|
45
45
|
} catch (n) {
|
|
46
46
|
console.error("❌ Error building public directory index.", n);
|
|
47
47
|
}
|
|
@@ -49,14 +49,14 @@ async function O(o) {
|
|
|
49
49
|
async function J() {
|
|
50
50
|
try {
|
|
51
51
|
console.info("🚀 Building connector configuration...");
|
|
52
|
-
const o = JSON.parse(await e.readFile("package.json", "utf8")), n = JSON.parse(await e.readFile("config.json", "utf8")),
|
|
53
|
-
let
|
|
54
|
-
const s = /^\s{4}(?:async\s+)?(private\s+)?(?:public\s+|protected\s+)?([A-Za-z_]\w*)\s*\(/gm, d = [...
|
|
52
|
+
const o = JSON.parse(await e.readFile("package.json", "utf8")), n = JSON.parse(await e.readFile("config.json", "utf8")), i = await e.readFile("src/index.ts", "utf8");
|
|
53
|
+
let t = !1, r = !1;
|
|
54
|
+
const s = /^\s{4}(?:async\s+)?(private\s+)?(?:public\s+|protected\s+)?([A-Za-z_]\w*)\s*\(/gm, d = [...i.matchAll(s)].filter((c) => c[1] == null && c[2] !== "constructor").map((c) => {
|
|
55
55
|
const l = c[2];
|
|
56
|
-
return
|
|
56
|
+
return t = t || $.includes(l), r = r || h.includes(l), l;
|
|
57
57
|
});
|
|
58
58
|
d.length > 0 ? console.info(`ℹ️ Implements ${d.length} operations.`) : console.warn("⚠️ Implements no operations.");
|
|
59
|
-
const a = r &&
|
|
59
|
+
const a = r && t ? "bidirectional" : r ? "source" : t ? "destination" : "unknown";
|
|
60
60
|
a && console.info(`ℹ️ Supports ${a} usage.`), o.name != null && (n.id = o.name), n.operations = d, n.usageId = a, o.version != null && (n.version = o.version), await e.writeFile("config.json", JSON.stringify(n, void 0, 4), "utf8"), console.info("✅ Connector configuration built.");
|
|
61
61
|
} catch (o) {
|
|
62
62
|
console.error("❌ Error building connector configuration.", o);
|
|
@@ -65,7 +65,7 @@ async function J() {
|
|
|
65
65
|
async function x() {
|
|
66
66
|
try {
|
|
67
67
|
console.info("🚀 Building context configuration...");
|
|
68
|
-
const o = JSON.parse(await e.readFile("package.json", "utf8")), n = JSON.parse(await e.readFile("config.json", "utf8")),
|
|
68
|
+
const o = JSON.parse(await e.readFile("package.json", "utf8")), n = JSON.parse(await e.readFile("config.json", "utf8")), i = await e.readFile("src/index.ts", "utf8"), t = /^\s{4}(?:async\s+)?(private\s+)?(?:public\s+|protected\s+)?([A-Za-z_]\w*)\s*\(/gm, r = [...i.matchAll(t)].filter((s) => s[1] == null && s[2] !== "constructor").map((s) => s[2]);
|
|
69
69
|
o.name != null && (n.id = o.name), n.operations = r, o.version != null && (n.version = o.version), await e.writeFile("config.json", JSON.stringify(n, void 0, 4), "utf8"), console.info("✅ Context configuration built.");
|
|
70
70
|
} catch (o) {
|
|
71
71
|
console.error("❌ Error building context configuration.", o);
|
|
@@ -74,23 +74,23 @@ async function x() {
|
|
|
74
74
|
async function j() {
|
|
75
75
|
try {
|
|
76
76
|
console.info("🚀 Building presenter configuration...");
|
|
77
|
-
const o = JSON.parse(await e.readFile("package.json", "utf8")), n = JSON.parse(await e.readFile("config.json", "utf8")),
|
|
77
|
+
const o = JSON.parse(await e.readFile("package.json", "utf8")), n = JSON.parse(await e.readFile("config.json", "utf8")), i = await e.readFile("src/index.ts", "utf8"), t = /^\s{4}(?:async\s+)?(private\s+)?(?:public\s+|protected\s+)?([A-Za-z_]\w*)\s*\(/gm, r = [...i.matchAll(t)].filter((s) => !s[1] && s[2] !== "constructor").map((s) => s[2]);
|
|
78
78
|
o.name != null && (n.id = o.name), n.operations = r, o.version != null && (n.version = o.version), await e.writeFile("config.json", JSON.stringify(n, void 0, 4), "utf8"), console.info("✅ Presenter configuration built.");
|
|
79
79
|
} catch (o) {
|
|
80
80
|
console.error("❌ Error building context configuration.", o);
|
|
81
81
|
}
|
|
82
82
|
}
|
|
83
|
-
async function C() {
|
|
83
|
+
async function C(o = "./") {
|
|
84
84
|
try {
|
|
85
85
|
console.info("🚀 Bumping version...");
|
|
86
|
-
const
|
|
87
|
-
if (
|
|
88
|
-
const
|
|
89
|
-
|
|
86
|
+
const n = JSON.parse(await e.readFile(`${o}package.json`, "utf8"));
|
|
87
|
+
if (n.version != null) {
|
|
88
|
+
const i = n.version, t = n.version.split(".");
|
|
89
|
+
n.version = `${t[0]}.${t[1]}.${Number(t[2]) + 1}`, await e.writeFile(`${o}package.json`, JSON.stringify(n, void 0, 4), "utf8"), console.info(`✅ Version bumped from ${i} to ${n.version}.`);
|
|
90
90
|
} else
|
|
91
|
-
|
|
92
|
-
} catch (
|
|
93
|
-
console.error("❌ Error bumping package version.",
|
|
91
|
+
n.version = "0.0.001", await e.writeFile(`${o}package.json`, JSON.stringify(n, void 0, 4), "utf8"), console.warn(`⚠️ Version initialised to ${n.version}.`);
|
|
92
|
+
} catch (n) {
|
|
93
|
+
console.error("❌ Error bumping package version.", n);
|
|
94
94
|
}
|
|
95
95
|
}
|
|
96
96
|
function F(o) {
|
|
@@ -99,14 +99,14 @@ function F(o) {
|
|
|
99
99
|
async function R() {
|
|
100
100
|
const o = "<!-- DEPENDENCY_LICENSES_START -->", n = "<!-- DEPENDENCY_LICENSES_END -->";
|
|
101
101
|
try {
|
|
102
|
-
const
|
|
102
|
+
const i = (await e.readFile("./licenses.md", "utf8")).trim(), t = await e.readFile("./README.md", "utf8"), r = t.indexOf(o), s = t.indexOf(n);
|
|
103
103
|
(r === -1 || s === -1) && (console.error("Error: Markers not found in README.md"), process.exit(1));
|
|
104
|
-
const d =
|
|
105
|
-
` +
|
|
106
|
-
` +
|
|
104
|
+
const d = t.substring(0, r + o.length) + `
|
|
105
|
+
` + i + `
|
|
106
|
+
` + t.substring(s);
|
|
107
107
|
await e.writeFile("README.md", d, "utf8"), console.log("✓ README.md updated with license information");
|
|
108
|
-
} catch (
|
|
109
|
-
console.error("Error updating README:",
|
|
108
|
+
} catch (i) {
|
|
109
|
+
console.error("Error updating README:", i), process.exit(1);
|
|
110
110
|
}
|
|
111
111
|
}
|
|
112
112
|
async function k() {
|
|
@@ -116,8 +116,8 @@ async function k() {
|
|
|
116
116
|
body: JSON.stringify(o),
|
|
117
117
|
headers: { "Content-Type": "application/json" },
|
|
118
118
|
method: "PUT"
|
|
119
|
-
},
|
|
120
|
-
if (!
|
|
119
|
+
}, i = await fetch(`https://api.datapos.app/states/${o.id}`, n);
|
|
120
|
+
if (!i.ok) throw new Error(await i.text());
|
|
121
121
|
console.info("✅ Deployment notice sent.");
|
|
122
122
|
} catch (o) {
|
|
123
123
|
console.error("❌ Error sending deployment notice.", o);
|
|
@@ -135,34 +135,34 @@ async function I() {
|
|
|
135
135
|
async function D(o, n) {
|
|
136
136
|
try {
|
|
137
137
|
console.info("🚀 Uploading directory to R2....");
|
|
138
|
-
async function
|
|
138
|
+
async function i(r, s, d) {
|
|
139
139
|
for (const a of d) {
|
|
140
140
|
const c = `${r}/${a}`, l = `${s}/${a}`;
|
|
141
141
|
if ((await e.stat(c)).isDirectory()) {
|
|
142
|
-
const
|
|
143
|
-
await
|
|
142
|
+
const u = await e.readdir(c);
|
|
143
|
+
await i(c, l, u);
|
|
144
144
|
} else {
|
|
145
145
|
console.info(`⚙️ Uploading '${r}/${a}'...`);
|
|
146
|
-
const
|
|
147
|
-
if (
|
|
146
|
+
const u = `wrangler r2 object put "datapos-sample-data-eu/${s}/${a}" --file="${r}/${a}" --jurisdiction=eu --remote`, p = await g(u);
|
|
147
|
+
if (p.stderr) throw new Error(p.stderr);
|
|
148
148
|
}
|
|
149
149
|
}
|
|
150
150
|
}
|
|
151
|
-
const
|
|
152
|
-
await
|
|
153
|
-
} catch (
|
|
154
|
-
console.error("❌ Error uploading directory to R2.",
|
|
151
|
+
const t = await e.readdir(`${o}/${n}/`);
|
|
152
|
+
await i(`${o}/${n}`, n, t), console.info("✅ Directory uploaded to R2.");
|
|
153
|
+
} catch (i) {
|
|
154
|
+
console.error("❌ Error uploading directory to R2.", i);
|
|
155
155
|
}
|
|
156
156
|
}
|
|
157
157
|
async function A() {
|
|
158
158
|
try {
|
|
159
159
|
console.info("🚀 Uploading module configuration....");
|
|
160
|
-
const o = JSON.parse(await e.readFile("config.json", "utf8")), n = o.id,
|
|
160
|
+
const o = JSON.parse(await e.readFile("config.json", "utf8")), n = o.id, i = {
|
|
161
161
|
body: JSON.stringify(o),
|
|
162
162
|
headers: { "Content-Type": "application/json" },
|
|
163
163
|
method: "PUT"
|
|
164
|
-
},
|
|
165
|
-
if (!
|
|
164
|
+
}, t = await fetch(`https://api.datapos.app/states/${n}`, i);
|
|
165
|
+
if (!t.ok) throw new Error(await t.text());
|
|
166
166
|
console.info("✅ Module configuration uploaded.");
|
|
167
167
|
} catch (o) {
|
|
168
168
|
console.error("❌ Error uploading module configuration.", o);
|
|
@@ -171,20 +171,20 @@ async function A() {
|
|
|
171
171
|
async function P(o) {
|
|
172
172
|
try {
|
|
173
173
|
console.info("🚀 Uploading module to R2...");
|
|
174
|
-
const
|
|
175
|
-
async function
|
|
174
|
+
const i = `v${JSON.parse(await e.readFile("package.json", "utf8")).version}`;
|
|
175
|
+
async function t(r, s = "") {
|
|
176
176
|
const d = await e.readdir(r, { withFileTypes: !0 });
|
|
177
177
|
for (const a of d) {
|
|
178
178
|
const c = `${r}/${a.name}`, l = s ? `${s}/${a.name}` : a.name;
|
|
179
179
|
if (!a.isDirectory()) {
|
|
180
|
-
const f = `${o}_${
|
|
180
|
+
const f = `${o}_${i}/${l}`.replace(/\\/g, "/"), u = a.name.endsWith(".js") ? "application/javascript" : a.name.endsWith(".css") ? "text/css" : "application/octet-stream";
|
|
181
181
|
console.info(`⚙️ Uploading '${l}' → '${f}'...`);
|
|
182
|
-
const { stderr:
|
|
183
|
-
if (
|
|
182
|
+
const { stderr: p } = await g(`wrangler r2 object put "${f}" --file="${c}" --content-type ${u} --jurisdiction=eu --remote`);
|
|
183
|
+
if (p) throw new Error(p);
|
|
184
184
|
}
|
|
185
185
|
}
|
|
186
186
|
}
|
|
187
|
-
await
|
|
187
|
+
await t("dist"), console.info("✅ Module uploaded to R2.");
|
|
188
188
|
} catch (n) {
|
|
189
189
|
console.error("❌ Error uploading module to R2.", n);
|
|
190
190
|
}
|
|
@@ -6,7 +6,7 @@ declare function buildPublicDirectoryIndex(id: string): Promise<void>;
|
|
|
6
6
|
declare function buildConnectorConfig(): Promise<void>;
|
|
7
7
|
declare function buildContextConfig(): Promise<void>;
|
|
8
8
|
declare function buildPresenterConfig(): Promise<void>;
|
|
9
|
-
declare function bumpVersion(): Promise<void>;
|
|
9
|
+
declare function bumpVersion(path?: string): Promise<void>;
|
|
10
10
|
declare function echoScriptNotImplemented(name: string): void;
|
|
11
11
|
declare function insertLicensesIntoReadme(): Promise<void>;
|
|
12
12
|
declare function sendDeploymentNotice(): Promise<void>;
|
package/package.json
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@datapos/datapos-development",
|
|
3
|
-
"version": "0.3.
|
|
4
|
-
"description": "A
|
|
3
|
+
"version": "0.3.110",
|
|
4
|
+
"description": "A library of utilities for managing the Data Positioning repositories.",
|
|
5
5
|
"license": "MIT",
|
|
6
6
|
"author": "Jonathan Terrell <terrell.jm@gmail.com>",
|
|
7
7
|
"private": false,
|
|
@@ -37,8 +37,10 @@
|
|
|
37
37
|
"jiti": "^2.6.1",
|
|
38
38
|
"license-report": "^6.8.1",
|
|
39
39
|
"license-report-check": "^0.1.2",
|
|
40
|
+
"license-report-recursive": "^6.8.2",
|
|
40
41
|
"nanoid": "^5.1.6",
|
|
41
42
|
"npm-check-updates": "^19.1.2",
|
|
43
|
+
"owasp-dependency-check": "^1.0.0",
|
|
42
44
|
"prettier": "^3.6.2",
|
|
43
45
|
"retire": "^5.3.0",
|
|
44
46
|
"rollup-plugin-visualizer": "^6.0.5",
|
|
@@ -47,23 +49,29 @@
|
|
|
47
49
|
"typescript": "^5.9.3",
|
|
48
50
|
"vite": "^7.2.4",
|
|
49
51
|
"vite-plugin-dts": "^4.5.4",
|
|
52
|
+
"vitest": "^4.0.14",
|
|
50
53
|
"zod": "^4.1.13"
|
|
51
54
|
},
|
|
52
55
|
"scripts": {
|
|
53
56
|
"audit": "npm audit",
|
|
54
57
|
"build": "vite build",
|
|
55
58
|
"check": "npm outdated; npm-check-updates -i && retire",
|
|
56
|
-
"document": "npm run _document:
|
|
59
|
+
"document": "npm run _document:licenceReportJSON && npm run _document:licenceReportMarkdown && npm run _document:licenceReportCheck && npm run _document:insertLicensesIntoReadme && npm run _document:licenceTree && npm run _document:licenceTreeCheck",
|
|
57
60
|
"format": "prettier --write src/",
|
|
58
61
|
"lint": "eslint .",
|
|
59
62
|
"publish:toNPM": "npm publish --access public",
|
|
60
63
|
"release": "npm run _bump:version && npm run build && npm run _sync:withGitHub && npm run publish:toNPM",
|
|
61
64
|
"sync": "npm run _bump:version && npm run _sync:withGitHub",
|
|
62
|
-
"test": "
|
|
65
|
+
"test": "vitest",
|
|
63
66
|
"update:dataPosDeps": "npm run _update:sharedDep",
|
|
64
67
|
"_bump:version": "node -e \"import('./dist/datapos-development.es.js').then(m => m.bumpVersion())\"",
|
|
65
|
-
"
|
|
66
|
-
"_document:
|
|
68
|
+
"_check:owasp": "set -a && source .env && set +a && owasp-dependency-check --project \"@datapos/datapos-tool-micromark\" --enableRetired --nvdApiKey \"$NVD_API_KEY\"",
|
|
69
|
+
"_document:licenceReportJSON": "license-report --only=prod,peer --department.value=n/a --licensePeriod=n/a --material=n/a --relatedTo.value=n/a > licenses.json",
|
|
70
|
+
"_document:licenceReportMarkdown": "license-report --config license-report-config.json --only=prod,peer --output=markdown > licenses.md",
|
|
71
|
+
"_document:licenceReportCheck": "license-report-check --source ./licenses.json --allowed 'MIT' --allowed 'n/a' --allowed 'Apache-2.0' --allowed 'CC0-1.0' --output=table",
|
|
72
|
+
"_document:licenceTree": "license-report-recursive --only=prod,peer,dev --recurse --output=tree > licenseTree.json",
|
|
73
|
+
"_document:licenceTreeCheck": "license-report-check --source ./licenseTree.json --allowed 'MIT' --allowed 'n/a' --allowed 'Apache-2.0' --allowed 'CC0-1.0' --output=table",
|
|
74
|
+
"_document:insertLicensesIntoReadme": "node -e \"import('./dist/datapos-development.es.js').then(m => m.insertLicensesIntoReadme())\"",
|
|
67
75
|
"_sync:withGitHub": "node -e \"import('./dist/datapos-development.es.js').then(m => m.syncWithGitHub())\"",
|
|
68
76
|
"_update:sharedDep": "npm install @datapos/datapos-shared@latest"
|
|
69
77
|
},
|