@ziacik/upgrade-verify 0.2.0 → 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +9 -1
- package/README.md +56 -11
- package/executors.json +5 -0
- package/package.json +1 -1
- package/src/executors/check-issues/executor.d.ts +5 -0
- package/src/executors/check-issues/executor.js +67 -0
- package/src/executors/check-issues/executor.js.map +1 -0
- package/src/executors/check-issues/schema.d.js +4 -0
- package/src/executors/check-issues/schema.d.js.map +1 -0
- package/src/executors/check-issues/schema.d.ts +1 -0
- package/src/executors/check-issues/schema.json +9 -0
- package/src/executors/verify-build/__fixtures__/expected-stats-nohash.json +15 -0
- package/src/executors/verify-build/dist-stats.d.ts +1 -1
- package/src/executors/verify-build/dist-stats.js +11 -7
- package/src/executors/verify-build/dist-stats.js.map +1 -1
- package/src/executors/verify-build/executor.js +5 -4
- package/src/executors/verify-build/executor.js.map +1 -1
- package/src/executors/verify-build/schema.d.js +0 -1
- package/src/executors/verify-build/schema.d.js.map +1 -1
- package/src/executors/verify-build/schema.d.ts +3 -1
- package/src/executors/verify-build/schema.json +7 -1
package/CHANGELOG.md
CHANGED
|
@@ -7,6 +7,13 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|
|
7
7
|
|
|
8
8
|
## [Unreleased]
|
|
9
9
|
|
|
10
|
+
## [1.0.0] - 2023-10-21
|
|
11
|
+
|
|
12
|
+
## Added
|
|
13
|
+
|
|
14
|
+
- `removeHashes` option added to the `verify-build` executor, with a default of `true`, which removes hashes from file names, making the comparisons more deterministic.
|
|
15
|
+
- `check-issues` executor added.
|
|
16
|
+
|
|
10
17
|
## [0.2.0] - 2023-10-15
|
|
11
18
|
|
|
12
19
|
### Changed
|
|
@@ -44,7 +51,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|
|
44
51
|
- READMEs updated and props added to package.json.
|
|
45
52
|
- A _Package subpath './package.json' is not defined by "exports"_ error hopefully fixed.
|
|
46
53
|
|
|
47
|
-
[unreleased]: https://github.com/ziacik/nx-tools/compare/upgrade-verify-0.
|
|
54
|
+
[unreleased]: https://github.com/ziacik/nx-tools/compare/upgrade-verify-1.0.0...HEAD
|
|
55
|
+
[1.0.0]: https://github.com/ziacik/nx-tools/compare/upgrade-verify-0.2.0...upgrade-verify-1.0.0
|
|
48
56
|
[0.2.0]: https://github.com/ziacik/nx-tools/compare/upgrade-verify-0.1.1...upgrade-verify-0.2.0
|
|
49
57
|
[0.1.1]: https://github.com/ziacik/nx-tools/compare/upgrade-verify-0.1.0...upgrade-verify-0.1.1
|
|
50
58
|
[0.1.0]: https://github.com/ziacik/nx-tools/compare/upgrade-verify-0.0.4...upgrade-verify-0.1.0
|
package/README.md
CHANGED
|
@@ -1,6 +1,22 @@
|
|
|
1
1
|
# Nx Upgrade Verify Plugin
|
|
2
2
|
|
|
3
|
-
This plugin provides functionality to verify
|
|
3
|
+
This plugin provides functionality to verify various aspects of a workspace after NX upgrade.
|
|
4
|
+
|
|
5
|
+
## Installation
|
|
6
|
+
|
|
7
|
+
To install the plugin, run the following command:
|
|
8
|
+
|
|
9
|
+
```bash
|
|
10
|
+
npm install -D @ziacik/upgrade-verify
|
|
11
|
+
```
|
|
12
|
+
|
|
13
|
+
## Usage
|
|
14
|
+
|
|
15
|
+
Once the plugin is installed, you can use its executors in your project's configuration.
|
|
16
|
+
|
|
17
|
+
### verify-build executor
|
|
18
|
+
|
|
19
|
+
This executor provides functionality to verify the build of a project after NX upgrade by comparing dist file statistics and detecting any significant differences.
|
|
4
20
|
|
|
5
21
|
On each run, the executor builds the project for each configuration from the build target. At the first run, the executor generates stats in the `.stats` directory of the project from the built files.
|
|
6
22
|
|
|
@@ -10,27 +26,56 @@ The stats can be committed to the repository for future use.
|
|
|
10
26
|
|
|
11
27
|
If the percentage differences cross a threshold of 10%, the executor will report a failure.
|
|
12
28
|
|
|
13
|
-
|
|
29
|
+
#### Example configuration
|
|
14
30
|
|
|
15
|
-
|
|
31
|
+
```json
|
|
32
|
+
{
|
|
33
|
+
"name": "my-app",
|
|
34
|
+
...
|
|
35
|
+
"targets": {
|
|
36
|
+
"verify-build": {
|
|
37
|
+
"executor": "@ziacik/upgrade-verify:verify-build",
|
|
38
|
+
"options": {
|
|
39
|
+
"removeHashes": true
|
|
40
|
+
}
|
|
41
|
+
},
|
|
42
|
+
...
|
|
43
|
+
},
|
|
44
|
+
...
|
|
45
|
+
}
|
|
46
|
+
```
|
|
16
47
|
|
|
17
|
-
|
|
48
|
+
Then to use the executor, run
|
|
18
49
|
|
|
19
50
|
```bash
|
|
20
|
-
|
|
51
|
+
nx verify-build my-app
|
|
21
52
|
```
|
|
22
53
|
|
|
23
|
-
|
|
54
|
+
### check-issues executor
|
|
55
|
+
|
|
56
|
+
This executor serves as a watchdog for external issues tracked on github that are listed in workspace root's **ISSUES.md** file. This is useful when for example when a workaround has been introduced becaues of a bug, and we would like to remove that workaround when the issue is resolved.
|
|
57
|
+
|
|
58
|
+
When this executor is run, it takes all issues listed in the **ISSUES.md** file, check if they are still active and lists all that have already been closed. That allows us to take action and remove that issue from the list.
|
|
24
59
|
|
|
25
|
-
|
|
60
|
+
#### Example contents of `ISSUES.md`
|
|
61
|
+
|
|
62
|
+
```markdown
|
|
63
|
+
# Issues to be watched
|
|
64
|
+
|
|
65
|
+
https://github.com/ziacik/nx-tools/issues/1
|
|
66
|
+
[https://github.com/ziacik/nx-tools/issues/2](https://github.com/ziacik/nx-tools/issues/2)
|
|
67
|
+
[Some issue](https://github.com/ziacik/nx-tools/issues/3)
|
|
68
|
+
```
|
|
69
|
+
|
|
70
|
+
#### Example configuration
|
|
26
71
|
|
|
27
72
|
```json
|
|
28
73
|
{
|
|
29
74
|
"name": "my-app",
|
|
30
75
|
...
|
|
31
76
|
"targets": {
|
|
32
|
-
"
|
|
33
|
-
"executor": "@ziacik/upgrade-verify:
|
|
77
|
+
"check-issues": {
|
|
78
|
+
"executor": "@ziacik/upgrade-verify:check-issues"
|
|
34
79
|
},
|
|
35
80
|
...
|
|
36
81
|
},
|
|
@@ -38,10 +83,10 @@ Once the plugin is installed, you can use it as a custom executor in your projec
|
|
|
38
83
|
}
|
|
39
84
|
```
|
|
40
85
|
|
|
41
|
-
Then to use the
|
|
86
|
+
Then to use the executor, run
|
|
42
87
|
|
|
43
88
|
```bash
|
|
44
|
-
nx
|
|
89
|
+
nx check-issues my-app
|
|
45
90
|
```
|
|
46
91
|
|
|
47
92
|
## License
|
package/executors.json
CHANGED
|
@@ -4,6 +4,11 @@
|
|
|
4
4
|
"implementation": "./src/executors/verify-build/executor",
|
|
5
5
|
"schema": "./src/executors/verify-build/schema.json",
|
|
6
6
|
"description": "Verifies that the build does not differ too much from the previous one after NX upgrade. Or, if this is the first run, creates stats for future verifications."
|
|
7
|
+
},
|
|
8
|
+
"check-issues": {
|
|
9
|
+
"implementation": "./src/executors/check-issues/executor",
|
|
10
|
+
"schema": "./src/executors/check-issues/schema.json",
|
|
11
|
+
"description": "Checks a list of issue links to see if they have been closed already."
|
|
7
12
|
}
|
|
8
13
|
}
|
|
9
14
|
}
|
package/package.json
CHANGED
|
@@ -0,0 +1,67 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
Object.defineProperty(exports, "default", {
|
|
3
|
+
enumerable: true,
|
|
4
|
+
get: function() {
|
|
5
|
+
return runExecutor;
|
|
6
|
+
}
|
|
7
|
+
});
|
|
8
|
+
const _devkit = require("@nx/devkit");
|
|
9
|
+
const _promises = require("fs/promises");
|
|
10
|
+
const _path = require("path");
|
|
11
|
+
const GITHUB_ISSUE_REGEX = RegExp("https:\\/\\/github.com\\/(?<owner>[^/]+)\\/(?<repo>[^/]+)\\/issues\\/(?<issueNumber>\\d+)");
|
|
12
|
+
async function runExecutor(options, context) {
|
|
13
|
+
const issuesMd = await tryLoadIssuesMd(context.root);
|
|
14
|
+
if (issuesMd != null) {
|
|
15
|
+
const issueMdLines = issuesMd.split('\n');
|
|
16
|
+
await printClosedIssueLines(issueMdLines);
|
|
17
|
+
} else {
|
|
18
|
+
_devkit.logger.info("There is no 'ISSUES.md' file in the workspace root.");
|
|
19
|
+
}
|
|
20
|
+
return {
|
|
21
|
+
success: true
|
|
22
|
+
};
|
|
23
|
+
}
|
|
24
|
+
async function printClosedIssueLines(issueMdLines) {
|
|
25
|
+
const resolvedLines = await Promise.all(issueMdLines.map(resolveIssueMdLine));
|
|
26
|
+
const closedLines = resolvedLines.filter((resolvedLine)=>resolvedLine.result === 'closed');
|
|
27
|
+
if (closedLines.length > 0) {
|
|
28
|
+
_devkit.logger.info('Issues which are closed now:');
|
|
29
|
+
for (const closedLine of closedLines){
|
|
30
|
+
_devkit.logger.info(`- ${closedLine.issueMdLine}`);
|
|
31
|
+
}
|
|
32
|
+
} else {
|
|
33
|
+
_devkit.logger.info('No issues have been closed.');
|
|
34
|
+
}
|
|
35
|
+
}
|
|
36
|
+
async function resolveIssueMdLine(issueMdLine) {
|
|
37
|
+
const githubLink = issueMdLine.match(GITHUB_ISSUE_REGEX);
|
|
38
|
+
if (!githubLink) {
|
|
39
|
+
return {
|
|
40
|
+
issueMdLine,
|
|
41
|
+
result: 'noissue'
|
|
42
|
+
};
|
|
43
|
+
}
|
|
44
|
+
const [, owner, repo, issueNumber] = githubLink;
|
|
45
|
+
const response = await fetch(`https://api.github.com/repos/${owner}/${repo}/issues/${issueNumber}`);
|
|
46
|
+
const { state } = await response.json();
|
|
47
|
+
return {
|
|
48
|
+
issueMdLine,
|
|
49
|
+
result: state === 'closed' ? 'closed' : 'active'
|
|
50
|
+
};
|
|
51
|
+
}
|
|
52
|
+
async function tryLoadIssuesMd(root) {
|
|
53
|
+
try {
|
|
54
|
+
return await (0, _promises.readFile)((0, _path.join)(root, 'ISSUES.md'), 'utf8');
|
|
55
|
+
} catch (e) {
|
|
56
|
+
if (isNodeError(e) && e.code === 'ENOENT') {
|
|
57
|
+
return undefined;
|
|
58
|
+
} else {
|
|
59
|
+
throw e;
|
|
60
|
+
}
|
|
61
|
+
}
|
|
62
|
+
}
|
|
63
|
+
function isNodeError(error) {
|
|
64
|
+
return error instanceof Error && 'code' in error;
|
|
65
|
+
}
|
|
66
|
+
|
|
67
|
+
//# sourceMappingURL=executor.js.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"sources":["../../../../../../packages/upgrade-verify/src/executors/check-issues/executor.ts"],"sourcesContent":["import { ExecutorContext, logger } from '@nx/devkit';\nimport { readFile } from 'fs/promises';\nimport { join } from 'path';\nimport { CheckIssuesExecutorSchema } from './schema';\n\nconst GITHUB_ISSUE_REGEX = /https:\\/\\/github.com\\/(?<owner>[^/]+)\\/(?<repo>[^/]+)\\/issues\\/(?<issueNumber>\\d+)/;\n\nexport default async function runExecutor(options: CheckIssuesExecutorSchema, context: ExecutorContext) {\n\tconst issuesMd = await tryLoadIssuesMd(context.root);\n\n\tif (issuesMd != null) {\n\t\tconst issueMdLines = issuesMd.split('\\n');\n\t\tawait printClosedIssueLines(issueMdLines);\n\t} else {\n\t\tlogger.info(\"There is no 'ISSUES.md' file in the workspace root.\");\n\t}\n\n\treturn {\n\t\tsuccess: true,\n\t};\n}\n\nasync function printClosedIssueLines(issueMdLines: string[]) {\n\tconst resolvedLines = await Promise.all(issueMdLines.map(resolveIssueMdLine));\n\tconst closedLines = resolvedLines.filter((resolvedLine) => resolvedLine.result === 'closed');\n\n\tif (closedLines.length > 0) {\n\t\tlogger.info('Issues which are closed now:');\n\n\t\tfor (const closedLine of closedLines) {\n\t\t\tlogger.info(`- ${closedLine.issueMdLine}`);\n\t\t}\n\t} else {\n\t\tlogger.info('No issues have been closed.');\n\t}\n}\n\nasync function resolveIssueMdLine(issueMdLine: string): Promise<{ issueMdLine: string; result: 'noissue' | 'active' | 'closed' }> {\n\tconst githubLink = issueMdLine.match(GITHUB_ISSUE_REGEX);\n\n\tif (!githubLink) {\n\t\treturn { issueMdLine, result: 'noissue' };\n\t}\n\n\tconst [, owner, repo, issueNumber] = githubLink;\n\n\tconst response = await fetch(`https://api.github.com/repos/${owner}/${repo}/issues/${issueNumber}`);\n\tconst { state } = await response.json();\n\n\treturn {\n\t\tissueMdLine,\n\t\tresult: state === 'closed' ? 'closed' : 'active',\n\t};\n}\n\nasync function tryLoadIssuesMd(root: string): Promise<string | undefined> {\n\ttry {\n\t\treturn await readFile(join(root, 'ISSUES.md'), 'utf8');\n\t} catch (e) {\n\t\tif (isNodeError(e) && e.code === 'ENOENT') {\n\t\t\treturn undefined;\n\t\t} else {\n\t\t\tthrow e;\n\t\t}\n\t}\n}\n\nfunction isNodeError(error: unknown): error is NodeJS.ErrnoException {\n\treturn error instanceof Error && 'code' in error;\n}\n"],"names":["runExecutor","GITHUB_ISSUE_REGEX","options","context","issuesMd","tryLoadIssuesMd","root","issueMdLines","split","printClosedIssueLines","logger","info","success","resolvedLines","Promise","all","map","resolveIssueMdLine","closedLines","filter","resolvedLine","result","length","closedLine","issueMdLine","githubLink","match","owner","repo","issueNumber","response","fetch","state","json","readFile","join","e","isNodeError","code","undefined","error","Error"],"mappings":";+BAOA;;;eAA8BA;;;wBAPU;0BACf;sBACJ;AAGrB,MAAMC,qBAAqB;AAEZ,eAAeD,YAAYE,OAAkC,EAAEC,OAAwB;IACrG,MAAMC,WAAW,MAAMC,gBAAgBF,QAAQG,IAAI;IAEnD,IAAIF,YAAY,MAAM;QACrB,MAAMG,eAAeH,SAASI,KAAK,CAAC;QACpC,MAAMC,sBAAsBF;IAC7B,OAAO;QACNG,cAAM,CAACC,IAAI,CAAC;IACb;IAEA,OAAO;QACNC,SAAS;IACV;AACD;AAEA,eAAeH,sBAAsBF,YAAsB;IAC1D,MAAMM,gBAAgB,MAAMC,QAAQC,GAAG,CAACR,aAAaS,GAAG,CAACC;IACzD,MAAMC,cAAcL,cAAcM,MAAM,CAAC,CAACC,eAAiBA,aAAaC,MAAM,KAAK;IAEnF,IAAIH,YAAYI,MAAM,GAAG,GAAG;QAC3BZ,cAAM,CAACC,IAAI,CAAC;QAEZ,KAAK,MAAMY,cAAcL,YAAa;YACrCR,cAAM,CAACC,IAAI,CAAC,CAAC,EAAE,EAAEY,WAAWC,WAAW,CAAC,CAAC;QAC1C;IACD,OAAO;QACNd,cAAM,CAACC,IAAI,CAAC;IACb;AACD;AAEA,eAAeM,mBAAmBO,WAAmB;IACpD,MAAMC,aAAaD,YAAYE,KAAK,CAACzB;IAErC,IAAI,CAACwB,YAAY;QAChB,OAAO;YAAED;YAAaH,QAAQ;QAAU;IACzC;IAEA,MAAM,GAAGM,OAAOC,MAAMC,YAAY,GAAGJ;IAErC,MAAMK,WAAW,MAAMC,MAAM,CAAC,6BAA6B,EAAEJ,MAAM,CAAC,EAAEC,KAAK,QAAQ,EAAEC,YAAY,CAAC;IAClG,MAAM,EAAEG,KAAK,EAAE,GAAG,MAAMF,SAASG,IAAI;IAErC,OAAO;QACNT;QACAH,QAAQW,UAAU,WAAW,WAAW;IACzC;AACD;AAEA,eAAe3B,gBAAgBC,IAAY;IAC1C,IAAI;QACH,OAAO,MAAM4B,IAAAA,kBAAQ,EAACC,IAAAA,UAAI,EAAC7B,MAAM,cAAc;IAChD,EAAE,OAAO8B,GAAG;QACX,IAAIC,YAAYD,MAAMA,EAAEE,IAAI,KAAK,UAAU;YAC1C,OAAOC;QACR,OAAO;YACN,MAAMH;QACP;IACD;AACD;AAEA,SAASC,YAAYG,KAAc;IAClC,OAAOA,iBAAiBC,SAAS,UAAUD;AAC5C"}
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"sources":["../../../../../../packages/upgrade-verify/src/executors/check-issues/schema.d.ts"],"sourcesContent":["export interface CheckIssuesExecutorSchema {} // eslint-disable-line\n"],"names":[],"mappings":";CAA8C,sBAAsB"}
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
export interface CheckIssuesExecutorSchema {} // eslint-disable-line
|
|
@@ -0,0 +1,15 @@
|
|
|
1
|
+
{
|
|
2
|
+
"name": "",
|
|
3
|
+
"size": 42926,
|
|
4
|
+
"items": [
|
|
5
|
+
{ "name": "3rdpartylicenses.txt", "size": 19, "items": [] },
|
|
6
|
+
{ "name": "assets", "size": 0, "items": [{ "name": ".gitkeep", "size": 0, "items": [] }] },
|
|
7
|
+
{ "name": "favicon.ico", "size": 15086, "items": [] },
|
|
8
|
+
{ "name": "index.html", "size": 608, "items": [] },
|
|
9
|
+
{ "name": "main.css", "size": 6168, "items": [] },
|
|
10
|
+
{ "name": "main.js", "size": 18760, "items": [] },
|
|
11
|
+
{ "name": "runtime.js", "size": 2109, "items": [] },
|
|
12
|
+
{ "name": "styles.js", "size": 176, "items": [] },
|
|
13
|
+
{ "name": "styles.css", "size": 0, "items": [] }
|
|
14
|
+
]
|
|
15
|
+
}
|
|
@@ -4,4 +4,4 @@ export type Stat = {
|
|
|
4
4
|
readonly items: Stat[];
|
|
5
5
|
};
|
|
6
6
|
export declare function loadExistingDistStats(statsPath: string): Promise<Stat | undefined>;
|
|
7
|
-
export declare function calculateDistStats(distDir: string): Promise<Stat>;
|
|
7
|
+
export declare function calculateDistStats(distDir: string, removeHashes: boolean): Promise<Stat>;
|
|
@@ -23,24 +23,24 @@ async function loadExistingDistStats(statsPath) {
|
|
|
23
23
|
return undefined;
|
|
24
24
|
}
|
|
25
25
|
}
|
|
26
|
-
async function calculateDistStats(distDir) {
|
|
27
|
-
return getFolderStats(distDir, '');
|
|
26
|
+
async function calculateDistStats(distDir, removeHashes) {
|
|
27
|
+
return getFolderStats(distDir, '', removeHashes);
|
|
28
28
|
}
|
|
29
|
-
async function getFolderStats(root, relative) {
|
|
30
|
-
return calculateStats(root, relative);
|
|
29
|
+
async function getFolderStats(root, relative, removeHashes) {
|
|
30
|
+
return calculateStats(root, relative, removeHashes);
|
|
31
31
|
}
|
|
32
|
-
async function calculateStats(root, relative) {
|
|
32
|
+
async function calculateStats(root, relative, removeHashes) {
|
|
33
33
|
const path = (0, _path.join)(root, relative);
|
|
34
34
|
const stats = await (0, _promises.stat)(path);
|
|
35
35
|
if (stats.isFile()) {
|
|
36
36
|
return {
|
|
37
|
-
name: relative,
|
|
37
|
+
name: removeHashes ? withHashRemoved(relative) : relative,
|
|
38
38
|
size: stats.size,
|
|
39
39
|
items: []
|
|
40
40
|
};
|
|
41
41
|
} else if (stats.isDirectory()) {
|
|
42
42
|
const nestedFiles = await (0, _promises.readdir)(path);
|
|
43
|
-
const nestedStats = await Promise.all(nestedFiles.map((nestedFile)=>calculateStats((0, _path.join)(root, relative), nestedFile)));
|
|
43
|
+
const nestedStats = await Promise.all(nestedFiles.map((nestedFile)=>calculateStats((0, _path.join)(root, relative), nestedFile, removeHashes)));
|
|
44
44
|
return {
|
|
45
45
|
name: relative,
|
|
46
46
|
size: nestedStats.reduce((result, stat)=>result + stat.size, 0),
|
|
@@ -50,5 +50,9 @@ async function calculateStats(root, relative) {
|
|
|
50
50
|
throw new Error('What: ' + path);
|
|
51
51
|
}
|
|
52
52
|
}
|
|
53
|
+
const HASH_REGEX = /\.[a-f0-9]{16}\./g;
|
|
54
|
+
function withHashRemoved(fileName) {
|
|
55
|
+
return fileName.replace(HASH_REGEX, '.');
|
|
56
|
+
}
|
|
53
57
|
|
|
54
58
|
//# sourceMappingURL=dist-stats.js.map
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"sources":["../../../../../../packages/upgrade-verify/src/executors/verify-build/dist-stats.ts"],"sourcesContent":["import { readFile, readdir, stat } from 'fs/promises';\nimport { join } from 'path';\n\nexport type Stat = {\n\treadonly name: string;\n\treadonly size: number;\n\treadonly items: Stat[];\n};\n\nexport async function loadExistingDistStats(statsPath: string): Promise<Stat | undefined> {\n\ttry {\n\t\tconst statsStr = await readFile(statsPath, 'utf-8');\n\t\treturn JSON.parse(statsStr);\n\t} catch {\n\t\treturn undefined;\n\t}\n}\n\nexport async function calculateDistStats(distDir: string): Promise<Stat> {\n\treturn getFolderStats(distDir, '');\n}\n\nasync function getFolderStats(root: string, relative: string): Promise<Stat> {\n\treturn calculateStats(root, relative);\n}\n\nasync function calculateStats(root: string, relative: string): Promise<Stat> {\n\tconst path = join(root, relative);\n\tconst stats = await stat(path);\n\n\tif (stats.isFile()) {\n\t\treturn {\n\t\t\tname: relative,\n\t\t\tsize: stats.size,\n\t\t\titems: [],\n\t\t};\n\t} else if (stats.isDirectory()) {\n\t\tconst nestedFiles = await readdir(path);\n\t\tconst nestedStats = await Promise.all(nestedFiles.map((nestedFile) => calculateStats(join(root, relative), nestedFile)));\n\t\treturn {\n\t\t\tname: relative,\n\t\t\tsize: nestedStats.reduce((result, stat) => result + stat.size, 0),\n\t\t\titems: nestedStats,\n\t\t};\n\t} else {\n\t\tthrow new Error('What: ' + path);\n\t}\n}\n"],"names":["loadExistingDistStats","calculateDistStats","statsPath","statsStr","readFile","JSON","parse","undefined","distDir","getFolderStats","root","relative","calculateStats","path","join","stats","stat","isFile","name","size","items","isDirectory","nestedFiles","readdir","nestedStats","Promise","all","map","nestedFile","reduce","result","Error"],"mappings":";;;;;;;;IASsBA,qBAAqB;eAArBA;;IASAC,kBAAkB;eAAlBA;;;0BAlBkB;sBACnB;AAQd,eAAeD,sBAAsBE,SAAiB;IAC5D,IAAI;QACH,MAAMC,WAAW,MAAMC,IAAAA,kBAAQ,EAACF,WAAW;QAC3C,OAAOG,KAAKC,KAAK,CAACH;IACnB,EAAE,UAAM;QACP,OAAOI;IACR;AACD;AAEO,eAAeN,mBAAmBO,OAAe;
|
|
1
|
+
{"version":3,"sources":["../../../../../../packages/upgrade-verify/src/executors/verify-build/dist-stats.ts"],"sourcesContent":["import { readFile, readdir, stat } from 'fs/promises';\nimport { join } from 'path';\n\nexport type Stat = {\n\treadonly name: string;\n\treadonly size: number;\n\treadonly items: Stat[];\n};\n\nexport async function loadExistingDistStats(statsPath: string): Promise<Stat | undefined> {\n\ttry {\n\t\tconst statsStr = await readFile(statsPath, 'utf-8');\n\t\treturn JSON.parse(statsStr);\n\t} catch {\n\t\treturn undefined;\n\t}\n}\n\nexport async function calculateDistStats(distDir: string, removeHashes: boolean): Promise<Stat> {\n\treturn getFolderStats(distDir, '', removeHashes);\n}\n\nasync function getFolderStats(root: string, relative: string, removeHashes: boolean): Promise<Stat> {\n\treturn calculateStats(root, relative, removeHashes);\n}\n\nasync function calculateStats(root: string, relative: string, removeHashes: boolean): Promise<Stat> {\n\tconst path = join(root, relative);\n\tconst stats = await stat(path);\n\n\tif (stats.isFile()) {\n\t\treturn {\n\t\t\tname: removeHashes ? withHashRemoved(relative) : relative,\n\t\t\tsize: stats.size,\n\t\t\titems: [],\n\t\t};\n\t} else if (stats.isDirectory()) {\n\t\tconst nestedFiles = await readdir(path);\n\t\tconst nestedStats = await Promise.all(nestedFiles.map((nestedFile) => calculateStats(join(root, relative), nestedFile, removeHashes)));\n\t\treturn {\n\t\t\tname: relative,\n\t\t\tsize: nestedStats.reduce((result, stat) => result + stat.size, 0),\n\t\t\titems: nestedStats,\n\t\t};\n\t} else {\n\t\tthrow new Error('What: ' + path);\n\t}\n}\n\nconst HASH_REGEX = /\\.[a-f0-9]{16}\\./g;\n\nfunction withHashRemoved(fileName: string): string {\n\treturn fileName.replace(HASH_REGEX, '.');\n}\n"],"names":["loadExistingDistStats","calculateDistStats","statsPath","statsStr","readFile","JSON","parse","undefined","distDir","removeHashes","getFolderStats","root","relative","calculateStats","path","join","stats","stat","isFile","name","withHashRemoved","size","items","isDirectory","nestedFiles","readdir","nestedStats","Promise","all","map","nestedFile","reduce","result","Error","HASH_REGEX","fileName","replace"],"mappings":";;;;;;;;IASsBA,qBAAqB;eAArBA;;IASAC,kBAAkB;eAAlBA;;;0BAlBkB;sBACnB;AAQd,eAAeD,sBAAsBE,SAAiB;IAC5D,IAAI;QACH,MAAMC,WAAW,MAAMC,IAAAA,kBAAQ,EAACF,WAAW;QAC3C,OAAOG,KAAKC,KAAK,CAACH;IACnB,EAAE,UAAM;QACP,OAAOI;IACR;AACD;AAEO,eAAeN,mBAAmBO,OAAe,EAAEC,YAAqB;IAC9E,OAAOC,eAAeF,SAAS,IAAIC;AACpC;AAEA,eAAeC,eAAeC,IAAY,EAAEC,QAAgB,EAAEH,YAAqB;IAClF,OAAOI,eAAeF,MAAMC,UAAUH;AACvC;AAEA,eAAeI,eAAeF,IAAY,EAAEC,QAAgB,EAAEH,YAAqB;IAClF,MAAMK,OAAOC,IAAAA,UAAI,EAACJ,MAAMC;IACxB,MAAMI,QAAQ,MAAMC,IAAAA,cAAI,EAACH;IAEzB,IAAIE,MAAME,MAAM,IAAI;QACnB,OAAO;YACNC,MAAMV,eAAeW,gBAAgBR,YAAYA;YACjDS,MAAML,MAAMK,IAAI;YAChBC,OAAO,EAAE;QACV;IACD,OAAO,IAAIN,MAAMO,WAAW,IAAI;QAC/B,MAAMC,cAAc,MAAMC,IAAAA,iBAAO,EAACX;QAClC,MAAMY,cAAc,MAAMC,QAAQC,GAAG,CAACJ,YAAYK,GAAG,CAAC,CAACC,aAAejB,eAAeE,IAAAA,UAAI,EAACJ,MAAMC,WAAWkB,YAAYrB;QACvH,OAAO;YACNU,MAAMP;YACNS,MAAMK,YAAYK,MAAM,CAAC,CAACC,QAAQf,OAASe,SAASf,KAAKI,IAAI,EAAE;YAC/DC,OAAOI;QACR;IACD,OAAO;QACN,MAAM,IAAIO,MAAM,WAAWnB;IAC5B;AACD;AAEA,MAAMoB,aAAa;AAEnB,SAASd,gBAAgBe,QAAgB;IACxC,OAAOA,SAASC,OAAO,CAACF,YAAY;AACrC"}
|
|
@@ -35,13 +35,13 @@ async function verifyBuild(options, context) {
|
|
|
35
35
|
retainEnv(envBackup);
|
|
36
36
|
const runContext = JSON.parse(JSON.stringify(context));
|
|
37
37
|
var _runContext_projectName;
|
|
38
|
-
const
|
|
38
|
+
const results = await (0, _devkit.runExecutor)({
|
|
39
39
|
project: (_runContext_projectName = runContext.projectName) != null ? _runContext_projectName : '',
|
|
40
40
|
target: 'build',
|
|
41
41
|
configuration: configurationName
|
|
42
42
|
}, {}, runContext);
|
|
43
|
-
for await (const
|
|
44
|
-
if (!
|
|
43
|
+
for await (const result of results){
|
|
44
|
+
if (!result.success) {
|
|
45
45
|
return {
|
|
46
46
|
success: false
|
|
47
47
|
};
|
|
@@ -49,7 +49,8 @@ async function verifyBuild(options, context) {
|
|
|
49
49
|
}
|
|
50
50
|
const statsPath = (0, _path.join)(statsDir, configurationName + '.json');
|
|
51
51
|
const existingStats = await (0, _diststats.loadExistingDistStats)(statsPath);
|
|
52
|
-
|
|
52
|
+
var _options_removeHashes;
|
|
53
|
+
const newStats = await (0, _diststats.calculateDistStats)(distDir, (_options_removeHashes = options.removeHashes) != null ? _options_removeHashes : true);
|
|
53
54
|
await (0, _promises.writeFile)(statsPath, JSON.stringify(newStats, null, '\t'));
|
|
54
55
|
if (existingStats != null) {
|
|
55
56
|
const comparison = (0, _diststatcomparer.compareStats)(existingStats, newStats);
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"sources":["../../../../../../packages/upgrade-verify/src/executors/verify-build/executor.ts"],"sourcesContent":["import { ExecutorContext, logger, runExecutor } from '@nx/devkit';\nimport { mkdir, writeFile } from 'fs/promises';\nimport { join } from 'path';\nimport { env } from 'process';\nimport { compareStats } from './dist-stat-comparer';\nimport { calculateDistStats, loadExistingDistStats } from './dist-stats';\nimport { VerifyBuildExecutorSchema } from './schema';\n\nexport default async function verifyBuild(options: VerifyBuildExecutorSchema, context: ExecutorContext) {\n\tif (context.workspace == null) {\n\t\tthrow new Error('Workspace context info not available.');\n\t}\n\n\tif (context.projectName == null) {\n\t\tthrow new Error('Project name not specified in context info.');\n\t}\n\n\tconst projectConfig = context.workspace.projects[context.projectName];\n\n\tif (projectConfig.targets == null) {\n\t\tthrow new Error('Target info not available for the project in context info.');\n\t}\n\n\tconst distDir = join(context.root, projectConfig.targets['build'].options.outputPath);\n\tconst statsDir = join(context.root, projectConfig.root, '.stats');\n\tawait tryMkdir(statsDir);\n\n\tif (projectConfig.targets['build'].configurations == null) {\n\t\tthrow new Error('Configurations info not available for the project, target \"build\", in context info.');\n\t}\n\n\tlet success = true;\n\tconst envBackup = { ...env };\n\n\tfor (const configurationName of Object.keys(projectConfig.targets['build'].configurations)) {\n\t\tretainEnv(envBackup);\n\t\tconst runContext: ExecutorContext = JSON.parse(JSON.stringify(context));\n\n\t\tconst
|
|
1
|
+
{"version":3,"sources":["../../../../../../packages/upgrade-verify/src/executors/verify-build/executor.ts"],"sourcesContent":["import { ExecutorContext, logger, runExecutor } from '@nx/devkit';\nimport { mkdir, writeFile } from 'fs/promises';\nimport { join } from 'path';\nimport { env } from 'process';\nimport { compareStats } from './dist-stat-comparer';\nimport { calculateDistStats, loadExistingDistStats } from './dist-stats';\nimport { VerifyBuildExecutorSchema } from './schema';\n\nexport default async function verifyBuild(options: VerifyBuildExecutorSchema, context: ExecutorContext) {\n\tif (context.workspace == null) {\n\t\tthrow new Error('Workspace context info not available.');\n\t}\n\n\tif (context.projectName == null) {\n\t\tthrow new Error('Project name not specified in context info.');\n\t}\n\n\tconst projectConfig = context.workspace.projects[context.projectName];\n\n\tif (projectConfig.targets == null) {\n\t\tthrow new Error('Target info not available for the project in context info.');\n\t}\n\n\tconst distDir = join(context.root, projectConfig.targets['build'].options.outputPath);\n\tconst statsDir = join(context.root, projectConfig.root, '.stats');\n\tawait tryMkdir(statsDir);\n\n\tif (projectConfig.targets['build'].configurations == null) {\n\t\tthrow new Error('Configurations info not available for the project, target \"build\", in context info.');\n\t}\n\n\tlet success = true;\n\tconst envBackup = { ...env };\n\n\tfor (const configurationName of Object.keys(projectConfig.targets['build'].configurations)) {\n\t\tretainEnv(envBackup);\n\t\tconst runContext: ExecutorContext = JSON.parse(JSON.stringify(context));\n\n\t\tconst results = await runExecutor(\n\t\t\t{\n\t\t\t\tproject: runContext.projectName ?? '',\n\t\t\t\ttarget: 'build',\n\t\t\t\tconfiguration: configurationName,\n\t\t\t},\n\t\t\t{},\n\t\t\trunContext\n\t\t);\n\n\t\tfor await (const result of results) {\n\t\t\tif (!result.success) {\n\t\t\t\treturn { success: false };\n\t\t\t}\n\t\t}\n\n\t\tconst statsPath = join(statsDir, configurationName + '.json');\n\t\tconst existingStats = await loadExistingDistStats(statsPath);\n\t\tconst newStats = await calculateDistStats(distDir, options.removeHashes ?? true);\n\n\t\tawait writeFile(statsPath, JSON.stringify(newStats, null, '\\t'));\n\n\t\tif (existingStats != null) {\n\t\t\tconst comparison = compareStats(existingStats, newStats);\n\t\t\tlogger.info(\n\t\t\t\t`Stats for ${runContext.projectName}/${configurationName}: ${comparison.totalSizeDifferencePercentage}% total size difference, ${comparison.fileCountDifferencePercentage}% file count difference, ${comparison.newFilesPercentage}% new files, ${comparison.deletedFilesPercentage}% deleted files`\n\t\t\t);\n\n\t\t\tif (\n\t\t\t\tMath.abs(comparison.deletedFilesPercentage) > 10 ||\n\t\t\t\tMath.abs(comparison.fileCountDifferencePercentage) > 10 ||\n\t\t\t\tMath.abs(comparison.newFilesPercentage) > 10 ||\n\t\t\t\tMath.abs(comparison.totalSizeDifferencePercentage) > 10\n\t\t\t) {\n\t\t\t\tsuccess = false;\n\t\t\t}\n\t\t}\n\t}\n\n\treturn { success };\n}\n\nasync function tryMkdir(statsDir: string) {\n\ttry {\n\t\tawait mkdir(statsDir);\n\t} catch {\n\t\t// ignore\n\t}\n}\n\nfunction retainEnv(envBackup: Record<string, unknown>): void {\n\tfor (const key of Object.keys(env)) {\n\t\tdelete env[key];\n\t}\n\tObject.assign(env, envBackup);\n}\n"],"names":["verifyBuild","options","context","workspace","Error","projectName","projectConfig","projects","targets","distDir","join","root","outputPath","statsDir","tryMkdir","configurations","success","envBackup","env","configurationName","Object","keys","retainEnv","runContext","JSON","parse","stringify","results","runExecutor","project","target","configuration","result","statsPath","existingStats","loadExistingDistStats","newStats","calculateDistStats","removeHashes","writeFile","comparison","compareStats","logger","info","totalSizeDifferencePercentage","fileCountDifferencePercentage","newFilesPercentage","deletedFilesPercentage","Math","abs","mkdir","key","assign"],"mappings":";+BAQA;;;eAA8BA;;;;wBARuB;0BACpB;sBACZ;yBACD;kCACS;2BAC6B;AAG3C,eAAeA,YAAYC,OAAkC,EAAEC,OAAwB;IACrG,IAAIA,QAAQC,SAAS,IAAI,MAAM;QAC9B,MAAM,IAAIC,MAAM;IACjB;IAEA,IAAIF,QAAQG,WAAW,IAAI,MAAM;QAChC,MAAM,IAAID,MAAM;IACjB;IAEA,MAAME,gBAAgBJ,QAAQC,SAAS,CAACI,QAAQ,CAACL,QAAQG,WAAW,CAAC;IAErE,IAAIC,cAAcE,OAAO,IAAI,MAAM;QAClC,MAAM,IAAIJ,MAAM;IACjB;IAEA,MAAMK,UAAUC,IAAAA,UAAI,EAACR,QAAQS,IAAI,EAAEL,cAAcE,OAAO,CAAC,QAAQ,CAACP,OAAO,CAACW,UAAU;IACpF,MAAMC,WAAWH,IAAAA,UAAI,EAACR,QAAQS,IAAI,EAAEL,cAAcK,IAAI,EAAE;IACxD,MAAMG,SAASD;IAEf,IAAIP,cAAcE,OAAO,CAAC,QAAQ,CAACO,cAAc,IAAI,MAAM;QAC1D,MAAM,IAAIX,MAAM;IACjB;IAEA,IAAIY,UAAU;IACd,MAAMC,YAAY,eAAKC,YAAG;IAE1B,KAAK,MAAMC,qBAAqBC,OAAOC,IAAI,CAACf,cAAcE,OAAO,CAAC,QAAQ,CAACO,cAAc,EAAG;QAC3FO,UAAUL;QACV,MAAMM,aAA8BC,KAAKC,KAAK,CAACD,KAAKE,SAAS,CAACxB;YAInDqB;QAFX,MAAMI,UAAU,MAAMC,IAAAA,mBAAW,EAChC;YACCC,SAASN,CAAAA,0BAAAA,WAAWlB,WAAW,YAAtBkB,0BAA0B;YACnCO,QAAQ;YACRC,eAAeZ;QAChB,GACA,CAAC,GACDI;QAGD,WAAW,MAAMS,UAAUL,QAAS;YACnC,IAAI,CAACK,OAAOhB,OAAO,EAAE;gBACpB,OAAO;oBAAEA,SAAS;gBAAM;YACzB;QACD;QAEA,MAAMiB,YAAYvB,IAAAA,UAAI,EAACG,UAAUM,oBAAoB;QACrD,MAAMe,gBAAgB,MAAMC,IAAAA,gCAAqB,EAACF;YACChC;QAAnD,MAAMmC,WAAW,MAAMC,IAAAA,6BAAkB,EAAC5B,SAASR,CAAAA,wBAAAA,QAAQqC,YAAY,YAApBrC,wBAAwB;QAE3E,MAAMsC,IAAAA,mBAAS,EAACN,WAAWT,KAAKE,SAAS,CAACU,UAAU,MAAM;QAE1D,IAAIF,iBAAiB,MAAM;YAC1B,MAAMM,aAAaC,IAAAA,8BAAY,EAACP,eAAeE;YAC/CM,cAAM,CAACC,IAAI,CACV,CAAC,UAAU,EAAEpB,WAAWlB,WAAW,CAAC,CAAC,EAAEc,kBAAkB,EAAE,EAAEqB,WAAWI,6BAA6B,CAAC,yBAAyB,EAAEJ,WAAWK,6BAA6B,CAAC,yBAAyB,EAAEL,WAAWM,kBAAkB,CAAC,aAAa,EAAEN,WAAWO,sBAAsB,CAAC,eAAe,CAAC;YAGrS,IACCC,KAAKC,GAAG,CAACT,WAAWO,sBAAsB,IAAI,MAC9CC,KAAKC,GAAG,CAACT,WAAWK,6BAA6B,IAAI,MACrDG,KAAKC,GAAG,CAACT,WAAWM,kBAAkB,IAAI,MAC1CE,KAAKC,GAAG,CAACT,WAAWI,6BAA6B,IAAI,IACpD;gBACD5B,UAAU;YACX;QACD;IACD;IAEA,OAAO;QAAEA;IAAQ;AAClB;AAEA,eAAeF,SAASD,QAAgB;IACvC,IAAI;QACH,MAAMqC,IAAAA,eAAK,EAACrC;IACb,EAAE,UAAM;IACP,SAAS;IACV;AACD;AAEA,SAASS,UAAUL,SAAkC;IACpD,KAAK,MAAMkC,OAAO/B,OAAOC,IAAI,CAACH,YAAG,EAAG;QACnC,OAAOA,YAAG,CAACiC,IAAI;IAChB;IACA/B,OAAOgC,MAAM,CAAClC,YAAG,EAAED;AACpB"}
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"sources":["../../../../../../packages/upgrade-verify/src/executors/verify-build/schema.d.ts"],"
|
|
1
|
+
{"version":3,"sources":["../../../../../../packages/upgrade-verify/src/executors/verify-build/schema.d.ts"],"names":[],"mappings":""}
|
|
@@ -4,6 +4,12 @@
|
|
|
4
4
|
"title": "VerifyBuild executor",
|
|
5
5
|
"description": "",
|
|
6
6
|
"type": "object",
|
|
7
|
-
"properties": {
|
|
7
|
+
"properties": {
|
|
8
|
+
"removeHashes": {
|
|
9
|
+
"type": "boolean",
|
|
10
|
+
"description": "If true, remove hashes from file names. The hash part is detected by heuristics.",
|
|
11
|
+
"default": true
|
|
12
|
+
}
|
|
13
|
+
},
|
|
8
14
|
"required": []
|
|
9
15
|
}
|