@nestia/benchmark 11.0.0-dev.20260314 → 11.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +21 -21
- package/README.md +93 -93
- package/package.json +2 -2
- package/src/DynamicBenchmarker.ts +436 -436
- package/src/internal/DynamicBenchmarkReporter.ts +104 -104
package/LICENSE
CHANGED
|
@@ -1,21 +1,21 @@
|
|
|
1
|
-
MIT License
|
|
2
|
-
|
|
3
|
-
Copyright (c) 2024 Jeongho Nam
|
|
4
|
-
|
|
5
|
-
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
-
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
-
in the Software without restriction, including without limitation the rights
|
|
8
|
-
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
-
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
-
furnished to do so, subject to the following conditions:
|
|
11
|
-
|
|
12
|
-
The above copyright notice and this permission notice shall be included in all
|
|
13
|
-
copies or substantial portions of the Software.
|
|
14
|
-
|
|
15
|
-
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
-
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
-
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
-
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
-
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
-
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
-
SOFTWARE.
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2024 Jeongho Nam
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
package/README.md
CHANGED
|
@@ -1,93 +1,93 @@
|
|
|
1
|
-
# Nestia
|
|
2
|
-

|
|
3
|
-
|
|
4
|
-
[](https://github.com/samchon/nestia/blob/master/LICENSE)
|
|
5
|
-
[](https://www.npmjs.com/package/@nestia/fetcher)
|
|
6
|
-
[](https://www.npmjs.com/package/@nestia/fetcher)
|
|
7
|
-
[](https://github.com/samchon/nestia/actions?query=workflow%3Atest)
|
|
8
|
-
[](https://nestia.io/docs/)
|
|
9
|
-
[](https://gurubase.io/g/nestia)
|
|
10
|
-
[](https://discord.gg/E94XhzrUCZ)
|
|
11
|
-
|
|
12
|
-
Nestia is a set of helper libraries for NestJS, supporting below features:
|
|
13
|
-
|
|
14
|
-
- `@nestia/core`:
|
|
15
|
-
- Super-fast/easy decorators
|
|
16
|
-
- Advanced WebSocket routes
|
|
17
|
-
- `@nestia/sdk`:
|
|
18
|
-
- Swagger generator, more evolved than ever
|
|
19
|
-
- SDK library generator for clients
|
|
20
|
-
- Mockup Simulator for client applications
|
|
21
|
-
- Automatic E2E test functions generator
|
|
22
|
-
- `@nestia/e2e`: Test program utilizing e2e test functions
|
|
23
|
-
- `@nestia/benchmark`: Benchmark program using e2e test functions
|
|
24
|
-
- `@nestia/editor`: Swagger-UI with Online TypeScript Editor
|
|
25
|
-
- [`@agentica`](https://github.com/wrtnlabs/agentica): Agentic AI library specialized in LLM function calling
|
|
26
|
-
- [`@autobe`](https://github.com/wrtnlabs/autobe): Vibe coding agent generating NestJS application
|
|
27
|
-
- `nestia`: Just CLI (command line interface) tool
|
|
28
|
-
|
|
29
|
-
> [!NOTE]
|
|
30
|
-
>
|
|
31
|
-
> - **Only one line** required, with pure TypeScript type
|
|
32
|
-
> - Enhance performance **30x** up
|
|
33
|
-
> - Runtime validator is **20,000x faster** than `class-validator`
|
|
34
|
-
> - JSON serialization is **200x faster** than `class-transformer`
|
|
35
|
-
> - Software Development Kit
|
|
36
|
-
> - Collection of typed `fetch` functions with DTO structures like [tRPC](https://trpc.io/)
|
|
37
|
-
> - Mockup simulator means embedded backend simulator in the SDK
|
|
38
|
-
> - similar with [msw](https://mswjs.io/), but fully automated
|
|
39
|
-
|
|
40
|
-

|
|
41
|
-
|
|
42
|
-
> Left is NestJS server code, and right is client (frontend) code utilizing SDK
|
|
43
|
-
|
|
44
|
-
|
|
45
|
-
|
|
46
|
-
|
|
47
|
-
## Sponsors and Backers
|
|
48
|
-
Thanks for your support.
|
|
49
|
-
|
|
50
|
-
Your donation would encourage `nestia` development.
|
|
51
|
-
|
|
52
|
-
[](https://opencollective.com/nestia)
|
|
53
|
-
|
|
54
|
-
|
|
55
|
-
|
|
56
|
-
|
|
57
|
-
## Guide Documents
|
|
58
|
-
Check out the document in the [website](https://nestia.io/docs/):
|
|
59
|
-
|
|
60
|
-
### 🏠 Home
|
|
61
|
-
- [Introduction](https://nestia.io/docs/)
|
|
62
|
-
- [Setup](https://nestia.io/docs/setup/)
|
|
63
|
-
- [Pure TypeScript](https://nestia.io/docs/pure)
|
|
64
|
-
|
|
65
|
-
### 📖 Features
|
|
66
|
-
- Core Library
|
|
67
|
-
- [`@WebSocketRoute`](https://nestia.io/docs/core/WebSocketRoute)
|
|
68
|
-
- [`@TypedRoute`](https://nestia.io/docs/core/TypedRoute/)
|
|
69
|
-
- [**`@TypedBody`**](https://nestia.io/docs/core/TypedBody/)
|
|
70
|
-
- [`@TypedParam`](https://nestia.io/docs/core/TypedParam/)
|
|
71
|
-
- [`@TypedQuery`](https://nestia.io/docs/core/TypedQuery/)
|
|
72
|
-
- [`@TypedFormData`](https://nestia.io/docs/core/TypedFormData/)
|
|
73
|
-
- [`@TypedHeaders`](https://nestia.io/docs/core/TypedHeaders/)
|
|
74
|
-
- [`@TypedException`](https://nestia.io/docs/core/TypedException/)
|
|
75
|
-
- Software Development Kit
|
|
76
|
-
- [SDK Builder](https://nestia.io/docs/sdk/)
|
|
77
|
-
- [Mockup Simulator](https://nestia.io/docs/sdk/simulate/)
|
|
78
|
-
- [E2E Test Functions](https://nestia.io/docs/sdk/e2e/)
|
|
79
|
-
- [Distribution](https://nestia.io/docs/sdk/distribute/)
|
|
80
|
-
- Swagger Document
|
|
81
|
-
- [Swagger Builder](https://nestia.io/docs/swagger/)
|
|
82
|
-
- [**AI Chatbot Development**](https://nestia.io/docs/swagger/chat/)
|
|
83
|
-
- [Cloud Swagger Editor](https://nestia.io/docs/swagger/editor/)
|
|
84
|
-
- [Documentation Strategy](https://nestia.io/docs/swagger/strategy/)
|
|
85
|
-
- E2E Testing
|
|
86
|
-
- [Why E2E Test?](https://nestia.io/docs/e2e/why/)
|
|
87
|
-
- [Test Program Development](https://nestia.io/docs/e2e/development/)
|
|
88
|
-
- [Performance Benchmark](https://nestia.io/docs/e2e/benchmark/)
|
|
89
|
-
|
|
90
|
-
### 🔗 Appendix
|
|
91
|
-
- [API Documents](https://nestia.io/api)
|
|
92
|
-
- [⇲ Benchmark Result](https://github.com/samchon/nestia/tree/master/benchmark/results/11th%20Gen%20Intel(R)%20Core(TM)%20i5-1135G7%20%40%202.40GHz)
|
|
93
|
-
- [⇲ `dev.to` Articles](https://dev.to/samchon/series/22751)
|
|
1
|
+
# Nestia
|
|
2
|
+

|
|
3
|
+
|
|
4
|
+
[](https://github.com/samchon/nestia/blob/master/LICENSE)
|
|
5
|
+
[](https://www.npmjs.com/package/@nestia/fetcher)
|
|
6
|
+
[](https://www.npmjs.com/package/@nestia/fetcher)
|
|
7
|
+
[](https://github.com/samchon/nestia/actions?query=workflow%3Atest)
|
|
8
|
+
[](https://nestia.io/docs/)
|
|
9
|
+
[](https://gurubase.io/g/nestia)
|
|
10
|
+
[](https://discord.gg/E94XhzrUCZ)
|
|
11
|
+
|
|
12
|
+
Nestia is a set of helper libraries for NestJS, supporting below features:
|
|
13
|
+
|
|
14
|
+
- `@nestia/core`:
|
|
15
|
+
- Super-fast/easy decorators
|
|
16
|
+
- Advanced WebSocket routes
|
|
17
|
+
- `@nestia/sdk`:
|
|
18
|
+
- Swagger generator, more evolved than ever
|
|
19
|
+
- SDK library generator for clients
|
|
20
|
+
- Mockup Simulator for client applications
|
|
21
|
+
- Automatic E2E test functions generator
|
|
22
|
+
- `@nestia/e2e`: Test program utilizing e2e test functions
|
|
23
|
+
- `@nestia/benchmark`: Benchmark program using e2e test functions
|
|
24
|
+
- `@nestia/editor`: Swagger-UI with Online TypeScript Editor
|
|
25
|
+
- [`@agentica`](https://github.com/wrtnlabs/agentica): Agentic AI library specialized in LLM function calling
|
|
26
|
+
- [`@autobe`](https://github.com/wrtnlabs/autobe): Vibe coding agent generating NestJS application
|
|
27
|
+
- `nestia`: Just CLI (command line interface) tool
|
|
28
|
+
|
|
29
|
+
> [!NOTE]
|
|
30
|
+
>
|
|
31
|
+
> - **Only one line** required, with pure TypeScript type
|
|
32
|
+
> - Enhance performance **30x** up
|
|
33
|
+
> - Runtime validator is **20,000x faster** than `class-validator`
|
|
34
|
+
> - JSON serialization is **200x faster** than `class-transformer`
|
|
35
|
+
> - Software Development Kit
|
|
36
|
+
> - Collection of typed `fetch` functions with DTO structures like [tRPC](https://trpc.io/)
|
|
37
|
+
> - Mockup simulator means embedded backend simulator in the SDK
|
|
38
|
+
> - similar with [msw](https://mswjs.io/), but fully automated
|
|
39
|
+
|
|
40
|
+

|
|
41
|
+
|
|
42
|
+
> Left is NestJS server code, and right is client (frontend) code utilizing SDK
|
|
43
|
+
|
|
44
|
+
|
|
45
|
+
|
|
46
|
+
|
|
47
|
+
## Sponsors and Backers
|
|
48
|
+
Thanks for your support.
|
|
49
|
+
|
|
50
|
+
Your donation would encourage `nestia` development.
|
|
51
|
+
|
|
52
|
+
[](https://opencollective.com/nestia)
|
|
53
|
+
|
|
54
|
+
|
|
55
|
+
|
|
56
|
+
|
|
57
|
+
## Guide Documents
|
|
58
|
+
Check out the document in the [website](https://nestia.io/docs/):
|
|
59
|
+
|
|
60
|
+
### 🏠 Home
|
|
61
|
+
- [Introduction](https://nestia.io/docs/)
|
|
62
|
+
- [Setup](https://nestia.io/docs/setup/)
|
|
63
|
+
- [Pure TypeScript](https://nestia.io/docs/pure)
|
|
64
|
+
|
|
65
|
+
### 📖 Features
|
|
66
|
+
- Core Library
|
|
67
|
+
- [`@WebSocketRoute`](https://nestia.io/docs/core/WebSocketRoute)
|
|
68
|
+
- [`@TypedRoute`](https://nestia.io/docs/core/TypedRoute/)
|
|
69
|
+
- [**`@TypedBody`**](https://nestia.io/docs/core/TypedBody/)
|
|
70
|
+
- [`@TypedParam`](https://nestia.io/docs/core/TypedParam/)
|
|
71
|
+
- [`@TypedQuery`](https://nestia.io/docs/core/TypedQuery/)
|
|
72
|
+
- [`@TypedFormData`](https://nestia.io/docs/core/TypedFormData/)
|
|
73
|
+
- [`@TypedHeaders`](https://nestia.io/docs/core/TypedHeaders/)
|
|
74
|
+
- [`@TypedException`](https://nestia.io/docs/core/TypedException/)
|
|
75
|
+
- Software Development Kit
|
|
76
|
+
- [SDK Builder](https://nestia.io/docs/sdk/)
|
|
77
|
+
- [Mockup Simulator](https://nestia.io/docs/sdk/simulate/)
|
|
78
|
+
- [E2E Test Functions](https://nestia.io/docs/sdk/e2e/)
|
|
79
|
+
- [Distribution](https://nestia.io/docs/sdk/distribute/)
|
|
80
|
+
- Swagger Document
|
|
81
|
+
- [Swagger Builder](https://nestia.io/docs/swagger/)
|
|
82
|
+
- [**AI Chatbot Development**](https://nestia.io/docs/swagger/chat/)
|
|
83
|
+
- [Cloud Swagger Editor](https://nestia.io/docs/swagger/editor/)
|
|
84
|
+
- [Documentation Strategy](https://nestia.io/docs/swagger/strategy/)
|
|
85
|
+
- E2E Testing
|
|
86
|
+
- [Why E2E Test?](https://nestia.io/docs/e2e/why/)
|
|
87
|
+
- [Test Program Development](https://nestia.io/docs/e2e/development/)
|
|
88
|
+
- [Performance Benchmark](https://nestia.io/docs/e2e/benchmark/)
|
|
89
|
+
|
|
90
|
+
### 🔗 Appendix
|
|
91
|
+
- [API Documents](https://nestia.io/api)
|
|
92
|
+
- [⇲ Benchmark Result](https://github.com/samchon/nestia/tree/master/benchmark/results/11th%20Gen%20Intel(R)%20Core(TM)%20i5-1135G7%20%40%202.40GHz)
|
|
93
|
+
- [⇲ `dev.to` Articles](https://dev.to/samchon/series/22751)
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@nestia/benchmark",
|
|
3
|
-
"version": "11.0.0
|
|
3
|
+
"version": "11.0.0",
|
|
4
4
|
"description": "NestJS Performance Benchmark Program",
|
|
5
5
|
"main": "lib/index.js",
|
|
6
6
|
"exports": {
|
|
@@ -30,7 +30,7 @@
|
|
|
30
30
|
"dependencies": {
|
|
31
31
|
"tgrid": "^1.1.0",
|
|
32
32
|
"tstl": "^3.0.0",
|
|
33
|
-
"@nestia/fetcher": "^11.0.0
|
|
33
|
+
"@nestia/fetcher": "^11.0.0"
|
|
34
34
|
},
|
|
35
35
|
"devDependencies": {
|
|
36
36
|
"@types/node": "^25.3.3",
|
|
@@ -1,436 +1,436 @@
|
|
|
1
|
-
import { IConnection } from "@nestia/fetcher";
|
|
2
|
-
import fs from "fs";
|
|
3
|
-
import { Driver, WorkerConnector, WorkerServer } from "tgrid";
|
|
4
|
-
import { HashMap, hash, sleep_for } from "tstl";
|
|
5
|
-
|
|
6
|
-
import { IBenchmarkEvent } from "./IBenchmarkEvent";
|
|
7
|
-
import { DynamicBenchmarkReporter } from "./internal/DynamicBenchmarkReporter";
|
|
8
|
-
import { IBenchmarkMaster } from "./internal/IBenchmarkMaster";
|
|
9
|
-
import { IBenchmarkServant } from "./internal/IBenchmarkServant";
|
|
10
|
-
|
|
11
|
-
/**
|
|
12
|
-
* Dynamic benchmark executor running prefixed functions.
|
|
13
|
-
*
|
|
14
|
-
* `DynamicBenchmarker` is composed with two programs,
|
|
15
|
-
* {@link DynamicBenchmarker.master} and
|
|
16
|
-
* {@link DynamicBenchmarker.servant servants}. The master program creates
|
|
17
|
-
* multiple servant programs, and the servant programs execute the prefixed
|
|
18
|
-
* functions in parallel. When the pre-congirued count of requests are all
|
|
19
|
-
* completed, the master program collects the results and returns them.
|
|
20
|
-
*
|
|
21
|
-
* Therefore, when you want to benchmark the performance of a backend server,
|
|
22
|
-
* you have to make two programs; one for calling the
|
|
23
|
-
* {@link DynamicBenchmarker.master} function, and the other for calling the
|
|
24
|
-
* {@link DynamicBenchmarker.servant} function. Also, never forget to write the
|
|
25
|
-
* path of the servant program to the
|
|
26
|
-
* {@link DynamicBenchmarker.IMasterProps.servant} property.
|
|
27
|
-
*
|
|
28
|
-
* Also, you when you complete the benchmark execution through the
|
|
29
|
-
* {@link DynamicBenchmarker.master} and {@link DynamicBenchmarker.servant}
|
|
30
|
-
* functions, you can convert the result to markdown content by using the
|
|
31
|
-
* {@link DynamicBenchmarker.markdown} function.
|
|
32
|
-
*
|
|
33
|
-
* Additionally, if you hope to see some utilization cases, see the below
|
|
34
|
-
* example tagged links.
|
|
35
|
-
*
|
|
36
|
-
* @author Jeongho Nam - https://github.com/samchon
|
|
37
|
-
* @example
|
|
38
|
-
* https://github.com/samchon/nestia-start/blob/master/test/benchmaark/index.ts
|
|
39
|
-
*
|
|
40
|
-
* @example
|
|
41
|
-
* https://github.com/samchon/backend/blob/master/test/benchmark/index.ts
|
|
42
|
-
*/
|
|
43
|
-
export namespace DynamicBenchmarker {
|
|
44
|
-
/** Properties of the master program. */
|
|
45
|
-
export interface IMasterProps {
|
|
46
|
-
/** Total count of the requests. */
|
|
47
|
-
count: number;
|
|
48
|
-
|
|
49
|
-
/**
|
|
50
|
-
* Number of threads.
|
|
51
|
-
*
|
|
52
|
-
* The number of threads to be executed as parallel servant.
|
|
53
|
-
*/
|
|
54
|
-
threads: number;
|
|
55
|
-
|
|
56
|
-
/**
|
|
57
|
-
* Number of simultaneous requests.
|
|
58
|
-
*
|
|
59
|
-
* The number of requests to be executed simultaneously.
|
|
60
|
-
*
|
|
61
|
-
* This property value would be divided by the {@link threads} in the
|
|
62
|
-
* servants.
|
|
63
|
-
*/
|
|
64
|
-
simultaneous: number;
|
|
65
|
-
|
|
66
|
-
/**
|
|
67
|
-
* Path of the servant program.
|
|
68
|
-
*
|
|
69
|
-
* The path of the servant program executing the
|
|
70
|
-
* {@link DynamicBenchmarker.servant} function.
|
|
71
|
-
*/
|
|
72
|
-
servant: string;
|
|
73
|
-
|
|
74
|
-
/**
|
|
75
|
-
* Filter function.
|
|
76
|
-
*
|
|
77
|
-
* The filter function to determine whether to execute the function in the
|
|
78
|
-
* servant or not.
|
|
79
|
-
*
|
|
80
|
-
* @param name Function name
|
|
81
|
-
* @returns Whether to execute the function or not.
|
|
82
|
-
*/
|
|
83
|
-
filter?: (name: string) => boolean;
|
|
84
|
-
|
|
85
|
-
/**
|
|
86
|
-
* Progress callback function.
|
|
87
|
-
*
|
|
88
|
-
* @param complete The number of completed requests.
|
|
89
|
-
*/
|
|
90
|
-
progress?: (complete: number) => void;
|
|
91
|
-
|
|
92
|
-
/**
|
|
93
|
-
* Get memory usage.
|
|
94
|
-
*
|
|
95
|
-
* Get the memory usage of the master program.
|
|
96
|
-
*
|
|
97
|
-
* Specify this property only when your backend server is running on a
|
|
98
|
-
* different process, so that need to measure the memory usage of the
|
|
99
|
-
* backend server from other process.
|
|
100
|
-
*/
|
|
101
|
-
memory?: () => Promise<NodeJS.MemoryUsage>;
|
|
102
|
-
|
|
103
|
-
/**
|
|
104
|
-
* Standard I/O option.
|
|
105
|
-
*
|
|
106
|
-
* The standard I/O option for the servant programs.
|
|
107
|
-
*/
|
|
108
|
-
stdio?: undefined | "overlapped" | "pipe" | "ignore" | "inherit";
|
|
109
|
-
}
|
|
110
|
-
|
|
111
|
-
/** Properties of the servant program. */
|
|
112
|
-
export interface IServantProps<Parameters extends any[]> {
|
|
113
|
-
/**
|
|
114
|
-
* Default connection.
|
|
115
|
-
*
|
|
116
|
-
* Default connection to be used in the servant.
|
|
117
|
-
*/
|
|
118
|
-
connection: IConnection;
|
|
119
|
-
|
|
120
|
-
/** Location of the benchmark functions. */
|
|
121
|
-
location: string;
|
|
122
|
-
|
|
123
|
-
/**
|
|
124
|
-
* Prefix of the benchmark functions.
|
|
125
|
-
*
|
|
126
|
-
* Every prefixed function will be executed in the servant.
|
|
127
|
-
*
|
|
128
|
-
* In other words, if a function name doesn't start with the prefix, then it
|
|
129
|
-
* would never be executed.
|
|
130
|
-
*/
|
|
131
|
-
prefix: string;
|
|
132
|
-
|
|
133
|
-
/**
|
|
134
|
-
* Get parameters of a function.
|
|
135
|
-
*
|
|
136
|
-
* When composing the parameters, never forget to copy the
|
|
137
|
-
* {@link IConnection.logger} property of default connection to the returning
|
|
138
|
-
* parameters.
|
|
139
|
-
*
|
|
140
|
-
* @param connection Default connection instance
|
|
141
|
-
* @param name Function name
|
|
142
|
-
*/
|
|
143
|
-
parameters: (connection: IConnection, name: string) => Parameters;
|
|
144
|
-
}
|
|
145
|
-
|
|
146
|
-
/** Benchmark report. */
|
|
147
|
-
export interface IReport {
|
|
148
|
-
count: number;
|
|
149
|
-
threads: number;
|
|
150
|
-
simultaneous: number;
|
|
151
|
-
started_at: string;
|
|
152
|
-
completed_at: string;
|
|
153
|
-
statistics: IReport.IStatistics;
|
|
154
|
-
endpoints: Array<IReport.IEndpoint & IReport.IStatistics>;
|
|
155
|
-
memories: IReport.IMemory[];
|
|
156
|
-
}
|
|
157
|
-
export namespace IReport {
|
|
158
|
-
export interface IEndpoint {
|
|
159
|
-
method: string;
|
|
160
|
-
path: string;
|
|
161
|
-
}
|
|
162
|
-
export interface IStatistics {
|
|
163
|
-
count: number;
|
|
164
|
-
success: number;
|
|
165
|
-
mean: number | null;
|
|
166
|
-
stdev: number | null;
|
|
167
|
-
minimum: number | null;
|
|
168
|
-
maximum: number | null;
|
|
169
|
-
}
|
|
170
|
-
export interface IMemory {
|
|
171
|
-
time: string;
|
|
172
|
-
usage: NodeJS.MemoryUsage;
|
|
173
|
-
}
|
|
174
|
-
}
|
|
175
|
-
|
|
176
|
-
/**
|
|
177
|
-
* Master program.
|
|
178
|
-
*
|
|
179
|
-
* Creates a master program that executing the servant programs in parallel.
|
|
180
|
-
*
|
|
181
|
-
* Note that, {@link IMasterProps.servant} property must be the path of the
|
|
182
|
-
* servant program executing the {@link servant} function.
|
|
183
|
-
*
|
|
184
|
-
* @param props Properties of the master program
|
|
185
|
-
* @returns Benchmark report
|
|
186
|
-
*/
|
|
187
|
-
export const master = async (props: IMasterProps): Promise<IReport> => {
|
|
188
|
-
const completes: number[] = new Array(props.threads).fill(0);
|
|
189
|
-
const servants: WorkerConnector<
|
|
190
|
-
null,
|
|
191
|
-
IBenchmarkMaster,
|
|
192
|
-
IBenchmarkServant
|
|
193
|
-
>[] = await Promise.all(
|
|
194
|
-
new Array(props.threads).fill(null).map(async (_, i) => {
|
|
195
|
-
const connector: WorkerConnector<
|
|
196
|
-
null,
|
|
197
|
-
IBenchmarkMaster,
|
|
198
|
-
IBenchmarkServant
|
|
199
|
-
> = new WorkerConnector(
|
|
200
|
-
null,
|
|
201
|
-
{
|
|
202
|
-
filter: props.filter ?? (() => true),
|
|
203
|
-
progress: (current) => {
|
|
204
|
-
completes[i] = current;
|
|
205
|
-
if (props.progress)
|
|
206
|
-
props.progress(completes.reduce((a, b) => a + b, 0));
|
|
207
|
-
},
|
|
208
|
-
},
|
|
209
|
-
"process",
|
|
210
|
-
);
|
|
211
|
-
await connector.connect(props.servant, { stdio: props.stdio });
|
|
212
|
-
return connector;
|
|
213
|
-
}),
|
|
214
|
-
);
|
|
215
|
-
|
|
216
|
-
const started_at: Date = new Date();
|
|
217
|
-
const memories: IReport.IMemory[] = [];
|
|
218
|
-
let completed_at: Date | null = null;
|
|
219
|
-
|
|
220
|
-
(async () => {
|
|
221
|
-
const getter = props.memory ?? (async () => process.memoryUsage());
|
|
222
|
-
while (completed_at === null) {
|
|
223
|
-
await sleep_for(1_000);
|
|
224
|
-
memories.push({
|
|
225
|
-
usage: await getter(),
|
|
226
|
-
time: new Date().toISOString(),
|
|
227
|
-
});
|
|
228
|
-
}
|
|
229
|
-
})().catch(() => {});
|
|
230
|
-
|
|
231
|
-
const events: IBenchmarkEvent[] = (
|
|
232
|
-
await Promise.all(
|
|
233
|
-
servants.map((connector) =>
|
|
234
|
-
connector.getDriver().execute({
|
|
235
|
-
count: Math.ceil(props.count / props.threads),
|
|
236
|
-
simultaneous: Math.ceil(props.simultaneous / props.threads),
|
|
237
|
-
}),
|
|
238
|
-
),
|
|
239
|
-
)
|
|
240
|
-
).flat();
|
|
241
|
-
|
|
242
|
-
completed_at = new Date();
|
|
243
|
-
await Promise.all(servants.map((connector) => connector.close()));
|
|
244
|
-
if (props.progress) props.progress(props.count);
|
|
245
|
-
|
|
246
|
-
const endpoints: HashMap<IReport.IEndpoint, IBenchmarkEvent[]> =
|
|
247
|
-
new HashMap(
|
|
248
|
-
(key) => hash(key.method, key.path),
|
|
249
|
-
(x, y) => x.method === y.method && x.path === y.path,
|
|
250
|
-
);
|
|
251
|
-
for (const e of events)
|
|
252
|
-
endpoints
|
|
253
|
-
.take(
|
|
254
|
-
{
|
|
255
|
-
method: e.metadata.method,
|
|
256
|
-
path: e.metadata.template ?? e.metadata.path,
|
|
257
|
-
},
|
|
258
|
-
() => [],
|
|
259
|
-
)
|
|
260
|
-
.push(e);
|
|
261
|
-
return {
|
|
262
|
-
count: props.count,
|
|
263
|
-
threads: props.threads,
|
|
264
|
-
simultaneous: props.simultaneous,
|
|
265
|
-
statistics: statistics(events),
|
|
266
|
-
endpoints: [...endpoints].map((it) => ({
|
|
267
|
-
...statistics(it.second),
|
|
268
|
-
...it.first,
|
|
269
|
-
})),
|
|
270
|
-
started_at: started_at.toISOString(),
|
|
271
|
-
completed_at: completed_at.toISOString(),
|
|
272
|
-
memories,
|
|
273
|
-
};
|
|
274
|
-
};
|
|
275
|
-
|
|
276
|
-
/**
|
|
277
|
-
* Create a servant program.
|
|
278
|
-
*
|
|
279
|
-
* Creates a servant program executing the prefixed functions in parallel.
|
|
280
|
-
*
|
|
281
|
-
* @param props Properties of the servant program
|
|
282
|
-
* @returns Servant program as a worker server
|
|
283
|
-
*/
|
|
284
|
-
export const servant = async <Parameters extends any[]>(
|
|
285
|
-
props: IServantProps<Parameters>,
|
|
286
|
-
): Promise<WorkerServer<null, IBenchmarkServant, IBenchmarkMaster>> => {
|
|
287
|
-
const server: WorkerServer<null, IBenchmarkServant, IBenchmarkMaster> =
|
|
288
|
-
new WorkerServer();
|
|
289
|
-
await server.open({
|
|
290
|
-
execute: execute({
|
|
291
|
-
driver: server.getDriver(),
|
|
292
|
-
props,
|
|
293
|
-
}),
|
|
294
|
-
});
|
|
295
|
-
return server;
|
|
296
|
-
};
|
|
297
|
-
|
|
298
|
-
/**
|
|
299
|
-
* Convert the benchmark report to markdown content.
|
|
300
|
-
*
|
|
301
|
-
* @param report Benchmark report
|
|
302
|
-
* @returns Markdown content
|
|
303
|
-
*/
|
|
304
|
-
export const markdown = (report: DynamicBenchmarker.IReport): string =>
|
|
305
|
-
DynamicBenchmarkReporter.markdown(report);
|
|
306
|
-
|
|
307
|
-
const execute =
|
|
308
|
-
<Parameters extends any[]>(ctx: {
|
|
309
|
-
driver: Driver<IBenchmarkMaster>;
|
|
310
|
-
props: IServantProps<Parameters>;
|
|
311
|
-
}) =>
|
|
312
|
-
async (mass: {
|
|
313
|
-
count: number;
|
|
314
|
-
simultaneous: number;
|
|
315
|
-
}): Promise<IBenchmarkEvent[]> => {
|
|
316
|
-
const functions: IFunction<Parameters>[] = [];
|
|
317
|
-
await iterate({
|
|
318
|
-
collection: functions,
|
|
319
|
-
driver: ctx.driver,
|
|
320
|
-
props: ctx.props,
|
|
321
|
-
})(ctx.props.location);
|
|
322
|
-
|
|
323
|
-
const entireEvents: IBenchmarkEvent[] = [];
|
|
324
|
-
await Promise.all(
|
|
325
|
-
new Array(mass.simultaneous)
|
|
326
|
-
.fill(null)
|
|
327
|
-
.map(() => 1)
|
|
328
|
-
.map(async () => {
|
|
329
|
-
while (entireEvents.length < mass.count) {
|
|
330
|
-
const localEvents: IBenchmarkEvent[] = [];
|
|
331
|
-
const func: IFunction<Parameters> =
|
|
332
|
-
functions[Math.floor(Math.random() * functions.length)]!;
|
|
333
|
-
const connection: IConnection = {
|
|
334
|
-
...ctx.props.connection,
|
|
335
|
-
logger: async (fe): Promise<void> => {
|
|
336
|
-
const be: IBenchmarkEvent = {
|
|
337
|
-
metadata: fe.route,
|
|
338
|
-
status: fe.status,
|
|
339
|
-
started_at: fe.started_at.toISOString(),
|
|
340
|
-
respond_at: fe.respond_at?.toISOString() ?? null,
|
|
341
|
-
completed_at: fe.completed_at.toISOString(),
|
|
342
|
-
success: true,
|
|
343
|
-
};
|
|
344
|
-
localEvents.push(be);
|
|
345
|
-
entireEvents.push(be);
|
|
346
|
-
},
|
|
347
|
-
};
|
|
348
|
-
try {
|
|
349
|
-
await func.value(...ctx.props.parameters(connection, func.key));
|
|
350
|
-
} catch (exp) {
|
|
351
|
-
for (const e of localEvents)
|
|
352
|
-
e.success = e.status === 200 || e.status === 201;
|
|
353
|
-
}
|
|
354
|
-
if (localEvents.length !== 0)
|
|
355
|
-
ctx.driver.progress(entireEvents.length).catch(() => {});
|
|
356
|
-
}
|
|
357
|
-
}),
|
|
358
|
-
);
|
|
359
|
-
await ctx.driver.progress(entireEvents.length);
|
|
360
|
-
return entireEvents;
|
|
361
|
-
};
|
|
362
|
-
}
|
|
363
|
-
|
|
364
|
-
interface IFunction<Parameters extends any[]> {
|
|
365
|
-
key: string;
|
|
366
|
-
value: (...args: Parameters) => Promise<void>;
|
|
367
|
-
}
|
|
368
|
-
|
|
369
|
-
const iterate =
|
|
370
|
-
<Parameters extends any[]>(ctx: {
|
|
371
|
-
collection: IFunction<Parameters>[];
|
|
372
|
-
driver: Driver<IBenchmarkMaster>;
|
|
373
|
-
props: DynamicBenchmarker.IServantProps<Parameters>;
|
|
374
|
-
}) =>
|
|
375
|
-
async (path: string): Promise<void> => {
|
|
376
|
-
const directory: string[] = await fs.promises.readdir(path);
|
|
377
|
-
for (const file of directory) {
|
|
378
|
-
const location: string = `${path}/${file}`;
|
|
379
|
-
const stat: fs.Stats = await fs.promises.stat(location);
|
|
380
|
-
if (stat.isDirectory() === true) await iterate(ctx)(location);
|
|
381
|
-
else if (file.endsWith(__filename.substr(-3)) === true) {
|
|
382
|
-
const modulo = await import(location);
|
|
383
|
-
for (const [key, value] of Object.entries(modulo)) {
|
|
384
|
-
if (typeof value !== "function") continue;
|
|
385
|
-
else if (key.startsWith(ctx.props.prefix) === false) continue;
|
|
386
|
-
else if ((await ctx.driver.filter(key)) === false) continue;
|
|
387
|
-
ctx.collection.push({
|
|
388
|
-
key,
|
|
389
|
-
value: value as (...args: Parameters) => Promise<any>,
|
|
390
|
-
});
|
|
391
|
-
}
|
|
392
|
-
}
|
|
393
|
-
}
|
|
394
|
-
};
|
|
395
|
-
|
|
396
|
-
const statistics = (
|
|
397
|
-
events: IBenchmarkEvent[],
|
|
398
|
-
): DynamicBenchmarker.IReport.IStatistics => {
|
|
399
|
-
const successes: IBenchmarkEvent[] = events.filter((event) => event.success);
|
|
400
|
-
return {
|
|
401
|
-
count: events.length,
|
|
402
|
-
success: successes.length,
|
|
403
|
-
...average(events),
|
|
404
|
-
};
|
|
405
|
-
};
|
|
406
|
-
|
|
407
|
-
const average = (
|
|
408
|
-
events: IBenchmarkEvent[],
|
|
409
|
-
): Pick<
|
|
410
|
-
DynamicBenchmarker.IReport.IStatistics,
|
|
411
|
-
"mean" | "stdev" | "minimum" | "maximum"
|
|
412
|
-
> => {
|
|
413
|
-
if (events.length === 0)
|
|
414
|
-
return {
|
|
415
|
-
mean: null,
|
|
416
|
-
stdev: null,
|
|
417
|
-
minimum: null,
|
|
418
|
-
maximum: null,
|
|
419
|
-
};
|
|
420
|
-
let mean: number = 0;
|
|
421
|
-
let stdev: number = 0;
|
|
422
|
-
let minimum: number = Number.MAX_SAFE_INTEGER;
|
|
423
|
-
let maximum: number = Number.MIN_SAFE_INTEGER;
|
|
424
|
-
for (const event of events) {
|
|
425
|
-
const elapsed: number =
|
|
426
|
-
new Date(event.completed_at).getTime() -
|
|
427
|
-
new Date(event.started_at).getTime();
|
|
428
|
-
mean += elapsed;
|
|
429
|
-
stdev += elapsed * elapsed;
|
|
430
|
-
minimum = Math.min(minimum, elapsed);
|
|
431
|
-
maximum = Math.max(maximum, elapsed);
|
|
432
|
-
}
|
|
433
|
-
mean /= events.length;
|
|
434
|
-
stdev = Math.sqrt(stdev / events.length - mean * mean);
|
|
435
|
-
return { mean, stdev, minimum, maximum };
|
|
436
|
-
};
|
|
1
|
+
import { IConnection } from "@nestia/fetcher";
|
|
2
|
+
import fs from "fs";
|
|
3
|
+
import { Driver, WorkerConnector, WorkerServer } from "tgrid";
|
|
4
|
+
import { HashMap, hash, sleep_for } from "tstl";
|
|
5
|
+
|
|
6
|
+
import { IBenchmarkEvent } from "./IBenchmarkEvent";
|
|
7
|
+
import { DynamicBenchmarkReporter } from "./internal/DynamicBenchmarkReporter";
|
|
8
|
+
import { IBenchmarkMaster } from "./internal/IBenchmarkMaster";
|
|
9
|
+
import { IBenchmarkServant } from "./internal/IBenchmarkServant";
|
|
10
|
+
|
|
11
|
+
/**
|
|
12
|
+
* Dynamic benchmark executor running prefixed functions.
|
|
13
|
+
*
|
|
14
|
+
* `DynamicBenchmarker` is composed with two programs,
|
|
15
|
+
* {@link DynamicBenchmarker.master} and
|
|
16
|
+
* {@link DynamicBenchmarker.servant servants}. The master program creates
|
|
17
|
+
* multiple servant programs, and the servant programs execute the prefixed
|
|
18
|
+
* functions in parallel. When the pre-congirued count of requests are all
|
|
19
|
+
* completed, the master program collects the results and returns them.
|
|
20
|
+
*
|
|
21
|
+
* Therefore, when you want to benchmark the performance of a backend server,
|
|
22
|
+
* you have to make two programs; one for calling the
|
|
23
|
+
* {@link DynamicBenchmarker.master} function, and the other for calling the
|
|
24
|
+
* {@link DynamicBenchmarker.servant} function. Also, never forget to write the
|
|
25
|
+
* path of the servant program to the
|
|
26
|
+
* {@link DynamicBenchmarker.IMasterProps.servant} property.
|
|
27
|
+
*
|
|
28
|
+
* Also, you when you complete the benchmark execution through the
|
|
29
|
+
* {@link DynamicBenchmarker.master} and {@link DynamicBenchmarker.servant}
|
|
30
|
+
* functions, you can convert the result to markdown content by using the
|
|
31
|
+
* {@link DynamicBenchmarker.markdown} function.
|
|
32
|
+
*
|
|
33
|
+
* Additionally, if you hope to see some utilization cases, see the below
|
|
34
|
+
* example tagged links.
|
|
35
|
+
*
|
|
36
|
+
* @author Jeongho Nam - https://github.com/samchon
|
|
37
|
+
* @example
|
|
38
|
+
* https://github.com/samchon/nestia-start/blob/master/test/benchmaark/index.ts
|
|
39
|
+
*
|
|
40
|
+
* @example
|
|
41
|
+
* https://github.com/samchon/backend/blob/master/test/benchmark/index.ts
|
|
42
|
+
*/
|
|
43
|
+
export namespace DynamicBenchmarker {
|
|
44
|
+
/** Properties of the master program. */
|
|
45
|
+
export interface IMasterProps {
|
|
46
|
+
/** Total count of the requests. */
|
|
47
|
+
count: number;
|
|
48
|
+
|
|
49
|
+
/**
|
|
50
|
+
* Number of threads.
|
|
51
|
+
*
|
|
52
|
+
* The number of threads to be executed as parallel servant.
|
|
53
|
+
*/
|
|
54
|
+
threads: number;
|
|
55
|
+
|
|
56
|
+
/**
|
|
57
|
+
* Number of simultaneous requests.
|
|
58
|
+
*
|
|
59
|
+
* The number of requests to be executed simultaneously.
|
|
60
|
+
*
|
|
61
|
+
* This property value would be divided by the {@link threads} in the
|
|
62
|
+
* servants.
|
|
63
|
+
*/
|
|
64
|
+
simultaneous: number;
|
|
65
|
+
|
|
66
|
+
/**
|
|
67
|
+
* Path of the servant program.
|
|
68
|
+
*
|
|
69
|
+
* The path of the servant program executing the
|
|
70
|
+
* {@link DynamicBenchmarker.servant} function.
|
|
71
|
+
*/
|
|
72
|
+
servant: string;
|
|
73
|
+
|
|
74
|
+
/**
|
|
75
|
+
* Filter function.
|
|
76
|
+
*
|
|
77
|
+
* The filter function to determine whether to execute the function in the
|
|
78
|
+
* servant or not.
|
|
79
|
+
*
|
|
80
|
+
* @param name Function name
|
|
81
|
+
* @returns Whether to execute the function or not.
|
|
82
|
+
*/
|
|
83
|
+
filter?: (name: string) => boolean;
|
|
84
|
+
|
|
85
|
+
/**
|
|
86
|
+
* Progress callback function.
|
|
87
|
+
*
|
|
88
|
+
* @param complete The number of completed requests.
|
|
89
|
+
*/
|
|
90
|
+
progress?: (complete: number) => void;
|
|
91
|
+
|
|
92
|
+
/**
|
|
93
|
+
* Get memory usage.
|
|
94
|
+
*
|
|
95
|
+
* Get the memory usage of the master program.
|
|
96
|
+
*
|
|
97
|
+
* Specify this property only when your backend server is running on a
|
|
98
|
+
* different process, so that need to measure the memory usage of the
|
|
99
|
+
* backend server from other process.
|
|
100
|
+
*/
|
|
101
|
+
memory?: () => Promise<NodeJS.MemoryUsage>;
|
|
102
|
+
|
|
103
|
+
/**
|
|
104
|
+
* Standard I/O option.
|
|
105
|
+
*
|
|
106
|
+
* The standard I/O option for the servant programs.
|
|
107
|
+
*/
|
|
108
|
+
stdio?: undefined | "overlapped" | "pipe" | "ignore" | "inherit";
|
|
109
|
+
}
|
|
110
|
+
|
|
111
|
+
/** Properties of the servant program. */
|
|
112
|
+
export interface IServantProps<Parameters extends any[]> {
|
|
113
|
+
/**
|
|
114
|
+
* Default connection.
|
|
115
|
+
*
|
|
116
|
+
* Default connection to be used in the servant.
|
|
117
|
+
*/
|
|
118
|
+
connection: IConnection;
|
|
119
|
+
|
|
120
|
+
/** Location of the benchmark functions. */
|
|
121
|
+
location: string;
|
|
122
|
+
|
|
123
|
+
/**
|
|
124
|
+
* Prefix of the benchmark functions.
|
|
125
|
+
*
|
|
126
|
+
* Every prefixed function will be executed in the servant.
|
|
127
|
+
*
|
|
128
|
+
* In other words, if a function name doesn't start with the prefix, then it
|
|
129
|
+
* would never be executed.
|
|
130
|
+
*/
|
|
131
|
+
prefix: string;
|
|
132
|
+
|
|
133
|
+
/**
|
|
134
|
+
* Get parameters of a function.
|
|
135
|
+
*
|
|
136
|
+
* When composing the parameters, never forget to copy the
|
|
137
|
+
* {@link IConnection.logger} property of default connection to the returning
|
|
138
|
+
* parameters.
|
|
139
|
+
*
|
|
140
|
+
* @param connection Default connection instance
|
|
141
|
+
* @param name Function name
|
|
142
|
+
*/
|
|
143
|
+
parameters: (connection: IConnection, name: string) => Parameters;
|
|
144
|
+
}
|
|
145
|
+
|
|
146
|
+
/** Benchmark report. */
|
|
147
|
+
export interface IReport {
|
|
148
|
+
count: number;
|
|
149
|
+
threads: number;
|
|
150
|
+
simultaneous: number;
|
|
151
|
+
started_at: string;
|
|
152
|
+
completed_at: string;
|
|
153
|
+
statistics: IReport.IStatistics;
|
|
154
|
+
endpoints: Array<IReport.IEndpoint & IReport.IStatistics>;
|
|
155
|
+
memories: IReport.IMemory[];
|
|
156
|
+
}
|
|
157
|
+
export namespace IReport {
|
|
158
|
+
export interface IEndpoint {
|
|
159
|
+
method: string;
|
|
160
|
+
path: string;
|
|
161
|
+
}
|
|
162
|
+
export interface IStatistics {
|
|
163
|
+
count: number;
|
|
164
|
+
success: number;
|
|
165
|
+
mean: number | null;
|
|
166
|
+
stdev: number | null;
|
|
167
|
+
minimum: number | null;
|
|
168
|
+
maximum: number | null;
|
|
169
|
+
}
|
|
170
|
+
export interface IMemory {
|
|
171
|
+
time: string;
|
|
172
|
+
usage: NodeJS.MemoryUsage;
|
|
173
|
+
}
|
|
174
|
+
}
|
|
175
|
+
|
|
176
|
+
/**
|
|
177
|
+
* Master program.
|
|
178
|
+
*
|
|
179
|
+
* Creates a master program that executing the servant programs in parallel.
|
|
180
|
+
*
|
|
181
|
+
* Note that, {@link IMasterProps.servant} property must be the path of the
|
|
182
|
+
* servant program executing the {@link servant} function.
|
|
183
|
+
*
|
|
184
|
+
* @param props Properties of the master program
|
|
185
|
+
* @returns Benchmark report
|
|
186
|
+
*/
|
|
187
|
+
export const master = async (props: IMasterProps): Promise<IReport> => {
|
|
188
|
+
const completes: number[] = new Array(props.threads).fill(0);
|
|
189
|
+
const servants: WorkerConnector<
|
|
190
|
+
null,
|
|
191
|
+
IBenchmarkMaster,
|
|
192
|
+
IBenchmarkServant
|
|
193
|
+
>[] = await Promise.all(
|
|
194
|
+
new Array(props.threads).fill(null).map(async (_, i) => {
|
|
195
|
+
const connector: WorkerConnector<
|
|
196
|
+
null,
|
|
197
|
+
IBenchmarkMaster,
|
|
198
|
+
IBenchmarkServant
|
|
199
|
+
> = new WorkerConnector(
|
|
200
|
+
null,
|
|
201
|
+
{
|
|
202
|
+
filter: props.filter ?? (() => true),
|
|
203
|
+
progress: (current) => {
|
|
204
|
+
completes[i] = current;
|
|
205
|
+
if (props.progress)
|
|
206
|
+
props.progress(completes.reduce((a, b) => a + b, 0));
|
|
207
|
+
},
|
|
208
|
+
},
|
|
209
|
+
"process",
|
|
210
|
+
);
|
|
211
|
+
await connector.connect(props.servant, { stdio: props.stdio });
|
|
212
|
+
return connector;
|
|
213
|
+
}),
|
|
214
|
+
);
|
|
215
|
+
|
|
216
|
+
const started_at: Date = new Date();
|
|
217
|
+
const memories: IReport.IMemory[] = [];
|
|
218
|
+
let completed_at: Date | null = null;
|
|
219
|
+
|
|
220
|
+
(async () => {
|
|
221
|
+
const getter = props.memory ?? (async () => process.memoryUsage());
|
|
222
|
+
while (completed_at === null) {
|
|
223
|
+
await sleep_for(1_000);
|
|
224
|
+
memories.push({
|
|
225
|
+
usage: await getter(),
|
|
226
|
+
time: new Date().toISOString(),
|
|
227
|
+
});
|
|
228
|
+
}
|
|
229
|
+
})().catch(() => {});
|
|
230
|
+
|
|
231
|
+
const events: IBenchmarkEvent[] = (
|
|
232
|
+
await Promise.all(
|
|
233
|
+
servants.map((connector) =>
|
|
234
|
+
connector.getDriver().execute({
|
|
235
|
+
count: Math.ceil(props.count / props.threads),
|
|
236
|
+
simultaneous: Math.ceil(props.simultaneous / props.threads),
|
|
237
|
+
}),
|
|
238
|
+
),
|
|
239
|
+
)
|
|
240
|
+
).flat();
|
|
241
|
+
|
|
242
|
+
completed_at = new Date();
|
|
243
|
+
await Promise.all(servants.map((connector) => connector.close()));
|
|
244
|
+
if (props.progress) props.progress(props.count);
|
|
245
|
+
|
|
246
|
+
const endpoints: HashMap<IReport.IEndpoint, IBenchmarkEvent[]> =
|
|
247
|
+
new HashMap(
|
|
248
|
+
(key) => hash(key.method, key.path),
|
|
249
|
+
(x, y) => x.method === y.method && x.path === y.path,
|
|
250
|
+
);
|
|
251
|
+
for (const e of events)
|
|
252
|
+
endpoints
|
|
253
|
+
.take(
|
|
254
|
+
{
|
|
255
|
+
method: e.metadata.method,
|
|
256
|
+
path: e.metadata.template ?? e.metadata.path,
|
|
257
|
+
},
|
|
258
|
+
() => [],
|
|
259
|
+
)
|
|
260
|
+
.push(e);
|
|
261
|
+
return {
|
|
262
|
+
count: props.count,
|
|
263
|
+
threads: props.threads,
|
|
264
|
+
simultaneous: props.simultaneous,
|
|
265
|
+
statistics: statistics(events),
|
|
266
|
+
endpoints: [...endpoints].map((it) => ({
|
|
267
|
+
...statistics(it.second),
|
|
268
|
+
...it.first,
|
|
269
|
+
})),
|
|
270
|
+
started_at: started_at.toISOString(),
|
|
271
|
+
completed_at: completed_at.toISOString(),
|
|
272
|
+
memories,
|
|
273
|
+
};
|
|
274
|
+
};
|
|
275
|
+
|
|
276
|
+
/**
|
|
277
|
+
* Create a servant program.
|
|
278
|
+
*
|
|
279
|
+
* Creates a servant program executing the prefixed functions in parallel.
|
|
280
|
+
*
|
|
281
|
+
* @param props Properties of the servant program
|
|
282
|
+
* @returns Servant program as a worker server
|
|
283
|
+
*/
|
|
284
|
+
export const servant = async <Parameters extends any[]>(
|
|
285
|
+
props: IServantProps<Parameters>,
|
|
286
|
+
): Promise<WorkerServer<null, IBenchmarkServant, IBenchmarkMaster>> => {
|
|
287
|
+
const server: WorkerServer<null, IBenchmarkServant, IBenchmarkMaster> =
|
|
288
|
+
new WorkerServer();
|
|
289
|
+
await server.open({
|
|
290
|
+
execute: execute({
|
|
291
|
+
driver: server.getDriver(),
|
|
292
|
+
props,
|
|
293
|
+
}),
|
|
294
|
+
});
|
|
295
|
+
return server;
|
|
296
|
+
};
|
|
297
|
+
|
|
298
|
+
/**
|
|
299
|
+
* Convert the benchmark report to markdown content.
|
|
300
|
+
*
|
|
301
|
+
* @param report Benchmark report
|
|
302
|
+
* @returns Markdown content
|
|
303
|
+
*/
|
|
304
|
+
export const markdown = (report: DynamicBenchmarker.IReport): string =>
|
|
305
|
+
DynamicBenchmarkReporter.markdown(report);
|
|
306
|
+
|
|
307
|
+
const execute =
|
|
308
|
+
<Parameters extends any[]>(ctx: {
|
|
309
|
+
driver: Driver<IBenchmarkMaster>;
|
|
310
|
+
props: IServantProps<Parameters>;
|
|
311
|
+
}) =>
|
|
312
|
+
async (mass: {
|
|
313
|
+
count: number;
|
|
314
|
+
simultaneous: number;
|
|
315
|
+
}): Promise<IBenchmarkEvent[]> => {
|
|
316
|
+
const functions: IFunction<Parameters>[] = [];
|
|
317
|
+
await iterate({
|
|
318
|
+
collection: functions,
|
|
319
|
+
driver: ctx.driver,
|
|
320
|
+
props: ctx.props,
|
|
321
|
+
})(ctx.props.location);
|
|
322
|
+
|
|
323
|
+
const entireEvents: IBenchmarkEvent[] = [];
|
|
324
|
+
await Promise.all(
|
|
325
|
+
new Array(mass.simultaneous)
|
|
326
|
+
.fill(null)
|
|
327
|
+
.map(() => 1)
|
|
328
|
+
.map(async () => {
|
|
329
|
+
while (entireEvents.length < mass.count) {
|
|
330
|
+
const localEvents: IBenchmarkEvent[] = [];
|
|
331
|
+
const func: IFunction<Parameters> =
|
|
332
|
+
functions[Math.floor(Math.random() * functions.length)]!;
|
|
333
|
+
const connection: IConnection = {
|
|
334
|
+
...ctx.props.connection,
|
|
335
|
+
logger: async (fe): Promise<void> => {
|
|
336
|
+
const be: IBenchmarkEvent = {
|
|
337
|
+
metadata: fe.route,
|
|
338
|
+
status: fe.status,
|
|
339
|
+
started_at: fe.started_at.toISOString(),
|
|
340
|
+
respond_at: fe.respond_at?.toISOString() ?? null,
|
|
341
|
+
completed_at: fe.completed_at.toISOString(),
|
|
342
|
+
success: true,
|
|
343
|
+
};
|
|
344
|
+
localEvents.push(be);
|
|
345
|
+
entireEvents.push(be);
|
|
346
|
+
},
|
|
347
|
+
};
|
|
348
|
+
try {
|
|
349
|
+
await func.value(...ctx.props.parameters(connection, func.key));
|
|
350
|
+
} catch (exp) {
|
|
351
|
+
for (const e of localEvents)
|
|
352
|
+
e.success = e.status === 200 || e.status === 201;
|
|
353
|
+
}
|
|
354
|
+
if (localEvents.length !== 0)
|
|
355
|
+
ctx.driver.progress(entireEvents.length).catch(() => {});
|
|
356
|
+
}
|
|
357
|
+
}),
|
|
358
|
+
);
|
|
359
|
+
await ctx.driver.progress(entireEvents.length);
|
|
360
|
+
return entireEvents;
|
|
361
|
+
};
|
|
362
|
+
}
|
|
363
|
+
|
|
364
|
+
interface IFunction<Parameters extends any[]> {
|
|
365
|
+
key: string;
|
|
366
|
+
value: (...args: Parameters) => Promise<void>;
|
|
367
|
+
}
|
|
368
|
+
|
|
369
|
+
const iterate =
|
|
370
|
+
<Parameters extends any[]>(ctx: {
|
|
371
|
+
collection: IFunction<Parameters>[];
|
|
372
|
+
driver: Driver<IBenchmarkMaster>;
|
|
373
|
+
props: DynamicBenchmarker.IServantProps<Parameters>;
|
|
374
|
+
}) =>
|
|
375
|
+
async (path: string): Promise<void> => {
|
|
376
|
+
const directory: string[] = await fs.promises.readdir(path);
|
|
377
|
+
for (const file of directory) {
|
|
378
|
+
const location: string = `${path}/${file}`;
|
|
379
|
+
const stat: fs.Stats = await fs.promises.stat(location);
|
|
380
|
+
if (stat.isDirectory() === true) await iterate(ctx)(location);
|
|
381
|
+
else if (file.endsWith(__filename.substr(-3)) === true) {
|
|
382
|
+
const modulo = await import(location);
|
|
383
|
+
for (const [key, value] of Object.entries(modulo)) {
|
|
384
|
+
if (typeof value !== "function") continue;
|
|
385
|
+
else if (key.startsWith(ctx.props.prefix) === false) continue;
|
|
386
|
+
else if ((await ctx.driver.filter(key)) === false) continue;
|
|
387
|
+
ctx.collection.push({
|
|
388
|
+
key,
|
|
389
|
+
value: value as (...args: Parameters) => Promise<any>,
|
|
390
|
+
});
|
|
391
|
+
}
|
|
392
|
+
}
|
|
393
|
+
}
|
|
394
|
+
};
|
|
395
|
+
|
|
396
|
+
const statistics = (
|
|
397
|
+
events: IBenchmarkEvent[],
|
|
398
|
+
): DynamicBenchmarker.IReport.IStatistics => {
|
|
399
|
+
const successes: IBenchmarkEvent[] = events.filter((event) => event.success);
|
|
400
|
+
return {
|
|
401
|
+
count: events.length,
|
|
402
|
+
success: successes.length,
|
|
403
|
+
...average(events),
|
|
404
|
+
};
|
|
405
|
+
};
|
|
406
|
+
|
|
407
|
+
const average = (
|
|
408
|
+
events: IBenchmarkEvent[],
|
|
409
|
+
): Pick<
|
|
410
|
+
DynamicBenchmarker.IReport.IStatistics,
|
|
411
|
+
"mean" | "stdev" | "minimum" | "maximum"
|
|
412
|
+
> => {
|
|
413
|
+
if (events.length === 0)
|
|
414
|
+
return {
|
|
415
|
+
mean: null,
|
|
416
|
+
stdev: null,
|
|
417
|
+
minimum: null,
|
|
418
|
+
maximum: null,
|
|
419
|
+
};
|
|
420
|
+
let mean: number = 0;
|
|
421
|
+
let stdev: number = 0;
|
|
422
|
+
let minimum: number = Number.MAX_SAFE_INTEGER;
|
|
423
|
+
let maximum: number = Number.MIN_SAFE_INTEGER;
|
|
424
|
+
for (const event of events) {
|
|
425
|
+
const elapsed: number =
|
|
426
|
+
new Date(event.completed_at).getTime() -
|
|
427
|
+
new Date(event.started_at).getTime();
|
|
428
|
+
mean += elapsed;
|
|
429
|
+
stdev += elapsed * elapsed;
|
|
430
|
+
minimum = Math.min(minimum, elapsed);
|
|
431
|
+
maximum = Math.max(maximum, elapsed);
|
|
432
|
+
}
|
|
433
|
+
mean /= events.length;
|
|
434
|
+
stdev = Math.sqrt(stdev / events.length - mean * mean);
|
|
435
|
+
return { mean, stdev, minimum, maximum };
|
|
436
|
+
};
|
|
@@ -1,104 +1,104 @@
|
|
|
1
|
-
import os from "os";
|
|
2
|
-
|
|
3
|
-
import { DynamicBenchmarker } from "../DynamicBenchmarker";
|
|
4
|
-
|
|
5
|
-
export namespace DynamicBenchmarkReporter {
|
|
6
|
-
export const markdown = (report: DynamicBenchmarker.IReport): string => {
|
|
7
|
-
const format = (value: number | null) =>
|
|
8
|
-
value === null ? "N/A" : (Math.floor(value * 100) / 100).toLocaleString();
|
|
9
|
-
const head = () =>
|
|
10
|
-
[
|
|
11
|
-
"Type",
|
|
12
|
-
"Count",
|
|
13
|
-
"Success",
|
|
14
|
-
"Mean.",
|
|
15
|
-
"Stdev.",
|
|
16
|
-
"Minimum",
|
|
17
|
-
"Maximum",
|
|
18
|
-
].join(" | ") +
|
|
19
|
-
"\n" +
|
|
20
|
-
new Array(7).fill("----").join("|");
|
|
21
|
-
const row = (title: string, s: DynamicBenchmarker.IReport.IStatistics) =>
|
|
22
|
-
[
|
|
23
|
-
title,
|
|
24
|
-
s.count.toLocaleString(),
|
|
25
|
-
s.success.toLocaleString(),
|
|
26
|
-
format(s.mean),
|
|
27
|
-
format(s.stdev),
|
|
28
|
-
format(s.minimum),
|
|
29
|
-
format(s.maximum),
|
|
30
|
-
].join(" | ");
|
|
31
|
-
const line = (
|
|
32
|
-
title: string,
|
|
33
|
-
getter: (m: NodeJS.MemoryUsage) => number,
|
|
34
|
-
): string =>
|
|
35
|
-
`line "${title}" [${report.memories.map((m) => Math.floor(getter(m.usage) / 1024 ** 2)).join(", ")}]`;
|
|
36
|
-
|
|
37
|
-
return [
|
|
38
|
-
`# Benchmark Report`,
|
|
39
|
-
"> Generated by [`@nestia/benchmark`](https://github.com/samchon/nestia)",
|
|
40
|
-
``,
|
|
41
|
-
` - Specifications`,
|
|
42
|
-
` - CPU: ${os.cpus()[0]!.model}`,
|
|
43
|
-
` - RAM: ${Math.floor(os.totalmem() / 1024 / 1024 / 1024).toLocaleString()} GB`,
|
|
44
|
-
` - NodeJS Version: ${process.version}`,
|
|
45
|
-
` - Backend Server: 1 core / 1 thread`,
|
|
46
|
-
` - Arguments`,
|
|
47
|
-
` - Count: ${report.count.toLocaleString()}`,
|
|
48
|
-
` - Threads: ${report.threads.toLocaleString()}`,
|
|
49
|
-
` - Simultaneous: ${report.simultaneous.toLocaleString()}`,
|
|
50
|
-
` - Time`,
|
|
51
|
-
` - Start: ${report.started_at}`,
|
|
52
|
-
` - Complete: ${report.completed_at}`,
|
|
53
|
-
` - Elapsed: ${(new Date(report.completed_at).getTime() - new Date(report.started_at).getTime()).toLocaleString()} ms`,
|
|
54
|
-
``,
|
|
55
|
-
head(),
|
|
56
|
-
row("Total", report.statistics),
|
|
57
|
-
"",
|
|
58
|
-
"> Unit: milliseconds",
|
|
59
|
-
"",
|
|
60
|
-
"## Memory Consumptions",
|
|
61
|
-
"```mermaid",
|
|
62
|
-
"xychart-beta",
|
|
63
|
-
` x-axis "Time (second)"`,
|
|
64
|
-
` y-axis "Memory (MB)"`,
|
|
65
|
-
` ${line("Resident Set Size", (m) => m.rss)}`,
|
|
66
|
-
` ${line("Heap Total", (m) => m.heapTotal)}`,
|
|
67
|
-
` ${line("Heap Used + External", (m) => m.heapUsed + m.external)}`,
|
|
68
|
-
` ${line("Heap Used Only", (m) => m.heapUsed)}`,
|
|
69
|
-
"```",
|
|
70
|
-
"",
|
|
71
|
-
`> - 🟦 Resident Set Size`,
|
|
72
|
-
`> - 🟢 Heap Total`,
|
|
73
|
-
`> - 🔴 Heap Used + External`,
|
|
74
|
-
`> - 🟡 Heap Used Only`,
|
|
75
|
-
"",
|
|
76
|
-
"## Endpoints",
|
|
77
|
-
head(),
|
|
78
|
-
...report.endpoints
|
|
79
|
-
.slice()
|
|
80
|
-
.sort((a, b) => (b.mean ?? 0) - (a.mean ?? 0))
|
|
81
|
-
.map((endpoint) =>
|
|
82
|
-
row(`${endpoint.method} ${endpoint.path}`, endpoint),
|
|
83
|
-
),
|
|
84
|
-
"",
|
|
85
|
-
"> Unit: milliseconds",
|
|
86
|
-
"",
|
|
87
|
-
"## Failures",
|
|
88
|
-
"Method | Path | Count | Failures",
|
|
89
|
-
"-------|------|-------|----------",
|
|
90
|
-
...report.endpoints
|
|
91
|
-
.filter((e) => e.success !== e.count)
|
|
92
|
-
.slice()
|
|
93
|
-
.sort((a, b) => b.count - a.count)
|
|
94
|
-
.map((e) =>
|
|
95
|
-
[
|
|
96
|
-
e.method,
|
|
97
|
-
e.path,
|
|
98
|
-
e.count.toLocaleString(),
|
|
99
|
-
(e.count - e.success).toLocaleString(),
|
|
100
|
-
].join(" | "),
|
|
101
|
-
),
|
|
102
|
-
].join("\n");
|
|
103
|
-
};
|
|
104
|
-
}
|
|
1
|
+
import os from "os";
|
|
2
|
+
|
|
3
|
+
import { DynamicBenchmarker } from "../DynamicBenchmarker";
|
|
4
|
+
|
|
5
|
+
export namespace DynamicBenchmarkReporter {
|
|
6
|
+
export const markdown = (report: DynamicBenchmarker.IReport): string => {
|
|
7
|
+
const format = (value: number | null) =>
|
|
8
|
+
value === null ? "N/A" : (Math.floor(value * 100) / 100).toLocaleString();
|
|
9
|
+
const head = () =>
|
|
10
|
+
[
|
|
11
|
+
"Type",
|
|
12
|
+
"Count",
|
|
13
|
+
"Success",
|
|
14
|
+
"Mean.",
|
|
15
|
+
"Stdev.",
|
|
16
|
+
"Minimum",
|
|
17
|
+
"Maximum",
|
|
18
|
+
].join(" | ") +
|
|
19
|
+
"\n" +
|
|
20
|
+
new Array(7).fill("----").join("|");
|
|
21
|
+
const row = (title: string, s: DynamicBenchmarker.IReport.IStatistics) =>
|
|
22
|
+
[
|
|
23
|
+
title,
|
|
24
|
+
s.count.toLocaleString(),
|
|
25
|
+
s.success.toLocaleString(),
|
|
26
|
+
format(s.mean),
|
|
27
|
+
format(s.stdev),
|
|
28
|
+
format(s.minimum),
|
|
29
|
+
format(s.maximum),
|
|
30
|
+
].join(" | ");
|
|
31
|
+
const line = (
|
|
32
|
+
title: string,
|
|
33
|
+
getter: (m: NodeJS.MemoryUsage) => number,
|
|
34
|
+
): string =>
|
|
35
|
+
`line "${title}" [${report.memories.map((m) => Math.floor(getter(m.usage) / 1024 ** 2)).join(", ")}]`;
|
|
36
|
+
|
|
37
|
+
return [
|
|
38
|
+
`# Benchmark Report`,
|
|
39
|
+
"> Generated by [`@nestia/benchmark`](https://github.com/samchon/nestia)",
|
|
40
|
+
``,
|
|
41
|
+
` - Specifications`,
|
|
42
|
+
` - CPU: ${os.cpus()[0]!.model}`,
|
|
43
|
+
` - RAM: ${Math.floor(os.totalmem() / 1024 / 1024 / 1024).toLocaleString()} GB`,
|
|
44
|
+
` - NodeJS Version: ${process.version}`,
|
|
45
|
+
` - Backend Server: 1 core / 1 thread`,
|
|
46
|
+
` - Arguments`,
|
|
47
|
+
` - Count: ${report.count.toLocaleString()}`,
|
|
48
|
+
` - Threads: ${report.threads.toLocaleString()}`,
|
|
49
|
+
` - Simultaneous: ${report.simultaneous.toLocaleString()}`,
|
|
50
|
+
` - Time`,
|
|
51
|
+
` - Start: ${report.started_at}`,
|
|
52
|
+
` - Complete: ${report.completed_at}`,
|
|
53
|
+
` - Elapsed: ${(new Date(report.completed_at).getTime() - new Date(report.started_at).getTime()).toLocaleString()} ms`,
|
|
54
|
+
``,
|
|
55
|
+
head(),
|
|
56
|
+
row("Total", report.statistics),
|
|
57
|
+
"",
|
|
58
|
+
"> Unit: milliseconds",
|
|
59
|
+
"",
|
|
60
|
+
"## Memory Consumptions",
|
|
61
|
+
"```mermaid",
|
|
62
|
+
"xychart-beta",
|
|
63
|
+
` x-axis "Time (second)"`,
|
|
64
|
+
` y-axis "Memory (MB)"`,
|
|
65
|
+
` ${line("Resident Set Size", (m) => m.rss)}`,
|
|
66
|
+
` ${line("Heap Total", (m) => m.heapTotal)}`,
|
|
67
|
+
` ${line("Heap Used + External", (m) => m.heapUsed + m.external)}`,
|
|
68
|
+
` ${line("Heap Used Only", (m) => m.heapUsed)}`,
|
|
69
|
+
"```",
|
|
70
|
+
"",
|
|
71
|
+
`> - 🟦 Resident Set Size`,
|
|
72
|
+
`> - 🟢 Heap Total`,
|
|
73
|
+
`> - 🔴 Heap Used + External`,
|
|
74
|
+
`> - 🟡 Heap Used Only`,
|
|
75
|
+
"",
|
|
76
|
+
"## Endpoints",
|
|
77
|
+
head(),
|
|
78
|
+
...report.endpoints
|
|
79
|
+
.slice()
|
|
80
|
+
.sort((a, b) => (b.mean ?? 0) - (a.mean ?? 0))
|
|
81
|
+
.map((endpoint) =>
|
|
82
|
+
row(`${endpoint.method} ${endpoint.path}`, endpoint),
|
|
83
|
+
),
|
|
84
|
+
"",
|
|
85
|
+
"> Unit: milliseconds",
|
|
86
|
+
"",
|
|
87
|
+
"## Failures",
|
|
88
|
+
"Method | Path | Count | Failures",
|
|
89
|
+
"-------|------|-------|----------",
|
|
90
|
+
...report.endpoints
|
|
91
|
+
.filter((e) => e.success !== e.count)
|
|
92
|
+
.slice()
|
|
93
|
+
.sort((a, b) => b.count - a.count)
|
|
94
|
+
.map((e) =>
|
|
95
|
+
[
|
|
96
|
+
e.method,
|
|
97
|
+
e.path,
|
|
98
|
+
e.count.toLocaleString(),
|
|
99
|
+
(e.count - e.success).toLocaleString(),
|
|
100
|
+
].join(" | "),
|
|
101
|
+
),
|
|
102
|
+
].join("\n");
|
|
103
|
+
};
|
|
104
|
+
}
|