binary-packet 1.2.0 → 1.2.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE CHANGED
@@ -1,61 +1,61 @@
1
- Apache License, Version 2.0 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/
2
-
3
- TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
4
-
5
- 1. Definitions.
6
-
7
- "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
8
-
9
- "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
10
-
11
- "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
12
-
13
- "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
14
-
15
- "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
16
-
17
- "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
18
-
19
- "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
20
-
21
- "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
22
-
23
- "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
24
-
25
- "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
26
-
27
- 2. Grant of Copyright License.
28
-
29
- Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
30
-
31
- 3. Grant of Patent License.
32
-
33
- Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
34
-
35
- 4. Redistribution.
36
-
37
- You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
38
-
39
- You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
40
-
41
- 5. Submission of Contributions.
42
-
43
- Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
44
-
45
- 6. Trademarks.
46
-
47
- This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
48
-
49
- 7. Disclaimer of Warranty.
50
-
51
- Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
52
-
53
- 8. Limitation of Liability.
54
-
55
- In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
56
-
57
- 9. Accepting Warranty or Additional Liability.
58
-
59
- While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
60
-
61
- END OF TERMS AND CONDITIONS
1
+ Apache License, Version 2.0 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/
2
+
3
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
4
+
5
+ 1. Definitions.
6
+
7
+ "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
8
+
9
+ "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
10
+
11
+ "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
12
+
13
+ "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
14
+
15
+ "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
16
+
17
+ "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
18
+
19
+ "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
20
+
21
+ "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
22
+
23
+ "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
24
+
25
+ "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
26
+
27
+ 2. Grant of Copyright License.
28
+
29
+ Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
30
+
31
+ 3. Grant of Patent License.
32
+
33
+ Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
34
+
35
+ 4. Redistribution.
36
+
37
+ You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
38
+
39
+ You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
40
+
41
+ 5. Submission of Contributions.
42
+
43
+ Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
44
+
45
+ 6. Trademarks.
46
+
47
+ This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
48
+
49
+ 7. Disclaimer of Warranty.
50
+
51
+ Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
52
+
53
+ 8. Limitation of Liability.
54
+
55
+ In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
56
+
57
+ 9. Accepting Warranty or Additional Liability.
58
+
59
+ While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
60
+
61
+ END OF TERMS AND CONDITIONS
package/README.md CHANGED
@@ -1,180 +1,211 @@
1
- # binary-packet
2
-
3
- Lightweight and hyper-fast, zero-dependencies, TypeScript-first, schema-based binary packets serialization and deserialization library. \
4
- Originally made to be used for an ultrafast WebSockets communication with user-defined type-safe messages between client and server, with the smallest bytes usage possible.
5
-
6
- Supports serializing into and deserializing from [**DataView**](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DataView)s, [**ArrayBuffer**](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer)s and [**Buffer**](https://nodejs.org/api/buffer.html#buffer)s (NodeJS/Bun only). \
7
- To achieve the maximum performance is it always advised to use node Buffer(s) when available.
8
-
9
- ## Installation
10
-
11
- Node: \
12
- `npm install binary-packet`
13
-
14
- Bun: \
15
- `bun add binary-packet`
16
-
17
- ## Features & Specification
18
-
19
- Define the structure of the packets through unique Packet IDs and "schema" objects. \
20
- This "schema" object is simply called `Definition` and defines the shape of a packet: specifically its `fields` and their `types`.
21
-
22
- ### Fields / Data types
23
-
24
- Currently, these kinds of `fields` are supported:
25
- | Type | Description | Values | Size (bytes) |
26
- |------|-------------|--------------|--------------|
27
- | `Field.UNSIGNED_INT_8` | 8 bits unsigned integer | 0 - 255 | 1 |
28
- | `Field.UNSIGNED_INT_16` | 16 bits unsigned integer | 0 - 65535 | 2 |
29
- | `Field.UNSIGNED_INT_32` | 32 bits unsigned integer | 0 - 4294967295 | 4 |
30
- | `Field.INT_8` | 8 bits signed integer | -128 - 127 | 1 |
31
- | `Field.INT_16` | 16 bits signed integer | -32768 - 32767 | 2 |
32
- | `Field.INT_32` | 32 bits signed integer | -2147483648 - 2147483647 | 4 |
33
- | `Field.FLOAT_32` | 32 bits IEEE754 floating-point | | 4 |
34
- | `Field.FLOAT_64` | 64 bits IEEE754 floating-point | | 8 |
35
- | `BinaryPacket` | BinaryPacket "subpacket" | BinaryPacket | size(BinaryPacket) |
36
- | `FieldString` | **ASCII** or **single octet utf-8 chars** string | Up to 65536 chars | 2 + length |
37
- | `FieldArray` | Dynamically-sized array of one of the types above | Up to 256 elements | 1 + length \* size(Element) |
38
- | `FieldFixedArray` | Statically-sized array of one of the types above | Any pre-defined numbers of elements | length \* size(Element) |
39
- | `FieldBitFlags` | Boolean flags packed into a single 8 bits integer | Up to 8 boolean flags | 1 |
40
- | `FieldOptional` | Optional BinaryPacket "subpacket" | BinaryPacket \| undefined | 1 + size(BinaryPacket) |
41
-
42
- As shown, both arrays and nested objects ("subpackets") are supported. \
43
- Note: `FieldFixedArray` is much more memory efficient and performant than `FieldArray`, but require a pre-defined length.
44
-
45
- ### Pattern matching
46
-
47
- The library exposes an easy way to "pattern match" packets of a **yet-unknown-type** in a type-safe manner through a `visitor` pattern. \
48
- For an example, search for "**pattern matching**" in the examples below.
49
-
50
- ## Usage Examples
51
-
52
- ### Example: (incomplete) definition of a simplistic board game
53
-
54
- ```typescript
55
- import { BinaryPacket, Field, FieldArray } from 'binary-packet'
56
-
57
- // Suppose we have a game board where each cell is a square and is one unit big.
58
- // A cell can be then defined by its X and Y coordinates.
59
- // For simplicity, let's say there cannot be more than 256 cells, so we can use 8 bits for each coordinate.
60
- const Cell = {
61
- x: Field.UNSIGNED_INT_8,
62
- y: Field.UNSIGNED_INT_8
63
- }
64
-
65
- // When done with the cell definition we can create its BinaryPacket writer/reader.
66
- // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.
67
- const CellPacket = BinaryPacket.define(0, Cell)
68
-
69
- // Let's now make the definition of the whole game board.
70
- // You can also specify arrays of both "primitive" fields and other BinaryPackets.
71
- const Board = {
72
- numPlayers: Field.UNSIGNED_INT_8,
73
- otherStuff: Field.INT_32,
74
- cells: FieldArray(CellPacket)
75
- }
76
-
77
- // When done with the board definition we can create its BinaryPacket writer/reader.
78
- // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.
79
- const BoardPacket = BinaryPacket.define(1, Board)
80
-
81
- //////////////////
82
- // WRITING SIDE //
83
- //////////////////
84
- const buffer = BoardPacket.writeNodeBuffer({
85
- numPlayers: 1,
86
- otherStuff: 69420,
87
- cells: [
88
- { x: 0, y: 0 },
89
- { x: 1, y: 1 }
90
- ]
91
- })
92
-
93
- // ...
94
- // sendTheBufferOverTheNetwork(buffer)
95
- // ...
96
-
97
- //////////////////
98
- // READING SIDE //
99
- //////////////////
100
- import assert from 'assert'
101
-
102
- // ...
103
- // const buffer = receiveTheBufferFromTheNetwork()
104
- // ...
105
-
106
- const board = BoardPacket.readNodeBuffer(buffer)
107
-
108
- assert(board.numPlayers === 1)
109
- assert(board.otherStuff === 69420)
110
- assert(board.cells.length === 2)
111
- assert(board.cells[0].x === 0)
112
- assert(board.cells[0].y === 0)
113
- assert(board.cells[1].x === 1)
114
- assert(board.cells[1].y === 1)
115
- ```
116
-
117
- ### Example: pattern matching
118
-
119
- ```typescript
120
- import assert from 'assert/strict'
121
- import { BinaryPacket, Field } from 'binary-packet'
122
-
123
- // Packet A definition
124
- const A = BinaryPacket.define(1)
125
-
126
- // Packet B definition: This is the kind of packets that we care about in this example!
127
- const B = BinaryPacket.define(2, { data: Field.UNSIGNED_INT_8 })
128
-
129
- // Packet C definition
130
- const C = BinaryPacket.define(3)
131
-
132
- // Assume the following packet comes from the network or, for some other reason, is a buffer we do not know anything about.
133
- const buffer = B.writeNodeBuffer({ data: 255 })
134
-
135
- BinaryPacket.visitNodeBuffer(
136
- buffer,
137
-
138
- A.visitor(() => assert(false, 'Erroneously accepted visitor A')),
139
-
140
- B.visitor(packet => {
141
- // Do something with the packet
142
- assert.equal(packet.data, 255)
143
- console.log('Accepted visitor B:', packet)
144
- }),
145
-
146
- C.visitor(() => assert(false, 'Erroneously accepted visitor C'))
147
- )
148
- ```
149
-
150
- ## Benchmarks & Alternatives
151
-
152
- Benchmarks are not always meant to be taken seriously. \
153
- Most of the times the results of a benchmark do not actually show the full capabilities of each library. \
154
- So, take these "performance" comparisons with a grain of salt; or, even better, do your own benchmarks with the actual data you need to serialize/deserialize.
155
-
156
- This library has been benchmarked against the following alternatives:
157
-
158
- - [msgpackr](https://www.npmjs.com/package/msgpackr) - A very popular, fast and battle-tested library. Currently offers more features than binary-packet, but it always appears to be slower and is also less type-safe.
159
- - [restructure](https://www.npmjs.com/package/restructure) - An older, popular schema-based library, has some extra features like LazyArrays, but it is **much slower** than both binary-packet and msgpackr. And, sadly, easily crashes with complex structures.
160
-
161
- The benchmarks are executed on three different kinds of packets:
162
-
163
- - EmptyPacket: basically an empty javascript object.
164
- - SimplePacket: objects with just primitive fields, statically-sized arrays and a string.
165
- - ComplexPacket: objects with primitives, statically-sized arrays, dynamically-sized arrays, bitflags, a string, an array of strings and other nested objects/arrays.
166
-
167
- You can see and run the benchmarks yourself if you:
168
-
169
- - Clone the [repository](https://github.com/silence-cloud-com/binary-packet).
170
- - Launch `npm run benchmark`.
171
-
172
- ## Disclaimer
173
-
174
- This library is still very new, thus not "battle-tested" in production enough, or may still have missing important features. \
175
- If you plan on serializing highly sensitive data or need to guarantee no crashes, use an alternative like [msgpackr](https://www.npmjs.com/package/msgpackr) until this library becomes 100% production-ready.
176
-
177
- ## Contribute
178
-
179
- Would like to have more complex, but still hyper-fast and memory efficient, features? \
180
- [Contribute on GitHub](https://github.com/silence-cloud-com/binary-packet) yourself or, alternatively, [buy me a coffee](https://buymeacoffee.com/silence.cloud)!
1
+ # binary-packet
2
+
3
+ Lightweight and hyper-fast, zero-dependencies, TypeScript-first, schema-based binary packets serialization and deserialization library. \
4
+ Originally made to be used for an ultrafast WebSockets communication with user-defined type-safe messages between client and server, with the smallest bytes usage possible.
5
+
6
+ Supports serializing into and deserializing from [**DataView**](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DataView)s, [**ArrayBuffer**](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer)s and [**Buffer**](https://nodejs.org/api/buffer.html#buffer)s (NodeJS/Bun only). \
7
+ To achieve the maximum performance is it always advised to use node Buffer(s) when available.
8
+
9
+ ## Installation
10
+
11
+ Node: \
12
+ `npm install binary-packet`
13
+
14
+ Bun: \
15
+ `bun add binary-packet`
16
+
17
+ ## Features & Specification
18
+
19
+ Define the structure of the packets through unique Packet IDs and "schema" objects. \
20
+ This "schema" object is simply called `Definition` and defines the shape of a packet: specifically its `fields` and their `types`.
21
+
22
+ ### Fields / Data types
23
+
24
+ Currently, these kinds of `fields` are supported:
25
+ | Type | Description | Values | Size (bytes) |
26
+ |------|-------------|--------------|--------------|
27
+ | `Field.UNSIGNED_INT_8` | 8 bits unsigned integer | 0 - 255 | 1 |
28
+ | `Field.UNSIGNED_INT_16` | 16 bits unsigned integer | 0 - 65535 | 2 |
29
+ | `Field.UNSIGNED_INT_32` | 32 bits unsigned integer | 0 - 4294967295 | 4 |
30
+ | `Field.INT_8` | 8 bits signed integer | -128 - 127 | 1 |
31
+ | `Field.INT_16` | 16 bits signed integer | -32768 - 32767 | 2 |
32
+ | `Field.INT_32` | 32 bits signed integer | -2147483648 - 2147483647 | 4 |
33
+ | `Field.FLOAT_32` | 32 bits IEEE754 floating-point | | 4 |
34
+ | `Field.FLOAT_64` | 64 bits IEEE754 floating-point | | 8 |
35
+ | `BinaryPacket` | BinaryPacket "subpacket" | BinaryPacket | size(BinaryPacket) |
36
+ | `FieldString` | **ASCII** or **single octet utf-8 chars** string | Up to 65536 chars | 2 + length |
37
+ | `FieldArray` | Dynamically-sized array of one of the types above | Up to 256 elements | 1 + length \* size(Element) |
38
+ | `FieldFixedArray` | Statically-sized array of one of the types above | Any pre-defined numbers of elements | length \* size(Element) |
39
+ | `FieldBitFlags` | Boolean flags packed into a single 8 bits integer | Up to 8 boolean flags | 1 |
40
+ | `FieldOptional` | Optional BinaryPacket "subpacket" | BinaryPacket \| undefined | 1 + size(BinaryPacket) |
41
+
42
+ As shown, both arrays and nested objects ("subpackets") are supported. \
43
+ Note: `FieldFixedArray` is much more memory efficient and performant than `FieldArray`, but require a pre-defined length.
44
+
45
+ ### Pattern matching
46
+
47
+ The library exposes an easy way to "pattern match" packets of a **yet-unknown-type** in a type-safe manner through a `visitor` pattern. \
48
+ For an example, search for "**pattern matching**" in the examples below.
49
+
50
+ ## Usage Examples
51
+
52
+ ### Example: (incomplete) definition of a simplistic board game
53
+
54
+ ```typescript
55
+ import { BinaryPacket, Field, FieldArray } from 'binary-packet'
56
+
57
+ // Suppose we have a game board where each cell is a square and is one unit big.
58
+ // A cell can be then defined by its X and Y coordinates.
59
+ // For simplicity, let's say there cannot be more than 256 cells, so we can use 8 bits for each coordinate.
60
+ const Cell = {
61
+ x: Field.UNSIGNED_INT_8,
62
+ y: Field.UNSIGNED_INT_8
63
+ }
64
+
65
+ // When done with the cell definition we can create its BinaryPacket writer/reader.
66
+ // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.
67
+ const CellPacket = BinaryPacket.define(0, Cell)
68
+
69
+ // Let's now make the definition of the whole game board.
70
+ // You can also specify arrays of both "primitive" fields and other BinaryPackets.
71
+ const Board = {
72
+ numPlayers: Field.UNSIGNED_INT_8,
73
+ otherStuff: Field.INT_32,
74
+ cells: FieldArray(CellPacket)
75
+ }
76
+
77
+ // When done with the board definition we can create its BinaryPacket writer/reader.
78
+ // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.
79
+ const BoardPacket = BinaryPacket.define(1, Board)
80
+
81
+ //////////////////
82
+ // WRITING SIDE //
83
+ //////////////////
84
+ const buffer = BoardPacket.writeNodeBuffer({
85
+ numPlayers: 1,
86
+ otherStuff: 69420,
87
+ cells: [
88
+ { x: 0, y: 0 },
89
+ { x: 1, y: 1 }
90
+ ]
91
+ })
92
+
93
+ // ...
94
+ // sendTheBufferOverTheNetwork(buffer)
95
+ // ...
96
+
97
+ //////////////////
98
+ // READING SIDE //
99
+ //////////////////
100
+ import assert from 'assert'
101
+
102
+ // ...
103
+ // const buffer = receiveTheBufferFromTheNetwork()
104
+ // ...
105
+
106
+ const board = BoardPacket.readNodeBuffer(buffer)
107
+
108
+ assert(board.numPlayers === 1)
109
+ assert(board.otherStuff === 69420)
110
+ assert(board.cells.length === 2)
111
+ assert(board.cells[0].x === 0)
112
+ assert(board.cells[0].y === 0)
113
+ assert(board.cells[1].x === 1)
114
+ assert(board.cells[1].y === 1)
115
+ ```
116
+
117
+ ### Example: pattern matching
118
+
119
+ ```typescript
120
+ import assert from 'assert/strict'
121
+ import { BinaryPacket, Field } from 'binary-packet'
122
+
123
+ // Packet A definition
124
+ const A = BinaryPacket.define(1)
125
+
126
+ // Packet B definition: This is the kind of packets that we care about in this example!
127
+ const B = BinaryPacket.define(2, { data: Field.UNSIGNED_INT_8 })
128
+
129
+ // Packet C definition
130
+ const C = BinaryPacket.define(3)
131
+
132
+ // Assume the following packet comes from the network or, for some other reason, is a buffer we do not know anything about.
133
+ const buffer = B.writeNodeBuffer({ data: 255 })
134
+
135
+ BinaryPacket.visitNodeBuffer(
136
+ buffer,
137
+
138
+ A.visitor(() => assert(false, 'Erroneously accepted visitor A')),
139
+
140
+ B.visitor(packet => {
141
+ // Do something with the packet
142
+ assert.equal(packet.data, 255)
143
+ console.log('Accepted visitor B:', packet)
144
+ }),
145
+
146
+ C.visitor(() => assert(false, 'Erroneously accepted visitor C'))
147
+ )
148
+ ```
149
+
150
+ ## Sequential Serializer
151
+
152
+ This library also provides an opinionated way to serialize any javascript Iterable, as long as their size is known beforehand. \
153
+ This is extra convenient when dealing with such iterables because it allows serializing all the data without having to allocate for temporary arrays, which was necessary in previous versions and in many other similar libraries.
154
+
155
+ Usage:
156
+
157
+ ```typescript
158
+ // JavaScript Maps are Iterable, would also work with generators and custom classes that correctly implement the Iterable interface.
159
+ import { BinaryPacket, Field, FieldArray, SequentialSerializer } from 'binary-packet'
160
+
161
+ const Packet = BinaryPacket.define(0, {
162
+ numbers: FieldArray(Field.INT_32)
163
+ })
164
+
165
+ const map = new Map([
166
+ [1, 2],
167
+ [2, 4],
168
+ [3, 6]
169
+ ])
170
+
171
+ // Example serializer that serializes only the values of a map, without the overhead of intermediate arrays.
172
+ const serializer = new SequentialSerializer(map.values(), map.size)
173
+
174
+ const buffer = Packet.writeNodeBuffer({
175
+ numbers: serializer
176
+ })
177
+ ```
178
+
179
+ Note: if an array is already available, just use that instead. The SequentialSerializer is meant for more complex iterables.
180
+
181
+ ## Benchmarks & Alternatives
182
+
183
+ Benchmarks are not always meant to be taken seriously. \
184
+ Most of the times the results of a benchmark do not actually show the full capabilities of each library. \
185
+ So, take these "performance" comparisons with a grain of salt; or, even better, do your own benchmarks with the actual data you need to serialize/deserialize.
186
+
187
+ This library has been benchmarked against the following alternatives:
188
+
189
+ - [msgpackr](https://www.npmjs.com/package/msgpackr) - A very popular, fast and battle-tested library. Currently offers more features than binary-packet, but it always appears to be slower and is also less type-safe.
190
+ - [restructure](https://www.npmjs.com/package/restructure) - An older, popular schema-based library, has some extra features like LazyArrays, but it is **much slower** than both binary-packet and msgpackr. And, sadly, easily crashes with complex structures.
191
+
192
+ The benchmarks are executed on three different kinds of packets:
193
+
194
+ - EmptyPacket: basically an empty javascript object.
195
+ - SimplePacket: objects with just primitive fields, statically-sized arrays and a string.
196
+ - ComplexPacket: objects with primitives, statically-sized arrays, dynamically-sized arrays, bitflags, a string, an array of strings and other nested objects/arrays.
197
+
198
+ You can see and run the benchmarks yourself if you:
199
+
200
+ - Clone the [repository](https://github.com/silence-cloud-com/binary-packet).
201
+ - Launch `npm run benchmark`.
202
+
203
+ ## Disclaimer
204
+
205
+ This library is still very new, thus not "battle-tested" in production enough, or may still have missing important features. \
206
+ If you plan on serializing highly sensitive data or need to guarantee no crashes, use an alternative like [msgpackr](https://www.npmjs.com/package/msgpackr) until this library becomes 100% production-ready.
207
+
208
+ ## Contribute
209
+
210
+ Would like to have more complex, but still hyper-fast and memory efficient, features? \
211
+ [Contribute on GitHub](https://github.com/silence-cloud-com/binary-packet) yourself or, alternatively, [buy me a coffee](https://buymeacoffee.com/silence.cloud)!
package/dist/index.d.mts CHANGED
@@ -83,6 +83,12 @@ declare class SequentialSerializer<T> implements Iterable<T> {
83
83
  constructor(iterable: Iterable<T>, length: number);
84
84
  [Symbol.iterator](): Iterator<T, any, any>;
85
85
  }
86
+ /**
87
+ * Either an array or a SequentialSerializer<T>.
88
+ *
89
+ * Note: when a packet is **read**, it will **always** be a standard array: the SequentialSerializer \
90
+ * is just a utility to serialize iterators avoiding data duplication and array-creation overheads.
91
+ */
86
92
  type SequentiallySerializable<T, IsRead extends boolean> = IsRead extends true ? T[] : T[] | SequentialSerializer<T>;
87
93
  type BitFlags = (string[] | ReadonlyArray<string>) & {
88
94
  length: 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8;
@@ -207,11 +213,11 @@ declare class BinaryPacket<T extends Definition> {
207
213
  *
208
214
  * If possible, always prefer writing using this method, as it is much faster than the other ones.
209
215
  */
210
- writeNodeBuffer(dataOut: ToJson<T>): Buffer;
216
+ writeNodeBuffer(dataOut: ToJson<T>): Buffer<ArrayBuffer>;
211
217
  /**
212
218
  * Writes/serializes the given object into a DataView. \
213
219
  */
214
- writeDataView(dataOut: ToJson<T>): DataView;
220
+ writeDataView(dataOut: ToJson<T>): DataView<ArrayBuffer>;
215
221
  /**
216
222
  * Writes/serializes the given object into an ArrayBuffer. \
217
223
  * This method is just a wrapper around either `writeNodeBuffer` or `writeDataView`. \
@@ -223,7 +229,7 @@ declare class BinaryPacket<T extends Definition> {
223
229
  * For more information read the `readArrayBuffer` documentation.
224
230
  */
225
231
  writeArrayBuffer(dataOut: ToJson<T>): {
226
- buffer: ArrayBufferLike;
232
+ buffer: ArrayBuffer;
227
233
  byteLength: number;
228
234
  byteOffset: number;
229
235
  };
package/dist/index.d.ts CHANGED
@@ -83,6 +83,12 @@ declare class SequentialSerializer<T> implements Iterable<T> {
83
83
  constructor(iterable: Iterable<T>, length: number);
84
84
  [Symbol.iterator](): Iterator<T, any, any>;
85
85
  }
86
+ /**
87
+ * Either an array or a SequentialSerializer<T>.
88
+ *
89
+ * Note: when a packet is **read**, it will **always** be a standard array: the SequentialSerializer \
90
+ * is just a utility to serialize iterators avoiding data duplication and array-creation overheads.
91
+ */
86
92
  type SequentiallySerializable<T, IsRead extends boolean> = IsRead extends true ? T[] : T[] | SequentialSerializer<T>;
87
93
  type BitFlags = (string[] | ReadonlyArray<string>) & {
88
94
  length: 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8;
@@ -207,11 +213,11 @@ declare class BinaryPacket<T extends Definition> {
207
213
  *
208
214
  * If possible, always prefer writing using this method, as it is much faster than the other ones.
209
215
  */
210
- writeNodeBuffer(dataOut: ToJson<T>): Buffer;
216
+ writeNodeBuffer(dataOut: ToJson<T>): Buffer<ArrayBuffer>;
211
217
  /**
212
218
  * Writes/serializes the given object into a DataView. \
213
219
  */
214
- writeDataView(dataOut: ToJson<T>): DataView;
220
+ writeDataView(dataOut: ToJson<T>): DataView<ArrayBuffer>;
215
221
  /**
216
222
  * Writes/serializes the given object into an ArrayBuffer. \
217
223
  * This method is just a wrapper around either `writeNodeBuffer` or `writeDataView`. \
@@ -223,7 +229,7 @@ declare class BinaryPacket<T extends Definition> {
223
229
  * For more information read the `readArrayBuffer` documentation.
224
230
  */
225
231
  writeArrayBuffer(dataOut: ToJson<T>): {
226
- buffer: ArrayBufferLike;
232
+ buffer: ArrayBuffer;
227
233
  byteLength: number;
228
234
  byteOffset: number;
229
235
  };
package/dist/index.js.map CHANGED
@@ -1 +1 @@
1
- {"version":3,"sources":["../src/index.ts","../src/buffers.ts"],"sourcesContent":["import {\n decodeStringFromDataView,\n decodeStringFromNodeBuffer,\n encodeStringIntoDataView,\n encodeStringIntoNodeBuffer,\n growDataView,\n growNodeBuffer,\n hasNodeBuffers,\n type TrueArrayBuffer\n} from './buffers'\n\nexport const enum Field {\n /**\n * Defines a 1 byte (8 bits) unsigned integer field. \\\n * (Range: 0 - 255)\n */\n UNSIGNED_INT_8 = 0,\n\n /**\n * Defines a 2 bytes (16 bits) unsigned integer field. \\\n * (Range: 0 - 65535)\n */\n UNSIGNED_INT_16,\n\n /**\n * Defines a 4 bytes (32 bits) unsigned integer field. \\\n * (Range: 0 - 4294967295)\n */\n UNSIGNED_INT_32,\n\n /**\n * Defines a 1 byte (8 bits) signed integer field. \\\n * (Range: -128 - 127)\n */\n INT_8,\n\n /**\n * Defines a 2 bytes (16 bits) signed integer field. \\\n * (Range: -32768 - 32767)\n */\n INT_16,\n\n /**\n * Defines a 4 bytes (32 bits) signed integer field. \\\n * (Range: -2147483648 - 2147483647)\n */\n INT_32,\n\n /**\n * Defines a 4 bytes (32 bits) floating-point field. \\\n */\n FLOAT_32,\n\n /**\n * Defines a 8 bytes (64 bits) floating-point field. \\\n */\n FLOAT_64\n}\n\n/**\n * Defines a dynamically-sized array with elements of a certain type. \\\n * Dynamically-sized arrays are useful when a packet's field is an array of a non pre-defined length. \\\n * Although, this makes dynamically-sized arrays more memory expensive as the internal buffer needs to be grown accordingly.\n *\n * NOTE: If an array will ALWAYS have the same length, prefer using the `FieldFixedArray` type, for both better performance and memory efficiency. \\\n * NOTE: As of now, dynamic arrays can have at most 256 elements.\n */\nexport function FieldArray<T extends Field | BinaryPacket<Definition> | ''>(\n item: T\n): [itemType: T] {\n return [item]\n}\n\n/**\n * Defines a statically-sized array with elements of a certain type. \\\n * Fixed arrays are useful when a packet's field is an array of a pre-defined length. \\\n * Fixed arrays much more memory efficient and performant than non-fixed ones.\n *\n * NOTE: If an array will not always have the same length, use the `FieldArray` type.\n */\nexport function FieldFixedArray<\n T extends Field | BinaryPacket<Definition> | '',\n Length extends number\n>(item: T, length: Length): [itemType: T, length: Length] {\n if (length < 0 || !Number.isFinite(length)) {\n throw new RangeError('Length of a FixedArray must be a positive integer.')\n }\n\n return [item, length]\n}\n\n/**\n * Utility class that allows serializing arrays through any kind of iterable, as long as the number of elements is known beforehand. \\\n * Needed to skip the overhead of duplicating the data into an actual array just for it to be serialized straight away and trashed.\n */\nexport class SequentialSerializer<T> implements Iterable<T> {\n constructor(\n private readonly iterable: Iterable<T>,\n public readonly length: number\n ) {}\n\n [Symbol.iterator]() {\n return this.iterable[Symbol.iterator]()\n }\n}\n\ntype SequentiallySerializable<T, IsRead extends boolean> = IsRead extends true\n ? T[]\n : T[] | SequentialSerializer<T>\n\ntype BitFlags = (string[] | ReadonlyArray<string>) & {\n length: 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8\n}\n\n/**\n * Defines a sequence of up to 8 \"flags\" (basically single bits/booleans) that can be packed together into a single 8 bits value. \\\n * This is useful for minimizing bytes usage when there are lots of boolean fields/flags, instead of saving each flag separately as its own 8 bits value.\n *\n * The input should be an array of strings (with at most 8 elements) where each string defines the name of a flag. \\\n * This is just for definition purposes, then when actually writing or reading packets it'll just be a record-object with those names as keys and boolean values.\n */\nexport function FieldBitFlags<const FlagsArray extends BitFlags>(flags: FlagsArray) {\n if (flags.length > 8) {\n throw new Error(\n `Invalid BinaryPacket definition: a BitFlags field can have only up to 8 flags, given: ${flags.join(', ')}`\n )\n }\n\n return { flags }\n}\n\n/**\n * Defines a string field. \\\n * Strings cannot be more than 65536 characters long.\n *\n * NOTE: Only strings containing just ASCII and/or single-octet UTF-8 characters are supported.\n */\nexport function FieldString() {\n return '' as const\n}\n\n/**\n * Defines an optional BinaryPacket \"subpacket\" field. \\\n * When writing and reading packets it'll be possible to provide and receive `undefined` instead of a valid object.\n */\nexport function FieldOptional<T extends BinaryPacket<Definition>>(packet: T) {\n return { optional: packet }\n}\n\n/**\n * Do not manually construct this type: an object of this kind is returned by a BinaryPacket `createVisitor` method. \\\n * Used in the `BinaryPacket::visit` static method to perform a sort of \"pattern matching\" on an incoming packet (of yet unknown type) buffer.\n */\ntype Visitor = [BinaryPacket<Definition>, (packet: any) => void]\n\nexport class BinaryPacket<T extends Definition> {\n /**\n * Defines a new binary packet. \\\n * Make sure that every `packetId` is unique.\n * @throws RangeError If packetId is negative, floating-point, or greater than 255.\n */\n static define<T extends Definition>(packetId: number, definition?: T) {\n if (packetId < 0 || !Number.isFinite(packetId)) {\n throw new RangeError('Packet IDs must be positive integers.')\n }\n\n if (packetId > 255) {\n throw new RangeError(\n 'Packet IDs greater than 255 are not supported. Do you REALLY need more than 255 different kinds of packets?'\n )\n }\n\n return new BinaryPacket(packetId, definition)\n }\n\n /**\n * Reads just the packetId from the given Buffer. \\\n * This method practically just reads the uint8 at offset `byteOffset` (default: 0). \\\n * Useful if the receiving side receives multiple types of packets.\n */\n static readPacketIdNodeBuffer(buffer: Buffer, byteOffset = 0) {\n return buffer.readUint8(byteOffset)\n }\n\n /**\n * Reads just the packetId from the given DataView. \\\n * This method practically just reads the uint8 at offset `byteOffset` (default: 0). \\\n * Useful if the receiving side receives multiple types of packets.\n */\n static readPacketIdDataView(dataview: DataView, byteOffset = 0) {\n return dataview.getUint8(byteOffset)\n }\n\n /**\n * Reads just the packetId from the given ArrayBuffer. \\\n * This method practically just reads the uint8 at offset `byteOffset`. \\\n * Useful if the receiving side receives multiple types of packets.\n *\n * NOTE: Due to security issues, the `byteOffset` argument cannot be defaulted and must be provided by the user. \\\n * NOTE: For more information read the `readArrayBuffer` method documentation.\n */\n static readPacketIdArrayBuffer(arraybuffer: TrueArrayBuffer, byteOffset: number) {\n return new Uint8Array(arraybuffer, byteOffset, 1)[0]\n }\n\n /**\n * Visits and \"pattern matches\" the given Buffer through the given visitors. \\\n * The Buffer is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitNodeBuffer(buffer: Buffer, ...visitors: Visitor[]) {\n return BinaryPacket.visit(buffer, GET_FUNCTION_BUF, decodeStringFromNodeBuffer, visitors)\n }\n\n /**\n * Visits and \"pattern matches\" the given DataView through the given visitors. \\\n * The DataView is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitDataView(dataview: DataView, ...visitors: Visitor[]) {\n return BinaryPacket.visit(dataview, GET_FUNCTION, decodeStringFromDataView, visitors)\n }\n\n /**\n * Visits and \"pattern matches\" the given ArrayBuffer through the given visitors. \\\n * The ArrayBuffer is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: Due to security issues, the `byteOffset` and `byteLength` arguments must be provided by the user. \\\n * NOTE: For more information read the `readArrayBuffer` method documentation. \\\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitArrayBuffer(\n arraybuffer: TrueArrayBuffer,\n byteOffset: number,\n byteLength: number,\n ...visitors: Visitor[]\n ) {\n return BinaryPacket.visit(\n new DataView(arraybuffer, byteOffset, byteLength),\n GET_FUNCTION,\n decodeStringFromDataView,\n visitors\n )\n }\n\n /**\n * Reads/deserializes from the given Buffer. \\\n * Method available ONLY on NodeJS and Bun.\n *\n * If possible, always prefer reading using this method, as it is much faster than the other ones.\n *\n * NOTE: if you have an ArrayBuffer do not bother wrapping it into a node Buffer yourself. \\\n * NOTE: if you have an ArrayBuffer use the appropriate `readArrayBuffer`.\n */\n readNodeBuffer(\n dataIn: Buffer,\n offsetPointer = { offset: 0 },\n byteLength = dataIn.byteLength\n ): ToJson<T, true> {\n return this.read(\n dataIn,\n offsetPointer,\n byteLength,\n GET_FUNCTION_BUF,\n decodeStringFromNodeBuffer\n )\n }\n\n /**\n * Reads/deserializes from the given DataView.\n *\n * NOTE: if you have an ArrayBuffer do not bother wrapping it into a DataView yourself. \\\n * NOTE: if you have an ArrayBuffer use the appropriate `readArrayBuffer`.\n */\n readDataView(\n dataIn: DataView,\n offsetPointer = { offset: 0 },\n byteLength = dataIn.byteLength\n ): ToJson<T, true> {\n return this.read(dataIn, offsetPointer, byteLength, GET_FUNCTION, decodeStringFromDataView)\n }\n\n /**\n * Reads/deserializes from the given ArrayBuffer. \\\n * WARNING: this method is practically a HACK.\n *\n * When using this method both the `byteOffset` and `byteLength` are REQUIRED and cannot be defaulted. \\\n * This is to prevent serious bugs and security issues. \\\n * That is because often raw ArrayBuffers come from a pre-allocated buffer pool and do not start at byteOffset 0.\n *\n * NOTE: if you have a node Buffer do not bother wrapping it into an ArrayBuffer yourself. \\\n * NOTE: if you have a node Buffer use the appropriate `readNodeBuffer` as it is much faster and less error prone.\n */\n readArrayBuffer(dataIn: TrueArrayBuffer, byteOffset: number, byteLength: number) {\n return this.read(\n hasNodeBuffers\n ? Buffer.from(dataIn, byteOffset, byteLength)\n : (new DataView(dataIn, byteOffset, byteLength) as any),\n { offset: 0 }, // The underlying buffer has already been offsetted\n byteLength,\n hasNodeBuffers ? GET_FUNCTION_BUF : GET_FUNCTION,\n hasNodeBuffers ? decodeStringFromNodeBuffer : (decodeStringFromDataView as any)\n )\n }\n\n /**\n * Writes/serializes the given object into a Buffer. \\\n * Method available ONLY on NodeJS and Bun.\n *\n * If possible, always prefer writing using this method, as it is much faster than the other ones.\n */\n writeNodeBuffer(dataOut: ToJson<T>) {\n const byteLength = this.precalculateBufferLengthWithStrings(dataOut)\n const buffer = Buffer.allocUnsafe(byteLength)\n\n return this.write(\n buffer,\n dataOut,\n { offset: 0 },\n byteLength,\n byteLength,\n SET_FUNCTION_BUF,\n growNodeBuffer,\n encodeStringIntoNodeBuffer\n )\n }\n\n /**\n * Writes/serializes the given object into a DataView. \\\n */\n writeDataView(dataOut: ToJson<T>) {\n const byteLength = this.precalculateBufferLengthWithStrings(dataOut)\n const dataview = new DataView(new ArrayBuffer(byteLength))\n\n return this.write(\n dataview,\n dataOut,\n { offset: 0 },\n byteLength,\n byteLength,\n SET_FUNCTION,\n growDataView,\n encodeStringIntoDataView\n )\n }\n\n /**\n * Writes/serializes the given object into an ArrayBuffer. \\\n * This method is just a wrapper around either `writeNodeBuffer` or `writeDataView`. \\\n *\n * This method works with JavaScript standard raw ArrayBuffer(s) and, as such, is very error prone: \\\n * Make sure you're using the returned byteLength and byteOffset fields in the read counterpart. \\\n *\n * Always consider whether is possible to use directly `writeNodeBuffer` or `writeDataView` instead of `writeArrayBuffer`. \\\n * For more information read the `readArrayBuffer` documentation.\n */\n writeArrayBuffer(dataOut: ToJson<T>) {\n const buf = hasNodeBuffers ? this.writeNodeBuffer(dataOut) : this.writeDataView(dataOut)\n return { buffer: buf.buffer, byteLength: buf.byteLength, byteOffset: buf.byteOffset }\n }\n\n /**\n * Creates a \"visitor\" object for this BinaryPacket definition. \\\n * Used when visiting and \"pattern matching\" buffers with the `BinaryPacket::visit` static utility methods. \\\n *\n * For more information read the `BinaryPacket::visitNodeBuffer` documentation. \\\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n visitor(onVisit: (packet: ToJson<T>) => void): Visitor {\n return [this, onVisit]\n }\n\n sequentialSerializer(numElements: number, dataOut: Iterable<ToJson<T>>) {\n const byteLength = this.minimumByteLength * numElements\n const buffer = Buffer.allocUnsafe(byteLength)\n const offsetPointer = { offset: 0 }\n\n for (const element of dataOut) {\n this.write(\n buffer,\n element,\n offsetPointer,\n byteLength,\n byteLength,\n SET_FUNCTION_BUF,\n growNodeBuffer,\n encodeStringIntoNodeBuffer\n )\n }\n }\n\n /// PRIVATE\n\n private readonly entries: Entries\n readonly stringPositions: StringPositions\n readonly minimumByteLength: number\n\n private constructor(\n private readonly packetId: number,\n definition?: T\n ) {\n this.entries = definition ? sortEntries(definition) : []\n const inspection = inspectEntries(this.entries)\n this.minimumByteLength = inspection.minimumByteLength\n this.stringPositions = inspection.stringPositions\n }\n\n private static visit<Buf extends DataView | Buffer>(\n dataIn: Buf,\n readFunctions: typeof GET_FUNCTION | typeof GET_FUNCTION_BUF,\n decodeStringFunction: (dataIn: Buf, byteOffset: number, strlen: number) => string,\n visitors: Visitor[]\n ) {\n for (const [Packet, onVisit] of visitors) {\n if (Packet.packetId === readFunctions[Field.UNSIGNED_INT_8](dataIn as any, 0)) {\n return onVisit(\n Packet.read(dataIn, { offset: 0 }, dataIn.byteLength, readFunctions, decodeStringFunction)\n )\n }\n }\n }\n\n private read<Buf extends DataView | Buffer>(\n dataIn: Buf,\n offsetPointer: { offset: number },\n byteLength: number,\n readFunctions: typeof GET_FUNCTION | typeof GET_FUNCTION_BUF,\n decodeStringFunction: (dataIn: Buf, byteOffset: number, strlen: number) => string\n ): ToJson<T, true> {\n if (byteLength + offsetPointer.offset < this.minimumByteLength) {\n throw new Error(\n `There is no space available to fit a packet of type ${this.packetId} at offset ${offsetPointer.offset}`\n )\n }\n\n if (\n readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset) !== this.packetId\n ) {\n throw new Error(\n `Data at offset ${offsetPointer.offset} is not a packet of type ${this.packetId}`\n )\n }\n\n offsetPointer.offset += 1\n const result: any = {}\n\n for (const [name, def] of this.entries) {\n if (Array.isArray(def)) {\n const length =\n // def[1] is the length of a statically-sized array, if undefined: must read the length from the buffer as it means it's a dynamically-sized array\n def[1] ?? readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset++)\n\n const array = Array(length)\n\n const itemType = def[0]\n\n if (typeof itemType === 'object') {\n // Array of \"subpackets\"\n for (let i = 0; i < length; ++i) {\n array[i] = itemType.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n } else if (itemType === '') {\n // Array of strings\n for (let i = 0; i < length; ++i) {\n const strlen = readFunctions[Field.UNSIGNED_INT_16](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 2\n\n array[i] = decodeStringFunction(dataIn, offsetPointer.offset, strlen)\n offsetPointer.offset += strlen\n }\n } else {\n // Array of primitives (numbers)\n const itemSize = BYTE_SIZE[itemType]\n\n // It seems like looping over each element is actually much faster than using TypedArrays bulk copy.\n // TODO: properly benchmark with various array sizes to see if it's actually the case.\n for (let i = 0; i < length; ++i) {\n array[i] = readFunctions[itemType](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += itemSize\n }\n }\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = array\n } else if (typeof def === 'number') {\n // Single primitive (number)\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = readFunctions[def](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += BYTE_SIZE[def]\n } else if (def === '') {\n const strlen = readFunctions[Field.UNSIGNED_INT_16](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 2\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = decodeStringFunction(dataIn, offsetPointer.offset, strlen)\n offsetPointer.offset += strlen\n } else if ('flags' in def) {\n // BitFlags\n const flags = readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 1\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = {}\n\n for (let bit = 0; bit < def.flags.length; ++bit) {\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name][def.flags[bit]] = !!(flags & (1 << bit))\n }\n } else if ('optional' in def) {\n // Single optional \"subpacket\"\n const hasSubPacket =\n readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset) !== 0\n\n offsetPointer.offset += 1\n\n if (hasSubPacket) {\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = def.optional.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n } else {\n // Single \"subpacket\"\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = def.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n }\n\n return result as ToJson<T, true>\n }\n\n private write<Buf extends DataView | Buffer>(\n buffer: Buf,\n dataOut: ToJson<T>,\n offsetPointer: { offset: number },\n byteLength: number,\n maxByteLength: number,\n writeFunctions: typeof SET_FUNCTION | typeof SET_FUNCTION_BUF,\n growBufferFunction: (buffer: Buf, newByteLength: number) => Buf,\n encodeStringFunction: (buffer: Buf, byteOffset: number, string: string) => void\n ): Buf {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, this.packetId, offsetPointer.offset)\n offsetPointer.offset += 1\n\n for (const [name, def] of this.entries) {\n const data = dataOut[name]\n\n if (Array.isArray(def)) {\n // Could be both an array of just numbers or \"subpackets\"\n\n const length = (data as SequentiallySerializable<any, false>).length\n\n // Check if it is a dynamically-sized array, if it is, the length of the array must be serialized in the buffer before its elements\n // Explicitly check for undefined and not falsy values because it could be a statically-sized array of 0 elements.\n const isDynamicArray = def[1] === undefined\n\n if (isDynamicArray) {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, length, offsetPointer.offset)\n offsetPointer.offset += 1\n }\n\n if (length > 0) {\n const itemType = def[0]\n\n if (typeof itemType === 'object') {\n // Array of \"subpackets\"\n\n if (isDynamicArray) {\n const neededBytesForElements = length * itemType.minimumByteLength\n\n byteLength += neededBytesForElements\n maxByteLength += neededBytesForElements\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n }\n\n for (const object of data as unknown as ToJson<Definition>[]) {\n // Array of \"subpackets\"\n buffer = itemType.write(\n buffer,\n object,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n }\n } else if (itemType === '') {\n // Array of strings\n for (let i = 0; i < length; ++i) {\n const str = (data as unknown as string[])[i]\n const strlen = str.length\n\n writeFunctions[Field.UNSIGNED_INT_16](buffer as any, strlen, offsetPointer.offset)\n offsetPointer.offset += 2\n\n encodeStringFunction(buffer, offsetPointer.offset, str)\n offsetPointer.offset += strlen\n }\n } else {\n // Array of primitives (numbers)\n const itemSize = BYTE_SIZE[itemType]\n\n if (isDynamicArray) {\n const neededBytesForElements = length * itemSize\n\n byteLength += neededBytesForElements\n maxByteLength += neededBytesForElements\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n }\n\n // It seems like looping over each element is actually much faster than using TypedArrays bulk copy.\n // TODO: properly benchmark with various array sizes to see if it's actually the case.\n for (const number of data as SequentiallySerializable<number, false>) {\n writeFunctions[itemType](buffer as any, number, offsetPointer.offset)\n offsetPointer.offset += itemSize\n }\n }\n }\n } else if (typeof def === 'number') {\n // Single primitive (number)\n writeFunctions[def](buffer as any, data as number, offsetPointer.offset)\n offsetPointer.offset += BYTE_SIZE[def]\n } else if (def === '') {\n // String\n const strlen = (data as string).length\n\n writeFunctions[Field.UNSIGNED_INT_16](buffer as any, strlen, offsetPointer.offset)\n offsetPointer.offset += 2\n\n encodeStringFunction(buffer, offsetPointer.offset, data as string)\n offsetPointer.offset += strlen\n } else if ('flags' in def) {\n // BitFlags\n let flags = 0\n\n for (let bit = 0; bit < def.flags.length; ++bit) {\n if ((data as Record<string, boolean>)[def.flags[bit]]) {\n flags |= 1 << bit\n }\n }\n\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, flags, offsetPointer.offset)\n offsetPointer.offset += 1\n } else if ('optional' in def) {\n if (data) {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, 1, offsetPointer.offset)\n offsetPointer.offset += 1\n\n byteLength += def.optional.minimumByteLength\n maxByteLength += def.optional.minimumByteLength\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n\n buffer = def.optional.write(\n buffer,\n data as ToJson<Definition>,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n } else {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, 0, offsetPointer.offset)\n offsetPointer.offset += 1\n }\n } else {\n // Single \"subpacket\"\n buffer = def.write(\n buffer,\n data as ToJson<Definition>,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n }\n }\n\n return buffer\n }\n\n private precalculateBufferLengthWithStrings(dataOut: ToJson<T>) {\n let len = this.minimumByteLength\n\n for (const field of this.stringPositions[0]) {\n // String field\n len += (dataOut[field] as string).length\n }\n\n for (const field of this.stringPositions[1]) {\n // Array of strings field\n for (const string of dataOut[field] as unknown as string[]) {\n len += 2 + string.length\n }\n }\n\n for (const field in this.stringPositions[2]) {\n // Subpacket that has some string fields\n len += this.stringPositions[2][field].precalculateBufferLengthWithStrings(\n dataOut[field] as any\n )\n }\n\n return len\n }\n}\n\n/**\n * BinaryPacket definition: \\\n * Any packet can be defined through a \"schema\" object explaining its fields names and types.\n *\n * @example\n * // Imagine we have a game board where each cell is a square and is one unit big.\n * // A cell can be then defined by its X and Y coordinates.\n * // For simplicity, let's say there cannot be more than 256 cells, so we can use 8 bits for each coordinate.\n * const Cell = {\n * x: Field.UNSIGNED_INT_8,\n * y: Field.UNSIGNED_INT_8\n * }\n *\n * // When done with the cell definition we can create its BinaryPacket writer/reader.\n * // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.\n * const CellPacket = BinaryPacket.define(0, Cell)\n *\n * // Let's now make the definition of the whole game board.\n * // You can also specify arrays of both \"primitive\" fields and other BinaryPackets.\n * const Board = {\n * numPlayers: Field.UNSIGNED_INT_8,\n * cells: FieldArray(CellPacket)\n * }\n *\n * // When done with the board definition we can create its BinaryPacket writer/reader.\n * // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.\n * const BoardPacket = BinaryPacket.define(1, Board)\n *\n * // And use it.\n * const buffer = BoardPacket.writeNodeBuffer({\n * numPlayers: 1,\n * cells: [\n * { x: 0, y: 0 },\n * { x: 1, y: 1 }\n * ]\n * })\n *\n * // sendTheBufferOver(buffer)\n * // ...\n * // const buffer = receiveTheBuffer()\n * const board = BoardPacket.readNodeBuffer(buffer)\n * // ...\n */\nexport type Definition = {\n [fieldName: string]:\n | MaybeArray<Field>\n | MaybeArray<BinaryPacket<Definition>>\n | MaybeArray<''>\n | { flags: BitFlags }\n | { optional: BinaryPacket<Definition> }\n}\n\ntype MaybeArray<T> = T | [itemType: T] | [itemType: T, length: number]\n\ntype BitFlagsToJson<FlagsArray extends BitFlags> = {\n [key in FlagsArray[number]]: boolean\n}\n\n/**\n * Meta-type that converts a `Definition` schema to the type of the actual JavaScript object that will be written into a packet or read from. \\\n */\nexport type ToJson<T extends Definition, IsRead extends boolean = false> = {\n [K in keyof T]: T[K] extends [infer Item]\n ? Item extends BinaryPacket<infer BPDef>\n ? SequentiallySerializable<ToJson<BPDef, IsRead>, IsRead>\n : Item extends ''\n ? SequentiallySerializable<string, IsRead>\n : SequentiallySerializable<number, IsRead>\n : T[K] extends [infer Item, infer Length]\n ? Item extends BinaryPacket<infer BPDef>\n ? SequentiallySerializable<ToJson<BPDef, IsRead>, IsRead> & { length: Length }\n : Item extends ''\n ? string[] & { length: Length }\n : number[] & { length: Length }\n : T[K] extends BinaryPacket<infer BPDef>\n ? ToJson<BPDef, IsRead>\n : T[K] extends { flags: infer FlagsArray extends BitFlags }\n ? BitFlagsToJson<FlagsArray>\n : T[K] extends { optional: BinaryPacket<infer BPDef extends Definition> }\n ? ToJson<BPDef, IsRead> | undefined\n : T[K] extends ''\n ? string\n : number\n}\n\n/**\n * In a JavaScript object, the order of its keys is not strictly defined: sort them by field name. \\\n * Thus, we cannot trust iterating over an object keys: we MUST iterate over its entries array. \\\n * This is important to make sure that whoever shares BinaryPacket definitions can correctly write/read packets independently of their JS engines.\n */\nfunction sortEntries(definition: Definition) {\n return Object.entries(definition).sort(([fieldName1], [fieldName2]) =>\n fieldName1.localeCompare(fieldName2)\n )\n}\n\ntype Entries = ReturnType<typeof sortEntries>\n\ntype StringPositions = [\n string[],\n string[],\n {\n [field: string]: BinaryPacket<Definition>\n }\n]\n\n/**\n * Helper function that \"inspects\" the entries of a BinaryPacket definition\n * and returns useful \"stats\" needed for writing and reading buffers.\n *\n * This function is ever called only once per BinaryPacket definition.\n */\nfunction inspectEntries(entries: Entries) {\n // The PacketID is already 1 byte, that's why we aren't starting from 0.\n let minimumByteLength = 1\n\n const stringPositions: StringPositions = [[], [], {}]\n\n for (const [name, type] of entries) {\n if (Array.isArray(type)) {\n if (type.length === 2) {\n // Statically-sized array\n const isString = type[0] === ''\n\n const itemSize =\n typeof type[0] === 'object'\n ? type[0].minimumByteLength\n : isString\n ? 2\n : BYTE_SIZE[type[0]]\n\n minimumByteLength += type[1] * itemSize\n\n if (isString) {\n stringPositions[1].push(name)\n }\n } else {\n // Dynamically-sized array\n // Adding 1 byte to serialize the array length\n minimumByteLength += 1\n\n if (type[0] === '') {\n stringPositions[1].push(name)\n }\n }\n } else if (type instanceof BinaryPacket) {\n minimumByteLength += type.minimumByteLength\n stringPositions[2][name] = type\n } else if (typeof type === 'object') {\n // BitFlags & Optionals\n // BitFlags are always 1 byte long, because they can hold up to 8 booleans\n // Optionals minimum is 1 byte long, because it holds whether the subpacket is present or not\n minimumByteLength += 1\n } else if (type === '') {\n // String\n // Adding 2 to serialize the string length\n minimumByteLength += 2\n stringPositions[0].push(name)\n } else {\n minimumByteLength += BYTE_SIZE[type]\n }\n }\n\n return { minimumByteLength, stringPositions }\n}\n\n//////////////////////////////////////////////\n// The logic here is practically over //\n// Here below there are needed constants //\n// that map a field-type to a functionality //\n//////////////////////////////////////////////\n\nconst BYTE_SIZE = Array(8) as number[]\n\nBYTE_SIZE[Field.UNSIGNED_INT_8] = 1\nBYTE_SIZE[Field.INT_8] = 1\n\nBYTE_SIZE[Field.UNSIGNED_INT_16] = 2\nBYTE_SIZE[Field.INT_16] = 2\n\nBYTE_SIZE[Field.UNSIGNED_INT_32] = 4\nBYTE_SIZE[Field.INT_32] = 4\nBYTE_SIZE[Field.FLOAT_32] = 4\n\nBYTE_SIZE[Field.FLOAT_64] = 8\n\nconst GET_FUNCTION = Array(8) as ((view: DataView, offset: number) => number)[]\n\nGET_FUNCTION[Field.UNSIGNED_INT_8] = (view, offset) => view.getUint8(offset)\nGET_FUNCTION[Field.INT_8] = (view, offset) => view.getInt8(offset)\n\nGET_FUNCTION[Field.UNSIGNED_INT_16] = (view, offset) => view.getUint16(offset)\nGET_FUNCTION[Field.INT_16] = (view, offset) => view.getInt16(offset)\n\nGET_FUNCTION[Field.UNSIGNED_INT_32] = (view, offset) => view.getUint32(offset)\nGET_FUNCTION[Field.INT_32] = (view, offset) => view.getInt32(offset)\nGET_FUNCTION[Field.FLOAT_32] = (view, offset) => view.getFloat32(offset)\n\nGET_FUNCTION[Field.FLOAT_64] = (view, offset) => view.getFloat64(offset)\n\nconst SET_FUNCTION = Array(8) as ((view: DataView, value: number, offset: number) => void)[]\n\nSET_FUNCTION[Field.UNSIGNED_INT_8] = (view, value, offset) => view.setUint8(offset, value)\nSET_FUNCTION[Field.INT_8] = (view, value, offset) => view.setInt8(offset, value)\n\nSET_FUNCTION[Field.UNSIGNED_INT_16] = (view, value, offset) => view.setUint16(offset, value)\nSET_FUNCTION[Field.INT_16] = (view, value, offset) => view.setInt16(offset, value)\n\nSET_FUNCTION[Field.UNSIGNED_INT_32] = (view, value, offset) => view.setUint32(offset, value)\nSET_FUNCTION[Field.INT_32] = (view, value, offset) => view.setInt32(offset, value)\nSET_FUNCTION[Field.FLOAT_32] = (view, value, offset) => view.setFloat32(offset, value)\n\nSET_FUNCTION[Field.FLOAT_64] = (view, value, offset) => view.setFloat64(offset, value)\n\nconst SET_FUNCTION_BUF = Array(8) as ((nodeBuffer: Buffer, value: number, offset: number) => void)[]\n\nif (hasNodeBuffers) {\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_8] = (view, value, offset) => view.writeUint8(value, offset)\n SET_FUNCTION_BUF[Field.INT_8] = (view, value, offset) => view.writeInt8(value, offset)\n\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_16] = (view, value, offset) =>\n view.writeUint16BE(value, offset)\n SET_FUNCTION_BUF[Field.INT_16] = (view, value, offset) => view.writeInt16BE(value, offset)\n\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_32] = (view, value, offset) =>\n view.writeUint32BE(value, offset)\n SET_FUNCTION_BUF[Field.INT_32] = (view, value, offset) => view.writeInt32BE(value, offset)\n SET_FUNCTION_BUF[Field.FLOAT_32] = (view, value, offset) => view.writeFloatBE(value, offset)\n\n SET_FUNCTION_BUF[Field.FLOAT_64] = (view, value, offset) => view.writeDoubleBE(value, offset)\n}\n\nconst GET_FUNCTION_BUF = Array(8) as ((nodeBuffer: Buffer, offset: number) => number)[]\n\nif (hasNodeBuffers) {\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_8] = (view, offset) => view.readUint8(offset)\n GET_FUNCTION_BUF[Field.INT_8] = (view, offset) => view.readInt8(offset)\n\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_16] = (view, offset) => view.readUint16BE(offset)\n\n GET_FUNCTION_BUF[Field.INT_16] = (view, offset) => view.readInt16BE(offset)\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_32] = (view, offset) => view.readUint32BE(offset)\n\n GET_FUNCTION_BUF[Field.INT_32] = (view, offset) => view.readInt32BE(offset)\n GET_FUNCTION_BUF[Field.FLOAT_32] = (view, offset) => view.readFloatBE(offset)\n GET_FUNCTION_BUF[Field.FLOAT_64] = (view, offset) => view.readDoubleBE(offset)\n}\n","/**\r\n * Exclusively matches objects of type `ArrayBuffer` and no other types that inherit from it. \\\r\n * This is needed because the `DataView` constructor explicitly requires a \"true\" ArrayBuffer, or else it throws.\r\n */\r\nexport type TrueArrayBuffer = ArrayBuffer & { buffer?: undefined }\r\n\r\nexport const hasNodeBuffers = typeof Buffer === 'function'\r\n\r\nexport function growDataView(dataview: DataView, newByteLength: number) {\r\n const resizedBuffer = new ArrayBuffer(newByteLength)\r\n const amountToCopy = Math.min(dataview.byteLength, resizedBuffer.byteLength)\r\n\r\n // Treat the buffer as if it was a Float64Array so we can copy 8 bytes at a time, to finish faster\r\n let length = Math.trunc(amountToCopy / 8)\r\n new Float64Array(resizedBuffer, 0, length).set(new Float64Array(dataview.buffer, 0, length))\r\n\r\n // Copy the remaining up to 7 bytes\r\n const offset = length * 8\r\n length = amountToCopy - offset\r\n new Uint8Array(resizedBuffer, offset, length).set(new Uint8Array(dataview.buffer, offset, length))\r\n\r\n return new DataView(resizedBuffer)\r\n}\r\n\r\nexport function growNodeBuffer(buffer: Buffer, newByteLength: number) {\r\n const newBuffer = Buffer.allocUnsafe(newByteLength)\r\n buffer.copy(newBuffer)\r\n return newBuffer\r\n}\r\n\r\nconst textEncoder = new TextEncoder()\r\nconst textDecoder = new TextDecoder()\r\n\r\nexport function encodeStringIntoDataView(dataview: DataView, byteOffset: number, string: string) {\r\n const strlen = string.length\r\n const u8Buffer = new Uint8Array(dataview.buffer, dataview.byteOffset + byteOffset, strlen)\r\n\r\n if (strlen <= 64) {\r\n encodeSmallString(u8Buffer, 0, string, strlen)\r\n } else {\r\n textEncoder.encodeInto(string, u8Buffer)\r\n }\r\n}\r\n\r\nexport function encodeStringIntoNodeBuffer(buffer: Buffer, byteOffset: number, string: string) {\r\n const strlen = string.length\r\n\r\n if (strlen <= 64) {\r\n encodeSmallString(buffer, byteOffset, string, strlen)\r\n } else {\r\n buffer.utf8Write(string, byteOffset, strlen)\r\n }\r\n}\r\n\r\nfunction encodeSmallString(buffer: Uint8Array, byteOffset: number, string: string, strlen: number) {\r\n for (let i = 0; i < strlen; ++i) {\r\n buffer[byteOffset + i] = string.charCodeAt(i) & 0xff\r\n }\r\n}\r\n\r\nexport function decodeStringFromNodeBuffer(buffer: Buffer, byteOffset: number, strlen: number) {\r\n return buffer.subarray(byteOffset, byteOffset + strlen).toString('utf8')\r\n}\r\n\r\nexport function decodeStringFromDataView(dataview: DataView, byteOffset: number, strlen: number) {\r\n return textDecoder.decode(new DataView(dataview.buffer, dataview.byteOffset + byteOffset, strlen))\r\n}\r\n\r\ndeclare global {\r\n interface Buffer {\r\n /**\r\n * Node buffer's internals function. \\\r\n * For some reason it is not exposed through TypeScript. \\\r\n * Fastest way to write utf8 strings into buffers.\r\n */\r\n utf8Write(string: string, byteOffset?: number, byteLength?: number): number\r\n }\r\n}\r\n"],"mappings":"yaAAA,IAAAA,EAAA,GAAAC,EAAAD,EAAA,kBAAAE,EAAA,UAAAC,EAAA,eAAAC,EAAA,kBAAAC,EAAA,oBAAAC,EAAA,kBAAAC,EAAA,gBAAAC,EAAA,yBAAAC,IAAA,eAAAC,EAAAV,GCMO,IAAMW,EAAiB,OAAO,QAAW,WAEzC,SAASC,EAAaC,EAAoBC,EAAuB,CACtE,IAAMC,EAAgB,IAAI,YAAYD,CAAa,EAC7CE,EAAe,KAAK,IAAIH,EAAS,WAAYE,EAAc,UAAU,EAGvEE,EAAS,KAAK,MAAMD,EAAe,CAAC,EACxC,IAAI,aAAaD,EAAe,EAAGE,CAAM,EAAE,IAAI,IAAI,aAAaJ,EAAS,OAAQ,EAAGI,CAAM,CAAC,EAG3F,IAAMC,EAASD,EAAS,EACxB,OAAAA,EAASD,EAAeE,EACxB,IAAI,WAAWH,EAAeG,EAAQD,CAAM,EAAE,IAAI,IAAI,WAAWJ,EAAS,OAAQK,EAAQD,CAAM,CAAC,EAE1F,IAAI,SAASF,CAAa,CACnC,CAEO,SAASI,EAAeC,EAAgBN,EAAuB,CACpE,IAAMO,EAAY,OAAO,YAAYP,CAAa,EAClD,OAAAM,EAAO,KAAKC,CAAS,EACdA,CACT,CAEA,IAAMC,EAAc,IAAI,YAClBC,EAAc,IAAI,YAEjB,SAASC,EAAyBX,EAAoBY,EAAoBC,EAAgB,CAC/F,IAAMC,EAASD,EAAO,OAChBE,EAAW,IAAI,WAAWf,EAAS,OAAQA,EAAS,WAAaY,EAAYE,CAAM,EAErFA,GAAU,GACZE,EAAkBD,EAAU,EAAGF,EAAQC,CAAM,EAE7CL,EAAY,WAAWI,EAAQE,CAAQ,CAE3C,CAEO,SAASE,EAA2BV,EAAgBK,EAAoBC,EAAgB,CAC7F,IAAMC,EAASD,EAAO,OAElBC,GAAU,GACZE,EAAkBT,EAAQK,EAAYC,EAAQC,CAAM,EAEpDP,EAAO,UAAUM,EAAQD,EAAYE,CAAM,CAE/C,CAEA,SAASE,EAAkBT,EAAoBK,EAAoBC,EAAgBC,EAAgB,CACjG,QAASI,EAAI,EAAGA,EAAIJ,EAAQ,EAAEI,EAC5BX,EAAOK,EAAaM,CAAC,EAAIL,EAAO,WAAWK,CAAC,EAAI,GAEpD,CAEO,SAASC,EAA2BZ,EAAgBK,EAAoBE,EAAgB,CAC7F,OAAOP,EAAO,SAASK,EAAYA,EAAaE,CAAM,EAAE,SAAS,MAAM,CACzE,CAEO,SAASM,EAAyBpB,EAAoBY,EAAoBE,EAAgB,CAC/F,OAAOJ,EAAY,OAAO,IAAI,SAASV,EAAS,OAAQA,EAAS,WAAaY,EAAYE,CAAM,CAAC,CACnG,CDvDO,IAAWO,OAKhBA,IAAA,eAAiB,GAAjB,iBAMAA,IAAA,qCAMAA,IAAA,qCAMAA,IAAA,iBAMAA,IAAA,mBAMAA,IAAA,mBAKAA,IAAA,uBAKAA,IAAA,uBA7CgBA,OAAA,IAwDX,SAASC,EACdC,EACe,CACf,MAAO,CAACA,CAAI,CACd,CASO,SAASC,EAGdD,EAASE,EAA+C,CACxD,GAAIA,EAAS,GAAK,CAAC,OAAO,SAASA,CAAM,EACvC,MAAM,IAAI,WAAW,oDAAoD,EAG3E,MAAO,CAACF,EAAME,CAAM,CACtB,CAMO,IAAMC,EAAN,KAAqD,CAC1D,YACmBC,EACDF,EAChB,CAFiB,cAAAE,EACD,YAAAF,CACf,CAEH,CAAC,OAAO,QAAQ,GAAI,CAClB,OAAO,KAAK,SAAS,OAAO,QAAQ,EAAE,CACxC,CACF,EAiBO,SAASG,EAAiDC,EAAmB,CAClF,GAAIA,EAAM,OAAS,EACjB,MAAM,IAAI,MACR,yFAAyFA,EAAM,KAAK,IAAI,CAAC,EAC3G,EAGF,MAAO,CAAE,MAAAA,CAAM,CACjB,CAQO,SAASC,GAAc,CAC5B,MAAO,EACT,CAMO,SAASC,EAAkDC,EAAW,CAC3E,MAAO,CAAE,SAAUA,CAAO,CAC5B,CAQO,IAAMC,EAAN,MAAMC,CAAmC,CAoPtC,YACWC,EACjBC,EACA,CAFiB,cAAAD,EAGjB,KAAK,QAAUC,EAAaC,EAAYD,CAAU,EAAI,CAAC,EACvD,IAAME,EAAaC,EAAe,KAAK,OAAO,EAC9C,KAAK,kBAAoBD,EAAW,kBACpC,KAAK,gBAAkBA,EAAW,eACpC,CAtPA,OAAO,OAA6BH,EAAkBC,EAAgB,CACpE,GAAID,EAAW,GAAK,CAAC,OAAO,SAASA,CAAQ,EAC3C,MAAM,IAAI,WAAW,uCAAuC,EAG9D,GAAIA,EAAW,IACb,MAAM,IAAI,WACR,6GACF,EAGF,OAAO,IAAID,EAAaC,EAAUC,CAAU,CAC9C,CAOA,OAAO,uBAAuBI,EAAgBC,EAAa,EAAG,CAC5D,OAAOD,EAAO,UAAUC,CAAU,CACpC,CAOA,OAAO,qBAAqBC,EAAoBD,EAAa,EAAG,CAC9D,OAAOC,EAAS,SAASD,CAAU,CACrC,CAUA,OAAO,wBAAwBE,EAA8BF,EAAoB,CAC/E,OAAO,IAAI,WAAWE,EAAaF,EAAY,CAAC,EAAE,CAAC,CACrD,CAQA,OAAO,gBAAgBD,KAAmBI,EAAqB,CAC7D,OAAOV,EAAa,MAAMM,EAAQK,EAAkBC,EAA4BF,CAAQ,CAC1F,CAQA,OAAO,cAAcF,KAAuBE,EAAqB,CAC/D,OAAOV,EAAa,MAAMQ,EAAUK,EAAcC,EAA0BJ,CAAQ,CACtF,CAUA,OAAO,iBACLD,EACAF,EACAQ,KACGL,EACH,CACA,OAAOV,EAAa,MAClB,IAAI,SAASS,EAAaF,EAAYQ,CAAU,EAChDF,EACAC,EACAJ,CACF,CACF,CAWA,eACEM,EACAC,EAAgB,CAAE,OAAQ,CAAE,EAC5BF,EAAaC,EAAO,WACH,CACjB,OAAO,KAAK,KACVA,EACAC,EACAF,EACAJ,EACAC,CACF,CACF,CAQA,aACEI,EACAC,EAAgB,CAAE,OAAQ,CAAE,EAC5BF,EAAaC,EAAO,WACH,CACjB,OAAO,KAAK,KAAKA,EAAQC,EAAeF,EAAYF,EAAcC,CAAwB,CAC5F,CAaA,gBAAgBE,EAAyBT,EAAoBQ,EAAoB,CAC/E,OAAO,KAAK,KACVG,EACI,OAAO,KAAKF,EAAQT,EAAYQ,CAAU,EACzC,IAAI,SAASC,EAAQT,EAAYQ,CAAU,EAChD,CAAE,OAAQ,CAAE,EACZA,EACAG,EAAiBP,EAAmBE,EACpCK,EAAiBN,EAA8BE,CACjD,CACF,CAQA,gBAAgBK,EAAoB,CAClC,IAAMJ,EAAa,KAAK,oCAAoCI,CAAO,EAC7Db,EAAS,OAAO,YAAYS,CAAU,EAE5C,OAAO,KAAK,MACVT,EACAa,EACA,CAAE,OAAQ,CAAE,EACZJ,EACAA,EACAK,EACAC,EACAC,CACF,CACF,CAKA,cAAcH,EAAoB,CAChC,IAAMJ,EAAa,KAAK,oCAAoCI,CAAO,EAC7DX,EAAW,IAAI,SAAS,IAAI,YAAYO,CAAU,CAAC,EAEzD,OAAO,KAAK,MACVP,EACAW,EACA,CAAE,OAAQ,CAAE,EACZJ,EACAA,EACAQ,EACAC,EACAC,CACF,CACF,CAYA,iBAAiBN,EAAoB,CACnC,IAAMO,EAAMR,EAAiB,KAAK,gBAAgBC,CAAO,EAAI,KAAK,cAAcA,CAAO,EACvF,MAAO,CAAE,OAAQO,EAAI,OAAQ,WAAYA,EAAI,WAAY,WAAYA,EAAI,UAAW,CACtF,CASA,QAAQC,EAA+C,CACrD,MAAO,CAAC,KAAMA,CAAO,CACvB,CAEA,qBAAqBC,EAAqBT,EAA8B,CACtE,IAAMJ,EAAa,KAAK,kBAAoBa,EACtCtB,EAAS,OAAO,YAAYS,CAAU,EACtCE,EAAgB,CAAE,OAAQ,CAAE,EAElC,QAAWY,KAAWV,EACpB,KAAK,MACHb,EACAuB,EACAZ,EACAF,EACAA,EACAK,EACAC,EACAC,CACF,CAEJ,CAIiB,QACR,gBACA,kBAYT,OAAe,MACbN,EACAc,EACAC,EACArB,EACA,CACA,OAAW,CAACsB,EAAQL,CAAO,IAAKjB,EAC9B,GAAIsB,EAAO,WAAaF,EAAc,CAAoB,EAAEd,EAAe,CAAC,EAC1E,OAAOW,EACLK,EAAO,KAAKhB,EAAQ,CAAE,OAAQ,CAAE,EAAGA,EAAO,WAAYc,EAAeC,CAAoB,CAC3F,CAGN,CAEQ,KACNf,EACAC,EACAF,EACAe,EACAC,EACiB,CACjB,GAAIhB,EAAaE,EAAc,OAAS,KAAK,kBAC3C,MAAM,IAAI,MACR,uDAAuD,KAAK,QAAQ,cAAcA,EAAc,MAAM,EACxG,EAGF,GACEa,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,IAAM,KAAK,SAElF,MAAM,IAAI,MACR,kBAAkBA,EAAc,MAAM,4BAA4B,KAAK,QAAQ,EACjF,EAGFA,EAAc,QAAU,EACxB,IAAMgB,EAAc,CAAC,EAErB,OAAW,CAACC,EAAMC,CAAG,IAAK,KAAK,QAC7B,GAAI,MAAM,QAAQA,CAAG,EAAG,CACtB,IAAM5C,EAEJ4C,EAAI,CAAC,GAAKL,EAAc,CAAoB,EAAEd,EAAeC,EAAc,QAAQ,EAE/EmB,EAAQ,MAAM7C,CAAM,EAEpB8C,EAAWF,EAAI,CAAC,EAEtB,GAAI,OAAOE,GAAa,SAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAC5BF,EAAME,CAAC,EAAID,EAAS,KAClBrB,EACAC,EACAF,EACAe,EACAC,CACF,UAEOM,IAAa,GAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAAG,CAC/B,IAAMC,EAAST,EAAc,CAAqB,EAAEd,EAAeC,EAAc,MAAM,EACvFA,EAAc,QAAU,EAExBmB,EAAME,CAAC,EAAIP,EAAqBf,EAAQC,EAAc,OAAQsB,CAAM,EACpEtB,EAAc,QAAUsB,CAC1B,KACK,CAEL,IAAMC,EAAWC,EAAUJ,CAAQ,EAInC,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAC5BF,EAAME,CAAC,EAAIR,EAAcO,CAAQ,EAAErB,EAAeC,EAAc,MAAM,EACtEA,EAAc,QAAUuB,CAE5B,CAGAP,EAAOC,CAAI,EAAIE,CACjB,SAAW,OAAOD,GAAQ,SAGxBF,EAAOC,CAAI,EAAIJ,EAAcK,CAAG,EAAEnB,EAAeC,EAAc,MAAM,EACrEA,EAAc,QAAUwB,EAAUN,CAAG,UAC5BA,IAAQ,GAAI,CACrB,IAAMI,EAAST,EAAc,CAAqB,EAAEd,EAAeC,EAAc,MAAM,EACvFA,EAAc,QAAU,EAGxBgB,EAAOC,CAAI,EAAIH,EAAqBf,EAAQC,EAAc,OAAQsB,CAAM,EACxEtB,EAAc,QAAUsB,CAC1B,SAAW,UAAWJ,EAAK,CAEzB,IAAMxC,EAAQmC,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,EACrFA,EAAc,QAAU,EAGxBgB,EAAOC,CAAI,EAAI,CAAC,EAEhB,QAASQ,EAAM,EAAGA,EAAMP,EAAI,MAAM,OAAQ,EAAEO,EAE1CT,EAAOC,CAAI,EAAEC,EAAI,MAAMO,CAAG,CAAC,EAAI,CAAC,EAAE/C,EAAS,GAAK+C,EAEpD,SAAW,aAAcP,EAAK,CAE5B,IAAMQ,EACJb,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,IAAM,EAE/EA,EAAc,QAAU,EAEpB0B,IAEFV,EAAOC,CAAI,EAAIC,EAAI,SAAS,KAC1BnB,EACAC,EACAF,EACAe,EACAC,CACF,EAEJ,MAGEE,EAAOC,CAAI,EAAIC,EAAI,KACjBnB,EACAC,EACAF,EACAe,EACAC,CACF,EAIJ,OAAOE,CACT,CAEQ,MACN3B,EACAa,EACAF,EACAF,EACA6B,EACAC,EACAC,EACAC,EACK,CACLF,EAAe,CAAoB,EAAEvC,EAAe,KAAK,SAAUW,EAAc,MAAM,EACvFA,EAAc,QAAU,EAExB,OAAW,CAACiB,EAAMC,CAAG,IAAK,KAAK,QAAS,CACtC,IAAMa,EAAO7B,EAAQe,CAAI,EAEzB,GAAI,MAAM,QAAQC,CAAG,EAAG,CAGtB,IAAM5C,EAAUyD,EAA8C,OAIxDC,EAAiBd,EAAI,CAAC,IAAM,OAOlC,GALIc,IACFJ,EAAe,CAAoB,EAAEvC,EAAef,EAAQ0B,EAAc,MAAM,EAChFA,EAAc,QAAU,GAGtB1B,EAAS,EAAG,CACd,IAAM8C,EAAWF,EAAI,CAAC,EAEtB,GAAI,OAAOE,GAAa,SAAU,CAGhC,GAAIY,EAAgB,CAClB,IAAMC,EAAyB3D,EAAS8C,EAAS,kBAEjDtB,GAAcmC,EACdN,GAAiBM,EAEb5C,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,EAErD,CAEA,QAAWO,KAAUH,EAEnB1C,EAAS+B,EAAS,MAChB/B,EACA6C,EACAlC,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,UAE3B,SAAW+B,IAAa,GAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAAG,CAC/B,IAAMc,EAAOJ,EAA6BV,CAAC,EACrCC,EAASa,EAAI,OAEnBP,EAAe,CAAqB,EAAEvC,EAAeiC,EAAQtB,EAAc,MAAM,EACjFA,EAAc,QAAU,EAExB8B,EAAqBzC,EAAQW,EAAc,OAAQmC,CAAG,EACtDnC,EAAc,QAAUsB,CAC1B,KACK,CAEL,IAAMC,EAAWC,EAAUJ,CAAQ,EAEnC,GAAIY,EAAgB,CAClB,IAAMC,EAAyB3D,EAASiD,EAExCzB,GAAcmC,EACdN,GAAiBM,EAEb5C,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,EAErD,CAIA,QAAWS,KAAUL,EACnBH,EAAeR,CAAQ,EAAE/B,EAAe+C,EAAQpC,EAAc,MAAM,EACpEA,EAAc,QAAUuB,CAE5B,CACF,CACF,SAAW,OAAOL,GAAQ,SAExBU,EAAeV,CAAG,EAAE7B,EAAe0C,EAAgB/B,EAAc,MAAM,EACvEA,EAAc,QAAUwB,EAAUN,CAAG,UAC5BA,IAAQ,GAAI,CAErB,IAAMI,EAAUS,EAAgB,OAEhCH,EAAe,CAAqB,EAAEvC,EAAeiC,EAAQtB,EAAc,MAAM,EACjFA,EAAc,QAAU,EAExB8B,EAAqBzC,EAAQW,EAAc,OAAQ+B,CAAc,EACjE/B,EAAc,QAAUsB,CAC1B,SAAW,UAAWJ,EAAK,CAEzB,IAAIxC,EAAQ,EAEZ,QAAS+C,EAAM,EAAGA,EAAMP,EAAI,MAAM,OAAQ,EAAEO,EACrCM,EAAiCb,EAAI,MAAMO,CAAG,CAAC,IAClD/C,GAAS,GAAK+C,GAIlBG,EAAe,CAAoB,EAAEvC,EAAeX,EAAOsB,EAAc,MAAM,EAC/EA,EAAc,QAAU,CAC1B,KAAW,aAAckB,EACnBa,GACFH,EAAe,CAAoB,EAAEvC,EAAe,EAAGW,EAAc,MAAM,EAC3EA,EAAc,QAAU,EAExBF,GAAcoB,EAAI,SAAS,kBAC3BS,GAAiBT,EAAI,SAAS,kBAE1B7B,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,GAGnDtC,EAAS6B,EAAI,SAAS,MACpB7B,EACA0C,EACA/B,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,aAEvBuC,EAAe,CAAoB,EAAEvC,EAAe,EAAGW,EAAc,MAAM,EAC3EA,EAAc,QAAU,IAI1BX,EAAS6B,EAAI,MACX7B,EACA0C,EACA/B,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,WAE3B,CAEA,OAAOA,CACT,CAEQ,oCAAoCa,EAAoB,CAC9D,IAAImC,EAAM,KAAK,kBAEf,QAAWC,KAAS,KAAK,gBAAgB,CAAC,EAExCD,GAAQnC,EAAQoC,CAAK,EAAa,OAGpC,QAAWA,KAAS,KAAK,gBAAgB,CAAC,EAExC,QAAWC,KAAUrC,EAAQoC,CAAK,EAChCD,GAAO,EAAIE,EAAO,OAItB,QAAWD,KAAS,KAAK,gBAAgB,CAAC,EAExCD,GAAO,KAAK,gBAAgB,CAAC,EAAEC,CAAK,EAAE,oCACpCpC,EAAQoC,CAAK,CACf,EAGF,OAAOD,CACT,CACF,EA4FA,SAASnD,EAAYD,EAAwB,CAC3C,OAAO,OAAO,QAAQA,CAAU,EAAE,KAAK,CAAC,CAACuD,CAAU,EAAG,CAACC,CAAU,IAC/DD,EAAW,cAAcC,CAAU,CACrC,CACF,CAkBA,SAASrD,EAAesD,EAAkB,CAExC,IAAIC,EAAoB,EAElBC,EAAmC,CAAC,CAAC,EAAG,CAAC,EAAG,CAAC,CAAC,EAEpD,OAAW,CAAC3B,EAAM4B,CAAI,IAAKH,EACzB,GAAI,MAAM,QAAQG,CAAI,EACpB,GAAIA,EAAK,SAAW,EAAG,CAErB,IAAMC,EAAWD,EAAK,CAAC,IAAM,GAEvBtB,EACJ,OAAOsB,EAAK,CAAC,GAAM,SACfA,EAAK,CAAC,EAAE,kBACRC,EACE,EACAtB,EAAUqB,EAAK,CAAC,CAAC,EAEzBF,GAAqBE,EAAK,CAAC,EAAItB,EAE3BuB,GACFF,EAAgB,CAAC,EAAE,KAAK3B,CAAI,CAEhC,MAGE0B,GAAqB,EAEjBE,EAAK,CAAC,IAAM,IACdD,EAAgB,CAAC,EAAE,KAAK3B,CAAI,OAGvB4B,aAAgB/D,GACzB6D,GAAqBE,EAAK,kBAC1BD,EAAgB,CAAC,EAAE3B,CAAI,EAAI4B,GAClB,OAAOA,GAAS,SAIzBF,GAAqB,EACZE,IAAS,IAGlBF,GAAqB,EACrBC,EAAgB,CAAC,EAAE,KAAK3B,CAAI,GAE5B0B,GAAqBnB,EAAUqB,CAAI,EAIvC,MAAO,CAAE,kBAAAF,EAAmB,gBAAAC,CAAgB,CAC9C,CAQA,IAAMpB,EAAY,MAAM,CAAC,EAEzBA,EAAU,CAAoB,EAAI,EAClCA,EAAU,CAAW,EAAI,EAEzBA,EAAU,CAAqB,EAAI,EACnCA,EAAU,CAAY,EAAI,EAE1BA,EAAU,CAAqB,EAAI,EACnCA,EAAU,CAAY,EAAI,EAC1BA,EAAU,CAAc,EAAI,EAE5BA,EAAU,CAAc,EAAI,EAE5B,IAAM5B,EAAe,MAAM,CAAC,EAE5BA,EAAa,CAAoB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAC3EpD,EAAa,CAAW,EAAI,CAACmD,EAAMC,IAAWD,EAAK,QAAQC,CAAM,EAEjEpD,EAAa,CAAqB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAC7EpD,EAAa,CAAY,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAEnEpD,EAAa,CAAqB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAC7EpD,EAAa,CAAY,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EACnEpD,EAAa,CAAc,EAAI,CAACmD,EAAMC,IAAWD,EAAK,WAAWC,CAAM,EAEvEpD,EAAa,CAAc,EAAI,CAACmD,EAAMC,IAAWD,EAAK,WAAWC,CAAM,EAEvE,IAAM1C,EAAe,MAAM,CAAC,EAE5BA,EAAa,CAAoB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EACzF3C,EAAa,CAAW,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,QAAQC,EAAQC,CAAK,EAE/E3C,EAAa,CAAqB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,UAAUC,EAAQC,CAAK,EAC3F3C,EAAa,CAAY,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EAEjF3C,EAAa,CAAqB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,UAAUC,EAAQC,CAAK,EAC3F3C,EAAa,CAAY,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EACjF3C,EAAa,CAAc,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,WAAWC,EAAQC,CAAK,EAErF3C,EAAa,CAAc,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,WAAWC,EAAQC,CAAK,EAErF,IAAM9C,EAAmB,MAAM,CAAC,EAE5BF,IACFE,EAAiB,CAAoB,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,WAAWE,EAAOD,CAAM,EAC/F7C,EAAiB,CAAW,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,UAAUE,EAAOD,CAAM,EAErF7C,EAAiB,CAAqB,EAAI,CAAC4C,EAAME,EAAOD,IACtDD,EAAK,cAAcE,EAAOD,CAAM,EAClC7C,EAAiB,CAAY,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EAEzF7C,EAAiB,CAAqB,EAAI,CAAC4C,EAAME,EAAOD,IACtDD,EAAK,cAAcE,EAAOD,CAAM,EAClC7C,EAAiB,CAAY,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EACzF7C,EAAiB,CAAc,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EAE3F7C,EAAiB,CAAc,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,cAAcE,EAAOD,CAAM,GAG9F,IAAMtD,EAAmB,MAAM,CAAC,EAE5BO,IACFP,EAAiB,CAAoB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAChFtD,EAAiB,CAAW,EAAI,CAACqD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAEtEtD,EAAiB,CAAqB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM,EAEpFtD,EAAiB,CAAY,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC1EtD,EAAiB,CAAqB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM,EAEpFtD,EAAiB,CAAY,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC1EtD,EAAiB,CAAc,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC5EtD,EAAiB,CAAc,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM","names":["src_exports","__export","BinaryPacket","Field","FieldArray","FieldBitFlags","FieldFixedArray","FieldOptional","FieldString","SequentialSerializer","__toCommonJS","hasNodeBuffers","growDataView","dataview","newByteLength","resizedBuffer","amountToCopy","length","offset","growNodeBuffer","buffer","newBuffer","textEncoder","textDecoder","encodeStringIntoDataView","byteOffset","string","strlen","u8Buffer","encodeSmallString","encodeStringIntoNodeBuffer","i","decodeStringFromNodeBuffer","decodeStringFromDataView","Field","FieldArray","item","FieldFixedArray","length","SequentialSerializer","iterable","FieldBitFlags","flags","FieldString","FieldOptional","packet","BinaryPacket","_BinaryPacket","packetId","definition","sortEntries","inspection","inspectEntries","buffer","byteOffset","dataview","arraybuffer","visitors","GET_FUNCTION_BUF","decodeStringFromNodeBuffer","GET_FUNCTION","decodeStringFromDataView","byteLength","dataIn","offsetPointer","hasNodeBuffers","dataOut","SET_FUNCTION_BUF","growNodeBuffer","encodeStringIntoNodeBuffer","SET_FUNCTION","growDataView","encodeStringIntoDataView","buf","onVisit","numElements","element","readFunctions","decodeStringFunction","Packet","result","name","def","array","itemType","i","strlen","itemSize","BYTE_SIZE","bit","hasSubPacket","maxByteLength","writeFunctions","growBufferFunction","encodeStringFunction","data","isDynamicArray","neededBytesForElements","object","str","number","len","field","string","fieldName1","fieldName2","entries","minimumByteLength","stringPositions","type","isString","view","offset","value"]}
1
+ {"version":3,"sources":["../src/index.ts","../src/buffers.ts"],"sourcesContent":["import {\n decodeStringFromDataView,\n decodeStringFromNodeBuffer,\n encodeStringIntoDataView,\n encodeStringIntoNodeBuffer,\n growDataView,\n growNodeBuffer,\n hasNodeBuffers,\n type TrueArrayBuffer\n} from './buffers'\n\nexport const enum Field {\n /**\n * Defines a 1 byte (8 bits) unsigned integer field. \\\n * (Range: 0 - 255)\n */\n UNSIGNED_INT_8 = 0,\n\n /**\n * Defines a 2 bytes (16 bits) unsigned integer field. \\\n * (Range: 0 - 65535)\n */\n UNSIGNED_INT_16,\n\n /**\n * Defines a 4 bytes (32 bits) unsigned integer field. \\\n * (Range: 0 - 4294967295)\n */\n UNSIGNED_INT_32,\n\n /**\n * Defines a 1 byte (8 bits) signed integer field. \\\n * (Range: -128 - 127)\n */\n INT_8,\n\n /**\n * Defines a 2 bytes (16 bits) signed integer field. \\\n * (Range: -32768 - 32767)\n */\n INT_16,\n\n /**\n * Defines a 4 bytes (32 bits) signed integer field. \\\n * (Range: -2147483648 - 2147483647)\n */\n INT_32,\n\n /**\n * Defines a 4 bytes (32 bits) floating-point field. \\\n */\n FLOAT_32,\n\n /**\n * Defines a 8 bytes (64 bits) floating-point field. \\\n */\n FLOAT_64\n}\n\n/**\n * Defines a dynamically-sized array with elements of a certain type. \\\n * Dynamically-sized arrays are useful when a packet's field is an array of a non pre-defined length. \\\n * Although, this makes dynamically-sized arrays more memory expensive as the internal buffer needs to be grown accordingly.\n *\n * NOTE: If an array will ALWAYS have the same length, prefer using the `FieldFixedArray` type, for both better performance and memory efficiency. \\\n * NOTE: As of now, dynamic arrays can have at most 256 elements.\n */\nexport function FieldArray<T extends Field | BinaryPacket<Definition> | ''>(\n item: T\n): [itemType: T] {\n return [item]\n}\n\n/**\n * Defines a statically-sized array with elements of a certain type. \\\n * Fixed arrays are useful when a packet's field is an array of a pre-defined length. \\\n * Fixed arrays much more memory efficient and performant than non-fixed ones.\n *\n * NOTE: If an array will not always have the same length, use the `FieldArray` type.\n */\nexport function FieldFixedArray<\n T extends Field | BinaryPacket<Definition> | '',\n Length extends number\n>(item: T, length: Length): [itemType: T, length: Length] {\n if (length < 0 || !Number.isFinite(length)) {\n throw new RangeError('Length of a FixedArray must be a positive integer.')\n }\n\n return [item, length]\n}\n\n/**\n * Utility class that allows serializing arrays through any kind of iterable, as long as the number of elements is known beforehand. \\\n * Needed to skip the overhead of duplicating the data into an actual array just for it to be serialized straight away and trashed.\n */\nexport class SequentialSerializer<T> implements Iterable<T> {\n constructor(\n private readonly iterable: Iterable<T>,\n public readonly length: number\n ) {}\n\n [Symbol.iterator]() {\n return this.iterable[Symbol.iterator]()\n }\n}\n\n/**\n * Either an array or a SequentialSerializer<T>.\n *\n * Note: when a packet is **read**, it will **always** be a standard array: the SequentialSerializer \\\n * is just a utility to serialize iterators avoiding data duplication and array-creation overheads.\n */\ntype SequentiallySerializable<T, IsRead extends boolean> = IsRead extends true\n ? T[]\n : T[] | SequentialSerializer<T>\n\ntype BitFlags = (string[] | ReadonlyArray<string>) & {\n length: 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8\n}\n\n/**\n * Defines a sequence of up to 8 \"flags\" (basically single bits/booleans) that can be packed together into a single 8 bits value. \\\n * This is useful for minimizing bytes usage when there are lots of boolean fields/flags, instead of saving each flag separately as its own 8 bits value.\n *\n * The input should be an array of strings (with at most 8 elements) where each string defines the name of a flag. \\\n * This is just for definition purposes, then when actually writing or reading packets it'll just be a record-object with those names as keys and boolean values.\n */\nexport function FieldBitFlags<const FlagsArray extends BitFlags>(flags: FlagsArray) {\n if (flags.length > 8) {\n throw new Error(\n `Invalid BinaryPacket definition: a BitFlags field can have only up to 8 flags, given: ${flags.join(', ')}`\n )\n }\n\n return { flags }\n}\n\n/**\n * Defines a string field. \\\n * Strings cannot be more than 65536 characters long.\n *\n * NOTE: Only strings containing just ASCII and/or single-octet UTF-8 characters are supported.\n */\nexport function FieldString() {\n return '' as const\n}\n\n/**\n * Defines an optional BinaryPacket \"subpacket\" field. \\\n * When writing and reading packets it'll be possible to provide and receive `undefined` instead of a valid object.\n */\nexport function FieldOptional<T extends BinaryPacket<Definition>>(packet: T) {\n return { optional: packet }\n}\n\n/**\n * Do not manually construct this type: an object of this kind is returned by a BinaryPacket `createVisitor` method. \\\n * Used in the `BinaryPacket::visit` static method to perform a sort of \"pattern matching\" on an incoming packet (of yet unknown type) buffer.\n */\ntype Visitor = [BinaryPacket<Definition>, (packet: any) => void]\n\nexport class BinaryPacket<T extends Definition> {\n /**\n * Defines a new binary packet. \\\n * Make sure that every `packetId` is unique.\n * @throws RangeError If packetId is negative, floating-point, or greater than 255.\n */\n static define<T extends Definition>(packetId: number, definition?: T) {\n if (packetId < 0 || !Number.isFinite(packetId)) {\n throw new RangeError('Packet IDs must be positive integers.')\n }\n\n if (packetId > 255) {\n throw new RangeError(\n 'Packet IDs greater than 255 are not supported. Do you REALLY need more than 255 different kinds of packets?'\n )\n }\n\n return new BinaryPacket(packetId, definition)\n }\n\n /**\n * Reads just the packetId from the given Buffer. \\\n * This method practically just reads the uint8 at offset `byteOffset` (default: 0). \\\n * Useful if the receiving side receives multiple types of packets.\n */\n static readPacketIdNodeBuffer(buffer: Buffer, byteOffset = 0) {\n return buffer.readUint8(byteOffset)\n }\n\n /**\n * Reads just the packetId from the given DataView. \\\n * This method practically just reads the uint8 at offset `byteOffset` (default: 0). \\\n * Useful if the receiving side receives multiple types of packets.\n */\n static readPacketIdDataView(dataview: DataView, byteOffset = 0) {\n return dataview.getUint8(byteOffset)\n }\n\n /**\n * Reads just the packetId from the given ArrayBuffer. \\\n * This method practically just reads the uint8 at offset `byteOffset`. \\\n * Useful if the receiving side receives multiple types of packets.\n *\n * NOTE: Due to security issues, the `byteOffset` argument cannot be defaulted and must be provided by the user. \\\n * NOTE: For more information read the `readArrayBuffer` method documentation.\n */\n static readPacketIdArrayBuffer(arraybuffer: TrueArrayBuffer, byteOffset: number) {\n return new Uint8Array(arraybuffer, byteOffset, 1)[0]\n }\n\n /**\n * Visits and \"pattern matches\" the given Buffer through the given visitors. \\\n * The Buffer is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitNodeBuffer(buffer: Buffer, ...visitors: Visitor[]) {\n return BinaryPacket.visit(buffer, GET_FUNCTION_BUF, decodeStringFromNodeBuffer, visitors)\n }\n\n /**\n * Visits and \"pattern matches\" the given DataView through the given visitors. \\\n * The DataView is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitDataView(dataview: DataView, ...visitors: Visitor[]) {\n return BinaryPacket.visit(dataview, GET_FUNCTION, decodeStringFromDataView, visitors)\n }\n\n /**\n * Visits and \"pattern matches\" the given ArrayBuffer through the given visitors. \\\n * The ArrayBuffer is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: Due to security issues, the `byteOffset` and `byteLength` arguments must be provided by the user. \\\n * NOTE: For more information read the `readArrayBuffer` method documentation. \\\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitArrayBuffer(\n arraybuffer: TrueArrayBuffer,\n byteOffset: number,\n byteLength: number,\n ...visitors: Visitor[]\n ) {\n return BinaryPacket.visit(\n new DataView(arraybuffer, byteOffset, byteLength),\n GET_FUNCTION,\n decodeStringFromDataView,\n visitors\n )\n }\n\n /**\n * Reads/deserializes from the given Buffer. \\\n * Method available ONLY on NodeJS and Bun.\n *\n * If possible, always prefer reading using this method, as it is much faster than the other ones.\n *\n * NOTE: if you have an ArrayBuffer do not bother wrapping it into a node Buffer yourself. \\\n * NOTE: if you have an ArrayBuffer use the appropriate `readArrayBuffer`.\n */\n readNodeBuffer(\n dataIn: Buffer,\n offsetPointer = { offset: 0 },\n byteLength = dataIn.byteLength\n ): ToJson<T, true> {\n return this.read(\n dataIn,\n offsetPointer,\n byteLength,\n GET_FUNCTION_BUF,\n decodeStringFromNodeBuffer\n )\n }\n\n /**\n * Reads/deserializes from the given DataView.\n *\n * NOTE: if you have an ArrayBuffer do not bother wrapping it into a DataView yourself. \\\n * NOTE: if you have an ArrayBuffer use the appropriate `readArrayBuffer`.\n */\n readDataView(\n dataIn: DataView,\n offsetPointer = { offset: 0 },\n byteLength = dataIn.byteLength\n ): ToJson<T, true> {\n return this.read(dataIn, offsetPointer, byteLength, GET_FUNCTION, decodeStringFromDataView)\n }\n\n /**\n * Reads/deserializes from the given ArrayBuffer. \\\n * WARNING: this method is practically a HACK.\n *\n * When using this method both the `byteOffset` and `byteLength` are REQUIRED and cannot be defaulted. \\\n * This is to prevent serious bugs and security issues. \\\n * That is because often raw ArrayBuffers come from a pre-allocated buffer pool and do not start at byteOffset 0.\n *\n * NOTE: if you have a node Buffer do not bother wrapping it into an ArrayBuffer yourself. \\\n * NOTE: if you have a node Buffer use the appropriate `readNodeBuffer` as it is much faster and less error prone.\n */\n readArrayBuffer(dataIn: TrueArrayBuffer, byteOffset: number, byteLength: number) {\n return this.read(\n hasNodeBuffers\n ? Buffer.from(dataIn, byteOffset, byteLength)\n : (new DataView(dataIn, byteOffset, byteLength) as any),\n { offset: 0 }, // The underlying buffer has already been offsetted\n byteLength,\n hasNodeBuffers ? GET_FUNCTION_BUF : GET_FUNCTION,\n hasNodeBuffers ? decodeStringFromNodeBuffer : (decodeStringFromDataView as any)\n )\n }\n\n /**\n * Writes/serializes the given object into a Buffer. \\\n * Method available ONLY on NodeJS and Bun.\n *\n * If possible, always prefer writing using this method, as it is much faster than the other ones.\n */\n writeNodeBuffer(dataOut: ToJson<T>) {\n const byteLength = this.precalculateBufferLengthWithStrings(dataOut)\n const buffer = Buffer.allocUnsafe(byteLength)\n\n return this.write(\n buffer,\n dataOut,\n { offset: 0 },\n byteLength,\n byteLength,\n SET_FUNCTION_BUF,\n growNodeBuffer,\n encodeStringIntoNodeBuffer\n )\n }\n\n /**\n * Writes/serializes the given object into a DataView. \\\n */\n writeDataView(dataOut: ToJson<T>) {\n const byteLength = this.precalculateBufferLengthWithStrings(dataOut)\n const dataview = new DataView(new ArrayBuffer(byteLength))\n\n return this.write(\n dataview,\n dataOut,\n { offset: 0 },\n byteLength,\n byteLength,\n SET_FUNCTION,\n growDataView,\n encodeStringIntoDataView\n )\n }\n\n /**\n * Writes/serializes the given object into an ArrayBuffer. \\\n * This method is just a wrapper around either `writeNodeBuffer` or `writeDataView`. \\\n *\n * This method works with JavaScript standard raw ArrayBuffer(s) and, as such, is very error prone: \\\n * Make sure you're using the returned byteLength and byteOffset fields in the read counterpart. \\\n *\n * Always consider whether is possible to use directly `writeNodeBuffer` or `writeDataView` instead of `writeArrayBuffer`. \\\n * For more information read the `readArrayBuffer` documentation.\n */\n writeArrayBuffer(dataOut: ToJson<T>) {\n const buf = hasNodeBuffers ? this.writeNodeBuffer(dataOut) : this.writeDataView(dataOut)\n return { buffer: buf.buffer, byteLength: buf.byteLength, byteOffset: buf.byteOffset }\n }\n\n /**\n * Creates a \"visitor\" object for this BinaryPacket definition. \\\n * Used when visiting and \"pattern matching\" buffers with the `BinaryPacket::visit` static utility methods. \\\n *\n * For more information read the `BinaryPacket::visitNodeBuffer` documentation. \\\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n visitor(onVisit: (packet: ToJson<T>) => void): Visitor {\n return [this, onVisit]\n }\n\n sequentialSerializer(numElements: number, dataOut: Iterable<ToJson<T>>) {\n const byteLength = this.minimumByteLength * numElements\n const buffer = Buffer.allocUnsafe(byteLength)\n const offsetPointer = { offset: 0 }\n\n for (const element of dataOut) {\n this.write(\n buffer,\n element,\n offsetPointer,\n byteLength,\n byteLength,\n SET_FUNCTION_BUF,\n growNodeBuffer,\n encodeStringIntoNodeBuffer\n )\n }\n }\n\n /// PRIVATE\n\n private readonly entries: Entries\n readonly stringPositions: StringPositions\n readonly minimumByteLength: number\n\n private constructor(\n private readonly packetId: number,\n definition?: T\n ) {\n this.entries = definition ? sortEntries(definition) : []\n const inspection = inspectEntries(this.entries)\n this.minimumByteLength = inspection.minimumByteLength\n this.stringPositions = inspection.stringPositions\n }\n\n private static visit<Buf extends DataView | Buffer>(\n dataIn: Buf,\n readFunctions: typeof GET_FUNCTION | typeof GET_FUNCTION_BUF,\n decodeStringFunction: (dataIn: Buf, byteOffset: number, strlen: number) => string,\n visitors: Visitor[]\n ) {\n for (const [Packet, onVisit] of visitors) {\n if (Packet.packetId === readFunctions[Field.UNSIGNED_INT_8](dataIn as any, 0)) {\n return onVisit(\n Packet.read(dataIn, { offset: 0 }, dataIn.byteLength, readFunctions, decodeStringFunction)\n )\n }\n }\n }\n\n private read<Buf extends DataView | Buffer>(\n dataIn: Buf,\n offsetPointer: { offset: number },\n byteLength: number,\n readFunctions: typeof GET_FUNCTION | typeof GET_FUNCTION_BUF,\n decodeStringFunction: (dataIn: Buf, byteOffset: number, strlen: number) => string\n ): ToJson<T, true> {\n if (byteLength + offsetPointer.offset < this.minimumByteLength) {\n throw new Error(\n `There is no space available to fit a packet of type ${this.packetId} at offset ${offsetPointer.offset}`\n )\n }\n\n if (\n readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset) !== this.packetId\n ) {\n throw new Error(\n `Data at offset ${offsetPointer.offset} is not a packet of type ${this.packetId}`\n )\n }\n\n offsetPointer.offset += 1\n const result: any = {}\n\n for (const [name, def] of this.entries) {\n if (Array.isArray(def)) {\n const length =\n // def[1] is the length of a statically-sized array, if undefined: must read the length from the buffer as it means it's a dynamically-sized array\n def[1] ?? readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset++)\n\n const array = Array(length)\n\n const itemType = def[0]\n\n if (typeof itemType === 'object') {\n // Array of \"subpackets\"\n for (let i = 0; i < length; ++i) {\n array[i] = itemType.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n } else if (itemType === '') {\n // Array of strings\n for (let i = 0; i < length; ++i) {\n const strlen = readFunctions[Field.UNSIGNED_INT_16](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 2\n\n array[i] = decodeStringFunction(dataIn, offsetPointer.offset, strlen)\n offsetPointer.offset += strlen\n }\n } else {\n // Array of primitives (numbers)\n const itemSize = BYTE_SIZE[itemType]\n\n // It seems like looping over each element is actually much faster than using TypedArrays bulk copy.\n // TODO: properly benchmark with various array sizes to see if it's actually the case.\n for (let i = 0; i < length; ++i) {\n array[i] = readFunctions[itemType](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += itemSize\n }\n }\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = array\n } else if (typeof def === 'number') {\n // Single primitive (number)\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = readFunctions[def](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += BYTE_SIZE[def]\n } else if (def === '') {\n const strlen = readFunctions[Field.UNSIGNED_INT_16](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 2\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = decodeStringFunction(dataIn, offsetPointer.offset, strlen)\n offsetPointer.offset += strlen\n } else if ('flags' in def) {\n // BitFlags\n const flags = readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 1\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = {}\n\n for (let bit = 0; bit < def.flags.length; ++bit) {\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name][def.flags[bit]] = !!(flags & (1 << bit))\n }\n } else if ('optional' in def) {\n // Single optional \"subpacket\"\n const hasSubPacket =\n readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset) !== 0\n\n offsetPointer.offset += 1\n\n if (hasSubPacket) {\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = def.optional.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n } else {\n // Single \"subpacket\"\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = def.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n }\n\n return result as ToJson<T, true>\n }\n\n private write<Buf extends DataView | Buffer>(\n buffer: Buf,\n dataOut: ToJson<T>,\n offsetPointer: { offset: number },\n byteLength: number,\n maxByteLength: number,\n writeFunctions: typeof SET_FUNCTION | typeof SET_FUNCTION_BUF,\n growBufferFunction: (buffer: Buf, newByteLength: number) => Buf,\n encodeStringFunction: (buffer: Buf, byteOffset: number, string: string) => void\n ): Buf {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, this.packetId, offsetPointer.offset)\n offsetPointer.offset += 1\n\n for (const [name, def] of this.entries) {\n const data = dataOut[name]\n\n if (Array.isArray(def)) {\n // Could be both an array of just numbers or \"subpackets\"\n\n const length = (data as SequentiallySerializable<any, false>).length\n\n // Check if it is a dynamically-sized array, if it is, the length of the array must be serialized in the buffer before its elements\n // Explicitly check for undefined and not falsy values because it could be a statically-sized array of 0 elements.\n const isDynamicArray = def[1] === undefined\n\n if (isDynamicArray) {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, length, offsetPointer.offset)\n offsetPointer.offset += 1\n }\n\n if (length > 0) {\n const itemType = def[0]\n\n if (typeof itemType === 'object') {\n // Array of \"subpackets\"\n\n if (isDynamicArray) {\n const neededBytesForElements = length * itemType.minimumByteLength\n\n byteLength += neededBytesForElements\n maxByteLength += neededBytesForElements\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n }\n\n for (const object of data as unknown as ToJson<Definition>[]) {\n // Array of \"subpackets\"\n buffer = itemType.write(\n buffer,\n object,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n }\n } else if (itemType === '') {\n // Array of strings\n for (let i = 0; i < length; ++i) {\n const str = (data as unknown as string[])[i]\n const strlen = str.length\n\n writeFunctions[Field.UNSIGNED_INT_16](buffer as any, strlen, offsetPointer.offset)\n offsetPointer.offset += 2\n\n encodeStringFunction(buffer, offsetPointer.offset, str)\n offsetPointer.offset += strlen\n }\n } else {\n // Array of primitives (numbers)\n const itemSize = BYTE_SIZE[itemType]\n\n if (isDynamicArray) {\n const neededBytesForElements = length * itemSize\n\n byteLength += neededBytesForElements\n maxByteLength += neededBytesForElements\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n }\n\n // It seems like looping over each element is actually much faster than using TypedArrays bulk copy.\n // TODO: properly benchmark with various array sizes to see if it's actually the case.\n for (const number of data as SequentiallySerializable<number, false>) {\n writeFunctions[itemType](buffer as any, number, offsetPointer.offset)\n offsetPointer.offset += itemSize\n }\n }\n }\n } else if (typeof def === 'number') {\n // Single primitive (number)\n writeFunctions[def](buffer as any, data as number, offsetPointer.offset)\n offsetPointer.offset += BYTE_SIZE[def]\n } else if (def === '') {\n // String\n const strlen = (data as string).length\n\n writeFunctions[Field.UNSIGNED_INT_16](buffer as any, strlen, offsetPointer.offset)\n offsetPointer.offset += 2\n\n encodeStringFunction(buffer, offsetPointer.offset, data as string)\n offsetPointer.offset += strlen\n } else if ('flags' in def) {\n // BitFlags\n let flags = 0\n\n for (let bit = 0; bit < def.flags.length; ++bit) {\n if ((data as Record<string, boolean>)[def.flags[bit]]) {\n flags |= 1 << bit\n }\n }\n\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, flags, offsetPointer.offset)\n offsetPointer.offset += 1\n } else if ('optional' in def) {\n if (data) {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, 1, offsetPointer.offset)\n offsetPointer.offset += 1\n\n byteLength += def.optional.minimumByteLength\n maxByteLength += def.optional.minimumByteLength\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n\n buffer = def.optional.write(\n buffer,\n data as ToJson<Definition>,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n } else {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, 0, offsetPointer.offset)\n offsetPointer.offset += 1\n }\n } else {\n // Single \"subpacket\"\n buffer = def.write(\n buffer,\n data as ToJson<Definition>,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n }\n }\n\n return buffer\n }\n\n private precalculateBufferLengthWithStrings(dataOut: ToJson<T>) {\n let len = this.minimumByteLength\n\n for (const field of this.stringPositions[0]) {\n // String field\n len += (dataOut[field] as string).length\n }\n\n for (const field of this.stringPositions[1]) {\n // Array of strings field\n for (const string of dataOut[field] as unknown as string[]) {\n len += 2 + string.length\n }\n }\n\n for (const field in this.stringPositions[2]) {\n // Subpacket that has some string fields\n len += this.stringPositions[2][field].precalculateBufferLengthWithStrings(\n dataOut[field] as any\n )\n }\n\n return len\n }\n}\n\n/**\n * BinaryPacket definition: \\\n * Any packet can be defined through a \"schema\" object explaining its fields names and types.\n *\n * @example\n * // Imagine we have a game board where each cell is a square and is one unit big.\n * // A cell can be then defined by its X and Y coordinates.\n * // For simplicity, let's say there cannot be more than 256 cells, so we can use 8 bits for each coordinate.\n * const Cell = {\n * x: Field.UNSIGNED_INT_8,\n * y: Field.UNSIGNED_INT_8\n * }\n *\n * // When done with the cell definition we can create its BinaryPacket writer/reader.\n * // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.\n * const CellPacket = BinaryPacket.define(0, Cell)\n *\n * // Let's now make the definition of the whole game board.\n * // You can also specify arrays of both \"primitive\" fields and other BinaryPackets.\n * const Board = {\n * numPlayers: Field.UNSIGNED_INT_8,\n * cells: FieldArray(CellPacket)\n * }\n *\n * // When done with the board definition we can create its BinaryPacket writer/reader.\n * // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.\n * const BoardPacket = BinaryPacket.define(1, Board)\n *\n * // And use it.\n * const buffer = BoardPacket.writeNodeBuffer({\n * numPlayers: 1,\n * cells: [\n * { x: 0, y: 0 },\n * { x: 1, y: 1 }\n * ]\n * })\n *\n * // sendTheBufferOver(buffer)\n * // ...\n * // const buffer = receiveTheBuffer()\n * const board = BoardPacket.readNodeBuffer(buffer)\n * // ...\n */\nexport type Definition = {\n [fieldName: string]:\n | MaybeArray<Field>\n | MaybeArray<BinaryPacket<Definition>>\n | MaybeArray<''>\n | { flags: BitFlags }\n | { optional: BinaryPacket<Definition> }\n}\n\ntype MaybeArray<T> = T | [itemType: T] | [itemType: T, length: number]\n\ntype BitFlagsToJson<FlagsArray extends BitFlags> = {\n [key in FlagsArray[number]]: boolean\n}\n\n/**\n * Meta-type that converts a `Definition` schema to the type of the actual JavaScript object that will be written into a packet or read from. \\\n */\nexport type ToJson<T extends Definition, IsRead extends boolean = false> = {\n [K in keyof T]: T[K] extends [infer Item]\n ? Item extends BinaryPacket<infer BPDef>\n ? SequentiallySerializable<ToJson<BPDef, IsRead>, IsRead>\n : Item extends ''\n ? SequentiallySerializable<string, IsRead>\n : SequentiallySerializable<number, IsRead>\n : T[K] extends [infer Item, infer Length]\n ? Item extends BinaryPacket<infer BPDef>\n ? SequentiallySerializable<ToJson<BPDef, IsRead>, IsRead> & { length: Length }\n : Item extends ''\n ? string[] & { length: Length }\n : number[] & { length: Length }\n : T[K] extends BinaryPacket<infer BPDef>\n ? ToJson<BPDef, IsRead>\n : T[K] extends { flags: infer FlagsArray extends BitFlags }\n ? BitFlagsToJson<FlagsArray>\n : T[K] extends { optional: BinaryPacket<infer BPDef extends Definition> }\n ? ToJson<BPDef, IsRead> | undefined\n : T[K] extends ''\n ? string\n : number\n}\n\n/**\n * In a JavaScript object, the order of its keys is not strictly defined: sort them by field name. \\\n * Thus, we cannot trust iterating over an object keys: we MUST iterate over its entries array. \\\n * This is important to make sure that whoever shares BinaryPacket definitions can correctly write/read packets independently of their JS engines.\n */\nfunction sortEntries(definition: Definition) {\n return Object.entries(definition).sort(([fieldName1], [fieldName2]) =>\n fieldName1.localeCompare(fieldName2)\n )\n}\n\ntype Entries = ReturnType<typeof sortEntries>\n\ntype StringPositions = [\n string[],\n string[],\n {\n [field: string]: BinaryPacket<Definition>\n }\n]\n\n/**\n * Helper function that \"inspects\" the entries of a BinaryPacket definition\n * and returns useful \"stats\" needed for writing and reading buffers.\n *\n * This function is ever called only once per BinaryPacket definition.\n */\nfunction inspectEntries(entries: Entries) {\n // The PacketID is already 1 byte, that's why we aren't starting from 0.\n let minimumByteLength = 1\n\n const stringPositions: StringPositions = [[], [], {}]\n\n for (const [name, type] of entries) {\n if (Array.isArray(type)) {\n if (type.length === 2) {\n // Statically-sized array\n const isString = type[0] === ''\n\n const itemSize =\n typeof type[0] === 'object'\n ? type[0].minimumByteLength\n : isString\n ? 2\n : BYTE_SIZE[type[0]]\n\n minimumByteLength += type[1] * itemSize\n\n if (isString) {\n stringPositions[1].push(name)\n }\n } else {\n // Dynamically-sized array\n // Adding 1 byte to serialize the array length\n minimumByteLength += 1\n\n if (type[0] === '') {\n stringPositions[1].push(name)\n }\n }\n } else if (type instanceof BinaryPacket) {\n minimumByteLength += type.minimumByteLength\n stringPositions[2][name] = type\n } else if (typeof type === 'object') {\n // BitFlags & Optionals\n // BitFlags are always 1 byte long, because they can hold up to 8 booleans\n // Optionals minimum is 1 byte long, because it holds whether the subpacket is present or not\n minimumByteLength += 1\n } else if (type === '') {\n // String\n // Adding 2 to serialize the string length\n minimumByteLength += 2\n stringPositions[0].push(name)\n } else {\n minimumByteLength += BYTE_SIZE[type]\n }\n }\n\n return { minimumByteLength, stringPositions }\n}\n\n//////////////////////////////////////////////\n// The logic here is practically over //\n// Here below there are needed constants //\n// that map a field-type to a functionality //\n//////////////////////////////////////////////\n\nconst BYTE_SIZE = Array(8) as number[]\n\nBYTE_SIZE[Field.UNSIGNED_INT_8] = 1\nBYTE_SIZE[Field.INT_8] = 1\n\nBYTE_SIZE[Field.UNSIGNED_INT_16] = 2\nBYTE_SIZE[Field.INT_16] = 2\n\nBYTE_SIZE[Field.UNSIGNED_INT_32] = 4\nBYTE_SIZE[Field.INT_32] = 4\nBYTE_SIZE[Field.FLOAT_32] = 4\n\nBYTE_SIZE[Field.FLOAT_64] = 8\n\nconst GET_FUNCTION = Array(8) as ((view: DataView, offset: number) => number)[]\n\nGET_FUNCTION[Field.UNSIGNED_INT_8] = (view, offset) => view.getUint8(offset)\nGET_FUNCTION[Field.INT_8] = (view, offset) => view.getInt8(offset)\n\nGET_FUNCTION[Field.UNSIGNED_INT_16] = (view, offset) => view.getUint16(offset)\nGET_FUNCTION[Field.INT_16] = (view, offset) => view.getInt16(offset)\n\nGET_FUNCTION[Field.UNSIGNED_INT_32] = (view, offset) => view.getUint32(offset)\nGET_FUNCTION[Field.INT_32] = (view, offset) => view.getInt32(offset)\nGET_FUNCTION[Field.FLOAT_32] = (view, offset) => view.getFloat32(offset)\n\nGET_FUNCTION[Field.FLOAT_64] = (view, offset) => view.getFloat64(offset)\n\nconst SET_FUNCTION = Array(8) as ((view: DataView, value: number, offset: number) => void)[]\n\nSET_FUNCTION[Field.UNSIGNED_INT_8] = (view, value, offset) => view.setUint8(offset, value)\nSET_FUNCTION[Field.INT_8] = (view, value, offset) => view.setInt8(offset, value)\n\nSET_FUNCTION[Field.UNSIGNED_INT_16] = (view, value, offset) => view.setUint16(offset, value)\nSET_FUNCTION[Field.INT_16] = (view, value, offset) => view.setInt16(offset, value)\n\nSET_FUNCTION[Field.UNSIGNED_INT_32] = (view, value, offset) => view.setUint32(offset, value)\nSET_FUNCTION[Field.INT_32] = (view, value, offset) => view.setInt32(offset, value)\nSET_FUNCTION[Field.FLOAT_32] = (view, value, offset) => view.setFloat32(offset, value)\n\nSET_FUNCTION[Field.FLOAT_64] = (view, value, offset) => view.setFloat64(offset, value)\n\nconst SET_FUNCTION_BUF = Array(8) as ((nodeBuffer: Buffer, value: number, offset: number) => void)[]\n\nif (hasNodeBuffers) {\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_8] = (view, value, offset) => view.writeUint8(value, offset)\n SET_FUNCTION_BUF[Field.INT_8] = (view, value, offset) => view.writeInt8(value, offset)\n\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_16] = (view, value, offset) =>\n view.writeUint16BE(value, offset)\n SET_FUNCTION_BUF[Field.INT_16] = (view, value, offset) => view.writeInt16BE(value, offset)\n\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_32] = (view, value, offset) =>\n view.writeUint32BE(value, offset)\n SET_FUNCTION_BUF[Field.INT_32] = (view, value, offset) => view.writeInt32BE(value, offset)\n SET_FUNCTION_BUF[Field.FLOAT_32] = (view, value, offset) => view.writeFloatBE(value, offset)\n\n SET_FUNCTION_BUF[Field.FLOAT_64] = (view, value, offset) => view.writeDoubleBE(value, offset)\n}\n\nconst GET_FUNCTION_BUF = Array(8) as ((nodeBuffer: Buffer, offset: number) => number)[]\n\nif (hasNodeBuffers) {\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_8] = (view, offset) => view.readUint8(offset)\n GET_FUNCTION_BUF[Field.INT_8] = (view, offset) => view.readInt8(offset)\n\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_16] = (view, offset) => view.readUint16BE(offset)\n\n GET_FUNCTION_BUF[Field.INT_16] = (view, offset) => view.readInt16BE(offset)\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_32] = (view, offset) => view.readUint32BE(offset)\n\n GET_FUNCTION_BUF[Field.INT_32] = (view, offset) => view.readInt32BE(offset)\n GET_FUNCTION_BUF[Field.FLOAT_32] = (view, offset) => view.readFloatBE(offset)\n GET_FUNCTION_BUF[Field.FLOAT_64] = (view, offset) => view.readDoubleBE(offset)\n}\n","/**\n * Exclusively matches objects of type `ArrayBuffer` and no other types that inherit from it. \\\n * This is needed because the `DataView` constructor explicitly requires a \"true\" ArrayBuffer, or else it throws.\n */\nexport type TrueArrayBuffer = ArrayBuffer & { buffer?: undefined }\n\nexport const hasNodeBuffers = typeof Buffer === 'function'\n\nexport function growDataView(dataview: DataView, newByteLength: number) {\n const resizedBuffer = new ArrayBuffer(newByteLength)\n const amountToCopy = Math.min(dataview.byteLength, resizedBuffer.byteLength)\n\n // Treat the buffer as if it was a Float64Array so we can copy 8 bytes at a time, to finish faster\n let length = Math.trunc(amountToCopy / 8)\n new Float64Array(resizedBuffer, 0, length).set(new Float64Array(dataview.buffer, 0, length))\n\n // Copy the remaining up to 7 bytes\n const offset = length * 8\n length = amountToCopy - offset\n new Uint8Array(resizedBuffer, offset, length).set(new Uint8Array(dataview.buffer, offset, length))\n\n return new DataView(resizedBuffer)\n}\n\nexport function growNodeBuffer(buffer: Buffer, newByteLength: number) {\n const newBuffer = Buffer.allocUnsafe(newByteLength)\n buffer.copy(newBuffer)\n return newBuffer\n}\n\nconst textEncoder = new TextEncoder()\nconst textDecoder = new TextDecoder()\n\nexport function encodeStringIntoDataView(dataview: DataView, byteOffset: number, string: string) {\n const strlen = string.length\n const u8Buffer = new Uint8Array(dataview.buffer, dataview.byteOffset + byteOffset, strlen)\n\n if (strlen <= 64) {\n encodeSmallString(u8Buffer, 0, string, strlen)\n } else {\n textEncoder.encodeInto(string, u8Buffer)\n }\n}\n\nexport function encodeStringIntoNodeBuffer(buffer: Buffer, byteOffset: number, string: string) {\n const strlen = string.length\n\n if (strlen <= 64) {\n encodeSmallString(buffer, byteOffset, string, strlen)\n } else {\n buffer.utf8Write(string, byteOffset, strlen)\n }\n}\n\nfunction encodeSmallString(buffer: Uint8Array, byteOffset: number, string: string, strlen: number) {\n for (let i = 0; i < strlen; ++i) {\n buffer[byteOffset + i] = string.charCodeAt(i) & 0xff\n }\n}\n\nexport function decodeStringFromNodeBuffer(buffer: Buffer, byteOffset: number, strlen: number) {\n return buffer.subarray(byteOffset, byteOffset + strlen).toString('utf8')\n}\n\nexport function decodeStringFromDataView(dataview: DataView, byteOffset: number, strlen: number) {\n return textDecoder.decode(new DataView(dataview.buffer, dataview.byteOffset + byteOffset, strlen))\n}\n\ndeclare global {\n interface Buffer {\n /**\n * Node buffer's internals function. \\\n * For some reason it is not exposed through TypeScript. \\\n * Fastest way to write utf8 strings into buffers.\n */\n utf8Write(string: string, byteOffset?: number, byteLength?: number): number\n }\n}\n"],"mappings":"yaAAA,IAAAA,EAAA,GAAAC,EAAAD,EAAA,kBAAAE,EAAA,UAAAC,EAAA,eAAAC,EAAA,kBAAAC,EAAA,oBAAAC,EAAA,kBAAAC,EAAA,gBAAAC,EAAA,yBAAAC,IAAA,eAAAC,EAAAV,GCMO,IAAMW,EAAiB,OAAO,QAAW,WAEzC,SAASC,EAAaC,EAAoBC,EAAuB,CACtE,IAAMC,EAAgB,IAAI,YAAYD,CAAa,EAC7CE,EAAe,KAAK,IAAIH,EAAS,WAAYE,EAAc,UAAU,EAGvEE,EAAS,KAAK,MAAMD,EAAe,CAAC,EACxC,IAAI,aAAaD,EAAe,EAAGE,CAAM,EAAE,IAAI,IAAI,aAAaJ,EAAS,OAAQ,EAAGI,CAAM,CAAC,EAG3F,IAAMC,EAASD,EAAS,EACxB,OAAAA,EAASD,EAAeE,EACxB,IAAI,WAAWH,EAAeG,EAAQD,CAAM,EAAE,IAAI,IAAI,WAAWJ,EAAS,OAAQK,EAAQD,CAAM,CAAC,EAE1F,IAAI,SAASF,CAAa,CACnC,CAEO,SAASI,EAAeC,EAAgBN,EAAuB,CACpE,IAAMO,EAAY,OAAO,YAAYP,CAAa,EAClD,OAAAM,EAAO,KAAKC,CAAS,EACdA,CACT,CAEA,IAAMC,EAAc,IAAI,YAClBC,EAAc,IAAI,YAEjB,SAASC,EAAyBX,EAAoBY,EAAoBC,EAAgB,CAC/F,IAAMC,EAASD,EAAO,OAChBE,EAAW,IAAI,WAAWf,EAAS,OAAQA,EAAS,WAAaY,EAAYE,CAAM,EAErFA,GAAU,GACZE,EAAkBD,EAAU,EAAGF,EAAQC,CAAM,EAE7CL,EAAY,WAAWI,EAAQE,CAAQ,CAE3C,CAEO,SAASE,EAA2BV,EAAgBK,EAAoBC,EAAgB,CAC7F,IAAMC,EAASD,EAAO,OAElBC,GAAU,GACZE,EAAkBT,EAAQK,EAAYC,EAAQC,CAAM,EAEpDP,EAAO,UAAUM,EAAQD,EAAYE,CAAM,CAE/C,CAEA,SAASE,EAAkBT,EAAoBK,EAAoBC,EAAgBC,EAAgB,CACjG,QAASI,EAAI,EAAGA,EAAIJ,EAAQ,EAAEI,EAC5BX,EAAOK,EAAaM,CAAC,EAAIL,EAAO,WAAWK,CAAC,EAAI,GAEpD,CAEO,SAASC,EAA2BZ,EAAgBK,EAAoBE,EAAgB,CAC7F,OAAOP,EAAO,SAASK,EAAYA,EAAaE,CAAM,EAAE,SAAS,MAAM,CACzE,CAEO,SAASM,EAAyBpB,EAAoBY,EAAoBE,EAAgB,CAC/F,OAAOJ,EAAY,OAAO,IAAI,SAASV,EAAS,OAAQA,EAAS,WAAaY,EAAYE,CAAM,CAAC,CACnG,CDvDO,IAAWO,OAKhBA,IAAA,eAAiB,GAAjB,iBAMAA,IAAA,qCAMAA,IAAA,qCAMAA,IAAA,iBAMAA,IAAA,mBAMAA,IAAA,mBAKAA,IAAA,uBAKAA,IAAA,uBA7CgBA,OAAA,IAwDX,SAASC,EACdC,EACe,CACf,MAAO,CAACA,CAAI,CACd,CASO,SAASC,EAGdD,EAASE,EAA+C,CACxD,GAAIA,EAAS,GAAK,CAAC,OAAO,SAASA,CAAM,EACvC,MAAM,IAAI,WAAW,oDAAoD,EAG3E,MAAO,CAACF,EAAME,CAAM,CACtB,CAMO,IAAMC,EAAN,KAAqD,CAC1D,YACmBC,EACDF,EAChB,CAFiB,cAAAE,EACD,YAAAF,CACf,CAEH,CAAC,OAAO,QAAQ,GAAI,CAClB,OAAO,KAAK,SAAS,OAAO,QAAQ,EAAE,CACxC,CACF,EAuBO,SAASG,EAAiDC,EAAmB,CAClF,GAAIA,EAAM,OAAS,EACjB,MAAM,IAAI,MACR,yFAAyFA,EAAM,KAAK,IAAI,CAAC,EAC3G,EAGF,MAAO,CAAE,MAAAA,CAAM,CACjB,CAQO,SAASC,GAAc,CAC5B,MAAO,EACT,CAMO,SAASC,EAAkDC,EAAW,CAC3E,MAAO,CAAE,SAAUA,CAAO,CAC5B,CAQO,IAAMC,EAAN,MAAMC,CAAmC,CAoPtC,YACWC,EACjBC,EACA,CAFiB,cAAAD,EAGjB,KAAK,QAAUC,EAAaC,EAAYD,CAAU,EAAI,CAAC,EACvD,IAAME,EAAaC,EAAe,KAAK,OAAO,EAC9C,KAAK,kBAAoBD,EAAW,kBACpC,KAAK,gBAAkBA,EAAW,eACpC,CAtPA,OAAO,OAA6BH,EAAkBC,EAAgB,CACpE,GAAID,EAAW,GAAK,CAAC,OAAO,SAASA,CAAQ,EAC3C,MAAM,IAAI,WAAW,uCAAuC,EAG9D,GAAIA,EAAW,IACb,MAAM,IAAI,WACR,6GACF,EAGF,OAAO,IAAID,EAAaC,EAAUC,CAAU,CAC9C,CAOA,OAAO,uBAAuBI,EAAgBC,EAAa,EAAG,CAC5D,OAAOD,EAAO,UAAUC,CAAU,CACpC,CAOA,OAAO,qBAAqBC,EAAoBD,EAAa,EAAG,CAC9D,OAAOC,EAAS,SAASD,CAAU,CACrC,CAUA,OAAO,wBAAwBE,EAA8BF,EAAoB,CAC/E,OAAO,IAAI,WAAWE,EAAaF,EAAY,CAAC,EAAE,CAAC,CACrD,CAQA,OAAO,gBAAgBD,KAAmBI,EAAqB,CAC7D,OAAOV,EAAa,MAAMM,EAAQK,EAAkBC,EAA4BF,CAAQ,CAC1F,CAQA,OAAO,cAAcF,KAAuBE,EAAqB,CAC/D,OAAOV,EAAa,MAAMQ,EAAUK,EAAcC,EAA0BJ,CAAQ,CACtF,CAUA,OAAO,iBACLD,EACAF,EACAQ,KACGL,EACH,CACA,OAAOV,EAAa,MAClB,IAAI,SAASS,EAAaF,EAAYQ,CAAU,EAChDF,EACAC,EACAJ,CACF,CACF,CAWA,eACEM,EACAC,EAAgB,CAAE,OAAQ,CAAE,EAC5BF,EAAaC,EAAO,WACH,CACjB,OAAO,KAAK,KACVA,EACAC,EACAF,EACAJ,EACAC,CACF,CACF,CAQA,aACEI,EACAC,EAAgB,CAAE,OAAQ,CAAE,EAC5BF,EAAaC,EAAO,WACH,CACjB,OAAO,KAAK,KAAKA,EAAQC,EAAeF,EAAYF,EAAcC,CAAwB,CAC5F,CAaA,gBAAgBE,EAAyBT,EAAoBQ,EAAoB,CAC/E,OAAO,KAAK,KACVG,EACI,OAAO,KAAKF,EAAQT,EAAYQ,CAAU,EACzC,IAAI,SAASC,EAAQT,EAAYQ,CAAU,EAChD,CAAE,OAAQ,CAAE,EACZA,EACAG,EAAiBP,EAAmBE,EACpCK,EAAiBN,EAA8BE,CACjD,CACF,CAQA,gBAAgBK,EAAoB,CAClC,IAAMJ,EAAa,KAAK,oCAAoCI,CAAO,EAC7Db,EAAS,OAAO,YAAYS,CAAU,EAE5C,OAAO,KAAK,MACVT,EACAa,EACA,CAAE,OAAQ,CAAE,EACZJ,EACAA,EACAK,EACAC,EACAC,CACF,CACF,CAKA,cAAcH,EAAoB,CAChC,IAAMJ,EAAa,KAAK,oCAAoCI,CAAO,EAC7DX,EAAW,IAAI,SAAS,IAAI,YAAYO,CAAU,CAAC,EAEzD,OAAO,KAAK,MACVP,EACAW,EACA,CAAE,OAAQ,CAAE,EACZJ,EACAA,EACAQ,EACAC,EACAC,CACF,CACF,CAYA,iBAAiBN,EAAoB,CACnC,IAAMO,EAAMR,EAAiB,KAAK,gBAAgBC,CAAO,EAAI,KAAK,cAAcA,CAAO,EACvF,MAAO,CAAE,OAAQO,EAAI,OAAQ,WAAYA,EAAI,WAAY,WAAYA,EAAI,UAAW,CACtF,CASA,QAAQC,EAA+C,CACrD,MAAO,CAAC,KAAMA,CAAO,CACvB,CAEA,qBAAqBC,EAAqBT,EAA8B,CACtE,IAAMJ,EAAa,KAAK,kBAAoBa,EACtCtB,EAAS,OAAO,YAAYS,CAAU,EACtCE,EAAgB,CAAE,OAAQ,CAAE,EAElC,QAAWY,KAAWV,EACpB,KAAK,MACHb,EACAuB,EACAZ,EACAF,EACAA,EACAK,EACAC,EACAC,CACF,CAEJ,CAIiB,QACR,gBACA,kBAYT,OAAe,MACbN,EACAc,EACAC,EACArB,EACA,CACA,OAAW,CAACsB,EAAQL,CAAO,IAAKjB,EAC9B,GAAIsB,EAAO,WAAaF,EAAc,CAAoB,EAAEd,EAAe,CAAC,EAC1E,OAAOW,EACLK,EAAO,KAAKhB,EAAQ,CAAE,OAAQ,CAAE,EAAGA,EAAO,WAAYc,EAAeC,CAAoB,CAC3F,CAGN,CAEQ,KACNf,EACAC,EACAF,EACAe,EACAC,EACiB,CACjB,GAAIhB,EAAaE,EAAc,OAAS,KAAK,kBAC3C,MAAM,IAAI,MACR,uDAAuD,KAAK,QAAQ,cAAcA,EAAc,MAAM,EACxG,EAGF,GACEa,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,IAAM,KAAK,SAElF,MAAM,IAAI,MACR,kBAAkBA,EAAc,MAAM,4BAA4B,KAAK,QAAQ,EACjF,EAGFA,EAAc,QAAU,EACxB,IAAMgB,EAAc,CAAC,EAErB,OAAW,CAACC,EAAMC,CAAG,IAAK,KAAK,QAC7B,GAAI,MAAM,QAAQA,CAAG,EAAG,CACtB,IAAM5C,EAEJ4C,EAAI,CAAC,GAAKL,EAAc,CAAoB,EAAEd,EAAeC,EAAc,QAAQ,EAE/EmB,EAAQ,MAAM7C,CAAM,EAEpB8C,EAAWF,EAAI,CAAC,EAEtB,GAAI,OAAOE,GAAa,SAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAC5BF,EAAME,CAAC,EAAID,EAAS,KAClBrB,EACAC,EACAF,EACAe,EACAC,CACF,UAEOM,IAAa,GAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAAG,CAC/B,IAAMC,EAAST,EAAc,CAAqB,EAAEd,EAAeC,EAAc,MAAM,EACvFA,EAAc,QAAU,EAExBmB,EAAME,CAAC,EAAIP,EAAqBf,EAAQC,EAAc,OAAQsB,CAAM,EACpEtB,EAAc,QAAUsB,CAC1B,KACK,CAEL,IAAMC,EAAWC,EAAUJ,CAAQ,EAInC,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAC5BF,EAAME,CAAC,EAAIR,EAAcO,CAAQ,EAAErB,EAAeC,EAAc,MAAM,EACtEA,EAAc,QAAUuB,CAE5B,CAGAP,EAAOC,CAAI,EAAIE,CACjB,SAAW,OAAOD,GAAQ,SAGxBF,EAAOC,CAAI,EAAIJ,EAAcK,CAAG,EAAEnB,EAAeC,EAAc,MAAM,EACrEA,EAAc,QAAUwB,EAAUN,CAAG,UAC5BA,IAAQ,GAAI,CACrB,IAAMI,EAAST,EAAc,CAAqB,EAAEd,EAAeC,EAAc,MAAM,EACvFA,EAAc,QAAU,EAGxBgB,EAAOC,CAAI,EAAIH,EAAqBf,EAAQC,EAAc,OAAQsB,CAAM,EACxEtB,EAAc,QAAUsB,CAC1B,SAAW,UAAWJ,EAAK,CAEzB,IAAMxC,EAAQmC,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,EACrFA,EAAc,QAAU,EAGxBgB,EAAOC,CAAI,EAAI,CAAC,EAEhB,QAASQ,EAAM,EAAGA,EAAMP,EAAI,MAAM,OAAQ,EAAEO,EAE1CT,EAAOC,CAAI,EAAEC,EAAI,MAAMO,CAAG,CAAC,EAAI,CAAC,EAAE/C,EAAS,GAAK+C,EAEpD,SAAW,aAAcP,EAAK,CAE5B,IAAMQ,EACJb,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,IAAM,EAE/EA,EAAc,QAAU,EAEpB0B,IAEFV,EAAOC,CAAI,EAAIC,EAAI,SAAS,KAC1BnB,EACAC,EACAF,EACAe,EACAC,CACF,EAEJ,MAGEE,EAAOC,CAAI,EAAIC,EAAI,KACjBnB,EACAC,EACAF,EACAe,EACAC,CACF,EAIJ,OAAOE,CACT,CAEQ,MACN3B,EACAa,EACAF,EACAF,EACA6B,EACAC,EACAC,EACAC,EACK,CACLF,EAAe,CAAoB,EAAEvC,EAAe,KAAK,SAAUW,EAAc,MAAM,EACvFA,EAAc,QAAU,EAExB,OAAW,CAACiB,EAAMC,CAAG,IAAK,KAAK,QAAS,CACtC,IAAMa,EAAO7B,EAAQe,CAAI,EAEzB,GAAI,MAAM,QAAQC,CAAG,EAAG,CAGtB,IAAM5C,EAAUyD,EAA8C,OAIxDC,EAAiBd,EAAI,CAAC,IAAM,OAOlC,GALIc,IACFJ,EAAe,CAAoB,EAAEvC,EAAef,EAAQ0B,EAAc,MAAM,EAChFA,EAAc,QAAU,GAGtB1B,EAAS,EAAG,CACd,IAAM8C,EAAWF,EAAI,CAAC,EAEtB,GAAI,OAAOE,GAAa,SAAU,CAGhC,GAAIY,EAAgB,CAClB,IAAMC,EAAyB3D,EAAS8C,EAAS,kBAEjDtB,GAAcmC,EACdN,GAAiBM,EAEb5C,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,EAErD,CAEA,QAAWO,KAAUH,EAEnB1C,EAAS+B,EAAS,MAChB/B,EACA6C,EACAlC,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,UAE3B,SAAW+B,IAAa,GAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAAG,CAC/B,IAAMc,EAAOJ,EAA6BV,CAAC,EACrCC,EAASa,EAAI,OAEnBP,EAAe,CAAqB,EAAEvC,EAAeiC,EAAQtB,EAAc,MAAM,EACjFA,EAAc,QAAU,EAExB8B,EAAqBzC,EAAQW,EAAc,OAAQmC,CAAG,EACtDnC,EAAc,QAAUsB,CAC1B,KACK,CAEL,IAAMC,EAAWC,EAAUJ,CAAQ,EAEnC,GAAIY,EAAgB,CAClB,IAAMC,EAAyB3D,EAASiD,EAExCzB,GAAcmC,EACdN,GAAiBM,EAEb5C,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,EAErD,CAIA,QAAWS,KAAUL,EACnBH,EAAeR,CAAQ,EAAE/B,EAAe+C,EAAQpC,EAAc,MAAM,EACpEA,EAAc,QAAUuB,CAE5B,CACF,CACF,SAAW,OAAOL,GAAQ,SAExBU,EAAeV,CAAG,EAAE7B,EAAe0C,EAAgB/B,EAAc,MAAM,EACvEA,EAAc,QAAUwB,EAAUN,CAAG,UAC5BA,IAAQ,GAAI,CAErB,IAAMI,EAAUS,EAAgB,OAEhCH,EAAe,CAAqB,EAAEvC,EAAeiC,EAAQtB,EAAc,MAAM,EACjFA,EAAc,QAAU,EAExB8B,EAAqBzC,EAAQW,EAAc,OAAQ+B,CAAc,EACjE/B,EAAc,QAAUsB,CAC1B,SAAW,UAAWJ,EAAK,CAEzB,IAAIxC,EAAQ,EAEZ,QAAS+C,EAAM,EAAGA,EAAMP,EAAI,MAAM,OAAQ,EAAEO,EACrCM,EAAiCb,EAAI,MAAMO,CAAG,CAAC,IAClD/C,GAAS,GAAK+C,GAIlBG,EAAe,CAAoB,EAAEvC,EAAeX,EAAOsB,EAAc,MAAM,EAC/EA,EAAc,QAAU,CAC1B,KAAW,aAAckB,EACnBa,GACFH,EAAe,CAAoB,EAAEvC,EAAe,EAAGW,EAAc,MAAM,EAC3EA,EAAc,QAAU,EAExBF,GAAcoB,EAAI,SAAS,kBAC3BS,GAAiBT,EAAI,SAAS,kBAE1B7B,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,GAGnDtC,EAAS6B,EAAI,SAAS,MACpB7B,EACA0C,EACA/B,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,aAEvBuC,EAAe,CAAoB,EAAEvC,EAAe,EAAGW,EAAc,MAAM,EAC3EA,EAAc,QAAU,IAI1BX,EAAS6B,EAAI,MACX7B,EACA0C,EACA/B,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,WAE3B,CAEA,OAAOA,CACT,CAEQ,oCAAoCa,EAAoB,CAC9D,IAAImC,EAAM,KAAK,kBAEf,QAAWC,KAAS,KAAK,gBAAgB,CAAC,EAExCD,GAAQnC,EAAQoC,CAAK,EAAa,OAGpC,QAAWA,KAAS,KAAK,gBAAgB,CAAC,EAExC,QAAWC,KAAUrC,EAAQoC,CAAK,EAChCD,GAAO,EAAIE,EAAO,OAItB,QAAWD,KAAS,KAAK,gBAAgB,CAAC,EAExCD,GAAO,KAAK,gBAAgB,CAAC,EAAEC,CAAK,EAAE,oCACpCpC,EAAQoC,CAAK,CACf,EAGF,OAAOD,CACT,CACF,EA4FA,SAASnD,EAAYD,EAAwB,CAC3C,OAAO,OAAO,QAAQA,CAAU,EAAE,KAAK,CAAC,CAACuD,CAAU,EAAG,CAACC,CAAU,IAC/DD,EAAW,cAAcC,CAAU,CACrC,CACF,CAkBA,SAASrD,EAAesD,EAAkB,CAExC,IAAIC,EAAoB,EAElBC,EAAmC,CAAC,CAAC,EAAG,CAAC,EAAG,CAAC,CAAC,EAEpD,OAAW,CAAC3B,EAAM4B,CAAI,IAAKH,EACzB,GAAI,MAAM,QAAQG,CAAI,EACpB,GAAIA,EAAK,SAAW,EAAG,CAErB,IAAMC,EAAWD,EAAK,CAAC,IAAM,GAEvBtB,EACJ,OAAOsB,EAAK,CAAC,GAAM,SACfA,EAAK,CAAC,EAAE,kBACRC,EACE,EACAtB,EAAUqB,EAAK,CAAC,CAAC,EAEzBF,GAAqBE,EAAK,CAAC,EAAItB,EAE3BuB,GACFF,EAAgB,CAAC,EAAE,KAAK3B,CAAI,CAEhC,MAGE0B,GAAqB,EAEjBE,EAAK,CAAC,IAAM,IACdD,EAAgB,CAAC,EAAE,KAAK3B,CAAI,OAGvB4B,aAAgB/D,GACzB6D,GAAqBE,EAAK,kBAC1BD,EAAgB,CAAC,EAAE3B,CAAI,EAAI4B,GAClB,OAAOA,GAAS,SAIzBF,GAAqB,EACZE,IAAS,IAGlBF,GAAqB,EACrBC,EAAgB,CAAC,EAAE,KAAK3B,CAAI,GAE5B0B,GAAqBnB,EAAUqB,CAAI,EAIvC,MAAO,CAAE,kBAAAF,EAAmB,gBAAAC,CAAgB,CAC9C,CAQA,IAAMpB,EAAY,MAAM,CAAC,EAEzBA,EAAU,CAAoB,EAAI,EAClCA,EAAU,CAAW,EAAI,EAEzBA,EAAU,CAAqB,EAAI,EACnCA,EAAU,CAAY,EAAI,EAE1BA,EAAU,CAAqB,EAAI,EACnCA,EAAU,CAAY,EAAI,EAC1BA,EAAU,CAAc,EAAI,EAE5BA,EAAU,CAAc,EAAI,EAE5B,IAAM5B,EAAe,MAAM,CAAC,EAE5BA,EAAa,CAAoB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAC3EpD,EAAa,CAAW,EAAI,CAACmD,EAAMC,IAAWD,EAAK,QAAQC,CAAM,EAEjEpD,EAAa,CAAqB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAC7EpD,EAAa,CAAY,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAEnEpD,EAAa,CAAqB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAC7EpD,EAAa,CAAY,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EACnEpD,EAAa,CAAc,EAAI,CAACmD,EAAMC,IAAWD,EAAK,WAAWC,CAAM,EAEvEpD,EAAa,CAAc,EAAI,CAACmD,EAAMC,IAAWD,EAAK,WAAWC,CAAM,EAEvE,IAAM1C,EAAe,MAAM,CAAC,EAE5BA,EAAa,CAAoB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EACzF3C,EAAa,CAAW,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,QAAQC,EAAQC,CAAK,EAE/E3C,EAAa,CAAqB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,UAAUC,EAAQC,CAAK,EAC3F3C,EAAa,CAAY,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EAEjF3C,EAAa,CAAqB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,UAAUC,EAAQC,CAAK,EAC3F3C,EAAa,CAAY,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EACjF3C,EAAa,CAAc,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,WAAWC,EAAQC,CAAK,EAErF3C,EAAa,CAAc,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,WAAWC,EAAQC,CAAK,EAErF,IAAM9C,EAAmB,MAAM,CAAC,EAE5BF,IACFE,EAAiB,CAAoB,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,WAAWE,EAAOD,CAAM,EAC/F7C,EAAiB,CAAW,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,UAAUE,EAAOD,CAAM,EAErF7C,EAAiB,CAAqB,EAAI,CAAC4C,EAAME,EAAOD,IACtDD,EAAK,cAAcE,EAAOD,CAAM,EAClC7C,EAAiB,CAAY,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EAEzF7C,EAAiB,CAAqB,EAAI,CAAC4C,EAAME,EAAOD,IACtDD,EAAK,cAAcE,EAAOD,CAAM,EAClC7C,EAAiB,CAAY,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EACzF7C,EAAiB,CAAc,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EAE3F7C,EAAiB,CAAc,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,cAAcE,EAAOD,CAAM,GAG9F,IAAMtD,EAAmB,MAAM,CAAC,EAE5BO,IACFP,EAAiB,CAAoB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAChFtD,EAAiB,CAAW,EAAI,CAACqD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAEtEtD,EAAiB,CAAqB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM,EAEpFtD,EAAiB,CAAY,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC1EtD,EAAiB,CAAqB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM,EAEpFtD,EAAiB,CAAY,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC1EtD,EAAiB,CAAc,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC5EtD,EAAiB,CAAc,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM","names":["index_exports","__export","BinaryPacket","Field","FieldArray","FieldBitFlags","FieldFixedArray","FieldOptional","FieldString","SequentialSerializer","__toCommonJS","hasNodeBuffers","growDataView","dataview","newByteLength","resizedBuffer","amountToCopy","length","offset","growNodeBuffer","buffer","newBuffer","textEncoder","textDecoder","encodeStringIntoDataView","byteOffset","string","strlen","u8Buffer","encodeSmallString","encodeStringIntoNodeBuffer","i","decodeStringFromNodeBuffer","decodeStringFromDataView","Field","FieldArray","item","FieldFixedArray","length","SequentialSerializer","iterable","FieldBitFlags","flags","FieldString","FieldOptional","packet","BinaryPacket","_BinaryPacket","packetId","definition","sortEntries","inspection","inspectEntries","buffer","byteOffset","dataview","arraybuffer","visitors","GET_FUNCTION_BUF","decodeStringFromNodeBuffer","GET_FUNCTION","decodeStringFromDataView","byteLength","dataIn","offsetPointer","hasNodeBuffers","dataOut","SET_FUNCTION_BUF","growNodeBuffer","encodeStringIntoNodeBuffer","SET_FUNCTION","growDataView","encodeStringIntoDataView","buf","onVisit","numElements","element","readFunctions","decodeStringFunction","Packet","result","name","def","array","itemType","i","strlen","itemSize","BYTE_SIZE","bit","hasSubPacket","maxByteLength","writeFunctions","growBufferFunction","encodeStringFunction","data","isDynamicArray","neededBytesForElements","object","str","number","len","field","string","fieldName1","fieldName2","entries","minimumByteLength","stringPositions","type","isString","view","offset","value"]}
@@ -1 +1 @@
1
- {"version":3,"sources":["../src/buffers.ts","../src/index.ts"],"sourcesContent":["/**\r\n * Exclusively matches objects of type `ArrayBuffer` and no other types that inherit from it. \\\r\n * This is needed because the `DataView` constructor explicitly requires a \"true\" ArrayBuffer, or else it throws.\r\n */\r\nexport type TrueArrayBuffer = ArrayBuffer & { buffer?: undefined }\r\n\r\nexport const hasNodeBuffers = typeof Buffer === 'function'\r\n\r\nexport function growDataView(dataview: DataView, newByteLength: number) {\r\n const resizedBuffer = new ArrayBuffer(newByteLength)\r\n const amountToCopy = Math.min(dataview.byteLength, resizedBuffer.byteLength)\r\n\r\n // Treat the buffer as if it was a Float64Array so we can copy 8 bytes at a time, to finish faster\r\n let length = Math.trunc(amountToCopy / 8)\r\n new Float64Array(resizedBuffer, 0, length).set(new Float64Array(dataview.buffer, 0, length))\r\n\r\n // Copy the remaining up to 7 bytes\r\n const offset = length * 8\r\n length = amountToCopy - offset\r\n new Uint8Array(resizedBuffer, offset, length).set(new Uint8Array(dataview.buffer, offset, length))\r\n\r\n return new DataView(resizedBuffer)\r\n}\r\n\r\nexport function growNodeBuffer(buffer: Buffer, newByteLength: number) {\r\n const newBuffer = Buffer.allocUnsafe(newByteLength)\r\n buffer.copy(newBuffer)\r\n return newBuffer\r\n}\r\n\r\nconst textEncoder = new TextEncoder()\r\nconst textDecoder = new TextDecoder()\r\n\r\nexport function encodeStringIntoDataView(dataview: DataView, byteOffset: number, string: string) {\r\n const strlen = string.length\r\n const u8Buffer = new Uint8Array(dataview.buffer, dataview.byteOffset + byteOffset, strlen)\r\n\r\n if (strlen <= 64) {\r\n encodeSmallString(u8Buffer, 0, string, strlen)\r\n } else {\r\n textEncoder.encodeInto(string, u8Buffer)\r\n }\r\n}\r\n\r\nexport function encodeStringIntoNodeBuffer(buffer: Buffer, byteOffset: number, string: string) {\r\n const strlen = string.length\r\n\r\n if (strlen <= 64) {\r\n encodeSmallString(buffer, byteOffset, string, strlen)\r\n } else {\r\n buffer.utf8Write(string, byteOffset, strlen)\r\n }\r\n}\r\n\r\nfunction encodeSmallString(buffer: Uint8Array, byteOffset: number, string: string, strlen: number) {\r\n for (let i = 0; i < strlen; ++i) {\r\n buffer[byteOffset + i] = string.charCodeAt(i) & 0xff\r\n }\r\n}\r\n\r\nexport function decodeStringFromNodeBuffer(buffer: Buffer, byteOffset: number, strlen: number) {\r\n return buffer.subarray(byteOffset, byteOffset + strlen).toString('utf8')\r\n}\r\n\r\nexport function decodeStringFromDataView(dataview: DataView, byteOffset: number, strlen: number) {\r\n return textDecoder.decode(new DataView(dataview.buffer, dataview.byteOffset + byteOffset, strlen))\r\n}\r\n\r\ndeclare global {\r\n interface Buffer {\r\n /**\r\n * Node buffer's internals function. \\\r\n * For some reason it is not exposed through TypeScript. \\\r\n * Fastest way to write utf8 strings into buffers.\r\n */\r\n utf8Write(string: string, byteOffset?: number, byteLength?: number): number\r\n }\r\n}\r\n","import {\n decodeStringFromDataView,\n decodeStringFromNodeBuffer,\n encodeStringIntoDataView,\n encodeStringIntoNodeBuffer,\n growDataView,\n growNodeBuffer,\n hasNodeBuffers,\n type TrueArrayBuffer\n} from './buffers'\n\nexport const enum Field {\n /**\n * Defines a 1 byte (8 bits) unsigned integer field. \\\n * (Range: 0 - 255)\n */\n UNSIGNED_INT_8 = 0,\n\n /**\n * Defines a 2 bytes (16 bits) unsigned integer field. \\\n * (Range: 0 - 65535)\n */\n UNSIGNED_INT_16,\n\n /**\n * Defines a 4 bytes (32 bits) unsigned integer field. \\\n * (Range: 0 - 4294967295)\n */\n UNSIGNED_INT_32,\n\n /**\n * Defines a 1 byte (8 bits) signed integer field. \\\n * (Range: -128 - 127)\n */\n INT_8,\n\n /**\n * Defines a 2 bytes (16 bits) signed integer field. \\\n * (Range: -32768 - 32767)\n */\n INT_16,\n\n /**\n * Defines a 4 bytes (32 bits) signed integer field. \\\n * (Range: -2147483648 - 2147483647)\n */\n INT_32,\n\n /**\n * Defines a 4 bytes (32 bits) floating-point field. \\\n */\n FLOAT_32,\n\n /**\n * Defines a 8 bytes (64 bits) floating-point field. \\\n */\n FLOAT_64\n}\n\n/**\n * Defines a dynamically-sized array with elements of a certain type. \\\n * Dynamically-sized arrays are useful when a packet's field is an array of a non pre-defined length. \\\n * Although, this makes dynamically-sized arrays more memory expensive as the internal buffer needs to be grown accordingly.\n *\n * NOTE: If an array will ALWAYS have the same length, prefer using the `FieldFixedArray` type, for both better performance and memory efficiency. \\\n * NOTE: As of now, dynamic arrays can have at most 256 elements.\n */\nexport function FieldArray<T extends Field | BinaryPacket<Definition> | ''>(\n item: T\n): [itemType: T] {\n return [item]\n}\n\n/**\n * Defines a statically-sized array with elements of a certain type. \\\n * Fixed arrays are useful when a packet's field is an array of a pre-defined length. \\\n * Fixed arrays much more memory efficient and performant than non-fixed ones.\n *\n * NOTE: If an array will not always have the same length, use the `FieldArray` type.\n */\nexport function FieldFixedArray<\n T extends Field | BinaryPacket<Definition> | '',\n Length extends number\n>(item: T, length: Length): [itemType: T, length: Length] {\n if (length < 0 || !Number.isFinite(length)) {\n throw new RangeError('Length of a FixedArray must be a positive integer.')\n }\n\n return [item, length]\n}\n\n/**\n * Utility class that allows serializing arrays through any kind of iterable, as long as the number of elements is known beforehand. \\\n * Needed to skip the overhead of duplicating the data into an actual array just for it to be serialized straight away and trashed.\n */\nexport class SequentialSerializer<T> implements Iterable<T> {\n constructor(\n private readonly iterable: Iterable<T>,\n public readonly length: number\n ) {}\n\n [Symbol.iterator]() {\n return this.iterable[Symbol.iterator]()\n }\n}\n\ntype SequentiallySerializable<T, IsRead extends boolean> = IsRead extends true\n ? T[]\n : T[] | SequentialSerializer<T>\n\ntype BitFlags = (string[] | ReadonlyArray<string>) & {\n length: 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8\n}\n\n/**\n * Defines a sequence of up to 8 \"flags\" (basically single bits/booleans) that can be packed together into a single 8 bits value. \\\n * This is useful for minimizing bytes usage when there are lots of boolean fields/flags, instead of saving each flag separately as its own 8 bits value.\n *\n * The input should be an array of strings (with at most 8 elements) where each string defines the name of a flag. \\\n * This is just for definition purposes, then when actually writing or reading packets it'll just be a record-object with those names as keys and boolean values.\n */\nexport function FieldBitFlags<const FlagsArray extends BitFlags>(flags: FlagsArray) {\n if (flags.length > 8) {\n throw new Error(\n `Invalid BinaryPacket definition: a BitFlags field can have only up to 8 flags, given: ${flags.join(', ')}`\n )\n }\n\n return { flags }\n}\n\n/**\n * Defines a string field. \\\n * Strings cannot be more than 65536 characters long.\n *\n * NOTE: Only strings containing just ASCII and/or single-octet UTF-8 characters are supported.\n */\nexport function FieldString() {\n return '' as const\n}\n\n/**\n * Defines an optional BinaryPacket \"subpacket\" field. \\\n * When writing and reading packets it'll be possible to provide and receive `undefined` instead of a valid object.\n */\nexport function FieldOptional<T extends BinaryPacket<Definition>>(packet: T) {\n return { optional: packet }\n}\n\n/**\n * Do not manually construct this type: an object of this kind is returned by a BinaryPacket `createVisitor` method. \\\n * Used in the `BinaryPacket::visit` static method to perform a sort of \"pattern matching\" on an incoming packet (of yet unknown type) buffer.\n */\ntype Visitor = [BinaryPacket<Definition>, (packet: any) => void]\n\nexport class BinaryPacket<T extends Definition> {\n /**\n * Defines a new binary packet. \\\n * Make sure that every `packetId` is unique.\n * @throws RangeError If packetId is negative, floating-point, or greater than 255.\n */\n static define<T extends Definition>(packetId: number, definition?: T) {\n if (packetId < 0 || !Number.isFinite(packetId)) {\n throw new RangeError('Packet IDs must be positive integers.')\n }\n\n if (packetId > 255) {\n throw new RangeError(\n 'Packet IDs greater than 255 are not supported. Do you REALLY need more than 255 different kinds of packets?'\n )\n }\n\n return new BinaryPacket(packetId, definition)\n }\n\n /**\n * Reads just the packetId from the given Buffer. \\\n * This method practically just reads the uint8 at offset `byteOffset` (default: 0). \\\n * Useful if the receiving side receives multiple types of packets.\n */\n static readPacketIdNodeBuffer(buffer: Buffer, byteOffset = 0) {\n return buffer.readUint8(byteOffset)\n }\n\n /**\n * Reads just the packetId from the given DataView. \\\n * This method practically just reads the uint8 at offset `byteOffset` (default: 0). \\\n * Useful if the receiving side receives multiple types of packets.\n */\n static readPacketIdDataView(dataview: DataView, byteOffset = 0) {\n return dataview.getUint8(byteOffset)\n }\n\n /**\n * Reads just the packetId from the given ArrayBuffer. \\\n * This method practically just reads the uint8 at offset `byteOffset`. \\\n * Useful if the receiving side receives multiple types of packets.\n *\n * NOTE: Due to security issues, the `byteOffset` argument cannot be defaulted and must be provided by the user. \\\n * NOTE: For more information read the `readArrayBuffer` method documentation.\n */\n static readPacketIdArrayBuffer(arraybuffer: TrueArrayBuffer, byteOffset: number) {\n return new Uint8Array(arraybuffer, byteOffset, 1)[0]\n }\n\n /**\n * Visits and \"pattern matches\" the given Buffer through the given visitors. \\\n * The Buffer is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitNodeBuffer(buffer: Buffer, ...visitors: Visitor[]) {\n return BinaryPacket.visit(buffer, GET_FUNCTION_BUF, decodeStringFromNodeBuffer, visitors)\n }\n\n /**\n * Visits and \"pattern matches\" the given DataView through the given visitors. \\\n * The DataView is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitDataView(dataview: DataView, ...visitors: Visitor[]) {\n return BinaryPacket.visit(dataview, GET_FUNCTION, decodeStringFromDataView, visitors)\n }\n\n /**\n * Visits and \"pattern matches\" the given ArrayBuffer through the given visitors. \\\n * The ArrayBuffer is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: Due to security issues, the `byteOffset` and `byteLength` arguments must be provided by the user. \\\n * NOTE: For more information read the `readArrayBuffer` method documentation. \\\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitArrayBuffer(\n arraybuffer: TrueArrayBuffer,\n byteOffset: number,\n byteLength: number,\n ...visitors: Visitor[]\n ) {\n return BinaryPacket.visit(\n new DataView(arraybuffer, byteOffset, byteLength),\n GET_FUNCTION,\n decodeStringFromDataView,\n visitors\n )\n }\n\n /**\n * Reads/deserializes from the given Buffer. \\\n * Method available ONLY on NodeJS and Bun.\n *\n * If possible, always prefer reading using this method, as it is much faster than the other ones.\n *\n * NOTE: if you have an ArrayBuffer do not bother wrapping it into a node Buffer yourself. \\\n * NOTE: if you have an ArrayBuffer use the appropriate `readArrayBuffer`.\n */\n readNodeBuffer(\n dataIn: Buffer,\n offsetPointer = { offset: 0 },\n byteLength = dataIn.byteLength\n ): ToJson<T, true> {\n return this.read(\n dataIn,\n offsetPointer,\n byteLength,\n GET_FUNCTION_BUF,\n decodeStringFromNodeBuffer\n )\n }\n\n /**\n * Reads/deserializes from the given DataView.\n *\n * NOTE: if you have an ArrayBuffer do not bother wrapping it into a DataView yourself. \\\n * NOTE: if you have an ArrayBuffer use the appropriate `readArrayBuffer`.\n */\n readDataView(\n dataIn: DataView,\n offsetPointer = { offset: 0 },\n byteLength = dataIn.byteLength\n ): ToJson<T, true> {\n return this.read(dataIn, offsetPointer, byteLength, GET_FUNCTION, decodeStringFromDataView)\n }\n\n /**\n * Reads/deserializes from the given ArrayBuffer. \\\n * WARNING: this method is practically a HACK.\n *\n * When using this method both the `byteOffset` and `byteLength` are REQUIRED and cannot be defaulted. \\\n * This is to prevent serious bugs and security issues. \\\n * That is because often raw ArrayBuffers come from a pre-allocated buffer pool and do not start at byteOffset 0.\n *\n * NOTE: if you have a node Buffer do not bother wrapping it into an ArrayBuffer yourself. \\\n * NOTE: if you have a node Buffer use the appropriate `readNodeBuffer` as it is much faster and less error prone.\n */\n readArrayBuffer(dataIn: TrueArrayBuffer, byteOffset: number, byteLength: number) {\n return this.read(\n hasNodeBuffers\n ? Buffer.from(dataIn, byteOffset, byteLength)\n : (new DataView(dataIn, byteOffset, byteLength) as any),\n { offset: 0 }, // The underlying buffer has already been offsetted\n byteLength,\n hasNodeBuffers ? GET_FUNCTION_BUF : GET_FUNCTION,\n hasNodeBuffers ? decodeStringFromNodeBuffer : (decodeStringFromDataView as any)\n )\n }\n\n /**\n * Writes/serializes the given object into a Buffer. \\\n * Method available ONLY on NodeJS and Bun.\n *\n * If possible, always prefer writing using this method, as it is much faster than the other ones.\n */\n writeNodeBuffer(dataOut: ToJson<T>) {\n const byteLength = this.precalculateBufferLengthWithStrings(dataOut)\n const buffer = Buffer.allocUnsafe(byteLength)\n\n return this.write(\n buffer,\n dataOut,\n { offset: 0 },\n byteLength,\n byteLength,\n SET_FUNCTION_BUF,\n growNodeBuffer,\n encodeStringIntoNodeBuffer\n )\n }\n\n /**\n * Writes/serializes the given object into a DataView. \\\n */\n writeDataView(dataOut: ToJson<T>) {\n const byteLength = this.precalculateBufferLengthWithStrings(dataOut)\n const dataview = new DataView(new ArrayBuffer(byteLength))\n\n return this.write(\n dataview,\n dataOut,\n { offset: 0 },\n byteLength,\n byteLength,\n SET_FUNCTION,\n growDataView,\n encodeStringIntoDataView\n )\n }\n\n /**\n * Writes/serializes the given object into an ArrayBuffer. \\\n * This method is just a wrapper around either `writeNodeBuffer` or `writeDataView`. \\\n *\n * This method works with JavaScript standard raw ArrayBuffer(s) and, as such, is very error prone: \\\n * Make sure you're using the returned byteLength and byteOffset fields in the read counterpart. \\\n *\n * Always consider whether is possible to use directly `writeNodeBuffer` or `writeDataView` instead of `writeArrayBuffer`. \\\n * For more information read the `readArrayBuffer` documentation.\n */\n writeArrayBuffer(dataOut: ToJson<T>) {\n const buf = hasNodeBuffers ? this.writeNodeBuffer(dataOut) : this.writeDataView(dataOut)\n return { buffer: buf.buffer, byteLength: buf.byteLength, byteOffset: buf.byteOffset }\n }\n\n /**\n * Creates a \"visitor\" object for this BinaryPacket definition. \\\n * Used when visiting and \"pattern matching\" buffers with the `BinaryPacket::visit` static utility methods. \\\n *\n * For more information read the `BinaryPacket::visitNodeBuffer` documentation. \\\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n visitor(onVisit: (packet: ToJson<T>) => void): Visitor {\n return [this, onVisit]\n }\n\n sequentialSerializer(numElements: number, dataOut: Iterable<ToJson<T>>) {\n const byteLength = this.minimumByteLength * numElements\n const buffer = Buffer.allocUnsafe(byteLength)\n const offsetPointer = { offset: 0 }\n\n for (const element of dataOut) {\n this.write(\n buffer,\n element,\n offsetPointer,\n byteLength,\n byteLength,\n SET_FUNCTION_BUF,\n growNodeBuffer,\n encodeStringIntoNodeBuffer\n )\n }\n }\n\n /// PRIVATE\n\n private readonly entries: Entries\n readonly stringPositions: StringPositions\n readonly minimumByteLength: number\n\n private constructor(\n private readonly packetId: number,\n definition?: T\n ) {\n this.entries = definition ? sortEntries(definition) : []\n const inspection = inspectEntries(this.entries)\n this.minimumByteLength = inspection.minimumByteLength\n this.stringPositions = inspection.stringPositions\n }\n\n private static visit<Buf extends DataView | Buffer>(\n dataIn: Buf,\n readFunctions: typeof GET_FUNCTION | typeof GET_FUNCTION_BUF,\n decodeStringFunction: (dataIn: Buf, byteOffset: number, strlen: number) => string,\n visitors: Visitor[]\n ) {\n for (const [Packet, onVisit] of visitors) {\n if (Packet.packetId === readFunctions[Field.UNSIGNED_INT_8](dataIn as any, 0)) {\n return onVisit(\n Packet.read(dataIn, { offset: 0 }, dataIn.byteLength, readFunctions, decodeStringFunction)\n )\n }\n }\n }\n\n private read<Buf extends DataView | Buffer>(\n dataIn: Buf,\n offsetPointer: { offset: number },\n byteLength: number,\n readFunctions: typeof GET_FUNCTION | typeof GET_FUNCTION_BUF,\n decodeStringFunction: (dataIn: Buf, byteOffset: number, strlen: number) => string\n ): ToJson<T, true> {\n if (byteLength + offsetPointer.offset < this.minimumByteLength) {\n throw new Error(\n `There is no space available to fit a packet of type ${this.packetId} at offset ${offsetPointer.offset}`\n )\n }\n\n if (\n readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset) !== this.packetId\n ) {\n throw new Error(\n `Data at offset ${offsetPointer.offset} is not a packet of type ${this.packetId}`\n )\n }\n\n offsetPointer.offset += 1\n const result: any = {}\n\n for (const [name, def] of this.entries) {\n if (Array.isArray(def)) {\n const length =\n // def[1] is the length of a statically-sized array, if undefined: must read the length from the buffer as it means it's a dynamically-sized array\n def[1] ?? readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset++)\n\n const array = Array(length)\n\n const itemType = def[0]\n\n if (typeof itemType === 'object') {\n // Array of \"subpackets\"\n for (let i = 0; i < length; ++i) {\n array[i] = itemType.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n } else if (itemType === '') {\n // Array of strings\n for (let i = 0; i < length; ++i) {\n const strlen = readFunctions[Field.UNSIGNED_INT_16](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 2\n\n array[i] = decodeStringFunction(dataIn, offsetPointer.offset, strlen)\n offsetPointer.offset += strlen\n }\n } else {\n // Array of primitives (numbers)\n const itemSize = BYTE_SIZE[itemType]\n\n // It seems like looping over each element is actually much faster than using TypedArrays bulk copy.\n // TODO: properly benchmark with various array sizes to see if it's actually the case.\n for (let i = 0; i < length; ++i) {\n array[i] = readFunctions[itemType](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += itemSize\n }\n }\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = array\n } else if (typeof def === 'number') {\n // Single primitive (number)\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = readFunctions[def](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += BYTE_SIZE[def]\n } else if (def === '') {\n const strlen = readFunctions[Field.UNSIGNED_INT_16](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 2\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = decodeStringFunction(dataIn, offsetPointer.offset, strlen)\n offsetPointer.offset += strlen\n } else if ('flags' in def) {\n // BitFlags\n const flags = readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 1\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = {}\n\n for (let bit = 0; bit < def.flags.length; ++bit) {\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name][def.flags[bit]] = !!(flags & (1 << bit))\n }\n } else if ('optional' in def) {\n // Single optional \"subpacket\"\n const hasSubPacket =\n readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset) !== 0\n\n offsetPointer.offset += 1\n\n if (hasSubPacket) {\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = def.optional.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n } else {\n // Single \"subpacket\"\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = def.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n }\n\n return result as ToJson<T, true>\n }\n\n private write<Buf extends DataView | Buffer>(\n buffer: Buf,\n dataOut: ToJson<T>,\n offsetPointer: { offset: number },\n byteLength: number,\n maxByteLength: number,\n writeFunctions: typeof SET_FUNCTION | typeof SET_FUNCTION_BUF,\n growBufferFunction: (buffer: Buf, newByteLength: number) => Buf,\n encodeStringFunction: (buffer: Buf, byteOffset: number, string: string) => void\n ): Buf {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, this.packetId, offsetPointer.offset)\n offsetPointer.offset += 1\n\n for (const [name, def] of this.entries) {\n const data = dataOut[name]\n\n if (Array.isArray(def)) {\n // Could be both an array of just numbers or \"subpackets\"\n\n const length = (data as SequentiallySerializable<any, false>).length\n\n // Check if it is a dynamically-sized array, if it is, the length of the array must be serialized in the buffer before its elements\n // Explicitly check for undefined and not falsy values because it could be a statically-sized array of 0 elements.\n const isDynamicArray = def[1] === undefined\n\n if (isDynamicArray) {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, length, offsetPointer.offset)\n offsetPointer.offset += 1\n }\n\n if (length > 0) {\n const itemType = def[0]\n\n if (typeof itemType === 'object') {\n // Array of \"subpackets\"\n\n if (isDynamicArray) {\n const neededBytesForElements = length * itemType.minimumByteLength\n\n byteLength += neededBytesForElements\n maxByteLength += neededBytesForElements\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n }\n\n for (const object of data as unknown as ToJson<Definition>[]) {\n // Array of \"subpackets\"\n buffer = itemType.write(\n buffer,\n object,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n }\n } else if (itemType === '') {\n // Array of strings\n for (let i = 0; i < length; ++i) {\n const str = (data as unknown as string[])[i]\n const strlen = str.length\n\n writeFunctions[Field.UNSIGNED_INT_16](buffer as any, strlen, offsetPointer.offset)\n offsetPointer.offset += 2\n\n encodeStringFunction(buffer, offsetPointer.offset, str)\n offsetPointer.offset += strlen\n }\n } else {\n // Array of primitives (numbers)\n const itemSize = BYTE_SIZE[itemType]\n\n if (isDynamicArray) {\n const neededBytesForElements = length * itemSize\n\n byteLength += neededBytesForElements\n maxByteLength += neededBytesForElements\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n }\n\n // It seems like looping over each element is actually much faster than using TypedArrays bulk copy.\n // TODO: properly benchmark with various array sizes to see if it's actually the case.\n for (const number of data as SequentiallySerializable<number, false>) {\n writeFunctions[itemType](buffer as any, number, offsetPointer.offset)\n offsetPointer.offset += itemSize\n }\n }\n }\n } else if (typeof def === 'number') {\n // Single primitive (number)\n writeFunctions[def](buffer as any, data as number, offsetPointer.offset)\n offsetPointer.offset += BYTE_SIZE[def]\n } else if (def === '') {\n // String\n const strlen = (data as string).length\n\n writeFunctions[Field.UNSIGNED_INT_16](buffer as any, strlen, offsetPointer.offset)\n offsetPointer.offset += 2\n\n encodeStringFunction(buffer, offsetPointer.offset, data as string)\n offsetPointer.offset += strlen\n } else if ('flags' in def) {\n // BitFlags\n let flags = 0\n\n for (let bit = 0; bit < def.flags.length; ++bit) {\n if ((data as Record<string, boolean>)[def.flags[bit]]) {\n flags |= 1 << bit\n }\n }\n\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, flags, offsetPointer.offset)\n offsetPointer.offset += 1\n } else if ('optional' in def) {\n if (data) {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, 1, offsetPointer.offset)\n offsetPointer.offset += 1\n\n byteLength += def.optional.minimumByteLength\n maxByteLength += def.optional.minimumByteLength\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n\n buffer = def.optional.write(\n buffer,\n data as ToJson<Definition>,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n } else {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, 0, offsetPointer.offset)\n offsetPointer.offset += 1\n }\n } else {\n // Single \"subpacket\"\n buffer = def.write(\n buffer,\n data as ToJson<Definition>,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n }\n }\n\n return buffer\n }\n\n private precalculateBufferLengthWithStrings(dataOut: ToJson<T>) {\n let len = this.minimumByteLength\n\n for (const field of this.stringPositions[0]) {\n // String field\n len += (dataOut[field] as string).length\n }\n\n for (const field of this.stringPositions[1]) {\n // Array of strings field\n for (const string of dataOut[field] as unknown as string[]) {\n len += 2 + string.length\n }\n }\n\n for (const field in this.stringPositions[2]) {\n // Subpacket that has some string fields\n len += this.stringPositions[2][field].precalculateBufferLengthWithStrings(\n dataOut[field] as any\n )\n }\n\n return len\n }\n}\n\n/**\n * BinaryPacket definition: \\\n * Any packet can be defined through a \"schema\" object explaining its fields names and types.\n *\n * @example\n * // Imagine we have a game board where each cell is a square and is one unit big.\n * // A cell can be then defined by its X and Y coordinates.\n * // For simplicity, let's say there cannot be more than 256 cells, so we can use 8 bits for each coordinate.\n * const Cell = {\n * x: Field.UNSIGNED_INT_8,\n * y: Field.UNSIGNED_INT_8\n * }\n *\n * // When done with the cell definition we can create its BinaryPacket writer/reader.\n * // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.\n * const CellPacket = BinaryPacket.define(0, Cell)\n *\n * // Let's now make the definition of the whole game board.\n * // You can also specify arrays of both \"primitive\" fields and other BinaryPackets.\n * const Board = {\n * numPlayers: Field.UNSIGNED_INT_8,\n * cells: FieldArray(CellPacket)\n * }\n *\n * // When done with the board definition we can create its BinaryPacket writer/reader.\n * // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.\n * const BoardPacket = BinaryPacket.define(1, Board)\n *\n * // And use it.\n * const buffer = BoardPacket.writeNodeBuffer({\n * numPlayers: 1,\n * cells: [\n * { x: 0, y: 0 },\n * { x: 1, y: 1 }\n * ]\n * })\n *\n * // sendTheBufferOver(buffer)\n * // ...\n * // const buffer = receiveTheBuffer()\n * const board = BoardPacket.readNodeBuffer(buffer)\n * // ...\n */\nexport type Definition = {\n [fieldName: string]:\n | MaybeArray<Field>\n | MaybeArray<BinaryPacket<Definition>>\n | MaybeArray<''>\n | { flags: BitFlags }\n | { optional: BinaryPacket<Definition> }\n}\n\ntype MaybeArray<T> = T | [itemType: T] | [itemType: T, length: number]\n\ntype BitFlagsToJson<FlagsArray extends BitFlags> = {\n [key in FlagsArray[number]]: boolean\n}\n\n/**\n * Meta-type that converts a `Definition` schema to the type of the actual JavaScript object that will be written into a packet or read from. \\\n */\nexport type ToJson<T extends Definition, IsRead extends boolean = false> = {\n [K in keyof T]: T[K] extends [infer Item]\n ? Item extends BinaryPacket<infer BPDef>\n ? SequentiallySerializable<ToJson<BPDef, IsRead>, IsRead>\n : Item extends ''\n ? SequentiallySerializable<string, IsRead>\n : SequentiallySerializable<number, IsRead>\n : T[K] extends [infer Item, infer Length]\n ? Item extends BinaryPacket<infer BPDef>\n ? SequentiallySerializable<ToJson<BPDef, IsRead>, IsRead> & { length: Length }\n : Item extends ''\n ? string[] & { length: Length }\n : number[] & { length: Length }\n : T[K] extends BinaryPacket<infer BPDef>\n ? ToJson<BPDef, IsRead>\n : T[K] extends { flags: infer FlagsArray extends BitFlags }\n ? BitFlagsToJson<FlagsArray>\n : T[K] extends { optional: BinaryPacket<infer BPDef extends Definition> }\n ? ToJson<BPDef, IsRead> | undefined\n : T[K] extends ''\n ? string\n : number\n}\n\n/**\n * In a JavaScript object, the order of its keys is not strictly defined: sort them by field name. \\\n * Thus, we cannot trust iterating over an object keys: we MUST iterate over its entries array. \\\n * This is important to make sure that whoever shares BinaryPacket definitions can correctly write/read packets independently of their JS engines.\n */\nfunction sortEntries(definition: Definition) {\n return Object.entries(definition).sort(([fieldName1], [fieldName2]) =>\n fieldName1.localeCompare(fieldName2)\n )\n}\n\ntype Entries = ReturnType<typeof sortEntries>\n\ntype StringPositions = [\n string[],\n string[],\n {\n [field: string]: BinaryPacket<Definition>\n }\n]\n\n/**\n * Helper function that \"inspects\" the entries of a BinaryPacket definition\n * and returns useful \"stats\" needed for writing and reading buffers.\n *\n * This function is ever called only once per BinaryPacket definition.\n */\nfunction inspectEntries(entries: Entries) {\n // The PacketID is already 1 byte, that's why we aren't starting from 0.\n let minimumByteLength = 1\n\n const stringPositions: StringPositions = [[], [], {}]\n\n for (const [name, type] of entries) {\n if (Array.isArray(type)) {\n if (type.length === 2) {\n // Statically-sized array\n const isString = type[0] === ''\n\n const itemSize =\n typeof type[0] === 'object'\n ? type[0].minimumByteLength\n : isString\n ? 2\n : BYTE_SIZE[type[0]]\n\n minimumByteLength += type[1] * itemSize\n\n if (isString) {\n stringPositions[1].push(name)\n }\n } else {\n // Dynamically-sized array\n // Adding 1 byte to serialize the array length\n minimumByteLength += 1\n\n if (type[0] === '') {\n stringPositions[1].push(name)\n }\n }\n } else if (type instanceof BinaryPacket) {\n minimumByteLength += type.minimumByteLength\n stringPositions[2][name] = type\n } else if (typeof type === 'object') {\n // BitFlags & Optionals\n // BitFlags are always 1 byte long, because they can hold up to 8 booleans\n // Optionals minimum is 1 byte long, because it holds whether the subpacket is present or not\n minimumByteLength += 1\n } else if (type === '') {\n // String\n // Adding 2 to serialize the string length\n minimumByteLength += 2\n stringPositions[0].push(name)\n } else {\n minimumByteLength += BYTE_SIZE[type]\n }\n }\n\n return { minimumByteLength, stringPositions }\n}\n\n//////////////////////////////////////////////\n// The logic here is practically over //\n// Here below there are needed constants //\n// that map a field-type to a functionality //\n//////////////////////////////////////////////\n\nconst BYTE_SIZE = Array(8) as number[]\n\nBYTE_SIZE[Field.UNSIGNED_INT_8] = 1\nBYTE_SIZE[Field.INT_8] = 1\n\nBYTE_SIZE[Field.UNSIGNED_INT_16] = 2\nBYTE_SIZE[Field.INT_16] = 2\n\nBYTE_SIZE[Field.UNSIGNED_INT_32] = 4\nBYTE_SIZE[Field.INT_32] = 4\nBYTE_SIZE[Field.FLOAT_32] = 4\n\nBYTE_SIZE[Field.FLOAT_64] = 8\n\nconst GET_FUNCTION = Array(8) as ((view: DataView, offset: number) => number)[]\n\nGET_FUNCTION[Field.UNSIGNED_INT_8] = (view, offset) => view.getUint8(offset)\nGET_FUNCTION[Field.INT_8] = (view, offset) => view.getInt8(offset)\n\nGET_FUNCTION[Field.UNSIGNED_INT_16] = (view, offset) => view.getUint16(offset)\nGET_FUNCTION[Field.INT_16] = (view, offset) => view.getInt16(offset)\n\nGET_FUNCTION[Field.UNSIGNED_INT_32] = (view, offset) => view.getUint32(offset)\nGET_FUNCTION[Field.INT_32] = (view, offset) => view.getInt32(offset)\nGET_FUNCTION[Field.FLOAT_32] = (view, offset) => view.getFloat32(offset)\n\nGET_FUNCTION[Field.FLOAT_64] = (view, offset) => view.getFloat64(offset)\n\nconst SET_FUNCTION = Array(8) as ((view: DataView, value: number, offset: number) => void)[]\n\nSET_FUNCTION[Field.UNSIGNED_INT_8] = (view, value, offset) => view.setUint8(offset, value)\nSET_FUNCTION[Field.INT_8] = (view, value, offset) => view.setInt8(offset, value)\n\nSET_FUNCTION[Field.UNSIGNED_INT_16] = (view, value, offset) => view.setUint16(offset, value)\nSET_FUNCTION[Field.INT_16] = (view, value, offset) => view.setInt16(offset, value)\n\nSET_FUNCTION[Field.UNSIGNED_INT_32] = (view, value, offset) => view.setUint32(offset, value)\nSET_FUNCTION[Field.INT_32] = (view, value, offset) => view.setInt32(offset, value)\nSET_FUNCTION[Field.FLOAT_32] = (view, value, offset) => view.setFloat32(offset, value)\n\nSET_FUNCTION[Field.FLOAT_64] = (view, value, offset) => view.setFloat64(offset, value)\n\nconst SET_FUNCTION_BUF = Array(8) as ((nodeBuffer: Buffer, value: number, offset: number) => void)[]\n\nif (hasNodeBuffers) {\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_8] = (view, value, offset) => view.writeUint8(value, offset)\n SET_FUNCTION_BUF[Field.INT_8] = (view, value, offset) => view.writeInt8(value, offset)\n\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_16] = (view, value, offset) =>\n view.writeUint16BE(value, offset)\n SET_FUNCTION_BUF[Field.INT_16] = (view, value, offset) => view.writeInt16BE(value, offset)\n\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_32] = (view, value, offset) =>\n view.writeUint32BE(value, offset)\n SET_FUNCTION_BUF[Field.INT_32] = (view, value, offset) => view.writeInt32BE(value, offset)\n SET_FUNCTION_BUF[Field.FLOAT_32] = (view, value, offset) => view.writeFloatBE(value, offset)\n\n SET_FUNCTION_BUF[Field.FLOAT_64] = (view, value, offset) => view.writeDoubleBE(value, offset)\n}\n\nconst GET_FUNCTION_BUF = Array(8) as ((nodeBuffer: Buffer, offset: number) => number)[]\n\nif (hasNodeBuffers) {\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_8] = (view, offset) => view.readUint8(offset)\n GET_FUNCTION_BUF[Field.INT_8] = (view, offset) => view.readInt8(offset)\n\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_16] = (view, offset) => view.readUint16BE(offset)\n\n GET_FUNCTION_BUF[Field.INT_16] = (view, offset) => view.readInt16BE(offset)\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_32] = (view, offset) => view.readUint32BE(offset)\n\n GET_FUNCTION_BUF[Field.INT_32] = (view, offset) => view.readInt32BE(offset)\n GET_FUNCTION_BUF[Field.FLOAT_32] = (view, offset) => view.readFloatBE(offset)\n GET_FUNCTION_BUF[Field.FLOAT_64] = (view, offset) => view.readDoubleBE(offset)\n}\n"],"mappings":"AAMO,IAAMA,EAAiB,OAAO,QAAW,WAEzC,SAASC,EAAaC,EAAoBC,EAAuB,CACtE,IAAMC,EAAgB,IAAI,YAAYD,CAAa,EAC7CE,EAAe,KAAK,IAAIH,EAAS,WAAYE,EAAc,UAAU,EAGvEE,EAAS,KAAK,MAAMD,EAAe,CAAC,EACxC,IAAI,aAAaD,EAAe,EAAGE,CAAM,EAAE,IAAI,IAAI,aAAaJ,EAAS,OAAQ,EAAGI,CAAM,CAAC,EAG3F,IAAMC,EAASD,EAAS,EACxB,OAAAA,EAASD,EAAeE,EACxB,IAAI,WAAWH,EAAeG,EAAQD,CAAM,EAAE,IAAI,IAAI,WAAWJ,EAAS,OAAQK,EAAQD,CAAM,CAAC,EAE1F,IAAI,SAASF,CAAa,CACnC,CAEO,SAASI,EAAeC,EAAgBN,EAAuB,CACpE,IAAMO,EAAY,OAAO,YAAYP,CAAa,EAClD,OAAAM,EAAO,KAAKC,CAAS,EACdA,CACT,CAEA,IAAMC,EAAc,IAAI,YAClBC,EAAc,IAAI,YAEjB,SAASC,EAAyBX,EAAoBY,EAAoBC,EAAgB,CAC/F,IAAMC,EAASD,EAAO,OAChBE,EAAW,IAAI,WAAWf,EAAS,OAAQA,EAAS,WAAaY,EAAYE,CAAM,EAErFA,GAAU,GACZE,EAAkBD,EAAU,EAAGF,EAAQC,CAAM,EAE7CL,EAAY,WAAWI,EAAQE,CAAQ,CAE3C,CAEO,SAASE,EAA2BV,EAAgBK,EAAoBC,EAAgB,CAC7F,IAAMC,EAASD,EAAO,OAElBC,GAAU,GACZE,EAAkBT,EAAQK,EAAYC,EAAQC,CAAM,EAEpDP,EAAO,UAAUM,EAAQD,EAAYE,CAAM,CAE/C,CAEA,SAASE,EAAkBT,EAAoBK,EAAoBC,EAAgBC,EAAgB,CACjG,QAASI,EAAI,EAAGA,EAAIJ,EAAQ,EAAEI,EAC5BX,EAAOK,EAAaM,CAAC,EAAIL,EAAO,WAAWK,CAAC,EAAI,GAEpD,CAEO,SAASC,EAA2BZ,EAAgBK,EAAoBE,EAAgB,CAC7F,OAAOP,EAAO,SAASK,EAAYA,EAAaE,CAAM,EAAE,SAAS,MAAM,CACzE,CAEO,SAASM,EAAyBpB,EAAoBY,EAAoBE,EAAgB,CAC/F,OAAOJ,EAAY,OAAO,IAAI,SAASV,EAAS,OAAQA,EAAS,WAAaY,EAAYE,CAAM,CAAC,CACnG,CCvDO,IAAWO,OAKhBA,IAAA,eAAiB,GAAjB,iBAMAA,IAAA,qCAMAA,IAAA,qCAMAA,IAAA,iBAMAA,IAAA,mBAMAA,IAAA,mBAKAA,IAAA,uBAKAA,IAAA,uBA7CgBA,OAAA,IAwDX,SAASC,EACdC,EACe,CACf,MAAO,CAACA,CAAI,CACd,CASO,SAASC,EAGdD,EAASE,EAA+C,CACxD,GAAIA,EAAS,GAAK,CAAC,OAAO,SAASA,CAAM,EACvC,MAAM,IAAI,WAAW,oDAAoD,EAG3E,MAAO,CAACF,EAAME,CAAM,CACtB,CAMO,IAAMC,EAAN,KAAqD,CAC1D,YACmBC,EACDF,EAChB,CAFiB,cAAAE,EACD,YAAAF,CACf,CAEH,CAAC,OAAO,QAAQ,GAAI,CAClB,OAAO,KAAK,SAAS,OAAO,QAAQ,EAAE,CACxC,CACF,EAiBO,SAASG,EAAiDC,EAAmB,CAClF,GAAIA,EAAM,OAAS,EACjB,MAAM,IAAI,MACR,yFAAyFA,EAAM,KAAK,IAAI,CAAC,EAC3G,EAGF,MAAO,CAAE,MAAAA,CAAM,CACjB,CAQO,SAASC,GAAc,CAC5B,MAAO,EACT,CAMO,SAASC,EAAkDC,EAAW,CAC3E,MAAO,CAAE,SAAUA,CAAO,CAC5B,CAQO,IAAMC,EAAN,MAAMC,CAAmC,CAoPtC,YACWC,EACjBC,EACA,CAFiB,cAAAD,EAGjB,KAAK,QAAUC,EAAaC,EAAYD,CAAU,EAAI,CAAC,EACvD,IAAME,EAAaC,EAAe,KAAK,OAAO,EAC9C,KAAK,kBAAoBD,EAAW,kBACpC,KAAK,gBAAkBA,EAAW,eACpC,CAtPA,OAAO,OAA6BH,EAAkBC,EAAgB,CACpE,GAAID,EAAW,GAAK,CAAC,OAAO,SAASA,CAAQ,EAC3C,MAAM,IAAI,WAAW,uCAAuC,EAG9D,GAAIA,EAAW,IACb,MAAM,IAAI,WACR,6GACF,EAGF,OAAO,IAAID,EAAaC,EAAUC,CAAU,CAC9C,CAOA,OAAO,uBAAuBI,EAAgBC,EAAa,EAAG,CAC5D,OAAOD,EAAO,UAAUC,CAAU,CACpC,CAOA,OAAO,qBAAqBC,EAAoBD,EAAa,EAAG,CAC9D,OAAOC,EAAS,SAASD,CAAU,CACrC,CAUA,OAAO,wBAAwBE,EAA8BF,EAAoB,CAC/E,OAAO,IAAI,WAAWE,EAAaF,EAAY,CAAC,EAAE,CAAC,CACrD,CAQA,OAAO,gBAAgBD,KAAmBI,EAAqB,CAC7D,OAAOV,EAAa,MAAMM,EAAQK,EAAkBC,EAA4BF,CAAQ,CAC1F,CAQA,OAAO,cAAcF,KAAuBE,EAAqB,CAC/D,OAAOV,EAAa,MAAMQ,EAAUK,EAAcC,EAA0BJ,CAAQ,CACtF,CAUA,OAAO,iBACLD,EACAF,EACAQ,KACGL,EACH,CACA,OAAOV,EAAa,MAClB,IAAI,SAASS,EAAaF,EAAYQ,CAAU,EAChDF,EACAC,EACAJ,CACF,CACF,CAWA,eACEM,EACAC,EAAgB,CAAE,OAAQ,CAAE,EAC5BF,EAAaC,EAAO,WACH,CACjB,OAAO,KAAK,KACVA,EACAC,EACAF,EACAJ,EACAC,CACF,CACF,CAQA,aACEI,EACAC,EAAgB,CAAE,OAAQ,CAAE,EAC5BF,EAAaC,EAAO,WACH,CACjB,OAAO,KAAK,KAAKA,EAAQC,EAAeF,EAAYF,EAAcC,CAAwB,CAC5F,CAaA,gBAAgBE,EAAyBT,EAAoBQ,EAAoB,CAC/E,OAAO,KAAK,KACVG,EACI,OAAO,KAAKF,EAAQT,EAAYQ,CAAU,EACzC,IAAI,SAASC,EAAQT,EAAYQ,CAAU,EAChD,CAAE,OAAQ,CAAE,EACZA,EACAG,EAAiBP,EAAmBE,EACpCK,EAAiBN,EAA8BE,CACjD,CACF,CAQA,gBAAgBK,EAAoB,CAClC,IAAMJ,EAAa,KAAK,oCAAoCI,CAAO,EAC7Db,EAAS,OAAO,YAAYS,CAAU,EAE5C,OAAO,KAAK,MACVT,EACAa,EACA,CAAE,OAAQ,CAAE,EACZJ,EACAA,EACAK,EACAC,EACAC,CACF,CACF,CAKA,cAAcH,EAAoB,CAChC,IAAMJ,EAAa,KAAK,oCAAoCI,CAAO,EAC7DX,EAAW,IAAI,SAAS,IAAI,YAAYO,CAAU,CAAC,EAEzD,OAAO,KAAK,MACVP,EACAW,EACA,CAAE,OAAQ,CAAE,EACZJ,EACAA,EACAQ,EACAC,EACAC,CACF,CACF,CAYA,iBAAiBN,EAAoB,CACnC,IAAMO,EAAMR,EAAiB,KAAK,gBAAgBC,CAAO,EAAI,KAAK,cAAcA,CAAO,EACvF,MAAO,CAAE,OAAQO,EAAI,OAAQ,WAAYA,EAAI,WAAY,WAAYA,EAAI,UAAW,CACtF,CASA,QAAQC,EAA+C,CACrD,MAAO,CAAC,KAAMA,CAAO,CACvB,CAEA,qBAAqBC,EAAqBT,EAA8B,CACtE,IAAMJ,EAAa,KAAK,kBAAoBa,EACtCtB,EAAS,OAAO,YAAYS,CAAU,EACtCE,EAAgB,CAAE,OAAQ,CAAE,EAElC,QAAWY,KAAWV,EACpB,KAAK,MACHb,EACAuB,EACAZ,EACAF,EACAA,EACAK,EACAC,EACAC,CACF,CAEJ,CAIiB,QACR,gBACA,kBAYT,OAAe,MACbN,EACAc,EACAC,EACArB,EACA,CACA,OAAW,CAACsB,EAAQL,CAAO,IAAKjB,EAC9B,GAAIsB,EAAO,WAAaF,EAAc,CAAoB,EAAEd,EAAe,CAAC,EAC1E,OAAOW,EACLK,EAAO,KAAKhB,EAAQ,CAAE,OAAQ,CAAE,EAAGA,EAAO,WAAYc,EAAeC,CAAoB,CAC3F,CAGN,CAEQ,KACNf,EACAC,EACAF,EACAe,EACAC,EACiB,CACjB,GAAIhB,EAAaE,EAAc,OAAS,KAAK,kBAC3C,MAAM,IAAI,MACR,uDAAuD,KAAK,QAAQ,cAAcA,EAAc,MAAM,EACxG,EAGF,GACEa,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,IAAM,KAAK,SAElF,MAAM,IAAI,MACR,kBAAkBA,EAAc,MAAM,4BAA4B,KAAK,QAAQ,EACjF,EAGFA,EAAc,QAAU,EACxB,IAAMgB,EAAc,CAAC,EAErB,OAAW,CAACC,EAAMC,CAAG,IAAK,KAAK,QAC7B,GAAI,MAAM,QAAQA,CAAG,EAAG,CACtB,IAAM5C,EAEJ4C,EAAI,CAAC,GAAKL,EAAc,CAAoB,EAAEd,EAAeC,EAAc,QAAQ,EAE/EmB,EAAQ,MAAM7C,CAAM,EAEpB8C,EAAWF,EAAI,CAAC,EAEtB,GAAI,OAAOE,GAAa,SAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAC5BF,EAAME,CAAC,EAAID,EAAS,KAClBrB,EACAC,EACAF,EACAe,EACAC,CACF,UAEOM,IAAa,GAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAAG,CAC/B,IAAMC,EAAST,EAAc,CAAqB,EAAEd,EAAeC,EAAc,MAAM,EACvFA,EAAc,QAAU,EAExBmB,EAAME,CAAC,EAAIP,EAAqBf,EAAQC,EAAc,OAAQsB,CAAM,EACpEtB,EAAc,QAAUsB,CAC1B,KACK,CAEL,IAAMC,EAAWC,EAAUJ,CAAQ,EAInC,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAC5BF,EAAME,CAAC,EAAIR,EAAcO,CAAQ,EAAErB,EAAeC,EAAc,MAAM,EACtEA,EAAc,QAAUuB,CAE5B,CAGAP,EAAOC,CAAI,EAAIE,CACjB,SAAW,OAAOD,GAAQ,SAGxBF,EAAOC,CAAI,EAAIJ,EAAcK,CAAG,EAAEnB,EAAeC,EAAc,MAAM,EACrEA,EAAc,QAAUwB,EAAUN,CAAG,UAC5BA,IAAQ,GAAI,CACrB,IAAMI,EAAST,EAAc,CAAqB,EAAEd,EAAeC,EAAc,MAAM,EACvFA,EAAc,QAAU,EAGxBgB,EAAOC,CAAI,EAAIH,EAAqBf,EAAQC,EAAc,OAAQsB,CAAM,EACxEtB,EAAc,QAAUsB,CAC1B,SAAW,UAAWJ,EAAK,CAEzB,IAAMxC,EAAQmC,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,EACrFA,EAAc,QAAU,EAGxBgB,EAAOC,CAAI,EAAI,CAAC,EAEhB,QAASQ,EAAM,EAAGA,EAAMP,EAAI,MAAM,OAAQ,EAAEO,EAE1CT,EAAOC,CAAI,EAAEC,EAAI,MAAMO,CAAG,CAAC,EAAI,CAAC,EAAE/C,EAAS,GAAK+C,EAEpD,SAAW,aAAcP,EAAK,CAE5B,IAAMQ,EACJb,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,IAAM,EAE/EA,EAAc,QAAU,EAEpB0B,IAEFV,EAAOC,CAAI,EAAIC,EAAI,SAAS,KAC1BnB,EACAC,EACAF,EACAe,EACAC,CACF,EAEJ,MAGEE,EAAOC,CAAI,EAAIC,EAAI,KACjBnB,EACAC,EACAF,EACAe,EACAC,CACF,EAIJ,OAAOE,CACT,CAEQ,MACN3B,EACAa,EACAF,EACAF,EACA6B,EACAC,EACAC,EACAC,EACK,CACLF,EAAe,CAAoB,EAAEvC,EAAe,KAAK,SAAUW,EAAc,MAAM,EACvFA,EAAc,QAAU,EAExB,OAAW,CAACiB,EAAMC,CAAG,IAAK,KAAK,QAAS,CACtC,IAAMa,EAAO7B,EAAQe,CAAI,EAEzB,GAAI,MAAM,QAAQC,CAAG,EAAG,CAGtB,IAAM5C,EAAUyD,EAA8C,OAIxDC,EAAiBd,EAAI,CAAC,IAAM,OAOlC,GALIc,IACFJ,EAAe,CAAoB,EAAEvC,EAAef,EAAQ0B,EAAc,MAAM,EAChFA,EAAc,QAAU,GAGtB1B,EAAS,EAAG,CACd,IAAM8C,EAAWF,EAAI,CAAC,EAEtB,GAAI,OAAOE,GAAa,SAAU,CAGhC,GAAIY,EAAgB,CAClB,IAAMC,EAAyB3D,EAAS8C,EAAS,kBAEjDtB,GAAcmC,EACdN,GAAiBM,EAEb5C,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,EAErD,CAEA,QAAWO,KAAUH,EAEnB1C,EAAS+B,EAAS,MAChB/B,EACA6C,EACAlC,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,UAE3B,SAAW+B,IAAa,GAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAAG,CAC/B,IAAMc,EAAOJ,EAA6BV,CAAC,EACrCC,EAASa,EAAI,OAEnBP,EAAe,CAAqB,EAAEvC,EAAeiC,EAAQtB,EAAc,MAAM,EACjFA,EAAc,QAAU,EAExB8B,EAAqBzC,EAAQW,EAAc,OAAQmC,CAAG,EACtDnC,EAAc,QAAUsB,CAC1B,KACK,CAEL,IAAMC,EAAWC,EAAUJ,CAAQ,EAEnC,GAAIY,EAAgB,CAClB,IAAMC,EAAyB3D,EAASiD,EAExCzB,GAAcmC,EACdN,GAAiBM,EAEb5C,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,EAErD,CAIA,QAAWS,KAAUL,EACnBH,EAAeR,CAAQ,EAAE/B,EAAe+C,EAAQpC,EAAc,MAAM,EACpEA,EAAc,QAAUuB,CAE5B,CACF,CACF,SAAW,OAAOL,GAAQ,SAExBU,EAAeV,CAAG,EAAE7B,EAAe0C,EAAgB/B,EAAc,MAAM,EACvEA,EAAc,QAAUwB,EAAUN,CAAG,UAC5BA,IAAQ,GAAI,CAErB,IAAMI,EAAUS,EAAgB,OAEhCH,EAAe,CAAqB,EAAEvC,EAAeiC,EAAQtB,EAAc,MAAM,EACjFA,EAAc,QAAU,EAExB8B,EAAqBzC,EAAQW,EAAc,OAAQ+B,CAAc,EACjE/B,EAAc,QAAUsB,CAC1B,SAAW,UAAWJ,EAAK,CAEzB,IAAIxC,EAAQ,EAEZ,QAAS+C,EAAM,EAAGA,EAAMP,EAAI,MAAM,OAAQ,EAAEO,EACrCM,EAAiCb,EAAI,MAAMO,CAAG,CAAC,IAClD/C,GAAS,GAAK+C,GAIlBG,EAAe,CAAoB,EAAEvC,EAAeX,EAAOsB,EAAc,MAAM,EAC/EA,EAAc,QAAU,CAC1B,KAAW,aAAckB,EACnBa,GACFH,EAAe,CAAoB,EAAEvC,EAAe,EAAGW,EAAc,MAAM,EAC3EA,EAAc,QAAU,EAExBF,GAAcoB,EAAI,SAAS,kBAC3BS,GAAiBT,EAAI,SAAS,kBAE1B7B,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,GAGnDtC,EAAS6B,EAAI,SAAS,MACpB7B,EACA0C,EACA/B,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,aAEvBuC,EAAe,CAAoB,EAAEvC,EAAe,EAAGW,EAAc,MAAM,EAC3EA,EAAc,QAAU,IAI1BX,EAAS6B,EAAI,MACX7B,EACA0C,EACA/B,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,WAE3B,CAEA,OAAOA,CACT,CAEQ,oCAAoCa,EAAoB,CAC9D,IAAImC,EAAM,KAAK,kBAEf,QAAWC,KAAS,KAAK,gBAAgB,CAAC,EAExCD,GAAQnC,EAAQoC,CAAK,EAAa,OAGpC,QAAWA,KAAS,KAAK,gBAAgB,CAAC,EAExC,QAAWC,KAAUrC,EAAQoC,CAAK,EAChCD,GAAO,EAAIE,EAAO,OAItB,QAAWD,KAAS,KAAK,gBAAgB,CAAC,EAExCD,GAAO,KAAK,gBAAgB,CAAC,EAAEC,CAAK,EAAE,oCACpCpC,EAAQoC,CAAK,CACf,EAGF,OAAOD,CACT,CACF,EA4FA,SAASnD,EAAYD,EAAwB,CAC3C,OAAO,OAAO,QAAQA,CAAU,EAAE,KAAK,CAAC,CAACuD,CAAU,EAAG,CAACC,CAAU,IAC/DD,EAAW,cAAcC,CAAU,CACrC,CACF,CAkBA,SAASrD,EAAesD,EAAkB,CAExC,IAAIC,EAAoB,EAElBC,EAAmC,CAAC,CAAC,EAAG,CAAC,EAAG,CAAC,CAAC,EAEpD,OAAW,CAAC3B,EAAM4B,CAAI,IAAKH,EACzB,GAAI,MAAM,QAAQG,CAAI,EACpB,GAAIA,EAAK,SAAW,EAAG,CAErB,IAAMC,EAAWD,EAAK,CAAC,IAAM,GAEvBtB,EACJ,OAAOsB,EAAK,CAAC,GAAM,SACfA,EAAK,CAAC,EAAE,kBACRC,EACE,EACAtB,EAAUqB,EAAK,CAAC,CAAC,EAEzBF,GAAqBE,EAAK,CAAC,EAAItB,EAE3BuB,GACFF,EAAgB,CAAC,EAAE,KAAK3B,CAAI,CAEhC,MAGE0B,GAAqB,EAEjBE,EAAK,CAAC,IAAM,IACdD,EAAgB,CAAC,EAAE,KAAK3B,CAAI,OAGvB4B,aAAgB/D,GACzB6D,GAAqBE,EAAK,kBAC1BD,EAAgB,CAAC,EAAE3B,CAAI,EAAI4B,GAClB,OAAOA,GAAS,SAIzBF,GAAqB,EACZE,IAAS,IAGlBF,GAAqB,EACrBC,EAAgB,CAAC,EAAE,KAAK3B,CAAI,GAE5B0B,GAAqBnB,EAAUqB,CAAI,EAIvC,MAAO,CAAE,kBAAAF,EAAmB,gBAAAC,CAAgB,CAC9C,CAQA,IAAMpB,EAAY,MAAM,CAAC,EAEzBA,EAAU,CAAoB,EAAI,EAClCA,EAAU,CAAW,EAAI,EAEzBA,EAAU,CAAqB,EAAI,EACnCA,EAAU,CAAY,EAAI,EAE1BA,EAAU,CAAqB,EAAI,EACnCA,EAAU,CAAY,EAAI,EAC1BA,EAAU,CAAc,EAAI,EAE5BA,EAAU,CAAc,EAAI,EAE5B,IAAM5B,EAAe,MAAM,CAAC,EAE5BA,EAAa,CAAoB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAC3EpD,EAAa,CAAW,EAAI,CAACmD,EAAMC,IAAWD,EAAK,QAAQC,CAAM,EAEjEpD,EAAa,CAAqB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAC7EpD,EAAa,CAAY,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAEnEpD,EAAa,CAAqB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAC7EpD,EAAa,CAAY,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EACnEpD,EAAa,CAAc,EAAI,CAACmD,EAAMC,IAAWD,EAAK,WAAWC,CAAM,EAEvEpD,EAAa,CAAc,EAAI,CAACmD,EAAMC,IAAWD,EAAK,WAAWC,CAAM,EAEvE,IAAM1C,EAAe,MAAM,CAAC,EAE5BA,EAAa,CAAoB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EACzF3C,EAAa,CAAW,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,QAAQC,EAAQC,CAAK,EAE/E3C,EAAa,CAAqB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,UAAUC,EAAQC,CAAK,EAC3F3C,EAAa,CAAY,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EAEjF3C,EAAa,CAAqB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,UAAUC,EAAQC,CAAK,EAC3F3C,EAAa,CAAY,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EACjF3C,EAAa,CAAc,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,WAAWC,EAAQC,CAAK,EAErF3C,EAAa,CAAc,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,WAAWC,EAAQC,CAAK,EAErF,IAAM9C,EAAmB,MAAM,CAAC,EAE5BF,IACFE,EAAiB,CAAoB,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,WAAWE,EAAOD,CAAM,EAC/F7C,EAAiB,CAAW,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,UAAUE,EAAOD,CAAM,EAErF7C,EAAiB,CAAqB,EAAI,CAAC4C,EAAME,EAAOD,IACtDD,EAAK,cAAcE,EAAOD,CAAM,EAClC7C,EAAiB,CAAY,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EAEzF7C,EAAiB,CAAqB,EAAI,CAAC4C,EAAME,EAAOD,IACtDD,EAAK,cAAcE,EAAOD,CAAM,EAClC7C,EAAiB,CAAY,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EACzF7C,EAAiB,CAAc,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EAE3F7C,EAAiB,CAAc,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,cAAcE,EAAOD,CAAM,GAG9F,IAAMtD,EAAmB,MAAM,CAAC,EAE5BO,IACFP,EAAiB,CAAoB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAChFtD,EAAiB,CAAW,EAAI,CAACqD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAEtEtD,EAAiB,CAAqB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM,EAEpFtD,EAAiB,CAAY,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC1EtD,EAAiB,CAAqB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM,EAEpFtD,EAAiB,CAAY,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC1EtD,EAAiB,CAAc,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC5EtD,EAAiB,CAAc,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM","names":["hasNodeBuffers","growDataView","dataview","newByteLength","resizedBuffer","amountToCopy","length","offset","growNodeBuffer","buffer","newBuffer","textEncoder","textDecoder","encodeStringIntoDataView","byteOffset","string","strlen","u8Buffer","encodeSmallString","encodeStringIntoNodeBuffer","i","decodeStringFromNodeBuffer","decodeStringFromDataView","Field","FieldArray","item","FieldFixedArray","length","SequentialSerializer","iterable","FieldBitFlags","flags","FieldString","FieldOptional","packet","BinaryPacket","_BinaryPacket","packetId","definition","sortEntries","inspection","inspectEntries","buffer","byteOffset","dataview","arraybuffer","visitors","GET_FUNCTION_BUF","decodeStringFromNodeBuffer","GET_FUNCTION","decodeStringFromDataView","byteLength","dataIn","offsetPointer","hasNodeBuffers","dataOut","SET_FUNCTION_BUF","growNodeBuffer","encodeStringIntoNodeBuffer","SET_FUNCTION","growDataView","encodeStringIntoDataView","buf","onVisit","numElements","element","readFunctions","decodeStringFunction","Packet","result","name","def","array","itemType","i","strlen","itemSize","BYTE_SIZE","bit","hasSubPacket","maxByteLength","writeFunctions","growBufferFunction","encodeStringFunction","data","isDynamicArray","neededBytesForElements","object","str","number","len","field","string","fieldName1","fieldName2","entries","minimumByteLength","stringPositions","type","isString","view","offset","value"]}
1
+ {"version":3,"sources":["../src/buffers.ts","../src/index.ts"],"sourcesContent":["/**\n * Exclusively matches objects of type `ArrayBuffer` and no other types that inherit from it. \\\n * This is needed because the `DataView` constructor explicitly requires a \"true\" ArrayBuffer, or else it throws.\n */\nexport type TrueArrayBuffer = ArrayBuffer & { buffer?: undefined }\n\nexport const hasNodeBuffers = typeof Buffer === 'function'\n\nexport function growDataView(dataview: DataView, newByteLength: number) {\n const resizedBuffer = new ArrayBuffer(newByteLength)\n const amountToCopy = Math.min(dataview.byteLength, resizedBuffer.byteLength)\n\n // Treat the buffer as if it was a Float64Array so we can copy 8 bytes at a time, to finish faster\n let length = Math.trunc(amountToCopy / 8)\n new Float64Array(resizedBuffer, 0, length).set(new Float64Array(dataview.buffer, 0, length))\n\n // Copy the remaining up to 7 bytes\n const offset = length * 8\n length = amountToCopy - offset\n new Uint8Array(resizedBuffer, offset, length).set(new Uint8Array(dataview.buffer, offset, length))\n\n return new DataView(resizedBuffer)\n}\n\nexport function growNodeBuffer(buffer: Buffer, newByteLength: number) {\n const newBuffer = Buffer.allocUnsafe(newByteLength)\n buffer.copy(newBuffer)\n return newBuffer\n}\n\nconst textEncoder = new TextEncoder()\nconst textDecoder = new TextDecoder()\n\nexport function encodeStringIntoDataView(dataview: DataView, byteOffset: number, string: string) {\n const strlen = string.length\n const u8Buffer = new Uint8Array(dataview.buffer, dataview.byteOffset + byteOffset, strlen)\n\n if (strlen <= 64) {\n encodeSmallString(u8Buffer, 0, string, strlen)\n } else {\n textEncoder.encodeInto(string, u8Buffer)\n }\n}\n\nexport function encodeStringIntoNodeBuffer(buffer: Buffer, byteOffset: number, string: string) {\n const strlen = string.length\n\n if (strlen <= 64) {\n encodeSmallString(buffer, byteOffset, string, strlen)\n } else {\n buffer.utf8Write(string, byteOffset, strlen)\n }\n}\n\nfunction encodeSmallString(buffer: Uint8Array, byteOffset: number, string: string, strlen: number) {\n for (let i = 0; i < strlen; ++i) {\n buffer[byteOffset + i] = string.charCodeAt(i) & 0xff\n }\n}\n\nexport function decodeStringFromNodeBuffer(buffer: Buffer, byteOffset: number, strlen: number) {\n return buffer.subarray(byteOffset, byteOffset + strlen).toString('utf8')\n}\n\nexport function decodeStringFromDataView(dataview: DataView, byteOffset: number, strlen: number) {\n return textDecoder.decode(new DataView(dataview.buffer, dataview.byteOffset + byteOffset, strlen))\n}\n\ndeclare global {\n interface Buffer {\n /**\n * Node buffer's internals function. \\\n * For some reason it is not exposed through TypeScript. \\\n * Fastest way to write utf8 strings into buffers.\n */\n utf8Write(string: string, byteOffset?: number, byteLength?: number): number\n }\n}\n","import {\n decodeStringFromDataView,\n decodeStringFromNodeBuffer,\n encodeStringIntoDataView,\n encodeStringIntoNodeBuffer,\n growDataView,\n growNodeBuffer,\n hasNodeBuffers,\n type TrueArrayBuffer\n} from './buffers'\n\nexport const enum Field {\n /**\n * Defines a 1 byte (8 bits) unsigned integer field. \\\n * (Range: 0 - 255)\n */\n UNSIGNED_INT_8 = 0,\n\n /**\n * Defines a 2 bytes (16 bits) unsigned integer field. \\\n * (Range: 0 - 65535)\n */\n UNSIGNED_INT_16,\n\n /**\n * Defines a 4 bytes (32 bits) unsigned integer field. \\\n * (Range: 0 - 4294967295)\n */\n UNSIGNED_INT_32,\n\n /**\n * Defines a 1 byte (8 bits) signed integer field. \\\n * (Range: -128 - 127)\n */\n INT_8,\n\n /**\n * Defines a 2 bytes (16 bits) signed integer field. \\\n * (Range: -32768 - 32767)\n */\n INT_16,\n\n /**\n * Defines a 4 bytes (32 bits) signed integer field. \\\n * (Range: -2147483648 - 2147483647)\n */\n INT_32,\n\n /**\n * Defines a 4 bytes (32 bits) floating-point field. \\\n */\n FLOAT_32,\n\n /**\n * Defines a 8 bytes (64 bits) floating-point field. \\\n */\n FLOAT_64\n}\n\n/**\n * Defines a dynamically-sized array with elements of a certain type. \\\n * Dynamically-sized arrays are useful when a packet's field is an array of a non pre-defined length. \\\n * Although, this makes dynamically-sized arrays more memory expensive as the internal buffer needs to be grown accordingly.\n *\n * NOTE: If an array will ALWAYS have the same length, prefer using the `FieldFixedArray` type, for both better performance and memory efficiency. \\\n * NOTE: As of now, dynamic arrays can have at most 256 elements.\n */\nexport function FieldArray<T extends Field | BinaryPacket<Definition> | ''>(\n item: T\n): [itemType: T] {\n return [item]\n}\n\n/**\n * Defines a statically-sized array with elements of a certain type. \\\n * Fixed arrays are useful when a packet's field is an array of a pre-defined length. \\\n * Fixed arrays much more memory efficient and performant than non-fixed ones.\n *\n * NOTE: If an array will not always have the same length, use the `FieldArray` type.\n */\nexport function FieldFixedArray<\n T extends Field | BinaryPacket<Definition> | '',\n Length extends number\n>(item: T, length: Length): [itemType: T, length: Length] {\n if (length < 0 || !Number.isFinite(length)) {\n throw new RangeError('Length of a FixedArray must be a positive integer.')\n }\n\n return [item, length]\n}\n\n/**\n * Utility class that allows serializing arrays through any kind of iterable, as long as the number of elements is known beforehand. \\\n * Needed to skip the overhead of duplicating the data into an actual array just for it to be serialized straight away and trashed.\n */\nexport class SequentialSerializer<T> implements Iterable<T> {\n constructor(\n private readonly iterable: Iterable<T>,\n public readonly length: number\n ) {}\n\n [Symbol.iterator]() {\n return this.iterable[Symbol.iterator]()\n }\n}\n\n/**\n * Either an array or a SequentialSerializer<T>.\n *\n * Note: when a packet is **read**, it will **always** be a standard array: the SequentialSerializer \\\n * is just a utility to serialize iterators avoiding data duplication and array-creation overheads.\n */\ntype SequentiallySerializable<T, IsRead extends boolean> = IsRead extends true\n ? T[]\n : T[] | SequentialSerializer<T>\n\ntype BitFlags = (string[] | ReadonlyArray<string>) & {\n length: 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8\n}\n\n/**\n * Defines a sequence of up to 8 \"flags\" (basically single bits/booleans) that can be packed together into a single 8 bits value. \\\n * This is useful for minimizing bytes usage when there are lots of boolean fields/flags, instead of saving each flag separately as its own 8 bits value.\n *\n * The input should be an array of strings (with at most 8 elements) where each string defines the name of a flag. \\\n * This is just for definition purposes, then when actually writing or reading packets it'll just be a record-object with those names as keys and boolean values.\n */\nexport function FieldBitFlags<const FlagsArray extends BitFlags>(flags: FlagsArray) {\n if (flags.length > 8) {\n throw new Error(\n `Invalid BinaryPacket definition: a BitFlags field can have only up to 8 flags, given: ${flags.join(', ')}`\n )\n }\n\n return { flags }\n}\n\n/**\n * Defines a string field. \\\n * Strings cannot be more than 65536 characters long.\n *\n * NOTE: Only strings containing just ASCII and/or single-octet UTF-8 characters are supported.\n */\nexport function FieldString() {\n return '' as const\n}\n\n/**\n * Defines an optional BinaryPacket \"subpacket\" field. \\\n * When writing and reading packets it'll be possible to provide and receive `undefined` instead of a valid object.\n */\nexport function FieldOptional<T extends BinaryPacket<Definition>>(packet: T) {\n return { optional: packet }\n}\n\n/**\n * Do not manually construct this type: an object of this kind is returned by a BinaryPacket `createVisitor` method. \\\n * Used in the `BinaryPacket::visit` static method to perform a sort of \"pattern matching\" on an incoming packet (of yet unknown type) buffer.\n */\ntype Visitor = [BinaryPacket<Definition>, (packet: any) => void]\n\nexport class BinaryPacket<T extends Definition> {\n /**\n * Defines a new binary packet. \\\n * Make sure that every `packetId` is unique.\n * @throws RangeError If packetId is negative, floating-point, or greater than 255.\n */\n static define<T extends Definition>(packetId: number, definition?: T) {\n if (packetId < 0 || !Number.isFinite(packetId)) {\n throw new RangeError('Packet IDs must be positive integers.')\n }\n\n if (packetId > 255) {\n throw new RangeError(\n 'Packet IDs greater than 255 are not supported. Do you REALLY need more than 255 different kinds of packets?'\n )\n }\n\n return new BinaryPacket(packetId, definition)\n }\n\n /**\n * Reads just the packetId from the given Buffer. \\\n * This method practically just reads the uint8 at offset `byteOffset` (default: 0). \\\n * Useful if the receiving side receives multiple types of packets.\n */\n static readPacketIdNodeBuffer(buffer: Buffer, byteOffset = 0) {\n return buffer.readUint8(byteOffset)\n }\n\n /**\n * Reads just the packetId from the given DataView. \\\n * This method practically just reads the uint8 at offset `byteOffset` (default: 0). \\\n * Useful if the receiving side receives multiple types of packets.\n */\n static readPacketIdDataView(dataview: DataView, byteOffset = 0) {\n return dataview.getUint8(byteOffset)\n }\n\n /**\n * Reads just the packetId from the given ArrayBuffer. \\\n * This method practically just reads the uint8 at offset `byteOffset`. \\\n * Useful if the receiving side receives multiple types of packets.\n *\n * NOTE: Due to security issues, the `byteOffset` argument cannot be defaulted and must be provided by the user. \\\n * NOTE: For more information read the `readArrayBuffer` method documentation.\n */\n static readPacketIdArrayBuffer(arraybuffer: TrueArrayBuffer, byteOffset: number) {\n return new Uint8Array(arraybuffer, byteOffset, 1)[0]\n }\n\n /**\n * Visits and \"pattern matches\" the given Buffer through the given visitors. \\\n * The Buffer is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitNodeBuffer(buffer: Buffer, ...visitors: Visitor[]) {\n return BinaryPacket.visit(buffer, GET_FUNCTION_BUF, decodeStringFromNodeBuffer, visitors)\n }\n\n /**\n * Visits and \"pattern matches\" the given DataView through the given visitors. \\\n * The DataView is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitDataView(dataview: DataView, ...visitors: Visitor[]) {\n return BinaryPacket.visit(dataview, GET_FUNCTION, decodeStringFromDataView, visitors)\n }\n\n /**\n * Visits and \"pattern matches\" the given ArrayBuffer through the given visitors. \\\n * The ArrayBuffer is compared to the series of visitors through its Packet ID, and, if an appropriate visitor is found: its callback is called.\n *\n * NOTE: Due to security issues, the `byteOffset` and `byteLength` arguments must be provided by the user. \\\n * NOTE: For more information read the `readArrayBuffer` method documentation. \\\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n static visitArrayBuffer(\n arraybuffer: TrueArrayBuffer,\n byteOffset: number,\n byteLength: number,\n ...visitors: Visitor[]\n ) {\n return BinaryPacket.visit(\n new DataView(arraybuffer, byteOffset, byteLength),\n GET_FUNCTION,\n decodeStringFromDataView,\n visitors\n )\n }\n\n /**\n * Reads/deserializes from the given Buffer. \\\n * Method available ONLY on NodeJS and Bun.\n *\n * If possible, always prefer reading using this method, as it is much faster than the other ones.\n *\n * NOTE: if you have an ArrayBuffer do not bother wrapping it into a node Buffer yourself. \\\n * NOTE: if you have an ArrayBuffer use the appropriate `readArrayBuffer`.\n */\n readNodeBuffer(\n dataIn: Buffer,\n offsetPointer = { offset: 0 },\n byteLength = dataIn.byteLength\n ): ToJson<T, true> {\n return this.read(\n dataIn,\n offsetPointer,\n byteLength,\n GET_FUNCTION_BUF,\n decodeStringFromNodeBuffer\n )\n }\n\n /**\n * Reads/deserializes from the given DataView.\n *\n * NOTE: if you have an ArrayBuffer do not bother wrapping it into a DataView yourself. \\\n * NOTE: if you have an ArrayBuffer use the appropriate `readArrayBuffer`.\n */\n readDataView(\n dataIn: DataView,\n offsetPointer = { offset: 0 },\n byteLength = dataIn.byteLength\n ): ToJson<T, true> {\n return this.read(dataIn, offsetPointer, byteLength, GET_FUNCTION, decodeStringFromDataView)\n }\n\n /**\n * Reads/deserializes from the given ArrayBuffer. \\\n * WARNING: this method is practically a HACK.\n *\n * When using this method both the `byteOffset` and `byteLength` are REQUIRED and cannot be defaulted. \\\n * This is to prevent serious bugs and security issues. \\\n * That is because often raw ArrayBuffers come from a pre-allocated buffer pool and do not start at byteOffset 0.\n *\n * NOTE: if you have a node Buffer do not bother wrapping it into an ArrayBuffer yourself. \\\n * NOTE: if you have a node Buffer use the appropriate `readNodeBuffer` as it is much faster and less error prone.\n */\n readArrayBuffer(dataIn: TrueArrayBuffer, byteOffset: number, byteLength: number) {\n return this.read(\n hasNodeBuffers\n ? Buffer.from(dataIn, byteOffset, byteLength)\n : (new DataView(dataIn, byteOffset, byteLength) as any),\n { offset: 0 }, // The underlying buffer has already been offsetted\n byteLength,\n hasNodeBuffers ? GET_FUNCTION_BUF : GET_FUNCTION,\n hasNodeBuffers ? decodeStringFromNodeBuffer : (decodeStringFromDataView as any)\n )\n }\n\n /**\n * Writes/serializes the given object into a Buffer. \\\n * Method available ONLY on NodeJS and Bun.\n *\n * If possible, always prefer writing using this method, as it is much faster than the other ones.\n */\n writeNodeBuffer(dataOut: ToJson<T>) {\n const byteLength = this.precalculateBufferLengthWithStrings(dataOut)\n const buffer = Buffer.allocUnsafe(byteLength)\n\n return this.write(\n buffer,\n dataOut,\n { offset: 0 },\n byteLength,\n byteLength,\n SET_FUNCTION_BUF,\n growNodeBuffer,\n encodeStringIntoNodeBuffer\n )\n }\n\n /**\n * Writes/serializes the given object into a DataView. \\\n */\n writeDataView(dataOut: ToJson<T>) {\n const byteLength = this.precalculateBufferLengthWithStrings(dataOut)\n const dataview = new DataView(new ArrayBuffer(byteLength))\n\n return this.write(\n dataview,\n dataOut,\n { offset: 0 },\n byteLength,\n byteLength,\n SET_FUNCTION,\n growDataView,\n encodeStringIntoDataView\n )\n }\n\n /**\n * Writes/serializes the given object into an ArrayBuffer. \\\n * This method is just a wrapper around either `writeNodeBuffer` or `writeDataView`. \\\n *\n * This method works with JavaScript standard raw ArrayBuffer(s) and, as such, is very error prone: \\\n * Make sure you're using the returned byteLength and byteOffset fields in the read counterpart. \\\n *\n * Always consider whether is possible to use directly `writeNodeBuffer` or `writeDataView` instead of `writeArrayBuffer`. \\\n * For more information read the `readArrayBuffer` documentation.\n */\n writeArrayBuffer(dataOut: ToJson<T>) {\n const buf = hasNodeBuffers ? this.writeNodeBuffer(dataOut) : this.writeDataView(dataOut)\n return { buffer: buf.buffer, byteLength: buf.byteLength, byteOffset: buf.byteOffset }\n }\n\n /**\n * Creates a \"visitor\" object for this BinaryPacket definition. \\\n * Used when visiting and \"pattern matching\" buffers with the `BinaryPacket::visit` static utility methods. \\\n *\n * For more information read the `BinaryPacket::visitNodeBuffer` documentation. \\\n * NOTE: If visiting packets in a loop, for both performance and memory efficiency reasons, it is much better to create each visitor only once before the loop starts and not every iteration.\n */\n visitor(onVisit: (packet: ToJson<T>) => void): Visitor {\n return [this, onVisit]\n }\n\n sequentialSerializer(numElements: number, dataOut: Iterable<ToJson<T>>) {\n const byteLength = this.minimumByteLength * numElements\n const buffer = Buffer.allocUnsafe(byteLength)\n const offsetPointer = { offset: 0 }\n\n for (const element of dataOut) {\n this.write(\n buffer,\n element,\n offsetPointer,\n byteLength,\n byteLength,\n SET_FUNCTION_BUF,\n growNodeBuffer,\n encodeStringIntoNodeBuffer\n )\n }\n }\n\n /// PRIVATE\n\n private readonly entries: Entries\n readonly stringPositions: StringPositions\n readonly minimumByteLength: number\n\n private constructor(\n private readonly packetId: number,\n definition?: T\n ) {\n this.entries = definition ? sortEntries(definition) : []\n const inspection = inspectEntries(this.entries)\n this.minimumByteLength = inspection.minimumByteLength\n this.stringPositions = inspection.stringPositions\n }\n\n private static visit<Buf extends DataView | Buffer>(\n dataIn: Buf,\n readFunctions: typeof GET_FUNCTION | typeof GET_FUNCTION_BUF,\n decodeStringFunction: (dataIn: Buf, byteOffset: number, strlen: number) => string,\n visitors: Visitor[]\n ) {\n for (const [Packet, onVisit] of visitors) {\n if (Packet.packetId === readFunctions[Field.UNSIGNED_INT_8](dataIn as any, 0)) {\n return onVisit(\n Packet.read(dataIn, { offset: 0 }, dataIn.byteLength, readFunctions, decodeStringFunction)\n )\n }\n }\n }\n\n private read<Buf extends DataView | Buffer>(\n dataIn: Buf,\n offsetPointer: { offset: number },\n byteLength: number,\n readFunctions: typeof GET_FUNCTION | typeof GET_FUNCTION_BUF,\n decodeStringFunction: (dataIn: Buf, byteOffset: number, strlen: number) => string\n ): ToJson<T, true> {\n if (byteLength + offsetPointer.offset < this.minimumByteLength) {\n throw new Error(\n `There is no space available to fit a packet of type ${this.packetId} at offset ${offsetPointer.offset}`\n )\n }\n\n if (\n readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset) !== this.packetId\n ) {\n throw new Error(\n `Data at offset ${offsetPointer.offset} is not a packet of type ${this.packetId}`\n )\n }\n\n offsetPointer.offset += 1\n const result: any = {}\n\n for (const [name, def] of this.entries) {\n if (Array.isArray(def)) {\n const length =\n // def[1] is the length of a statically-sized array, if undefined: must read the length from the buffer as it means it's a dynamically-sized array\n def[1] ?? readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset++)\n\n const array = Array(length)\n\n const itemType = def[0]\n\n if (typeof itemType === 'object') {\n // Array of \"subpackets\"\n for (let i = 0; i < length; ++i) {\n array[i] = itemType.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n } else if (itemType === '') {\n // Array of strings\n for (let i = 0; i < length; ++i) {\n const strlen = readFunctions[Field.UNSIGNED_INT_16](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 2\n\n array[i] = decodeStringFunction(dataIn, offsetPointer.offset, strlen)\n offsetPointer.offset += strlen\n }\n } else {\n // Array of primitives (numbers)\n const itemSize = BYTE_SIZE[itemType]\n\n // It seems like looping over each element is actually much faster than using TypedArrays bulk copy.\n // TODO: properly benchmark with various array sizes to see if it's actually the case.\n for (let i = 0; i < length; ++i) {\n array[i] = readFunctions[itemType](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += itemSize\n }\n }\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = array\n } else if (typeof def === 'number') {\n // Single primitive (number)\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = readFunctions[def](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += BYTE_SIZE[def]\n } else if (def === '') {\n const strlen = readFunctions[Field.UNSIGNED_INT_16](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 2\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = decodeStringFunction(dataIn, offsetPointer.offset, strlen)\n offsetPointer.offset += strlen\n } else if ('flags' in def) {\n // BitFlags\n const flags = readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset)\n offsetPointer.offset += 1\n\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = {}\n\n for (let bit = 0; bit < def.flags.length; ++bit) {\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name][def.flags[bit]] = !!(flags & (1 << bit))\n }\n } else if ('optional' in def) {\n // Single optional \"subpacket\"\n const hasSubPacket =\n readFunctions[Field.UNSIGNED_INT_8](dataIn as any, offsetPointer.offset) !== 0\n\n offsetPointer.offset += 1\n\n if (hasSubPacket) {\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = def.optional.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n } else {\n // Single \"subpacket\"\n // eslint-disable-next-line @typescript-eslint/no-unsafe-member-access\n result[name] = def.read(\n dataIn,\n offsetPointer,\n byteLength,\n readFunctions,\n decodeStringFunction\n )\n }\n }\n\n return result as ToJson<T, true>\n }\n\n private write<Buf extends DataView | Buffer>(\n buffer: Buf,\n dataOut: ToJson<T>,\n offsetPointer: { offset: number },\n byteLength: number,\n maxByteLength: number,\n writeFunctions: typeof SET_FUNCTION | typeof SET_FUNCTION_BUF,\n growBufferFunction: (buffer: Buf, newByteLength: number) => Buf,\n encodeStringFunction: (buffer: Buf, byteOffset: number, string: string) => void\n ): Buf {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, this.packetId, offsetPointer.offset)\n offsetPointer.offset += 1\n\n for (const [name, def] of this.entries) {\n const data = dataOut[name]\n\n if (Array.isArray(def)) {\n // Could be both an array of just numbers or \"subpackets\"\n\n const length = (data as SequentiallySerializable<any, false>).length\n\n // Check if it is a dynamically-sized array, if it is, the length of the array must be serialized in the buffer before its elements\n // Explicitly check for undefined and not falsy values because it could be a statically-sized array of 0 elements.\n const isDynamicArray = def[1] === undefined\n\n if (isDynamicArray) {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, length, offsetPointer.offset)\n offsetPointer.offset += 1\n }\n\n if (length > 0) {\n const itemType = def[0]\n\n if (typeof itemType === 'object') {\n // Array of \"subpackets\"\n\n if (isDynamicArray) {\n const neededBytesForElements = length * itemType.minimumByteLength\n\n byteLength += neededBytesForElements\n maxByteLength += neededBytesForElements\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n }\n\n for (const object of data as unknown as ToJson<Definition>[]) {\n // Array of \"subpackets\"\n buffer = itemType.write(\n buffer,\n object,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n }\n } else if (itemType === '') {\n // Array of strings\n for (let i = 0; i < length; ++i) {\n const str = (data as unknown as string[])[i]\n const strlen = str.length\n\n writeFunctions[Field.UNSIGNED_INT_16](buffer as any, strlen, offsetPointer.offset)\n offsetPointer.offset += 2\n\n encodeStringFunction(buffer, offsetPointer.offset, str)\n offsetPointer.offset += strlen\n }\n } else {\n // Array of primitives (numbers)\n const itemSize = BYTE_SIZE[itemType]\n\n if (isDynamicArray) {\n const neededBytesForElements = length * itemSize\n\n byteLength += neededBytesForElements\n maxByteLength += neededBytesForElements\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n }\n\n // It seems like looping over each element is actually much faster than using TypedArrays bulk copy.\n // TODO: properly benchmark with various array sizes to see if it's actually the case.\n for (const number of data as SequentiallySerializable<number, false>) {\n writeFunctions[itemType](buffer as any, number, offsetPointer.offset)\n offsetPointer.offset += itemSize\n }\n }\n }\n } else if (typeof def === 'number') {\n // Single primitive (number)\n writeFunctions[def](buffer as any, data as number, offsetPointer.offset)\n offsetPointer.offset += BYTE_SIZE[def]\n } else if (def === '') {\n // String\n const strlen = (data as string).length\n\n writeFunctions[Field.UNSIGNED_INT_16](buffer as any, strlen, offsetPointer.offset)\n offsetPointer.offset += 2\n\n encodeStringFunction(buffer, offsetPointer.offset, data as string)\n offsetPointer.offset += strlen\n } else if ('flags' in def) {\n // BitFlags\n let flags = 0\n\n for (let bit = 0; bit < def.flags.length; ++bit) {\n if ((data as Record<string, boolean>)[def.flags[bit]]) {\n flags |= 1 << bit\n }\n }\n\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, flags, offsetPointer.offset)\n offsetPointer.offset += 1\n } else if ('optional' in def) {\n if (data) {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, 1, offsetPointer.offset)\n offsetPointer.offset += 1\n\n byteLength += def.optional.minimumByteLength\n maxByteLength += def.optional.minimumByteLength\n\n if (buffer.byteLength < maxByteLength) {\n buffer = growBufferFunction(buffer, maxByteLength)\n }\n\n buffer = def.optional.write(\n buffer,\n data as ToJson<Definition>,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n } else {\n writeFunctions[Field.UNSIGNED_INT_8](buffer as any, 0, offsetPointer.offset)\n offsetPointer.offset += 1\n }\n } else {\n // Single \"subpacket\"\n buffer = def.write(\n buffer,\n data as ToJson<Definition>,\n offsetPointer,\n byteLength,\n maxByteLength,\n writeFunctions,\n growBufferFunction,\n encodeStringFunction\n )\n\n byteLength = offsetPointer.offset\n maxByteLength = buffer.byteLength\n }\n }\n\n return buffer\n }\n\n private precalculateBufferLengthWithStrings(dataOut: ToJson<T>) {\n let len = this.minimumByteLength\n\n for (const field of this.stringPositions[0]) {\n // String field\n len += (dataOut[field] as string).length\n }\n\n for (const field of this.stringPositions[1]) {\n // Array of strings field\n for (const string of dataOut[field] as unknown as string[]) {\n len += 2 + string.length\n }\n }\n\n for (const field in this.stringPositions[2]) {\n // Subpacket that has some string fields\n len += this.stringPositions[2][field].precalculateBufferLengthWithStrings(\n dataOut[field] as any\n )\n }\n\n return len\n }\n}\n\n/**\n * BinaryPacket definition: \\\n * Any packet can be defined through a \"schema\" object explaining its fields names and types.\n *\n * @example\n * // Imagine we have a game board where each cell is a square and is one unit big.\n * // A cell can be then defined by its X and Y coordinates.\n * // For simplicity, let's say there cannot be more than 256 cells, so we can use 8 bits for each coordinate.\n * const Cell = {\n * x: Field.UNSIGNED_INT_8,\n * y: Field.UNSIGNED_INT_8\n * }\n *\n * // When done with the cell definition we can create its BinaryPacket writer/reader.\n * // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.\n * const CellPacket = BinaryPacket.define(0, Cell)\n *\n * // Let's now make the definition of the whole game board.\n * // You can also specify arrays of both \"primitive\" fields and other BinaryPackets.\n * const Board = {\n * numPlayers: Field.UNSIGNED_INT_8,\n * cells: FieldArray(CellPacket)\n * }\n *\n * // When done with the board definition we can create its BinaryPacket writer/reader.\n * // NOTE: each BinaryPacket needs an unique ID, for identification purposes and error checking.\n * const BoardPacket = BinaryPacket.define(1, Board)\n *\n * // And use it.\n * const buffer = BoardPacket.writeNodeBuffer({\n * numPlayers: 1,\n * cells: [\n * { x: 0, y: 0 },\n * { x: 1, y: 1 }\n * ]\n * })\n *\n * // sendTheBufferOver(buffer)\n * // ...\n * // const buffer = receiveTheBuffer()\n * const board = BoardPacket.readNodeBuffer(buffer)\n * // ...\n */\nexport type Definition = {\n [fieldName: string]:\n | MaybeArray<Field>\n | MaybeArray<BinaryPacket<Definition>>\n | MaybeArray<''>\n | { flags: BitFlags }\n | { optional: BinaryPacket<Definition> }\n}\n\ntype MaybeArray<T> = T | [itemType: T] | [itemType: T, length: number]\n\ntype BitFlagsToJson<FlagsArray extends BitFlags> = {\n [key in FlagsArray[number]]: boolean\n}\n\n/**\n * Meta-type that converts a `Definition` schema to the type of the actual JavaScript object that will be written into a packet or read from. \\\n */\nexport type ToJson<T extends Definition, IsRead extends boolean = false> = {\n [K in keyof T]: T[K] extends [infer Item]\n ? Item extends BinaryPacket<infer BPDef>\n ? SequentiallySerializable<ToJson<BPDef, IsRead>, IsRead>\n : Item extends ''\n ? SequentiallySerializable<string, IsRead>\n : SequentiallySerializable<number, IsRead>\n : T[K] extends [infer Item, infer Length]\n ? Item extends BinaryPacket<infer BPDef>\n ? SequentiallySerializable<ToJson<BPDef, IsRead>, IsRead> & { length: Length }\n : Item extends ''\n ? string[] & { length: Length }\n : number[] & { length: Length }\n : T[K] extends BinaryPacket<infer BPDef>\n ? ToJson<BPDef, IsRead>\n : T[K] extends { flags: infer FlagsArray extends BitFlags }\n ? BitFlagsToJson<FlagsArray>\n : T[K] extends { optional: BinaryPacket<infer BPDef extends Definition> }\n ? ToJson<BPDef, IsRead> | undefined\n : T[K] extends ''\n ? string\n : number\n}\n\n/**\n * In a JavaScript object, the order of its keys is not strictly defined: sort them by field name. \\\n * Thus, we cannot trust iterating over an object keys: we MUST iterate over its entries array. \\\n * This is important to make sure that whoever shares BinaryPacket definitions can correctly write/read packets independently of their JS engines.\n */\nfunction sortEntries(definition: Definition) {\n return Object.entries(definition).sort(([fieldName1], [fieldName2]) =>\n fieldName1.localeCompare(fieldName2)\n )\n}\n\ntype Entries = ReturnType<typeof sortEntries>\n\ntype StringPositions = [\n string[],\n string[],\n {\n [field: string]: BinaryPacket<Definition>\n }\n]\n\n/**\n * Helper function that \"inspects\" the entries of a BinaryPacket definition\n * and returns useful \"stats\" needed for writing and reading buffers.\n *\n * This function is ever called only once per BinaryPacket definition.\n */\nfunction inspectEntries(entries: Entries) {\n // The PacketID is already 1 byte, that's why we aren't starting from 0.\n let minimumByteLength = 1\n\n const stringPositions: StringPositions = [[], [], {}]\n\n for (const [name, type] of entries) {\n if (Array.isArray(type)) {\n if (type.length === 2) {\n // Statically-sized array\n const isString = type[0] === ''\n\n const itemSize =\n typeof type[0] === 'object'\n ? type[0].minimumByteLength\n : isString\n ? 2\n : BYTE_SIZE[type[0]]\n\n minimumByteLength += type[1] * itemSize\n\n if (isString) {\n stringPositions[1].push(name)\n }\n } else {\n // Dynamically-sized array\n // Adding 1 byte to serialize the array length\n minimumByteLength += 1\n\n if (type[0] === '') {\n stringPositions[1].push(name)\n }\n }\n } else if (type instanceof BinaryPacket) {\n minimumByteLength += type.minimumByteLength\n stringPositions[2][name] = type\n } else if (typeof type === 'object') {\n // BitFlags & Optionals\n // BitFlags are always 1 byte long, because they can hold up to 8 booleans\n // Optionals minimum is 1 byte long, because it holds whether the subpacket is present or not\n minimumByteLength += 1\n } else if (type === '') {\n // String\n // Adding 2 to serialize the string length\n minimumByteLength += 2\n stringPositions[0].push(name)\n } else {\n minimumByteLength += BYTE_SIZE[type]\n }\n }\n\n return { minimumByteLength, stringPositions }\n}\n\n//////////////////////////////////////////////\n// The logic here is practically over //\n// Here below there are needed constants //\n// that map a field-type to a functionality //\n//////////////////////////////////////////////\n\nconst BYTE_SIZE = Array(8) as number[]\n\nBYTE_SIZE[Field.UNSIGNED_INT_8] = 1\nBYTE_SIZE[Field.INT_8] = 1\n\nBYTE_SIZE[Field.UNSIGNED_INT_16] = 2\nBYTE_SIZE[Field.INT_16] = 2\n\nBYTE_SIZE[Field.UNSIGNED_INT_32] = 4\nBYTE_SIZE[Field.INT_32] = 4\nBYTE_SIZE[Field.FLOAT_32] = 4\n\nBYTE_SIZE[Field.FLOAT_64] = 8\n\nconst GET_FUNCTION = Array(8) as ((view: DataView, offset: number) => number)[]\n\nGET_FUNCTION[Field.UNSIGNED_INT_8] = (view, offset) => view.getUint8(offset)\nGET_FUNCTION[Field.INT_8] = (view, offset) => view.getInt8(offset)\n\nGET_FUNCTION[Field.UNSIGNED_INT_16] = (view, offset) => view.getUint16(offset)\nGET_FUNCTION[Field.INT_16] = (view, offset) => view.getInt16(offset)\n\nGET_FUNCTION[Field.UNSIGNED_INT_32] = (view, offset) => view.getUint32(offset)\nGET_FUNCTION[Field.INT_32] = (view, offset) => view.getInt32(offset)\nGET_FUNCTION[Field.FLOAT_32] = (view, offset) => view.getFloat32(offset)\n\nGET_FUNCTION[Field.FLOAT_64] = (view, offset) => view.getFloat64(offset)\n\nconst SET_FUNCTION = Array(8) as ((view: DataView, value: number, offset: number) => void)[]\n\nSET_FUNCTION[Field.UNSIGNED_INT_8] = (view, value, offset) => view.setUint8(offset, value)\nSET_FUNCTION[Field.INT_8] = (view, value, offset) => view.setInt8(offset, value)\n\nSET_FUNCTION[Field.UNSIGNED_INT_16] = (view, value, offset) => view.setUint16(offset, value)\nSET_FUNCTION[Field.INT_16] = (view, value, offset) => view.setInt16(offset, value)\n\nSET_FUNCTION[Field.UNSIGNED_INT_32] = (view, value, offset) => view.setUint32(offset, value)\nSET_FUNCTION[Field.INT_32] = (view, value, offset) => view.setInt32(offset, value)\nSET_FUNCTION[Field.FLOAT_32] = (view, value, offset) => view.setFloat32(offset, value)\n\nSET_FUNCTION[Field.FLOAT_64] = (view, value, offset) => view.setFloat64(offset, value)\n\nconst SET_FUNCTION_BUF = Array(8) as ((nodeBuffer: Buffer, value: number, offset: number) => void)[]\n\nif (hasNodeBuffers) {\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_8] = (view, value, offset) => view.writeUint8(value, offset)\n SET_FUNCTION_BUF[Field.INT_8] = (view, value, offset) => view.writeInt8(value, offset)\n\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_16] = (view, value, offset) =>\n view.writeUint16BE(value, offset)\n SET_FUNCTION_BUF[Field.INT_16] = (view, value, offset) => view.writeInt16BE(value, offset)\n\n SET_FUNCTION_BUF[Field.UNSIGNED_INT_32] = (view, value, offset) =>\n view.writeUint32BE(value, offset)\n SET_FUNCTION_BUF[Field.INT_32] = (view, value, offset) => view.writeInt32BE(value, offset)\n SET_FUNCTION_BUF[Field.FLOAT_32] = (view, value, offset) => view.writeFloatBE(value, offset)\n\n SET_FUNCTION_BUF[Field.FLOAT_64] = (view, value, offset) => view.writeDoubleBE(value, offset)\n}\n\nconst GET_FUNCTION_BUF = Array(8) as ((nodeBuffer: Buffer, offset: number) => number)[]\n\nif (hasNodeBuffers) {\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_8] = (view, offset) => view.readUint8(offset)\n GET_FUNCTION_BUF[Field.INT_8] = (view, offset) => view.readInt8(offset)\n\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_16] = (view, offset) => view.readUint16BE(offset)\n\n GET_FUNCTION_BUF[Field.INT_16] = (view, offset) => view.readInt16BE(offset)\n GET_FUNCTION_BUF[Field.UNSIGNED_INT_32] = (view, offset) => view.readUint32BE(offset)\n\n GET_FUNCTION_BUF[Field.INT_32] = (view, offset) => view.readInt32BE(offset)\n GET_FUNCTION_BUF[Field.FLOAT_32] = (view, offset) => view.readFloatBE(offset)\n GET_FUNCTION_BUF[Field.FLOAT_64] = (view, offset) => view.readDoubleBE(offset)\n}\n"],"mappings":"AAMO,IAAMA,EAAiB,OAAO,QAAW,WAEzC,SAASC,EAAaC,EAAoBC,EAAuB,CACtE,IAAMC,EAAgB,IAAI,YAAYD,CAAa,EAC7CE,EAAe,KAAK,IAAIH,EAAS,WAAYE,EAAc,UAAU,EAGvEE,EAAS,KAAK,MAAMD,EAAe,CAAC,EACxC,IAAI,aAAaD,EAAe,EAAGE,CAAM,EAAE,IAAI,IAAI,aAAaJ,EAAS,OAAQ,EAAGI,CAAM,CAAC,EAG3F,IAAMC,EAASD,EAAS,EACxB,OAAAA,EAASD,EAAeE,EACxB,IAAI,WAAWH,EAAeG,EAAQD,CAAM,EAAE,IAAI,IAAI,WAAWJ,EAAS,OAAQK,EAAQD,CAAM,CAAC,EAE1F,IAAI,SAASF,CAAa,CACnC,CAEO,SAASI,EAAeC,EAAgBN,EAAuB,CACpE,IAAMO,EAAY,OAAO,YAAYP,CAAa,EAClD,OAAAM,EAAO,KAAKC,CAAS,EACdA,CACT,CAEA,IAAMC,EAAc,IAAI,YAClBC,EAAc,IAAI,YAEjB,SAASC,EAAyBX,EAAoBY,EAAoBC,EAAgB,CAC/F,IAAMC,EAASD,EAAO,OAChBE,EAAW,IAAI,WAAWf,EAAS,OAAQA,EAAS,WAAaY,EAAYE,CAAM,EAErFA,GAAU,GACZE,EAAkBD,EAAU,EAAGF,EAAQC,CAAM,EAE7CL,EAAY,WAAWI,EAAQE,CAAQ,CAE3C,CAEO,SAASE,EAA2BV,EAAgBK,EAAoBC,EAAgB,CAC7F,IAAMC,EAASD,EAAO,OAElBC,GAAU,GACZE,EAAkBT,EAAQK,EAAYC,EAAQC,CAAM,EAEpDP,EAAO,UAAUM,EAAQD,EAAYE,CAAM,CAE/C,CAEA,SAASE,EAAkBT,EAAoBK,EAAoBC,EAAgBC,EAAgB,CACjG,QAASI,EAAI,EAAGA,EAAIJ,EAAQ,EAAEI,EAC5BX,EAAOK,EAAaM,CAAC,EAAIL,EAAO,WAAWK,CAAC,EAAI,GAEpD,CAEO,SAASC,EAA2BZ,EAAgBK,EAAoBE,EAAgB,CAC7F,OAAOP,EAAO,SAASK,EAAYA,EAAaE,CAAM,EAAE,SAAS,MAAM,CACzE,CAEO,SAASM,EAAyBpB,EAAoBY,EAAoBE,EAAgB,CAC/F,OAAOJ,EAAY,OAAO,IAAI,SAASV,EAAS,OAAQA,EAAS,WAAaY,EAAYE,CAAM,CAAC,CACnG,CCvDO,IAAWO,OAKhBA,IAAA,eAAiB,GAAjB,iBAMAA,IAAA,qCAMAA,IAAA,qCAMAA,IAAA,iBAMAA,IAAA,mBAMAA,IAAA,mBAKAA,IAAA,uBAKAA,IAAA,uBA7CgBA,OAAA,IAwDX,SAASC,EACdC,EACe,CACf,MAAO,CAACA,CAAI,CACd,CASO,SAASC,EAGdD,EAASE,EAA+C,CACxD,GAAIA,EAAS,GAAK,CAAC,OAAO,SAASA,CAAM,EACvC,MAAM,IAAI,WAAW,oDAAoD,EAG3E,MAAO,CAACF,EAAME,CAAM,CACtB,CAMO,IAAMC,EAAN,KAAqD,CAC1D,YACmBC,EACDF,EAChB,CAFiB,cAAAE,EACD,YAAAF,CACf,CAEH,CAAC,OAAO,QAAQ,GAAI,CAClB,OAAO,KAAK,SAAS,OAAO,QAAQ,EAAE,CACxC,CACF,EAuBO,SAASG,EAAiDC,EAAmB,CAClF,GAAIA,EAAM,OAAS,EACjB,MAAM,IAAI,MACR,yFAAyFA,EAAM,KAAK,IAAI,CAAC,EAC3G,EAGF,MAAO,CAAE,MAAAA,CAAM,CACjB,CAQO,SAASC,GAAc,CAC5B,MAAO,EACT,CAMO,SAASC,EAAkDC,EAAW,CAC3E,MAAO,CAAE,SAAUA,CAAO,CAC5B,CAQO,IAAMC,EAAN,MAAMC,CAAmC,CAoPtC,YACWC,EACjBC,EACA,CAFiB,cAAAD,EAGjB,KAAK,QAAUC,EAAaC,EAAYD,CAAU,EAAI,CAAC,EACvD,IAAME,EAAaC,EAAe,KAAK,OAAO,EAC9C,KAAK,kBAAoBD,EAAW,kBACpC,KAAK,gBAAkBA,EAAW,eACpC,CAtPA,OAAO,OAA6BH,EAAkBC,EAAgB,CACpE,GAAID,EAAW,GAAK,CAAC,OAAO,SAASA,CAAQ,EAC3C,MAAM,IAAI,WAAW,uCAAuC,EAG9D,GAAIA,EAAW,IACb,MAAM,IAAI,WACR,6GACF,EAGF,OAAO,IAAID,EAAaC,EAAUC,CAAU,CAC9C,CAOA,OAAO,uBAAuBI,EAAgBC,EAAa,EAAG,CAC5D,OAAOD,EAAO,UAAUC,CAAU,CACpC,CAOA,OAAO,qBAAqBC,EAAoBD,EAAa,EAAG,CAC9D,OAAOC,EAAS,SAASD,CAAU,CACrC,CAUA,OAAO,wBAAwBE,EAA8BF,EAAoB,CAC/E,OAAO,IAAI,WAAWE,EAAaF,EAAY,CAAC,EAAE,CAAC,CACrD,CAQA,OAAO,gBAAgBD,KAAmBI,EAAqB,CAC7D,OAAOV,EAAa,MAAMM,EAAQK,EAAkBC,EAA4BF,CAAQ,CAC1F,CAQA,OAAO,cAAcF,KAAuBE,EAAqB,CAC/D,OAAOV,EAAa,MAAMQ,EAAUK,EAAcC,EAA0BJ,CAAQ,CACtF,CAUA,OAAO,iBACLD,EACAF,EACAQ,KACGL,EACH,CACA,OAAOV,EAAa,MAClB,IAAI,SAASS,EAAaF,EAAYQ,CAAU,EAChDF,EACAC,EACAJ,CACF,CACF,CAWA,eACEM,EACAC,EAAgB,CAAE,OAAQ,CAAE,EAC5BF,EAAaC,EAAO,WACH,CACjB,OAAO,KAAK,KACVA,EACAC,EACAF,EACAJ,EACAC,CACF,CACF,CAQA,aACEI,EACAC,EAAgB,CAAE,OAAQ,CAAE,EAC5BF,EAAaC,EAAO,WACH,CACjB,OAAO,KAAK,KAAKA,EAAQC,EAAeF,EAAYF,EAAcC,CAAwB,CAC5F,CAaA,gBAAgBE,EAAyBT,EAAoBQ,EAAoB,CAC/E,OAAO,KAAK,KACVG,EACI,OAAO,KAAKF,EAAQT,EAAYQ,CAAU,EACzC,IAAI,SAASC,EAAQT,EAAYQ,CAAU,EAChD,CAAE,OAAQ,CAAE,EACZA,EACAG,EAAiBP,EAAmBE,EACpCK,EAAiBN,EAA8BE,CACjD,CACF,CAQA,gBAAgBK,EAAoB,CAClC,IAAMJ,EAAa,KAAK,oCAAoCI,CAAO,EAC7Db,EAAS,OAAO,YAAYS,CAAU,EAE5C,OAAO,KAAK,MACVT,EACAa,EACA,CAAE,OAAQ,CAAE,EACZJ,EACAA,EACAK,EACAC,EACAC,CACF,CACF,CAKA,cAAcH,EAAoB,CAChC,IAAMJ,EAAa,KAAK,oCAAoCI,CAAO,EAC7DX,EAAW,IAAI,SAAS,IAAI,YAAYO,CAAU,CAAC,EAEzD,OAAO,KAAK,MACVP,EACAW,EACA,CAAE,OAAQ,CAAE,EACZJ,EACAA,EACAQ,EACAC,EACAC,CACF,CACF,CAYA,iBAAiBN,EAAoB,CACnC,IAAMO,EAAMR,EAAiB,KAAK,gBAAgBC,CAAO,EAAI,KAAK,cAAcA,CAAO,EACvF,MAAO,CAAE,OAAQO,EAAI,OAAQ,WAAYA,EAAI,WAAY,WAAYA,EAAI,UAAW,CACtF,CASA,QAAQC,EAA+C,CACrD,MAAO,CAAC,KAAMA,CAAO,CACvB,CAEA,qBAAqBC,EAAqBT,EAA8B,CACtE,IAAMJ,EAAa,KAAK,kBAAoBa,EACtCtB,EAAS,OAAO,YAAYS,CAAU,EACtCE,EAAgB,CAAE,OAAQ,CAAE,EAElC,QAAWY,KAAWV,EACpB,KAAK,MACHb,EACAuB,EACAZ,EACAF,EACAA,EACAK,EACAC,EACAC,CACF,CAEJ,CAIiB,QACR,gBACA,kBAYT,OAAe,MACbN,EACAc,EACAC,EACArB,EACA,CACA,OAAW,CAACsB,EAAQL,CAAO,IAAKjB,EAC9B,GAAIsB,EAAO,WAAaF,EAAc,CAAoB,EAAEd,EAAe,CAAC,EAC1E,OAAOW,EACLK,EAAO,KAAKhB,EAAQ,CAAE,OAAQ,CAAE,EAAGA,EAAO,WAAYc,EAAeC,CAAoB,CAC3F,CAGN,CAEQ,KACNf,EACAC,EACAF,EACAe,EACAC,EACiB,CACjB,GAAIhB,EAAaE,EAAc,OAAS,KAAK,kBAC3C,MAAM,IAAI,MACR,uDAAuD,KAAK,QAAQ,cAAcA,EAAc,MAAM,EACxG,EAGF,GACEa,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,IAAM,KAAK,SAElF,MAAM,IAAI,MACR,kBAAkBA,EAAc,MAAM,4BAA4B,KAAK,QAAQ,EACjF,EAGFA,EAAc,QAAU,EACxB,IAAMgB,EAAc,CAAC,EAErB,OAAW,CAACC,EAAMC,CAAG,IAAK,KAAK,QAC7B,GAAI,MAAM,QAAQA,CAAG,EAAG,CACtB,IAAM5C,EAEJ4C,EAAI,CAAC,GAAKL,EAAc,CAAoB,EAAEd,EAAeC,EAAc,QAAQ,EAE/EmB,EAAQ,MAAM7C,CAAM,EAEpB8C,EAAWF,EAAI,CAAC,EAEtB,GAAI,OAAOE,GAAa,SAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAC5BF,EAAME,CAAC,EAAID,EAAS,KAClBrB,EACAC,EACAF,EACAe,EACAC,CACF,UAEOM,IAAa,GAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAAG,CAC/B,IAAMC,EAAST,EAAc,CAAqB,EAAEd,EAAeC,EAAc,MAAM,EACvFA,EAAc,QAAU,EAExBmB,EAAME,CAAC,EAAIP,EAAqBf,EAAQC,EAAc,OAAQsB,CAAM,EACpEtB,EAAc,QAAUsB,CAC1B,KACK,CAEL,IAAMC,EAAWC,EAAUJ,CAAQ,EAInC,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAC5BF,EAAME,CAAC,EAAIR,EAAcO,CAAQ,EAAErB,EAAeC,EAAc,MAAM,EACtEA,EAAc,QAAUuB,CAE5B,CAGAP,EAAOC,CAAI,EAAIE,CACjB,SAAW,OAAOD,GAAQ,SAGxBF,EAAOC,CAAI,EAAIJ,EAAcK,CAAG,EAAEnB,EAAeC,EAAc,MAAM,EACrEA,EAAc,QAAUwB,EAAUN,CAAG,UAC5BA,IAAQ,GAAI,CACrB,IAAMI,EAAST,EAAc,CAAqB,EAAEd,EAAeC,EAAc,MAAM,EACvFA,EAAc,QAAU,EAGxBgB,EAAOC,CAAI,EAAIH,EAAqBf,EAAQC,EAAc,OAAQsB,CAAM,EACxEtB,EAAc,QAAUsB,CAC1B,SAAW,UAAWJ,EAAK,CAEzB,IAAMxC,EAAQmC,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,EACrFA,EAAc,QAAU,EAGxBgB,EAAOC,CAAI,EAAI,CAAC,EAEhB,QAASQ,EAAM,EAAGA,EAAMP,EAAI,MAAM,OAAQ,EAAEO,EAE1CT,EAAOC,CAAI,EAAEC,EAAI,MAAMO,CAAG,CAAC,EAAI,CAAC,EAAE/C,EAAS,GAAK+C,EAEpD,SAAW,aAAcP,EAAK,CAE5B,IAAMQ,EACJb,EAAc,CAAoB,EAAEd,EAAeC,EAAc,MAAM,IAAM,EAE/EA,EAAc,QAAU,EAEpB0B,IAEFV,EAAOC,CAAI,EAAIC,EAAI,SAAS,KAC1BnB,EACAC,EACAF,EACAe,EACAC,CACF,EAEJ,MAGEE,EAAOC,CAAI,EAAIC,EAAI,KACjBnB,EACAC,EACAF,EACAe,EACAC,CACF,EAIJ,OAAOE,CACT,CAEQ,MACN3B,EACAa,EACAF,EACAF,EACA6B,EACAC,EACAC,EACAC,EACK,CACLF,EAAe,CAAoB,EAAEvC,EAAe,KAAK,SAAUW,EAAc,MAAM,EACvFA,EAAc,QAAU,EAExB,OAAW,CAACiB,EAAMC,CAAG,IAAK,KAAK,QAAS,CACtC,IAAMa,EAAO7B,EAAQe,CAAI,EAEzB,GAAI,MAAM,QAAQC,CAAG,EAAG,CAGtB,IAAM5C,EAAUyD,EAA8C,OAIxDC,EAAiBd,EAAI,CAAC,IAAM,OAOlC,GALIc,IACFJ,EAAe,CAAoB,EAAEvC,EAAef,EAAQ0B,EAAc,MAAM,EAChFA,EAAc,QAAU,GAGtB1B,EAAS,EAAG,CACd,IAAM8C,EAAWF,EAAI,CAAC,EAEtB,GAAI,OAAOE,GAAa,SAAU,CAGhC,GAAIY,EAAgB,CAClB,IAAMC,EAAyB3D,EAAS8C,EAAS,kBAEjDtB,GAAcmC,EACdN,GAAiBM,EAEb5C,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,EAErD,CAEA,QAAWO,KAAUH,EAEnB1C,EAAS+B,EAAS,MAChB/B,EACA6C,EACAlC,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,UAE3B,SAAW+B,IAAa,GAEtB,QAASC,EAAI,EAAGA,EAAI/C,EAAQ,EAAE+C,EAAG,CAC/B,IAAMc,EAAOJ,EAA6BV,CAAC,EACrCC,EAASa,EAAI,OAEnBP,EAAe,CAAqB,EAAEvC,EAAeiC,EAAQtB,EAAc,MAAM,EACjFA,EAAc,QAAU,EAExB8B,EAAqBzC,EAAQW,EAAc,OAAQmC,CAAG,EACtDnC,EAAc,QAAUsB,CAC1B,KACK,CAEL,IAAMC,EAAWC,EAAUJ,CAAQ,EAEnC,GAAIY,EAAgB,CAClB,IAAMC,EAAyB3D,EAASiD,EAExCzB,GAAcmC,EACdN,GAAiBM,EAEb5C,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,EAErD,CAIA,QAAWS,KAAUL,EACnBH,EAAeR,CAAQ,EAAE/B,EAAe+C,EAAQpC,EAAc,MAAM,EACpEA,EAAc,QAAUuB,CAE5B,CACF,CACF,SAAW,OAAOL,GAAQ,SAExBU,EAAeV,CAAG,EAAE7B,EAAe0C,EAAgB/B,EAAc,MAAM,EACvEA,EAAc,QAAUwB,EAAUN,CAAG,UAC5BA,IAAQ,GAAI,CAErB,IAAMI,EAAUS,EAAgB,OAEhCH,EAAe,CAAqB,EAAEvC,EAAeiC,EAAQtB,EAAc,MAAM,EACjFA,EAAc,QAAU,EAExB8B,EAAqBzC,EAAQW,EAAc,OAAQ+B,CAAc,EACjE/B,EAAc,QAAUsB,CAC1B,SAAW,UAAWJ,EAAK,CAEzB,IAAIxC,EAAQ,EAEZ,QAAS+C,EAAM,EAAGA,EAAMP,EAAI,MAAM,OAAQ,EAAEO,EACrCM,EAAiCb,EAAI,MAAMO,CAAG,CAAC,IAClD/C,GAAS,GAAK+C,GAIlBG,EAAe,CAAoB,EAAEvC,EAAeX,EAAOsB,EAAc,MAAM,EAC/EA,EAAc,QAAU,CAC1B,KAAW,aAAckB,EACnBa,GACFH,EAAe,CAAoB,EAAEvC,EAAe,EAAGW,EAAc,MAAM,EAC3EA,EAAc,QAAU,EAExBF,GAAcoB,EAAI,SAAS,kBAC3BS,GAAiBT,EAAI,SAAS,kBAE1B7B,EAAO,WAAasC,IACtBtC,EAASwC,EAAmBxC,EAAQsC,CAAa,GAGnDtC,EAAS6B,EAAI,SAAS,MACpB7B,EACA0C,EACA/B,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,aAEvBuC,EAAe,CAAoB,EAAEvC,EAAe,EAAGW,EAAc,MAAM,EAC3EA,EAAc,QAAU,IAI1BX,EAAS6B,EAAI,MACX7B,EACA0C,EACA/B,EACAF,EACA6B,EACAC,EACAC,EACAC,CACF,EAEAhC,EAAaE,EAAc,OAC3B2B,EAAgBtC,EAAO,WAE3B,CAEA,OAAOA,CACT,CAEQ,oCAAoCa,EAAoB,CAC9D,IAAImC,EAAM,KAAK,kBAEf,QAAWC,KAAS,KAAK,gBAAgB,CAAC,EAExCD,GAAQnC,EAAQoC,CAAK,EAAa,OAGpC,QAAWA,KAAS,KAAK,gBAAgB,CAAC,EAExC,QAAWC,KAAUrC,EAAQoC,CAAK,EAChCD,GAAO,EAAIE,EAAO,OAItB,QAAWD,KAAS,KAAK,gBAAgB,CAAC,EAExCD,GAAO,KAAK,gBAAgB,CAAC,EAAEC,CAAK,EAAE,oCACpCpC,EAAQoC,CAAK,CACf,EAGF,OAAOD,CACT,CACF,EA4FA,SAASnD,EAAYD,EAAwB,CAC3C,OAAO,OAAO,QAAQA,CAAU,EAAE,KAAK,CAAC,CAACuD,CAAU,EAAG,CAACC,CAAU,IAC/DD,EAAW,cAAcC,CAAU,CACrC,CACF,CAkBA,SAASrD,EAAesD,EAAkB,CAExC,IAAIC,EAAoB,EAElBC,EAAmC,CAAC,CAAC,EAAG,CAAC,EAAG,CAAC,CAAC,EAEpD,OAAW,CAAC3B,EAAM4B,CAAI,IAAKH,EACzB,GAAI,MAAM,QAAQG,CAAI,EACpB,GAAIA,EAAK,SAAW,EAAG,CAErB,IAAMC,EAAWD,EAAK,CAAC,IAAM,GAEvBtB,EACJ,OAAOsB,EAAK,CAAC,GAAM,SACfA,EAAK,CAAC,EAAE,kBACRC,EACE,EACAtB,EAAUqB,EAAK,CAAC,CAAC,EAEzBF,GAAqBE,EAAK,CAAC,EAAItB,EAE3BuB,GACFF,EAAgB,CAAC,EAAE,KAAK3B,CAAI,CAEhC,MAGE0B,GAAqB,EAEjBE,EAAK,CAAC,IAAM,IACdD,EAAgB,CAAC,EAAE,KAAK3B,CAAI,OAGvB4B,aAAgB/D,GACzB6D,GAAqBE,EAAK,kBAC1BD,EAAgB,CAAC,EAAE3B,CAAI,EAAI4B,GAClB,OAAOA,GAAS,SAIzBF,GAAqB,EACZE,IAAS,IAGlBF,GAAqB,EACrBC,EAAgB,CAAC,EAAE,KAAK3B,CAAI,GAE5B0B,GAAqBnB,EAAUqB,CAAI,EAIvC,MAAO,CAAE,kBAAAF,EAAmB,gBAAAC,CAAgB,CAC9C,CAQA,IAAMpB,EAAY,MAAM,CAAC,EAEzBA,EAAU,CAAoB,EAAI,EAClCA,EAAU,CAAW,EAAI,EAEzBA,EAAU,CAAqB,EAAI,EACnCA,EAAU,CAAY,EAAI,EAE1BA,EAAU,CAAqB,EAAI,EACnCA,EAAU,CAAY,EAAI,EAC1BA,EAAU,CAAc,EAAI,EAE5BA,EAAU,CAAc,EAAI,EAE5B,IAAM5B,EAAe,MAAM,CAAC,EAE5BA,EAAa,CAAoB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAC3EpD,EAAa,CAAW,EAAI,CAACmD,EAAMC,IAAWD,EAAK,QAAQC,CAAM,EAEjEpD,EAAa,CAAqB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAC7EpD,EAAa,CAAY,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAEnEpD,EAAa,CAAqB,EAAI,CAACmD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAC7EpD,EAAa,CAAY,EAAI,CAACmD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EACnEpD,EAAa,CAAc,EAAI,CAACmD,EAAMC,IAAWD,EAAK,WAAWC,CAAM,EAEvEpD,EAAa,CAAc,EAAI,CAACmD,EAAMC,IAAWD,EAAK,WAAWC,CAAM,EAEvE,IAAM1C,EAAe,MAAM,CAAC,EAE5BA,EAAa,CAAoB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EACzF3C,EAAa,CAAW,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,QAAQC,EAAQC,CAAK,EAE/E3C,EAAa,CAAqB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,UAAUC,EAAQC,CAAK,EAC3F3C,EAAa,CAAY,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EAEjF3C,EAAa,CAAqB,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,UAAUC,EAAQC,CAAK,EAC3F3C,EAAa,CAAY,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,SAASC,EAAQC,CAAK,EACjF3C,EAAa,CAAc,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,WAAWC,EAAQC,CAAK,EAErF3C,EAAa,CAAc,EAAI,CAACyC,EAAME,EAAOD,IAAWD,EAAK,WAAWC,EAAQC,CAAK,EAErF,IAAM9C,EAAmB,MAAM,CAAC,EAE5BF,IACFE,EAAiB,CAAoB,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,WAAWE,EAAOD,CAAM,EAC/F7C,EAAiB,CAAW,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,UAAUE,EAAOD,CAAM,EAErF7C,EAAiB,CAAqB,EAAI,CAAC4C,EAAME,EAAOD,IACtDD,EAAK,cAAcE,EAAOD,CAAM,EAClC7C,EAAiB,CAAY,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EAEzF7C,EAAiB,CAAqB,EAAI,CAAC4C,EAAME,EAAOD,IACtDD,EAAK,cAAcE,EAAOD,CAAM,EAClC7C,EAAiB,CAAY,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EACzF7C,EAAiB,CAAc,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,aAAaE,EAAOD,CAAM,EAE3F7C,EAAiB,CAAc,EAAI,CAAC4C,EAAME,EAAOD,IAAWD,EAAK,cAAcE,EAAOD,CAAM,GAG9F,IAAMtD,EAAmB,MAAM,CAAC,EAE5BO,IACFP,EAAiB,CAAoB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,UAAUC,CAAM,EAChFtD,EAAiB,CAAW,EAAI,CAACqD,EAAMC,IAAWD,EAAK,SAASC,CAAM,EAEtEtD,EAAiB,CAAqB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM,EAEpFtD,EAAiB,CAAY,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC1EtD,EAAiB,CAAqB,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM,EAEpFtD,EAAiB,CAAY,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC1EtD,EAAiB,CAAc,EAAI,CAACqD,EAAMC,IAAWD,EAAK,YAAYC,CAAM,EAC5EtD,EAAiB,CAAc,EAAI,CAACqD,EAAMC,IAAWD,EAAK,aAAaC,CAAM","names":["hasNodeBuffers","growDataView","dataview","newByteLength","resizedBuffer","amountToCopy","length","offset","growNodeBuffer","buffer","newBuffer","textEncoder","textDecoder","encodeStringIntoDataView","byteOffset","string","strlen","u8Buffer","encodeSmallString","encodeStringIntoNodeBuffer","i","decodeStringFromNodeBuffer","decodeStringFromDataView","Field","FieldArray","item","FieldFixedArray","length","SequentialSerializer","iterable","FieldBitFlags","flags","FieldString","FieldOptional","packet","BinaryPacket","_BinaryPacket","packetId","definition","sortEntries","inspection","inspectEntries","buffer","byteOffset","dataview","arraybuffer","visitors","GET_FUNCTION_BUF","decodeStringFromNodeBuffer","GET_FUNCTION","decodeStringFromDataView","byteLength","dataIn","offsetPointer","hasNodeBuffers","dataOut","SET_FUNCTION_BUF","growNodeBuffer","encodeStringIntoNodeBuffer","SET_FUNCTION","growDataView","encodeStringIntoDataView","buf","onVisit","numElements","element","readFunctions","decodeStringFunction","Packet","result","name","def","array","itemType","i","strlen","itemSize","BYTE_SIZE","bit","hasSubPacket","maxByteLength","writeFunctions","growBufferFunction","encodeStringFunction","data","isDynamicArray","neededBytesForElements","object","str","number","len","field","string","fieldName1","fieldName2","entries","minimumByteLength","stringPositions","type","isString","view","offset","value"]}
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "binary-packet",
3
- "version": "1.2.0",
3
+ "version": "1.2.1",
4
4
  "description": "Lightweight and hyper-fast, zero-dependencies, TypeScript-first, schema-based binary packets serialization and deserialization library",
5
5
  "main": "./dist/index.js",
6
6
  "module": "./dist/index.mjs",
@@ -47,17 +47,17 @@
47
47
  "license": "Apache-2.0",
48
48
  "devDependencies": {
49
49
  "colors": "^1.4.0",
50
- "eslint": "^9.12.0",
51
- "msgpackr": "^1.11.0",
52
- "prettier": "^3.3.3",
53
- "prettier-plugin-organize-imports": "^4.1.0",
50
+ "eslint": "^10.0.0",
51
+ "msgpackr": "^1.11.8",
52
+ "prettier": "^3.8.1",
53
+ "prettier-plugin-organize-imports": "^4.3.0",
54
54
  "restructure": "^3.0.2",
55
55
  "ts-node": "^10.9.2",
56
- "tsup": "^8.3.0",
57
- "typescript": "^5.6.2",
58
- "typescript-eslint": "^8.8.1"
56
+ "tsup": "^8.5.1",
57
+ "typescript": "^5.9.3",
58
+ "typescript-eslint": "^8.56.0"
59
59
  },
60
60
  "engines": {
61
- "node": ">=16"
61
+ "node": ">=20"
62
62
  }
63
63
  }