@xyo-network/archivist-abstract 2.72.9
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +165 -0
- package/README.md +13 -0
- package/dist/AbstractArchivist.d.mts +54 -0
- package/dist/AbstractArchivist.d.ts +54 -0
- package/dist/AbstractArchivist.js +294 -0
- package/dist/AbstractArchivist.js.map +1 -0
- package/dist/AbstractArchivist.mjs +269 -0
- package/dist/AbstractArchivist.mjs.map +1 -0
- package/dist/docs.json +22991 -0
- package/dist/index.d.mts +7 -0
- package/dist/index.d.ts +7 -0
- package/dist/index.js +23 -0
- package/dist/index.js.map +1 -0
- package/dist/index.mjs +2 -0
- package/dist/index.mjs.map +1 -0
- package/package.json +76 -0
- package/src/AbstractArchivist.ts +350 -0
- package/src/index.ts +1 -0
- package/tsup.config.ts +14 -0
- package/typedoc.json +5 -0
package/LICENSE
ADDED
|
@@ -0,0 +1,165 @@
|
|
|
1
|
+
GNU LESSER GENERAL PUBLIC LICENSE
|
|
2
|
+
Version 3, 29 June 2007
|
|
3
|
+
|
|
4
|
+
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
|
5
|
+
Everyone is permitted to copy and distribute verbatim copies
|
|
6
|
+
of this license document, but changing it is not allowed.
|
|
7
|
+
|
|
8
|
+
|
|
9
|
+
This version of the GNU Lesser General Public License incorporates
|
|
10
|
+
the terms and conditions of version 3 of the GNU General Public
|
|
11
|
+
License, supplemented by the additional permissions listed below.
|
|
12
|
+
|
|
13
|
+
0. Additional Definitions.
|
|
14
|
+
|
|
15
|
+
As used herein, "this License" refers to version 3 of the GNU Lesser
|
|
16
|
+
General Public License, and the "GNU GPL" refers to version 3 of the GNU
|
|
17
|
+
General Public License.
|
|
18
|
+
|
|
19
|
+
"The Library" refers to a covered work governed by this License,
|
|
20
|
+
other than an Application or a Combined Work as defined below.
|
|
21
|
+
|
|
22
|
+
An "Application" is any work that makes use of an interface provided
|
|
23
|
+
by the Library, but which is not otherwise based on the Library.
|
|
24
|
+
Defining a subclass of a class defined by the Library is deemed a mode
|
|
25
|
+
of using an interface provided by the Library.
|
|
26
|
+
|
|
27
|
+
A "Combined Work" is a work produced by combining or linking an
|
|
28
|
+
Application with the Library. The particular version of the Library
|
|
29
|
+
with which the Combined Work was made is also called the "Linked
|
|
30
|
+
Version".
|
|
31
|
+
|
|
32
|
+
The "Minimal Corresponding Source" for a Combined Work means the
|
|
33
|
+
Corresponding Source for the Combined Work, excluding any source code
|
|
34
|
+
for portions of the Combined Work that, considered in isolation, are
|
|
35
|
+
based on the Application, and not on the Linked Version.
|
|
36
|
+
|
|
37
|
+
The "Corresponding Application Code" for a Combined Work means the
|
|
38
|
+
object code and/or source code for the Application, including any data
|
|
39
|
+
and utility programs needed for reproducing the Combined Work from the
|
|
40
|
+
Application, but excluding the System Libraries of the Combined Work.
|
|
41
|
+
|
|
42
|
+
1. Exception to Section 3 of the GNU GPL.
|
|
43
|
+
|
|
44
|
+
You may convey a covered work under sections 3 and 4 of this License
|
|
45
|
+
without being bound by section 3 of the GNU GPL.
|
|
46
|
+
|
|
47
|
+
2. Conveying Modified Versions.
|
|
48
|
+
|
|
49
|
+
If you modify a copy of the Library, and, in your modifications, a
|
|
50
|
+
facility refers to a function or data to be supplied by an Application
|
|
51
|
+
that uses the facility (other than as an argument passed when the
|
|
52
|
+
facility is invoked), then you may convey a copy of the modified
|
|
53
|
+
version:
|
|
54
|
+
|
|
55
|
+
a) under this License, provided that you make a good faith effort to
|
|
56
|
+
ensure that, in the event an Application does not supply the
|
|
57
|
+
function or data, the facility still operates, and performs
|
|
58
|
+
whatever part of its purpose remains meaningful, or
|
|
59
|
+
|
|
60
|
+
b) under the GNU GPL, with none of the additional permissions of
|
|
61
|
+
this License applicable to that copy.
|
|
62
|
+
|
|
63
|
+
3. Object Code Incorporating Material from Library Header Files.
|
|
64
|
+
|
|
65
|
+
The object code form of an Application may incorporate material from
|
|
66
|
+
a header file that is part of the Library. You may convey such object
|
|
67
|
+
code under terms of your choice, provided that, if the incorporated
|
|
68
|
+
material is not limited to numerical parameters, data structure
|
|
69
|
+
layouts and accessors, or small macros, inline functions and templates
|
|
70
|
+
(ten or fewer lines in length), you do both of the following:
|
|
71
|
+
|
|
72
|
+
a) Give prominent notice with each copy of the object code that the
|
|
73
|
+
Library is used in it and that the Library and its use are
|
|
74
|
+
covered by this License.
|
|
75
|
+
|
|
76
|
+
b) Accompany the object code with a copy of the GNU GPL and this license
|
|
77
|
+
document.
|
|
78
|
+
|
|
79
|
+
4. Combined Works.
|
|
80
|
+
|
|
81
|
+
You may convey a Combined Work under terms of your choice that,
|
|
82
|
+
taken together, effectively do not restrict modification of the
|
|
83
|
+
portions of the Library contained in the Combined Work and reverse
|
|
84
|
+
engineering for debugging such modifications, if you also do each of
|
|
85
|
+
the following:
|
|
86
|
+
|
|
87
|
+
a) Give prominent notice with each copy of the Combined Work that
|
|
88
|
+
the Library is used in it and that the Library and its use are
|
|
89
|
+
covered by this License.
|
|
90
|
+
|
|
91
|
+
b) Accompany the Combined Work with a copy of the GNU GPL and this license
|
|
92
|
+
document.
|
|
93
|
+
|
|
94
|
+
c) For a Combined Work that displays copyright notices during
|
|
95
|
+
execution, include the copyright notice for the Library among
|
|
96
|
+
these notices, as well as a reference directing the user to the
|
|
97
|
+
copies of the GNU GPL and this license document.
|
|
98
|
+
|
|
99
|
+
d) Do one of the following:
|
|
100
|
+
|
|
101
|
+
0) Convey the Minimal Corresponding Source under the terms of this
|
|
102
|
+
License, and the Corresponding Application Code in a form
|
|
103
|
+
suitable for, and under terms that permit, the user to
|
|
104
|
+
recombine or relink the Application with a modified version of
|
|
105
|
+
the Linked Version to produce a modified Combined Work, in the
|
|
106
|
+
manner specified by section 6 of the GNU GPL for conveying
|
|
107
|
+
Corresponding Source.
|
|
108
|
+
|
|
109
|
+
1) Use a suitable shared library mechanism for linking with the
|
|
110
|
+
Library. A suitable mechanism is one that (a) uses at run time
|
|
111
|
+
a copy of the Library already present on the user's computer
|
|
112
|
+
system, and (b) will operate properly with a modified version
|
|
113
|
+
of the Library that is interface-compatible with the Linked
|
|
114
|
+
Version.
|
|
115
|
+
|
|
116
|
+
e) Provide Installation Information, but only if you would otherwise
|
|
117
|
+
be required to provide such information under section 6 of the
|
|
118
|
+
GNU GPL, and only to the extent that such information is
|
|
119
|
+
necessary to install and execute a modified version of the
|
|
120
|
+
Combined Work produced by recombining or relinking the
|
|
121
|
+
Application with a modified version of the Linked Version. (If
|
|
122
|
+
you use option 4d0, the Installation Information must accompany
|
|
123
|
+
the Minimal Corresponding Source and Corresponding Application
|
|
124
|
+
Code. If you use option 4d1, you must provide the Installation
|
|
125
|
+
Information in the manner specified by section 6 of the GNU GPL
|
|
126
|
+
for conveying Corresponding Source.)
|
|
127
|
+
|
|
128
|
+
5. Combined Libraries.
|
|
129
|
+
|
|
130
|
+
You may place library facilities that are a work based on the
|
|
131
|
+
Library side by side in a single library together with other library
|
|
132
|
+
facilities that are not Applications and are not covered by this
|
|
133
|
+
License, and convey such a combined library under terms of your
|
|
134
|
+
choice, if you do both of the following:
|
|
135
|
+
|
|
136
|
+
a) Accompany the combined library with a copy of the same work based
|
|
137
|
+
on the Library, uncombined with any other library facilities,
|
|
138
|
+
conveyed under the terms of this License.
|
|
139
|
+
|
|
140
|
+
b) Give prominent notice with the combined library that part of it
|
|
141
|
+
is a work based on the Library, and explaining where to find the
|
|
142
|
+
accompanying uncombined form of the same work.
|
|
143
|
+
|
|
144
|
+
6. Revised Versions of the GNU Lesser General Public License.
|
|
145
|
+
|
|
146
|
+
The Free Software Foundation may publish revised and/or new versions
|
|
147
|
+
of the GNU Lesser General Public License from time to time. Such new
|
|
148
|
+
versions will be similar in spirit to the present version, but may
|
|
149
|
+
differ in detail to address new problems or concerns.
|
|
150
|
+
|
|
151
|
+
Each version is given a distinguishing version number. If the
|
|
152
|
+
Library as you received it specifies that a certain numbered version
|
|
153
|
+
of the GNU Lesser General Public License "or any later version"
|
|
154
|
+
applies to it, you have the option of following the terms and
|
|
155
|
+
conditions either of that published version or of any later version
|
|
156
|
+
published by the Free Software Foundation. If the Library as you
|
|
157
|
+
received it does not specify a version number of the GNU Lesser
|
|
158
|
+
General Public License, you may choose any version of the GNU Lesser
|
|
159
|
+
General Public License ever published by the Free Software Foundation.
|
|
160
|
+
|
|
161
|
+
If the Library as you received it specifies that a proxy can decide
|
|
162
|
+
whether future versions of the GNU Lesser General Public License shall
|
|
163
|
+
apply, that proxy's public statement of acceptance of any version is
|
|
164
|
+
permanent authorization for you to choose that version for the
|
|
165
|
+
Library.
|
package/README.md
ADDED
|
@@ -0,0 +1,13 @@
|
|
|
1
|
+
[![logo][]](https://xyo.network)
|
|
2
|
+
|
|
3
|
+
Part of [sdk-xyo-clint-js](https://www.npmjs.com/package/@xyo-network/sdk-xyo-client-js)
|
|
4
|
+
|
|
5
|
+
## License
|
|
6
|
+
|
|
7
|
+
> See the [LICENSE](LICENSE) file for license details
|
|
8
|
+
|
|
9
|
+
## Credits
|
|
10
|
+
|
|
11
|
+
[Made with 🔥 and ❄️ by XYO](https://xyo.network)
|
|
12
|
+
|
|
13
|
+
[logo]: https://cdn.xy.company/img/brand/XYO_full_colored.png
|
|
@@ -0,0 +1,54 @@
|
|
|
1
|
+
import * as _xyo_network_payload_model from '@xyo-network/payload-model';
|
|
2
|
+
import { Payload } from '@xyo-network/payload-model';
|
|
3
|
+
import { ArchivistInstance, ArchivistParams, ArchivistModuleEventData, ArchivistModule, ArchivistQueryBase } from '@xyo-network/archivist-model';
|
|
4
|
+
import { QueryBoundWitness } from '@xyo-network/boundwitness-builder';
|
|
5
|
+
import { BoundWitness } from '@xyo-network/boundwitness-model';
|
|
6
|
+
import { AbstractModuleInstance, ModuleConfig, ModuleQueryHandlerResult } from '@xyo-network/module';
|
|
7
|
+
import { PromisableArray, Promisable } from '@xyo-network/promise';
|
|
8
|
+
|
|
9
|
+
interface ActionConfig {
|
|
10
|
+
emitEvents?: boolean;
|
|
11
|
+
}
|
|
12
|
+
interface InsertConfig extends ActionConfig {
|
|
13
|
+
writeToParents?: boolean;
|
|
14
|
+
}
|
|
15
|
+
interface ArchivistParentInstances {
|
|
16
|
+
commit?: Record<string, ArchivistInstance>;
|
|
17
|
+
read?: Record<string, ArchivistInstance>;
|
|
18
|
+
write?: Record<string, ArchivistInstance>;
|
|
19
|
+
}
|
|
20
|
+
declare abstract class AbstractArchivist<TParams extends ArchivistParams = ArchivistParams, TEventData extends ArchivistModuleEventData = ArchivistModuleEventData> extends AbstractModuleInstance<TParams, TEventData> implements ArchivistModule<TParams> {
|
|
21
|
+
private _lastInsertedPayload;
|
|
22
|
+
private _parents?;
|
|
23
|
+
get queries(): string[];
|
|
24
|
+
get requireAllParents(): boolean;
|
|
25
|
+
protected get _queryAccountPaths(): Record<ArchivistQueryBase['schema'], string>;
|
|
26
|
+
protected get storeParentReads(): boolean;
|
|
27
|
+
all(): PromisableArray<Payload>;
|
|
28
|
+
clear(): Promisable<void>;
|
|
29
|
+
commit(): Promisable<BoundWitness[]>;
|
|
30
|
+
delete(hashes: string[]): Promise<string[]>;
|
|
31
|
+
get(hashes: string[]): Promise<Payload[]>;
|
|
32
|
+
insert(payloads: Payload[]): Promise<Payload[]>;
|
|
33
|
+
protected allHandler(): PromisableArray<Payload>;
|
|
34
|
+
protected clearHandler(): Promisable<void>;
|
|
35
|
+
protected commitHandler(): Promisable<BoundWitness[]>;
|
|
36
|
+
protected deleteHandler(_hashes: string[]): PromisableArray<string>;
|
|
37
|
+
protected deleteWithConfig(hashes: string[], config?: ActionConfig): Promise<string[]>;
|
|
38
|
+
protected getFromParent(hashes: string[], archivist: ArchivistInstance): Promise<[Payload[], string[]]>;
|
|
39
|
+
protected getFromParents(hashes: string[]): Promise<[Payload[], string[]]>;
|
|
40
|
+
protected getHandler(_hashes: string[]): Promisable<Payload[]>;
|
|
41
|
+
protected getWithConfig(hashes: string[], config?: InsertConfig): Promise<Payload[]>;
|
|
42
|
+
protected head(): Promisable<Payload | undefined>;
|
|
43
|
+
protected insertHandler(_payloads: Payload[]): Promise<Payload[]>;
|
|
44
|
+
protected insertWithConfig(payloads: Payload[], config?: InsertConfig): Promise<Payload[]>;
|
|
45
|
+
protected parents(): Promise<ArchivistParentInstances>;
|
|
46
|
+
protected queryHandler<T extends QueryBoundWitness = QueryBoundWitness, TConfig extends ModuleConfig = ModuleConfig>(query: T, payloads?: Payload[], queryConfig?: TConfig): Promise<ModuleQueryHandlerResult>;
|
|
47
|
+
protected writeToParent(parent: ArchivistInstance, payloads: Payload[]): Promise<(_xyo_network_payload_model.SchemaFields & _xyo_network_payload_model.PayloadFields & {
|
|
48
|
+
schema: string;
|
|
49
|
+
})[]>;
|
|
50
|
+
protected writeToParents(payloads: Payload[]): Promise<Payload[]>;
|
|
51
|
+
private resolveArchivists;
|
|
52
|
+
}
|
|
53
|
+
|
|
54
|
+
export { AbstractArchivist, ActionConfig, ArchivistParentInstances, InsertConfig };
|
|
@@ -0,0 +1,54 @@
|
|
|
1
|
+
import * as _xyo_network_payload_model from '@xyo-network/payload-model';
|
|
2
|
+
import { Payload } from '@xyo-network/payload-model';
|
|
3
|
+
import { ArchivistInstance, ArchivistParams, ArchivistModuleEventData, ArchivistModule, ArchivistQueryBase } from '@xyo-network/archivist-model';
|
|
4
|
+
import { QueryBoundWitness } from '@xyo-network/boundwitness-builder';
|
|
5
|
+
import { BoundWitness } from '@xyo-network/boundwitness-model';
|
|
6
|
+
import { AbstractModuleInstance, ModuleConfig, ModuleQueryHandlerResult } from '@xyo-network/module';
|
|
7
|
+
import { PromisableArray, Promisable } from '@xyo-network/promise';
|
|
8
|
+
|
|
9
|
+
interface ActionConfig {
|
|
10
|
+
emitEvents?: boolean;
|
|
11
|
+
}
|
|
12
|
+
interface InsertConfig extends ActionConfig {
|
|
13
|
+
writeToParents?: boolean;
|
|
14
|
+
}
|
|
15
|
+
interface ArchivistParentInstances {
|
|
16
|
+
commit?: Record<string, ArchivistInstance>;
|
|
17
|
+
read?: Record<string, ArchivistInstance>;
|
|
18
|
+
write?: Record<string, ArchivistInstance>;
|
|
19
|
+
}
|
|
20
|
+
declare abstract class AbstractArchivist<TParams extends ArchivistParams = ArchivistParams, TEventData extends ArchivistModuleEventData = ArchivistModuleEventData> extends AbstractModuleInstance<TParams, TEventData> implements ArchivistModule<TParams> {
|
|
21
|
+
private _lastInsertedPayload;
|
|
22
|
+
private _parents?;
|
|
23
|
+
get queries(): string[];
|
|
24
|
+
get requireAllParents(): boolean;
|
|
25
|
+
protected get _queryAccountPaths(): Record<ArchivistQueryBase['schema'], string>;
|
|
26
|
+
protected get storeParentReads(): boolean;
|
|
27
|
+
all(): PromisableArray<Payload>;
|
|
28
|
+
clear(): Promisable<void>;
|
|
29
|
+
commit(): Promisable<BoundWitness[]>;
|
|
30
|
+
delete(hashes: string[]): Promise<string[]>;
|
|
31
|
+
get(hashes: string[]): Promise<Payload[]>;
|
|
32
|
+
insert(payloads: Payload[]): Promise<Payload[]>;
|
|
33
|
+
protected allHandler(): PromisableArray<Payload>;
|
|
34
|
+
protected clearHandler(): Promisable<void>;
|
|
35
|
+
protected commitHandler(): Promisable<BoundWitness[]>;
|
|
36
|
+
protected deleteHandler(_hashes: string[]): PromisableArray<string>;
|
|
37
|
+
protected deleteWithConfig(hashes: string[], config?: ActionConfig): Promise<string[]>;
|
|
38
|
+
protected getFromParent(hashes: string[], archivist: ArchivistInstance): Promise<[Payload[], string[]]>;
|
|
39
|
+
protected getFromParents(hashes: string[]): Promise<[Payload[], string[]]>;
|
|
40
|
+
protected getHandler(_hashes: string[]): Promisable<Payload[]>;
|
|
41
|
+
protected getWithConfig(hashes: string[], config?: InsertConfig): Promise<Payload[]>;
|
|
42
|
+
protected head(): Promisable<Payload | undefined>;
|
|
43
|
+
protected insertHandler(_payloads: Payload[]): Promise<Payload[]>;
|
|
44
|
+
protected insertWithConfig(payloads: Payload[], config?: InsertConfig): Promise<Payload[]>;
|
|
45
|
+
protected parents(): Promise<ArchivistParentInstances>;
|
|
46
|
+
protected queryHandler<T extends QueryBoundWitness = QueryBoundWitness, TConfig extends ModuleConfig = ModuleConfig>(query: T, payloads?: Payload[], queryConfig?: TConfig): Promise<ModuleQueryHandlerResult>;
|
|
47
|
+
protected writeToParent(parent: ArchivistInstance, payloads: Payload[]): Promise<(_xyo_network_payload_model.SchemaFields & _xyo_network_payload_model.PayloadFields & {
|
|
48
|
+
schema: string;
|
|
49
|
+
})[]>;
|
|
50
|
+
protected writeToParents(payloads: Payload[]): Promise<Payload[]>;
|
|
51
|
+
private resolveArchivists;
|
|
52
|
+
}
|
|
53
|
+
|
|
54
|
+
export { AbstractArchivist, ActionConfig, ArchivistParentInstances, InsertConfig };
|
|
@@ -0,0 +1,294 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
var __create = Object.create;
|
|
3
|
+
var __defProp = Object.defineProperty;
|
|
4
|
+
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
|
5
|
+
var __getOwnPropNames = Object.getOwnPropertyNames;
|
|
6
|
+
var __getProtoOf = Object.getPrototypeOf;
|
|
7
|
+
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
|
8
|
+
var __export = (target, all) => {
|
|
9
|
+
for (var name in all)
|
|
10
|
+
__defProp(target, name, { get: all[name], enumerable: true });
|
|
11
|
+
};
|
|
12
|
+
var __copyProps = (to, from, except, desc) => {
|
|
13
|
+
if (from && typeof from === "object" || typeof from === "function") {
|
|
14
|
+
for (let key of __getOwnPropNames(from))
|
|
15
|
+
if (!__hasOwnProp.call(to, key) && key !== except)
|
|
16
|
+
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
|
17
|
+
}
|
|
18
|
+
return to;
|
|
19
|
+
};
|
|
20
|
+
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
|
21
|
+
// If the importer is in node compatibility mode or this is not an ESM
|
|
22
|
+
// file that has been converted to a CommonJS file using a Babel-
|
|
23
|
+
// compatible transform (i.e. "__esModule" has not been set), then set
|
|
24
|
+
// "default" to the CommonJS "module.exports" for node compatibility.
|
|
25
|
+
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
|
26
|
+
mod
|
|
27
|
+
));
|
|
28
|
+
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
|
29
|
+
var AbstractArchivist_exports = {};
|
|
30
|
+
__export(AbstractArchivist_exports, {
|
|
31
|
+
AbstractArchivist: () => AbstractArchivist
|
|
32
|
+
});
|
|
33
|
+
module.exports = __toCommonJS(AbstractArchivist_exports);
|
|
34
|
+
var import_assert = require("@xylabs/assert");
|
|
35
|
+
var import_archivist_model = require("@xyo-network/archivist-model");
|
|
36
|
+
var import_boundwitness_builder = require("@xyo-network/boundwitness-builder");
|
|
37
|
+
var import_boundwitness_model = require("@xyo-network/boundwitness-model");
|
|
38
|
+
var import_core = require("@xyo-network/core");
|
|
39
|
+
var import_module = require("@xyo-network/module");
|
|
40
|
+
var import_payload_wrapper = require("@xyo-network/payload-wrapper");
|
|
41
|
+
var import_compact = __toESM(require("lodash/compact"));
|
|
42
|
+
class AbstractArchivist extends import_module.AbstractModuleInstance {
|
|
43
|
+
_lastInsertedPayload;
|
|
44
|
+
_parents;
|
|
45
|
+
get queries() {
|
|
46
|
+
return [import_archivist_model.ArchivistGetQuerySchema, ...super.queries];
|
|
47
|
+
}
|
|
48
|
+
get requireAllParents() {
|
|
49
|
+
return this.config.requireAllParents ?? true;
|
|
50
|
+
}
|
|
51
|
+
get _queryAccountPaths() {
|
|
52
|
+
return {
|
|
53
|
+
"network.xyo.query.archivist.all": "1/1",
|
|
54
|
+
"network.xyo.query.archivist.clear": "1/2",
|
|
55
|
+
"network.xyo.query.archivist.commit": "1/3",
|
|
56
|
+
"network.xyo.query.archivist.delete": "1/4",
|
|
57
|
+
"network.xyo.query.archivist.get": "1/5",
|
|
58
|
+
"network.xyo.query.archivist.insert": "1/6"
|
|
59
|
+
};
|
|
60
|
+
}
|
|
61
|
+
get storeParentReads() {
|
|
62
|
+
return !!this.config?.storeParentReads;
|
|
63
|
+
}
|
|
64
|
+
all() {
|
|
65
|
+
this._noOverride("all");
|
|
66
|
+
return this.busy(async () => {
|
|
67
|
+
await this.started("throw");
|
|
68
|
+
return await this.allHandler();
|
|
69
|
+
});
|
|
70
|
+
}
|
|
71
|
+
clear() {
|
|
72
|
+
this._noOverride("clear");
|
|
73
|
+
return this.busy(async () => {
|
|
74
|
+
await this.started("throw");
|
|
75
|
+
return await this.clearHandler();
|
|
76
|
+
});
|
|
77
|
+
}
|
|
78
|
+
commit() {
|
|
79
|
+
this._noOverride("commit");
|
|
80
|
+
return this.busy(async () => {
|
|
81
|
+
await this.started("throw");
|
|
82
|
+
return await this.commitHandler();
|
|
83
|
+
});
|
|
84
|
+
}
|
|
85
|
+
async delete(hashes) {
|
|
86
|
+
this._noOverride("delete");
|
|
87
|
+
return await this.busy(async () => {
|
|
88
|
+
await this.started("throw");
|
|
89
|
+
return await this.deleteWithConfig(hashes);
|
|
90
|
+
});
|
|
91
|
+
}
|
|
92
|
+
async get(hashes) {
|
|
93
|
+
this._noOverride("get");
|
|
94
|
+
return await this.busy(async () => {
|
|
95
|
+
await this.started("throw");
|
|
96
|
+
return await this.getWithConfig(hashes);
|
|
97
|
+
});
|
|
98
|
+
}
|
|
99
|
+
async insert(payloads) {
|
|
100
|
+
this._noOverride("insert");
|
|
101
|
+
return await this.busy(async () => {
|
|
102
|
+
await this.started("throw");
|
|
103
|
+
return await this.insertWithConfig(payloads);
|
|
104
|
+
});
|
|
105
|
+
}
|
|
106
|
+
allHandler() {
|
|
107
|
+
throw Error("Not implemented");
|
|
108
|
+
}
|
|
109
|
+
clearHandler() {
|
|
110
|
+
throw Error("Not implemented");
|
|
111
|
+
}
|
|
112
|
+
commitHandler() {
|
|
113
|
+
throw Error("Not implemented");
|
|
114
|
+
}
|
|
115
|
+
deleteHandler(_hashes) {
|
|
116
|
+
throw Error("Not implemented");
|
|
117
|
+
}
|
|
118
|
+
async deleteWithConfig(hashes, config) {
|
|
119
|
+
const emitEvents = config?.emitEvents ?? true;
|
|
120
|
+
const deletedHashes = await this.deleteHandler(hashes);
|
|
121
|
+
if (emitEvents) {
|
|
122
|
+
await this.emit("deleted", { hashes: deletedHashes, module: this });
|
|
123
|
+
}
|
|
124
|
+
return deletedHashes;
|
|
125
|
+
}
|
|
126
|
+
async getFromParent(hashes, archivist) {
|
|
127
|
+
const foundPairs = (await Promise.all(
|
|
128
|
+
(await archivist.get(hashes)).map(async (payload) => [await import_core.PayloadHasher.hashAsync(payload), payload])
|
|
129
|
+
)).filter(([hash]) => {
|
|
130
|
+
const askedFor = hashes.includes(hash);
|
|
131
|
+
if (!askedFor) {
|
|
132
|
+
console.warn(`Parent returned payload with hash not asked for: ${hash}`);
|
|
133
|
+
}
|
|
134
|
+
return askedFor;
|
|
135
|
+
});
|
|
136
|
+
const foundHashes = foundPairs.map(([hash]) => hash);
|
|
137
|
+
const foundPayloads = foundPairs.map(([, payload]) => payload);
|
|
138
|
+
const notfound = hashes.filter((hash) => !foundHashes.includes(hash));
|
|
139
|
+
return [foundPayloads, notfound];
|
|
140
|
+
}
|
|
141
|
+
async getFromParents(hashes) {
|
|
142
|
+
const parents = Object.values((await this.parents())?.read ?? {});
|
|
143
|
+
let remainingHashes = [...hashes];
|
|
144
|
+
let parentIndex = 0;
|
|
145
|
+
let result = [];
|
|
146
|
+
while (parentIndex < parents.length && remainingHashes.length > 0) {
|
|
147
|
+
const [found, notfound] = await this.getFromParent(remainingHashes, parents[parentIndex]);
|
|
148
|
+
result = [...result, ...found];
|
|
149
|
+
remainingHashes = notfound;
|
|
150
|
+
parentIndex++;
|
|
151
|
+
}
|
|
152
|
+
return [result, remainingHashes];
|
|
153
|
+
}
|
|
154
|
+
getHandler(_hashes) {
|
|
155
|
+
throw Error("Not implemented");
|
|
156
|
+
}
|
|
157
|
+
async getWithConfig(hashes, config) {
|
|
158
|
+
const emitEvents = config?.emitEvents ?? true;
|
|
159
|
+
const map = await import_payload_wrapper.PayloadWrapper.toMap(await this.getHandler(hashes));
|
|
160
|
+
const { foundPayloads, notfoundHashes } = hashes.reduce(
|
|
161
|
+
(prev, hash) => {
|
|
162
|
+
const found = map[hash];
|
|
163
|
+
if (found) {
|
|
164
|
+
if (found.schema === import_boundwitness_model.BoundWitnessSchema) {
|
|
165
|
+
prev.foundPayloads.push({ ...import_core.PayloadHasher.hashFields(found), ...{ _signatures: found._signatures } });
|
|
166
|
+
} else {
|
|
167
|
+
prev.foundPayloads.push({ ...import_core.PayloadHasher.hashFields(found) });
|
|
168
|
+
}
|
|
169
|
+
} else {
|
|
170
|
+
prev.notfoundHashes.push(hash);
|
|
171
|
+
}
|
|
172
|
+
return prev;
|
|
173
|
+
},
|
|
174
|
+
{ foundPayloads: [], notfoundHashes: [] }
|
|
175
|
+
);
|
|
176
|
+
const [parentFoundPayloads] = await this.getFromParents(notfoundHashes);
|
|
177
|
+
if (this.storeParentReads) {
|
|
178
|
+
await this.insertWithConfig(parentFoundPayloads);
|
|
179
|
+
}
|
|
180
|
+
return [...foundPayloads, ...parentFoundPayloads];
|
|
181
|
+
}
|
|
182
|
+
head() {
|
|
183
|
+
return this._lastInsertedPayload;
|
|
184
|
+
}
|
|
185
|
+
insertHandler(_payloads) {
|
|
186
|
+
throw Error("Not implemented");
|
|
187
|
+
}
|
|
188
|
+
async insertWithConfig(payloads, config) {
|
|
189
|
+
const emitEvents = config?.emitEvents ?? true;
|
|
190
|
+
const writeToParents = config?.writeToParents ?? true;
|
|
191
|
+
const insertedPayloads = await this.insertHandler(payloads);
|
|
192
|
+
if (writeToParents) {
|
|
193
|
+
await this.writeToParents(insertedPayloads);
|
|
194
|
+
}
|
|
195
|
+
if (emitEvents) {
|
|
196
|
+
await this.emit("inserted", { module: this, payloads: insertedPayloads });
|
|
197
|
+
}
|
|
198
|
+
return insertedPayloads;
|
|
199
|
+
}
|
|
200
|
+
async parents() {
|
|
201
|
+
this._parents = this._parents ?? {
|
|
202
|
+
commit: await this.resolveArchivists(this.config?.parents?.commit),
|
|
203
|
+
read: await this.resolveArchivists(this.config?.parents?.read),
|
|
204
|
+
write: await this.resolveArchivists(this.config?.parents?.write)
|
|
205
|
+
};
|
|
206
|
+
return (0, import_assert.assertEx)(this._parents);
|
|
207
|
+
}
|
|
208
|
+
async queryHandler(query, payloads, queryConfig) {
|
|
209
|
+
const wrappedQuery = import_boundwitness_builder.QueryBoundWitnessWrapper.parseQuery(query, payloads);
|
|
210
|
+
const queryPayload = await wrappedQuery.getQuery();
|
|
211
|
+
(0, import_assert.assertEx)(this.queryable(query, payloads, queryConfig));
|
|
212
|
+
const resultPayloads = [];
|
|
213
|
+
if (this.config.storeQueries) {
|
|
214
|
+
await this.insertHandler([query]);
|
|
215
|
+
}
|
|
216
|
+
switch (queryPayload.schema) {
|
|
217
|
+
case import_archivist_model.ArchivistAllQuerySchema:
|
|
218
|
+
resultPayloads.push(...await this.allHandler());
|
|
219
|
+
break;
|
|
220
|
+
case import_archivist_model.ArchivistClearQuerySchema:
|
|
221
|
+
await this.clearHandler();
|
|
222
|
+
break;
|
|
223
|
+
case import_archivist_model.ArchivistCommitQuerySchema:
|
|
224
|
+
resultPayloads.push(...await this.commitHandler());
|
|
225
|
+
break;
|
|
226
|
+
case import_archivist_model.ArchivistDeleteQuerySchema: {
|
|
227
|
+
const resultPayload = {
|
|
228
|
+
hashes: [...await this.deleteWithConfig(queryPayload.hashes)],
|
|
229
|
+
schema: import_archivist_model.ArchivistDeleteQuerySchema
|
|
230
|
+
};
|
|
231
|
+
resultPayloads.push(resultPayload);
|
|
232
|
+
break;
|
|
233
|
+
}
|
|
234
|
+
case import_archivist_model.ArchivistGetQuerySchema:
|
|
235
|
+
if (queryPayload.hashes?.length) {
|
|
236
|
+
resultPayloads.push(...await this.getWithConfig(queryPayload.hashes));
|
|
237
|
+
} else {
|
|
238
|
+
const head = await this.head();
|
|
239
|
+
if (head)
|
|
240
|
+
resultPayloads.push(head);
|
|
241
|
+
}
|
|
242
|
+
break;
|
|
243
|
+
case import_archivist_model.ArchivistInsertQuerySchema: {
|
|
244
|
+
const payloads2 = await wrappedQuery.getPayloads();
|
|
245
|
+
(0, import_assert.assertEx)(await wrappedQuery.getPayloads(), `Missing payloads: ${JSON.stringify(wrappedQuery.payload(), null, 2)}`);
|
|
246
|
+
const resolvedPayloads = await import_payload_wrapper.PayloadWrapper.filterExclude(payloads2, await wrappedQuery.hashAsync());
|
|
247
|
+
(0, import_assert.assertEx)(resolvedPayloads.length === payloads2.length, `Could not find some passed hashes [${resolvedPayloads.length} != ${payloads2.length}]`);
|
|
248
|
+
resultPayloads.push(...await this.insertWithConfig(payloads2));
|
|
249
|
+
this._lastInsertedPayload = resolvedPayloads[resolvedPayloads.length - 1];
|
|
250
|
+
break;
|
|
251
|
+
}
|
|
252
|
+
default:
|
|
253
|
+
return await super.queryHandler(query, payloads);
|
|
254
|
+
}
|
|
255
|
+
return resultPayloads;
|
|
256
|
+
}
|
|
257
|
+
async writeToParent(parent, payloads) {
|
|
258
|
+
return await parent.insert(payloads);
|
|
259
|
+
}
|
|
260
|
+
async writeToParents(payloads) {
|
|
261
|
+
const parents = await this.parents();
|
|
262
|
+
this.logger?.log(parents.write?.length ?? 0);
|
|
263
|
+
return (0, import_compact.default)(
|
|
264
|
+
await Promise.all(
|
|
265
|
+
Object.values(parents.write ?? {}).map(async (parent) => {
|
|
266
|
+
return parent ? await this.writeToParent(parent, payloads) : void 0;
|
|
267
|
+
})
|
|
268
|
+
)
|
|
269
|
+
).flat();
|
|
270
|
+
}
|
|
271
|
+
async resolveArchivists(archivists = []) {
|
|
272
|
+
const archivistModules = [...await this.resolve({ address: archivists }), ...await this.resolve({ name: archivists })].filter(
|
|
273
|
+
import_module.duplicateModules
|
|
274
|
+
);
|
|
275
|
+
(0, import_assert.assertEx)(
|
|
276
|
+
!this.requireAllParents || archivistModules.length === archivists.length,
|
|
277
|
+
`Failed to find some archivists (set allRequired to false if ok): [${archivists.filter(
|
|
278
|
+
(archivist) => archivistModules.map((module2) => !(module2.address === archivist || module2.config.name === archivist))
|
|
279
|
+
)}]`
|
|
280
|
+
);
|
|
281
|
+
return archivistModules.reduce((prev, module2) => {
|
|
282
|
+
prev[module2.address] = (0, import_archivist_model.asArchivistInstance)(module2, () => {
|
|
283
|
+
(0, import_archivist_model.isArchivistInstance)(module2, { log: console });
|
|
284
|
+
return `Unable to cast resolved module to an archivist: [${module2.address}, ${module2.config.name}, ${module2.config.schema})}]`;
|
|
285
|
+
});
|
|
286
|
+
return prev;
|
|
287
|
+
}, {});
|
|
288
|
+
}
|
|
289
|
+
}
|
|
290
|
+
// Annotate the CommonJS export names for ESM import in node:
|
|
291
|
+
0 && (module.exports = {
|
|
292
|
+
AbstractArchivist
|
|
293
|
+
});
|
|
294
|
+
//# sourceMappingURL=AbstractArchivist.js.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"sources":["../src/AbstractArchivist.ts"],"sourcesContent":["import { assertEx } from '@xylabs/assert'\nimport {\n ArchivistAllQuerySchema,\n ArchivistClearQuerySchema,\n ArchivistCommitQuerySchema,\n ArchivistDeleteQuery,\n ArchivistDeleteQuerySchema,\n ArchivistGetQuerySchema,\n ArchivistInsertQuerySchema,\n ArchivistInstance,\n ArchivistModule,\n ArchivistModuleEventData,\n ArchivistParams,\n ArchivistQuery,\n ArchivistQueryBase,\n asArchivistInstance,\n isArchivistInstance,\n} from '@xyo-network/archivist-model'\nimport { QueryBoundWitness, QueryBoundWitnessWrapper } from '@xyo-network/boundwitness-builder'\nimport { BoundWitness, BoundWitnessSchema } from '@xyo-network/boundwitness-model'\nimport { PayloadHasher } from '@xyo-network/core'\nimport { AbstractModuleInstance, duplicateModules, ModuleConfig, ModuleQueryHandlerResult } from '@xyo-network/module'\nimport { Payload } from '@xyo-network/payload-model'\nimport { PayloadWrapper } from '@xyo-network/payload-wrapper'\nimport { Promisable, PromisableArray } from '@xyo-network/promise'\nimport compact from 'lodash/compact'\n\nexport interface ActionConfig {\n emitEvents?: boolean\n}\n\nexport interface InsertConfig extends ActionConfig {\n writeToParents?: boolean\n}\n\nexport interface ArchivistParentInstances {\n commit?: Record<string, ArchivistInstance>\n read?: Record<string, ArchivistInstance>\n write?: Record<string, ArchivistInstance>\n}\n\nexport abstract class AbstractArchivist<\n TParams extends ArchivistParams = ArchivistParams,\n TEventData extends ArchivistModuleEventData = ArchivistModuleEventData,\n >\n extends AbstractModuleInstance<TParams, TEventData>\n implements ArchivistModule<TParams>\n{\n private _lastInsertedPayload: Payload | undefined\n private _parents?: ArchivistParentInstances\n\n override get queries(): string[] {\n return [ArchivistGetQuerySchema, ...super.queries]\n }\n\n get requireAllParents() {\n return this.config.requireAllParents ?? true\n }\n\n protected override get _queryAccountPaths(): Record<ArchivistQueryBase['schema'], string> {\n return {\n 'network.xyo.query.archivist.all': '1/1',\n 'network.xyo.query.archivist.clear': '1/2',\n 'network.xyo.query.archivist.commit': '1/3',\n 'network.xyo.query.archivist.delete': '1/4',\n 'network.xyo.query.archivist.get': '1/5',\n 'network.xyo.query.archivist.insert': '1/6',\n }\n }\n\n protected get storeParentReads() {\n return !!this.config?.storeParentReads\n }\n\n all(): PromisableArray<Payload> {\n this._noOverride('all')\n return this.busy(async () => {\n await this.started('throw')\n return await this.allHandler()\n })\n }\n\n clear(): Promisable<void> {\n this._noOverride('clear')\n return this.busy(async () => {\n await this.started('throw')\n return await this.clearHandler()\n })\n }\n\n commit(): Promisable<BoundWitness[]> {\n this._noOverride('commit')\n return this.busy(async () => {\n await this.started('throw')\n return await this.commitHandler()\n })\n }\n\n async delete(hashes: string[]): Promise<string[]> {\n this._noOverride('delete')\n return await this.busy(async () => {\n await this.started('throw')\n return await this.deleteWithConfig(hashes)\n })\n }\n\n async get(hashes: string[]): Promise<Payload[]> {\n this._noOverride('get')\n return await this.busy(async () => {\n await this.started('throw')\n return await this.getWithConfig(hashes)\n })\n }\n\n async insert(payloads: Payload[]): Promise<Payload[]> {\n this._noOverride('insert')\n return await this.busy(async () => {\n await this.started('throw')\n return await this.insertWithConfig(payloads)\n })\n }\n\n protected allHandler(): PromisableArray<Payload> {\n throw Error('Not implemented')\n }\n\n protected clearHandler(): Promisable<void> {\n throw Error('Not implemented')\n }\n\n protected commitHandler(): Promisable<BoundWitness[]> {\n throw Error('Not implemented')\n }\n\n protected deleteHandler(_hashes: string[]): PromisableArray<string> {\n throw Error('Not implemented')\n }\n\n protected async deleteWithConfig(hashes: string[], config?: ActionConfig): Promise<string[]> {\n const emitEvents = config?.emitEvents ?? true\n\n const deletedHashes = await this.deleteHandler(hashes)\n\n if (emitEvents) {\n await this.emit('deleted', { hashes: deletedHashes, module: this })\n }\n\n return deletedHashes\n }\n\n protected async getFromParent(hashes: string[], archivist: ArchivistInstance): Promise<[Payload[], string[]]> {\n const foundPairs = (\n await Promise.all(\n (await archivist.get(hashes)).map<Promise<[string, Payload]>>(async (payload) => [await PayloadHasher.hashAsync(payload), payload]),\n )\n ).filter(([hash]) => {\n const askedFor = hashes.includes(hash)\n if (!askedFor) {\n console.warn(`Parent returned payload with hash not asked for: ${hash}`)\n //throw Error(`Parent returned payload with hash not asked for: ${hash}`)\n }\n return askedFor\n })\n\n const foundHashes = foundPairs.map(([hash]) => hash)\n const foundPayloads = foundPairs.map(([, payload]) => payload)\n\n const notfound = hashes.filter((hash) => !foundHashes.includes(hash))\n return [foundPayloads, notfound]\n }\n\n protected async getFromParents(hashes: string[]): Promise<[Payload[], string[]]> {\n const parents = Object.values((await this.parents())?.read ?? {})\n let remainingHashes = [...hashes]\n let parentIndex = 0\n let result: Payload[] = []\n\n //intentionally doing this serially\n while (parentIndex < parents.length && remainingHashes.length > 0) {\n const [found, notfound] = await this.getFromParent(remainingHashes, parents[parentIndex])\n result = [...result, ...found]\n remainingHashes = notfound\n parentIndex++\n }\n return [result, remainingHashes]\n }\n\n protected getHandler(_hashes: string[]): Promisable<Payload[]> {\n throw Error('Not implemented')\n }\n\n protected async getWithConfig(hashes: string[], config?: InsertConfig): Promise<Payload[]> {\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n const emitEvents = config?.emitEvents ?? true\n const map = await PayloadWrapper.toMap(await this.getHandler(hashes))\n\n const { foundPayloads, notfoundHashes } = hashes.reduce<{ foundPayloads: Payload[]; notfoundHashes: string[] }>(\n (prev, hash) => {\n const found = map[hash]\n if (found) {\n //TODO: Find a better way to scrub meta data without scrubbing _signatures\n if (found.schema === BoundWitnessSchema) {\n prev.foundPayloads.push({ ...PayloadHasher.hashFields(found), ...{ _signatures: (found as BoundWitness)._signatures } })\n } else {\n prev.foundPayloads.push({ ...PayloadHasher.hashFields(found) })\n }\n } else {\n prev.notfoundHashes.push(hash)\n }\n return prev\n },\n { foundPayloads: [], notfoundHashes: [] },\n )\n\n const [parentFoundPayloads] = await this.getFromParents(notfoundHashes)\n\n if (this.storeParentReads) {\n await this.insertWithConfig(parentFoundPayloads)\n }\n return [...foundPayloads, ...parentFoundPayloads]\n }\n\n protected head(): Promisable<Payload | undefined> {\n return this._lastInsertedPayload\n }\n\n protected insertHandler(_payloads: Payload[]): Promise<Payload[]> {\n throw Error('Not implemented')\n }\n\n protected async insertWithConfig(payloads: Payload[], config?: InsertConfig): Promise<Payload[]> {\n const emitEvents = config?.emitEvents ?? true\n const writeToParents = config?.writeToParents ?? true\n\n const insertedPayloads = await this.insertHandler(payloads)\n\n if (writeToParents) {\n await this.writeToParents(insertedPayloads)\n }\n if (emitEvents) {\n await this.emit('inserted', { module: this, payloads: insertedPayloads })\n }\n\n return insertedPayloads\n }\n\n protected async parents() {\n this._parents = this._parents ?? {\n commit: await this.resolveArchivists(this.config?.parents?.commit),\n read: await this.resolveArchivists(this.config?.parents?.read),\n write: await this.resolveArchivists(this.config?.parents?.write),\n }\n return assertEx(this._parents)\n }\n\n protected override async queryHandler<T extends QueryBoundWitness = QueryBoundWitness, TConfig extends ModuleConfig = ModuleConfig>(\n query: T,\n payloads?: Payload[],\n queryConfig?: TConfig,\n ): Promise<ModuleQueryHandlerResult> {\n const wrappedQuery = QueryBoundWitnessWrapper.parseQuery<ArchivistQuery>(query, payloads)\n const queryPayload = await wrappedQuery.getQuery()\n assertEx(this.queryable(query, payloads, queryConfig))\n const resultPayloads: Payload[] = []\n if (this.config.storeQueries) {\n await this.insertHandler([query])\n }\n\n switch (queryPayload.schema) {\n case ArchivistAllQuerySchema:\n resultPayloads.push(...(await this.allHandler()))\n break\n case ArchivistClearQuerySchema:\n await this.clearHandler()\n break\n case ArchivistCommitQuerySchema:\n resultPayloads.push(...(await this.commitHandler()))\n break\n case ArchivistDeleteQuerySchema: {\n const resultPayload: ArchivistDeleteQuery = {\n hashes: [...(await this.deleteWithConfig(queryPayload.hashes))],\n schema: ArchivistDeleteQuerySchema,\n }\n resultPayloads.push(resultPayload)\n break\n }\n case ArchivistGetQuerySchema:\n if (queryPayload.hashes?.length) {\n resultPayloads.push(...(await this.getWithConfig(queryPayload.hashes)))\n } else {\n const head = await this.head()\n if (head) resultPayloads.push(head)\n }\n break\n case ArchivistInsertQuerySchema: {\n const payloads = await wrappedQuery.getPayloads()\n assertEx(await wrappedQuery.getPayloads(), `Missing payloads: ${JSON.stringify(wrappedQuery.payload(), null, 2)}`)\n const resolvedPayloads = await PayloadWrapper.filterExclude(payloads, await wrappedQuery.hashAsync())\n assertEx(resolvedPayloads.length === payloads.length, `Could not find some passed hashes [${resolvedPayloads.length} != ${payloads.length}]`)\n resultPayloads.push(...(await this.insertWithConfig(payloads)))\n // NOTE: There isn't an exact equivalence between what we get and what we store. Once\n // we move to returning only inserted Payloads(/hash) instead of a BoundWitness, we\n // can grab the actual last one\n this._lastInsertedPayload = resolvedPayloads[resolvedPayloads.length - 1]\n break\n }\n default:\n return await super.queryHandler(query, payloads)\n }\n return resultPayloads\n }\n\n protected async writeToParent(parent: ArchivistInstance, payloads: Payload[]) {\n return await parent.insert(payloads)\n }\n\n protected async writeToParents(payloads: Payload[]): Promise<Payload[]> {\n const parents = await this.parents()\n this.logger?.log(parents.write?.length ?? 0)\n return compact(\n await Promise.all(\n Object.values(parents.write ?? {}).map(async (parent) => {\n return parent ? await this.writeToParent(parent, payloads) : undefined\n }),\n ),\n ).flat()\n }\n\n private async resolveArchivists(archivists: string[] = []) {\n const archivistModules = [...(await this.resolve({ address: archivists })), ...(await this.resolve({ name: archivists }))].filter(\n duplicateModules,\n )\n\n assertEx(\n !this.requireAllParents || archivistModules.length === archivists.length,\n `Failed to find some archivists (set allRequired to false if ok): [${archivists.filter((archivist) =>\n archivistModules.map((module) => !(module.address === archivist || module.config.name === archivist)),\n )}]`,\n )\n\n return archivistModules.reduce<Record<string, ArchivistInstance>>((prev, module) => {\n prev[module.address] = asArchivistInstance(module, () => {\n isArchivistInstance(module, { log: console })\n return `Unable to cast resolved module to an archivist: [${module.address}, ${module.config.name}, ${module.config.schema})}]`\n })\n\n return prev\n }, {})\n }\n}\n"],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,oBAAyB;AACzB,6BAgBO;AACP,kCAA4D;AAC5D,gCAAiD;AACjD,kBAA8B;AAC9B,oBAAiG;AAEjG,6BAA+B;AAE/B,qBAAoB;AAgBb,MAAe,0BAIZ,qCAEV;AAAA,EACU;AAAA,EACA;AAAA,EAER,IAAa,UAAoB;AAC/B,WAAO,CAAC,gDAAyB,GAAG,MAAM,OAAO;AAAA,EACnD;AAAA,EAEA,IAAI,oBAAoB;AACtB,WAAO,KAAK,OAAO,qBAAqB;AAAA,EAC1C;AAAA,EAEA,IAAuB,qBAAmE;AACxF,WAAO;AAAA,MACL,mCAAmC;AAAA,MACnC,qCAAqC;AAAA,MACrC,sCAAsC;AAAA,MACtC,sCAAsC;AAAA,MACtC,mCAAmC;AAAA,MACnC,sCAAsC;AAAA,IACxC;AAAA,EACF;AAAA,EAEA,IAAc,mBAAmB;AAC/B,WAAO,CAAC,CAAC,KAAK,QAAQ;AAAA,EACxB;AAAA,EAEA,MAAgC;AAC9B,SAAK,YAAY,KAAK;AACtB,WAAO,KAAK,KAAK,YAAY;AAC3B,YAAM,KAAK,QAAQ,OAAO;AAC1B,aAAO,MAAM,KAAK,WAAW;AAAA,IAC/B,CAAC;AAAA,EACH;AAAA,EAEA,QAA0B;AACxB,SAAK,YAAY,OAAO;AACxB,WAAO,KAAK,KAAK,YAAY;AAC3B,YAAM,KAAK,QAAQ,OAAO;AAC1B,aAAO,MAAM,KAAK,aAAa;AAAA,IACjC,CAAC;AAAA,EACH;AAAA,EAEA,SAAqC;AACnC,SAAK,YAAY,QAAQ;AACzB,WAAO,KAAK,KAAK,YAAY;AAC3B,YAAM,KAAK,QAAQ,OAAO;AAC1B,aAAO,MAAM,KAAK,cAAc;AAAA,IAClC,CAAC;AAAA,EACH;AAAA,EAEA,MAAM,OAAO,QAAqC;AAChD,SAAK,YAAY,QAAQ;AACzB,WAAO,MAAM,KAAK,KAAK,YAAY;AACjC,YAAM,KAAK,QAAQ,OAAO;AAC1B,aAAO,MAAM,KAAK,iBAAiB,MAAM;AAAA,IAC3C,CAAC;AAAA,EACH;AAAA,EAEA,MAAM,IAAI,QAAsC;AAC9C,SAAK,YAAY,KAAK;AACtB,WAAO,MAAM,KAAK,KAAK,YAAY;AACjC,YAAM,KAAK,QAAQ,OAAO;AAC1B,aAAO,MAAM,KAAK,cAAc,MAAM;AAAA,IACxC,CAAC;AAAA,EACH;AAAA,EAEA,MAAM,OAAO,UAAyC;AACpD,SAAK,YAAY,QAAQ;AACzB,WAAO,MAAM,KAAK,KAAK,YAAY;AACjC,YAAM,KAAK,QAAQ,OAAO;AAC1B,aAAO,MAAM,KAAK,iBAAiB,QAAQ;AAAA,IAC7C,CAAC;AAAA,EACH;AAAA,EAEU,aAAuC;AAC/C,UAAM,MAAM,iBAAiB;AAAA,EAC/B;AAAA,EAEU,eAAiC;AACzC,UAAM,MAAM,iBAAiB;AAAA,EAC/B;AAAA,EAEU,gBAA4C;AACpD,UAAM,MAAM,iBAAiB;AAAA,EAC/B;AAAA,EAEU,cAAc,SAA4C;AAClE,UAAM,MAAM,iBAAiB;AAAA,EAC/B;AAAA,EAEA,MAAgB,iBAAiB,QAAkB,QAA0C;AAC3F,UAAM,aAAa,QAAQ,cAAc;AAEzC,UAAM,gBAAgB,MAAM,KAAK,cAAc,MAAM;AAErD,QAAI,YAAY;AACd,YAAM,KAAK,KAAK,WAAW,EAAE,QAAQ,eAAe,QAAQ,KAAK,CAAC;AAAA,IACpE;AAEA,WAAO;AAAA,EACT;AAAA,EAEA,MAAgB,cAAc,QAAkB,WAA8D;AAC5G,UAAM,cACJ,MAAM,QAAQ;AAAA,OACX,MAAM,UAAU,IAAI,MAAM,GAAG,IAAgC,OAAO,YAAY,CAAC,MAAM,0BAAc,UAAU,OAAO,GAAG,OAAO,CAAC;AAAA,IACpI,GACA,OAAO,CAAC,CAAC,IAAI,MAAM;AACnB,YAAM,WAAW,OAAO,SAAS,IAAI;AACrC,UAAI,CAAC,UAAU;AACb,gBAAQ,KAAK,oDAAoD,IAAI,EAAE;AAAA,MAEzE;AACA,aAAO;AAAA,IACT,CAAC;AAED,UAAM,cAAc,WAAW,IAAI,CAAC,CAAC,IAAI,MAAM,IAAI;AACnD,UAAM,gBAAgB,WAAW,IAAI,CAAC,CAAC,EAAE,OAAO,MAAM,OAAO;AAE7D,UAAM,WAAW,OAAO,OAAO,CAAC,SAAS,CAAC,YAAY,SAAS,IAAI,CAAC;AACpE,WAAO,CAAC,eAAe,QAAQ;AAAA,EACjC;AAAA,EAEA,MAAgB,eAAe,QAAkD;AAC/E,UAAM,UAAU,OAAO,QAAQ,MAAM,KAAK,QAAQ,IAAI,QAAQ,CAAC,CAAC;AAChE,QAAI,kBAAkB,CAAC,GAAG,MAAM;AAChC,QAAI,cAAc;AAClB,QAAI,SAAoB,CAAC;AAGzB,WAAO,cAAc,QAAQ,UAAU,gBAAgB,SAAS,GAAG;AACjE,YAAM,CAAC,OAAO,QAAQ,IAAI,MAAM,KAAK,cAAc,iBAAiB,QAAQ,WAAW,CAAC;AACxF,eAAS,CAAC,GAAG,QAAQ,GAAG,KAAK;AAC7B,wBAAkB;AAClB;AAAA,IACF;AACA,WAAO,CAAC,QAAQ,eAAe;AAAA,EACjC;AAAA,EAEU,WAAW,SAA0C;AAC7D,UAAM,MAAM,iBAAiB;AAAA,EAC/B;AAAA,EAEA,MAAgB,cAAc,QAAkB,QAA2C;AAEzF,UAAM,aAAa,QAAQ,cAAc;AACzC,UAAM,MAAM,MAAM,sCAAe,MAAM,MAAM,KAAK,WAAW,MAAM,CAAC;AAEpE,UAAM,EAAE,eAAe,eAAe,IAAI,OAAO;AAAA,MAC/C,CAAC,MAAM,SAAS;AACd,cAAM,QAAQ,IAAI,IAAI;AACtB,YAAI,OAAO;AAET,cAAI,MAAM,WAAW,8CAAoB;AACvC,iBAAK,cAAc,KAAK,EAAE,GAAG,0BAAc,WAAW,KAAK,GAAG,GAAG,EAAE,aAAc,MAAuB,YAAY,EAAE,CAAC;AAAA,UACzH,OAAO;AACL,iBAAK,cAAc,KAAK,EAAE,GAAG,0BAAc,WAAW,KAAK,EAAE,CAAC;AAAA,UAChE;AAAA,QACF,OAAO;AACL,eAAK,eAAe,KAAK,IAAI;AAAA,QAC/B;AACA,eAAO;AAAA,MACT;AAAA,MACA,EAAE,eAAe,CAAC,GAAG,gBAAgB,CAAC,EAAE;AAAA,IAC1C;AAEA,UAAM,CAAC,mBAAmB,IAAI,MAAM,KAAK,eAAe,cAAc;AAEtE,QAAI,KAAK,kBAAkB;AACzB,YAAM,KAAK,iBAAiB,mBAAmB;AAAA,IACjD;AACA,WAAO,CAAC,GAAG,eAAe,GAAG,mBAAmB;AAAA,EAClD;AAAA,EAEU,OAAwC;AAChD,WAAO,KAAK;AAAA,EACd;AAAA,EAEU,cAAc,WAA0C;AAChE,UAAM,MAAM,iBAAiB;AAAA,EAC/B;AAAA,EAEA,MAAgB,iBAAiB,UAAqB,QAA2C;AAC/F,UAAM,aAAa,QAAQ,cAAc;AACzC,UAAM,iBAAiB,QAAQ,kBAAkB;AAEjD,UAAM,mBAAmB,MAAM,KAAK,cAAc,QAAQ;AAE1D,QAAI,gBAAgB;AAClB,YAAM,KAAK,eAAe,gBAAgB;AAAA,IAC5C;AACA,QAAI,YAAY;AACd,YAAM,KAAK,KAAK,YAAY,EAAE,QAAQ,MAAM,UAAU,iBAAiB,CAAC;AAAA,IAC1E;AAEA,WAAO;AAAA,EACT;AAAA,EAEA,MAAgB,UAAU;AACxB,SAAK,WAAW,KAAK,YAAY;AAAA,MAC/B,QAAQ,MAAM,KAAK,kBAAkB,KAAK,QAAQ,SAAS,MAAM;AAAA,MACjE,MAAM,MAAM,KAAK,kBAAkB,KAAK,QAAQ,SAAS,IAAI;AAAA,MAC7D,OAAO,MAAM,KAAK,kBAAkB,KAAK,QAAQ,SAAS,KAAK;AAAA,IACjE;AACA,eAAO,wBAAS,KAAK,QAAQ;AAAA,EAC/B;AAAA,EAEA,MAAyB,aACvB,OACA,UACA,aACmC;AACnC,UAAM,eAAe,qDAAyB,WAA2B,OAAO,QAAQ;AACxF,UAAM,eAAe,MAAM,aAAa,SAAS;AACjD,gCAAS,KAAK,UAAU,OAAO,UAAU,WAAW,CAAC;AACrD,UAAM,iBAA4B,CAAC;AACnC,QAAI,KAAK,OAAO,cAAc;AAC5B,YAAM,KAAK,cAAc,CAAC,KAAK,CAAC;AAAA,IAClC;AAEA,YAAQ,aAAa,QAAQ;AAAA,MAC3B,KAAK;AACH,uBAAe,KAAK,GAAI,MAAM,KAAK,WAAW,CAAE;AAChD;AAAA,MACF,KAAK;AACH,cAAM,KAAK,aAAa;AACxB;AAAA,MACF,KAAK;AACH,uBAAe,KAAK,GAAI,MAAM,KAAK,cAAc,CAAE;AACnD;AAAA,MACF,KAAK,mDAA4B;AAC/B,cAAM,gBAAsC;AAAA,UAC1C,QAAQ,CAAC,GAAI,MAAM,KAAK,iBAAiB,aAAa,MAAM,CAAE;AAAA,UAC9D,QAAQ;AAAA,QACV;AACA,uBAAe,KAAK,aAAa;AACjC;AAAA,MACF;AAAA,MACA,KAAK;AACH,YAAI,aAAa,QAAQ,QAAQ;AAC/B,yBAAe,KAAK,GAAI,MAAM,KAAK,cAAc,aAAa,MAAM,CAAE;AAAA,QACxE,OAAO;AACL,gBAAM,OAAO,MAAM,KAAK,KAAK;AAC7B,cAAI;AAAM,2BAAe,KAAK,IAAI;AAAA,QACpC;AACA;AAAA,MACF,KAAK,mDAA4B;AAC/B,cAAMA,YAAW,MAAM,aAAa,YAAY;AAChD,oCAAS,MAAM,aAAa,YAAY,GAAG,qBAAqB,KAAK,UAAU,aAAa,QAAQ,GAAG,MAAM,CAAC,CAAC,EAAE;AACjH,cAAM,mBAAmB,MAAM,sCAAe,cAAcA,WAAU,MAAM,aAAa,UAAU,CAAC;AACpG,oCAAS,iBAAiB,WAAWA,UAAS,QAAQ,sCAAsC,iBAAiB,MAAM,OAAOA,UAAS,MAAM,GAAG;AAC5I,uBAAe,KAAK,GAAI,MAAM,KAAK,iBAAiBA,SAAQ,CAAE;AAI9D,aAAK,uBAAuB,iBAAiB,iBAAiB,SAAS,CAAC;AACxE;AAAA,MACF;AAAA,MACA;AACE,eAAO,MAAM,MAAM,aAAa,OAAO,QAAQ;AAAA,IACnD;AACA,WAAO;AAAA,EACT;AAAA,EAEA,MAAgB,cAAc,QAA2B,UAAqB;AAC5E,WAAO,MAAM,OAAO,OAAO,QAAQ;AAAA,EACrC;AAAA,EAEA,MAAgB,eAAe,UAAyC;AACtE,UAAM,UAAU,MAAM,KAAK,QAAQ;AACnC,SAAK,QAAQ,IAAI,QAAQ,OAAO,UAAU,CAAC;AAC3C,eAAO,eAAAC;AAAA,MACL,MAAM,QAAQ;AAAA,QACZ,OAAO,OAAO,QAAQ,SAAS,CAAC,CAAC,EAAE,IAAI,OAAO,WAAW;AACvD,iBAAO,SAAS,MAAM,KAAK,cAAc,QAAQ,QAAQ,IAAI;AAAA,QAC/D,CAAC;AAAA,MACH;AAAA,IACF,EAAE,KAAK;AAAA,EACT;AAAA,EAEA,MAAc,kBAAkB,aAAuB,CAAC,GAAG;AACzD,UAAM,mBAAmB,CAAC,GAAI,MAAM,KAAK,QAAQ,EAAE,SAAS,WAAW,CAAC,GAAI,GAAI,MAAM,KAAK,QAAQ,EAAE,MAAM,WAAW,CAAC,CAAE,EAAE;AAAA,MACzH;AAAA,IACF;AAEA;AAAA,MACE,CAAC,KAAK,qBAAqB,iBAAiB,WAAW,WAAW;AAAA,MAClE,qEAAqE,WAAW;AAAA,QAAO,CAAC,cACtF,iBAAiB,IAAI,CAACC,YAAW,EAAEA,QAAO,YAAY,aAAaA,QAAO,OAAO,SAAS,UAAU;AAAA,MACtG,CAAC;AAAA,IACH;AAEA,WAAO,iBAAiB,OAA0C,CAAC,MAAMA,YAAW;AAClF,WAAKA,QAAO,OAAO,QAAI,4CAAoBA,SAAQ,MAAM;AACvD,wDAAoBA,SAAQ,EAAE,KAAK,QAAQ,CAAC;AAC5C,eAAO,oDAAoDA,QAAO,OAAO,KAAKA,QAAO,OAAO,IAAI,KAAKA,QAAO,OAAO,MAAM;AAAA,MAC3H,CAAC;AAED,aAAO;AAAA,IACT,GAAG,CAAC,CAAC;AAAA,EACP;AACF;","names":["payloads","compact","module"]}
|