@fluid-experimental/tree 0.57.2 → 0.58.1000

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,195 @@
1
+ # Compression
2
+
3
+ This document covers reducing per node costs in common high node count use-cases using the approach prototyped in <https://github.com/CraigMacomber/sequence/tree/main/src>.
4
+
5
+ ## Motivation
6
+
7
+ We want users of shared tree to encode their data as nodes, all the way down to primitive leaf values, like characters and integers.
8
+ This means they can have a unified approach on all structured data, and not use a separate approach for blobs of data at leaves.
9
+ To make this practical modeling large qualities of data as nodes needs to have very low overhead compared to using a custom format and putting the data in a blob (a single node's `value`), at least for usage patterns that work reasonably well with a single large value.
10
+
11
+ Currently (with no compression), these cases suffer from inefficiency due to per node overhead for several reasons:
12
+
13
+ - Each node is large when serialized. For example `{definition: "SomeUUID", identifier: "SomeOtherUUID", traits: {}, value: 5}` is much larger than just a byte storing the number 5. This is an issue for op bandwidth, summary upload and download bandwidth, and on disk document size (typically on the server) and also an issue due to processing costs (formatting, parsing and escaping long strings costs more than shorter ones).
14
+ - Each node needs a separate entry in our structure to look up nodes by id. Currently this is an in memory b-tree. Having lots of nodes here costs memory, and reduces lookup performance (deeper trees, and less idea cache use). Additionally, this separated storage causes common tree access patterns, like child lookup or sequence traversal, to be relatively expensive compared to accessing children and sequences in a plausible custom a non-shared tree blob format.
15
+ - Each node needs a separate entry in our structure to look up parents by id.
16
+ - A lot of redundant data is stored in memory on each node. This is basically just another version for the serialized size being large.
17
+ - Inserting and removing large numbers of nodes has high cost (compared to inserting one node with a blob value) due to accesses and changes all over the keyed by node identifier structures.
18
+
19
+ ## Scoping a Solution
20
+
21
+ There are many different usage patterns for which different optimizations would be suitable.
22
+ Thus we can use the following approach:
23
+
24
+ - Introduce an abstraction over some part of the tree where we can implement optimizations for particular usage patterns.
25
+ - Provide a new optimized implementation of this abstraction which solves the above mentioned overheads for the known common large data cases.
26
+ - Much longer term, consider adding more optimized implementations for other use cases as they are discovered.
27
+
28
+ The current use-cases we will focus on fall into two catagories:
29
+
30
+ - Leaf Struct like data: documents where many leaf subtrees have exactly the same `shape` (same number of nodes, same traits, same definitions, same presence or absence of values, buf different value contents)
31
+ - Sequences that contain mostly contiguous runs of identically structured leaf struct like data.
32
+
33
+ We will consider these cases where mutations which change this structure are possible, but most of the nodes don't get modified.
34
+ Ex: the a sequence might get a chunk of it moved somewhere else, ort delegated/inserted, but its uncommon for a large percentage of the nodes in it to have the set of traits used, or the number of children changed.
35
+
36
+ Additionally we will allow for identities of new nodes in these cases to be allocated in a way to benefit the compression.
37
+ The schema proposed here is sequential in pre-order traversal order.
38
+ Justifications for this exact choice are below.
39
+
40
+ Additionally we will focus on the in memory representation: as long as we are using an in memory format that is copy on write (aka a persisted datastructure) it has more constraints than the persisted format, since it need to provide fast random access and editing.
41
+ From the in memory representation, we can then derive a suitable persisted format that is efficient to encode, decode and store.
42
+
43
+ ## Picking an Abstraction
44
+
45
+ There are many different places in the tree we could abstract to allow multiple data representations:
46
+
47
+ - Node
48
+ - Trait Map (the `traits` object on a Node)
49
+ - Trait
50
+ - Sequences (Content of of a trait)
51
+ - Sequence Chunks (aka Sub-sections of Sequences)
52
+
53
+ For the use-cases above, we pick Sub-sections of Sequences Chunks.
54
+ This is because, like Node, they work well for individual struct like trees (chunk of length one),
55
+ but they also generalize well to sequences of mostly homogeneous data.
56
+ Sequence Chunks are a better choice than Sequences since they handle fully homogeneous sequences just about as well,
57
+ but regress much less when a large sequences is edited and is no longer fully homogeneous.
58
+
59
+ This abstraction can be implemented by changing the NodeIds in the b-tree (as well as in the child references in the nodes in the tree) into ChunkIds,
60
+ which identify a sequence chunk instead of a Node.
61
+
62
+ We then make the contents of the b-tree (and thus forest) into chunks, instead of Nodes.
63
+ Since we still need to be able to look up a node based on its identifier, we require that:
64
+
65
+ - the Id of a chunk be equal to or less than the Id of any nodes it contains
66
+ - the chunk only contain nodes with ids between that and some maximum value
67
+ - all chunks have disjoint id ranges
68
+ - a chunk can efficiently look-up information about a node within it, given the offset of that node into the chunk (`nodeId - chunkId`).
69
+ A rust version of this can be seen as [Chunk](https://github.com/CraigMacomber/sequence/blob/ada56998a853ea19b5a6536e7b79771400044bc0/src/forest.rs#L16)
70
+
71
+ Together this means you can lookup a node by finding the entry in the b-tree with the id closest to it (it or lower),
72
+ then asking the chunk you get about it. ([An implementation](https://github.com/CraigMacomber/sequence/blob/ada56998a853ea19b5a6536e7b79771400044bc0/src/forest.rs#L72))
73
+
74
+ We then provide two implementations of a sequence chunk:
75
+
76
+ - Our existing node implementation, which makes a chunk with a single node in it ([example of this in rust](https://github.com/CraigMacomber/sequence/blob/ada56998a853ea19b5a6536e7b79771400044bc0/src/basic_indirect.rs#L61))
77
+ - A structurally compressed uniform sequence chunk ([example of this in rust](https://github.com/CraigMacomber/sequence/blob/ada56998a853ea19b5a6536e7b79771400044bc0/src/chunk.rs#L139))
78
+
79
+ ## Structurally compressed sequence chunks
80
+
81
+ A structurally compressed sequence chunk, or just `UniformChunk` for short in this context, has 4 parts:
82
+
83
+ 1. `data`: a flat array of values from the trees in the chunk, in pre-order traversal order.
84
+ 2. `schema`: a description of the "shape" of a tree. (Ex: [RootChunkSchema](https://github.com/CraigMacomber/sequence/blob/ada56998a853ea19b5a6536e7b79771400044bc0/src/chunk.rs#L24))
85
+ 3. `UniformChunk` itself which implements the Chunk abstraction, and is a value in the b-tree. ([UniformChunk](https://github.com/CraigMacomber/sequence/blob/ada56998a853ea19b5a6536e7b79771400044bc0/src/chunk.rs#L13)). This works by having a reference to the data, and the schema, and using the schema to index into the data when actual values are required.
86
+ 4. `ChunkId`: the actual id. Since the b-tree stores the chunk under its ChunkId, its not necessary to store the id in the `UniformChunk`.
87
+
88
+ The `data` is logically owned by the `UniformChunk`, though it may optionally be stored out of line.
89
+ The `schema` is referenced by the `UniformChunk`, but most of its data should be deduplicated: many `UniformChunk`s will have identical schema.
90
+ The `schema` information can be factored into two parts:
91
+
92
+ 1. Minimal description of the tree shape:
93
+ - definition
94
+ - does it have a value? (if using a byte array, how long is the value)
95
+ - list of traits (each with a trait id, a child count, and a reference to a `schema` for the child)
96
+ 2. Derived data used to accelerate lookups. This is omitted from the serialized version of UniformChunk, and is not detailed here. The short version is store what ever is needed to optimize the implementation of Chunk that UniformChunk provides. This is only needed on the root schema, and is why (Ex: [RootChunkSchema](https://github.com/CraigMacomber/sequence/blob/main/src/chunk.rs#L24)) is distinct from [ChunkSchema](https://github.com/CraigMacomber/sequence/blob/ada56998a853ea19b5a6536e7b79771400044bc0/src/chunk.rs#L106). Things like tables to look up schema or parent info from `idOffset % stride` (where stride is the number of nodes in the schema for a single tree in the two level sequence) belong there.
97
+
98
+ Another way to think about this is there are 3 classes of data for nodes in a uniform chunk:
99
+
100
+ 1. values: stored in the data array as part of the uniform chunk.
101
+ 2. shape/schema: includes definition, traits, parentage etc. Stored in the `schema` and deduplicated, and referenced from the chunk.
102
+ 3. identifiers: inferred based off the chunk's id.
103
+
104
+ This format can only encode sequence chunks which have the following properties:
105
+
106
+ 1. All id are in preorder traversal order.
107
+ 2. All trees in the top level sequence have identical shape/schema. This includes:
108
+ - definitions
109
+ - number of children and their shape(recursively) in each trait.
110
+ - presence or absence of a value (and size if using a encoding like a byte array where size impacts the access to other values).
111
+
112
+ ### Serialized Format
113
+
114
+ Assuming javascript/json here, as well as a javascript array of values not some backed byte array.
115
+
116
+ For nodes we currently have `TreeNode` as defined by: (See actual source for for documentation)
117
+
118
+ ```typescript
119
+ export interface TraitMap<TChild> {
120
+ readonly [key: string]: TreeNodeSequence<TChild>;
121
+ }
122
+
123
+ export type TreeNodeSequence<TChild> = readonly TChild[];
124
+
125
+ export interface NodeData {
126
+ readonly payload?: Payload;
127
+ readonly definition: Definition;
128
+ readonly identifier: NodeId;
129
+ }
130
+
131
+ export interface TreeNode<TChild> extends NodeData {
132
+ readonly traits: TraitMap<TChild>;
133
+ }
134
+ ```
135
+
136
+ In memory (in the b-tree), `TChild` is a `NodeId`, when persisted, its a recursive Node type:
137
+
138
+ ```typescript
139
+ export type ChangeNode = TreeNode<ChangeNode>;
140
+ ```
141
+
142
+ Adding support for UniformChunks means that `TChild` will become a `ChunkId` in memory/b-tree, and when persisted will become a slightly different recursive type:
143
+
144
+ ```typescript
145
+ // ChunkSchemaId is used as index into interner/dedup table of ChunkSchema<ChunkSchemaId>
146
+ export type ChangeNode = TreeNode<ChangeNode | UniformChunk<ChunkSchemaId>>;
147
+
148
+ export interface UniformChunk<TSchema> {
149
+ readonly data: Payload[];
150
+ // Schema. Runtime version would have some extra cached info compared to persisted.
151
+ readonly schema: TSchema;
152
+ // Id of this chunk, and also NodeId of first node in this chunk. Only needed here in persisted case.
153
+ readonly identifier: ChunkId;
154
+ }
155
+
156
+ export interface ChunkSchema<TChild> {
157
+ readonly hasValue: boolean;
158
+ readonly definition: Definition;
159
+ // `traits` has to be a list not an object/map because order matters and objects might not preserve order though json.
160
+ readonly traits: readonly { readonly label: string; readonly schema: TChild; readonly count: number }[];
161
+ }
162
+ ```
163
+
164
+ ### Finding when data should be chunked
165
+
166
+ Eventually the work of deciding when to chunk data can mostly be paid by the client inserting the data, and it can stay chunked through the whole system.
167
+ Thus would make the decision relatively easy and efficient since the client doing the insert mostly likely has schema for the data, and when appropriate translating it into shared-tree chunks instead of nodes should be straight forward.
168
+
169
+ That said, an algorithm to determine when to chunk data that does not need help from the client is practical.
170
+ We can use a recursive algorithm, which returns the shape of the subtree (indicating it should be part of a uniform chunk) or an actual encoded version of the ree (may or may not be compressed).
171
+
172
+ - As the base case, all leaves can be chunked: return their trivial shape.
173
+ - For sequences (in traits):
174
+ - if length 1: chunk it: return the shape indicating to make it part of a larger UniformChunk if possible
175
+ - if length > 1: as a heuristic, break it into a minimal number of UniformChunks, and do not chunk any thing above this in the tree.
176
+ - For non-leaf nodes:
177
+ - If any child sequence/trait returns do not chunk, make a normal node.
178
+ - If its ids are not sequential in pre-order traversal order, make a normal node (though sections of its children might be chunks)
179
+ - Otherwise, return a shape, so this node can be part of a larger chunk.
180
+
181
+ This should be possible in `O(size of tree)` time and `O(size of tree)` space, and thus not change the asymptotic complexity of processing if used as part of serialization or change processing.
182
+
183
+ Note that the one place we sometimes decide to not chunk a tree where it might be possible to chunk it is parents of uniform sequences of length greater than 1.
184
+ This heuristic helps prevent very large numbers of schemas being generated in cases where the number of children are variable (eventually we plan to store and reuse the schema lone term with stable ids, so accumulating lots of them would be bad).
185
+ Depending on context a different heuristic might be desirable, but this one is simple and should work pretty well in practice.
186
+
187
+ ## Future Options
188
+
189
+ Since the ChunkSchema are deduplicated, their size should not matter much, so it should be fine to use the somewhat verbose types above.
190
+ In the future we might introduce an alternative encoding of them thats more compressed if they end up being a large portion of documents:
191
+ note that this makes the data non-human readable and makes the serialized types less in line with the in memory ones, so it is a tradeoff.
192
+ We will likely want to keep support for the human readable format long term, at least for debug-ability.
193
+
194
+ There are lots of possible generalizations of UniformChunk format (ex: allow referring to subtrees by chunk id in the value array, or allowing variable sized traits).
195
+ These can be added as additional Chunk formats later, or as generalizations to UniformChunk.
@@ -134,6 +134,7 @@ describe('SharedTree history virtualization', () => {
134
134
  expect(editChunks[1].chunk.length).to.equal(10);
135
135
  });
136
136
  it('correctly saves handles and their corresponding starting revisions to the summary', async () => {
137
+ testObjectProvider.logger.registerExpectedEvent({ eventName: 'fluid:telemetry:Batching:LengthTooBig' }, { eventName: 'fluid:telemetry:Batching:LengthTooBig' });
137
138
  await processNewEditChunks(4);
138
139
  const { editHistory } = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);
139
140
  const { editChunks } = assertNotUndefined(editHistory);
@@ -1 +1 @@
1
- {"version":3,"file":"Virtualization.tests.js","sourceRoot":"","sources":["../../src/test/Virtualization.tests.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAEH,OAAO,EAAE,MAAM,EAAE,MAAM,MAAM,CAAC;AAG9B,OAAO,EAAuB,OAAO,EAAE,2BAA2B,EAAE,MAAM,YAAY,CAAC;AACvF,OAAO,EAAc,QAAQ,EAAU,MAAM,kBAAkB,CAAC;AAChE,OAAO,EAAE,kBAAkB,EAAE,MAAM,WAAW,CAAC;AAE/C,OAAO,EAAE,WAAW,EAAE,MAAM,gBAAgB,CAAC;AAC7C,OAAO,EAAE,yBAAyB,EAAE,MAAM,8BAA8B,CAAC;AACzE,OAAO,EAAE,iBAAiB,EAAE,YAAY,EAAE,8BAA8B,EAAE,SAAS,EAAE,MAAM,2BAA2B,CAAC;AAEvH,QAAQ,CAAC,mCAAmC,EAAE,GAAG,EAAE;IAClD,IAAI,UAAsB,CAAC;IAC3B,IAAI,kBAAsC,CAAC;IAE3C,iDAAiD;IACjD,MAAM,gBAAgB,GAAoC;QACzD,WAAW,EAAE,WAAW;QACxB,OAAO,EAAE,OAAO;QAChB,cAAc,EAAE,iBAAiB,CAAC,GAAG,CAAC;KACtC,CAAC;IAEF,UAAU,CAAC,KAAK,IAAI,EAAE;QACrB,MAAM,iBAAiB,GAAG,MAAM,8BAA8B,CAAC;YAC9D,gBAAgB,EAAE,IAAI;SACtB,CAAC,CAAC;QACH,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC;QACpC,kBAAkB,GAAG,iBAAiB,CAAC,kBAAkB,CAAC;IAC3D,CAAC,CAAC,CAAC;IAEH,SAAS,CAAC,KAAK,IAAI,EAAE;QACpB,kBAAkB,CAAC,KAAK,EAAE,CAAC;IAC5B,CAAC,CAAC,CAAC;IAEH,uEAAuE;IACvE,MAAM,oBAAoB,GAAG,KAAK,EAAE,cAAc,GAAG,CAAC,EAAE,EAAE;QACzD,MAAM,aAAa,GAAmB,EAAE,CAAC;QAEzC,yCAAyC;QACzC,OAAO,aAAa,CAAC,MAAM,GAAI,UAAU,CAAC,KAAyB,CAAC,aAAa,GAAG,cAAc,EAAE;YACnG,MAAM,IAAI,GAAG,OAAO,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAC,YAAY,EAAE,CAAC,CAAC,CAAC,CAAC;YAC5D,aAAa,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;YACzB,UAAU,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC;SAClC;QAED,2EAA2E;QAC3E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAE9C,OAAO,aAAa,CAAC;IACtB,CAAC,CAAC;IAEF,EAAE,CAAC,qDAAqD,EAAE,KAAK,IAAI,EAAE;QACpE,MAAM,aAAa,GAAmB,MAAM,oBAAoB,EAAE,CAAC;QAEnE,MAAM,OAAO,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAEtF,MAAM,EAAE,WAAW,EAAE,GAAG,OAAO,CAAC;QAChC,MAAM,EAAE,UAAU,EAAE,GAAG,kBAAkB,CAAC,WAAW,CAAC,CAAC;QACvD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QACtC,MAAM,CAAC,OAAQ,UAAU,CAAC,CAAC,CAAC,CAAC,KAAoB,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;QAE5E,uCAAuC;QACvC,MAAM,EAAE,IAAI,EAAE,WAAW,EAAE,GAAG,MAAM,8BAA8B,CAAC,EAAE,kBAAkB,EAAE,CAAC,CAAC;QAE3F,WAAW,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;QAEjC,uCAAuC;QACvC,MAAM,CAAC,CAAC,MAAM,WAAW,CAAC,KAAK,CAAC,cAAc,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,aAAa,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC;IACtF,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,0BAA0B,EAAE,KAAK,IAAI,EAAE;QACzC,IAAI,oBAAoB,GAAG,CAAC,CAAC;QAC7B,UAAU,CAAC,EAAE,CAAC,yBAAyB,CAAC,mBAAmB,EAAE,GAAG,EAAE;YACjE,oBAAoB,EAAE,CAAC;QACxB,CAAC,CAAC,CAAC;QAEH,0EAA0E;QAC1E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAE9C,UAAU,CAAC,WAAW,CAAC,gBAAgB,CAAC,CAAC;QAEzC,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAC9C,MAAM,CAAC,oBAAoB,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAEzC,MAAM,EAAE,WAAW,EAAE,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAC9F,MAAM,EAAE,UAAU,EAAE,GAAG,kBAAkB,CAAC,WAAW,CAAC,CAAC;QACvD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QACtC,MAAM,CAAC,OAAQ,UAAU,CAAC,CAAC,CAAC,CAAC,KAAoB,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;IAC7E,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,4CAA4C,EAAE,KAAK,IAAI,EAAE;QAC3D,8BAA8B;QAC9B,MAAM,EAAE,IAAI,EAAE,WAAW,EAAE,GAAG,MAAM,8BAA8B,CAAC;YAClE,kBAAkB;YAClB,gBAAgB,EAAE,IAAI;SACtB,CAAC,CAAC;QACH,MAAM,EAAE,IAAI,EAAE,WAAW,EAAE,GAAG,MAAM,8BAA8B,CAAC;YAClE,kBAAkB;YAClB,gBAAgB,EAAE,IAAI;SACtB,CAAC,CAAC;QAEH,IAAI,oBAAoB,GAAG,CAAC,CAAC;QAC7B,UAAU,CAAC,EAAE,CAAC,yBAAyB,CAAC,mBAAmB,EAAE,GAAG,EAAE;YACjE,oBAAoB,EAAE,CAAC;QACxB,CAAC,CAAC,CAAC;QACH,WAAW,CAAC,EAAE,CAAC,yBAAyB,CAAC,mBAAmB,EAAE,GAAG,EAAE;YAClE,oBAAoB,EAAE,CAAC;QACxB,CAAC,CAAC,CAAC;QACH,WAAW,CAAC,EAAE,CAAC,yBAAyB,CAAC,mBAAmB,EAAE,GAAG,EAAE;YAClE,oBAAoB,EAAE,CAAC;QACxB,CAAC,CAAC,CAAC;QAEH,0DAA0D;QAC1D,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAE9C,yCAAyC;QACzC,UAAU,CAAC,WAAW,CAAC,gBAAgB,CAAC,CAAC;QACzC,WAAW,CAAC,WAAW,CAAC,gBAAgB,CAAC,CAAC;QAC1C,WAAW,CAAC,WAAW,CAAC,gBAAgB,CAAC,CAAC;QAE1C,sDAAsD;QACtD,MAAM,IAAI,OAAO,CAAC,CAAC,OAAO,EAAE,EAAE,CAAC,YAAY,CAAC,OAAO,CAAC,CAAC,CAAC;QACtD,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAC9C,MAAM,CAAC,oBAAoB,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAEzC,yCAAyC;QACzC,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;QAClD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;IACnD,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,kCAAkC,EAAE,KAAK,IAAI,EAAE;QACjD,MAAM,IAAI,GAAG,OAAO,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAC,YAAY,EAAE,CAAC,CAAC,CAAC,CAAC;QAC5D,UAAU,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC;QAElC,0EAA0E;QAC1E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAE9C,MAAM,EAAE,WAAW,EAAE,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAC9F,MAAM,EAAE,UAAU,EAAE,GAAG,kBAAkB,CAAC,WAAW,CAAC,CAAC;QACvD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAEtC,iEAAiE;QACjE,MAAM,EAAE,KAAK,EAAE,GAAG,UAAU,CAAC,CAAC,CAAC,CAAC;QAChC,MAAM,CAAC,KAAK,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;QACxC,MAAM,CAAE,KAAiC,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;IAC/D,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,+DAA+D,EAAE,KAAK,IAAI,EAAE;QAC9E,MAAM,aAAa,GAAmB,EAAE,CAAC;QAEzC,yCAAyC;QACzC,OAAO,aAAa,CAAC,MAAM,GAAI,UAAU,CAAC,KAAyB,CAAC,aAAa,GAAG,EAAE,EAAE;YACvF,MAAM,IAAI,GAAG,OAAO,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAC,YAAY,EAAE,CAAC,CAAC,CAAC,CAAC;YAC5D,aAAa,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;YACzB,UAAU,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC;SAClC;QAED,2EAA2E;QAC3E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAE9C,MAAM,EAAE,WAAW,EAAE,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAC9F,MAAM,EAAE,UAAU,EAAE,GAAG,kBAAkB,CAAC,WAAW,CAAC,CAAC;QACvD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QACtC,MAAM,CAAC,OAAQ,UAAU,CAAC,CAAC,CAAC,CAAC,KAAoB,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;QAC5E,MAAM,CAAC,KAAK,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;QACtD,MAAM,CAAE,UAAU,CAAC,CAAC,CAAC,CAAC,KAAiC,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC;IAC9E,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,mFAAmF,EAAE,KAAK,IAAI,EAAE;QAClG,MAAM,oBAAoB,CAAC,CAAC,CAAC,CAAC;QAE9B,MAAM,EAAE,WAAW,EAAE,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAC9F,MAAM,EAAE,UAAU,EAAE,GAAG,kBAAkB,CAAC,WAAW,CAAC,CAAC;QACvD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAEtC,wFAAwF;QACxF,UAAU,CAAC,OAAO,CAAC,CAAC,EAAE,aAAa,EAAE,KAAK,EAAE,EAAE,KAAK,EAAE,EAAE;YACtD,MAAM,CAAC,aAAa,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,KAAK,GAAI,UAAU,CAAC,KAAyB,CAAC,aAAa,CAAC,CAAC;YAC5F,MAAM,CAAC,OAAQ,KAAoB,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;QAC/D,CAAC,CAAC,CAAC;IACJ,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,gEAAgE,EAAE,KAAK,IAAI,EAAE;;QAC/E,MAAM,EAAE,IAAI,EAAE,WAAW,EAAE,GAAG,MAAM,8BAA8B,CAAC;YAClE,kBAAkB;YAClB,gBAAgB,EAAE,IAAI;SACtB,CAAC,CAAC;QACH,MAAM,EAAE,IAAI,EAAE,WAAW,EAAE,GAAG,MAAM,8BAA8B,CAAC;YAClE,kBAAkB;YAClB,gBAAgB,EAAE,IAAI;SACtB,CAAC,CAAC;QAEH,kDAAkD;QAClD,MAAM,OACL,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC,WAAW,0CAAE,UAAU,CAAC,MAAM,CACpG,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QACd,MAAM,OACL,2BAA2B,CAAC,WAAW,CAAC,KAAK,EAAE,WAAW,CAAC,WAAW,CAAC,CAAC,WAAW,0CAAE,UAAU,CAAC,MAAM,CACtG,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QACd,MAAM,OACL,2BAA2B,CAAC,WAAW,CAAC,KAAK,EAAE,WAAW,CAAC,WAAW,CAAC,CAAC,WAAW,0CAAE,UAAU,CAAC,MAAM,CACtG,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAEd,MAAM,oBAAoB,EAAE,CAAC;QAE7B,8CAA8C;QAC9C,MAAM,iBAAiB,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAChG,MAAM,kBAAkB,GAAG,2BAA2B,CAAC,WAAW,CAAC,KAAK,EAAE,WAAW,CAAC,WAAW,CAAC,CAAC;QACnG,MAAM,kBAAkB,GAAG,2BAA2B,CAAC,WAAW,CAAC,KAAK,EAAE,WAAW,CAAC,WAAW,CAAC,CAAC;QACnG,MAAM,eAAe,GAAG,kBAAkB,CAAC,iBAAiB,CAAC,WAAW,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC;QAC9F,MAAM,gBAAgB,GAAG,kBAAkB,CAAC,kBAAkB,CAAC,WAAW,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC;QAChG,MAAM,gBAAgB,GAAG,kBAAkB,CAAC,kBAAkB,CAAC,WAAW,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC;QAEhG,2DAA2D;QAC3D,MAAM,CAAC,OAAQ,eAA8B,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;QAExE,MAAM,qBAAqB,GAAI,eAAuB,CAAC,YAAY,CAAC;QACpE,MAAM,sBAAsB,GAAI,gBAAwB,CAAC,YAAY,CAAC;QACtE,MAAM,sBAAsB,GAAI,gBAAwB,CAAC,YAAY,CAAC;QAEtE,kEAAkE;QAClE,MAAM,CAAC,OAAO,qBAAqB,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,QAAQ,CAAC,CAAC;QAExD,MAAM,CAAC,qBAAqB,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,sBAAsB,CAAC,CAAC;QAC/D,MAAM,CAAC,sBAAsB,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,sBAAsB,CAAC,CAAC;IACjE,CAAC,CAAC,CAAC;AACJ,CAAC,CAAC,CAAC","sourcesContent":["/*!\n * Copyright (c) Microsoft Corporation and contributors. All rights reserved.\n * Licensed under the MIT License.\n */\n\nimport { expect } from 'chai';\nimport { TestObjectProvider } from '@fluidframework/test-utils';\nimport { EditHandle, EditLog } from '../EditLog';\nimport { Edit, EditWithoutId, newEdit, fullHistorySummarizer_0_1_0 } from '../generic';\nimport { SharedTree, setTrait, Change } from '../default-edits';\nimport { assertNotUndefined } from '../Common';\nimport { SharedTreeSummary_0_0_2 } from '../SummaryBackCompatibility';\nimport { initialTree } from '../InitialTree';\nimport { SharedTreeDiagnosticEvent } from '../generic/GenericSharedTree';\nimport { createStableEdits, makeTestNode, setUpLocalServerTestSharedTree, testTrait } from './utilities/TestUtilities';\n\ndescribe('SharedTree history virtualization', () => {\n\tlet sharedTree: SharedTree;\n\tlet testObjectProvider: TestObjectProvider;\n\n\t// Create a summary used to test catchup blobbing\n\tconst summaryToCatchUp: SharedTreeSummary_0_0_2<Change> = {\n\t\tcurrentTree: initialTree,\n\t\tversion: '0.0.2',\n\t\tsequencedEdits: createStableEdits(250),\n\t};\n\n\tbeforeEach(async () => {\n\t\tconst testingComponents = await setUpLocalServerTestSharedTree({\n\t\t\tsummarizeHistory: true,\n\t\t});\n\t\tsharedTree = testingComponents.tree;\n\t\ttestObjectProvider = testingComponents.testObjectProvider;\n\t});\n\n\tafterEach(async () => {\n\t\ttestObjectProvider.reset();\n\t});\n\n\t// Adds edits to sharedTree1 to make up the specified number of chunks.\n\tconst processNewEditChunks = async (numberOfChunks = 1) => {\n\t\tconst expectedEdits: Edit<Change>[] = [];\n\n\t\t// Add some edits to create a chunk with.\n\t\twhile (expectedEdits.length < (sharedTree.edits as EditLog<Change>).editsPerChunk * numberOfChunks) {\n\t\t\tconst edit = newEdit(setTrait(testTrait, [makeTestNode()]));\n\t\t\texpectedEdits.push(edit);\n\t\t\tsharedTree.processLocalEdit(edit);\n\t\t}\n\n\t\t// Wait for the ops to to be submitted and processed across the containers.\n\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\treturn expectedEdits;\n\t};\n\n\tit('can upload edit chunks and load chunks from handles', async () => {\n\t\tconst expectedEdits: Edit<Change>[] = await processNewEditChunks();\n\n\t\tconst summary = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\n\t\tconst { editHistory } = summary;\n\t\tconst { editChunks } = assertNotUndefined(editHistory);\n\t\texpect(editChunks.length).to.equal(1);\n\t\texpect(typeof (editChunks[0].chunk as EditHandle).get).to.equal('function');\n\n\t\t// Load a second tree using the summary\n\t\tconst { tree: sharedTree2 } = await setUpLocalServerTestSharedTree({ testObjectProvider });\n\n\t\tsharedTree2.loadSummary(summary);\n\n\t\t// Ensure chunked edit can be retrieved\n\t\texpect((await sharedTree2.edits.getEditAtIndex(2)).id).to.equal(expectedEdits[2].id);\n\t});\n\n\tit('can upload catchup blobs', async () => {\n\t\tlet catchUpBlobsUploaded = 0;\n\t\tsharedTree.on(SharedTreeDiagnosticEvent.CatchUpBlobUploaded, () => {\n\t\t\tcatchUpBlobsUploaded++;\n\t\t});\n\n\t\t// Wait for the op to to be submitted and processed across the containers.\n\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\tsharedTree.loadSummary(summaryToCatchUp);\n\n\t\tawait testObjectProvider.ensureSynchronized();\n\t\texpect(catchUpBlobsUploaded).to.equal(1);\n\n\t\tconst { editHistory } = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\t\tconst { editChunks } = assertNotUndefined(editHistory);\n\t\texpect(editChunks.length).to.equal(1);\n\t\texpect(typeof (editChunks[0].chunk as EditHandle).get).to.equal('function');\n\t});\n\n\tit('only uploads catchup blobs from one client', async () => {\n\t\t// Create more connected trees\n\t\tconst { tree: sharedTree2 } = await setUpLocalServerTestSharedTree({\n\t\t\ttestObjectProvider,\n\t\t\tsummarizeHistory: true,\n\t\t});\n\t\tconst { tree: sharedTree3 } = await setUpLocalServerTestSharedTree({\n\t\t\ttestObjectProvider,\n\t\t\tsummarizeHistory: true,\n\t\t});\n\n\t\tlet catchUpBlobsUploaded = 0;\n\t\tsharedTree.on(SharedTreeDiagnosticEvent.CatchUpBlobUploaded, () => {\n\t\t\tcatchUpBlobsUploaded++;\n\t\t});\n\t\tsharedTree2.on(SharedTreeDiagnosticEvent.CatchUpBlobUploaded, () => {\n\t\t\tcatchUpBlobsUploaded++;\n\t\t});\n\t\tsharedTree3.on(SharedTreeDiagnosticEvent.CatchUpBlobUploaded, () => {\n\t\t\tcatchUpBlobsUploaded++;\n\t\t});\n\n\t\t// Wait for processing again in case there are more no ops\n\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\t// Try to load summaries on all the trees\n\t\tsharedTree.loadSummary(summaryToCatchUp);\n\t\tsharedTree2.loadSummary(summaryToCatchUp);\n\t\tsharedTree3.loadSummary(summaryToCatchUp);\n\n\t\t// `ensureSynchronized` does not guarantee blob upload\n\t\tawait new Promise((resolve) => setImmediate(resolve));\n\t\tawait testObjectProvider.ensureSynchronized();\n\t\texpect(catchUpBlobsUploaded).to.equal(1);\n\n\t\t// Make sure the trees are still the same\n\t\texpect(sharedTree.equals(sharedTree2)).to.be.true;\n\t\texpect(sharedTree.equals(sharedTree3)).to.be.true;\n\t});\n\n\tit(\"doesn't upload incomplete chunks\", async () => {\n\t\tconst edit = newEdit(setTrait(testTrait, [makeTestNode()]));\n\t\tsharedTree.processLocalEdit(edit);\n\n\t\t// Wait for the op to to be submitted and processed across the containers.\n\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\tconst { editHistory } = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\t\tconst { editChunks } = assertNotUndefined(editHistory);\n\t\texpect(editChunks.length).to.equal(1);\n\n\t\t// The chunk given by the summary should be an array of length 1.\n\t\tconst { chunk } = editChunks[0];\n\t\texpect(Array.isArray(chunk)).to.be.true;\n\t\texpect((chunk as EditWithoutId<Change>[]).length).to.equal(1);\n\t});\n\n\tit('can upload full chunks with incomplete chunks in the edit log', async () => {\n\t\tconst expectedEdits: Edit<Change>[] = [];\n\n\t\t// Add some edits to create a chunk with.\n\t\twhile (expectedEdits.length < (sharedTree.edits as EditLog<Change>).editsPerChunk + 10) {\n\t\t\tconst edit = newEdit(setTrait(testTrait, [makeTestNode()]));\n\t\t\texpectedEdits.push(edit);\n\t\t\tsharedTree.processLocalEdit(edit);\n\t\t}\n\n\t\t// Wait for the ops to to be submitted and processed across the containers.\n\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\tconst { editHistory } = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\t\tconst { editChunks } = assertNotUndefined(editHistory);\n\t\texpect(editChunks.length).to.equal(2);\n\t\texpect(typeof (editChunks[0].chunk as EditHandle).get).to.equal('function');\n\t\texpect(Array.isArray(editChunks[1].chunk)).to.be.true;\n\t\texpect((editChunks[1].chunk as EditWithoutId<Change>[]).length).to.equal(10);\n\t});\n\n\tit('correctly saves handles and their corresponding starting revisions to the summary', async () => {\n\t\tawait processNewEditChunks(4);\n\n\t\tconst { editHistory } = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\t\tconst { editChunks } = assertNotUndefined(editHistory);\n\t\texpect(editChunks.length).to.equal(4);\n\n\t\t// Make sure each starting revision is correct and each chunk in the summary is a handle\n\t\teditChunks.forEach(({ startRevision, chunk }, index) => {\n\t\t\texpect(startRevision).to.equal(index * (sharedTree.edits as EditLog<Change>).editsPerChunk);\n\t\t\texpect(typeof (chunk as EditHandle).get).to.equal('function');\n\t\t});\n\t});\n\n\tit('sends handle ops to connected clients when chunks are uploaded', async () => {\n\t\tconst { tree: sharedTree2 } = await setUpLocalServerTestSharedTree({\n\t\t\ttestObjectProvider,\n\t\t\tsummarizeHistory: true,\n\t\t});\n\t\tconst { tree: sharedTree3 } = await setUpLocalServerTestSharedTree({\n\t\t\ttestObjectProvider,\n\t\t\tsummarizeHistory: true,\n\t\t});\n\n\t\t// All shared trees should have no edits or chunks\n\t\texpect(\n\t\t\tfullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView).editHistory?.editChunks.length\n\t\t).to.equal(0);\n\t\texpect(\n\t\t\tfullHistorySummarizer_0_1_0(sharedTree2.edits, sharedTree2.currentView).editHistory?.editChunks.length\n\t\t).to.equal(0);\n\t\texpect(\n\t\t\tfullHistorySummarizer_0_1_0(sharedTree3.edits, sharedTree3.currentView).editHistory?.editChunks.length\n\t\t).to.equal(0);\n\n\t\tawait processNewEditChunks();\n\n\t\t// All shared trees should have the new handle\n\t\tconst sharedTreeSummary = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\t\tconst sharedTree2Summary = fullHistorySummarizer_0_1_0(sharedTree2.edits, sharedTree2.currentView);\n\t\tconst sharedTree3Summary = fullHistorySummarizer_0_1_0(sharedTree3.edits, sharedTree3.currentView);\n\t\tconst sharedTreeChunk = assertNotUndefined(sharedTreeSummary.editHistory).editChunks[0].chunk;\n\t\tconst sharedTree2Chunk = assertNotUndefined(sharedTree2Summary.editHistory).editChunks[0].chunk;\n\t\tconst sharedTree3Chunk = assertNotUndefined(sharedTree3Summary.editHistory).editChunks[0].chunk;\n\n\t\t// Make sure the chunk of the first shared tree is a handle\n\t\texpect(typeof (sharedTreeChunk as EditHandle).get).to.equal('function');\n\n\t\tconst sharedTreeHandleRoute = (sharedTreeChunk as any).absolutePath;\n\t\tconst sharedTree2HandleRoute = (sharedTree2Chunk as any).absolutePath;\n\t\tconst sharedTree3HandleRoute = (sharedTree3Chunk as any).absolutePath;\n\n\t\t// Make sure the handle route of the first shared tree is a string\n\t\texpect(typeof sharedTreeHandleRoute).to.equal('string');\n\n\t\texpect(sharedTreeHandleRoute).to.equal(sharedTree2HandleRoute);\n\t\texpect(sharedTree2HandleRoute).to.equal(sharedTree3HandleRoute);\n\t});\n});\n"]}
1
+ {"version":3,"file":"Virtualization.tests.js","sourceRoot":"","sources":["../../src/test/Virtualization.tests.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAEH,OAAO,EAAE,MAAM,EAAE,MAAM,MAAM,CAAC;AAG9B,OAAO,EAAuB,OAAO,EAAE,2BAA2B,EAAE,MAAM,YAAY,CAAC;AACvF,OAAO,EAAc,QAAQ,EAAU,MAAM,kBAAkB,CAAC;AAChE,OAAO,EAAE,kBAAkB,EAAE,MAAM,WAAW,CAAC;AAE/C,OAAO,EAAE,WAAW,EAAE,MAAM,gBAAgB,CAAC;AAC7C,OAAO,EAAE,yBAAyB,EAAE,MAAM,8BAA8B,CAAC;AACzE,OAAO,EAAE,iBAAiB,EAAE,YAAY,EAAE,8BAA8B,EAAE,SAAS,EAAE,MAAM,2BAA2B,CAAC;AAEvH,QAAQ,CAAC,mCAAmC,EAAE,GAAG,EAAE;IAClD,IAAI,UAAsB,CAAC;IAC3B,IAAI,kBAAsC,CAAC;IAE3C,iDAAiD;IACjD,MAAM,gBAAgB,GAAoC;QACzD,WAAW,EAAE,WAAW;QACxB,OAAO,EAAE,OAAO;QAChB,cAAc,EAAE,iBAAiB,CAAC,GAAG,CAAC;KACtC,CAAC;IAEF,UAAU,CAAC,KAAK,IAAI,EAAE;QACrB,MAAM,iBAAiB,GAAG,MAAM,8BAA8B,CAAC;YAC9D,gBAAgB,EAAE,IAAI;SACtB,CAAC,CAAC;QACH,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC;QACpC,kBAAkB,GAAG,iBAAiB,CAAC,kBAAkB,CAAC;IAC3D,CAAC,CAAC,CAAC;IAEH,SAAS,CAAC,KAAK,IAAI,EAAE;QACpB,kBAAkB,CAAC,KAAK,EAAE,CAAC;IAC5B,CAAC,CAAC,CAAC;IAEH,uEAAuE;IACvE,MAAM,oBAAoB,GAAG,KAAK,EAAE,cAAc,GAAG,CAAC,EAAE,EAAE;QACzD,MAAM,aAAa,GAAmB,EAAE,CAAC;QAEzC,yCAAyC;QACzC,OAAO,aAAa,CAAC,MAAM,GAAI,UAAU,CAAC,KAAyB,CAAC,aAAa,GAAG,cAAc,EAAE;YACnG,MAAM,IAAI,GAAG,OAAO,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAC,YAAY,EAAE,CAAC,CAAC,CAAC,CAAC;YAC5D,aAAa,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;YACzB,UAAU,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC;SAClC;QAED,2EAA2E;QAC3E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAE9C,OAAO,aAAa,CAAC;IACtB,CAAC,CAAC;IAEF,EAAE,CAAC,qDAAqD,EAAE,KAAK,IAAI,EAAE;QACpE,MAAM,aAAa,GAAmB,MAAM,oBAAoB,EAAE,CAAC;QAEnE,MAAM,OAAO,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAEtF,MAAM,EAAE,WAAW,EAAE,GAAG,OAAO,CAAC;QAChC,MAAM,EAAE,UAAU,EAAE,GAAG,kBAAkB,CAAC,WAAW,CAAC,CAAC;QACvD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QACtC,MAAM,CAAC,OAAQ,UAAU,CAAC,CAAC,CAAC,CAAC,KAAoB,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;QAE5E,uCAAuC;QACvC,MAAM,EAAE,IAAI,EAAE,WAAW,EAAE,GAAG,MAAM,8BAA8B,CAAC,EAAE,kBAAkB,EAAE,CAAC,CAAC;QAE3F,WAAW,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;QAEjC,uCAAuC;QACvC,MAAM,CAAC,CAAC,MAAM,WAAW,CAAC,KAAK,CAAC,cAAc,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,aAAa,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC;IACtF,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,0BAA0B,EAAE,KAAK,IAAI,EAAE;QACzC,IAAI,oBAAoB,GAAG,CAAC,CAAC;QAC7B,UAAU,CAAC,EAAE,CAAC,yBAAyB,CAAC,mBAAmB,EAAE,GAAG,EAAE;YACjE,oBAAoB,EAAE,CAAC;QACxB,CAAC,CAAC,CAAC;QAEH,0EAA0E;QAC1E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAE9C,UAAU,CAAC,WAAW,CAAC,gBAAgB,CAAC,CAAC;QAEzC,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAC9C,MAAM,CAAC,oBAAoB,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAEzC,MAAM,EAAE,WAAW,EAAE,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAC9F,MAAM,EAAE,UAAU,EAAE,GAAG,kBAAkB,CAAC,WAAW,CAAC,CAAC;QACvD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QACtC,MAAM,CAAC,OAAQ,UAAU,CAAC,CAAC,CAAC,CAAC,KAAoB,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;IAC7E,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,4CAA4C,EAAE,KAAK,IAAI,EAAE;QAC3D,8BAA8B;QAC9B,MAAM,EAAE,IAAI,EAAE,WAAW,EAAE,GAAG,MAAM,8BAA8B,CAAC;YAClE,kBAAkB;YAClB,gBAAgB,EAAE,IAAI;SACtB,CAAC,CAAC;QACH,MAAM,EAAE,IAAI,EAAE,WAAW,EAAE,GAAG,MAAM,8BAA8B,CAAC;YAClE,kBAAkB;YAClB,gBAAgB,EAAE,IAAI;SACtB,CAAC,CAAC;QAEH,IAAI,oBAAoB,GAAG,CAAC,CAAC;QAC7B,UAAU,CAAC,EAAE,CAAC,yBAAyB,CAAC,mBAAmB,EAAE,GAAG,EAAE;YACjE,oBAAoB,EAAE,CAAC;QACxB,CAAC,CAAC,CAAC;QACH,WAAW,CAAC,EAAE,CAAC,yBAAyB,CAAC,mBAAmB,EAAE,GAAG,EAAE;YAClE,oBAAoB,EAAE,CAAC;QACxB,CAAC,CAAC,CAAC;QACH,WAAW,CAAC,EAAE,CAAC,yBAAyB,CAAC,mBAAmB,EAAE,GAAG,EAAE;YAClE,oBAAoB,EAAE,CAAC;QACxB,CAAC,CAAC,CAAC;QAEH,0DAA0D;QAC1D,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAE9C,yCAAyC;QACzC,UAAU,CAAC,WAAW,CAAC,gBAAgB,CAAC,CAAC;QACzC,WAAW,CAAC,WAAW,CAAC,gBAAgB,CAAC,CAAC;QAC1C,WAAW,CAAC,WAAW,CAAC,gBAAgB,CAAC,CAAC;QAE1C,sDAAsD;QACtD,MAAM,IAAI,OAAO,CAAC,CAAC,OAAO,EAAE,EAAE,CAAC,YAAY,CAAC,OAAO,CAAC,CAAC,CAAC;QACtD,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAC9C,MAAM,CAAC,oBAAoB,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAEzC,yCAAyC;QACzC,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;QAClD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;IACnD,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,kCAAkC,EAAE,KAAK,IAAI,EAAE;QACjD,MAAM,IAAI,GAAG,OAAO,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAC,YAAY,EAAE,CAAC,CAAC,CAAC,CAAC;QAC5D,UAAU,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC;QAElC,0EAA0E;QAC1E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAE9C,MAAM,EAAE,WAAW,EAAE,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAC9F,MAAM,EAAE,UAAU,EAAE,GAAG,kBAAkB,CAAC,WAAW,CAAC,CAAC;QACvD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAEtC,iEAAiE;QACjE,MAAM,EAAE,KAAK,EAAE,GAAG,UAAU,CAAC,CAAC,CAAC,CAAC;QAChC,MAAM,CAAC,KAAK,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;QACxC,MAAM,CAAE,KAAiC,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;IAC/D,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,+DAA+D,EAAE,KAAK,IAAI,EAAE;QAC9E,MAAM,aAAa,GAAmB,EAAE,CAAC;QAEzC,yCAAyC;QACzC,OAAO,aAAa,CAAC,MAAM,GAAI,UAAU,CAAC,KAAyB,CAAC,aAAa,GAAG,EAAE,EAAE;YACvF,MAAM,IAAI,GAAG,OAAO,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAC,YAAY,EAAE,CAAC,CAAC,CAAC,CAAC;YAC5D,aAAa,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;YACzB,UAAU,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC;SAClC;QAED,2EAA2E;QAC3E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;QAE9C,MAAM,EAAE,WAAW,EAAE,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAC9F,MAAM,EAAE,UAAU,EAAE,GAAG,kBAAkB,CAAC,WAAW,CAAC,CAAC;QACvD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QACtC,MAAM,CAAC,OAAQ,UAAU,CAAC,CAAC,CAAC,CAAC,KAAoB,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;QAC5E,MAAM,CAAC,KAAK,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;QACtD,MAAM,CAAE,UAAU,CAAC,CAAC,CAAC,CAAC,KAAiC,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC;IAC9E,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,mFAAmF,EAAE,KAAK,IAAI,EAAE;QAClG,kBAAkB,CAAC,MAAM,CAAC,qBAAqB,CAC9C,EAAE,SAAS,EAAE,uCAAuC,EAAE,EACtD,EAAE,SAAS,EAAE,uCAAuC,EAAE,CACtD,CAAC;QACF,MAAM,oBAAoB,CAAC,CAAC,CAAC,CAAC;QAE9B,MAAM,EAAE,WAAW,EAAE,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAC9F,MAAM,EAAE,UAAU,EAAE,GAAG,kBAAkB,CAAC,WAAW,CAAC,CAAC;QACvD,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAEtC,wFAAwF;QACxF,UAAU,CAAC,OAAO,CAAC,CAAC,EAAE,aAAa,EAAE,KAAK,EAAE,EAAE,KAAK,EAAE,EAAE;YACtD,MAAM,CAAC,aAAa,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,KAAK,GAAI,UAAU,CAAC,KAAyB,CAAC,aAAa,CAAC,CAAC;YAC5F,MAAM,CAAC,OAAQ,KAAoB,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;QAC/D,CAAC,CAAC,CAAC;IACJ,CAAC,CAAC,CAAC;IAEH,EAAE,CAAC,gEAAgE,EAAE,KAAK,IAAI,EAAE;;QAC/E,MAAM,EAAE,IAAI,EAAE,WAAW,EAAE,GAAG,MAAM,8BAA8B,CAAC;YAClE,kBAAkB;YAClB,gBAAgB,EAAE,IAAI;SACtB,CAAC,CAAC;QACH,MAAM,EAAE,IAAI,EAAE,WAAW,EAAE,GAAG,MAAM,8BAA8B,CAAC;YAClE,kBAAkB;YAClB,gBAAgB,EAAE,IAAI;SACtB,CAAC,CAAC;QAEH,kDAAkD;QAClD,MAAM,OACL,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC,WAAW,0CAAE,UAAU,CAAC,MAAM,CACpG,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QACd,MAAM,OACL,2BAA2B,CAAC,WAAW,CAAC,KAAK,EAAE,WAAW,CAAC,WAAW,CAAC,CAAC,WAAW,0CAAE,UAAU,CAAC,MAAM,CACtG,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QACd,MAAM,OACL,2BAA2B,CAAC,WAAW,CAAC,KAAK,EAAE,WAAW,CAAC,WAAW,CAAC,CAAC,WAAW,0CAAE,UAAU,CAAC,MAAM,CACtG,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAEd,MAAM,oBAAoB,EAAE,CAAC;QAE7B,8CAA8C;QAC9C,MAAM,iBAAiB,GAAG,2BAA2B,CAAC,UAAU,CAAC,KAAK,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;QAChG,MAAM,kBAAkB,GAAG,2BAA2B,CAAC,WAAW,CAAC,KAAK,EAAE,WAAW,CAAC,WAAW,CAAC,CAAC;QACnG,MAAM,kBAAkB,GAAG,2BAA2B,CAAC,WAAW,CAAC,KAAK,EAAE,WAAW,CAAC,WAAW,CAAC,CAAC;QACnG,MAAM,eAAe,GAAG,kBAAkB,CAAC,iBAAiB,CAAC,WAAW,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC;QAC9F,MAAM,gBAAgB,GAAG,kBAAkB,CAAC,kBAAkB,CAAC,WAAW,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC;QAChG,MAAM,gBAAgB,GAAG,kBAAkB,CAAC,kBAAkB,CAAC,WAAW,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC;QAEhG,2DAA2D;QAC3D,MAAM,CAAC,OAAQ,eAA8B,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;QAExE,MAAM,qBAAqB,GAAI,eAAuB,CAAC,YAAY,CAAC;QACpE,MAAM,sBAAsB,GAAI,gBAAwB,CAAC,YAAY,CAAC;QACtE,MAAM,sBAAsB,GAAI,gBAAwB,CAAC,YAAY,CAAC;QAEtE,kEAAkE;QAClE,MAAM,CAAC,OAAO,qBAAqB,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,QAAQ,CAAC,CAAC;QAExD,MAAM,CAAC,qBAAqB,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,sBAAsB,CAAC,CAAC;QAC/D,MAAM,CAAC,sBAAsB,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,sBAAsB,CAAC,CAAC;IACjE,CAAC,CAAC,CAAC;AACJ,CAAC,CAAC,CAAC","sourcesContent":["/*!\n * Copyright (c) Microsoft Corporation and contributors. All rights reserved.\n * Licensed under the MIT License.\n */\n\nimport { expect } from 'chai';\nimport { TestObjectProvider } from '@fluidframework/test-utils';\nimport { EditHandle, EditLog } from '../EditLog';\nimport { Edit, EditWithoutId, newEdit, fullHistorySummarizer_0_1_0 } from '../generic';\nimport { SharedTree, setTrait, Change } from '../default-edits';\nimport { assertNotUndefined } from '../Common';\nimport { SharedTreeSummary_0_0_2 } from '../SummaryBackCompatibility';\nimport { initialTree } from '../InitialTree';\nimport { SharedTreeDiagnosticEvent } from '../generic/GenericSharedTree';\nimport { createStableEdits, makeTestNode, setUpLocalServerTestSharedTree, testTrait } from './utilities/TestUtilities';\n\ndescribe('SharedTree history virtualization', () => {\n\tlet sharedTree: SharedTree;\n\tlet testObjectProvider: TestObjectProvider;\n\n\t// Create a summary used to test catchup blobbing\n\tconst summaryToCatchUp: SharedTreeSummary_0_0_2<Change> = {\n\t\tcurrentTree: initialTree,\n\t\tversion: '0.0.2',\n\t\tsequencedEdits: createStableEdits(250),\n\t};\n\n\tbeforeEach(async () => {\n\t\tconst testingComponents = await setUpLocalServerTestSharedTree({\n\t\t\tsummarizeHistory: true,\n\t\t});\n\t\tsharedTree = testingComponents.tree;\n\t\ttestObjectProvider = testingComponents.testObjectProvider;\n\t});\n\n\tafterEach(async () => {\n\t\ttestObjectProvider.reset();\n\t});\n\n\t// Adds edits to sharedTree1 to make up the specified number of chunks.\n\tconst processNewEditChunks = async (numberOfChunks = 1) => {\n\t\tconst expectedEdits: Edit<Change>[] = [];\n\n\t\t// Add some edits to create a chunk with.\n\t\twhile (expectedEdits.length < (sharedTree.edits as EditLog<Change>).editsPerChunk * numberOfChunks) {\n\t\t\tconst edit = newEdit(setTrait(testTrait, [makeTestNode()]));\n\t\t\texpectedEdits.push(edit);\n\t\t\tsharedTree.processLocalEdit(edit);\n\t\t}\n\n\t\t// Wait for the ops to to be submitted and processed across the containers.\n\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\treturn expectedEdits;\n\t};\n\n\tit('can upload edit chunks and load chunks from handles', async () => {\n\t\tconst expectedEdits: Edit<Change>[] = await processNewEditChunks();\n\n\t\tconst summary = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\n\t\tconst { editHistory } = summary;\n\t\tconst { editChunks } = assertNotUndefined(editHistory);\n\t\texpect(editChunks.length).to.equal(1);\n\t\texpect(typeof (editChunks[0].chunk as EditHandle).get).to.equal('function');\n\n\t\t// Load a second tree using the summary\n\t\tconst { tree: sharedTree2 } = await setUpLocalServerTestSharedTree({ testObjectProvider });\n\n\t\tsharedTree2.loadSummary(summary);\n\n\t\t// Ensure chunked edit can be retrieved\n\t\texpect((await sharedTree2.edits.getEditAtIndex(2)).id).to.equal(expectedEdits[2].id);\n\t});\n\n\tit('can upload catchup blobs', async () => {\n\t\tlet catchUpBlobsUploaded = 0;\n\t\tsharedTree.on(SharedTreeDiagnosticEvent.CatchUpBlobUploaded, () => {\n\t\t\tcatchUpBlobsUploaded++;\n\t\t});\n\n\t\t// Wait for the op to to be submitted and processed across the containers.\n\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\tsharedTree.loadSummary(summaryToCatchUp);\n\n\t\tawait testObjectProvider.ensureSynchronized();\n\t\texpect(catchUpBlobsUploaded).to.equal(1);\n\n\t\tconst { editHistory } = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\t\tconst { editChunks } = assertNotUndefined(editHistory);\n\t\texpect(editChunks.length).to.equal(1);\n\t\texpect(typeof (editChunks[0].chunk as EditHandle).get).to.equal('function');\n\t});\n\n\tit('only uploads catchup blobs from one client', async () => {\n\t\t// Create more connected trees\n\t\tconst { tree: sharedTree2 } = await setUpLocalServerTestSharedTree({\n\t\t\ttestObjectProvider,\n\t\t\tsummarizeHistory: true,\n\t\t});\n\t\tconst { tree: sharedTree3 } = await setUpLocalServerTestSharedTree({\n\t\t\ttestObjectProvider,\n\t\t\tsummarizeHistory: true,\n\t\t});\n\n\t\tlet catchUpBlobsUploaded = 0;\n\t\tsharedTree.on(SharedTreeDiagnosticEvent.CatchUpBlobUploaded, () => {\n\t\t\tcatchUpBlobsUploaded++;\n\t\t});\n\t\tsharedTree2.on(SharedTreeDiagnosticEvent.CatchUpBlobUploaded, () => {\n\t\t\tcatchUpBlobsUploaded++;\n\t\t});\n\t\tsharedTree3.on(SharedTreeDiagnosticEvent.CatchUpBlobUploaded, () => {\n\t\t\tcatchUpBlobsUploaded++;\n\t\t});\n\n\t\t// Wait for processing again in case there are more no ops\n\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\t// Try to load summaries on all the trees\n\t\tsharedTree.loadSummary(summaryToCatchUp);\n\t\tsharedTree2.loadSummary(summaryToCatchUp);\n\t\tsharedTree3.loadSummary(summaryToCatchUp);\n\n\t\t// `ensureSynchronized` does not guarantee blob upload\n\t\tawait new Promise((resolve) => setImmediate(resolve));\n\t\tawait testObjectProvider.ensureSynchronized();\n\t\texpect(catchUpBlobsUploaded).to.equal(1);\n\n\t\t// Make sure the trees are still the same\n\t\texpect(sharedTree.equals(sharedTree2)).to.be.true;\n\t\texpect(sharedTree.equals(sharedTree3)).to.be.true;\n\t});\n\n\tit(\"doesn't upload incomplete chunks\", async () => {\n\t\tconst edit = newEdit(setTrait(testTrait, [makeTestNode()]));\n\t\tsharedTree.processLocalEdit(edit);\n\n\t\t// Wait for the op to to be submitted and processed across the containers.\n\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\tconst { editHistory } = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\t\tconst { editChunks } = assertNotUndefined(editHistory);\n\t\texpect(editChunks.length).to.equal(1);\n\n\t\t// The chunk given by the summary should be an array of length 1.\n\t\tconst { chunk } = editChunks[0];\n\t\texpect(Array.isArray(chunk)).to.be.true;\n\t\texpect((chunk as EditWithoutId<Change>[]).length).to.equal(1);\n\t});\n\n\tit('can upload full chunks with incomplete chunks in the edit log', async () => {\n\t\tconst expectedEdits: Edit<Change>[] = [];\n\n\t\t// Add some edits to create a chunk with.\n\t\twhile (expectedEdits.length < (sharedTree.edits as EditLog<Change>).editsPerChunk + 10) {\n\t\t\tconst edit = newEdit(setTrait(testTrait, [makeTestNode()]));\n\t\t\texpectedEdits.push(edit);\n\t\t\tsharedTree.processLocalEdit(edit);\n\t\t}\n\n\t\t// Wait for the ops to to be submitted and processed across the containers.\n\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\tconst { editHistory } = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\t\tconst { editChunks } = assertNotUndefined(editHistory);\n\t\texpect(editChunks.length).to.equal(2);\n\t\texpect(typeof (editChunks[0].chunk as EditHandle).get).to.equal('function');\n\t\texpect(Array.isArray(editChunks[1].chunk)).to.be.true;\n\t\texpect((editChunks[1].chunk as EditWithoutId<Change>[]).length).to.equal(10);\n\t});\n\n\tit('correctly saves handles and their corresponding starting revisions to the summary', async () => {\n\t\ttestObjectProvider.logger.registerExpectedEvent(\n\t\t\t{ eventName: 'fluid:telemetry:Batching:LengthTooBig' },\n\t\t\t{ eventName: 'fluid:telemetry:Batching:LengthTooBig' }\n\t\t);\n\t\tawait processNewEditChunks(4);\n\n\t\tconst { editHistory } = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\t\tconst { editChunks } = assertNotUndefined(editHistory);\n\t\texpect(editChunks.length).to.equal(4);\n\n\t\t// Make sure each starting revision is correct and each chunk in the summary is a handle\n\t\teditChunks.forEach(({ startRevision, chunk }, index) => {\n\t\t\texpect(startRevision).to.equal(index * (sharedTree.edits as EditLog<Change>).editsPerChunk);\n\t\t\texpect(typeof (chunk as EditHandle).get).to.equal('function');\n\t\t});\n\t});\n\n\tit('sends handle ops to connected clients when chunks are uploaded', async () => {\n\t\tconst { tree: sharedTree2 } = await setUpLocalServerTestSharedTree({\n\t\t\ttestObjectProvider,\n\t\t\tsummarizeHistory: true,\n\t\t});\n\t\tconst { tree: sharedTree3 } = await setUpLocalServerTestSharedTree({\n\t\t\ttestObjectProvider,\n\t\t\tsummarizeHistory: true,\n\t\t});\n\n\t\t// All shared trees should have no edits or chunks\n\t\texpect(\n\t\t\tfullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView).editHistory?.editChunks.length\n\t\t).to.equal(0);\n\t\texpect(\n\t\t\tfullHistorySummarizer_0_1_0(sharedTree2.edits, sharedTree2.currentView).editHistory?.editChunks.length\n\t\t).to.equal(0);\n\t\texpect(\n\t\t\tfullHistorySummarizer_0_1_0(sharedTree3.edits, sharedTree3.currentView).editHistory?.editChunks.length\n\t\t).to.equal(0);\n\n\t\tawait processNewEditChunks();\n\n\t\t// All shared trees should have the new handle\n\t\tconst sharedTreeSummary = fullHistorySummarizer_0_1_0(sharedTree.edits, sharedTree.currentView);\n\t\tconst sharedTree2Summary = fullHistorySummarizer_0_1_0(sharedTree2.edits, sharedTree2.currentView);\n\t\tconst sharedTree3Summary = fullHistorySummarizer_0_1_0(sharedTree3.edits, sharedTree3.currentView);\n\t\tconst sharedTreeChunk = assertNotUndefined(sharedTreeSummary.editHistory).editChunks[0].chunk;\n\t\tconst sharedTree2Chunk = assertNotUndefined(sharedTree2Summary.editHistory).editChunks[0].chunk;\n\t\tconst sharedTree3Chunk = assertNotUndefined(sharedTree3Summary.editHistory).editChunks[0].chunk;\n\n\t\t// Make sure the chunk of the first shared tree is a handle\n\t\texpect(typeof (sharedTreeChunk as EditHandle).get).to.equal('function');\n\n\t\tconst sharedTreeHandleRoute = (sharedTreeChunk as any).absolutePath;\n\t\tconst sharedTree2HandleRoute = (sharedTree2Chunk as any).absolutePath;\n\t\tconst sharedTree3HandleRoute = (sharedTree3Chunk as any).absolutePath;\n\n\t\t// Make sure the handle route of the first shared tree is a string\n\t\texpect(typeof sharedTreeHandleRoute).to.equal('string');\n\n\t\texpect(sharedTreeHandleRoute).to.equal(sharedTree2HandleRoute);\n\t\texpect(sharedTree2HandleRoute).to.equal(sharedTree3HandleRoute);\n\t});\n});\n"]}
@@ -1 +1 @@
1
- {"version":3,"file":"SummaryFormatCompatibilityTests.d.ts","sourceRoot":"","sources":["../../../src/test/utilities/SummaryFormatCompatibilityTests.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAOH,OAAO,EAAE,UAAU,EAAE,MAAM,qBAAqB,CAAC;AASjD,OAAO,EAAE,qBAAqB,EAAE,MAAM,sBAAsB,CAAC;AAC7D,OAAO,EAEN,sCAAsC,EACtC,mCAAmC,EACnC,2BAA2B,EAC3B,wBAAwB,EACxB,MAAM,iBAAiB,CAAC;AAoBzB;;;GAGG;AACH,wBAAgB,eAAe,CAAC,WAAW,SAAS,UAAU,GAAG,qBAAqB,EACrF,KAAK,EAAE,MAAM,EACb,mBAAmB,EAAE,CAAC,OAAO,CAAC,EAAE,wBAAwB,KAAK,2BAA2B,CAAC,WAAW,CAAC,EACrG,8BAA8B,EAAE,CAC/B,OAAO,EAAE,mCAAmC,KACxC,OAAO,CAAC,sCAAsC,CAAC,WAAW,CAAC,CAAC,QA0JjE"}
1
+ {"version":3,"file":"SummaryFormatCompatibilityTests.d.ts","sourceRoot":"","sources":["../../../src/test/utilities/SummaryFormatCompatibilityTests.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAOH,OAAO,EAAE,UAAU,EAAE,MAAM,qBAAqB,CAAC;AASjD,OAAO,EAAE,qBAAqB,EAAE,MAAM,sBAAsB,CAAC;AAC7D,OAAO,EAEN,sCAAsC,EACtC,mCAAmC,EACnC,2BAA2B,EAC3B,wBAAwB,EACxB,MAAM,iBAAiB,CAAC;AAoBzB;;;GAGG;AACH,wBAAgB,eAAe,CAAC,WAAW,SAAS,UAAU,GAAG,qBAAqB,EACrF,KAAK,EAAE,MAAM,EACb,mBAAmB,EAAE,CAAC,OAAO,CAAC,EAAE,wBAAwB,KAAK,2BAA2B,CAAC,WAAW,CAAC,EACrG,8BAA8B,EAAE,CAC/B,OAAO,EAAE,mCAAmC,KACxC,OAAO,CAAC,sCAAsC,CAAC,WAAW,CAAC,CAAC,QA8JjE"}
@@ -138,6 +138,7 @@ export function runSummaryTests(title, setUpTestSharedTree, setUpLocalServerTest
138
138
  validateSummaryWrite(fullHistorySummarizer_0_1_0);
139
139
  });
140
140
  it('can be read and written with large history', async () => {
141
+ testObjectProvider.logger.registerExpectedEvent({ eventName: 'fluid:telemetry:Batching:LengthTooBig' }, { eventName: 'fluid:telemetry:Batching:LengthTooBig' });
141
142
  // Process an arbitrarily large number of stable edits
142
143
  createStableEdits(251).forEach((edit) => {
143
144
  expectedTree.processLocalEdit(edit);
@@ -1 +1 @@
1
- {"version":3,"file":"SummaryFormatCompatibilityTests.js","sourceRoot":"","sources":["../../../src/test/utilities/SummaryFormatCompatibilityTests.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAEH,OAAO,KAAK,EAAE,MAAM,IAAI,CAAC;AACzB,OAAO,EAAE,OAAO,EAAE,IAAI,EAAE,MAAM,MAAM,CAAC;AACrC,OAAO,EAAE,MAAM,EAAE,MAAM,EAAE,MAAM,MAAM,CAAC;AAEtC,OAAO,EAAE,IAAI,EAAE,MAAM,cAAc,CAAC;AAGpC,OAAO,EAAE,WAAW,EAAE,MAAM,gCAAgC,CAAC;AAC7D,OAAO,EACN,qBAAqB,EACrB,2BAA2B,GAG3B,MAAM,eAAe,CAAC;AAEvB,OAAO,EACN,iBAAiB,GAKjB,MAAM,iBAAiB,CAAC;AACzB,OAAO,EAAE,mBAAmB,EAAE,MAAM,kBAAkB,CAAC;AAEvD,0HAA0H;AAC1H,yCAAyC;AACzC,MAAM,QAAQ,GAAG,OAAO,CAAC,SAAS,EAAE,kCAAkC,CAAC,CAAC;AAExE,SAAS,eAAe,CAAC,WAAmB;IAC3C,OAAO,IAAI,CAAC,QAAQ,EAAE,GAAG,WAAW,OAAO,CAAC,CAAC;AAC9C,CAAC;AAED;;;GAGG;AACH,MAAM,oBAAoB,GAAqE;IAC9F,EAAE,OAAO,EAAE,OAAO,EAAE,UAAU,EAAE,qBAAqB,EAAE;IACvD,EAAE,OAAO,EAAE,OAAO,EAAE,UAAU,EAAE,2BAA2B,EAAE;CAC7D,CAAC;AAEF;;;GAGG;AACH,MAAM,UAAU,eAAe,CAC9B,KAAa,EACb,mBAAqG,EACrG,8BAEiE;IAEjE,QAAQ,CAAC,KAAK,EAAE,GAAG,EAAE;;QACpB,MAAM,WAAW,GAAG,sCAAgD,CAAC;QAErE,MAAM,cAAc,GAAG,IAAI,mBAAmB,EAAE,CAAC;QAEjD,IAAI,YAAyB,CAAC;QAC9B,IAAI,kBAAsC,CAAC;QAE3C,MAAM,gBAAgB,GAAG,EAAE,CAAC,WAAW,CAAC,QAAQ,CAAC,CAAC;QAElD,iFAAiF;QACjF,MAAM,YAAY,GAAG,IAAI,GAAG,EAAoB,CAAC;QACjD,KAAK,IAAI,QAAQ,IAAI,gBAAgB,EAAE;YACtC,0FAA0F;YAC1F,MAAM,yBAAyB,GAAG,yDAAyD,CAAC;YAC5F,MAAM,KAAK,GAAG,yBAAyB,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;YAEvD,MAAM,WAAW,SAAG,KAAK,aAAL,KAAK,uBAAL,KAAK,CAAE,MAAM,mCAAI,IAAI,CAAC,oBAAoB,QAAQ,EAAE,CAAC,CAAC;YAC1E,MAAM,WAAW,GAAG,WAAW,CAAC,WAAW,CAAC;YAC5C,QAAQ,GAAG,GAAG,WAAW,CAAC,WAAW,IAAI,WAAW,CAAC,OAAO,EAAE,CAAC;YAE/D,IAAI,UAAU,GAAG,YAAY,CAAC,GAAG,CAAC,WAAW,CAAC,CAAC;YAC/C,IAAI,UAAU,KAAK,SAAS,EAAE;gBAC7B,UAAU,GAAG,EAAE,CAAC;gBAChB,YAAY,CAAC,GAAG,CAAC,WAAW,EAAE,UAAU,CAAC,CAAC;aAC1C;YACD,UAAU,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;SAC1B;QAED,mCAAmC;QACnC,UAAU,CAAC,KAAK,IAAI,EAAE;YACrB,MAAM,iBAAiB,GAAG,MAAM,8BAA8B,CAAC;gBAC9D,WAAW;aACX,CAAC,CAAC;YACH,YAAY,GAAG,iBAAiB,CAAC,IAAI,CAAC;YACtC,kBAAkB,GAAG,iBAAiB,CAAC,kBAAkB,CAAC;QAC3D,CAAC,CAAC,CAAC;QAEH,SAAS,CAAC,KAAK,IAAI,EAAE;YACpB,kBAAkB,CAAC,KAAK,EAAE,CAAC;QAC5B,CAAC,CAAC,CAAC;QAEH,MAAM,mBAAmB,GAAG,CAAC,QAAgB,EAAQ,EAAE;YACtD,MAAM,iBAAiB,GAAG,EAAE,CAAC,YAAY,CAAC,eAAe,CAAC,QAAQ,CAAC,EAAE,MAAM,CAAC,CAAC;YAC7E,MAAM,OAAO,GAAG,WAAW,CAAC,iBAAiB,EAAE,cAAc,CAAC,CAAC;YAE/D,MAAM,EAAE,IAAI,EAAE,GAAG,mBAAmB,EAAE,CAAC;YACvC,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;YACjC,IAAI,CAAC,WAAW,CAAC,OAAgC,CAAC,CAAC;YAEnD,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;QAC9C,CAAC,CAAC;QAEF,MAAM,oBAAoB,GAAG,CAAC,UAAyC,EAAQ,EAAE;YAChF,gFAAgF;YAChF,MAAM,UAAU,GAAG,UAAU,CAAC,YAAY,CAAC,KAAK,EAAE,YAAY,CAAC,WAAW,CAAC,CAAC;YAC5E,MAAM,EAAE,IAAI,EAAE,KAAK,EAAE,GAAG,mBAAmB,EAAE,CAAC;YAC9C,KAAK,CAAC,WAAW,CAAC,UAAU,CAAC,CAAC;YAE9B,gFAAgF;YAChF,4CAA4C;YAC5C,MAAM,CAAC,KAAK,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;QAC/C,CAAC,CAAC;QAEF,KAAK,MAAM,CAAC,WAAW,EAAE,KAAK,CAAC,IAAI,YAAY,CAAC,OAAO,EAAE,EAAE;YAC1D,EAAE,CAAC,kBAAkB,WAAW,0DAA0D,EAAE,GAAG,EAAE;gBAChG,8BAA8B;gBAC9B,MAAM,iBAAiB,GAAG,EAAE,CAAC,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,EAAE,MAAM,CAAC,CAAC;gBAC7E,MAAM,OAAO,GAAG,WAAW,CAAC,iBAAiB,EAAE,cAAc,CAAC,CAAC;gBAC/D,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;gBACjC,YAAY,CAAC,WAAW,CAAC,OAAgC,CAAC,CAAC;gBAE3D,iEAAiE;gBACjE,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,KAAK,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;oBACtC,MAAM,EAAE,IAAI,EAAE,GAAG,mBAAmB,EAAE,CAAC;oBAEvC,MAAM,iBAAiB,GAAG,EAAE,CAAC,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,EAAE,MAAM,CAAC,CAAC;oBAC7E,MAAM,OAAO,GAAG,WAAW,CAAC,iBAAiB,EAAE,cAAc,CAAC,CAAC;oBAC/D,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;oBACjC,IAAI,CAAC,WAAW,CAAC,OAAgC,CAAC,CAAC;oBAEnD,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;iBAC7C;YACF,CAAC,CAAC,CAAC;YAEH,KAAK,MAAM,EAAE,OAAO,EAAE,UAAU,EAAE,IAAI,oBAAoB,EAAE;gBAC3D,EAAE,CAAC,kBAAkB,OAAO,uBAAuB,WAAW,eAAe,EAAE,KAAK,IAAI,EAAE;oBACzF,gEAAgE;oBAChE,MAAM,iBAAiB,GAAG,EAAE,CAAC,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,IAAI,EAAE,CAAC,CAAC,CAAC,CAAC,EAAE,MAAM,CAAC,CAAC;oBACpF,MAAM,OAAO,GAAG,WAAW,CAAC,iBAAiB,EAAE,cAAc,CAAC,CAAC;oBAC/D,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;oBAEjC,2EAA2E;oBAC3E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;oBAC9C,YAAY,CAAC,WAAW,CAAC,OAAgC,CAAC,CAAC;oBAE3D,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;oBAE9C,iDAAiD;oBACjD,MAAM,UAAU,GAAG,YAAY,CAAC,qBAAqB,CAAC,EAAE,UAAU,EAAE,CAAC,CAAC;oBAEtE,uFAAuF;oBACvF,MAAM,QAAQ,GAAG,GAAG,WAAW,IAAI,OAAO,EAAE,CAAC;oBAC7C,8DAA8D;oBAC9D,MAAM,eAAe,GAAG,IAAI,CAAC,SAAS,CACrC,IAAI,CAAC,KAAK,CAAC,EAAE,CAAC,YAAY,CAAC,eAAe,CAAC,QAAQ,CAAC,EAAE,MAAM,CAAC,CAAC,CAC9D,CAAC;oBAEF,MAAM,CAAC,UAAU,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,eAAe,CAAC,CAAC;gBAC9C,CAAC,CAAC,CAAC;aACH;SACD;QAED,QAAQ,CAAC,eAAe,EAAE,GAAG,EAAE;YAC9B,EAAE,CAAC,yCAAyC,EAAE,KAAK,IAAI,EAAE;gBACxD,mBAAmB,CAAC,kBAAkB,CAAC,CAAC;gBACxC,oBAAoB,CAAC,qBAAqB,CAAC,CAAC;YAC7C,CAAC,CAAC,CAAC;YAEH,EAAE,CAAC,4CAA4C,EAAE,KAAK,IAAI,EAAE;gBAC3D,iBAAiB,CAAC,EAAE,CAAC,CAAC,OAAO,CAAC,CAAC,IAAI,EAAE,EAAE;oBACtC,YAAY,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC;gBACrC,CAAC,CAAC,CAAC;gBAEH,2EAA2E;gBAC3E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;gBAE9C,mBAAmB,CAAC,qBAAqB,CAAC,CAAC;gBAC3C,oBAAoB,CAAC,qBAAqB,CAAC,CAAC;YAC7C,CAAC,CAAC,CAAC;QACJ,CAAC,CAAC,CAAC;QAEH,QAAQ,CAAC,eAAe,EAAE,GAAG,EAAE;YAC9B,EAAE,CAAC,yCAAyC,EAAE,KAAK,IAAI,EAAE;gBACxD,mBAAmB,CAAC,kBAAkB,CAAC,CAAC;gBACxC,oBAAoB,CAAC,2BAA2B,CAAC,CAAC;YACnD,CAAC,CAAC,CAAC;YAEH,EAAE,CAAC,4CAA4C,EAAE,KAAK,IAAI,EAAE;gBAC3D,sDAAsD;gBACtD,iBAAiB,CAAC,GAAG,CAAC,CAAC,OAAO,CAAC,CAAC,IAAI,EAAE,EAAE;oBACvC,YAAY,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC;gBACrC,CAAC,CAAC,CAAC;gBAEH,2EAA2E;gBAC3E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;gBAE9C,mBAAmB,CAAC,qBAAqB,CAAC,CAAC;gBAC3C,oBAAoB,CAAC,2BAA2B,CAAC,CAAC;YACnD,CAAC,CAAC,CAAC;QACJ,CAAC,CAAC,CAAC;IACJ,CAAC,CAAC,CAAC;AACJ,CAAC","sourcesContent":["/*!\n * Copyright (c) Microsoft Corporation and contributors. All rights reserved.\n * Licensed under the MIT License.\n */\n\nimport * as fs from 'fs';\nimport { resolve, join } from 'path';\nimport { assert, expect } from 'chai';\nimport { TestObjectProvider } from '@fluidframework/test-utils';\nimport { fail } from '../../Common';\nimport { SharedTree } from '../../default-edits';\nimport { EditId } from '../../Identifiers';\nimport { deserialize } from '../../SummaryBackCompatibility';\nimport {\n\tfullHistorySummarizer,\n\tfullHistorySummarizer_0_1_0,\n\tSharedTreeSummarizer,\n\tSharedTreeSummaryBase,\n} from '../../generic';\nimport { SharedTreeWithAnchors } from '../../anchored-edits';\nimport {\n\tcreateStableEdits,\n\tLocalServerSharedTreeTestingComponents,\n\tLocalServerSharedTreeTestingOptions,\n\tSharedTreeTestingComponents,\n\tSharedTreeTestingOptions,\n} from './TestUtilities';\nimport { TestFluidSerializer } from './TestSerializer';\n\n// This accounts for this file being executed after compilation. If many tests want to leverage resources, we should unify\n// resource path logic to a single place.\nconst pathBase = resolve(__dirname, '../../../src/test/summary-files/');\n\nfunction summaryFilePath(summaryName: string): string {\n\treturn join(pathBase, `${summaryName}.json`);\n}\n\n/**\n * A version/summarizer pair must be specified for a write test to be generated.\n * Versions that can no longer be written should be removed from this list.\n */\nconst supportedSummarizers: { version: string; summarizer: SharedTreeSummarizer<unknown> }[] = [\n\t{ version: '0.0.2', summarizer: fullHistorySummarizer },\n\t{ version: '0.1.0', summarizer: fullHistorySummarizer_0_1_0 },\n];\n\n/**\n * Runs a test suite for summaries on `SharedTree`.\n * This suite can be used to test other implementations that aim to fulfill `SharedTree`'s contract.\n */\nexport function runSummaryTests<TSharedTree extends SharedTree | SharedTreeWithAnchors>(\n\ttitle: string,\n\tsetUpTestSharedTree: (options?: SharedTreeTestingOptions) => SharedTreeTestingComponents<TSharedTree>,\n\tsetUpLocalServerTestSharedTree: (\n\t\toptions: LocalServerSharedTreeTestingOptions\n\t) => Promise<LocalServerSharedTreeTestingComponents<TSharedTree>>\n) {\n\tdescribe(title, () => {\n\t\tconst setupEditId = '9406d301-7449-48a5-b2ea-9be637b0c6e4' as EditId;\n\n\t\tconst testSerializer = new TestFluidSerializer();\n\n\t\tlet expectedTree: TSharedTree;\n\t\tlet testObjectProvider: TestObjectProvider;\n\n\t\tconst testSummaryFiles = fs.readdirSync(pathBase);\n\n\t\t// Create and populate a map of the file names associated with their summary type\n\t\tconst summaryTypes = new Map<string, string[]>();\n\t\tfor (let fileName of testSummaryFiles) {\n\t\t\t// Summary files should be named in the following format: `${summaryType}-${version}.json`\n\t\t\tconst fileNameRegularExpression = /(?<summaryType>[\\w+-]*\\w+)-(?<version>\\d+\\.\\d\\.\\d).json/;\n\t\t\tconst match = fileNameRegularExpression.exec(fileName);\n\n\t\t\tconst matchGroups = match?.groups ?? fail(`invalid filename ${fileName}`);\n\t\t\tconst summaryType = matchGroups.summaryType;\n\t\t\tfileName = `${matchGroups.summaryType}-${matchGroups.version}`;\n\n\t\t\tlet collection = summaryTypes.get(summaryType);\n\t\t\tif (collection === undefined) {\n\t\t\t\tcollection = [];\n\t\t\t\tsummaryTypes.set(summaryType, collection);\n\t\t\t}\n\t\t\tcollection.push(fileName);\n\t\t}\n\n\t\t// Resets the tree before each test\n\t\tbeforeEach(async () => {\n\t\t\tconst testingComponents = await setUpLocalServerTestSharedTree({\n\t\t\t\tsetupEditId,\n\t\t\t});\n\t\t\texpectedTree = testingComponents.tree;\n\t\t\ttestObjectProvider = testingComponents.testObjectProvider;\n\t\t});\n\n\t\tafterEach(async () => {\n\t\t\ttestObjectProvider.reset();\n\t\t});\n\n\t\tconst validateSummaryRead = (fileName: string): void => {\n\t\t\tconst serializedSummary = fs.readFileSync(summaryFilePath(fileName), 'utf8');\n\t\t\tconst summary = deserialize(serializedSummary, testSerializer);\n\n\t\t\tconst { tree } = setUpTestSharedTree();\n\t\t\tassert.typeOf(summary, 'object');\n\t\t\ttree.loadSummary(summary as SharedTreeSummaryBase);\n\n\t\t\texpect(tree.equals(expectedTree)).to.be.true;\n\t\t};\n\n\t\tconst validateSummaryWrite = (summarizer: SharedTreeSummarizer<unknown>): void => {\n\t\t\t// Save a new summary with the expected tree and use it to load a new SharedTree\n\t\t\tconst newSummary = summarizer(expectedTree.edits, expectedTree.currentView);\n\t\t\tconst { tree: tree2 } = setUpTestSharedTree();\n\t\t\ttree2.loadSummary(newSummary);\n\n\t\t\t// The expected tree, tree loaded with the existing summary, and the tree loaded\n\t\t\t// with the new summary should all be equal.\n\t\t\texpect(tree2.equals(expectedTree)).to.be.true;\n\t\t};\n\n\t\tfor (const [summaryType, files] of summaryTypes.entries()) {\n\t\t\tit(`files of type '${summaryType}' with different format versions produce identical trees`, () => {\n\t\t\t\t// Load the first summary file\n\t\t\t\tconst serializedSummary = fs.readFileSync(summaryFilePath(files[0]), 'utf8');\n\t\t\t\tconst summary = deserialize(serializedSummary, testSerializer);\n\t\t\t\tassert.typeOf(summary, 'object');\n\t\t\t\texpectedTree.loadSummary(summary as SharedTreeSummaryBase);\n\n\t\t\t\t// Check every other summary file results in the same loaded tree\n\t\t\t\tfor (let i = 1; i < files.length; i++) {\n\t\t\t\t\tconst { tree } = setUpTestSharedTree();\n\n\t\t\t\t\tconst serializedSummary = fs.readFileSync(summaryFilePath(files[i]), 'utf8');\n\t\t\t\t\tconst summary = deserialize(serializedSummary, testSerializer);\n\t\t\t\t\tassert.typeOf(summary, 'object');\n\t\t\t\t\ttree.loadSummary(summary as SharedTreeSummaryBase);\n\n\t\t\t\t\texpect(tree.equals(expectedTree)).to.be.true;\n\t\t\t\t}\n\t\t\t});\n\n\t\t\tfor (const { version, summarizer } of supportedSummarizers) {\n\t\t\t\tit(`format version ${version} can be written for ${summaryType} summary type`, async () => {\n\t\t\t\t\t// Load the first summary file (the one with the oldest version)\n\t\t\t\t\tconst serializedSummary = fs.readFileSync(summaryFilePath(files.sort()[0]), 'utf8');\n\t\t\t\t\tconst summary = deserialize(serializedSummary, testSerializer);\n\t\t\t\t\tassert.typeOf(summary, 'object');\n\n\t\t\t\t\t// Wait for the ops to to be submitted and processed across the containers.\n\t\t\t\t\tawait testObjectProvider.ensureSynchronized();\n\t\t\t\t\texpectedTree.loadSummary(summary as SharedTreeSummaryBase);\n\n\t\t\t\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\t\t\t\t// Write a new summary with the specified version\n\t\t\t\t\tconst newSummary = expectedTree.saveSerializedSummary({ summarizer });\n\n\t\t\t\t\t// Check the newly written summary is equivalent to its corresponding test summary file\n\t\t\t\t\tconst fileName = `${summaryType}-${version}`;\n\t\t\t\t\t// Re-stringify the the JSON file to remove escaped characters\n\t\t\t\t\tconst expectedSummary = JSON.stringify(\n\t\t\t\t\t\tJSON.parse(fs.readFileSync(summaryFilePath(fileName), 'utf8'))\n\t\t\t\t\t);\n\n\t\t\t\t\texpect(newSummary).to.equal(expectedSummary);\n\t\t\t\t});\n\t\t\t}\n\t\t}\n\n\t\tdescribe('version 0.0.2', () => {\n\t\t\tit('can be read and written with no history', async () => {\n\t\t\t\tvalidateSummaryRead('no-history-0.0.2');\n\t\t\t\tvalidateSummaryWrite(fullHistorySummarizer);\n\t\t\t});\n\n\t\t\tit('can be read and written with small history', async () => {\n\t\t\t\tcreateStableEdits(11).forEach((edit) => {\n\t\t\t\t\texpectedTree.processLocalEdit(edit);\n\t\t\t\t});\n\n\t\t\t\t// Wait for the ops to to be submitted and processed across the containers.\n\t\t\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\t\t\tvalidateSummaryRead('small-history-0.0.2');\n\t\t\t\tvalidateSummaryWrite(fullHistorySummarizer);\n\t\t\t});\n\t\t});\n\n\t\tdescribe('version 0.1.0', () => {\n\t\t\tit('can be read and written with no history', async () => {\n\t\t\t\tvalidateSummaryRead('no-history-0.1.0');\n\t\t\t\tvalidateSummaryWrite(fullHistorySummarizer_0_1_0);\n\t\t\t});\n\n\t\t\tit('can be read and written with large history', async () => {\n\t\t\t\t// Process an arbitrarily large number of stable edits\n\t\t\t\tcreateStableEdits(251).forEach((edit) => {\n\t\t\t\t\texpectedTree.processLocalEdit(edit);\n\t\t\t\t});\n\n\t\t\t\t// Wait for the ops to to be submitted and processed across the containers.\n\t\t\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\t\t\tvalidateSummaryRead('large-history-0.1.0');\n\t\t\t\tvalidateSummaryWrite(fullHistorySummarizer_0_1_0);\n\t\t\t});\n\t\t});\n\t});\n}\n"]}
1
+ {"version":3,"file":"SummaryFormatCompatibilityTests.js","sourceRoot":"","sources":["../../../src/test/utilities/SummaryFormatCompatibilityTests.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAEH,OAAO,KAAK,EAAE,MAAM,IAAI,CAAC;AACzB,OAAO,EAAE,OAAO,EAAE,IAAI,EAAE,MAAM,MAAM,CAAC;AACrC,OAAO,EAAE,MAAM,EAAE,MAAM,EAAE,MAAM,MAAM,CAAC;AAEtC,OAAO,EAAE,IAAI,EAAE,MAAM,cAAc,CAAC;AAGpC,OAAO,EAAE,WAAW,EAAE,MAAM,gCAAgC,CAAC;AAC7D,OAAO,EACN,qBAAqB,EACrB,2BAA2B,GAG3B,MAAM,eAAe,CAAC;AAEvB,OAAO,EACN,iBAAiB,GAKjB,MAAM,iBAAiB,CAAC;AACzB,OAAO,EAAE,mBAAmB,EAAE,MAAM,kBAAkB,CAAC;AAEvD,0HAA0H;AAC1H,yCAAyC;AACzC,MAAM,QAAQ,GAAG,OAAO,CAAC,SAAS,EAAE,kCAAkC,CAAC,CAAC;AAExE,SAAS,eAAe,CAAC,WAAmB;IAC3C,OAAO,IAAI,CAAC,QAAQ,EAAE,GAAG,WAAW,OAAO,CAAC,CAAC;AAC9C,CAAC;AAED;;;GAGG;AACH,MAAM,oBAAoB,GAAqE;IAC9F,EAAE,OAAO,EAAE,OAAO,EAAE,UAAU,EAAE,qBAAqB,EAAE;IACvD,EAAE,OAAO,EAAE,OAAO,EAAE,UAAU,EAAE,2BAA2B,EAAE;CAC7D,CAAC;AAEF;;;GAGG;AACH,MAAM,UAAU,eAAe,CAC9B,KAAa,EACb,mBAAqG,EACrG,8BAEiE;IAEjE,QAAQ,CAAC,KAAK,EAAE,GAAG,EAAE;;QACpB,MAAM,WAAW,GAAG,sCAAgD,CAAC;QAErE,MAAM,cAAc,GAAG,IAAI,mBAAmB,EAAE,CAAC;QAEjD,IAAI,YAAyB,CAAC;QAC9B,IAAI,kBAAsC,CAAC;QAE3C,MAAM,gBAAgB,GAAG,EAAE,CAAC,WAAW,CAAC,QAAQ,CAAC,CAAC;QAElD,iFAAiF;QACjF,MAAM,YAAY,GAAG,IAAI,GAAG,EAAoB,CAAC;QACjD,KAAK,IAAI,QAAQ,IAAI,gBAAgB,EAAE;YACtC,0FAA0F;YAC1F,MAAM,yBAAyB,GAAG,yDAAyD,CAAC;YAC5F,MAAM,KAAK,GAAG,yBAAyB,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;YAEvD,MAAM,WAAW,SAAG,KAAK,aAAL,KAAK,uBAAL,KAAK,CAAE,MAAM,mCAAI,IAAI,CAAC,oBAAoB,QAAQ,EAAE,CAAC,CAAC;YAC1E,MAAM,WAAW,GAAG,WAAW,CAAC,WAAW,CAAC;YAC5C,QAAQ,GAAG,GAAG,WAAW,CAAC,WAAW,IAAI,WAAW,CAAC,OAAO,EAAE,CAAC;YAE/D,IAAI,UAAU,GAAG,YAAY,CAAC,GAAG,CAAC,WAAW,CAAC,CAAC;YAC/C,IAAI,UAAU,KAAK,SAAS,EAAE;gBAC7B,UAAU,GAAG,EAAE,CAAC;gBAChB,YAAY,CAAC,GAAG,CAAC,WAAW,EAAE,UAAU,CAAC,CAAC;aAC1C;YACD,UAAU,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;SAC1B;QAED,mCAAmC;QACnC,UAAU,CAAC,KAAK,IAAI,EAAE;YACrB,MAAM,iBAAiB,GAAG,MAAM,8BAA8B,CAAC;gBAC9D,WAAW;aACX,CAAC,CAAC;YACH,YAAY,GAAG,iBAAiB,CAAC,IAAI,CAAC;YACtC,kBAAkB,GAAG,iBAAiB,CAAC,kBAAkB,CAAC;QAC3D,CAAC,CAAC,CAAC;QAEH,SAAS,CAAC,KAAK,IAAI,EAAE;YACpB,kBAAkB,CAAC,KAAK,EAAE,CAAC;QAC5B,CAAC,CAAC,CAAC;QAEH,MAAM,mBAAmB,GAAG,CAAC,QAAgB,EAAQ,EAAE;YACtD,MAAM,iBAAiB,GAAG,EAAE,CAAC,YAAY,CAAC,eAAe,CAAC,QAAQ,CAAC,EAAE,MAAM,CAAC,CAAC;YAC7E,MAAM,OAAO,GAAG,WAAW,CAAC,iBAAiB,EAAE,cAAc,CAAC,CAAC;YAE/D,MAAM,EAAE,IAAI,EAAE,GAAG,mBAAmB,EAAE,CAAC;YACvC,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;YACjC,IAAI,CAAC,WAAW,CAAC,OAAgC,CAAC,CAAC;YAEnD,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;QAC9C,CAAC,CAAC;QAEF,MAAM,oBAAoB,GAAG,CAAC,UAAyC,EAAQ,EAAE;YAChF,gFAAgF;YAChF,MAAM,UAAU,GAAG,UAAU,CAAC,YAAY,CAAC,KAAK,EAAE,YAAY,CAAC,WAAW,CAAC,CAAC;YAC5E,MAAM,EAAE,IAAI,EAAE,KAAK,EAAE,GAAG,mBAAmB,EAAE,CAAC;YAC9C,KAAK,CAAC,WAAW,CAAC,UAAU,CAAC,CAAC;YAE9B,gFAAgF;YAChF,4CAA4C;YAC5C,MAAM,CAAC,KAAK,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;QAC/C,CAAC,CAAC;QAEF,KAAK,MAAM,CAAC,WAAW,EAAE,KAAK,CAAC,IAAI,YAAY,CAAC,OAAO,EAAE,EAAE;YAC1D,EAAE,CAAC,kBAAkB,WAAW,0DAA0D,EAAE,GAAG,EAAE;gBAChG,8BAA8B;gBAC9B,MAAM,iBAAiB,GAAG,EAAE,CAAC,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,EAAE,MAAM,CAAC,CAAC;gBAC7E,MAAM,OAAO,GAAG,WAAW,CAAC,iBAAiB,EAAE,cAAc,CAAC,CAAC;gBAC/D,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;gBACjC,YAAY,CAAC,WAAW,CAAC,OAAgC,CAAC,CAAC;gBAE3D,iEAAiE;gBACjE,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,KAAK,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;oBACtC,MAAM,EAAE,IAAI,EAAE,GAAG,mBAAmB,EAAE,CAAC;oBAEvC,MAAM,iBAAiB,GAAG,EAAE,CAAC,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,EAAE,MAAM,CAAC,CAAC;oBAC7E,MAAM,OAAO,GAAG,WAAW,CAAC,iBAAiB,EAAE,cAAc,CAAC,CAAC;oBAC/D,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;oBACjC,IAAI,CAAC,WAAW,CAAC,OAAgC,CAAC,CAAC;oBAEnD,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC;iBAC7C;YACF,CAAC,CAAC,CAAC;YAEH,KAAK,MAAM,EAAE,OAAO,EAAE,UAAU,EAAE,IAAI,oBAAoB,EAAE;gBAC3D,EAAE,CAAC,kBAAkB,OAAO,uBAAuB,WAAW,eAAe,EAAE,KAAK,IAAI,EAAE;oBACzF,gEAAgE;oBAChE,MAAM,iBAAiB,GAAG,EAAE,CAAC,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,IAAI,EAAE,CAAC,CAAC,CAAC,CAAC,EAAE,MAAM,CAAC,CAAC;oBACpF,MAAM,OAAO,GAAG,WAAW,CAAC,iBAAiB,EAAE,cAAc,CAAC,CAAC;oBAC/D,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;oBAEjC,2EAA2E;oBAC3E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;oBAC9C,YAAY,CAAC,WAAW,CAAC,OAAgC,CAAC,CAAC;oBAE3D,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;oBAE9C,iDAAiD;oBACjD,MAAM,UAAU,GAAG,YAAY,CAAC,qBAAqB,CAAC,EAAE,UAAU,EAAE,CAAC,CAAC;oBAEtE,uFAAuF;oBACvF,MAAM,QAAQ,GAAG,GAAG,WAAW,IAAI,OAAO,EAAE,CAAC;oBAC7C,8DAA8D;oBAC9D,MAAM,eAAe,GAAG,IAAI,CAAC,SAAS,CACrC,IAAI,CAAC,KAAK,CAAC,EAAE,CAAC,YAAY,CAAC,eAAe,CAAC,QAAQ,CAAC,EAAE,MAAM,CAAC,CAAC,CAC9D,CAAC;oBAEF,MAAM,CAAC,UAAU,CAAC,CAAC,EAAE,CAAC,KAAK,CAAC,eAAe,CAAC,CAAC;gBAC9C,CAAC,CAAC,CAAC;aACH;SACD;QAED,QAAQ,CAAC,eAAe,EAAE,GAAG,EAAE;YAC9B,EAAE,CAAC,yCAAyC,EAAE,KAAK,IAAI,EAAE;gBACxD,mBAAmB,CAAC,kBAAkB,CAAC,CAAC;gBACxC,oBAAoB,CAAC,qBAAqB,CAAC,CAAC;YAC7C,CAAC,CAAC,CAAC;YAEH,EAAE,CAAC,4CAA4C,EAAE,KAAK,IAAI,EAAE;gBAC3D,iBAAiB,CAAC,EAAE,CAAC,CAAC,OAAO,CAAC,CAAC,IAAI,EAAE,EAAE;oBACtC,YAAY,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC;gBACrC,CAAC,CAAC,CAAC;gBAEH,2EAA2E;gBAC3E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;gBAE9C,mBAAmB,CAAC,qBAAqB,CAAC,CAAC;gBAC3C,oBAAoB,CAAC,qBAAqB,CAAC,CAAC;YAC7C,CAAC,CAAC,CAAC;QACJ,CAAC,CAAC,CAAC;QAEH,QAAQ,CAAC,eAAe,EAAE,GAAG,EAAE;YAC9B,EAAE,CAAC,yCAAyC,EAAE,KAAK,IAAI,EAAE;gBACxD,mBAAmB,CAAC,kBAAkB,CAAC,CAAC;gBACxC,oBAAoB,CAAC,2BAA2B,CAAC,CAAC;YACnD,CAAC,CAAC,CAAC;YAEH,EAAE,CAAC,4CAA4C,EAAE,KAAK,IAAI,EAAE;gBAC3D,kBAAkB,CAAC,MAAM,CAAC,qBAAqB,CAC9C,EAAE,SAAS,EAAE,uCAAuC,EAAE,EACtD,EAAE,SAAS,EAAE,uCAAuC,EAAE,CACtD,CAAC;gBACF,sDAAsD;gBACtD,iBAAiB,CAAC,GAAG,CAAC,CAAC,OAAO,CAAC,CAAC,IAAI,EAAE,EAAE;oBACvC,YAAY,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC;gBACrC,CAAC,CAAC,CAAC;gBAEH,2EAA2E;gBAC3E,MAAM,kBAAkB,CAAC,kBAAkB,EAAE,CAAC;gBAE9C,mBAAmB,CAAC,qBAAqB,CAAC,CAAC;gBAC3C,oBAAoB,CAAC,2BAA2B,CAAC,CAAC;YACnD,CAAC,CAAC,CAAC;QACJ,CAAC,CAAC,CAAC;IACJ,CAAC,CAAC,CAAC;AACJ,CAAC","sourcesContent":["/*!\n * Copyright (c) Microsoft Corporation and contributors. All rights reserved.\n * Licensed under the MIT License.\n */\n\nimport * as fs from 'fs';\nimport { resolve, join } from 'path';\nimport { assert, expect } from 'chai';\nimport { TestObjectProvider } from '@fluidframework/test-utils';\nimport { fail } from '../../Common';\nimport { SharedTree } from '../../default-edits';\nimport { EditId } from '../../Identifiers';\nimport { deserialize } from '../../SummaryBackCompatibility';\nimport {\n\tfullHistorySummarizer,\n\tfullHistorySummarizer_0_1_0,\n\tSharedTreeSummarizer,\n\tSharedTreeSummaryBase,\n} from '../../generic';\nimport { SharedTreeWithAnchors } from '../../anchored-edits';\nimport {\n\tcreateStableEdits,\n\tLocalServerSharedTreeTestingComponents,\n\tLocalServerSharedTreeTestingOptions,\n\tSharedTreeTestingComponents,\n\tSharedTreeTestingOptions,\n} from './TestUtilities';\nimport { TestFluidSerializer } from './TestSerializer';\n\n// This accounts for this file being executed after compilation. If many tests want to leverage resources, we should unify\n// resource path logic to a single place.\nconst pathBase = resolve(__dirname, '../../../src/test/summary-files/');\n\nfunction summaryFilePath(summaryName: string): string {\n\treturn join(pathBase, `${summaryName}.json`);\n}\n\n/**\n * A version/summarizer pair must be specified for a write test to be generated.\n * Versions that can no longer be written should be removed from this list.\n */\nconst supportedSummarizers: { version: string; summarizer: SharedTreeSummarizer<unknown> }[] = [\n\t{ version: '0.0.2', summarizer: fullHistorySummarizer },\n\t{ version: '0.1.0', summarizer: fullHistorySummarizer_0_1_0 },\n];\n\n/**\n * Runs a test suite for summaries on `SharedTree`.\n * This suite can be used to test other implementations that aim to fulfill `SharedTree`'s contract.\n */\nexport function runSummaryTests<TSharedTree extends SharedTree | SharedTreeWithAnchors>(\n\ttitle: string,\n\tsetUpTestSharedTree: (options?: SharedTreeTestingOptions) => SharedTreeTestingComponents<TSharedTree>,\n\tsetUpLocalServerTestSharedTree: (\n\t\toptions: LocalServerSharedTreeTestingOptions\n\t) => Promise<LocalServerSharedTreeTestingComponents<TSharedTree>>\n) {\n\tdescribe(title, () => {\n\t\tconst setupEditId = '9406d301-7449-48a5-b2ea-9be637b0c6e4' as EditId;\n\n\t\tconst testSerializer = new TestFluidSerializer();\n\n\t\tlet expectedTree: TSharedTree;\n\t\tlet testObjectProvider: TestObjectProvider;\n\n\t\tconst testSummaryFiles = fs.readdirSync(pathBase);\n\n\t\t// Create and populate a map of the file names associated with their summary type\n\t\tconst summaryTypes = new Map<string, string[]>();\n\t\tfor (let fileName of testSummaryFiles) {\n\t\t\t// Summary files should be named in the following format: `${summaryType}-${version}.json`\n\t\t\tconst fileNameRegularExpression = /(?<summaryType>[\\w+-]*\\w+)-(?<version>\\d+\\.\\d\\.\\d).json/;\n\t\t\tconst match = fileNameRegularExpression.exec(fileName);\n\n\t\t\tconst matchGroups = match?.groups ?? fail(`invalid filename ${fileName}`);\n\t\t\tconst summaryType = matchGroups.summaryType;\n\t\t\tfileName = `${matchGroups.summaryType}-${matchGroups.version}`;\n\n\t\t\tlet collection = summaryTypes.get(summaryType);\n\t\t\tif (collection === undefined) {\n\t\t\t\tcollection = [];\n\t\t\t\tsummaryTypes.set(summaryType, collection);\n\t\t\t}\n\t\t\tcollection.push(fileName);\n\t\t}\n\n\t\t// Resets the tree before each test\n\t\tbeforeEach(async () => {\n\t\t\tconst testingComponents = await setUpLocalServerTestSharedTree({\n\t\t\t\tsetupEditId,\n\t\t\t});\n\t\t\texpectedTree = testingComponents.tree;\n\t\t\ttestObjectProvider = testingComponents.testObjectProvider;\n\t\t});\n\n\t\tafterEach(async () => {\n\t\t\ttestObjectProvider.reset();\n\t\t});\n\n\t\tconst validateSummaryRead = (fileName: string): void => {\n\t\t\tconst serializedSummary = fs.readFileSync(summaryFilePath(fileName), 'utf8');\n\t\t\tconst summary = deserialize(serializedSummary, testSerializer);\n\n\t\t\tconst { tree } = setUpTestSharedTree();\n\t\t\tassert.typeOf(summary, 'object');\n\t\t\ttree.loadSummary(summary as SharedTreeSummaryBase);\n\n\t\t\texpect(tree.equals(expectedTree)).to.be.true;\n\t\t};\n\n\t\tconst validateSummaryWrite = (summarizer: SharedTreeSummarizer<unknown>): void => {\n\t\t\t// Save a new summary with the expected tree and use it to load a new SharedTree\n\t\t\tconst newSummary = summarizer(expectedTree.edits, expectedTree.currentView);\n\t\t\tconst { tree: tree2 } = setUpTestSharedTree();\n\t\t\ttree2.loadSummary(newSummary);\n\n\t\t\t// The expected tree, tree loaded with the existing summary, and the tree loaded\n\t\t\t// with the new summary should all be equal.\n\t\t\texpect(tree2.equals(expectedTree)).to.be.true;\n\t\t};\n\n\t\tfor (const [summaryType, files] of summaryTypes.entries()) {\n\t\t\tit(`files of type '${summaryType}' with different format versions produce identical trees`, () => {\n\t\t\t\t// Load the first summary file\n\t\t\t\tconst serializedSummary = fs.readFileSync(summaryFilePath(files[0]), 'utf8');\n\t\t\t\tconst summary = deserialize(serializedSummary, testSerializer);\n\t\t\t\tassert.typeOf(summary, 'object');\n\t\t\t\texpectedTree.loadSummary(summary as SharedTreeSummaryBase);\n\n\t\t\t\t// Check every other summary file results in the same loaded tree\n\t\t\t\tfor (let i = 1; i < files.length; i++) {\n\t\t\t\t\tconst { tree } = setUpTestSharedTree();\n\n\t\t\t\t\tconst serializedSummary = fs.readFileSync(summaryFilePath(files[i]), 'utf8');\n\t\t\t\t\tconst summary = deserialize(serializedSummary, testSerializer);\n\t\t\t\t\tassert.typeOf(summary, 'object');\n\t\t\t\t\ttree.loadSummary(summary as SharedTreeSummaryBase);\n\n\t\t\t\t\texpect(tree.equals(expectedTree)).to.be.true;\n\t\t\t\t}\n\t\t\t});\n\n\t\t\tfor (const { version, summarizer } of supportedSummarizers) {\n\t\t\t\tit(`format version ${version} can be written for ${summaryType} summary type`, async () => {\n\t\t\t\t\t// Load the first summary file (the one with the oldest version)\n\t\t\t\t\tconst serializedSummary = fs.readFileSync(summaryFilePath(files.sort()[0]), 'utf8');\n\t\t\t\t\tconst summary = deserialize(serializedSummary, testSerializer);\n\t\t\t\t\tassert.typeOf(summary, 'object');\n\n\t\t\t\t\t// Wait for the ops to to be submitted and processed across the containers.\n\t\t\t\t\tawait testObjectProvider.ensureSynchronized();\n\t\t\t\t\texpectedTree.loadSummary(summary as SharedTreeSummaryBase);\n\n\t\t\t\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\t\t\t\t// Write a new summary with the specified version\n\t\t\t\t\tconst newSummary = expectedTree.saveSerializedSummary({ summarizer });\n\n\t\t\t\t\t// Check the newly written summary is equivalent to its corresponding test summary file\n\t\t\t\t\tconst fileName = `${summaryType}-${version}`;\n\t\t\t\t\t// Re-stringify the the JSON file to remove escaped characters\n\t\t\t\t\tconst expectedSummary = JSON.stringify(\n\t\t\t\t\t\tJSON.parse(fs.readFileSync(summaryFilePath(fileName), 'utf8'))\n\t\t\t\t\t);\n\n\t\t\t\t\texpect(newSummary).to.equal(expectedSummary);\n\t\t\t\t});\n\t\t\t}\n\t\t}\n\n\t\tdescribe('version 0.0.2', () => {\n\t\t\tit('can be read and written with no history', async () => {\n\t\t\t\tvalidateSummaryRead('no-history-0.0.2');\n\t\t\t\tvalidateSummaryWrite(fullHistorySummarizer);\n\t\t\t});\n\n\t\t\tit('can be read and written with small history', async () => {\n\t\t\t\tcreateStableEdits(11).forEach((edit) => {\n\t\t\t\t\texpectedTree.processLocalEdit(edit);\n\t\t\t\t});\n\n\t\t\t\t// Wait for the ops to to be submitted and processed across the containers.\n\t\t\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\t\t\tvalidateSummaryRead('small-history-0.0.2');\n\t\t\t\tvalidateSummaryWrite(fullHistorySummarizer);\n\t\t\t});\n\t\t});\n\n\t\tdescribe('version 0.1.0', () => {\n\t\t\tit('can be read and written with no history', async () => {\n\t\t\t\tvalidateSummaryRead('no-history-0.1.0');\n\t\t\t\tvalidateSummaryWrite(fullHistorySummarizer_0_1_0);\n\t\t\t});\n\n\t\t\tit('can be read and written with large history', async () => {\n\t\t\t\ttestObjectProvider.logger.registerExpectedEvent(\n\t\t\t\t\t{ eventName: 'fluid:telemetry:Batching:LengthTooBig' },\n\t\t\t\t\t{ eventName: 'fluid:telemetry:Batching:LengthTooBig' }\n\t\t\t\t);\n\t\t\t\t// Process an arbitrarily large number of stable edits\n\t\t\t\tcreateStableEdits(251).forEach((edit) => {\n\t\t\t\t\texpectedTree.processLocalEdit(edit);\n\t\t\t\t});\n\n\t\t\t\t// Wait for the ops to to be submitted and processed across the containers.\n\t\t\t\tawait testObjectProvider.ensureSynchronized();\n\n\t\t\t\tvalidateSummaryRead('large-history-0.1.0');\n\t\t\t\tvalidateSummaryWrite(fullHistorySummarizer_0_1_0);\n\t\t\t});\n\t\t});\n\t});\n}\n"]}
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@fluid-experimental/tree",
3
- "version": "0.57.2",
3
+ "version": "0.58.1000",
4
4
  "description": "Distributed tree",
5
5
  "homepage": "https://fluidframework.com",
6
6
  "repository": "https://github.com/microsoft/FluidFramework",
@@ -32,13 +32,13 @@
32
32
  "dependencies": {
33
33
  "@fluidframework/common-definitions": "^0.20.1",
34
34
  "@fluidframework/common-utils": "^0.32.1",
35
- "@fluidframework/container-definitions": "^0.46.0",
35
+ "@fluidframework/container-definitions": "^0.47.1000",
36
36
  "@fluidframework/core-interfaces": "^0.42.0",
37
- "@fluidframework/datastore-definitions": "^0.57.2",
38
- "@fluidframework/protocol-definitions": "^0.1026.0",
39
- "@fluidframework/runtime-definitions": "^0.57.2",
40
- "@fluidframework/shared-object-base": "^0.57.2",
41
- "@fluidframework/telemetry-utils": "^0.57.2",
37
+ "@fluidframework/datastore-definitions": "^0.58.1000",
38
+ "@fluidframework/protocol-definitions": "^0.1027.1000",
39
+ "@fluidframework/runtime-definitions": "^0.58.1000",
40
+ "@fluidframework/shared-object-base": "^0.58.1000",
41
+ "@fluidframework/telemetry-utils": "^0.58.1000",
42
42
  "buffer": "^6.0.3",
43
43
  "denque": "^1.5.0",
44
44
  "lru-cache": "^6.0.0",
@@ -47,13 +47,13 @@
47
47
  },
48
48
  "devDependencies": {
49
49
  "@fluidframework/build-common": "^0.23.0",
50
- "@fluidframework/container-loader": "^0.57.2",
50
+ "@fluidframework/container-loader": "^0.58.1000",
51
51
  "@fluidframework/eslint-config-fluid": "^0.26.0",
52
- "@fluidframework/mocha-test-setup": "^0.57.2",
53
- "@fluidframework/runtime-utils": "^0.57.2",
54
- "@fluidframework/test-drivers": "^0.57.2",
55
- "@fluidframework/test-runtime-utils": "^0.57.2",
56
- "@fluidframework/test-utils": "^0.57.2",
52
+ "@fluidframework/mocha-test-setup": "^0.58.1000",
53
+ "@fluidframework/runtime-utils": "^0.58.1000",
54
+ "@fluidframework/test-drivers": "^0.58.1000",
55
+ "@fluidframework/test-runtime-utils": "^0.58.1000",
56
+ "@fluidframework/test-utils": "^0.58.1000",
57
57
  "@microsoft/api-extractor": "^7.16.1",
58
58
  "@rushstack/eslint-config": "^2.5.1",
59
59
  "@types/lru-cache": "^5.1.0",