@powerhousedao/academy 3.2.0-dev.2 → 3.2.0-dev.3
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +10 -0
- package/docs/academy/01-GetStarted/00-ExploreDemoPackage.mdx +188 -0
- package/docs/academy/01-GetStarted/01-CreateNewPowerhouseProject.md +10 -12
- package/docs/academy/01-GetStarted/02-DefineToDoListDocumentModel.md +8 -8
- package/docs/academy/01-GetStarted/03-ImplementOperationReducers.md +11 -11
- package/docs/academy/01-GetStarted/04-BuildToDoListEditor.md +13 -13
- package/docs/academy/01-GetStarted/_04-BuildToDoListEditor +12 -12
- package/docs/academy/01-GetStarted/home.mdx +50 -51
- package/docs/academy/01-GetStarted/images/Connect.png +0 -0
- package/docs/academy/01-GetStarted/images/Packagemanager.png +0 -0
- package/docs/academy/01-GetStarted/images/TodoDriveApp.png +0 -0
- package/docs/academy/01-GetStarted/styles.module.css +7 -14
- package/docs/academy/02-MasteryTrack/01-BuilderEnvironment/02-StandardDocumentModelWorkflow.md +22 -22
- package/docs/academy/02-MasteryTrack/01-BuilderEnvironment/03-BuilderTools.md +17 -17
- package/docs/academy/02-MasteryTrack/01-BuilderEnvironment/_category_.json +1 -1
- package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/01-WhatIsADocumentModel.md +15 -15
- package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/02-SpecifyTheStateSchema.md +11 -9
- package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/03-SpecifyDocumentOperations.md +15 -15
- package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/04-UseTheDocumentModelGenerator.md +9 -9
- package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/05-ImplementDocumentReducers.md +15 -15
- package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/06-ImplementDocumentModelTests.md +14 -14
- package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/07-ExampleToDoListRepository.md +4 -4
- package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/01-BuildingDocumentEditors.md +28 -30
- package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/02-ConfiguringDrives.md +7 -7
- package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/03-BuildingADriveExplorer.md +9 -10
- package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/07-DocumentTools/01-OperationHistory.md +11 -11
- package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/07-DocumentTools/02-RevisionHistoryTimeline.md +6 -6
- package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/08-Authorization/01-RenownAuthenticationFlow.md +8 -8
- package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/08-Authorization/02-Authorization.md +8 -8
- package/docs/academy/02-MasteryTrack/04-WorkWithData/01-ReadingAndWritingThroughTheAPI.mdx +5 -5
- package/docs/academy/02-MasteryTrack/04-WorkWithData/02-GraphQLAtPowerhouse.md +3 -3
- package/docs/academy/02-MasteryTrack/04-WorkWithData/03-WorkingWithSubgraphs/02-GraphQLAndSubgraphs.mdx +8 -8
- package/docs/academy/02-MasteryTrack/04-WorkWithData/03-WorkingWithSubgraphs/03-WorkingWithSubgraphs.md +28 -28
- package/docs/academy/02-MasteryTrack/04-WorkWithData/04-analytics-processor.md +4 -4
- package/docs/academy/02-MasteryTrack/04-WorkWithData/05-AnalyticsProcessorTutorial/01-SetupBuilderEnvironment.md +14 -14
- package/docs/academy/02-MasteryTrack/04-WorkWithData/05-AnalyticsProcessorTutorial/02-CreateNewPowerhouseProject.md +2 -2
- package/docs/academy/02-MasteryTrack/04-WorkWithData/05-AnalyticsProcessorTutorial/03-GenerateAnAnalyticsProcessor.md +6 -6
- package/docs/academy/02-MasteryTrack/04-WorkWithData/05-AnalyticsProcessorTutorial/04-UpdateAnalyticsProcessor.md +1 -1
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/GraphQL References/QueryingADocumentWithGraphQL.md +2 -2
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/best-practices.md +4 -4
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/graphql/index.md +7 -7
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/graphql/integration.md +1 -1
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/intro.md +6 -6
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/browser.md +1 -1
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/index.md +5 -5
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/memory.md +1 -1
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/pg.md +2 -2
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/schema.md +1 -1
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/utilities.md +1 -1
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/use-cases/index.md +1 -1
- package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/use-cases/maker.md +12 -12
- package/docs/academy/02-MasteryTrack/05-Launch/01-IntroductionToPackages.md +9 -9
- package/docs/academy/02-MasteryTrack/05-Launch/02-PublishYourProject.md +8 -8
- package/docs/academy/02-MasteryTrack/05-Launch/03-SetupEnvironment.md +35 -35
- package/docs/academy/02-MasteryTrack/05-Launch/04-ConfigureEnvironment.md +8 -8
- package/docs/academy/02-MasteryTrack/_category_.json +1 -1
- package/docs/academy/03-ExampleUsecases/Chatroom/03-DefineChatroomDocumentModel.md +5 -5
- package/docs/academy/07-Cookbook.md +105 -105
- package/package.json +1 -1
- package/sidebars.ts +9 -10
- package/src/css/custom.css +18 -0
- package/docs/academy/01-GetStarted/00-ExploreDemoPackage.md +0 -88
|
@@ -1,9 +1,9 @@
|
|
|
1
|
-
# Working with
|
|
1
|
+
# Working with subgraphs
|
|
2
2
|
|
|
3
|
-
This tutorial will demonstrate how to create and customize a subgraph using our
|
|
3
|
+
This tutorial will demonstrate how to create and customize a subgraph using our To-do List project as an example.
|
|
4
4
|
Let's start with the basics and gradually add more complex features and functionality.
|
|
5
5
|
|
|
6
|
-
## What is a
|
|
6
|
+
## What is a subgraph?
|
|
7
7
|
|
|
8
8
|
A subgraph in Powerhouse is a **GraphQL-based modular data component** that extends the functionality of your document models. While document models handle the core state and operations, subgraphs can:
|
|
9
9
|
1. Connect to external APIs or databases
|
|
@@ -11,19 +11,19 @@ A subgraph in Powerhouse is a **GraphQL-based modular data component** that exte
|
|
|
11
11
|
3. Automate interactions between different document models
|
|
12
12
|
4. Provide additional backend functionality
|
|
13
13
|
|
|
14
|
-
### Subgraphs can retrieve data from
|
|
14
|
+
### Subgraphs can retrieve data from
|
|
15
15
|
|
|
16
16
|
- **The Reactor** – The core Powerhouse data system or network node.
|
|
17
17
|
- **Operational Data Stores** – Structured data storage for operational processes, offering real-time updates, for querying structured data.
|
|
18
18
|
- **Analytics Stores** – Aggregated historical data, useful for insights, reporting and business intelligence.
|
|
19
19
|
|
|
20
|
-
### Subgraphs consist of
|
|
20
|
+
### Subgraphs consist of
|
|
21
21
|
|
|
22
22
|
- **A schema** – Which defines the GraphQL Queries and Mutations.
|
|
23
23
|
- **Resolvers** – Which handle data fetching and logic.
|
|
24
24
|
- **Context Fields** – Additional metadata that helps in resolving data efficiently.
|
|
25
25
|
|
|
26
|
-
####
|
|
26
|
+
#### Additionally, context fields allow resolvers to access extra information, such as:
|
|
27
27
|
- **User authentication** (e.g., checking if a user is an admin).
|
|
28
28
|
- **External data sources** (e.g., analytics).
|
|
29
29
|
|
|
@@ -38,9 +38,9 @@ context: {
|
|
|
38
38
|
}
|
|
39
39
|
```
|
|
40
40
|
|
|
41
|
-
## 1. How to generate a subgraph
|
|
41
|
+
## 1. How to generate a subgraph
|
|
42
42
|
|
|
43
|
-
Lets start by generating a new subgraph. For our tutorial we will create a new subgraph within our
|
|
43
|
+
Lets start by generating a new subgraph. For our tutorial we will create a new subgraph within our To-do List project.
|
|
44
44
|
Open your project and start your terminal.
|
|
45
45
|
The Powerhouse toolkit provides a command-line utility to create new subgraphs easily.
|
|
46
46
|
|
|
@@ -55,7 +55,7 @@ Loaded templates: node_modules/@powerhousedao/codegen/dist/codegen/.hygen/templa
|
|
|
55
55
|
inject: ./subgraphs/index.ts
|
|
56
56
|
```
|
|
57
57
|
|
|
58
|
-
### What
|
|
58
|
+
### What happened?
|
|
59
59
|
1. A new subgraph was created in `./subgraphs/to-do-list-subgraph/`
|
|
60
60
|
2. The subgraph was automatically registered in your project's registry
|
|
61
61
|
3. Basic boilerplate code was generated with an example query
|
|
@@ -67,7 +67,7 @@ If we now run 'phreactor' we will see the new subgraph being registered during t
|
|
|
67
67
|
|
|
68
68
|
Now that we've generated our subgraph, let's open it and define the schema inside the `index.ts` file.
|
|
69
69
|
|
|
70
|
-
### 2.1 Define the
|
|
70
|
+
### 2.1 Define the schema
|
|
71
71
|
|
|
72
72
|
Here we define the schema (typeDefs) which defines the structure of your queries and mutations.
|
|
73
73
|
For educational purposes we will define a simple query that mimics the functionality of the todoList interface (or editor):
|
|
@@ -83,7 +83,7 @@ type Query {
|
|
|
83
83
|
}
|
|
84
84
|
```
|
|
85
85
|
|
|
86
|
-
### What
|
|
86
|
+
### What happened?
|
|
87
87
|
|
|
88
88
|
- Added two queries: todoList and todoItems
|
|
89
89
|
- Created an operational table todo_items to store the todo items
|
|
@@ -93,7 +93,7 @@ type Query {
|
|
|
93
93
|
- The todoList query returns the full list with its statistics
|
|
94
94
|
|
|
95
95
|
|
|
96
|
-
### 2.2 Implement the
|
|
96
|
+
### 2.2 Implement the resolver for the subgraph's schema
|
|
97
97
|
Resolvers define how data is retrieved or modified.
|
|
98
98
|
If you query for a specific value you can retrieve the value from either the reactor itself or an operational datastore.
|
|
99
99
|
We'll look into this in more detail in the next section.
|
|
@@ -108,7 +108,7 @@ resolvers: {
|
|
|
108
108
|
}
|
|
109
109
|
```
|
|
110
110
|
|
|
111
|
-
### 2.3 Add
|
|
111
|
+
### 2.3 Add operational data storage (optional)
|
|
112
112
|
If you need to persist data, initialize an operational datastore inside onSetup():
|
|
113
113
|
|
|
114
114
|
```typescript title="Adding an operational datastore"
|
|
@@ -126,7 +126,7 @@ async createOperationalTables() {
|
|
|
126
126
|
}
|
|
127
127
|
```
|
|
128
128
|
|
|
129
|
-
### 2.4 Fetching from an
|
|
129
|
+
### 2.4 Fetching from an operational store
|
|
130
130
|
If your subgraph interacts with an Operational Data Store, modify the resolver:
|
|
131
131
|
|
|
132
132
|
```typescript title="Example of a resolver that fetches data from an operational store"
|
|
@@ -139,9 +139,9 @@ resolvers: {
|
|
|
139
139
|
}
|
|
140
140
|
```
|
|
141
141
|
|
|
142
|
-
### 2.5 Connecting to a
|
|
142
|
+
### 2.5 Connecting to a processor (optional, but recommended)
|
|
143
143
|
|
|
144
|
-
#### Why
|
|
144
|
+
#### Why connect a processor?
|
|
145
145
|
Subgraphs alone are limited. A subgraph only queries data, but doesn't generate or store it.
|
|
146
146
|
To make subgraphs useful, connect them with processors that update the data dynamically.
|
|
147
147
|
**A processor listens to system events and updates the operational store in real-time.**
|
|
@@ -173,9 +173,9 @@ resolvers: {
|
|
|
173
173
|
}
|
|
174
174
|
```
|
|
175
175
|
|
|
176
|
-
## 3. Testing the
|
|
176
|
+
## 3. Testing the subgraph
|
|
177
177
|
|
|
178
|
-
### 3.1. Start the
|
|
178
|
+
### 3.1. Start the reactor
|
|
179
179
|
To activate the subgraph, run:
|
|
180
180
|
|
|
181
181
|
```bash
|
|
@@ -187,7 +187,7 @@ Or, for full system startup:
|
|
|
187
187
|
ph dev
|
|
188
188
|
```
|
|
189
189
|
|
|
190
|
-
### 3.2. Access GraphQL
|
|
190
|
+
### 3.2. Access GraphQL playground
|
|
191
191
|
Open your browser and go to:
|
|
192
192
|
|
|
193
193
|
```bash
|
|
@@ -199,7 +199,7 @@ Example:
|
|
|
199
199
|
http://localhost:4001/test-subgraph
|
|
200
200
|
```
|
|
201
201
|
|
|
202
|
-
### 3.3. Run a
|
|
202
|
+
### 3.3. Run a query
|
|
203
203
|
|
|
204
204
|
```graphql
|
|
205
205
|
query {
|
|
@@ -207,7 +207,7 @@ query {
|
|
|
207
207
|
}
|
|
208
208
|
```
|
|
209
209
|
|
|
210
|
-
### 3.4. Expected
|
|
210
|
+
### 3.4. Expected response
|
|
211
211
|
If everything works, you should see:
|
|
212
212
|
|
|
213
213
|
```json
|
|
@@ -218,16 +218,16 @@ If everything works, you should see:
|
|
|
218
218
|
}
|
|
219
219
|
```
|
|
220
220
|
|
|
221
|
-
## 4. Working with the
|
|
221
|
+
## 4. Working with the supergraph or gateway
|
|
222
222
|
|
|
223
223
|
A supergraph is a GraphQL schema that combines multiple underlying GraphQL APIs, known as subgraphs, into a single, unified graph. This architecture allows different teams to work independently on their respective services (subgraphs) while providing a single entry point for clients or users to query all available data
|
|
224
224
|
|
|
225
|
-
### 4.1 Key
|
|
225
|
+
### 4.1 Key concepts
|
|
226
226
|
|
|
227
227
|
* **Subgraph:** An independent GraphQL service with its own schema. Each subgraph typically represents a specific domain or microservice within a larger system.
|
|
228
228
|
* **Gateway/Router:** A server that sits in front of the subgraphs. It receives client queries, consults the supergraph schema, and routes parts of the query to the relevant subgraphs. It then stitches the results back together before sending the final response to the client.
|
|
229
229
|
|
|
230
|
-
### 4.2 Benefits of
|
|
230
|
+
### 4.2 Benefits of using a supergraph
|
|
231
231
|
|
|
232
232
|
* **Federated Architecture:** Enables a microservices-based approach where different teams can own and operate their services independently.
|
|
233
233
|
* **Scalability:** Individual subgraphs can be scaled independently based on their specific needs.
|
|
@@ -236,7 +236,7 @@ A supergraph is a GraphQL schema that combines multiple underlying GraphQL APIs,
|
|
|
236
236
|
* **Clear Separation of Concerns:** Each subgraph focuses on a specific domain, leading to more maintainable and understandable codebases.
|
|
237
237
|
|
|
238
238
|
|
|
239
|
-
### 4.3 Use the Powerhouse
|
|
239
|
+
### 4.3 Use the Powerhouse supergraph
|
|
240
240
|
|
|
241
241
|
The Powerhouse supergraph for any given remote drive or reactor can be found under `http://localhost:4001/graphql`. The gateway / supergraph available on `/graphql` combines all the subgraphs, except for the drive subgraph (which is accessible via `/d/:driveId`). To get to the endpoint open your localhost by starting the reactor and adding `graphql` to the end of the url. The following commands explain how you can test & try the supergraph.
|
|
242
242
|
|
|
@@ -279,9 +279,9 @@ The supergraph allows to both query & mutate data from the same endpoint.
|
|
|
279
279
|
This is a quick example of how the supegraph can be used.
|
|
280
280
|
|
|
281
281
|
|
|
282
|
-
## Subgraphs are particularly useful for
|
|
282
|
+
## Subgraphs are particularly useful for
|
|
283
283
|
|
|
284
|
-
1. **Cross-Document Interactions**: For example, connecting a
|
|
284
|
+
1. **Cross-Document Interactions**: For example, connecting a To-do List with an Invoice document model:
|
|
285
285
|
- When an invoice-related task is marked complete, update the invoice status
|
|
286
286
|
- When an invoice is paid, automatically check off related tasks
|
|
287
287
|
|
|
@@ -300,7 +300,7 @@ This is a quick example of how the supegraph can be used.
|
|
|
300
300
|
Some subgraphs (e.g., System Subgraph, Drive Subgraph) already exist.
|
|
301
301
|
To integrate with them, register them via the Reactor API.
|
|
302
302
|
|
|
303
|
-
### Future
|
|
303
|
+
### Future enhancements
|
|
304
304
|
|
|
305
305
|
Bridge Processors and Subgraphs – Currently, there's a gap in how processors and subgraphs interact. Powerhouse might improve this in future updates.
|
|
306
306
|
|
|
@@ -1,8 +1,8 @@
|
|
|
1
|
-
# Analytics
|
|
1
|
+
# Analytics processors
|
|
2
2
|
|
|
3
3
|
An Analytics Processor is an object that can track analytics for operations and state changes on a set of document models. These analytics can be used to generate bespoke dashboards and reports, specific to the type or implementation of the document model.
|
|
4
4
|
|
|
5
|
-
## Generating an
|
|
5
|
+
## Generating an analytics processor with the CLI
|
|
6
6
|
|
|
7
7
|
The `ph-cli` utility can be used to generate the scaffolding for an Analytics Processor.
|
|
8
8
|
|
|
@@ -109,7 +109,7 @@ Model-specific code will go where the `console.log` statement currently resides.
|
|
|
109
109
|
|
|
110
110
|
> It is best practice to batch insert all updates to the analytics system. In this example, we add all updates to an array of inputs, then insert them all at once. This is optimal over `await`-ing each separate value.
|
|
111
111
|
|
|
112
|
-
## Learn
|
|
112
|
+
## Learn by example: RwaAnalyticsProcessor
|
|
113
113
|
|
|
114
114
|
In the `reactor-local` package, we have implemented a processor for the `makerdao/rwa-portfolio` document type. This is a document model that tracks MakerDAO's Real World Asset (RWA) portfolio. It was initially generated using the `ph-cli` utility.
|
|
115
115
|
|
|
@@ -216,7 +216,7 @@ With variables:
|
|
|
216
216
|
}
|
|
217
217
|
```
|
|
218
218
|
|
|
219
|
-
## Learn
|
|
219
|
+
## Learn by example: Document operations
|
|
220
220
|
|
|
221
221
|
The RWA processor example pulls information from operation _inputs_ to insert analytics data. Another use case might be to capture meta-analytics from the states themselves.
|
|
222
222
|
|
|
@@ -1,4 +1,4 @@
|
|
|
1
|
-
# Setup
|
|
1
|
+
# Setup builder environment
|
|
2
2
|
|
|
3
3
|
Let's set up your computer to start with any of the Powerhouse Connect tutorials. Don't worry if this is your first time setting up a development environment - we'll guide you through each step!
|
|
4
4
|
|
|
@@ -6,7 +6,7 @@ Let's set up your computer to start with any of the Powerhouse Connect tutorials
|
|
|
6
6
|
If you've already setup git, node, and npm, your most important step is to install the Powerhouse CLI with the command `npm install ph-cmd`. A global install is recommended if you want to use the command from any directory as a power user. The Powerhouse CLI is used to create, build, and run your Document Models and give you direct access to a series of Powerhouse ecosystem tools.
|
|
7
7
|
:::
|
|
8
8
|
|
|
9
|
-
## Table of
|
|
9
|
+
## Table of contents
|
|
10
10
|
- [Prerequisites](#prerequisites)
|
|
11
11
|
- [Installing node.js 22](#installing-nodejs)
|
|
12
12
|
- [For Windows](#for-windows)
|
|
@@ -27,11 +27,11 @@ If you've already setup git, node, and npm, your most important step is to insta
|
|
|
27
27
|
|
|
28
28
|
Before we begin with any of the Powerhouse Connect tutorials, we need to install some software on your computer. We'll need three main tools: node.js 22, which helps us run our code, Visual Studio Code (VS Code), which is where we'll write our code, and Git, which helps us manage our code. Follow the steps below based on your computer's operating system.
|
|
29
29
|
|
|
30
|
-
### Installing
|
|
30
|
+
### Installing Node.js 22
|
|
31
31
|
|
|
32
32
|
node.js 22 is a tool that lets us run our application. Let's install it step by step.
|
|
33
33
|
|
|
34
|
-
#### For Windows
|
|
34
|
+
#### For Windows
|
|
35
35
|
1. **Set up PowerShell for running commands:**
|
|
36
36
|
- Press the Windows key
|
|
37
37
|
- Type "PowerShell"
|
|
@@ -60,7 +60,7 @@ node.js 22 is a tool that lets us run our application. Let's install it step by
|
|
|
60
60
|
|
|
61
61
|
> **Note**: If node.js 22 commands don't work in VS Code, restart VS Code to refresh environment variables.
|
|
62
62
|
|
|
63
|
-
#### For macOS
|
|
63
|
+
#### For macOS
|
|
64
64
|
1. **Install Homebrew:**
|
|
65
65
|
- Open Terminal (press Command + Space and type "Terminal")
|
|
66
66
|
- Copy and paste this command into Terminal and press Enter:
|
|
@@ -83,7 +83,7 @@ node.js 22 is a tool that lets us run our application. Let's install it step by
|
|
|
83
83
|
```
|
|
84
84
|
- If you see version numbers, you've successfully installed node.js 22!
|
|
85
85
|
|
|
86
|
-
#### For Linux (Ubuntu/Debian)
|
|
86
|
+
#### For Linux (Ubuntu/Debian)
|
|
87
87
|
1. **Open Terminal:**
|
|
88
88
|
- Press Ctrl + Alt + T on your keyboard, or
|
|
89
89
|
- Click the Activities button and type "Terminal"
|
|
@@ -110,7 +110,7 @@ node.js 22 is a tool that lets us run our application. Let's install it step by
|
|
|
110
110
|
|
|
111
111
|
VS Code is the editor we'll use to write our code. Here's how to install it:
|
|
112
112
|
|
|
113
|
-
#### For Windows
|
|
113
|
+
#### For Windows
|
|
114
114
|
1. Visit the [Visual Studio Code website](https://code.visualstudio.com/)
|
|
115
115
|
2. Click the blue "Download for Windows" button
|
|
116
116
|
3. Once the installer downloads, double-click it
|
|
@@ -120,7 +120,7 @@ VS Code is the editor we'll use to write our code. Here's how to install it:
|
|
|
120
120
|
7. Click "Next" and then "Install"
|
|
121
121
|
8. When installation is complete, click "Finish"
|
|
122
122
|
|
|
123
|
-
#### For macOS
|
|
123
|
+
#### For macOS
|
|
124
124
|
1. Visit the [Visual Studio Code website](https://code.visualstudio.com/)
|
|
125
125
|
2. Click the blue "Download for Mac" button
|
|
126
126
|
3. Once the .zip file downloads, double-click it to extract
|
|
@@ -131,7 +131,7 @@ VS Code is the editor we'll use to write our code. Here's how to install it:
|
|
|
131
131
|
- Press Command + Shift + P
|
|
132
132
|
- Type "shell command" and select "Install 'code' command in PATH"
|
|
133
133
|
|
|
134
|
-
#### For Linux (Ubuntu/Debian)
|
|
134
|
+
#### For Linux (Ubuntu/Debian)
|
|
135
135
|
1. Open Terminal (Ctrl + Alt + T)
|
|
136
136
|
2. First, update the packages list:
|
|
137
137
|
```bash
|
|
@@ -159,7 +159,7 @@ VS Code is the editor we'll use to write our code. Here's how to install it:
|
|
|
159
159
|
|
|
160
160
|
### Install Git
|
|
161
161
|
|
|
162
|
-
#### Windows
|
|
162
|
+
#### For Windows
|
|
163
163
|
1. Open PowerShell (press Windows key, type "PowerShell", and press Enter)
|
|
164
164
|
2. Visit the [Git website](https://git-scm.com/)
|
|
165
165
|
3. Download the latest version for Windows
|
|
@@ -169,7 +169,7 @@ VS Code is the editor we'll use to write our code. Here's how to install it:
|
|
|
169
169
|
git --version
|
|
170
170
|
```
|
|
171
171
|
|
|
172
|
-
#### macOS
|
|
172
|
+
#### For macOS
|
|
173
173
|
1. Install using Homebrew:
|
|
174
174
|
```bash
|
|
175
175
|
brew install git
|
|
@@ -179,7 +179,7 @@ VS Code is the editor we'll use to write our code. Here's how to install it:
|
|
|
179
179
|
git --version
|
|
180
180
|
```
|
|
181
181
|
|
|
182
|
-
#### Linux (Ubuntu/Debian)
|
|
182
|
+
#### For Linux (Ubuntu/Debian)
|
|
183
183
|
1. Update package list:
|
|
184
184
|
```bash
|
|
185
185
|
sudo apt update
|
|
@@ -193,7 +193,7 @@ VS Code is the editor we'll use to write our code. Here's how to install it:
|
|
|
193
193
|
git --version
|
|
194
194
|
```
|
|
195
195
|
|
|
196
|
-
### Configure Git (
|
|
196
|
+
### Configure Git (all systems)
|
|
197
197
|
|
|
198
198
|
Open your terminal (command prompt) and run the following commands to set up Git:
|
|
199
199
|
|
|
@@ -203,7 +203,7 @@ git config --global user.name "Your Name"
|
|
|
203
203
|
git config --global user.email "your.email@example.com"
|
|
204
204
|
```
|
|
205
205
|
|
|
206
|
-
### Verify
|
|
206
|
+
### Verify installation
|
|
207
207
|
|
|
208
208
|
Open your terminal (command prompt) and run the following commands to verify your setup:
|
|
209
209
|
```bash
|
|
@@ -1,4 +1,4 @@
|
|
|
1
|
-
# Create
|
|
1
|
+
# Create new Powerhouse project
|
|
2
2
|
|
|
3
3
|
Before we start diving into the tutorial, lets have a quick look at all the steps we'll go through.
|
|
4
4
|
|
|
@@ -16,7 +16,7 @@ Let's start with step 1 & 2 in the next section of the tutorial!
|
|
|
16
16
|
|
|
17
17
|
To create a new Powerhouse Document Model Library project, you can use the `ph init` command in your terminal. This command will create a new project in the current directory.
|
|
18
18
|
|
|
19
|
-
## Create
|
|
19
|
+
## Create new Powerhouse document model library project
|
|
20
20
|
|
|
21
21
|
:::info
|
|
22
22
|
This command will create a new project in the current directory.
|
|
@@ -1,6 +1,6 @@
|
|
|
1
|
-
# Generate the
|
|
1
|
+
# Generate the reactor and analytics processor
|
|
2
2
|
|
|
3
|
-
### Launch the
|
|
3
|
+
### Launch the reactor
|
|
4
4
|
|
|
5
5
|
**First**, navigate to the project directory with `cd rwa-analytics` to start the reactor from the project directory.
|
|
6
6
|
Now open the directory with your code editor with the command `code . `
|
|
@@ -24,7 +24,7 @@ Next, launch the reactor by running the following command in your terminal.
|
|
|
24
24
|
|
|
25
25
|
As you can see the reactor registered the default system subgraph but also the `d/:drive` subgraph for fetching documents and pushing updates. It also registered the analytics subgraph, which we'll use later for running analytics queries on the document model in the next steps.
|
|
26
26
|
|
|
27
|
-
### Generate the
|
|
27
|
+
### Generate the analytics processor
|
|
28
28
|
**Next** we'll open up a second terminal window and generate the analytics processor.
|
|
29
29
|
Use the following command to generate the analytics processor in a seperate terminal window.
|
|
30
30
|
```bash
|
|
@@ -34,7 +34,7 @@ In your project directory you'll now see the folder 'processors' being created.
|
|
|
34
34
|
Later we'll see how this processor is actively listening for the operations on a document model.
|
|
35
35
|
Eventually we'll replace the default code of the generated processor with our own custom analytics processor code to listen to specific operations, dimensions and metrics.
|
|
36
36
|
|
|
37
|
-
### Add data to the document model in the
|
|
37
|
+
### Add data to the document model in the reactor
|
|
38
38
|
|
|
39
39
|
Now that we've generated the analytics processor we need to give it some data to listen to.
|
|
40
40
|
For this we've created the local reactor instance, which will represent a node in the Powerhouse network.
|
|
@@ -111,7 +111,7 @@ Click on the 'Save' button to save the transaction.
|
|
|
111
111
|
|
|
112
112
|
You'll now see the transaction appear in the transaction table.
|
|
113
113
|
|
|
114
|
-
### Let's buy
|
|
114
|
+
### Let's buy and sell some T-bills!
|
|
115
115
|
|
|
116
116
|
Now that we've created the T-bill asset and took out a loan from Sky to purchase it, we can start buying some T-bills.
|
|
117
117
|
Let's buy 2,500,000 T-bills for $2,490,000 on May 15.
|
|
@@ -157,7 +157,7 @@ Your table of transactions should now look like this:
|
|
|
157
157
|
|
|
158
158
|

|
|
159
159
|
|
|
160
|
-
### Verify that the document data is correctly entered
|
|
160
|
+
### Verify that the document data is correctly entered and inspect the document history
|
|
161
161
|
|
|
162
162
|
Connect offers you with a few ways to inspect the document data and history. Click on the 'revision history' button in the top right corner of your document model in Connect to see the history of your operations on the document model.
|
|
163
163
|
|
|
@@ -1,4 +1,4 @@
|
|
|
1
|
-
# Update the
|
|
1
|
+
# Update the analytics processor
|
|
2
2
|
|
|
3
3
|
Now let's implement our custom analytics processor code to calculate the total value of the asset class on a monthly basis.
|
|
4
4
|
You will likely already be in the desired directory, but if not, navigate to the project directory and open it.
|
|
@@ -16,7 +16,7 @@ We will show how you can receive:
|
|
|
16
16
|
> #### 2. Only the latest changes and updates:
|
|
17
17
|
> Or specific operations of a document by registering a listener with a specific filter.
|
|
18
18
|
|
|
19
|
-
### Adding the specific
|
|
19
|
+
### Adding the specific document drive to Connect with a URL
|
|
20
20
|
|
|
21
21
|
The specific drive where your RWA-report or any other document is being stored can be accessed through Connect. Connect functions as your decentralised document drive to access and edit document models that capture the data present in your organisation.
|
|
22
22
|
|
|
@@ -44,7 +44,7 @@ It extracts common fields such as id, name, documentType, revision, created, and
|
|
|
44
44
|
|
|
45
45
|
Additionally, it retrieves specific data related to the 'Real World Assets' document model, including accounts, SPVs, fixed income types, fee types, portfolio, and transactions. The RWA section of the query is designed to pull in detailed information about the financial structure and transactions of real-world assets managed within the document model.
|
|
46
46
|
|
|
47
|
-
#### Common
|
|
47
|
+
#### Common document fields
|
|
48
48
|
|
|
49
49
|
- `name`: The name of the document.
|
|
50
50
|
- `documentType`: The type classification of the document.
|
|
@@ -2,13 +2,13 @@
|
|
|
2
2
|
sidebar_position: 1000
|
|
3
3
|
---
|
|
4
4
|
|
|
5
|
-
# Best
|
|
5
|
+
# Best practices
|
|
6
6
|
|
|
7
7
|
### GraphQL
|
|
8
8
|
|
|
9
9
|
**Start general, end with detailed queries.** In the case you want to find a specific number for a specific metric and dimension, the rule of thumb is to start general in the filter definitions and slowly start cutting down the paths to a particular team or budget period.
|
|
10
10
|
|
|
11
|
-
For example: Looking at the dimension section in the filter options: To see available sub paths for the budget, leave the select: "atlas" and lod as 5 . With lod:5 you
|
|
11
|
+
For example: Looking at the dimension section in the filter options: To see available sub paths for the budget, leave the select: "atlas" and lod as 5 . With lod:5 you'll be able to see the different paths available under your defined filter. Then, later you can apply a more detailed path in the select: "atlas/..." path that will suit your needs.
|
|
12
12
|
|
|
13
13
|
```graphql
|
|
14
14
|
{
|
|
@@ -30,7 +30,7 @@ For example: Looking at the dimension section in the filter options: To see avai
|
|
|
30
30
|
|
|
31
31
|
**Fetch per specified path** if performance is an issue.
|
|
32
32
|
|
|
33
|
-
### Guidelines for
|
|
33
|
+
### Guidelines for selecting and combining dimensions
|
|
34
34
|
|
|
35
35
|
1. **Understand the Purpose of Analysis**
|
|
36
36
|
Before selecting dimensions, clarify the objective of your analysis. Are you looking to track expenses for a specific project, analyze budget utilization, or examine transaction patterns? Your objective will guide which dimensions are most relevant.
|
|
@@ -54,7 +54,7 @@ While combining dimensions can provide depth, avoid overly complex combinations
|
|
|
54
54
|
Where icons are available, they can be used as a quick visual reference to identify different dimensions or categories, particularly in user interfaces where rapid identification is beneficial.
|
|
55
55
|
|
|
56
56
|
8. **Experiment and Iterate**
|
|
57
|
-
Don
|
|
57
|
+
Don't hesitate to experiment with different combinations of dimensions to see which provide the most meaningful insights. The flexibility of the dimensions allows for various permutations and combinations to suit diverse analytical needs.
|
|
58
58
|
|
|
59
59
|
9. **Stay Updated**
|
|
60
60
|
Keep abreast of any changes or additions to the dimensions within the analytics engine, as this can impact ongoing and future analyses.
|
|
@@ -6,7 +6,7 @@ sidebar_position: 3
|
|
|
6
6
|
|
|
7
7
|
A basic understanding of GraphQL is helpful to understand this document, as we use concepts and vocabulary from the broader ecosystem. For a good primer on the subject, see the [Introduction to GraphQL](https://graphql.org/learn/).
|
|
8
8
|
|
|
9
|
-
## Basic GraphQL
|
|
9
|
+
## Basic GraphQL structure
|
|
10
10
|
|
|
11
11
|
- **Query**: At its core, a GraphQL query is about asking or 'querying' for specific fields on objects. The query is structured like a JSON object, but without the values. This structure lets you specify exactly which data you want to fetch from the server.
|
|
12
12
|
|
|
@@ -33,7 +33,7 @@ query {
|
|
|
33
33
|
In this example, query is the operation type, user is the field on the root type, and name, email, and friends are fields on the user type. The id: `"1"` is a parameter passed to the query, specifying which user to fetch.
|
|
34
34
|
You can find more examples of GraphQL core concept in the section [GraphQL at Powerhouse](/academy/MasteryTrack/WorkWithData/GraphQLAtPowerhouse).
|
|
35
35
|
|
|
36
|
-
## Analytics
|
|
36
|
+
## Analytics query structure
|
|
37
37
|
|
|
38
38
|
In the GraphQL query for the Analytics Engine below, each element requests specific pieces of data from the analytics API. Here is a breakdown of the fields and elements with a dropdown with their descriptions below.
|
|
39
39
|
|
|
@@ -102,7 +102,7 @@ This query structure allows users to extract detailed and summarized analytics d
|
|
|
102
102
|
|
|
103
103
|
In the analytics engine, the filter object in a GraphQL query is crucial for tailoring the data retrieval to specific requirements. For the query to function correctly, a complete and fully structured filter object, containing all necessary parameters, must be used. Here's a guide on how to construct these filter objects and examples of different filter configurations.
|
|
104
104
|
|
|
105
|
-
### Constructing a
|
|
105
|
+
### Constructing a complete filter object
|
|
106
106
|
|
|
107
107
|
A filter object must include all the following parameters:
|
|
108
108
|
|
|
@@ -133,7 +133,7 @@ The level of detail (`lod`) Parameter
|
|
|
133
133
|
- **Impact on Analysis**: The level of detail can significantly affect the analysis. For instance, a high lod can provide in-depth insights into a specific area, useful for detailed audits or close examination of a particular segment. Conversely, a low lod is better for broader overviews or when comparing larger categories.
|
|
134
134
|
|
|
135
135
|
Importance of Correct Configuration
|
|
136
|
-
- **Accuracy of Results**: Setting the correct path in the select field and aligning it with an appropriate lod ensures the accuracy and relevance of the query results. Incorrect or mismatched configurations might lead to misleading insights or data that doesn
|
|
136
|
+
- **Accuracy of Results**: Setting the correct path in the select field and aligning it with an appropriate lod ensures the accuracy and relevance of the query results. Incorrect or mismatched configurations might lead to misleading insights or data that doesn't serve the intended analytical purpose.
|
|
137
137
|
- **Customized Analysis**: Together, the select field and lod allow for a high degree of customization in data queries. Users can tailor their data requests precisely to their specific requirements, whether they need a broad overview or a detailed breakdown.
|
|
138
138
|
- Follow the right upper or lower case letter style from metrics, granularity and dimensions.
|
|
139
139
|
|
|
@@ -149,15 +149,15 @@ The filter object can be created by using the UI menu from the graphql apollo st
|
|
|
149
149
|
|
|
150
150
|
## Troubleshooting
|
|
151
151
|
|
|
152
|
-
### Common
|
|
152
|
+
### Common issues
|
|
153
153
|
|
|
154
154
|
**Mistyping the filter options or not using proper upper or lower case when necessary.** Make sure to check this first when running into problems
|
|
155
155
|
|
|
156
|
-
### Error
|
|
156
|
+
### Error messages
|
|
157
157
|
|
|
158
158
|
Usually, error messages are explicit and easy to understand what is wrong, below you can find some examples.
|
|
159
159
|
|
|
160
|
-
- `"message": "Cannot read properties of undefined (reading 'filter')"`, → There is something wrong with the filter options, make sure you
|
|
160
|
+
- `"message": "Cannot read properties of undefined (reading 'filter')"`, → There is something wrong with the filter options, make sure you've added all fields in the filter options.
|
|
161
161
|
|
|
162
162
|
- `"message": "No valid metrics provided, make sure to use metrics from this list: Actuals, Budget, Forecast, FTEs, PaymentsOffChainIncluded, PaymentsOnChain, ProtocolNetOutflow"`, → A wrong format when adding metrics
|
|
163
163
|
|
package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/graphql/integration.md
CHANGED
|
@@ -6,7 +6,7 @@ sidebar_position: 1
|
|
|
6
6
|
|
|
7
7
|
The `analytics-engine-graphql` module provides types and resolvers needed for a fully-functional GraphQL Server API. This library has no dependencies on any particular server but has been tested using [Apollo Server 3 and 4](https://www.apollographql.com/docs/apollo-server).
|
|
8
8
|
|
|
9
|
-
## Resolvers and
|
|
9
|
+
## Resolvers and types
|
|
10
10
|
|
|
11
11
|
The graphql package contains typical resolvers and typedefs. The resolvers, however, are dependent on another object: `AnalyticsModel`. This object is responsible for connecting GQL types with an `IAnalyticsStore` implementation.
|
|
12
12
|
|
|
@@ -2,7 +2,7 @@
|
|
|
2
2
|
sidebar_position: 0
|
|
3
3
|
---
|
|
4
4
|
|
|
5
|
-
# Getting
|
|
5
|
+
# Getting started
|
|
6
6
|
|
|
7
7
|
## Introduction
|
|
8
8
|
|
|
@@ -16,7 +16,7 @@ This documentation serves as a guide for Typescript and GraphQL API usage, not l
|
|
|
16
16
|
|
|
17
17
|
This system can be broken up into several major systems: **queries**, **engine components** (including filters and aggregation), and **storage**. Each of these systems have detailed documentation, but it is most helpful to start with a holistic understanding of the major pieces of data: series and dimensions.
|
|
18
18
|
|
|
19
|
-
### Series and
|
|
19
|
+
### Series and dimensions
|
|
20
20
|
|
|
21
21
|
All metrics are collected and stored using only two objects: **AnalyticsSeries** and **AnalyticsDimension**. To be a successful consumer of this system, It is vital to understand the intent of these two pieces of data.
|
|
22
22
|
|
|
@@ -52,7 +52,7 @@ Visually, you can think of series as values running up and down, while dimension
|
|
|
52
52
|
|
|
53
53
|

|
|
54
54
|
|
|
55
|
-
### Structure
|
|
55
|
+
### Structure vs. reporting
|
|
56
56
|
|
|
57
57
|
The Series and Dimensions objects give control over how metric data is defined and related. That is, they define the **structure** of data. Three new concepts: Paths, LODs, and granularity, are parameters used to define how the system should _report_ the data.
|
|
58
58
|
|
|
@@ -94,7 +94,7 @@ An LOD greater than the number of parts of the path (in this case, 4 or more) wi
|
|
|
94
94
|
|
|
95
95
|
While LODs allow you to aggregate series data across dimensions, _granularity_ refers to how data is aggregated over time. It determines the time span each data point or record covers. Choosing the right granularity is crucial for meaningful analysis, as it affects the interpretation and insights that can be drawn from the data.
|
|
96
96
|
|
|
97
|
-
#### Available
|
|
97
|
+
#### Available options
|
|
98
98
|
|
|
99
99
|
The analytics engine supports various granularity options, each suitable for different types of analysis:
|
|
100
100
|
|
|
@@ -113,7 +113,7 @@ daily: Offers a day-to-day breakdown, ideal for detailed analysis of daily opera
|
|
|
113
113
|
|
|
114
114
|
7. `hourly`: The most granular level, providing insights into hourly fluctuations. Useful in scenarios where short-term data spikes or dips are significant.
|
|
115
115
|
|
|
116
|
-
#### How
|
|
116
|
+
#### How granularity affects query results
|
|
117
117
|
|
|
118
118
|
- Data Volume: Higher granularity (like hourly or daily) results in a larger volume of data points, providing more detailed insights but potentially leading to more complex analysis. Lower granularity (like annual or total) simplifies the data into fewer, broader data points.
|
|
119
119
|
|
|
@@ -140,7 +140,7 @@ These are high-level definitions of major terms you will need to know.
|
|
|
140
140
|
|
|
141
141
|
* **Granularity** - Determines how to aggregate values over time, like "monthly" or "yearly".
|
|
142
142
|
|
|
143
|
-
## Next
|
|
143
|
+
## Next steps
|
|
144
144
|
|
|
145
145
|
Next steps depend on use case.
|
|
146
146
|
|
package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/index.md
CHANGED
|
@@ -16,7 +16,7 @@ The `core` library contains common data types and abstractions used throughout.
|
|
|
16
16
|
|
|
17
17
|
The `knex`, `pg`, and `browser` libraries contain various storage implementations. Finally, the `graphql` library contains types, resolvers, and data types for a GraphQL API on top.
|
|
18
18
|
|
|
19
|
-
## Querying
|
|
19
|
+
## Querying data
|
|
20
20
|
|
|
21
21
|
The entry point for data queries in Typescript is the `AnalyticsQueryEngine`. This wraps an `IAnalyticsStore` implementation, which will be discussed in detail later.
|
|
22
22
|
|
|
@@ -121,7 +121,7 @@ type GroupedPeriodResult = {
|
|
|
121
121
|
|
|
122
122
|
- `sum`: A total or aggregated value of the metric over the entire period, providing a summarized figure.
|
|
123
123
|
|
|
124
|
-
### A
|
|
124
|
+
### A note about times
|
|
125
125
|
|
|
126
126
|
Time is a bit of a sore subject in JavaScript, and we wrestled a bit with how to attack this problem.
|
|
127
127
|
|
|
@@ -133,7 +133,7 @@ As this is a time-series analytics system, times are rather important to get rig
|
|
|
133
133
|
|
|
134
134
|
This is why our analytics API uses these luxon types as inputs and outputs to the system: to make it hard to screw up.
|
|
135
135
|
|
|
136
|
-
## Inserting
|
|
136
|
+
## Inserting data
|
|
137
137
|
|
|
138
138
|
The `IAnalyticsStore` interface is the primary entry point for inserting and deleting data. Multiple storage implementations are provided, but for simplicity we can get up and running quickly with the [`MemoryAnalyticsStore`](#memory).
|
|
139
139
|
|
|
@@ -168,7 +168,7 @@ await store.addSeriesValue([
|
|
|
168
168
|
]);
|
|
169
169
|
```
|
|
170
170
|
|
|
171
|
-
## Subscribing to
|
|
171
|
+
## Subscribing to data changes
|
|
172
172
|
|
|
173
173
|
The `IAnalyticsStore` also provides an API for subscribing to data changes. This is achieved by subscribing to an `AnalyticsPath`.
|
|
174
174
|
|
|
@@ -221,7 +221,7 @@ await store.addSeriesValue({ source: AnalyticsPath.fromString("atlas/foo/test/a/
|
|
|
221
221
|
await store.addSeriesValue({ source: AnalyticsPath.fromString("rwa/foo/test"), ... });
|
|
222
222
|
```
|
|
223
223
|
|
|
224
|
-
## Store
|
|
224
|
+
## Store implementations
|
|
225
225
|
|
|
226
226
|
Multiple storage implementations are provided, each with comprehensive documentation. See the corresponding docs for:
|
|
227
227
|
|
package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/memory.md
CHANGED
|
@@ -2,7 +2,7 @@
|
|
|
2
2
|
sidebar_position: 1
|
|
3
3
|
---
|
|
4
4
|
|
|
5
|
-
# Memory
|
|
5
|
+
# Memory store
|
|
6
6
|
|
|
7
7
|
The `MemoryAnalyticsStore` is an `IAnalyticsStore` implementation that uses a an in-memory database as its storage mechanism. Under the hood, we load a WASM build of Postgres, called [PGlite](https://pglite.dev/).
|
|
8
8
|
|