@powerhousedao/academy 3.2.0-dev.7 → 3.2.0-dev.9
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +21 -0
- package/docs/academy/01-GetStarted/00-ExploreDemoPackage.mdx +60 -51
- package/docs/academy/01-GetStarted/01-CreateNewPowerhouseProject.md +8 -28
- package/docs/academy/01-GetStarted/02-DefineToDoListDocumentModel.md +1 -1
- package/docs/academy/01-GetStarted/images/OperationsHistory.png +0 -0
- package/docs/academy/01-GetStarted/images/OperationsHistoryButton.png +0 -0
- package/docs/academy/01-GetStarted/images/OperationsHistorySignature.png +0 -0
- package/docs/academy/02-MasteryTrack/04-WorkWithData/04-analytics-processor.md +196 -4
- package/docs/academy/02-MasteryTrack/04-WorkWithData/05-AnalyticsProcessorTutorial/02-CreateNewPowerhouseProject.md +4 -2
- package/docs/academy/02-MasteryTrack/04-WorkWithData/07-drive-analytics.md +467 -0
- package/docs/academy/03-ExampleUsecases/Chatroom/02-CreateNewPowerhouseProject.md +2 -1
- package/docs/academy/06-ComponentLibrary/00-DocumentEngineering.md +52 -0
- package/docs/academy/07-Cookbook.md +2 -2
- package/package.json +1 -1
- package/src/css/custom.css +15 -0
package/CHANGELOG.md
CHANGED
|
@@ -1,3 +1,24 @@
|
|
|
1
|
+
## 3.2.0-dev.9 (2025-07-02)
|
|
2
|
+
|
|
3
|
+
### 🩹 Fixes
|
|
4
|
+
|
|
5
|
+
- updated processor generator and added codegen test for it ([6af3bbcf7](https://github.com/powerhouse-inc/powerhouse/commit/6af3bbcf7))
|
|
6
|
+
- added test to generate and compile a generated document-model ([17bbca3bb](https://github.com/powerhouse-inc/powerhouse/commit/17bbca3bb))
|
|
7
|
+
|
|
8
|
+
### ❤️ Thank You
|
|
9
|
+
|
|
10
|
+
- Benjamin Jordan (@thegoldenmule)
|
|
11
|
+
|
|
12
|
+
## 3.2.0-dev.8 (2025-07-01)
|
|
13
|
+
|
|
14
|
+
### 🚀 Features
|
|
15
|
+
|
|
16
|
+
- **academy:** add Drive Analytics documentation and examples ([daedc28a3](https://github.com/powerhouse-inc/powerhouse/commit/daedc28a3))
|
|
17
|
+
|
|
18
|
+
### ❤️ Thank You
|
|
19
|
+
|
|
20
|
+
- Guillermo Puente @gpuente
|
|
21
|
+
|
|
1
22
|
## 3.2.0-dev.7 (2025-06-28)
|
|
2
23
|
|
|
3
24
|
### 🚀 Features
|
|
@@ -1,18 +1,5 @@
|
|
|
1
1
|
# Explore the demo package
|
|
2
2
|
|
|
3
|
-
<details>
|
|
4
|
-
<summary>How long will this tutorial take?</summary>
|
|
5
|
-
|
|
6
|
-
We've designed this "Get Started" track to be as smooth as possible. The time it takes will vary based on your familiarity with modern web development tools.
|
|
7
|
-
|
|
8
|
-
- **For Experienced Developers** (familiar with TypeScript, React, and CLIs), you can expect to complete the entire four-part tutorial in approximately **1 to 1.5 hours**.
|
|
9
|
-
- **For Developers New to This Stack**, we recommend setting aside **3.5 to 4.5 hours**. This allows for time to understand not just the steps, but the core concepts behind them.
|
|
10
|
-
|
|
11
|
-
This is just a guideline. The goal is to learn comfortably and build a solid foundation with Powerhouse!
|
|
12
|
-
|
|
13
|
-
A more theoretical and advanced version of this tutorial can also be found in [Mastery Track - Document Model Creation](../MasteryTrack/DocumentModelCreation/WhatIsADocumentModel).
|
|
14
|
-
</details>
|
|
15
|
-
|
|
16
3
|
## Let's get started
|
|
17
4
|
|
|
18
5
|
To give you a quick idea of how the Powerhouse ecosystem operates on document models and packages, why don't you try installing a package?
|
|
@@ -109,7 +96,55 @@ Here, you'll see that you've installed the `@powerhousedao/todo-demo-package`, w
|
|
|
109
96
|
<figcaption>The Package Manager showing the installed todo-demo-package.</figcaption>
|
|
110
97
|
</figure>
|
|
111
98
|
|
|
112
|
-
## Step 4:
|
|
99
|
+
## Step 4: Create a todo list document
|
|
100
|
+
|
|
101
|
+
:::tip What is a 'drive' at Powerhouse?
|
|
102
|
+
A **drive** is a folder to store and organize your documents in. Powerhouse offers the ability to build customized 'Drive Apps' for your documents. Think of a Drive App as a specialized lens—it offers **different ways to visualize, organize, and interact with** the data stored within a drive, making it more intuitive and efficient for specific use cases. To learn more, visit [Building A Drive App](/academy/MasteryTrack/BuildingUserExperiences/BuildingADriveExplorer)
|
|
103
|
+
:::
|
|
104
|
+
|
|
105
|
+
### 4.1 Create a local todolist app drive
|
|
106
|
+
|
|
107
|
+
First, let's create a dedicated drive for your to-do lists:
|
|
108
|
+
|
|
109
|
+
- Click the new drive icon in the interface
|
|
110
|
+
- In the **Drive App** field, select 'To-do Drive App'
|
|
111
|
+
- This creates a specialized drive that's optimized for to-do list documents
|
|
112
|
+
|
|
113
|
+
### 4.2 Create a todolist document
|
|
114
|
+
|
|
115
|
+
Now move into the drive you've just created:
|
|
116
|
+
|
|
117
|
+
- Click the button at the bottom of the page to create a new to-do list document
|
|
118
|
+
- This opens the to-do list editor where you can start managing your tasks
|
|
119
|
+
|
|
120
|
+
### 4.3 Add a few todos and inspect the document history
|
|
121
|
+
|
|
122
|
+
- Add a few to-dos that are on your mind
|
|
123
|
+
- You'll see a statistics widget that counts the open to-dos
|
|
124
|
+
- After closing the document, look at the To-do Drive App interface—you'll see that it tracks your tasks and displays a progress bar
|
|
125
|
+
|
|
126
|
+
<figure className="image-container">
|
|
127
|
+
<img src={require("./images/TodoDriveApp.png").default} alt="Todo Drive App" />
|
|
128
|
+
<figcaption>A list of todo's in the custom todo drive app. </figcaption>
|
|
129
|
+
</figure>
|
|
130
|
+
|
|
131
|
+
A key feature you get with Connect is the **Operations History**. Every change to a document is stored as an individual operation, creating an immutable and replayable history. This provides complete auditability and transparency, as you can inspect each revision, its details, and any associated signatures. For example, you can see a chronological list of all modifications, along with who made them and when.
|
|
132
|
+
|
|
133
|
+
<figure className="image-container">
|
|
134
|
+
<img src={require("./images/OperationsHistoryButton.png").default} alt="Operations History Button" />
|
|
135
|
+
<figcaption> You can find the button to visit the operations history in the document model toolbar </figcaption>
|
|
136
|
+
</figure>
|
|
137
|
+
|
|
138
|
+
<figure className="image-container">
|
|
139
|
+
<img src={require("./images/OperationsHistory.png").default} alt="Operations History" />
|
|
140
|
+
<figcaption>Example of the operations history for a document, showing all modifications made to it in a list. </figcaption>
|
|
141
|
+
</figure>
|
|
142
|
+
|
|
143
|
+
Learn more about the [Operations History](../MasteryTrack/BuildingUserExperiences/DocumentTools/OperationHistory) and other document tools you get for free.
|
|
144
|
+
|
|
145
|
+
This is the power of Drive Apps. They offer a customized interface that works well with the different documents inside your drive. Read more about drive apps in the Mastery Track: [Drive Apps and Drive Explorers](/academy/MasteryTrack/BuildingUserExperiences/BuildingADriveExplorer).
|
|
146
|
+
|
|
147
|
+
## Step 5: Enable operation signing and verification through Renown
|
|
113
148
|
|
|
114
149
|
Renown is Powerhouse's **decentralized identity and reputation system** designed to address the challenge of trust within open organizations, where contributors often operate under pseudonyms. In traditional organizations, personal identity and reputation are key to establishing trust and accountability. Renown replicates this dynamic in the digital space, allowing contributors to earn experience and build reputation without revealing their real-world identities.
|
|
115
150
|
|
|
@@ -117,16 +152,18 @@ Renown is Powerhouse's **decentralized identity and reputation system** designed
|
|
|
117
152
|
When signing in with Renown, use an Ethereum or blockchain address that can function as your 'identity', as this address will accrue more experience and history over time.
|
|
118
153
|
:::
|
|
119
154
|
|
|
120
|
-
###
|
|
121
|
-
|
|
155
|
+
### 5.1 Click the renown icon and connect your eth identity
|
|
156
|
+
|
|
157
|
+
"**Log in with Renown**" is a decentralized authentication flow that enables you to log into applications by signing a credential with your Ethereum wallet. Upon signing in, a Decentralized Identifier (DID) is created based on your Ethereum key.
|
|
122
158
|
|
|
123
159
|
<figure className="image-container">
|
|
124
160
|
<img src={require("./images/RenownLogin.png").default} alt="Renown Login" />
|
|
125
161
|
<figcaption>The Renown login screen, prompting for a signature from a wallet.</figcaption>
|
|
126
162
|
</figure>
|
|
127
163
|
|
|
128
|
-
###
|
|
129
|
-
|
|
164
|
+
### 5.2 Authorize Connect to sign document edits on your behalf
|
|
165
|
+
|
|
166
|
+
This DID is then associated with a credential that authorizes a specific Connect instance to act on your behalf. That credential is stored securely on Ceramic, a decentralized data network. When you perform actions through the Powerhouse Connect interface, those operations are signed with the DID and transmitted to Switchboard, which serves as the verifier.
|
|
130
167
|
|
|
131
168
|
<figure className="image-container">
|
|
132
169
|
<img src={require("./images/ConnectAddress.png").default} alt="Connect Address for DID" />
|
|
@@ -138,45 +175,17 @@ This DID is then associated with a credential that authorizes a specific Connect
|
|
|
138
175
|
<figcaption>Confirmation of a successful login with Renown.</figcaption>
|
|
139
176
|
</figure>
|
|
140
177
|
|
|
141
|
-
###
|
|
142
|
-
**Switchboard**, our (remote & local) data processing engine, acts as a verifier in the Renown authentication flow. Switchboard checks the validity of the DID and credential, ensuring the operation request is legitimate. This flow is designed to offer a verifiable, cryptographically secure login system that replaces traditional password-based authentication with decentralized identity and signature-based trust.
|
|
178
|
+
### 5.3 Verify the signatures of new operations in the todo list
|
|
143
179
|
|
|
144
|
-
|
|
145
|
-
|
|
146
|
-
<figure className="image-container">
|
|
147
|
-
<img src={require("./images/OperationsHistory.png").default} alt="Operations History" />
|
|
148
|
-
<figcaption>Example of the operations history for a document, showing all modifications made to it in a list. </figcaption>
|
|
149
|
-
</figure>
|
|
180
|
+
By leveraging this system, every operation or modification made to a document is cryptographically signed by the contributor's Renown identity. This ensures that each change is verifiable, traceable, and attributable to a specific pseudonymous user, providing a robust audit trail for all document activity.
|
|
150
181
|
|
|
151
|
-
|
|
152
|
-
|
|
153
|
-
A key feature you get with Connect is the **Operations History**. Every change to a document is stored as an individual operation, creating an immutable and replayable history. This provides complete auditability and transparency, as you can inspect each revision, its details, and any associated signatures. For example, you can see a chronological list of all modifications, along with who made them and when. This ensures every action is traceable and verifiable.
|
|
154
|
-
Learn more about the [Operations History](../MasteryTrack/BuildingUserExperiences/DocumentTools/OperationHistory) and other document tools you get for free.
|
|
155
|
-
|
|
156
|
-
|
|
157
|
-
### 5.1 Create a to-do list
|
|
158
|
-
|
|
159
|
-
Now, move back to your local drive and create a to-do list. Add a few to-dos that are on your mind. You'll see a statistics widget that counts the open to-dos.
|
|
160
|
-
|
|
161
|
-
### 5.2 Create a specific to-do drive app
|
|
162
|
-
|
|
163
|
-
:::tip What is a 'drive' at Powerhouse?
|
|
164
|
-
A **drive** is a folder to store and organize your documents in. Powerhouse offers the ability to build customized 'Drive Apps' for your documents. Think of a Drive App as a specialized lens—it offers **different ways to visualize, organize, and interact with** the data stored within a drive, making it more intuitive and efficient for specific use cases. To learn more, visit [Building A Drive App](/academy/MasteryTrack/BuildingUserExperiences/BuildingADriveExplorer)
|
|
165
|
-
:::
|
|
166
|
-
|
|
167
|
-
Since your previous to-do list was created inside a local drive with a general-purpose drive explorer, it didn't look particularly special.
|
|
168
|
-
|
|
169
|
-
- Now let's create a new drive by clicking the new drive icon. In the **Drive App** field, select 'To-do Drive App'.
|
|
170
|
-
- Now move into the drive you've just created and create a new to-do list by clicking the button at the bottom of the page.
|
|
171
|
-
- Add a new set of random to-dos
|
|
172
|
-
- After closing the document, look at the To-do Drive App interface. You'll see that it tracks your tasks and displays a progress bar.
|
|
182
|
+
Now, return to your to-do list and make some additional changes. You'll notice that these operations are now signed with your Renown identity, making every action traceable and verifiable in the operations history.
|
|
173
183
|
|
|
174
184
|
<figure className="image-container">
|
|
175
|
-
<img src={require("./images/
|
|
176
|
-
<figcaption>
|
|
185
|
+
<img src={require("./images/OperationsHistorySignature.png").default} alt="Operation History Signature" />
|
|
186
|
+
<figcaption>Your DID is now signing the operations that are being added to the history.</figcaption>
|
|
177
187
|
</figure>
|
|
178
188
|
|
|
179
|
-
This is the power of Drive Apps. They offer a customized interface that works well with the different documents inside your drive. Read more about drive apps in the Mastery Track: [Drive Apps and Drive Explorers](/academy/MasteryTrack/BuildingUserExperiences/BuildingADriveExplorer).
|
|
180
189
|
## Step 6: Export a document
|
|
181
190
|
|
|
182
191
|
Export the document as a `.phd` (Powerhouse Document) file using the export button in the document toolbar at the top. In this toolbar, you will find all available functionality for your documents. The `.phd` file can be sent through any of your preferred channels to other users on your network.
|
|
@@ -1,4 +1,4 @@
|
|
|
1
|
-
# Create a to-do list document
|
|
1
|
+
# Create a new to-do list document
|
|
2
2
|
|
|
3
3
|
## Overview
|
|
4
4
|
This tutorial guides you through creating a simplified version of a 'Powerhouse project' for a **To-do List**.
|
|
@@ -18,52 +18,32 @@ Create a new Powerhouse project with a single command:
|
|
|
18
18
|
```bash
|
|
19
19
|
ph init
|
|
20
20
|
```
|
|
21
|
-
<details>
|
|
22
|
-
<summary>How to use different branches</summary>
|
|
23
|
-
|
|
24
|
-
When installing or using the Powerhouse CLI commands you are able to make use of the dev & staging branches.
|
|
25
|
-
These branches contain more experimental features then the latest stable release the PH CLI uses by default.
|
|
26
|
-
They can be used to get access to a bugfix or features under development.
|
|
27
|
-
|
|
28
|
-
| Command | Description |
|
|
29
|
-
|---------|-------------|
|
|
30
|
-
| **pnpm install -g ph-cmd** | Install latest stable version |
|
|
31
|
-
| **pnpm install -g ph-cmd@dev** | Install development version |
|
|
32
|
-
| **pnpm install -g ph-cmd@staging** | Install staging version |
|
|
33
|
-
| **ph init** | Use latest stable version of the boilerplate |
|
|
34
|
-
| **ph init --dev** | Use development version of the boilerplate |
|
|
35
|
-
| **ph init --staging** | Use staging version of the boilerplate |
|
|
36
|
-
| **ph use** | Switch all dependencies to latest production versions |
|
|
37
|
-
| **ph use dev** | Switch all dependencies to development versions |
|
|
38
|
-
| **ph use prod** | Switch all dependencies to production versions |
|
|
39
|
-
|
|
40
|
-
Please be aware that these versions can contain bugs and experimental features that aren't fully tested.
|
|
41
|
-
</details>
|
|
42
21
|
|
|
43
22
|
## Before you begin
|
|
44
23
|
1. Open your terminal (either your system terminal or IDE's integrated terminal)
|
|
45
|
-
2.
|
|
24
|
+
2. Optionally, create a folder first to keep your Powerhouse projects:
|
|
46
25
|
|
|
47
26
|
```bash
|
|
48
|
-
|
|
27
|
+
mkdir ph-projects
|
|
28
|
+
cd ph-projects
|
|
49
29
|
```
|
|
50
30
|
3. Ensure you're in the correct directory before running the `ph init` command.
|
|
51
31
|
In the terminal, you will be asked to enter the project name. Fill in the project name and press Enter.
|
|
52
32
|
```bash
|
|
53
|
-
you@yourmachine:~/
|
|
33
|
+
you@yourmachine:~/ph-projects % ph init
|
|
54
34
|
|
|
55
|
-
? What is the project name? ‣
|
|
35
|
+
? What is the project name? ‣ getting-started
|
|
56
36
|
```
|
|
57
37
|
|
|
58
38
|
Once the project is created, you will see the following output:
|
|
59
39
|
```bash
|
|
60
|
-
Initialized empty Git repository in /Users/
|
|
40
|
+
Initialized empty Git repository in /Users/you/ph-projects/getting-started/.git/
|
|
61
41
|
The installation is done!
|
|
62
42
|
```
|
|
63
43
|
|
|
64
44
|
Navigate to the newly created project directory:
|
|
65
45
|
```bash
|
|
66
|
-
cd
|
|
46
|
+
cd getting-started
|
|
67
47
|
```
|
|
68
48
|
Once in the project directory, run the `ph connect` command to start a local instance of the Connect application. This allows you to start your document model specification document.
|
|
69
49
|
Run the following command to start the Connect application:
|
|
@@ -1,4 +1,4 @@
|
|
|
1
|
-
#
|
|
1
|
+
# Write the document specification
|
|
2
2
|
|
|
3
3
|
In this tutorial, you will learn how to define the specifications for a **To-do List** document model within the Connect application using its GraphQL schema, and then export the resulting document model specification document for your Powerhouse project.
|
|
4
4
|
If you don't have a document specification file created yet, have a look at the previous step of this tutorial to create a new document specification.
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
@@ -401,7 +401,7 @@ export const processorFactory =
|
|
|
401
401
|
};
|
|
402
402
|
```
|
|
403
403
|
|
|
404
|
-
This is described in more detail in the [ProcessorFactory](#processorfactory) section, but for our purposes, we only want our processor to run
|
|
404
|
+
This is described in more detail in the [ProcessorFactory](#processorfactory) section, but for our purposes, we only want our processor to run on our document type, so we should update the filter accordingly.
|
|
405
405
|
|
|
406
406
|
```ts
|
|
407
407
|
filter: {
|
|
@@ -412,15 +412,207 @@ filter: {
|
|
|
412
412
|
},
|
|
413
413
|
```
|
|
414
414
|
|
|
415
|
-
###
|
|
415
|
+
### Dimension Design
|
|
416
416
|
|
|
417
|
-
Before we get into the meat of the processor, we
|
|
417
|
+
Before we get into the meat of the processor, we should be sure to spend some upfront time designing the data we want to query. One way to do this is to start at the end: what do we want to see? In the case of billing statements, we will want to be able to generate reports that show:
|
|
418
418
|
|
|
419
|
+
- Total spent on headcount vs non-headcount
|
|
420
|
+
- Total spent across all budgets
|
|
421
|
+
- Stacked bar chart of total spent each month, grouped by budget
|
|
422
|
+
- Stacked bar chart of total spent each month, grouped by expense category
|
|
423
|
+
- Total spent each year across all budgets
|
|
424
|
+
- Total spent each year, grouped by budget
|
|
425
|
+
- Total spent per month, grouped by budget
|
|
426
|
+
- Total spent last 30 days, grouped by budget
|
|
419
427
|
|
|
428
|
+
From here, we can deconstruct the different criteria we would like to group data across:
|
|
429
|
+
|
|
430
|
+
- time period
|
|
431
|
+
- budget
|
|
432
|
+
- category
|
|
433
|
+
- contributor
|
|
434
|
+
|
|
435
|
+
The analytics engine gives us a good way to bucket based on time period, so we can focus on the other criteria, which we will specify as _dimensions_. Let's use these dimentions to stub out some of the queries we would want to run, using the `useAnalyticsQuery` hook.
|
|
436
|
+
|
|
437
|
+
### Query Design
|
|
438
|
+
|
|
439
|
+
Let's start with "Total spent on headcount vs non-headcount". First, we need to define the time-based criteria.
|
|
440
|
+
|
|
441
|
+
> Time and Dates can be very confusing. This is why we use the `DateTime` class from `luxon`-- see the [luxon docs](https://moment.github.io/luxon/#/math) for a quickstart.
|
|
442
|
+
|
|
443
|
+
```ts
|
|
444
|
+
// easy way to get the start and end of the current year
|
|
445
|
+
const start = DateTime.now().startOf("year");
|
|
446
|
+
const end = DateTime.now().endOf("year");
|
|
447
|
+
|
|
448
|
+
// this means we'll aggregate results across the entire time period
|
|
449
|
+
const granularity = "total";
|
|
450
|
+
```
|
|
451
|
+
|
|
452
|
+
Next, we want to define the metrics we want to analyze. These are the numerical values we will be aggregating over.
|
|
453
|
+
|
|
454
|
+
```ts
|
|
455
|
+
// the two numerical values we want to analyze are cash and powt, which are declared separately in the document model
|
|
456
|
+
const metrics = ["Cash", "Powt"];
|
|
457
|
+
```
|
|
458
|
+
|
|
459
|
+
Now, we can define the dimensions we want to group by. We can imagine that we will have a `contributor` dimension, which will tell us whether or not the contributor is headcount: `/billing-statement/contributor/headcount` or `/billing-statement/contributor/non-headcount`.
|
|
460
|
+
|
|
461
|
+
> It's best practice to namespace dimensions so that we are sure our data is not colliding with other processors. In this case, we will prepend the `billing-statement` namespace, which is simply a prefix we made up.
|
|
462
|
+
|
|
463
|
+
```ts
|
|
464
|
+
const totalSpendOnHeadcount = useAnalyticsQuery({
|
|
465
|
+
start, end, granularity, metrics,
|
|
466
|
+
select: {
|
|
467
|
+
contributor: "/billing-statement/contributor"
|
|
468
|
+
},
|
|
469
|
+
lod: {
|
|
470
|
+
contributor: 3,
|
|
471
|
+
},
|
|
472
|
+
});
|
|
473
|
+
```
|
|
474
|
+
|
|
475
|
+
It is very important to note that the `lod` parameter is used to specify the level of detail we want to see. In this case, we want to see results grouped by contributor, so we set `lod` to `3`. This means we will get separate metric results for `/billing-statement/contributor/headcount` and `/billing-statement/contributor/non-headcount`.
|
|
476
|
+
|
|
477
|
+
We can use these same strategies to create queries for the other criteria we want to group by.
|
|
478
|
+
|
|
479
|
+
```ts
|
|
480
|
+
const totalSpend = useAnalyticsQuery({
|
|
481
|
+
start,
|
|
482
|
+
end,
|
|
483
|
+
granularity: "total", // <--- this means we'll get results for the entire time period
|
|
484
|
+
metrics: ["Cash", "Powt"],
|
|
485
|
+
select: {
|
|
486
|
+
budget: "/billing-statement"
|
|
487
|
+
},
|
|
488
|
+
lod: {
|
|
489
|
+
budget: 0, // <--- this means we'll get all results lumped together
|
|
490
|
+
},
|
|
491
|
+
});
|
|
492
|
+
|
|
493
|
+
const monthlySpendByBudget = useAnalyticsQuery({
|
|
494
|
+
start,
|
|
495
|
+
end,
|
|
496
|
+
granularity: "monthly", // <--- this means we'll get results grouped by month
|
|
497
|
+
metrics: ["Cash", "Powt"],
|
|
498
|
+
select: {
|
|
499
|
+
budget: "/billing-statement/budget"
|
|
500
|
+
},
|
|
501
|
+
lod: {
|
|
502
|
+
budget: 3, // <--- this means we'll get results grouped by "/billing-statement/budget/budget1", "/billing-statement/budget/budget2", etc.
|
|
503
|
+
},
|
|
504
|
+
});
|
|
505
|
+
|
|
506
|
+
const monthlySpendByCategory = useAnalyticsQuery({
|
|
507
|
+
start,
|
|
508
|
+
end,
|
|
509
|
+
granularity: "monthly", // <--- this means we'll get results grouped by month
|
|
510
|
+
metrics: ["Cash", "Powt"],
|
|
511
|
+
select: {
|
|
512
|
+
category: "/billing-statement/category"
|
|
513
|
+
},
|
|
514
|
+
lod: {
|
|
515
|
+
category: 3, // <--- this means we'll get results grouped by "/billing-statement/category/category1", "/billing-statement/category/category2", etc.
|
|
516
|
+
},
|
|
517
|
+
});
|
|
518
|
+
|
|
519
|
+
const yearlySpendByBudget = useAnalyticsQuery({
|
|
520
|
+
start: DateTime.fromObject({ year: 2022 }),
|
|
521
|
+
end: DateTime.now().endOf("year"),
|
|
522
|
+
granularity: "yearly", // <--- this means we'll get results grouped by year
|
|
523
|
+
metrics: ["Cash", "Powt"],
|
|
524
|
+
select: {
|
|
525
|
+
budget: "/billing-statement/budget"
|
|
526
|
+
},
|
|
527
|
+
lod: {
|
|
528
|
+
budget: 3, // <--- this means we'll get results grouped by "/billing-statement/budget/budget1", "/billing-statement/budget/budget2", etc.
|
|
529
|
+
},
|
|
530
|
+
});
|
|
531
|
+
|
|
532
|
+
const monthlySpendByBudget = useAnalyticsQuery({
|
|
533
|
+
start,
|
|
534
|
+
end,
|
|
535
|
+
granularity: "monthly", // <--- this means we'll get results grouped by month
|
|
536
|
+
metrics: ["Cash", "Powt"],
|
|
537
|
+
select: {
|
|
538
|
+
budget: "/billing-statement/budget"
|
|
539
|
+
},
|
|
540
|
+
lod: {
|
|
541
|
+
budget: 3, // <--- this means we'll get results grouped by "/billing-statement/budget/budget1", "/billing-statement/budget/budget2", etc.
|
|
542
|
+
},
|
|
543
|
+
});
|
|
544
|
+
|
|
545
|
+
const last30DaysSpendByBudget = useAnalyticsQuery({
|
|
546
|
+
start: DateTime.now().minus({ days: 30 }),
|
|
547
|
+
end: DateTime.now(),
|
|
548
|
+
granularity: "day", // <--- this means we'll get results grouped by day
|
|
549
|
+
metrics: ["Cash", "Powt"],
|
|
550
|
+
select: {
|
|
551
|
+
budget: "/billing-statement/budget"
|
|
552
|
+
},
|
|
553
|
+
lod: {
|
|
554
|
+
budget: 3, // <--- this means we'll get results grouped by "/billing-statement/budget/budget1", "/billing-statement/budget/budget2", etc.
|
|
555
|
+
},
|
|
556
|
+
});
|
|
557
|
+
```
|
|
558
|
+
|
|
559
|
+
### Source Design
|
|
560
|
+
|
|
561
|
+
The final consideration is the source design. While dimensions and sources both use path syntax, _the paths are unrelated_. That is, a path used in an AnalyticsSeries `source` does not affect a path used in a `dimension`, and vice versa. The `source` attribute of an analytics series is a composable mechanism to track down _where the data came from_.
|
|
562
|
+
|
|
563
|
+
This turns out to be an important consideration, as when we query data, we will likely also want to subscribe to a set of sources to later update the data.
|
|
564
|
+
|
|
565
|
+
For instance, say we take our monthly spend by category query:
|
|
566
|
+
|
|
567
|
+
```ts
|
|
568
|
+
const monthlySpendByCategory = useAnalyticsQuery({
|
|
569
|
+
start,
|
|
570
|
+
end,
|
|
571
|
+
granularity: "monthly",
|
|
572
|
+
metrics: ["Cash", "Powt"],
|
|
573
|
+
select: {
|
|
574
|
+
category: "/billing-statement/category"
|
|
575
|
+
},
|
|
576
|
+
lod: {
|
|
577
|
+
category: 3,
|
|
578
|
+
},
|
|
579
|
+
});
|
|
580
|
+
```
|
|
581
|
+
|
|
582
|
+
This gives us the results we're looking for but, by design, there may be many different `AnalyticsSeries` objects that relate to affect this query. Thus, the hook does not know what to listen to. This is where our `source` design comes in. Generally, we will want to relate analytics by drive and/or document.
|
|
583
|
+
|
|
584
|
+
```ts
|
|
585
|
+
// this source will match all analytics updates from any document in the drive
|
|
586
|
+
const driveSource = AnalyticsPath.fromString(`billing-statement/${drive.header.id}`);
|
|
587
|
+
|
|
588
|
+
// this source will match all analytics updates from a specific document in a drive
|
|
589
|
+
const documentSource = AnalyticsPath.fromString(`billing-statement/${drive.header.id}/${document.header.id}`);
|
|
590
|
+
```
|
|
591
|
+
|
|
592
|
+
```ts
|
|
593
|
+
const { state, data: drive } = useSelectedDrive();
|
|
594
|
+
|
|
595
|
+
const results = useAnalyticsQuery({
|
|
596
|
+
start, end,
|
|
597
|
+
granularity: "monthly",
|
|
598
|
+
metrics: ["Cash", "Powt"],
|
|
599
|
+
select: {
|
|
600
|
+
category: "/billing-statement/category"
|
|
601
|
+
},
|
|
602
|
+
lod: {
|
|
603
|
+
category: 3,
|
|
604
|
+
},
|
|
605
|
+
},
|
|
606
|
+
{
|
|
607
|
+
sources: [
|
|
608
|
+
`/billing-statement/${drive.header.id}/`
|
|
609
|
+
],
|
|
610
|
+
});
|
|
611
|
+
```
|
|
420
612
|
|
|
421
613
|
### `IProcessor`
|
|
422
614
|
|
|
423
|
-
Now we can open up `line-item-processor/index.ts` to add the custom logic we're looking for. This will be in the `onStrands` function.
|
|
615
|
+
Now that we have designed out our data, we can open up `line-item-processor/index.ts` to add the custom logic we're looking for. This will be in the `onStrands` function.
|
|
424
616
|
|
|
425
617
|
```ts
|
|
426
618
|
|
|
@@ -16,7 +16,7 @@ Let's start with step 1 & 2 in the next section of the tutorial!
|
|
|
16
16
|
|
|
17
17
|
To create a new Powerhouse Document Model Library project, you can use the `ph init` command in your terminal. This command will create a new project in the current directory.
|
|
18
18
|
|
|
19
|
-
## Create new Powerhouse document model
|
|
19
|
+
## Create new Powerhouse document model project
|
|
20
20
|
|
|
21
21
|
:::info
|
|
22
22
|
This command will create a new project in the current directory.
|
|
@@ -24,8 +24,10 @@ You can run the command in the terminal window of your OS or you open the newly
|
|
|
24
24
|
You will need VSCode later in the tutorial once you have generated the document model.
|
|
25
25
|
Make sure the terminal reflects the directory where you want to create the new project.
|
|
26
26
|
To open a directory in a terminal, you use the cd command to change your current directory. The cd command takes an argument, usually the name of the folder you want to move to, so the full command is
|
|
27
|
+
|
|
27
28
|
```bash
|
|
28
|
-
|
|
29
|
+
mkdir ph-projects
|
|
30
|
+
cd ph-projects
|
|
29
31
|
```
|
|
30
32
|
This essentially opens that folder and places you in it.
|
|
31
33
|
:::
|
|
@@ -0,0 +1,467 @@
|
|
|
1
|
+
# Drive Analytics
|
|
2
|
+
|
|
3
|
+
Drive Analytics provides automated monitoring and insights into document drive operations within Powerhouse applications. This system tracks user interactions, document modifications, and drive activity to help developers understand usage patterns and system performance.
|
|
4
|
+
|
|
5
|
+
## Overview
|
|
6
|
+
|
|
7
|
+
The Drive Analytics system consists of two specialized processors that automatically collect metrics from document drives:
|
|
8
|
+
|
|
9
|
+
1. **Drive Analytics Processor**: Tracks file and folder operations (creation, deletion, moves, etc.)
|
|
10
|
+
2. **Document Analytics Processor**: Tracks document content changes and state modifications
|
|
11
|
+
|
|
12
|
+
These processors run in the background, converting operations into structured time-series data that can be queried and visualized in real-time.
|
|
13
|
+
|
|
14
|
+
## Available Metrics in Connect
|
|
15
|
+
|
|
16
|
+
Connect applications have Drive Analytics enabled by default through the `ReactorAnalyticsProvider`. When enabled, the system automatically tracks:
|
|
17
|
+
|
|
18
|
+
### Drive Operations Metrics
|
|
19
|
+
- **File Creation**: New documents added to drives
|
|
20
|
+
- **Folder Creation**: New directories created
|
|
21
|
+
- **File Updates**: Document content modifications
|
|
22
|
+
- **Node Updates**: Metadata changes
|
|
23
|
+
- **File Moves**: Documents relocated between folders
|
|
24
|
+
- **File Copies**: Document duplication
|
|
25
|
+
- **File Deletions**: Documents removed from drives
|
|
26
|
+
|
|
27
|
+
### Document Operations Metrics
|
|
28
|
+
- **State Changes**: Document model state modifications
|
|
29
|
+
|
|
30
|
+
## Data Sources and Structure
|
|
31
|
+
|
|
32
|
+
Drive Analytics organizes data using hierarchical source paths that allow precise querying of different analytics contexts:
|
|
33
|
+
|
|
34
|
+
### Drive Analytics Sources
|
|
35
|
+
Pattern: `ph/drive/{driveId}/{branch}/{scope}`
|
|
36
|
+
- **driveId**: Unique identifier for the document drive
|
|
37
|
+
- **branch**: Branch name (e.g., "main", "dev")
|
|
38
|
+
- **scope**: Operation scope ("global" for shared operations, "local" for device-specific)
|
|
39
|
+
|
|
40
|
+
Example: `ph/drive/abc123/main/global`
|
|
41
|
+
|
|
42
|
+
### Document Analytics Sources
|
|
43
|
+
Pattern: `ph/doc/{driveId}/{documentId}/{branch}/{scope}`
|
|
44
|
+
- **driveId**: Drive containing the document
|
|
45
|
+
- **documentId**: Specific document identifier
|
|
46
|
+
- **branch**: Branch name
|
|
47
|
+
- **scope**: Operation scope
|
|
48
|
+
|
|
49
|
+
Example: `ph/doc/abc123/doc456/main/global`
|
|
50
|
+
|
|
51
|
+
## Available Metrics
|
|
52
|
+
|
|
53
|
+
### DriveOperations
|
|
54
|
+
Tracks file system operations within drives:
|
|
55
|
+
- **Value**: Always 1 (counter metric)
|
|
56
|
+
- **Purpose**: Count drive-level operations like file creation, deletion, moves
|
|
57
|
+
- **Source Pattern**: `ph/drive/*`
|
|
58
|
+
|
|
59
|
+
### DocumentOperations
|
|
60
|
+
Tracks document content and state changes:
|
|
61
|
+
- **Value**: Always 1 (counter metric)
|
|
62
|
+
- **Purpose**: Count document-specific operations like state changes
|
|
63
|
+
- **Source Pattern**: `ph/doc/*`
|
|
64
|
+
|
|
65
|
+
## Complete Dimensions Reference
|
|
66
|
+
|
|
67
|
+
### Drive Analytics Dimensions
|
|
68
|
+
|
|
69
|
+
#### 1. Drive Dimension
|
|
70
|
+
**Pattern**: `ph/drive/{driveId}/{branch}/{scope}/{revision}`
|
|
71
|
+
**Purpose**: Identifies the drive context with revision information
|
|
72
|
+
```tsx
|
|
73
|
+
// Examples
|
|
74
|
+
"ph/drive/abc123/main/global/42"
|
|
75
|
+
"ph/drive/my-drive/feature-branch/local/15"
|
|
76
|
+
```
|
|
77
|
+
|
|
78
|
+
#### 2. Operation Dimension
|
|
79
|
+
**Pattern**: `ph/drive/operation/{operationType}/{operationIndex}`
|
|
80
|
+
**Purpose**: Identifies specific operation types and their sequence
|
|
81
|
+
|
|
82
|
+
**Available Operation Types**:
|
|
83
|
+
- **ADD_FILE**: Create new file
|
|
84
|
+
- **ADD_FOLDER**: Create new folder
|
|
85
|
+
- **UPDATE_FILE**: Modify file content
|
|
86
|
+
- **UPDATE_NODE**: Modify node metadata
|
|
87
|
+
- **MOVE_NODE**: Move file/folder to different location
|
|
88
|
+
- **COPY_NODE**: Duplicate existing file/folder
|
|
89
|
+
- **DELETE_NODE**: Remove file/folder
|
|
90
|
+
|
|
91
|
+
```tsx
|
|
92
|
+
// Examples
|
|
93
|
+
"ph/drive/operation/ADD_FILE/5"
|
|
94
|
+
"ph/drive/operation/DELETE_NODE/23"
|
|
95
|
+
"ph/drive/operation/MOVE_NODE/12"
|
|
96
|
+
```
|
|
97
|
+
|
|
98
|
+
#### 3. Target Dimension
|
|
99
|
+
**Pattern**: `ph/drive/target/{targetType}/{targetId}`
|
|
100
|
+
**Purpose**: Identifies what was targeted by the operation
|
|
101
|
+
|
|
102
|
+
**Target Types**:
|
|
103
|
+
- **DRIVE**: Operation affects the drive itself
|
|
104
|
+
- **NODE**: Operation affects a specific file/folder
|
|
105
|
+
|
|
106
|
+
```tsx
|
|
107
|
+
// Examples
|
|
108
|
+
"ph/drive/target/DRIVE/abc123"
|
|
109
|
+
"ph/drive/target/NODE/file456"
|
|
110
|
+
"ph/drive/target/NODE/folder789"
|
|
111
|
+
```
|
|
112
|
+
|
|
113
|
+
#### 4. Action Type Dimension
|
|
114
|
+
**Pattern**: `ph/drive/actionType/{actionType}/{targetId}`
|
|
115
|
+
**Purpose**: Categorizes operations by their effect
|
|
116
|
+
|
|
117
|
+
**Action Types**:
|
|
118
|
+
- **CREATED**: New items added (ADD_FILE, ADD_FOLDER)
|
|
119
|
+
- **DUPLICATED**: Items copied (COPY_NODE)
|
|
120
|
+
- **UPDATED**: Existing items modified (UPDATE_FILE, UPDATE_NODE)
|
|
121
|
+
- **MOVED**: Items relocated (MOVE_NODE)
|
|
122
|
+
- **REMOVED**: Items deleted (DELETE_NODE)
|
|
123
|
+
|
|
124
|
+
```tsx
|
|
125
|
+
// Examples
|
|
126
|
+
"ph/drive/actionType/CREATED/file123"
|
|
127
|
+
"ph/drive/actionType/MOVED/folder456"
|
|
128
|
+
"ph/drive/actionType/REMOVED/doc789"
|
|
129
|
+
```
|
|
130
|
+
|
|
131
|
+
### Document Analytics Dimensions
|
|
132
|
+
|
|
133
|
+
#### 1. Drive Dimension
|
|
134
|
+
**Pattern**: `ph/doc/drive/{driveId}/{branch}/{scope}/{revision}`
|
|
135
|
+
**Purpose**: Drive context for document operations
|
|
136
|
+
```tsx
|
|
137
|
+
// Examples
|
|
138
|
+
"ph/doc/drive/abc123/main/global/42"
|
|
139
|
+
```
|
|
140
|
+
|
|
141
|
+
#### 2. Operation Dimension
|
|
142
|
+
**Pattern**: `ph/doc/operation/{operationType}/{operationIndex}`
|
|
143
|
+
**Purpose**: Document-specific operation identification
|
|
144
|
+
```tsx
|
|
145
|
+
// Examples (document model operations vary by document type)
|
|
146
|
+
"ph/doc/operation/SET_STATE/15"
|
|
147
|
+
"ph/doc/operation/ADD_ITEM/8"
|
|
148
|
+
"ph/doc/operation/UPDATE_PROPERTY/22"
|
|
149
|
+
```
|
|
150
|
+
|
|
151
|
+
#### 3. Target Dimension
|
|
152
|
+
**Pattern**: `ph/doc/target/{driveId}/{targetType}/{documentId}`
|
|
153
|
+
**Purpose**: Document target identification
|
|
154
|
+
|
|
155
|
+
**Target Types**:
|
|
156
|
+
- **DRIVE**: Document is the drive document itself (driveId === documentId)
|
|
157
|
+
- **NODE**: Document is a regular document within the drive
|
|
158
|
+
|
|
159
|
+
```tsx
|
|
160
|
+
// Examples
|
|
161
|
+
"ph/doc/target/abc123/DRIVE/abc123" // Drive document
|
|
162
|
+
"ph/doc/target/abc123/NODE/doc456" // Regular document
|
|
163
|
+
```
|
|
164
|
+
|
|
165
|
+
## Query Parameters
|
|
166
|
+
|
|
167
|
+
### Time Range
|
|
168
|
+
- **start**: DateTime object for query start time
|
|
169
|
+
- **end**: DateTime object for query end time
|
|
170
|
+
- **granularity**: Time bucketing (Total, Hourly, Daily, Weekly, Monthly)
|
|
171
|
+
|
|
172
|
+
### Filtering with Select
|
|
173
|
+
|
|
174
|
+
Use the `select` parameter to filter by specific dimension values:
|
|
175
|
+
|
|
176
|
+
```tsx
|
|
177
|
+
select: {
|
|
178
|
+
// Filter by specific drives
|
|
179
|
+
drive: [
|
|
180
|
+
AnalyticsPath.fromString("ph/drive/abc123"),
|
|
181
|
+
AnalyticsPath.fromString("ph/drive/xyz789")
|
|
182
|
+
],
|
|
183
|
+
|
|
184
|
+
// Filter by operation types
|
|
185
|
+
operation: [
|
|
186
|
+
AnalyticsPath.fromString("ph/drive/operation/ADD_FILE"),
|
|
187
|
+
AnalyticsPath.fromString("ph/drive/operation/UPDATE_FILE")
|
|
188
|
+
],
|
|
189
|
+
|
|
190
|
+
// Filter by action types
|
|
191
|
+
actionType: [
|
|
192
|
+
AnalyticsPath.fromString("ph/drive/actionType/CREATED"),
|
|
193
|
+
AnalyticsPath.fromString("ph/drive/actionType/UPDATED")
|
|
194
|
+
],
|
|
195
|
+
|
|
196
|
+
// Filter by targets
|
|
197
|
+
target: [
|
|
198
|
+
AnalyticsPath.fromString("ph/drive/target/NODE")
|
|
199
|
+
]
|
|
200
|
+
}
|
|
201
|
+
```
|
|
202
|
+
|
|
203
|
+
### Level of Detail (LOD)
|
|
204
|
+
|
|
205
|
+
Control how deeply dimensions are grouped:
|
|
206
|
+
|
|
207
|
+
```tsx
|
|
208
|
+
lod: {
|
|
209
|
+
drive: 1, // Group by drive only (ignore branch/scope/revision)
|
|
210
|
+
operation: 1, // Group by operation type only (ignore index)
|
|
211
|
+
actionType: 1, // Group by action type only (ignore target ID)
|
|
212
|
+
target: 1 // Group by target type only (ignore target ID)
|
|
213
|
+
}
|
|
214
|
+
```
|
|
215
|
+
|
|
216
|
+
## Querying Analytics Data
|
|
217
|
+
|
|
218
|
+
### Using the useAnalyticsQuery Hook
|
|
219
|
+
|
|
220
|
+
The primary way to access drive analytics is through the `useAnalyticsQuery` hook:
|
|
221
|
+
|
|
222
|
+
```tsx
|
|
223
|
+
import { useAnalyticsQuery, AnalyticsGranularity, AnalyticsPath, DateTime } from '@powerhousedao/reactor-browser/analytics';
|
|
224
|
+
|
|
225
|
+
function DriveUsageChart({ driveId }: { driveId: string }) {
|
|
226
|
+
const { data, isLoading } = useAnalyticsQuery({
|
|
227
|
+
start: DateTime.now().minus({ days: 7 }),
|
|
228
|
+
end: DateTime.now(),
|
|
229
|
+
granularity: AnalyticsGranularity.Daily,
|
|
230
|
+
metrics: ["DriveOperations"],
|
|
231
|
+
select: {
|
|
232
|
+
drive: [AnalyticsPath.fromString(`ph/drive/${driveId}`)],
|
|
233
|
+
actionType: [
|
|
234
|
+
AnalyticsPath.fromString("ph/drive/actionType/CREATED"),
|
|
235
|
+
AnalyticsPath.fromString("ph/drive/actionType/UPDATED"),
|
|
236
|
+
AnalyticsPath.fromString("ph/drive/actionType/REMOVED")
|
|
237
|
+
]
|
|
238
|
+
},
|
|
239
|
+
lod: {
|
|
240
|
+
drive: 1,
|
|
241
|
+
actionType: 1
|
|
242
|
+
}
|
|
243
|
+
});
|
|
244
|
+
|
|
245
|
+
if (isLoading) return <div>Loading analytics...</div>;
|
|
246
|
+
|
|
247
|
+
return (
|
|
248
|
+
<div>
|
|
249
|
+
{/* Render your chart using the analytics data */}
|
|
250
|
+
{data?.rows.map(row => (
|
|
251
|
+
<div key={row.metric}>
|
|
252
|
+
{row.metric}: {row.value}
|
|
253
|
+
</div>
|
|
254
|
+
))}
|
|
255
|
+
</div>
|
|
256
|
+
);
|
|
257
|
+
}
|
|
258
|
+
```
|
|
259
|
+
|
|
260
|
+
### Using the useDriveAnalytics Hook
|
|
261
|
+
|
|
262
|
+
For common drive analytics queries, use the specialized `useDriveAnalytics` hook:
|
|
263
|
+
|
|
264
|
+
```tsx
|
|
265
|
+
import { useDriveAnalytics } from '@powerhousedao/common/drive-analytics';
|
|
266
|
+
import { AnalyticsGranularity } from '@powerhousedao/reactor-browser/analytics';
|
|
267
|
+
|
|
268
|
+
function DriveInsights({ driveIds }: { driveIds: string[] }) {
|
|
269
|
+
const analytics = useDriveAnalytics({
|
|
270
|
+
filters: {
|
|
271
|
+
driveId: driveIds,
|
|
272
|
+
operation: ["ADD_FILE", "UPDATE_FILE", "DELETE_NODE"],
|
|
273
|
+
actionType: ["CREATED", "UPDATED", "REMOVED"]
|
|
274
|
+
},
|
|
275
|
+
from: new Date(Date.now() - 7 * 24 * 60 * 60 * 1000).toISOString(), // 7 days ago
|
|
276
|
+
to: new Date().toISOString(),
|
|
277
|
+
granularity: AnalyticsGranularity.Daily,
|
|
278
|
+
levelOfDetail: { drive: 1, operation: 1 }
|
|
279
|
+
});
|
|
280
|
+
|
|
281
|
+
if (analytics.isLoading) return <div>Loading...</div>;
|
|
282
|
+
|
|
283
|
+
return (
|
|
284
|
+
<div>
|
|
285
|
+
<h3>Drive Activity Summary</h3>
|
|
286
|
+
{analytics.data?.rows.map((row, index) => (
|
|
287
|
+
<div key={index}>
|
|
288
|
+
<strong>{row.dimensions.find(d => d.name === 'actionType')?.path}</strong>: {row.value}
|
|
289
|
+
</div>
|
|
290
|
+
))}
|
|
291
|
+
</div>
|
|
292
|
+
);
|
|
293
|
+
}
|
|
294
|
+
```
|
|
295
|
+
|
|
296
|
+
### Using the useDocumentAnalytics Hook
|
|
297
|
+
|
|
298
|
+
For document-specific analytics queries, use the `useDocumentAnalytics` hook:
|
|
299
|
+
|
|
300
|
+
```tsx
|
|
301
|
+
import { useDocumentAnalytics } from '@powerhousedao/common/drive-analytics';
|
|
302
|
+
import { AnalyticsGranularity } from '@powerhousedao/reactor-browser/analytics';
|
|
303
|
+
|
|
304
|
+
function DocumentInsights({ driveId, documentIds }: { driveId: string, documentIds: string[] }) {
|
|
305
|
+
const analytics = useDocumentAnalytics({
|
|
306
|
+
filters: {
|
|
307
|
+
driveId: [driveId],
|
|
308
|
+
documentId: documentIds,
|
|
309
|
+
target: ["NODE"], // Focus on document nodes vs drive documents
|
|
310
|
+
branch: ["main"],
|
|
311
|
+
scope: ["global"]
|
|
312
|
+
},
|
|
313
|
+
from: new Date(Date.now() - 24 * 60 * 60 * 1000).toISOString(), // 24 hours ago
|
|
314
|
+
to: new Date().toISOString(),
|
|
315
|
+
granularity: AnalyticsGranularity.Hourly,
|
|
316
|
+
levelOfDetail: {
|
|
317
|
+
drive: 1,
|
|
318
|
+
operation: 1,
|
|
319
|
+
target: 1
|
|
320
|
+
}
|
|
321
|
+
});
|
|
322
|
+
|
|
323
|
+
if (analytics.isLoading) return <div>Loading...</div>;
|
|
324
|
+
|
|
325
|
+
return (
|
|
326
|
+
<div>
|
|
327
|
+
<h3>Document Activity Summary</h3>
|
|
328
|
+
{analytics.data?.rows.map((row, index) => (
|
|
329
|
+
<div key={index}>
|
|
330
|
+
Document Operations: {row.value}
|
|
331
|
+
</div>
|
|
332
|
+
))}
|
|
333
|
+
</div>
|
|
334
|
+
);
|
|
335
|
+
}
|
|
336
|
+
```
|
|
337
|
+
|
|
338
|
+
## Advanced Query Examples
|
|
339
|
+
|
|
340
|
+
### Filter by Multiple Criteria
|
|
341
|
+
|
|
342
|
+
```tsx
|
|
343
|
+
// Get file creations and updates for specific drives in the last 24 hours
|
|
344
|
+
const { data } = useAnalyticsQuery({
|
|
345
|
+
start: DateTime.now().minus({ hours: 24 }),
|
|
346
|
+
end: DateTime.now(),
|
|
347
|
+
granularity: AnalyticsGranularity.Hourly,
|
|
348
|
+
metrics: ["DriveOperations"],
|
|
349
|
+
select: {
|
|
350
|
+
drive: [
|
|
351
|
+
AnalyticsPath.fromString("ph/drive/project-a"),
|
|
352
|
+
AnalyticsPath.fromString("ph/drive/project-b")
|
|
353
|
+
],
|
|
354
|
+
operation: [
|
|
355
|
+
AnalyticsPath.fromString("ph/drive/operation/ADD_FILE"),
|
|
356
|
+
AnalyticsPath.fromString("ph/drive/operation/UPDATE_FILE")
|
|
357
|
+
],
|
|
358
|
+
target: [
|
|
359
|
+
AnalyticsPath.fromString("ph/drive/target/NODE")
|
|
360
|
+
]
|
|
361
|
+
},
|
|
362
|
+
lod: {
|
|
363
|
+
drive: 1,
|
|
364
|
+
operation: 1
|
|
365
|
+
}
|
|
366
|
+
});
|
|
367
|
+
```
|
|
368
|
+
|
|
369
|
+
### Compare Document vs Drive Operations
|
|
370
|
+
|
|
371
|
+
```tsx
|
|
372
|
+
// Using the specialized hooks for easier comparison
|
|
373
|
+
const driveOps = useDriveAnalytics({
|
|
374
|
+
filters: { driveId: [driveId] },
|
|
375
|
+
from: DateTime.now().minus({ days: 1 }).toISO(),
|
|
376
|
+
to: DateTime.now().toISO(),
|
|
377
|
+
granularity: AnalyticsGranularity.Total
|
|
378
|
+
});
|
|
379
|
+
|
|
380
|
+
const docOps = useDocumentAnalytics({
|
|
381
|
+
filters: { driveId: [driveId] },
|
|
382
|
+
from: DateTime.now().minus({ days: 1 }).toISO(),
|
|
383
|
+
to: DateTime.now().toISO(),
|
|
384
|
+
granularity: AnalyticsGranularity.Total
|
|
385
|
+
});
|
|
386
|
+
|
|
387
|
+
// Or using useAnalyticsQuery directly
|
|
388
|
+
const driveOpsQuery = useAnalyticsQuery({
|
|
389
|
+
start: DateTime.now().minus({ days: 1 }),
|
|
390
|
+
end: DateTime.now(),
|
|
391
|
+
granularity: AnalyticsGranularity.Total,
|
|
392
|
+
metrics: ["DriveOperations"],
|
|
393
|
+
select: {
|
|
394
|
+
drive: [AnalyticsPath.fromString(`ph/drive/${driveId}`)]
|
|
395
|
+
}
|
|
396
|
+
});
|
|
397
|
+
|
|
398
|
+
const docOpsQuery = useAnalyticsQuery({
|
|
399
|
+
start: DateTime.now().minus({ days: 1 }),
|
|
400
|
+
end: DateTime.now(),
|
|
401
|
+
granularity: AnalyticsGranularity.Total,
|
|
402
|
+
metrics: ["DocumentOperations"],
|
|
403
|
+
select: {
|
|
404
|
+
drive: [AnalyticsPath.fromString(`ph/doc/drive/${driveId}`)]
|
|
405
|
+
}
|
|
406
|
+
});
|
|
407
|
+
```
|
|
408
|
+
|
|
409
|
+
### Real-time Activity Monitoring
|
|
410
|
+
|
|
411
|
+
```tsx
|
|
412
|
+
// Monitor specific drive for real-time updates
|
|
413
|
+
const { data } = useAnalyticsQuery(
|
|
414
|
+
{
|
|
415
|
+
start: DateTime.now().minus({ minutes: 10 }),
|
|
416
|
+
end: DateTime.now(),
|
|
417
|
+
granularity: AnalyticsGranularity.Total,
|
|
418
|
+
metrics: ["DriveOperations"],
|
|
419
|
+
select: {
|
|
420
|
+
drive: [AnalyticsPath.fromString(`ph/drive/${driveId}`)]
|
|
421
|
+
}
|
|
422
|
+
},
|
|
423
|
+
{
|
|
424
|
+
sources: [AnalyticsPath.fromString(`ph/drive/${driveId}`)],
|
|
425
|
+
refetchInterval: 5000 // Poll every 5 seconds
|
|
426
|
+
}
|
|
427
|
+
);
|
|
428
|
+
```
|
|
429
|
+
|
|
430
|
+
## Real-time Updates
|
|
431
|
+
|
|
432
|
+
Analytics queries can automatically update when new data is available by specifying sources:
|
|
433
|
+
|
|
434
|
+
```tsx
|
|
435
|
+
const { data } = useAnalyticsQuery(
|
|
436
|
+
{
|
|
437
|
+
start: DateTime.now().minus({ hours: 1 }),
|
|
438
|
+
end: DateTime.now(),
|
|
439
|
+
granularity: AnalyticsGranularity.Total,
|
|
440
|
+
metrics: ["DriveOperations"]
|
|
441
|
+
},
|
|
442
|
+
{
|
|
443
|
+
sources: [AnalyticsPath.fromString(`ph/drive/${driveId}`)]
|
|
444
|
+
}
|
|
445
|
+
);
|
|
446
|
+
|
|
447
|
+
// This query will automatically refetch when new operations occur in the specified drive
|
|
448
|
+
```
|
|
449
|
+
|
|
450
|
+
|
|
451
|
+
## Configuration in Connect
|
|
452
|
+
|
|
453
|
+
Drive Analytics is automatically enabled in Connect applications through feature flags:
|
|
454
|
+
|
|
455
|
+
```tsx
|
|
456
|
+
// In apps/connect/src/context/reactor-analytics.tsx
|
|
457
|
+
export function ReactorAnalyticsProvider({ children }: PropsWithChildren) {
|
|
458
|
+
return (
|
|
459
|
+
<AnalyticsProvider options={{ databaseName: "connect:analytics" }}>
|
|
460
|
+
{connectConfig.analytics.driveAnalyticsEnabled && (
|
|
461
|
+
<DriveAnalyticsProcessor />
|
|
462
|
+
)}
|
|
463
|
+
{children}
|
|
464
|
+
</AnalyticsProvider>
|
|
465
|
+
);
|
|
466
|
+
}
|
|
467
|
+
```
|
|
@@ -13,7 +13,8 @@ To create a new Powerhouse Document Model Library project, you can use the `ph i
|
|
|
13
13
|
This command will create a new project in the current directory. You can run the command in the terminal window of your OS or you open the newly installed VSCode and run the command in the terminal window of VSCode.Make sure the terminal reflects the directory where you want to create the new project.
|
|
14
14
|
|
|
15
15
|
```bash
|
|
16
|
-
|
|
16
|
+
mkdir ph-projects
|
|
17
|
+
cd ph-projects
|
|
17
18
|
```
|
|
18
19
|
This essentially opens that folder and places you in it.
|
|
19
20
|
|
|
@@ -15,6 +15,22 @@ Here are the key points to understand:
|
|
|
15
15
|
- **Custom Scalars:** Besides the built-in scalars, you can define custom scalars (e.g., a Date type) if you need to handle more specific formats or validations. Powerhouse does this specific for the web3 ecosystem.
|
|
16
16
|
:::
|
|
17
17
|
|
|
18
|
+
## What are Components?
|
|
19
|
+
|
|
20
|
+
In the context of Powerhouse Builder platform, components can be thought of as reusable elements, or ready-to-use building blocks that help builders implement **document editors & viewers** with little to no effort. An important utility aspect of a component is that it serves its users as a **data input field**, providing structured ways to enter and manipulate information within your document models.
|
|
21
|
+
|
|
22
|
+
## Document Editors vs Document Viewers
|
|
23
|
+
|
|
24
|
+
Understanding the relationship between document editors and viewers is crucial for component usage:
|
|
25
|
+
|
|
26
|
+
**Document Editor**: A specific document type that is used by one or more users to make data entries and update its state. Key utility is the ability to enter data in a structured format, making it a great tool for collaboration within a group of authorized users.
|
|
27
|
+
|
|
28
|
+
**Document Viewer**: Does not allow modifications. It's a great way to inform about the state of the document type, making it a great tool for providing a broader group or public with transparent insights. Document viewers do not have to match the view of the editor one-to-one - the data presented could be framed as a specific selection, or filtered to provide desired insights.
|
|
29
|
+
|
|
30
|
+
:::tip Component Behavior in Different Contexts
|
|
31
|
+
The same component that will be used in a document viewer will have a **disabled state** (not allowed to edit documents). Document editors precede document viewers - you would start by creating a document editor and then, if needed, decide which viewer format is useful.
|
|
32
|
+
:::
|
|
33
|
+
|
|
18
34
|
## Scalars vs. General UI Components
|
|
19
35
|
|
|
20
36
|
### Scalar Components
|
|
@@ -39,6 +55,42 @@ This category includes a broader range of UI elements such as simplified version
|
|
|
39
55
|
**Location:** @powerhousedao/document-engineering/ui
|
|
40
56
|
https://github.com/powerhouse-inc/document-engineering
|
|
41
57
|
|
|
58
|
+
## Component Types Classification
|
|
59
|
+
|
|
60
|
+
Inspired by atomic design methodology, Powerhouse classifies components into the following categories:
|
|
61
|
+
|
|
62
|
+
### Fragment
|
|
63
|
+
The smallest element that combined together makes up a scalar or other simple component.
|
|
64
|
+
**Examples:** Character counter, Checkbox field, Label
|
|
65
|
+
|
|
66
|
+
### Scalar (Simple Component)
|
|
67
|
+
The simplest component that contains the basic input field for one-dimensional data type (single value).
|
|
68
|
+
**Examples:** Integer, Boolean, String, Powerhouse ID (PHID)
|
|
69
|
+
|
|
70
|
+
### Complex Component
|
|
71
|
+
Compound component that has an object/array value. It's made up of multiple scalars combined to serve a specific function.
|
|
72
|
+
**Examples:** Sidebar (tree structure navigation component with content-style navigation for hierarchical data)
|
|
73
|
+
|
|
74
|
+
### Layout Component
|
|
75
|
+
Purpose-specific container for other components like lists of other components, color layouts, sections, etc.
|
|
76
|
+
**Examples:** Homepage section layout
|
|
77
|
+
|
|
78
|
+
:::info Component Library Philosophy
|
|
79
|
+
The Powerhouse team is building a Component library with a wide range of components embedding best UX practices & key functionality. This library establishes standards and best practices for building documents while fast-tracking the building process through facilitation of the most basic & useful component types.
|
|
80
|
+
:::
|
|
81
|
+
|
|
82
|
+
## Component Behavior & UX Principles
|
|
83
|
+
|
|
84
|
+
Besides the ability to input data, components have another crucial utility: they describe the mechanism of user interaction through implementing a defined set of behavior rules.
|
|
85
|
+
|
|
86
|
+
**Best Practices for Component Behavior:**
|
|
87
|
+
- Implementing behaviors at a component level is much more efficient than at the document level
|
|
88
|
+
- Good component behavior feels natural to the user and is easily understood
|
|
89
|
+
- Components should be intuitive and not require additional tutorials or explanations
|
|
90
|
+
- Start with the most simple/basic behaviors first, then layer additional behaviors on top
|
|
91
|
+
- Keep behaviors as simple as needed - less is more
|
|
92
|
+
|
|
93
|
+
|
|
42
94
|
## Exploring Components with Storybook
|
|
43
95
|
|
|
44
96
|
We use Storybook as an interactive catalog for our design system components. It allows you to visually explore each component, interact with different states, and understand how to integrate them into your projects. [https://storybook.powerhouse.academy](https://storybook.powerhouse.academy)
|
|
@@ -14,7 +14,7 @@ You need to install the Powerhouse CLI (`ph-cmd`) to create and manage Powerhous
|
|
|
14
14
|
|
|
15
15
|
## Prerequisites
|
|
16
16
|
- node.js 22 installed
|
|
17
|
-
- pnpm package manager installed
|
|
17
|
+
- pnpm package manager 10 installed
|
|
18
18
|
- Terminal or command prompt access
|
|
19
19
|
|
|
20
20
|
## Solution
|
|
@@ -169,7 +169,7 @@ You need to access experimental features, bugfixes, or development versions of P
|
|
|
169
169
|
|
|
170
170
|
## Prerequisites
|
|
171
171
|
- Terminal or command prompt access
|
|
172
|
-
- pnpm package manager installed
|
|
172
|
+
- pnpm package manager 10 installed
|
|
173
173
|
- Node.js 22 installed
|
|
174
174
|
|
|
175
175
|
## Solution
|
package/package.json
CHANGED
package/src/css/custom.css
CHANGED
|
@@ -157,6 +157,21 @@ a {
|
|
|
157
157
|
color: #003d7a; /* darker blue on hover */
|
|
158
158
|
}
|
|
159
159
|
|
|
160
|
+
/* Dark mode specific styles for better readability */
|
|
161
|
+
[data-theme='dark'] .docs-doc-page .theme-doc-markdown a,
|
|
162
|
+
[data-theme='dark'] .markdown a,
|
|
163
|
+
[data-theme='dark'] details a,
|
|
164
|
+
[data-theme='dark'] .theme-doc-markdown details a {
|
|
165
|
+
color: #66b3ff !important; /* lighter blue for dark mode */
|
|
166
|
+
}
|
|
167
|
+
|
|
168
|
+
[data-theme='dark'] .docs-doc-page .theme-doc-markdown a:hover,
|
|
169
|
+
[data-theme='dark'] .markdown a:hover,
|
|
170
|
+
[data-theme='dark'] details a:hover,
|
|
171
|
+
[data-theme='dark'] .theme-doc-markdown details a:hover {
|
|
172
|
+
color: #99ccff !important; /* even lighter blue on hover for dark mode */
|
|
173
|
+
}
|
|
174
|
+
|
|
160
175
|
/* Reset styling for navigation elements, category pages, and utility links */
|
|
161
176
|
.breadcrumbs a,
|
|
162
177
|
.pagination-nav a,
|