@powerhousedao/academy 3.3.0-dev.14 → 3.3.0-dev.16

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (20) hide show
  1. package/CHANGELOG.md +16 -0
  2. package/docs/academy/02-MasteryTrack/04-WorkWithData/03-UsingSubgraphs.md +145 -86
  3. package/docs/academy/02-MasteryTrack/04-WorkWithData/07-OperationalDbProcessorTutorial/01-TodoList-example.md +242 -58
  4. package/docs/academy/02-MasteryTrack/04-WorkWithData/07-OperationalDbProcessorTutorial/_category_.json +8 -0
  5. package/docs/academy/04-APIReferences/01-ReactHooks.md +36 -0
  6. package/docs/academy/04-APIReferences/04-RelationalDatabase.md +46 -52
  7. package/docs/academy/04-APIReferences/05-PHDocumentMigrationGuide.md +8 -9
  8. package/package.json +1 -1
  9. package/docs/academy/02-MasteryTrack/04-WorkWithData/05-AnalyticsProcessorTutorial/05-AnalyticsTutorial +0 -4
  10. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial/01-SetupBuilderEnvironment.md → _05-AnalyticsProcessorTutorial/_01-SetupBuilderEnvironment.md} +0 -0
  11. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial/02-CreateNewPowerhouseProject.md → _05-AnalyticsProcessorTutorial/_02-CreateNewPowerhouseProject.md} +0 -0
  12. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial/03-GenerateAnAnalyticsProcessor.md → _05-AnalyticsProcessorTutorial/_03-GenerateAnAnalyticsProcessor.md} +0 -0
  13. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial/04-UpdateAnalyticsProcessor.md → _05-AnalyticsProcessorTutorial/_04-UpdateAnalyticsProcessor.md} +0 -0
  14. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial → _05-AnalyticsProcessorTutorial}/_category_.json +0 -0
  15. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial → _05-AnalyticsProcessorTutorial}/images/Create-SPV.gif +0 -0
  16. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial → _05-AnalyticsProcessorTutorial}/images/Create-a-new-asset.png +0 -0
  17. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial → _05-AnalyticsProcessorTutorial}/images/Create-a-transaction.gif +0 -0
  18. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial → _05-AnalyticsProcessorTutorial}/images/Transaction-table.png +0 -0
  19. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial → _05-AnalyticsProcessorTutorial}/images/create-a-new-RWA-document.gif +0 -0
  20. /package/docs/academy/02-MasteryTrack/04-WorkWithData/{05-AnalyticsProcessorTutorial → _05-AnalyticsProcessorTutorial}/images/granularity.png +0 -0
package/CHANGELOG.md CHANGED
@@ -1,3 +1,19 @@
1
+ ## 3.3.0-dev.16 (2025-07-22)
2
+
3
+ This was a version bump only for @powerhousedao/academy to align it with other projects, there were no code changes.
4
+
5
+ ## 3.3.0-dev.15 (2025-07-17)
6
+
7
+ ### 🩹 Fixes
8
+
9
+ - **codegen:** updated subgraph template to deal with undefined return on getDocument ([7b2862a91](https://github.com/powerhouse-inc/powerhouse/commit/7b2862a91))
10
+ - **academy:** update broken links ([cbbfe9b30](https://github.com/powerhouse-inc/powerhouse/commit/cbbfe9b30))
11
+
12
+ ### ❤️ Thank You
13
+
14
+ - acaldas
15
+ - Callme-T
16
+
1
17
  ## 3.3.0-dev.14 (2025-07-17)
2
18
 
3
19
  This was a version bump only for @powerhousedao/academy to align it with other projects, there were no code changes.
@@ -19,9 +19,9 @@ A subgraph in Powerhouse is a **GraphQL-based modular data component** that exte
19
19
 
20
20
  ### Subgraphs consist of
21
21
 
22
- - **A schema** Which defines the GraphQL Queries and Mutations.
23
- - **Resolvers** Which handle data fetching and logic.
24
- - **Context Fields** Additional metadata that helps in resolving data efficiently.
22
+ - **A schema** Which defines the GraphQL Queries and Mutations.
23
+ - **Resolvers** Which handle data fetching and logic.
24
+ - **Context Fields** Additional metadata that helps in resolving data efficiently.
25
25
 
26
26
  #### Additionally, context fields allow resolvers to access extra information, such as:
27
27
  - **User authentication** (e.g., checking if a user is an admin).
@@ -40,30 +40,30 @@ context: {
40
40
 
41
41
  ## 1. How to generate a subgraph
42
42
 
43
- Lets start by generating a new subgraph. For our tutorial we will create a new subgraph within our To-do List project.
43
+ Let's start by generating a new subgraph. For our tutorial we will create a new subgraph within our To-do List project.
44
44
  Open your project and start your terminal.
45
45
  The Powerhouse toolkit provides a command-line utility to create new subgraphs easily.
46
46
 
47
47
  ```bash title="Run the following command to generate a new subgraph"
48
- ph generate --subgraph <to-do-list-subgraph>
48
+ ph generate --subgraph to-do-list
49
49
  ```
50
50
 
51
51
  ```bash title="Expected Output"
52
52
  Loaded templates: node_modules/@powerhousedao/codegen/dist/codegen/.hygen/templates
53
- FORCED: ./subgraphs/to-do-list-subgraph/index.ts
53
+ FORCED: ./subgraphs/to-do-list/index.ts
54
54
  skipped: ./subgraphs/index.ts
55
55
  inject: ./subgraphs/index.ts
56
56
  ```
57
57
 
58
58
  ### What happened?
59
- 1. A new subgraph was created in `./subgraphs/to-do-list-subgraph/`
59
+ 1. A new subgraph was created in `./subgraphs/to-do-list/`
60
60
  2. The subgraph was automatically registered in your project's registry
61
61
  3. Basic boilerplate code was generated with an example query
62
62
 
63
63
  If we now run `ph reactor` we will see the new subgraph being registered during the startup of the Reactor.
64
64
  > Registered /todolist subgraph.
65
65
 
66
- Alternatively, when you are running a local reactor with `ph reactor` a series of subgraphs will automatically get registered, amongst those one for the available document models in your Powerhouse project.
66
+ Alternatively, when you are running a local reactor with `ph reactor`, a series of subgraphs will automatically get registered, among those, one for the available document models in your Powerhouse project.
67
67
 
68
68
  ```
69
69
  Initializing Subgraph Manager...
@@ -120,30 +120,77 @@ Now let's create a subgraph that provides enhanced querying capabilities for our
120
120
  ```typescript
121
121
  export const typeDefs = `
122
122
  type Query {
123
+ # Dashboard-style summary query - returns high-level metrics
124
+ # Similar to ToDoListStats from document model but optimized for quick queries
123
125
  todoList: TodoListSummary
126
+
127
+ # Filtered list query - lets you get items by completion status
128
+ # More flexible than the basic document model - can filter checked/unchecked
124
129
  todoItems(checked: Boolean): [TodoItem!]!
130
+
131
+ # Count-only query - when you just need numbers, not full data
132
+ # Faster than getting full list when you only need totals for dashboards
125
133
  todoItemsCount(checked: Boolean): Int!
126
134
  }
127
135
 
136
+ # This mirrors ToDoListStats from the document model
137
+ # But it's a "view" optimized for summary reports and dashboards
128
138
  type TodoListSummary {
129
- total: Int!
130
- checked: Int!
131
- unchecked: Int!
139
+ total: Int! # Total number of items
140
+ checked: Int! # Number of completed items
141
+ unchecked: Int! # Number of pending items
132
142
  }
133
143
 
144
+ # This matches the ToDoItem from the document model
145
+ # Same data structure, but accessed through subgraph queries for filtering
134
146
  type TodoItem {
135
- id: ID!
136
- text: String!
137
- checked: Boolean!
147
+ id: ID! # Unique identifier
148
+ text: String! # The task description
149
+ checked: Boolean! # Completion status
138
150
  }
139
151
  `;
140
152
  ```
141
153
 
154
+
155
+
156
+ #### Understanding Resolvers
157
+
158
+ Before diving into the technical implementation, let's understand why these three different query types matter for your product.
159
+ Think of resolvers as custom API endpoints that are automatically created based on what your users actually need to know about your data.
160
+
161
+ When someone asks your system a question through GraphQL, the resolver:
162
+
163
+ 1. **Understands the request** - "The customer wants unchecked items"
164
+ 2. **Knows where to get the data** - "I need to check the todo_items database table"
165
+ 3. **Applies the right filters** - "Only get items where checked = false"
166
+ 4. **Returns the answer** - "Here are the 5 unchecked items"
167
+
168
+ **The three resolvers serve different business needs:**
169
+
170
+ - **`todoList` Resolver - The Dashboard**
171
+ - **Business value**: Perfect for executive dashboards or KPI displays
172
+ - **Use case**: "We have 150 total tasks, 89 completed, 61 pending"
173
+ - **Users**: Executives, managers, anyone needing high-level metrics
174
+
175
+ - **`todoItems` Resolver - The Detailed List**
176
+ - **Business value**: Great for operational views where people need to see actual tasks
177
+ - **Use case**: "Show me all pending tasks" or "Show me everything"
178
+ - **Users**: Workers, operators, anyone who needs to act on specific items
179
+
180
+ - **`todoItemsCount` Resolver - The Counter**
181
+ - **Business value**: Super fast for analytics or when you only need numbers
182
+ - **Use case**: "How many completed tasks do we have?" → "47"
183
+ - **Users**: Analysts, automated systems, performance dashboards
184
+
185
+ **Why this architecture matters:**
186
+ - **Performance**: Count queries are much faster than getting full lists when you only need numbers
187
+ - **User Experience**: Different resolvers serve different user needs efficiently
188
+ - **Flexibility**: Users can ask for exactly what they need, nothing more, nothing less
189
+
142
190
  **Step 2: Create resolvers in `subgraphs/to-do-list/resolvers.ts`:**
143
191
 
144
192
  ```typescript
145
193
  // subgraphs/to-do-list/resolvers.ts
146
- // subgraphs/to-do-list/resolvers.ts
147
194
  interface SubgraphInstance {
148
195
  operationalStore: any;
149
196
  }
@@ -196,35 +243,48 @@ import { typeDefs } from './schema.js';
196
243
  import { createResolvers } from './resolvers.js';
197
244
 
198
245
  export default class ToDoListSubgraph {
246
+ // Define the API endpoint where this subgraph will be accessible
247
+ // Users can query this at: http://localhost:4001/graphql/to-do-list
199
248
  path = '/to-do-list';
200
249
 
250
+ // GraphQL schema definition (what queries are available)
201
251
  typeDefs = typeDefs;
252
+
253
+ // Query handlers (how to fetch the data)
202
254
  resolvers: any;
255
+
256
+ // Database interface (injected by Powerhouse framework)
203
257
  operationalStore: any;
204
258
 
205
259
  constructor() {
260
+ // Connect the resolvers to this subgraph instance
261
+ // This gives resolvers access to the database through this.operationalStore
206
262
  this.resolvers = createResolvers(this);
207
263
  }
208
264
 
265
+ // Called once when the subgraph starts up
209
266
  async onSetup() {
210
267
  await this.createOperationalTables();
211
268
  }
212
269
 
270
+ // Create the database tables we need for storing todo items
213
271
  async createOperationalTables() {
214
272
  await this.operationalStore.schema.createTableIfNotExists(
215
- "todo_items",
273
+ "todo_items", // Table name
216
274
  (table: any) => {
217
- table.string("id").primary();
218
- table.string("text").notNullable();
219
- table.boolean("checked").defaultTo(false);
220
- table.timestamp("created_at").defaultTo(this.operationalStore.fn.now());
221
- table.timestamp("updated_at").defaultTo(this.operationalStore.fn.now());
275
+ table.string("id").primary(); // Unique identifier for each todo item
276
+ table.string("text").notNullable(); // The actual todo task text
277
+ table.boolean("checked").defaultTo(false); // Completion status (unchecked by default)
278
+ table.timestamp("created_at").defaultTo(this.operationalStore.fn.now()); // When item was created
279
+ table.timestamp("updated_at").defaultTo(this.operationalStore.fn.now()); // When item was last modified
222
280
  }
223
281
  );
224
282
  }
225
283
 
284
+ // Event processor: Keeps subgraph data synchronized with document model changes
285
+ // When users add/update/delete todos in Connect, this method handles the updates
226
286
  async process(event: any) {
227
- // Handle To-do List document operations
287
+ // Handle new todo item creation
228
288
  if (event.type === "ADD_TODO_ITEM") {
229
289
  await this.operationalStore.insert("todo_items", {
230
290
  id: event.input.id,
@@ -237,12 +297,13 @@ export default class ToDoListSubgraph {
237
297
  console.log(`Added todo item: ${event.input.text}`);
238
298
  }
239
299
 
300
+ // Handle todo item updates (text changes, checking/unchecking)
240
301
  if (event.type === "UPDATE_TODO_ITEM") {
241
302
  const updateData: any = {
242
- updated_at: new Date()
303
+ updated_at: new Date() // Always update the timestamp
243
304
  };
244
305
 
245
- // Only update fields that were provided
306
+ // Only update fields that were actually changed
246
307
  if (event.input.text !== undefined) {
247
308
  updateData.text = event.input.text;
248
309
  }
@@ -257,6 +318,7 @@ export default class ToDoListSubgraph {
257
318
  console.log(`Updated todo item: ${event.input.id}`);
258
319
  }
259
320
 
321
+ // Handle todo item deletion
260
322
  if (event.type === "DELETE_TODO_ITEM") {
261
323
  await this.operationalStore.delete("todo_items")
262
324
  .where("id", event.input.id);
@@ -267,11 +329,6 @@ export default class ToDoListSubgraph {
267
329
  }
268
330
  ```
269
331
 
270
- **What this schema provides:**
271
- - `todoList`: Returns statistics about all to-do items (total, checked, unchecked counts)
272
- - `todoItems`: Returns a list of to-do items, optionally filtered by checked status
273
- - `todoItemsCount`: Returns just the count of items, optionally filtered by checked status
274
-
275
332
  ### 2.3 Understanding the Implementation
276
333
 
277
334
  **What this multi-file approach provides:**
@@ -287,70 +344,72 @@ export default class ToDoListSubgraph {
287
344
  - Resolvers that fetch and filter todo items from the operational store
288
345
  - Event processing to keep the subgraph data synchronized with document model changes
289
346
 
290
- ### 2.4 Connect to Document Model Events (Processor Integration)
347
+ ### 2.4 Understanding the Document Model Event Integration
291
348
 
292
- To make our subgraph truly useful, we need to connect it to the actual To-do List document model events. This ensures that when users interact with To-do List documents through Connect, the subgraph data stays synchronized.
349
+ Notice that our `index.ts` file already includes a `process` method - this is the **processor integration** that keeps our subgraph synchronized with To-do List document model events. When users interact with To-do List documents through Connect, this method automatically handles the updates.
293
350
 
294
- Add this processor integration to your subgraph:
351
+ **How the existing processor integration works:**
295
352
 
353
+ The `process` method in our `index.ts` file handles three types of document model events:
354
+
355
+ **1. Adding new todo items:**
296
356
  ```typescript
297
- async process(event) {
298
- // Handle To-do List document operations
299
- if (event.type === "ADD_TODO_ITEM") {
300
- await this.operationalStore.insert("todo_items", {
301
- id: event.input.id,
302
- text: event.input.text,
303
- checked: false,
304
- created_at: new Date(),
305
- updated_at: new Date()
306
- });
307
-
308
- console.log(`Added todo item: ${event.input.text}`);
309
- }
310
-
311
- if (event.type === "UPDATE_TODO_ITEM") {
312
- const updateData = {
313
- updated_at: new Date()
314
- };
315
-
316
- // Only update fields that were provided
317
- if (event.input.text !== undefined) {
318
- updateData.text = event.input.text;
319
- }
320
- if (event.input.checked !== undefined) {
321
- updateData.checked = event.input.checked;
322
- }
323
-
324
- await this.operationalStore.update("todo_items")
325
- .where("id", event.input.id)
326
- .update(updateData);
327
-
328
- console.log(`Updated todo item: ${event.input.id}`);
329
- }
357
+ if (event.type === "ADD_TODO_ITEM") {
358
+ await this.operationalStore.insert("todo_items", {
359
+ id: event.input.id,
360
+ text: event.input.text,
361
+ checked: false,
362
+ created_at: new Date(),
363
+ updated_at: new Date()
364
+ });
365
+ }
366
+ ```
367
+
368
+ **2. Updating existing items:**
369
+ ```typescript
370
+ if (event.type === "UPDATE_TODO_ITEM") {
371
+ // Only update fields that were actually changed
372
+ const updateData = { updated_at: new Date() };
373
+ if (event.input.text !== undefined) updateData.text = event.input.text;
374
+ if (event.input.checked !== undefined) updateData.checked = event.input.checked;
330
375
 
331
- if (event.type === "DELETE_TODO_ITEM") {
332
- await this.operationalStore.delete("todo_items")
333
- .where("id", event.input.id);
334
-
335
- console.log(`Deleted todo item: ${event.input.id}`);
336
- }
376
+ await this.operationalStore.update("todo_items")
377
+ .where("id", event.input.id)
378
+ .update(updateData);
379
+ }
380
+ ```
381
+
382
+ **3. Deleting items:**
383
+ ```typescript
384
+ if (event.type === "DELETE_TODO_ITEM") {
385
+ await this.operationalStore.delete("todo_items")
386
+ .where("id", event.input.id);
337
387
  }
338
388
  ```
339
389
 
340
- **What this processor does:**
341
- - Listens for document model operations (`ADD_TODO_ITEM`, `UPDATE_TODO_ITEM`, `DELETE_TODO_ITEM`)
342
- - Updates the operational store in real-time when these operations occur
343
- - Provides console logging for debugging
344
- - Maintains data consistency between the document model and the subgraph
390
+ **The integration happens automatically:**
391
+ 1. **User action**: Someone adds a todo item in Connect
392
+ 2. **Document model**: Processes the `ADD_TODO_ITEM` operation
393
+ 3. **Framework routing**: Powerhouse automatically calls your subgraph's `process` method
394
+ 4. **Subgraph response**: Your `process` method updates the operational store
395
+ 5. **Query availability**: Users can now query the updated data via GraphQL
345
396
 
346
397
  ### 2.5 Summary of What We've Built
347
398
 
348
- - **Added two main queries**: `todoList` for statistics and `todoItems` for item lists
349
- - **Created an operational table** `todo_items` to store the todo items with proper schema
350
- - **Added resolvers** to fetch and filter todo items from the operational store
351
- - **Implemented event processing** to keep the subgraph data synchronized with document model changes
352
- - **The todoItems query accepts an optional checked parameter** to filter items by their completion status
353
- - **The todoList query returns the full statistics** including total, checked, and unchecked counts
399
+ Our complete To-do List subgraph includes:
400
+
401
+ - **GraphQL schema** (`schema.ts`): Defines `todoList`, `todoItems`, and `todoItemsCount` queries
402
+ - **Resolvers** (`resolvers.ts`): Handle data fetching and filtering from the operational store
403
+ - **Main subgraph class** (`index.ts`): Coordinates everything and includes:
404
+ - **Operational table creation**: Sets up the `todo_items` table with proper schema
405
+ - **Event processing**: The `process` method keeps subgraph data synchronized with document model changes
406
+ - **Real-time updates**: Automatically handles `ADD_TODO_ITEM`, `UPDATE_TODO_ITEM`, and `DELETE_TODO_ITEM` events
407
+
408
+ **Key features:**
409
+ - **Filtering capability**: The `todoItems` query accepts an optional `checked` parameter
410
+ - **Performance optimization**: The `todoItemsCount` query returns just numbers when you don't need full data
411
+ - **Real-time synchronization**: Changes in Connect immediately appear in subgraph queries
412
+ - **Complete statistics**: The `todoList` query returns total, checked, and unchecked counts
354
413
 
355
414
  ## 3. Testing the To-do List Subgraph
356
415
 
@@ -485,7 +544,7 @@ This demonstrates the real-time synchronization between the document model and t
485
544
 
486
545
  ## 4. Working with the supergraph or gateway
487
546
 
488
- A supergraph is a GraphQL schema that combines multiple underlying GraphQL APIs, known as subgraphs, into a single, unified graph. This architecture allows different teams to work independently on their respective services (subgraphs) while providing a single entry point for clients or users to query all available data
547
+ A supergraph is a GraphQL schema that combines multiple underlying GraphQL APIs, known as subgraphs, into a single, unified graph. This architecture allows different teams to work independently on their respective services (subgraphs) while providing a single entry point for clients or users to query all available data.
489
548
 
490
549
  ### 4.1 Key concepts
491
550
 
@@ -503,7 +562,7 @@ A supergraph is a GraphQL schema that combines multiple underlying GraphQL APIs,
503
562
 
504
563
  ### 4.3 Use the Powerhouse supergraph
505
564
 
506
- The Powerhouse supergraph for any given remote drive or reactor can be found under `http://localhost:4001/graphql`. The gateway / supergraph available on `/graphql` combines all the subgraphs, except for the drive subgraph (which is accessible via `/d/:driveId`). To get to the endpoint open your localhost by starting the reactor and adding `graphql` to the end of the url. The following commands explain how you can test & try the supergraph.
565
+ The Powerhouse supergraph for any given remote drive or reactor can be found under `http://localhost:4001/graphql`. The gateway / supergraph available on `/graphql` combines all the subgraphs, except for the drive subgraph (which is accessible via `/d/:driveId`). To access the endpoint, start the reactor and navigate to the URL with `graphql` appended. The following commands explain how you can test & try the supergraph.
507
566
 
508
567
  - Start the reactor:
509
568
 
@@ -517,7 +576,7 @@ The Powerhouse supergraph for any given remote drive or reactor can be found und
517
576
  http://localhost:4001/graphql
518
577
  ```
519
578
 
520
- The supergraph allows to both query & mutate data from the same endpoint.
579
+ The supergraph allows you to both query & mutate data from the same endpoint.
521
580
 
522
581
  **Example: Using the supergraph with To-do List documents**
523
582
 
@@ -639,7 +698,7 @@ To integrate with them, register them via the Reactor API.
639
698
 
640
699
  ### Future enhancements
641
700
 
642
- Bridge Processors and Subgraphs Currently, there's a gap in how processors and subgraphs interact. Powerhouse might improve this in future updates.
701
+ Bridge Processors and Subgraphs Currently, there's a gap in how processors and subgraphs interact. Powerhouse might improve this in future updates.
643
702
 
644
703
 
645
704
 
@@ -1,28 +1,62 @@
1
- # Build a Todo-List processor
1
+ # Build a Todo-List Processor
2
2
 
3
- 1. Generate the processor
4
- 2. Define your database schema
5
- 3. Customize the processor to your needs
6
- 4. Test your processor
7
- 5. Use the relational database in Frontend and Subgraph
3
+ ## What You'll Learn
8
4
 
5
+ In this tutorial, you'll learn how to build a **relational database processor** that listens to changes in Powerhouse TodoList documents and automatically maintains a synchronized relational database. This is useful for creating queryable data stores, generating reports, or integrating with existing database-driven applications.
9
6
 
10
- ## Generate the Processor
7
+ ## What is a Processor?
8
+
9
+ A **processor** in Powerhouse is a background service that automatically responds to document changes. Think of it as a "listener" that watches for specific document operations (like creating, updating, or deleting todos) and then performs custom logic - in this case, updating a relational database.
10
+
11
+ **Key Benefits:**
12
+ - **Real-time synchronization**: Your database stays automatically up-to-date with document changes
13
+ - **Query performance**: Relational databases excel at complex queries and joins
14
+
15
+ ## Tutorial Steps
16
+
17
+ 1. **Generate the processor** - Create the basic processor structure
18
+ 2. **Define your database schema** - Design the tables to store your data
19
+ 3. **Generate TypeScript types** - Get type safety for database operations
20
+ 4. **Configure the filter** - Specify which documents to listen to
21
+ 5. **Customize the processor logic** - Implement how document changes update the database
22
+ 6. **Use the data via Subgraph** - Query your processed data through GraphQL
23
+
24
+ ---
25
+
26
+ ## Step 1: Generate the Processor
27
+
28
+ First, we'll create the processor using the Powerhouse CLI. This command scaffolds all the necessary files and configuration.
11
29
 
12
- In order to generate the processor you need to run the following command:
13
30
  ```bash
14
31
  ph generate --processor todo-processor --processor-type relational-db --document-types powerhouse/todolist
15
32
  ```
16
33
 
17
- With that command you create a processor named todo-processor which is of type relational db and listens on changes from documents of type powerhouse/todolist.
34
+ **Breaking down this command:**
35
+ - `--processor todo-processor`: Names your processor "todo-processor"
36
+ - `--processor-type relational-db`: Creates a processor that works with SQL databases
37
+ - `--document-types powerhouse/todolist`: Tells the processor to listen for changes in TodoList documents
38
+
39
+ **What gets created:**
40
+ - `processors/todo-processor/` directory with all necessary files
41
+ - Migration files for database schema management
42
+ - Factory function for processor instantiation
43
+ - Base processor class ready for customization
44
+
45
+ ---
18
46
 
19
- ## Define your database schema
47
+ ## Step 2: Define Your Database Schema
20
48
 
21
- As next step we need to define the db schema in the `processors/todo-processor/migration.ts` file.
49
+ Next, we need to define what our database tables will look like. This happens in the **migration file**, which contains instructions for creating (and optionally destroying) database tables.
22
50
 
23
- The migration file has a up and a down function which gets called when either the processor was added or when the processor was removed.
51
+ **File location:** `processors/todo-processor/migration.ts`
24
52
 
25
- Below you can find the example of a todo table.
53
+ ### Understanding Migrations
54
+
55
+ Migrations are scripts that modify your database structure. They have two functions:
56
+ - **`up()`**: Runs when the processor is added - creates tables and indexes
57
+ - **`down()`**: Runs when the processor is removed - cleans up by dropping tables
58
+
59
+ Here's our TodoList migration:
26
60
 
27
61
  ```ts
28
62
  import { type IBaseRelationalDb } from "document-drive/processors/types"
@@ -30,41 +64,61 @@ import { type IBaseRelationalDb } from "document-drive/processors/types"
30
64
  export async function up(db: IBaseRelationalDb): Promise<void> {
31
65
  // Create table
32
66
  await db.schema
33
- .createTable("todo")
34
- .addColumn("name", "varchar(255)")
35
- .addColumn("completed", "boolean")
36
- .addPrimaryKeyConstraint("todo_pkey", ["name"])
37
- .ifNotExists()
38
- .execute();
39
-
67
+ .createTable("todo") // Table name: "todo"
68
+ .addColumn("name", "varchar(255)") // Todo item text (up to 255 characters)
69
+ .addColumn("completed", "boolean") // Completion status (true/false)
70
+ .addPrimaryKeyConstraint("todo_pkey", ["name"]) // Primary key on 'name' column
71
+ .ifNotExists() // Only create if table doesn't exist
72
+ .execute(); // Execute the SQL command
73
+
74
+ // Optional: Log all tables for debugging
40
75
  const tables = await db.introspection.getTables();
41
76
  console.log(tables);
42
77
  }
43
78
 
44
79
  export async function down(db: IBaseRelationalDb): Promise<void> {
45
- // drop table
80
+ // Clean up: drop the table when processor is removed
46
81
  await db.schema.dropTable("todo").execute();
47
82
  }
48
83
  ```
49
84
 
50
- ## Generate Types
85
+ **Design decisions explained:**
86
+ - **`name` as primary key**: Assumes todo names are unique (you might want to use an auto-incrementing ID instead)
87
+ - **Simple boolean for completion**: Easy to query for completed vs. incomplete todos
88
+ - **`ifNotExists()`**: Prevents errors if the processor restarts
89
+
90
+ ---
51
91
 
52
- After defining your db schema its important to generate the types for typescript. This allows to create type safety queries and make use of code completion in your IDE when writing database queries.
92
+ ## Step 3: Generate TypeScript Types
53
93
 
54
- Simply execute the following command.
94
+ After defining your database schema, generate TypeScript types for type-safe database operations. This provides IDE autocomplete and catches errors at compile time.
55
95
 
56
96
  ```bash
57
97
  ph generate --migration-file processors/todo-indexer/migrations.js --schema-file processors/todo-indexer/schema.ts
58
98
  ```
59
99
 
60
- Afterwards check your `processors/todo-processor/schema.ts` file.
61
- It will contain the types of your database.
100
+ **What this does:**
101
+ - Analyzes your migration file
102
+ - Generates TypeScript interfaces matching your database tables
103
+ - Creates a `schema.ts` file with type definitions
104
+
105
+ **Result:** You'll get types like:
106
+ ```ts
107
+ interface Todo {
108
+ name: string;
109
+ completed: boolean;
110
+ }
111
+ ```
62
112
 
63
- ## Define the Filter
113
+ These types will be available in `processors/todo-processor/schema.ts` and ensure your database queries are type-safe.
64
114
 
65
- Checkout the `processors/todo-processor/factory.ts`.
115
+ ---
66
116
 
67
- Here you can define how the processor is being instantiated. In thise case it listens on powerhouse/todo-list document changes in the main branch and the global scope.
117
+ ## Step 4: Configure the Filter
118
+
119
+ The **filter** determines which document changes your processor should respond to. This is configured in the factory function.
120
+
121
+ **File location:** `processors/todo-processor/factory.ts`
68
122
 
69
123
  ```ts
70
124
  export const todoProcessorProcessorFactory =
@@ -75,10 +129,10 @@ export const todoProcessorProcessorFactory =
75
129
 
76
130
  // Create a filter for the processor
77
131
  const filter: RelationalDbProcessorFilter = {
78
- branch: ["main"],
79
- documentId: ["*"],
80
- documentType: ["powerhouse/todo-list"],
81
- scope: ["global"],
132
+ branch: ["main"], // Only listen to main branch changes
133
+ documentId: ["*"], // Listen to ALL documents (wildcard)
134
+ documentType: ["powerhouse/todo-list"], // Only TodoList document types
135
+ scope: ["global"], // Global scope (vs. user-specific)
82
136
  };
83
137
 
84
138
  // Create a namespaced store for the processor
@@ -96,14 +150,23 @@ export const todoProcessorProcessorFactory =
96
150
  },
97
151
  ];
98
152
  };
99
-
100
153
  ```
101
154
 
102
- ## Customize the logic of the processor
155
+ **Filter options explained:**
156
+ - **`branch`**: Which document branches to monitor (usually "main" for production data)
157
+ - **`documentId`**: Specific document IDs or "*" for all documents
158
+ - **`documentType`**: The document model type - must match exactly
159
+ - **`scope`**: "global" for shared data, or specific scopes for user/organization data
160
+
161
+ **Namespace concept**: Each processor gets its own database namespace to avoid conflicts when multiple processors or drives exist.
103
162
 
104
- When you defined your db schema and the filter when your processor should receive processed operations its time to implement the actual logic.
163
+ ---
105
164
 
106
- In the following you'll find an example where we store all the created and udpated todos in a table.
165
+ ## Step 5: Implement the Processor Logic
166
+
167
+ Now for the core functionality - how your processor responds to document changes. This is where you define what happens when TodoList documents are created, updated, or deleted.
168
+
169
+ **File location:** `processors/todo-processor/index.ts`
107
170
 
108
171
  ```ts
109
172
  type DocumentType = ToDoListDocument;
@@ -112,88 +175,209 @@ export class TodoIndexerProcessor extends RelationalDbProcessor<DB> {
112
175
 
113
176
  static override getNamespace(driveId: string): string {
114
177
  // Default namespace: `${this.name}_${driveId.replaceAll("-", "_")}`
178
+ // Each drive gets its own database tables to prevent data mixing
115
179
  return super.getNamespace(driveId);
116
180
  }
117
181
 
118
182
  override async initAndUpgrade(): Promise<void> {
183
+ // Run database migrations when processor starts
184
+ // This creates your tables if they don't exist
119
185
  await up(this.relationalDb as IBaseRelationalDb);
120
186
  }
121
187
 
122
188
  override async onStrands(
123
189
  strands: InternalTransmitterUpdate<DocumentType>[],
124
190
  ): Promise<void> {
191
+ // Early exit if no data to process
125
192
  if (strands.length === 0) {
126
193
  return;
127
194
  }
128
195
 
196
+ // Process each strand (a strand represents changes to one document)
129
197
  for (const strand of strands) {
130
198
  if (strand.operations.length === 0) {
131
199
  continue;
132
200
  }
133
201
 
202
+ // Process each operation in the strand
134
203
  for (const operation of strand.operations) {
204
+ // Simple example: Insert a new todo for every operation
205
+ // In a real implementation, you'd check the operation type and data
135
206
  await this.relationalDb
136
207
  .insertInto("todo")
137
208
  .values({
138
- task: strand.documentId,
139
- status: true,
209
+ task: strand.documentId, // Use document ID as task name
210
+ status: true, // Default to completed
140
211
  })
141
212
  .execute();
142
213
  }
143
214
  }
144
215
  }
145
216
 
146
- async onDisconnect() {}
217
+ async onDisconnect() {
218
+ // Cleanup logic when processor shuts down
219
+ // Could include closing connections, saving state, etc.
220
+ }
147
221
  }
148
-
149
222
  ```
150
223
 
151
- ## Fetch Data through a Subgraph
224
+ ### Understanding Strands and Operations
152
225
 
153
- ### Generate Subgraph
226
+ **Strands** represent a sequence of changes to a single document. Each strand contains:
227
+ - `documentId`: Which document changed
228
+ - `operations`: Array of operations (add todo, complete todo, etc.)
229
+ - `state`: The current document state
154
230
 
155
- Simply generate a new subgraph with:
156
- ```bash
157
- ph generate --subgraph <subgraph-name>
231
+ **Operations** are the actual changes made to the document:
232
+ - `ADD_TODO`: New todo item created
233
+ - `TOGGLE_TODO`: Todo completion status changed
234
+ - `DELETE_TODO`: Todo item removed
235
+
236
+ ### Improving the Example
237
+
238
+ The provided example is simplified. In production, you'd want to:
239
+
240
+ 1. **Parse operation types:**
241
+ ```ts
242
+ switch (operation.type) {
243
+ case 'ADD_TODO':
244
+ // Insert new todo
245
+ break;
246
+ case 'CHECK_TODO':
247
+ // Update completion status
248
+ break;
249
+ case 'DELETE_TODO':
250
+ // Remove todo from database
251
+ break;
252
+ }
253
+ ```
254
+
255
+ 2. **Handle errors gracefully:**
256
+ ```ts
257
+ try {
258
+ await this.relationalDb.insertInto("todo").values(values).execute();
259
+ } catch (error) {
260
+ console.error('Failed to insert todo:', error);
261
+ // Could implement retry logic, dead letter queue, etc.
262
+ }
158
263
  ```
159
264
 
160
- ### Fetch Data from Processor
265
+ 3. **Use transactions for consistency:**
266
+ ```ts
267
+ await this.relationalDb.transaction().execute(async (trx) => {
268
+ // Multiple operations that should all succeed or all fail
269
+ });
270
+ ```
161
271
 
162
- open ```./subgraphs/<subgraph-name>/index.ts```
272
+ ---
163
273
 
274
+ ## Step 6: Query Data Through a Subgraph
164
275
 
276
+ Once your processor is storing data in the database, you can expose it via GraphQL using a **subgraph**. This creates a clean API for frontend applications to query the processed data.
165
277
 
166
- define the following:
278
+ ### Generate a Subgraph
167
279
 
280
+ Create a new GraphQL subgraph that can query your processor's database:
281
+
282
+ ```bash
283
+ ph generate --subgraph <subgraph-name>
284
+ ```
285
+
286
+ **What this creates:**
287
+ - GraphQL schema definitions
288
+ - Resolver functions that fetch data
289
+ - Integration with your processor's database
290
+
291
+ ### Configure the Subgraph
292
+
293
+ **File location:** `./subgraphs/<subgraph-name>/index.ts`
168
294
 
169
295
  ```ts
170
296
  resolvers = {
171
297
  Query: {
172
298
  todoList: {
173
299
  resolve: async (parent, args, context, info) => {
300
+ // Query the processor's database using the generated types
174
301
  const todoList = await TodoProcessor.query(
175
- args.driveId ?? "powerhouse",
176
- this.relationalDb
302
+ args.driveId ?? "powerhouse", // Default drive if none specified
303
+ this.relationalDb // Database connection from processor
177
304
  )
178
- .selectFrom("todo")
179
- .selectAll()
180
- .execute();
305
+ .selectFrom("todo") // FROM todo table
306
+ .selectAll() // SELECT * (all columns)
307
+ .execute(); // Execute and return results
181
308
  return todoList
182
309
  },
183
310
  },
184
311
  },
185
312
  };
186
313
 
314
+ // GraphQL schema definition
187
315
  typeDefs = gql`
188
316
  type Query {
189
317
  type Todo {
190
- name: String!
191
- completed: Boolean!
318
+ name: String! # Todo text (required)
319
+ completed: Boolean! # Completion status (required)
192
320
  }
193
321
 
194
- todoList(driveId: String): [Todo!]!
322
+ todoList(driveId: String): [Todo!]! # Query to get all todos for a drive
195
323
  }
196
324
  `;
197
- ```
325
+ ```
326
+
327
+ ### Understanding the GraphQL Integration
328
+
329
+ **Resolvers** are functions that fetch data for each GraphQL field:
330
+ - `parent`: Data from parent resolver (unused here)
331
+ - `args`: Arguments passed to the query (like `driveId`)
332
+ - `context`: Shared context (database connections, user info, etc.)
333
+ - `info`: Metadata about the GraphQL query
334
+
335
+ **Type Definitions** describe your GraphQL schema:
336
+ - `type Todo`: Defines the structure of a todo item
337
+ - `todoList(driveId: String): [Todo!]!`: A query that returns an array of todos
338
+ - `!` means the field is required/non-null
339
+
340
+ ### Querying Your Data
341
+
342
+ Once deployed, frontend applications can query your data like this:
343
+
344
+ ```graphql
345
+ query GetTodos($driveId: String) {
346
+ todoList(driveId: $driveId) {
347
+ name
348
+ completed
349
+ }
350
+ }
351
+ ```
352
+
353
+ This would return:
354
+ ```json
355
+ {
356
+ "data": {
357
+ "todoList": [
358
+ {"name": "Buy groceries", "completed": false},
359
+ {"name": "Write tutorial", "completed": true}
360
+ ]
361
+ }
362
+ }
363
+ ```
364
+
365
+ ---
366
+
367
+ ## Next Steps and Best Practices
368
+
369
+ ### Testing Your Processor
370
+
371
+ 1. **Unit tests**: Test individual functions with mock data
372
+ 2. **Integration tests**: Test the full processor with real document operations
373
+
374
+ ### Production Considerations
375
+
376
+ 1. **Error handling**: Implement robust error handling and logging
377
+ 2. **Monitoring**: Add metrics to track processor performance
378
+ 3. **Scaling**: Consider database indexing and query optimization
379
+ 4. **Security**: Validate input data and implement proper access controls
380
+
381
+ This processor tutorial demonstrates the power of Powerhouse's event-driven architecture, where document changes automatically flow through to specialized data stores optimized for different use cases.
198
382
 
199
383
 
@@ -0,0 +1,8 @@
1
+ {
2
+ "label": "RelationalDb Processor",
3
+ "position": 6,
4
+ "link": {
5
+ "type": "generated-index",
6
+ "description": "Learn how to make use of a RelationalDb Processor in this tutorial!"
7
+ }
8
+ }
@@ -50,6 +50,42 @@ const updateInvoiceName = useUpdateDocumentField('docId', 'name')
50
50
  // Combined read + write (like useState)
51
51
  const [invoiceName, updateInvoiceName] = useDocumentField('docId', 'name')
52
52
  ```
53
+ Initial documentation about the hooks can be found [here](https://github.com/powerhouse-inc/powerhouse/blob/main/packages/common/state/README.md)
54
+
55
+ ### Temporary use of new hooks on custom drive editors.
56
+
57
+
58
+ To use the new hooks in custom components or editors today, devs must:
59
+
60
+ 1. Pass in the existing Reactor instance as a prop (createReactor).
61
+ 2. Call useInitializeReactor(props.createReactor, false) in the custom component.
62
+ 3. Optionally wrap dispatch() to trigger a refresh manually, using:
63
+
64
+ const refresh = useSyncDrivesAndDocumentsWithReactor();
65
+
66
+ For maximum compatibility, consider passing data and set functions (drives, documents, etc.) directly as props instead of using the hooks inside the custom editor..
67
+
68
+ Example:
69
+
70
+ ```
71
+ +import { AtomStoreProvider } from "@powerhousedao/common";
72
+
73
+ export default function Editor(props: IProps) {
74
+ return (
75
+ + <AtomStoreProvider reactor={props.context.reactor}>
76
+ <DriveContextProvider value={props.context}>
77
+ <WagmiContext>
78
+ <BaseEditor {...props} />
79
+ </WagmiContext>
80
+ </DriveContextProvider>
81
+ + </AtomStoreProvider>
82
+ );
83
+ }
84
+ ```
85
+
86
+ Until the hooks are fully integrated and event handling is granular, the team will need to handle some of this manually or limit their use to experimental side projects and internal demos.
87
+
88
+
53
89
 
54
90
 
55
91
  ## An overview of currently available hooks
@@ -7,11 +7,11 @@ This page covers the relational database tools available in Powerhouse applicati
7
7
  The relational database layer gives you powerful tools to work with data in your Powerhouse applications. You get type-safe queries, real-time updates, and a simple API that feels familiar to React developers.
8
8
 
9
9
  **Key Benefits:**
10
- - 🔒 **Type-safe queries** with full TypeScript support
11
- - 🔄 **Live query capabilities** with real-time updates
12
- - **Automatic optimization** to prevent infinite re-renders
13
- - 🎯 **Simple API** that abstracts away complexity
14
- - 🧠 **Smart memoization** for parameters and queries
10
+ - **Type-safe queries** with full TypeScript support
11
+ - **Live query capabilities** with real-time updates
12
+ - **Automatic optimization** to prevent infinite re-renders
13
+ - **Simple API** that abstracts away complexity
14
+ - **Smart memorization** for parameters and queries
15
15
 
16
16
  ## Quick Start
17
17
 
@@ -40,23 +40,26 @@ type MyDatabase = {
40
40
 
41
41
  ```typescript
42
42
  import { createProcessorQuery } from '@powerhousedao/reactor-browser/relational';
43
+ import { MyProcessor } from './processors/my-processor';
43
44
 
44
- const useTypedQuery = createProcessorQuery<MyDatabase>();
45
+ // Create a typed query hook for your processor
46
+ const useTypedQuery = createProcessorQuery(MyProcessor);
45
47
  ```
46
48
 
47
49
  ### Step 3: Use it in your component
48
50
 
49
51
  ```typescript
50
52
  // Simple query - no parameters needed
51
- export function useUserList() {
52
- return useTypedQuery(db => {
53
+ export function useUserList(driveId: string) {
54
+ return useTypedQuery(driveId, db => {
53
55
  return db.selectFrom('users').selectAll().compile();
54
56
  });
55
57
  }
56
58
 
57
59
  // Query with parameters
58
- export function useUserById(userId: number) {
60
+ export function useUserById(driveId: string, userId: number) {
59
61
  return useTypedQuery(
62
+ driveId,
60
63
  (db, params) => {
61
64
  return db
62
65
  .selectFrom('users')
@@ -72,8 +75,8 @@ export function useUserById(userId: number) {
72
75
  ### Step 4: Use in your React component
73
76
 
74
77
  ```typescript
75
- function UserList() {
76
- const { isLoading, error, result } = useUserList();
78
+ function UserList({ driveId }: { driveId: string }) {
79
+ const { isLoading, error, result } = useUserList(driveId);
77
80
 
78
81
  if (isLoading) return <div>Loading...</div>;
79
82
  if (error) return <div>Error: {error.message}</div>;
@@ -98,40 +101,40 @@ function UserList() {
98
101
  ### 1. createProcessorQuery()
99
102
 
100
103
  <details>
101
- <summary>`createProcessorQuery<Schema>()`: Creates a typed query hook for your database schema</summary>
104
+ <summary>`createProcessorQuery(ProcessorClass)`: Creates a typed query hook factory for your processor</summary>
102
105
 
103
- ### Hook Name and Signature
106
+ ### Function Name and Signature
104
107
 
105
108
  ```typescript
106
- function createProcessorQuery<Schema>(): TypedQueryHook<Schema>
109
+ function createProcessorQuery<Schema>(
110
+ ProcessorClass: RelationalDbProcessorClass<Schema>
111
+ ): TypedQueryHook<Schema>
107
112
  ```
108
113
 
109
114
  ### Description
110
115
 
111
- Creates a typed query hook that provides type-safe database operations with live query capabilities. This is the main hook you'll use for most relational database operations in your components.
116
+ Creates a typed query hook factory for a specific processor class. This is the main function you'll use to create hooks for querying your relational database.
112
117
 
113
118
  ### Usage Example
114
119
 
115
120
  ```typescript
116
121
  import { createProcessorQuery } from '@powerhousedao/reactor-browser/relational';
122
+ import { MyProcessor } from './processors/my-processor';
117
123
 
118
- type AppDatabase = {
119
- users: { id: number; name: string; email: string };
120
- posts: { id: number; title: string; author_id: number };
121
- };
122
-
123
- const useTypedQuery = createProcessorQuery<AppDatabase>();
124
+ // Create a typed query hook for your processor
125
+ const useTypedQuery = createProcessorQuery(MyProcessor);
124
126
 
125
- // Static query (no parameters)
126
- function useAllUsers() {
127
- return useTypedQuery(db => {
127
+ // Use it to create specific query hooks
128
+ export const useUsers = (driveId: string) => {
129
+ return useTypedQuery(driveId, (db) => {
128
130
  return db.selectFrom('users').selectAll().compile();
129
131
  });
130
- }
132
+ };
131
133
 
132
- // Dynamic query with parameters
133
- function useUsersByStatus(status: string) {
134
+ // With parameters
135
+ export const useUsersByStatus = (driveId: string, status: string) => {
134
136
  return useTypedQuery(
137
+ driveId,
135
138
  (db, params) => {
136
139
  return db
137
140
  .selectFrom('users')
@@ -141,41 +144,32 @@ function useUsersByStatus(status: string) {
141
144
  },
142
145
  { status }
143
146
  );
144
- }
147
+ };
145
148
  ```
146
149
 
147
150
  ### Parameters
148
151
 
149
- The returned hook has two overloads:
150
-
151
- **Static queries (no parameters):**
152
- - `queryCallback: (db: EnhancedKysely<Schema>) => QueryCallbackReturnType` - Function that receives the database instance and returns a query
153
-
154
- **Parameterized queries:**
155
- - `queryCallback: (db: EnhancedKysely<Schema>, parameters: TParams) => QueryCallbackReturnType` - Function that receives the database instance and parameters
156
- - `parameters: TParams` - Parameters for the query (automatically memoized)
152
+ The returned hook accepts:
153
+ - `driveId`: The ID of the drive
154
+ - `queryCallback`: Function that receives the database instance and optional parameters
155
+ - `parameters`: Optional parameters for the query
157
156
 
158
157
  ### Return Value
159
158
 
160
159
  ```typescript
161
160
  {
162
- isLoading: boolean; // True while query is loading
163
- error: Error | null; // Any error that occurred
164
- result: LiveQueryResults<T> | null; // Query results with real-time updates
161
+ isLoading: boolean; // True while loading or retrying
162
+ error: Error | null; // Any error that occurred
163
+ result: LiveQueryResults<T> | null; // Query results with live updates
165
164
  }
166
165
  ```
167
166
 
168
167
  ### Notes / Caveats
169
168
 
170
- - Parameters are automatically memoized using deep comparison
171
- - Queries update in real-time when the database changes
172
- - The callback must return an object with `sql` and optional `parameters` properties
173
- - Use `.compile()` on Kysely queries to get the required format
174
-
175
- ### Related Hooks
176
-
177
- - [`useOperationalStore`](#useoperationalstore) - For direct database access
178
- - [`useOperationalQuery`](#useoperationalquery) - Lower-level query hook
169
+ - Create one `useTypedQuery` hook per processor
170
+ - The hook includes automatic retry logic for common errors
171
+ - Parameters are automatically memoized
172
+ - Queries are live and will update automatically when data changes
179
173
 
180
174
  </details>
181
175
 
@@ -246,8 +240,8 @@ function DatabaseOperations() {
246
240
 
247
241
  ### Related Hooks
248
242
 
249
- - [`createProcessorQuery`](#createProcessorQuery) - For optimized queries
250
- - [`useOperationalQuery`](#useoperationalquery) - For manual query control
243
+ - [`createProcessorQuery`](#1-createprocessorquery) - For optimized queries
244
+ - [`useOperationalQuery`](#3-useoperationalquery) - For manual query control
251
245
 
252
246
  </details>
253
247
 
@@ -314,8 +308,8 @@ function UserCount() {
314
308
 
315
309
  ### Related Hooks
316
310
 
317
- - [`createProcessorQuery`](#createProcessorQuery) - Recommended higher-level API
318
- - [`useOperationalStore`](#useoperationalstore) - For direct database access
311
+ - [`createProcessorQuery`](#1-createprocessorquery) - Recommended higher-level API
312
+ - [`useOperationalStore`](#2-useoperationalstore) - For direct database access
319
313
 
320
314
  </details>
321
315
 
@@ -1,12 +1,12 @@
1
- # PHDocument Migration Guide (v3.3.0)
1
+ # PHDocument Migration Guide
2
2
 
3
3
  :::tip
4
- This guide covers the **breaking changes** introduced in Powerhouse v3.3.0 related to PHDocument structure changes. If you're upgrading from v3.2.0 or earlier, **this migration is required** and document models must be regenerated.
4
+ This guide covers the **breaking changes** introduced in Powerhouse v4.0.0 related to PHDocument structure changes. If you're upgrading from v3.2.0 or earlier, **this migration is required** and document models must be regenerated.
5
5
  :::
6
6
 
7
7
  ## Overview
8
8
 
9
- Version 3.3.0 introduced a significant refactor of the `PHDocument` structure that consolidates document metadata into a `header` field. This change enables signed and unsigned documents with cryptographic verification capabilities, but requires updating all code that accesses document properties.
9
+ Version 4.0.0 introduced a significant refactor of the `PHDocument` structure that consolidates document metadata into a `header` field. This change enables signed and unsigned documents with cryptographic verification capabilities, but requires updating all code that accesses document properties.
10
10
 
11
11
  ## What Changed
12
12
 
@@ -28,7 +28,7 @@ const document = {
28
28
  }
29
29
  ```
30
30
 
31
- **After (v3.3.0):**
31
+ **After (v4.0.0):**
32
32
  ```javascript
33
33
  const document = {
34
34
  header: {
@@ -328,11 +328,10 @@ describe('Document Migration', () => {
328
328
 
329
329
  ## Related Documentation
330
330
 
331
- - [PHDocument Architecture](../05-Architecture/PHDocument.md)
332
- - [Document Model Creation](../02-MasteryTrack/DocumentModelCreation/WhatIsADocumentModel.md)
333
- - [GraphQL API Reference](./02-ReactorAPI.md)
334
- - [React Hooks](./01-ReactHooks.md)
331
+ - [PHDocument Architecture](/academy/Architecture/PowerhouseArchitecture)
332
+ - [Document Model Creation](/academy/MasteryTrack/DocumentModelCreation/WhatIsADocumentModel)
333
+ - [React Hooks](/academy/APIReferences/ReactHooks)
335
334
 
336
335
  ---
337
336
 
338
- *This migration guide covers the major changes in v3.3.0. For additional technical details, refer to the [RELEASE-NOTES.md](https://github.com/powerhouse-dao/powerhouse/blob/main/RELEASE-NOTES.md) in the main repository.*
337
+ *This migration guide covers the major changes in v4.0.0. For additional technical details, refer to the [RELEASE-NOTES.md](https://github.com/powerhouse-dao/powerhouse/blob/main/RELEASE-NOTES.md) in the main repository.*
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@powerhousedao/academy",
3
- "version": "3.3.0-dev.14",
3
+ "version": "3.3.0-dev.16",
4
4
  "homepage": "https://powerhouse.academy",
5
5
  "repository": {
6
6
  "type": "git",
@@ -1,4 +0,0 @@
1
- ## Start the Reactor & Connect
2
-
3
- ## Use the analytics modal / Explain the modal
4
-