@powerhousedao/academy 3.2.0-dev.2 → 3.2.0-dev.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (62) hide show
  1. package/CHANGELOG.md +10 -0
  2. package/docs/academy/01-GetStarted/00-ExploreDemoPackage.mdx +188 -0
  3. package/docs/academy/01-GetStarted/01-CreateNewPowerhouseProject.md +10 -12
  4. package/docs/academy/01-GetStarted/02-DefineToDoListDocumentModel.md +8 -8
  5. package/docs/academy/01-GetStarted/03-ImplementOperationReducers.md +11 -11
  6. package/docs/academy/01-GetStarted/04-BuildToDoListEditor.md +13 -13
  7. package/docs/academy/01-GetStarted/_04-BuildToDoListEditor +12 -12
  8. package/docs/academy/01-GetStarted/home.mdx +50 -51
  9. package/docs/academy/01-GetStarted/images/Connect.png +0 -0
  10. package/docs/academy/01-GetStarted/images/Packagemanager.png +0 -0
  11. package/docs/academy/01-GetStarted/images/TodoDriveApp.png +0 -0
  12. package/docs/academy/01-GetStarted/styles.module.css +7 -14
  13. package/docs/academy/02-MasteryTrack/01-BuilderEnvironment/02-StandardDocumentModelWorkflow.md +22 -22
  14. package/docs/academy/02-MasteryTrack/01-BuilderEnvironment/03-BuilderTools.md +17 -17
  15. package/docs/academy/02-MasteryTrack/01-BuilderEnvironment/_category_.json +1 -1
  16. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/01-WhatIsADocumentModel.md +15 -15
  17. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/02-SpecifyTheStateSchema.md +11 -9
  18. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/03-SpecifyDocumentOperations.md +15 -15
  19. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/04-UseTheDocumentModelGenerator.md +9 -9
  20. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/05-ImplementDocumentReducers.md +15 -15
  21. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/06-ImplementDocumentModelTests.md +14 -14
  22. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/07-ExampleToDoListRepository.md +4 -4
  23. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/01-BuildingDocumentEditors.md +28 -30
  24. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/02-ConfiguringDrives.md +7 -7
  25. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/03-BuildingADriveExplorer.md +9 -10
  26. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/07-DocumentTools/01-OperationHistory.md +11 -11
  27. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/07-DocumentTools/02-RevisionHistoryTimeline.md +6 -6
  28. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/08-Authorization/01-RenownAuthenticationFlow.md +8 -8
  29. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/08-Authorization/02-Authorization.md +8 -8
  30. package/docs/academy/02-MasteryTrack/04-WorkWithData/01-ReadingAndWritingThroughTheAPI.mdx +5 -5
  31. package/docs/academy/02-MasteryTrack/04-WorkWithData/02-GraphQLAtPowerhouse.md +3 -3
  32. package/docs/academy/02-MasteryTrack/04-WorkWithData/03-WorkingWithSubgraphs/02-GraphQLAndSubgraphs.mdx +8 -8
  33. package/docs/academy/02-MasteryTrack/04-WorkWithData/03-WorkingWithSubgraphs/03-WorkingWithSubgraphs.md +28 -28
  34. package/docs/academy/02-MasteryTrack/04-WorkWithData/04-analytics-processor.md +4 -4
  35. package/docs/academy/02-MasteryTrack/04-WorkWithData/05-AnalyticsProcessorTutorial/01-SetupBuilderEnvironment.md +14 -14
  36. package/docs/academy/02-MasteryTrack/04-WorkWithData/05-AnalyticsProcessorTutorial/02-CreateNewPowerhouseProject.md +2 -2
  37. package/docs/academy/02-MasteryTrack/04-WorkWithData/05-AnalyticsProcessorTutorial/03-GenerateAnAnalyticsProcessor.md +6 -6
  38. package/docs/academy/02-MasteryTrack/04-WorkWithData/05-AnalyticsProcessorTutorial/04-UpdateAnalyticsProcessor.md +1 -1
  39. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/GraphQL References/QueryingADocumentWithGraphQL.md +2 -2
  40. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/best-practices.md +4 -4
  41. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/graphql/index.md +7 -7
  42. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/graphql/integration.md +1 -1
  43. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/intro.md +6 -6
  44. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/browser.md +1 -1
  45. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/index.md +5 -5
  46. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/memory.md +1 -1
  47. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/pg.md +2 -2
  48. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/schema.md +1 -1
  49. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/utilities.md +1 -1
  50. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/use-cases/index.md +1 -1
  51. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/use-cases/maker.md +12 -12
  52. package/docs/academy/02-MasteryTrack/05-Launch/01-IntroductionToPackages.md +9 -9
  53. package/docs/academy/02-MasteryTrack/05-Launch/02-PublishYourProject.md +8 -8
  54. package/docs/academy/02-MasteryTrack/05-Launch/03-SetupEnvironment.md +35 -35
  55. package/docs/academy/02-MasteryTrack/05-Launch/04-ConfigureEnvironment.md +8 -8
  56. package/docs/academy/02-MasteryTrack/_category_.json +1 -1
  57. package/docs/academy/03-ExampleUsecases/Chatroom/03-DefineChatroomDocumentModel.md +5 -5
  58. package/docs/academy/07-Cookbook.md +105 -105
  59. package/package.json +1 -1
  60. package/sidebars.ts +9 -10
  61. package/src/css/custom.css +18 -0
  62. package/docs/academy/01-GetStarted/00-ExploreDemoPackage.md +0 -88
@@ -2,7 +2,7 @@
2
2
  sidebar_position: 3
3
3
  ---
4
4
 
5
- # Postgres Store
5
+ # Postgres store
6
6
 
7
7
  The `PostgresAnalyticsStore` is an `IAnalyticsStore` implementation that leverages a Postgres database. It requires some APIs that do not run in a browser, and is intended for server-side applications.
8
8
 
@@ -54,7 +54,7 @@ const store = new PostgresAnalyticsStore({
54
54
 
55
55
  For more details on these optional constructor parameters, see the [Utilities](#utilities) section.
56
56
 
57
- ## Raw Queries
57
+ ## Raw queries
58
58
 
59
59
  Though there is no method on `IAnalyticsStore` for running arbitrary queries, the `PostgresAnalyticsStore` implementation provides a `raw(sql: string)` method. This is used only in development, testing, and [benchmarking](https://github.com/powerhouse-inc/analytics-engine/blob/main/benchmarks/src/wasm.ts) situations and is not intended for production use cases.
60
60
 
@@ -1,4 +1,4 @@
1
- # Database Schema
1
+ # Database schema
2
2
 
3
3
  ![untitled](../images/dbs.png)
4
4
  *Database table structures for the analytics engine.*
@@ -78,7 +78,7 @@ const result = profiler.recordSync("computeSync", () => {
78
78
  });
79
79
  ```
80
80
 
81
- ### Name Stack
81
+ ### Name stack
82
82
 
83
83
  Often, it is useful to group metrics together. This is accomplished through a metric naming stack, which takes the form: `[Namespace].[... Stack Values].[Metric Name]`. This allows systems to pass down a profiler instance, and compose results.
84
84
 
@@ -2,6 +2,6 @@
2
2
  sidebar_position: 4
3
3
  ---
4
4
 
5
- # Use Cases
5
+ # Use cases
6
6
 
7
7
  This section covers various real-world use cases.
@@ -1,4 +1,4 @@
1
- # Sky Dashboard
1
+ # Sky dashboard
2
2
 
3
3
  All consumable data for the Sky ecosystem is available through GraphQL API, which allows you to only fetch the data you require.
4
4
 
@@ -6,7 +6,7 @@ The API is available in two environments, a developer environment (DEV) and a pr
6
6
 
7
7
  Both environments are available through Apollo Studio Explorer where the user can find the analytics query and start fetching information from the analytics engine.
8
8
 
9
- ### Available Metrics
9
+ ### Available metrics
10
10
 
11
11
  To see which metrics are available to fetch analytics data, use the below query:
12
12
 
@@ -87,7 +87,7 @@ The above query will output the following list, which is described in the dropdo
87
87
 
88
88
  </details>
89
89
 
90
- ### Available Dimensions
90
+ ### Available dimensions
91
91
 
92
92
  To see the available dimensions use the query below:
93
93
 
@@ -292,7 +292,7 @@ This query will return budget data in MKR for the specified time frame and dimen
292
292
 
293
293
  By appropriately specifying the currency in your queries, you can tailor the financial analysis to suit the specific needs of your investigation, whether it be transactional stability with DAI or governance dynamics with MKR.
294
294
 
295
- ### Examples of Filter Configurations
295
+ ### Examples of filter configurations
296
296
 
297
297
  1. Analyzing MKR Budgets for a Specific Period:
298
298
 
@@ -341,7 +341,7 @@ This filter setup is designed to provide an annual overview of actuals and forec
341
341
 
342
342
  These examples demonstrate how to construct a complete filter object for different types of financial analysis. By properly defining each parameter, users can ensure accurate and relevant data retrieval from the analytics engine.
343
343
 
344
- ### Multicurrency Series
344
+ ### Multicurrency series
345
345
 
346
346
  #### Overview
347
347
 
@@ -394,7 +394,7 @@ In the future, there could become more metrics available for different purposes,
394
394
 
395
395
  4. Data Retrieval: The converted data is then retrieved as part of the analytics query, allowing users to analyze financials directly in the currency of their choice.
396
396
 
397
- #### Query Structure
397
+ #### Query structure
398
398
 
399
399
  The query structure for fetching multicurrency data is similar to the standard analytics query, with the addition of the `multicurrencySeries` field and relevant subfields for specifying the conversion criteria.
400
400
 
@@ -447,13 +447,13 @@ query MultiCurrencySeries($filter: MultiCurrencyConversions) {
447
447
 
448
448
  ```
449
449
 
450
- **Example Use Case**
450
+ #### Example use case
451
451
 
452
452
  - **Budget Analysis in Different Currencies**: A user can analyze the MakerDAO budget in both MKR and DAI, viewing how the budget figures translate between the governance token and the stablecoin. This can be particularly useful for reports that need to present financial data in a currency that is more widely used or understood by the audience.
453
453
 
454
- #### Writing Queries
454
+ #### Writing queries
455
455
 
456
- Writing queries in the analytics engine involves constructing structured requests to retrieve specific data. Heres a step-by-step guide to help you write a basic query.
456
+ Writing queries in the analytics engine involves constructing structured requests to retrieve specific data. Here's a step-by-step guide to help you write a basic query.
457
457
 
458
458
  **Step 1: Understand Your Data Requirements**
459
459
 
@@ -465,7 +465,7 @@ Access the analytics engine through a GraphQL interface, like Apollo Studio or a
465
465
 
466
466
  **Step 3: Start with the Query Structure**
467
467
 
468
- Every query begins with the keyword query, followed by an optional name, and the main query field. In our case, its `Analytics`:
468
+ Every query begins with the keyword query, followed by an optional name, and the main query field. In our case, it's `Analytics`:
469
469
 
470
470
  ```graphql
471
471
  query MyAnalyticsQuery {
@@ -535,7 +535,7 @@ query MyAnalyticsQuery($filter: AnalyticsFilter) {
535
535
 
536
536
  **Step 6: Execute the Query**
537
537
 
538
- Once you have constructed the query, execute it in your GraphQL interface. Ensure that the filter object is passed correctly as a variable. In most interfaces, theres a separate section or panel where you can define these variables.
538
+ Once you have constructed the query, execute it in your GraphQL interface. Ensure that the filter object is passed correctly as a variable. In most interfaces, there's a separate section or panel where you can define these variables.
539
539
 
540
540
  **Step 7: Review the Results**
541
541
 
@@ -547,7 +547,7 @@ Query writing is often an iterative process. Based on the initial results, you m
547
547
 
548
548
  By following these steps, you can effectively write and execute queries in the analytics engine, tailoring the data retrieval to your specific analytical needs.
549
549
 
550
- ### Query Examples
550
+ ### Query examples
551
551
 
552
552
  Example 1: Fetching Total MakerDAO Actual Expenses and Budget for 2023
553
553
 
@@ -1,10 +1,10 @@
1
- # Intro to Packages
1
+ # Introduction to packages
2
2
 
3
3
  Packages of document models are a core structuring mechanism in the Powerhouse framework, allowing developers to group and manage related document models efficiently. These packages serve as modular collections of document definitions, ensuring consistency, scalability, and reusability across different applications.
4
4
 
5
5
  By packaging document models together, developers can create well-organized, interoperable sets of data structures that capture the specific needs of various operational processes. Whether used for financial operations, governance, or contributor management, these packages streamline development and integration within Powerhouse's decentralized framework.
6
6
 
7
- ### Key Features of Packages
7
+ ### Key features of packages
8
8
 
9
9
  - **Modular Structure** – Packages encapsulate related document models, making it easier to manage and deploy them as a cohesive unit.
10
10
  - **Standardized Definitions** – Each document model within a package follows Powerhouse's structured schema approach, ensuring consistency in data representation.
@@ -12,7 +12,7 @@ By packaging document models together, developers can create well-organized, int
12
12
  - **Dependency Management** – Developers can define dependencies between document models, ensuring proper relationships and data flows between interconnected components.
13
13
  - **Automated Schema Evolution** – Versioning mechanisms allow document models within a package to evolve over time without breaking existing functionality.
14
14
 
15
- ### Example Use Cases of Packages
15
+ ### Example use cases of packages
16
16
 
17
17
  - **Finance Package** – A set of document models handling invoices, payments, budgets, and financial reporting. link
18
18
  - **Contributor Billing Package** – Defines document models for tracking work, invoicing, and facilitating payments (in both fiat and crypto) for contributors in decentralized organizations. link
@@ -33,9 +33,9 @@ Within a package, you'll find **several key modules**, each serving a distinct r
33
33
 
34
34
  ---
35
35
 
36
- ## **Modules of a Powerhouse Package**
36
+ ## Modules of a Powerhouse package
37
37
 
38
- ### **1. Document Model**
38
+ ### 1. Document model
39
39
 
40
40
  The **core component** of any package, defining how data is structured and manipulated.
41
41
 
@@ -43,14 +43,14 @@ The **core component** of any package, defining how data is structured and manip
43
43
  - Serves as the foundation for decentralized workflows.
44
44
  - **Example:** An **Invoice** document model might define fields like `issuer`, `recipient`, `amount`, and operations like `ADD_LINE_ITEM` or `MARK_AS_PAID`.
45
45
 
46
- ### **2. Document Model Editor**
46
+ ### 2. Document model editor
47
47
 
48
48
  A **UI component** that allows users to **interact with document models visually**.
49
49
 
50
50
  - Enables users to **create, modify, and manage** documents without coding.
51
51
  - Supports multiple editors for a single document model, offering different **UI experiences** tailored to specific roles.
52
52
 
53
- ### **3. Scripts**
53
+ ### 3. Scripts
54
54
 
55
55
  **Automated actions** that run on demand or on a schedule.
56
56
 
@@ -58,7 +58,7 @@ A **UI component** that allows users to **interact with document models visually
58
58
  - Not continuously running—activated manually or at predefined intervals.
59
59
  - **Example:** A script that **generates a weekly financial report** from invoice data.
60
60
 
61
- ### **4. Processors**
61
+ ### 4. Processors
62
62
 
63
63
  **Event-driven background tasks** that react to document changes.
64
64
 
@@ -66,7 +66,7 @@ A **UI component** that allows users to **interact with document models visually
66
66
  - Supports real-time **data updates, analytics, and workflow automation**.
67
67
  - **Example:** A processor that **monitors new invoices** and updates an **analytics dashboard** in real-time.
68
68
 
69
- ### **5. Drive-Apps**
69
+ ### 5. Drive-apps
70
70
 
71
71
  **Custom interfaces** that provide enhanced ways to interact with a package's document models.
72
72
 
@@ -1,4 +1,4 @@
1
- # Publish Your Package
1
+ # Publish your package
2
2
 
3
3
  This tutorial is a step by step guide tackling the following topics:
4
4
  1. the process of **building a powerhouse project**
@@ -116,7 +116,7 @@ These include:
116
116
  - `scripts`: A folder containing the scripts you might use.
117
117
  - `tests`: A folder containing your unit tests.
118
118
 
119
- ### 1.2. Adding Document Models, editors and unit tests
119
+ ### 1.2. Adding document models, editors and unit tests
120
120
 
121
121
  Now that you've set up your directory.
122
122
  Go ahead and add the document models you'd like to add by going through the standard document model building flow:
@@ -203,11 +203,11 @@ If you're publishing a package under a scope (like @your-org/my-package), you mi
203
203
  }
204
204
  ```
205
205
 
206
- ### 2.1 Versioning, Tagging, and Publishing Your Package
206
+ ### 2.1 Versioning, tagging, and publishing your package
207
207
 
208
208
  Before publishing, it's crucial to version your package correctly and tag the release in your Git repository. This helps track changes and allows users to depend on specific versions.
209
209
 
210
- **1. Versioning with PNPM**
210
+ #### 1. Versioning with pnpm
211
211
 
212
212
  Use the `pnpm version` command to update your package version according to semantic versioning rules (`patch` for bugfixes, `minor` for new features, `major` for breaking changes). This command will:
213
213
  - Update the `version` in your `package.json`.
@@ -226,7 +226,7 @@ pnpm version major
226
226
  ```
227
227
  Take note of the new version tag created (e.g., `v1.0.1`), as you'll need it in the next step.
228
228
 
229
- **2. Pushing Changes to Git**
229
+ #### 2. Pushing changes to Git
230
230
 
231
231
  Next, push your commits and the new version tag to your remote Git repository:
232
232
 
@@ -246,7 +246,7 @@ Alternatively, to push all new local tags (use with caution):
246
246
  # git push --tags
247
247
  ```
248
248
 
249
- **3. Understanding Git Tags vs. NPM Distributor Tags**
249
+ #### 3. Understanding Git tags vs. npm distributor tags
250
250
 
251
251
  It's important to distinguish between Git tags and NPM distributor tags (dist-tags):
252
252
 
@@ -256,7 +256,7 @@ It's important to distinguish between Git tags and NPM distributor tags (dist-ta
256
256
  - `beta`, `next`, `alpha`: Often used for pre-release versions.
257
257
  When you publish a package without specifying an NPM tag, it usually gets the `latest` tag by default.
258
258
 
259
- **4. Publishing to NPM**
259
+ #### 4. Publishing to npm
260
260
 
261
261
  Now you are ready to publish your package to the NPM registry. Ensure you are logged into NPM (the `npm login` command shown in previous steps should be used, or `pnpm login` which is an alias).
262
262
 
@@ -279,7 +279,7 @@ This is useful for testing releases before making them `latest`.
279
279
 
280
280
  Now let's verify that the package(s) get published in the package repository, next to pre-existing packages that you might have been publishing before.
281
281
 
282
- ## 3. Deploying the host apps & project.
282
+ ## 3. Deploying the host apps and project
283
283
  Now that we've installed all the necessary services on our server instance, we can start deploying the host apps & our packaged project from npm.
284
284
 
285
285
  Install your project package we've published earlier on your local connect (`ph connect`) instance by running the following command:
@@ -1,4 +1,4 @@
1
- # Environment Setup Guide
1
+ # Environment setup guide
2
2
 
3
3
  ## Introduction
4
4
  Powerhouse is a powerful platform that helps you manage and deploy your applications efficiently.
@@ -19,16 +19,16 @@ This tutorial will guide you through the process of creating a new virtual priva
19
19
 
20
20
  **Current Date:** May 15, 2024
21
21
 
22
- ## Part 1: Setting Up Your DigitalOcean Droplet
22
+ ## Part 1: Setting up your DigitalOcean droplet
23
23
 
24
24
  A Droplet is a scalable virtual machine that you can configure to host your websites, applications, or other services.
25
25
 
26
- ### Step 1: Sign Up or Log In to DigitalOcean
26
+ ### Step 1: Sign up or log in to DigitalOcean
27
27
 
28
28
  - If you don't have an account, go to [digitalocean.com](https://digitalocean.com) and sign up. You'll likely need to provide payment information.
29
29
  - If you already have an account, log in.
30
30
 
31
- ### Step 2: Create a New Droplet
31
+ ### Step 2: Create a new droplet
32
32
 
33
33
  1. From your DigitalOcean dashboard, click the green "Create" button in the top right corner and select "Droplets".
34
34
 
@@ -62,7 +62,7 @@ A Droplet is a scalable virtual machine that you can configure to host your webs
62
62
  - Select Project: Assign the Droplet to a project.
63
63
  - Review your selections and click the "Create Droplet" button at the bottom.
64
64
 
65
- ### Step 3: Access Your Droplet
65
+ ### Step 3: Access your droplet
66
66
 
67
67
  It will take a minute or two for your Droplet to be provisioned. Once it's ready, its IP address will be displayed in your Droplets list.
68
68
 
@@ -86,9 +86,9 @@ To log in via SSH:
86
86
 
87
87
  Now your Droplet is running! Now you can continue with the Powerhouse tutorial or any next steps.
88
88
 
89
- ### DNS Configuration
89
+ ### DNS configuration
90
90
 
91
- #### Option A: Using DigitalOcean's Nameservers (Recommended)
91
+ #### Option A: Using DigitalOcean's nameservers (recommended)
92
92
 
93
93
  1. **Add Your Domain to DigitalOcean:**
94
94
  - Go to "Networking" → "Domains"
@@ -129,7 +129,7 @@ Now your Droplet is running! Now you can continue with the Powerhouse tutorial o
129
129
  - **WILL DIRECT TO:** Your Droplet's IP
130
130
  - **TTL:** 3600
131
131
 
132
- #### Option B: Using Your Existing Nameservers (NS locked)
132
+ #### Option B: Using your existing nameservers (NS locked)
133
133
 
134
134
  1. **Just Create DNS Records at Your Registrar:**
135
135
  - **Root Domain (A Record):**
@@ -158,7 +158,7 @@ Now your Droplet is running! Now you can continue with the Powerhouse tutorial o
158
158
 
159
159
  **Note:** DNS changes may take up to 48 hours to propagate globally.
160
160
 
161
- ### Verify Configuration
161
+ ### Verify configuration
162
162
 
163
163
  1. Use DNS lookup tools to verify your records:
164
164
  ```bash
@@ -202,9 +202,9 @@ This tutorial will guide you through the process of assigning a static IP (Elast
202
202
  :::
203
203
 
204
204
 
205
- ## Part 1: Assigning a Static IP to EC2 Instance
205
+ ## Part 1: Assigning a static IP to EC2 instance
206
206
 
207
- ### Step 1: Allocate Elastic IP
207
+ ### Step 1: Allocate elastic IP
208
208
 
209
209
  1. Navigate to the EC2 service in the AWS console
210
210
  2. Choose "Elastic IPs" from the navigation pane
@@ -212,7 +212,7 @@ This tutorial will guide you through the process of assigning a static IP (Elast
212
212
  4. Select the VPC where your EC2 instance is located
213
213
  5. Click "Allocate"
214
214
 
215
- ### Step 2: Associate Elastic IP
215
+ ### Step 2: Associate elastic IP
216
216
 
217
217
  1. Go back to the EC2 console and select your instance
218
218
  2. From the "Networking" tab, expand "Network interfaces"
@@ -221,9 +221,9 @@ This tutorial will guide you through the process of assigning a static IP (Elast
221
221
  5. Choose "Actions", then "Manage IP Addresses"
222
222
  6. Find the Elastic IP you allocated and click "Associate"
223
223
 
224
- ## Part 2: DNS Configuration
224
+ ## Part 2: DNS configuration
225
225
 
226
- ### Option A: Using AWS Route 53 (Recommended)
226
+ ### Option A: Using AWS Route 53 (recommended)
227
227
 
228
228
  1. **Add Your Domain to Route 53:**
229
229
  - Go to Route 53 → "Hosted zones"
@@ -267,7 +267,7 @@ This tutorial will guide you through the process of assigning a static IP (Elast
267
267
  - **VALUE:** Your Elastic IP
268
268
  - **TTL:** 3600
269
269
 
270
- ### Option B: Using Your Existing Nameservers
270
+ ### Option B: Using your existing nameservers
271
271
 
272
272
  1. **Create DNS Records at Your Registrar:**
273
273
  - **Root Domain (A Record):**
@@ -299,7 +299,7 @@ This tutorial will guide you through the process of assigning a static IP (Elast
299
299
  - Point them to your EC2 instance's public IP address
300
300
  - Wait for DNS propagation before requesting SSL certificates
301
301
 
302
- ### Verify Configuration
302
+ ### Verify configuration
303
303
 
304
304
  1. Use DNS lookup tools to verify your records:
305
305
  ```bash
@@ -343,7 +343,7 @@ The `install` script provides a streamlined way to install the Powerhouse CLI to
343
343
 
344
344
  Up next is the configurations of your services.
345
345
 
346
- ### Service Configuration
346
+ ### Service configuration
347
347
 
348
348
  Next, run
349
349
  ```bash
@@ -362,26 +362,26 @@ PM2 is configured to automatically restart services if they crash and to start t
362
362
 
363
363
  The setup command will prompt you for the following information:
364
364
 
365
- #### Package Installation
365
+ #### Package installation
366
366
  During this phase, you can enter package names that you want to install. For example, you might want to `ph install @powerhousedao/todo-demo-package` or other Powerhouse packages. This step is crucial for adding the specific functionality you need. You can also press Enter to skip this step and install packages later using the `ph install` command.
367
367
 
368
- #### Database Configuration
368
+ #### Database configuration
369
369
  The script offers two options for database configuration:
370
370
  * **Option 1: Local Database** Sets up a local PostgreSQL database, which is ideal for development or small deployments. It automatically creates a database user with a secure random password and configures the database to accept local connections. This option is perfect for getting started quickly.
371
371
  * **Option 2: Remote Database** Allows you to connect to a remote PostgreSQL database by providing a connection URL in the format `postgres://user:password@host:port/db`. This is recommended for production environments.
372
372
 
373
- #### SSL Configuration
373
+ #### SSL configuration
374
374
  For SSL configuration, you have two choices:
375
375
  * **Option 1: Let's Encrypt (Recommended for Production)** This option requires you to provide a base domain (e.g., `powerhouse.xyz`) and subdomains for your services. The script will automatically obtain and configure SSL certificates for your domains.
376
376
  * **Option 2: Self-signed Certificate** This is suitable for development or testing. It uses your machine's hostname and generates a self-signed certificate. Browsers will show security warnings with this option.
377
377
 
378
- #### Domain Setup
378
+ #### Domain setup
379
379
  You will be asked to enter your `connect` and `switchboard` subdomains to complete the setup. If you need more information, revisit the cloud provider setup sections at the beginning of this guide.
380
380
 
381
- #### Security Features
381
+ #### Security features
382
382
  Security is a top priority. The script implements automatic SSL certificate management, generates secure database passwords, and configures security headers in Nginx, and sets up proper proxy settings to support WebSocket connections securely.
383
383
 
384
- ## 2. Verifying the Setup
384
+ ## 2. Verifying the setup
385
385
 
386
386
  After the installation is complete, it's important to verify that everything is working correctly. You can check the status of your services using PM2, verify the Nginx configuration, and ensure your SSL certificates are properly installed. This step is crucial for identifying any potential issues before they affect your users.
387
387
 
@@ -406,16 +406,16 @@ sudo nginx -t
406
406
  sudo certbot certificates # if using Let's Encrypt
407
407
  ```
408
408
 
409
- ## 3. Accessing the Services
409
+ ## 3. Accessing the services
410
410
 
411
411
  Once everything is set up, you can access your services through the configured domains.
412
412
  If you chose Let's Encrypt, your services will be available at their respective subdomains. With a self-signed certificate, you'll access the services through your machine's hostname with the appropriate base paths. The services are configured to use HTTPS by default, ensuring secure communication.
413
413
 
414
- ### With Let's Encrypt:
414
+ ### With Let's Encrypt
415
415
  - Connect: `https://connect.yourdomain.com`
416
416
  - Switchboard: `https://switchboard.yourdomain.com`
417
417
 
418
- ### With Self-signed Certificate:
418
+ ### With self-signed certificate
419
419
  - Connect: `https://your-hostname/connect`
420
420
  - Switchboard: `https://your-hostname/switchboard`
421
421
 
@@ -426,7 +426,7 @@ When issues arise, there are several common problems you might encounter.
426
426
  - Nginx configuration errors can be investigated through the error logs, and service issues can be diagnosed using PM2 logs.
427
427
  - SSL certificate problems often relate to DNS settings or certificate paths. Understanding these common issues and their solutions will help you maintain a stable Powerhouse installation.
428
428
 
429
- ### Common Issues:
429
+ ### Common issues
430
430
  1. **"`ph`: command not found"**
431
431
  - Run `source ~/.bashrc` or restart your terminal
432
432
  - Verify that the `PNPM_HOME` environment variable is set correctly
@@ -451,22 +451,22 @@ When issues arise, there are several common problems you might encounter.
451
451
 
452
452
  Regular maintenance is crucial for keeping your Powerhouse installation running smoothly. You can update services using the Powerhouse CLI, restart services through PM2, and monitor logs to ensure everything is functioning correctly. Regular maintenance helps prevent issues and ensures that your services are running with the latest security patches and features.
453
453
 
454
- ### Updating Services:
454
+ ### Updating services
455
455
  ```bash
456
456
  ph update <package-name>
457
457
  ```
458
458
 
459
- ### Restarting Services:
459
+ ### Restarting services
460
460
  ```bash
461
461
  ph service restart
462
462
  ```
463
463
 
464
- ### Checking Service Status and Logs:
464
+ ### Checking service status and logs
465
465
  ```bash
466
466
  ph service status
467
467
  ```
468
468
 
469
- ## 6. Security Notes
469
+ ## 6. Security notes
470
470
 
471
471
  Maintaining security is an ongoing process. It's essential to keep your database credentials secure and regularly update your SSL certificates. Regular monitoring of system logs helps identify potential security issues, and keeping your system and packages updated ensures you have the latest security patches. Consider implementing additional security measures such as firewall rules, intrusion detection systems, and regular security audits.
472
472
 
@@ -474,17 +474,17 @@ Maintaining security is an ongoing process. It's essential to keep your database
474
474
 
475
475
  Regular backups are crucial for data safety. The database can be backed up using pg_dump, and your configuration files can be archived using tar. These backups should be stored securely and tested regularly to ensure they can be restored if needed. Consider implementing an automated backup schedule and storing backups in multiple locations for redundancy.
476
476
 
477
- ### Database Backup:
477
+ ### Database backup
478
478
  ```bash
479
479
  pg_dump -U powerhouse -d powerhouse > backup.sql
480
480
  ```
481
481
 
482
- ### Configuration Backup:
482
+ ### Configuration backup
483
483
  ```bash
484
484
  sudo tar -czf powerhouse-config.tar.gz /etc/powerhouse/
485
485
  ```
486
486
 
487
- ## 8. Best Practices
487
+ ## 8. Best practices
488
488
 
489
489
  To get the most out of your Powerhouse installation, follow these best practices:
490
490
 
@@ -494,7 +494,7 @@ To get the most out of your Powerhouse installation, follow these best practices
494
494
  4. **Testing**: Test your backup and restore procedures regularly.
495
495
  5. **Security**: Regularly review and update your security measures.
496
496
 
497
- ## 9. Getting Help
497
+ ## 9. Getting help
498
498
 
499
499
  If you encounter issues or need assistance, there are several resources available:
500
500
 
@@ -1,4 +1,4 @@
1
- # Configure Your Environment
1
+ # Configure your environment
2
2
 
3
3
  After successfully setting up your server and installing the Powerhouse services using the `ph service setup` command as described in the [Setup Environment](./03-SetupEnvironment.md) guide, the next crucial step is to configure your environment. Proper configuration ensures that your Powerhouse Connect and Switchboard instances behave exactly as you need them to for your specific application.
4
4
 
@@ -13,11 +13,11 @@ A key principle to remember is that **environment variables will always override
13
13
 
14
14
  This guide will walk you through both methods and provide details on common configuration options, including setting up authorization.
15
15
 
16
- ## Using Environment Variables
16
+ ## Using environment variables
17
17
 
18
18
  The most common way to configure Powerhouse services is through environment variables. You can place these variables in a `.env` file at the root of your project directory. When you run `ph service start` or `ph service restart`, these variables are loaded into the environment of your running services.
19
19
 
20
- ### How to Create and Edit Your `.env` File
20
+ ### How to create and edit your .env file
21
21
 
22
22
  If you're on your cloud server, you can create and edit the `.env` file directly:
23
23
 
@@ -38,7 +38,7 @@ If you're on your cloud server, you can create and edit the `.env` file directly
38
38
  ph service restart
39
39
  ```
40
40
 
41
- ### Common Environment Variables for Connect
41
+ ### Common environment variables for Connect
42
42
 
43
43
  The Powerhouse Connect application has a wide range of available environment variables to toggle features and change its behavior. Below is a list of some variables you can configure.
44
44
 
@@ -102,17 +102,17 @@ SENTRY_RELEASE=""
102
102
  ```
103
103
  You can find the most up-to-date list of variables in the source repository: [https://github.com/powerhouse-inc/powerhouse/blob/main/apps/connect/.env](https://github.com/powerhouse-inc/powerhouse/blob/main/apps/connect/.env)
104
104
 
105
- ## Using a Configuration File
105
+ ## Using a configuration file
106
106
 
107
107
  For services like the Switchboard, you can also use a `powerhouse.config.json` file for more structured configuration, especially for features like authorization.
108
108
 
109
- ### Configuring Authorization
109
+ ### Configuring authorization
110
110
 
111
111
  A critical aspect of your environment configuration is setting up authorization to control who can access your services and what they can do. As detailed in our dedicated [Switchboard Authorization](/academy/MasteryTrack/BuildingUserExperiences/Authorization/Authorization) guide, you can manage access using a role-based system.
112
112
 
113
113
  Here's a quick overview of how you can configure authorization:
114
114
 
115
- #### Via Environment Variables
115
+ #### Via environment variables
116
116
 
117
117
  You can set the roles directly in your `.env` file. This is quick and easy for simple allowlists.
118
118
 
@@ -151,7 +151,7 @@ Remember, if you define `AUTH_ENABLED=false` as an environment variable, it will
151
151
 
152
152
  For a complete understanding of how roles (Guest, User, Admin) work and the permissions they have, please refer to the full [Authorization guide](/academy/MasteryTrack/BuildingUserExperiences/Authorization/Authorization).
153
153
 
154
- ## Applying Your Changes
154
+ ## Applying your changes
155
155
 
156
156
  Regardless of which method you use to update your configuration, the changes will not be applied until you restart your services.
157
157
 
@@ -1,5 +1,5 @@
1
1
  {
2
- "label": "Mastery Track",
2
+ "label": "Mastery track",
3
3
  "link": {
4
4
  "type": "generated-index",
5
5
  "description": "A set of tutorials that support the creation of more complex Powerhouse documents, drive apps and packages."
@@ -1,11 +1,11 @@
1
- # Define the Chatroom Document Model
1
+ # Define the chatroom document model
2
2
 
3
3
  In this tutorial, you will learn how to design your document model and export it to be later used in your Powerhouse project.
4
4
  If you don't have a document model created yet, have a look at the previous steps of this tutorial to create a new document model.
5
5
 
6
6
  Before you start, make sure you have the Connect application running.
7
7
 
8
- ## Chatroom Document Model Schema
8
+ ## Chatroom document model schema
9
9
 
10
10
  We use GraphQL Schema Definition Language (SDL) to define the document model schema. Below, you can see the SDL for the `ChatRoom` document model.
11
11
 
@@ -13,7 +13,7 @@ We use GraphQL Schema Definition Language (SDL) to define the document model sch
13
13
  This schema contains the data structure of the document model and the basic operations that can be performed on the document model. For more in depth information please visit [State Schema](/academy/MasteryTrack/DocumentModelCreation/SpecifyTheStateSchema)
14
14
  :::
15
15
 
16
- ## State Schema (See next steps)
16
+ ## State schema (See next steps)
17
17
 
18
18
  ```graphql
19
19
  # Defines a GraphQL type for the state of the chatroom document
@@ -58,7 +58,7 @@ enum ReactionType {
58
58
  }
59
59
  ```
60
60
 
61
- ## Operations Schema (See next steps)
61
+ ## Operations schema (See next steps)
62
62
 
63
63
  ```graphql
64
64
  # add_message
@@ -91,7 +91,7 @@ input EditChatDescriptionInput {
91
91
  }
92
92
  ```
93
93
 
94
- ## Define the Document Model
94
+ ## Define the document model
95
95
 
96
96
  To be able to define the document model, you need to open the Chatroom document model editor in Connect.
97
97