@powerhousedao/academy 5.0.0-staging.8 → 5.0.1-staging.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (108) hide show
  1. package/.vscode/settings.json +1 -1
  2. package/CHANGELOG.md +404 -0
  3. package/README.md +3 -3
  4. package/babel.config.js +1 -1
  5. package/blog/BeyondCommunication-ABlueprintForDevelopment.md +25 -24
  6. package/blog/TheChallengeOfChange.md +21 -21
  7. package/docs/academy/01-GetStarted/00-ExploreDemoPackage.mdx +67 -30
  8. package/docs/academy/01-GetStarted/01-CreateNewPowerhouseProject.md +38 -21
  9. package/docs/academy/01-GetStarted/02-DefineToDoListDocumentModel.md +24 -19
  10. package/docs/academy/01-GetStarted/03-ImplementOperationReducers.md +44 -41
  11. package/docs/academy/01-GetStarted/04-BuildToDoListEditor.md +10 -10
  12. package/docs/academy/01-GetStarted/05-VetraStudio.md +164 -0
  13. package/docs/academy/01-GetStarted/06-ReactorMCP.md +58 -0
  14. package/docs/academy/01-GetStarted/home.mdx +185 -90
  15. package/docs/academy/01-GetStarted/images/Modules.png +0 -0
  16. package/docs/academy/01-GetStarted/images/VetraStudioDrive.png +0 -0
  17. package/docs/academy/01-GetStarted/styles.module.css +5 -5
  18. package/docs/academy/02-MasteryTrack/01-BuilderEnvironment/01-Prerequisites.md +46 -18
  19. package/docs/academy/02-MasteryTrack/01-BuilderEnvironment/02-StandardDocumentModelWorkflow.md +118 -68
  20. package/docs/academy/02-MasteryTrack/01-BuilderEnvironment/03-BuilderTools.md +75 -33
  21. package/docs/academy/02-MasteryTrack/01-BuilderEnvironment/_category_.json +6 -6
  22. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/01-WhatIsADocumentModel.md +30 -21
  23. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/02-SpecifyTheStateSchema.md +41 -37
  24. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/03-SpecifyDocumentOperations.md +29 -25
  25. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/04-UseTheDocumentModelGenerator.md +36 -37
  26. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/05-ImplementDocumentReducers.md +128 -109
  27. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/06-ImplementDocumentModelTests.md +95 -86
  28. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/07-ExampleToDoListRepository.md +7 -9
  29. package/docs/academy/02-MasteryTrack/02-DocumentModelCreation/_category_.json +6 -6
  30. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/01-BuildingDocumentEditors.md +65 -47
  31. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/02-ConfiguringDrives.md +77 -62
  32. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/03-BuildingADriveExplorer.md +360 -349
  33. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/06-DocumentTools/00-DocumentToolbar.mdx +16 -10
  34. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/06-DocumentTools/01-OperationHistory.md +10 -7
  35. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/06-DocumentTools/02-RevisionHistoryTimeline.md +25 -17
  36. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/06-DocumentTools/_category_.json +6 -6
  37. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/07-Authorization/01-RenownAuthenticationFlow.md +14 -7
  38. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/07-Authorization/02-Authorization.md +0 -1
  39. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/07-Authorization/_category_.json +5 -5
  40. package/docs/academy/02-MasteryTrack/03-BuildingUserExperiences/_category_.json +1 -1
  41. package/docs/academy/02-MasteryTrack/04-WorkWithData/01-GraphQLAtPowerhouse.md +45 -33
  42. package/docs/academy/02-MasteryTrack/04-WorkWithData/02-UsingTheAPI.mdx +61 -18
  43. package/docs/academy/02-MasteryTrack/04-WorkWithData/03-UsingSubgraphs.md +50 -54
  44. package/docs/academy/02-MasteryTrack/04-WorkWithData/04-analytics-processor.md +126 -110
  45. package/docs/academy/02-MasteryTrack/04-WorkWithData/05-RelationalDbProcessor.md +75 -45
  46. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/GraphQL References/QueryingADocumentWithGraphQL.md +23 -21
  47. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/best-practices.md +9 -9
  48. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/graphql/index.md +11 -23
  49. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/graphql/integration.md +25 -9
  50. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/intro.md +10 -10
  51. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/benchmarks.md +1 -1
  52. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/index.md +16 -11
  53. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/memory.md +6 -5
  54. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/schema.md +2 -2
  55. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/typescript/utilities.md +7 -5
  56. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/use-cases/maker.md +32 -58
  57. package/docs/academy/02-MasteryTrack/04-WorkWithData/06-Analytics Engine/use-cases/processors.md +1 -1
  58. package/docs/academy/02-MasteryTrack/04-WorkWithData/07-drive-analytics.md +105 -71
  59. package/docs/academy/02-MasteryTrack/04-WorkWithData/_ARCHIVE-AnalyticsProcessorTutorial/_01-SetupBuilderEnvironment.md +22 -0
  60. package/docs/academy/02-MasteryTrack/04-WorkWithData/_ARCHIVE-AnalyticsProcessorTutorial/_02-CreateNewPowerhouseProject.md +9 -8
  61. package/docs/academy/02-MasteryTrack/04-WorkWithData/_ARCHIVE-AnalyticsProcessorTutorial/_03-GenerateAnAnalyticsProcessor.md +28 -32
  62. package/docs/academy/02-MasteryTrack/04-WorkWithData/_ARCHIVE-AnalyticsProcessorTutorial/_04-UpdateAnalyticsProcessor.md +25 -26
  63. package/docs/academy/02-MasteryTrack/04-WorkWithData/_ARCHIVE-AnalyticsProcessorTutorial/_category_.json +1 -1
  64. package/docs/academy/02-MasteryTrack/04-WorkWithData/_category_.json +7 -7
  65. package/docs/academy/02-MasteryTrack/05-Launch/01-IntroductionToPackages.md +3 -4
  66. package/docs/academy/02-MasteryTrack/05-Launch/02-PublishYourProject.md +69 -45
  67. package/docs/academy/02-MasteryTrack/05-Launch/03-SetupEnvironment.md +70 -40
  68. package/docs/academy/02-MasteryTrack/05-Launch/04-ConfigureEnvironment.md +1 -0
  69. package/docs/academy/02-MasteryTrack/05-Launch/_category_.json +7 -7
  70. package/docs/academy/02-MasteryTrack/_category_.json +6 -6
  71. package/docs/academy/03-ExampleUsecases/Chatroom/02-CreateNewPowerhouseProject.md +5 -3
  72. package/docs/academy/03-ExampleUsecases/Chatroom/03-DefineChatroomDocumentModel.md +38 -37
  73. package/docs/academy/03-ExampleUsecases/Chatroom/04-ImplementOperationReducers.md +45 -41
  74. package/docs/academy/03-ExampleUsecases/Chatroom/05-ImplementChatroomEditor.md +14 -14
  75. package/docs/academy/03-ExampleUsecases/Chatroom/06-LaunchALocalReactor.md +6 -6
  76. package/docs/academy/03-ExampleUsecases/Chatroom/_category_.json +1 -1
  77. package/docs/academy/04-APIReferences/00-PowerhouseCLI.md +104 -43
  78. package/docs/academy/04-APIReferences/01-ReactHooks.md +177 -129
  79. package/docs/academy/04-APIReferences/04-RelationalDatabase.md +121 -113
  80. package/docs/academy/04-APIReferences/05-PHDocumentMigrationGuide.md +48 -41
  81. package/docs/academy/04-APIReferences/_category_.json +6 -6
  82. package/docs/academy/05-Architecture/00-PowerhouseArchitecture.md +1 -2
  83. package/docs/academy/05-Architecture/01-WorkingWithTheReactor.md +11 -8
  84. package/docs/academy/05-Architecture/05-DocumentModelTheory/_category_.json +1 -1
  85. package/docs/academy/05-Architecture/_category_.json +6 -6
  86. package/docs/academy/06-ComponentLibrary/00-DocumentEngineering.md +25 -23
  87. package/docs/academy/06-ComponentLibrary/02-CreateCustomScalars.md +105 -93
  88. package/docs/academy/06-ComponentLibrary/03-IntegrateIntoAReactComponent.md +1 -0
  89. package/docs/academy/06-ComponentLibrary/_category_.json +7 -7
  90. package/docs/academy/07-Cookbook.md +268 -35
  91. package/docs/academy/08-Glossary.md +7 -1
  92. package/docs/bookofpowerhouse/01-Overview.md +2 -2
  93. package/docs/bookofpowerhouse/02-GeneralFrameworkAndPhilosophy.md +1 -7
  94. package/docs/bookofpowerhouse/03-PowerhouseSoftwareArchitecture.md +10 -7
  95. package/docs/bookofpowerhouse/04-DevelopmentApproaches.md +10 -4
  96. package/docs/bookofpowerhouse/05-SNOsandANewModelForOSSandPublicGoods.md +23 -30
  97. package/docs/bookofpowerhouse/06-SNOsInActionAndPlatformEconomies.md +0 -7
  98. package/docusaurus.config.ts +64 -66
  99. package/package.json +9 -7
  100. package/scripts/generate-combined-cli-docs.ts +43 -13
  101. package/sidebars.ts +2 -0
  102. package/src/components/HomepageFeatures/index.tsx +171 -78
  103. package/src/components/HomepageFeatures/styles.module.css +1 -2
  104. package/src/css/custom.css +89 -89
  105. package/src/pages/_archive-homepage.tsx +17 -16
  106. package/src/theme/DocCardList/index.tsx +9 -8
  107. package/static.json +6 -6
  108. package/tsconfig.tsbuildinfo +1 -0
@@ -5,7 +5,7 @@
5
5
  **First**, navigate to the project directory with `cd rwa-analytics` to start the reactor from the project directory.
6
6
  Now open the directory with your code editor with the command `code . `
7
7
 
8
- Next, launch the reactor by running the following command in your terminal.
8
+ Next, launch the reactor by running the following command in your terminal.
9
9
  `ph reactor` . A Reactor instance is a local (or remote) node in the Powerhouse network that can hold & sychronize data. You'll see the following output in your terminal while the reactor is starting up and a new browser window opens with the Connect application. The reactor you've created will become available as a public drive in Connect's left hand side bar, named `Powerhouse`.
10
10
 
11
11
  ```bash
@@ -22,59 +22,55 @@ Next, launch the reactor by running the following command in your terminal.
22
22
  [Connect]: ➜ Local: http://localhost:3000/
23
23
  ```
24
24
 
25
- As you can see the reactor registered the default system subgraph but also the `d/:drive` subgraph for fetching documents and pushing updates. It also registered the analytics subgraph, which we'll use later for running analytics queries on the document model in the next steps.
25
+ As you can see the reactor registered the default system subgraph but also the `d/:drive` subgraph for fetching documents and pushing updates. It also registered the analytics subgraph, which we'll use later for running analytics queries on the document model in the next steps.
26
26
 
27
27
  ### Generate the analytics processor
28
+
28
29
  **Next** we'll open up a second terminal window and generate the analytics processor.
29
30
  Use the following command to generate the analytics processor in a seperate terminal window.
31
+
30
32
  ```bash
31
33
  npm run generate -- --processor rwa-analytics --document-types makerdao/rwa-portfolio
32
34
  ```
33
- In your project directory you'll now see the folder 'processors' being created.
35
+
36
+ In your project directory you'll now see the folder 'processors' being created.
34
37
  Later we'll see how this processor is actively listening for the operations on a document model.
35
38
  Eventually we'll replace the default code of the generated processor with our own custom analytics processor code to listen to specific operations, dimensions and metrics.
36
39
 
37
40
  ### Add data to the document model in the reactor
38
41
 
39
42
  Now that we've generated the analytics processor we need to give it some data to listen to.
40
- For this we've created the local reactor instance, which will represent a node in the Powerhouse network.
41
- It represents a node that is usually hosted as a remote server, but in our case we run it locally on our machine.
42
- The document model data will be stored in the local reactor instance and be analyzed by the analytics processor.
43
+ For this we've created the local reactor instance, which will represent a node in the Powerhouse network.
44
+ It represents a node that is usually hosted as a remote server, but in our case we run it locally on our machine.
45
+ The document model data will be stored in the local reactor instance and be analyzed by the analytics processor.
43
46
 
44
47
  Let's move on to the next step where we'll create a new RWA document model and add data to it.
45
48
 
46
49
  ### A brief introduction to the RWA document model
47
50
 
48
51
  :::info
49
- An RWA document model helps an arranger for Sky (previously MakerDAO) to accurately track the value of a Real World Asset (RWA) such as US treasury bills by tracking the purchase, sale and other transactions on the asset.
50
- In the case of Sky these real world assets are used to back the Sky/MakerDAO stablecoin.
51
- Let's pretend we're an arranger and we want to track the value of a new RWA asset and analyze the total value of the assetclass on a monthly basis.
52
- Aside from setting up an analytics processor you'll now also become familiar with how the RWA reporting workflow is captured in a document model.
52
+ An RWA document model helps an arranger for Sky (previously MakerDAO) to accurately track the value of a Real World Asset (RWA) such as US treasury bills by tracking the purchase, sale and other transactions on the asset.
53
+ In the case of Sky these real world assets are used to back the Sky/MakerDAO stablecoin.
54
+ Let's pretend we're an arranger and we want to track the value of a new RWA asset and analyze the total value of the assetclass on a monthly basis.
55
+ Aside from setting up an analytics processor you'll now also become familiar with how the RWA reporting workflow is captured in a document model.
53
56
  :::
54
57
 
55
- We'll run through the following scenario:
56
- 1. Add the Asset "T-Bill 4565446" with CUSIP : TB4565446, maturity 01/12/2024. So we have a new asset to track and work with in the document model.
57
- 2. Make a "principal draw" of $10,000,000 on May 1. We're simulating an arranger borrowing $10,000,000 from Sky to purchase the asset.
58
- 3. Make a purchase of 2,500,000 TB4565446 bills for $2,490,000 on May 15.
59
- 4. Make a purchase of 2,000,000 TB4565446 bills for $1,995,000 on June 15.
60
- 5. Make a sale of 4,000,000 TB4565446 bills for $4,000,000 on July 1.
61
- 6. Verify that the document data is correctly entered.
62
- 7. Inspect the document history.
58
+ We'll run through the following scenario: 1. Add the Asset "T-Bill 4565446" with CUSIP : TB4565446, maturity 01/12/2024. So we have a new asset to track and work with in the document model. 2. Make a "principal draw" of $10,000,000 on May 1. We're simulating an arranger borrowing $10,000,000 from Sky to purchase the asset. 3. Make a purchase of 2,500,000 TB4565446 bills for $2,490,000 on May 15. 4. Make a purchase of 2,000,000 TB4565446 bills for $1,995,000 on June 15. 5. Make a sale of 4,000,000 TB4565446 bills for $4,000,000 on July 1. 6. Verify that the document data is correctly entered. 7. Inspect the document history.
63
59
 
64
60
  ### Create a new RWA document model in Connect
65
61
 
66
- Under the 'New Document' section in Connect, click on the 'RWA Portfolio' document.
62
+ Under the 'New Document' section in Connect, click on the 'RWA Portfolio' document.
67
63
  Name your document model `Sky Rwa's` and click on 'Create'.
68
64
 
69
65
  ![Create a new RWA document](./images/create-a-new-RWA-document.gif)
70
66
 
71
- You'll immmediaty see that you've arrived in the portfolio editor with the option to add a new asset.
72
- Since this document model is a portfolio for tracking the value of different assets, we'll first need to create the asset to be able to add any transaction history to them and track the value of the assets.
67
+ You'll immmediaty see that you've arrived in the portfolio editor with the option to add a new asset.
68
+ Since this document model is a portfolio for tracking the value of different assets, we'll first need to create the asset to be able to add any transaction history to them and track the value of the assets.
73
69
 
74
- Hit the 'create asset' button. Which will offer you an overview of all the fields you need to fill in.
75
- Before we can add an asset, we need to create a new "Special purpose vehicle" (SPV) for the asset.
76
- A SPV is a legal entity that is created to hold the real world asset such as a US treasury bill.
77
- For the sake of easily remembering the SPV let's name it 'SPV A'.
70
+ Hit the 'create asset' button. Which will offer you an overview of all the fields you need to fill in.
71
+ Before we can add an asset, we need to create a new "Special purpose vehicle" (SPV) for the asset.
72
+ A SPV is a legal entity that is created to hold the real world asset such as a US treasury bill.
73
+ For the sake of easily remembering the SPV let's name it 'SPV A'.
78
74
 
79
75
  ![Create a new SPV](./images/Create-SPV.gif)
80
76
 
@@ -94,9 +90,9 @@ Your new asset should look like this:
94
90
 
95
91
  ### Add a transaction to the asset
96
92
 
97
- We're now going to add a transaction to the asset and therefor simulate an arranger borrowing $10,000,000 through a loan from Sky to purchase the asset.
93
+ We're now going to add a transaction to the asset and therefor simulate an arranger borrowing $10,000,000 through a loan from Sky to purchase the asset.
98
94
 
99
- To do this, move to the transaction tab and click on the 'create transaction' button below the table.
95
+ To do this, move to the transaction tab and click on the 'create transaction' button below the table.
100
96
 
101
97
  Fill in the following details to complete the transaction of $10,000,000 from Sky to the SPV, known as a principal draw.
102
98
 
@@ -113,7 +109,7 @@ You'll now see the transaction appear in the transaction table.
113
109
 
114
110
  ### Let's buy and sell some T-bills!
115
111
 
116
- Now that we've created the T-bill asset and took out a loan from Sky to purchase it, we can start buying some T-bills.
112
+ Now that we've created the T-bill asset and took out a loan from Sky to purchase it, we can start buying some T-bills.
117
113
  Let's buy 2,500,000 T-bills for $2,490,000 on May 15.
118
114
 
119
115
  Create a new transaction and fill in the following details:
@@ -136,7 +132,7 @@ Create a new transaction and fill in the following details:
136
132
  - Asset name: T-Bill 4565446 - US0045654460 - TB4565446
137
133
  - Quantity: 2,000,000
138
134
  - Asset proceeds: $1,995,000
139
- - Transaction reference: 0x123456789
135
+ - Transaction reference: 0x123456789
140
136
 
141
137
  Click on the 'Save' button to save the transaction.
142
138
 
@@ -157,7 +153,7 @@ Your table of transactions should now look like this:
157
153
 
158
154
  ![RWA Portfolio](./images/Transaction-table.png)
159
155
 
160
- ### Verify that the document data is correctly entered and inspect the document history
156
+ ### Verify that the document data is correctly entered and inspect the document history
161
157
 
162
158
  Connect offers you with a few ways to inspect the document data and history. Click on the 'revision history' button in the top right corner of your document model in Connect to see the history of your operations on the document model.
163
159
 
@@ -168,6 +164,6 @@ This allows you to inspect the document data and history in a structured way. Cu
168
164
 
169
165
  To inspect the document data, click on the 'Switchboard' icon in the top right corner of your document model in Connect.
170
166
 
171
- This will open a new tab with the graphql interface to the document model. If you click 'run' the base query will be executed and you'll see the document data and history. You'll find all the transactions we've added to the asset in the document model previously.
167
+ This will open a new tab with the graphql interface to the document model. If you click 'run' the base query will be executed and you'll see the document data and history. You'll find all the transactions we've added to the asset in the document model previously.
172
168
 
173
- Now let's move on to the next step where we'll update the analytics processor code to listen to the document model and calculate the total value of the asset class on a monthly basis.
169
+ Now let's move on to the next step where we'll update the analytics processor code to listen to the document model and calculate the total value of the asset class on a monthly basis.
@@ -7,7 +7,7 @@ You will likely already be in the desired directory, but if not, navigate to the
7
7
  cd rwa-analytics
8
8
  ```
9
9
 
10
- Now open the directory with your code editor.
10
+ Now open the directory with your code editor.
11
11
 
12
12
  ```bash
13
13
  code .
@@ -23,9 +23,7 @@ import {
23
23
  type IAnalyticsStore,
24
24
  AnalyticsSeriesInput,
25
25
  } from "@powerhousedao/reactor-api";
26
- import {
27
- CreateGroupTransactionInput,
28
- } from "document-model-libs/real-world-assets";
26
+ import { CreateGroupTransactionInput } from "document-model-libs/real-world-assets";
29
27
  import { DateTime } from "luxon";
30
28
  import { IProcessor } from "document-drive/processors/types";
31
29
  import { InternalTransmitterUpdate } from "document-drive/server/listener/transmitter/internal";
@@ -35,10 +33,10 @@ export class RwaAnalyticsProcessor implements IProcessor {
35
33
  constructor(private readonly analyticsStore: IAnalyticsStore) {
36
34
  //
37
35
  }
38
-
36
+
39
37
  // This is the function that is called when the processor receives a new strand.
40
38
  async onStrands<TDocument extends PHDocument>(
41
- strands: InternalTransmitterUpdate<TDocument>[]
39
+ strands: InternalTransmitterUpdate<TDocument>[],
42
40
  ): Promise<void> {
43
41
  if (strands.length === 0) {
44
42
  return;
@@ -60,12 +58,13 @@ export class RwaAnalyticsProcessor implements IProcessor {
60
58
  `ph/${strand.driveId}/${documentId}/${strand.branch}/${strand.scope}`,
61
59
  );
62
60
 
63
- if (firstOp.index === 0) { // This is the index of the operation in the strand.
61
+ if (firstOp.index === 0) {
62
+ // This is the index of the operation in the strand.
64
63
  // This is the function that clears the source of the operation since there could be data existing from previous runs for the same document to avoid double counting.
65
64
  await this.analyticsStore.clearSeriesBySource(source, true);
66
65
  }
67
66
 
68
- for (const operation of strand.operations) {
67
+ for (const operation of strand.operations) {
69
68
  // This is the type of the operation that is being displayed in the terminal.
70
69
  console.log(">>> ", operation.type);
71
70
 
@@ -87,16 +86,15 @@ export class RwaAnalyticsProcessor implements IProcessor {
87
86
  }
88
87
 
89
88
  // Up next we'll go through the crucial design decision of defining the different dimensions or categories that are relevant to our analytics and queries. In this case we'll create 2 series values for each transaction type. Cash and Fixed Income transactions.
90
- const { fixedIncomeTransaction, cashTransaction } =
91
- groupTransaction;
89
+ const { fixedIncomeTransaction, cashTransaction } = groupTransaction;
92
90
 
93
91
  if (fixedIncomeTransaction) {
94
92
  const dimensions = {
95
93
  // We're selecting T-bills as one of the dimensions.
96
94
  asset: AnalyticsPath.fromString(
97
- `sky/rwas/assets/t-bills/${fixedIncomeTransaction.assetId}`
95
+ `sky/rwas/assets/t-bills/${fixedIncomeTransaction.assetId}`,
98
96
  ),
99
- // We're selecting the portfolio as another dimension.
97
+ // We're selecting the portfolio as another dimension.
100
98
  portfolio: AnalyticsPath.fromString(
101
99
  `sky/rwas/portfolios/${documentId}`,
102
100
  ),
@@ -104,9 +102,9 @@ export class RwaAnalyticsProcessor implements IProcessor {
104
102
 
105
103
  values.push({
106
104
  dimensions,
107
- metric: "AssetBalance", // We're selecting the AssetBalance metric.
105
+ metric: "AssetBalance", // We're selecting the AssetBalance metric.
108
106
  source,
109
- start: DateTime.fromISO(fixedIncomeTransaction.entryTime), // We're selecting the entryTime as the start of the series value.
107
+ start: DateTime.fromISO(fixedIncomeTransaction.entryTime), // We're selecting the entryTime as the start of the series value.
110
108
  value:
111
109
  groupTransaction.type === "AssetPurchase"
112
110
  ? fixedIncomeTransaction.amount
@@ -125,14 +123,14 @@ export class RwaAnalyticsProcessor implements IProcessor {
125
123
 
126
124
  values.push({
127
125
  dimensions,
128
- metric: "AssetBalance", // We're selecting the AssetBalance metric for our tutorial scenario
126
+ metric: "AssetBalance", // We're selecting the AssetBalance metric for our tutorial scenario
129
127
  source,
130
- start: DateTime.fromISO(cashTransaction.entryTime), // We're selecting the entryTime as the start of the series value.
128
+ start: DateTime.fromISO(cashTransaction.entryTime), // We're selecting the entryTime as the start of the series value.
131
129
  value:
132
130
  groupTransaction.type === "AssetPurchase" ||
133
131
  groupTransaction.type === "PrincipalReturn"
134
- ? -cashTransaction.amount
135
- : cashTransaction.amount,
132
+ ? -cashTransaction.amount
133
+ : cashTransaction.amount,
136
134
  });
137
135
  }
138
136
  }
@@ -157,7 +155,7 @@ You'll need to use this specific endpoint to access the graphql playground `http
157
155
  Then, use this query to get the data from the analytics store, but don't forget to also add the variables below.
158
156
 
159
157
  ```graphql
160
- query analytics ($filter: AnalyticsFilter) {
158
+ query analytics($filter: AnalyticsFilter) {
161
159
  analytics {
162
160
  series(filter: $filter) {
163
161
  start
@@ -174,7 +172,8 @@ query analytics ($filter: AnalyticsFilter) {
174
172
  }
175
173
  }
176
174
  ```
177
- With the following variables:
175
+
176
+ With the following variables:
178
177
 
179
178
  ```graphql
180
179
  {
@@ -190,11 +189,11 @@ With the following variables:
190
189
  "select": "sky"
191
190
  }
192
191
  ]
193
- }
192
+ }
194
193
  }
195
194
  ```
196
195
 
197
- You'll see that the data is now being displayed per month for each of the asset classes.
196
+ You'll see that the data is now being displayed per month for each of the asset classes.
198
197
 
199
198
  `sum`: Displays the total cumulative sum of the asset class for the month.
200
199
 
@@ -208,10 +207,10 @@ When you play around with the variables of our query you can see the different g
208
207
 
209
208
  If you would take 'total' you would see the total value left in the portfolio.
210
209
 
211
- To see more changes become present in the analytics store, you could add more transactions to the portfolio. We could do this by adding a rwa portfolio with a different documentId.
212
- Imagine that this is similar to real time updates that might happen to a live environment when new transactions are added to a porftolio.
210
+ To see more changes become present in the analytics store, you could add more transactions to the portfolio. We could do this by adding a rwa portfolio with a different documentId.
211
+ Imagine that this is similar to real time updates that might happen to a live environment when new transactions are added to a porftolio.
213
212
 
214
- Now you know how to implement a custom analytics processor and how to query the data from the analytics store!
213
+ Now you know how to implement a custom analytics processor and how to query the data from the analytics store!
215
214
  In case you'd want to try it out on another document model follow the steps below.
216
215
 
217
216
  1. Generate the processor
@@ -220,4 +219,4 @@ In case you'd want to try it out on another document model follow the steps belo
220
219
  4. Launch the reactor (again) and add your data
221
220
  5. Query the data from the analytics store
222
221
 
223
- Enjoy!
222
+ Enjoy!
@@ -5,4 +5,4 @@
5
5
  "type": "generated-index",
6
6
  "description": "Learn how to make use of an Analytics Processor in this tutorial!"
7
7
  }
8
- }
8
+ }
@@ -1,8 +1,8 @@
1
1
  {
2
- "label": "Work with Data",
3
- "position": 5,
4
- "link": {
5
- "type": "generated-index",
6
- "description": "Learn how to work with data in Powerhouse."
7
- }
8
- }
2
+ "label": "Work with Data",
3
+ "position": 5,
4
+ "link": {
5
+ "type": "generated-index",
6
+ "description": "Learn how to work with data in Powerhouse."
7
+ }
8
+ }
@@ -18,10 +18,9 @@ By packaging document models together, developers can create well-organized, int
18
18
  - **Contributor Billing Package** – Defines document models for tracking work, invoicing, and facilitating payments (in both fiat and crypto) for contributors in decentralized organizations. link
19
19
  - **Governance Package** – Models for proposals, voting, contributor agreements, and decision-making processes. link
20
20
  - **People Ops Package** – Documents managing contributor profiles, roles, task assignments, and reputation tracking. link
21
- - **Project Management Package** – Models for task tracking, milestones, resource allocation, and deliverables. link
21
+ - **Project Management Package** – Models for task tracking, milestones, resource allocation, and deliverables. link
22
22
 
23
-
24
- A **Powerhouse Package** is a **modular unit** that defines and automates **data structures and workflows** within the Powerhouse ecosystem. Each package includes several components that work together to ensure **seamless interaction, data processing, and automation** for different areas of decentralized operations.
23
+ A **Powerhouse Package** is a **modular unit** that defines and automates **data structures and workflows** within the Powerhouse ecosystem. Each package includes several components that work together to ensure **seamless interaction, data processing, and automation** for different areas of decentralized operations.
25
24
 
26
25
  Packages follow a **scoped naming convention** based on the organization that owns them or created the package:
27
26
 
@@ -76,4 +75,4 @@ A **UI component** that allows users to **interact with document models visually
76
75
 
77
76
  ---
78
77
 
79
- Each of these modules plays a crucial role in making **Powerhouse Packages extensible, reusable, and efficient**, allowing developers and organizations to **streamline operations and automate workflows** within decentralized environments.
78
+ Each of these modules plays a crucial role in making **Powerhouse Packages extensible, reusable, and efficient**, allowing developers and organizations to **streamline operations and automate workflows** within decentralized environments.
@@ -1,40 +1,42 @@
1
1
  # Publish your package
2
2
 
3
- This tutorial is a step by step guide tackling the following topics:
4
- 1. the process of **building a powerhouse project**
3
+ This tutorial is a step by step guide tackling the following topics:
4
+
5
+ 1. the process of **building a powerhouse project**
5
6
  2. the process of **publishing it as a package**
6
7
 
7
8
  :::info
8
9
  Let's start with some **key concepts** that will help you understand the process we're going to go through in this tutorial.
9
10
 
10
11
  - **Powerhouse Project**: The construction site of your package: A project is built with document models and editors which you will publish to NPM as a package with modules.
11
- - **Powerhouse Modules**: The modules that are part of your project, such as the document models, editors, processors or scripts.
12
- - **Powerhouse Drive Apps**: Customized drive interfaces that function as a drive add and enhance or augment the functionality of your documents and workflows within the drive.
12
+ - **Powerhouse Modules**: The modules that are part of your project, such as the document models, editors, processors or scripts.
13
+ - **Powerhouse Drive Apps**: Customized drive interfaces that function as a drive add and enhance or augment the functionality of your documents and workflows within the drive.
13
14
  - **Powerhouse Package**: A package is a collection of modules that are published to NPM and can be installed on a server instance or locally on your machine with help of the host apps such as Connect, Switchboard & Fusion. Organizations build packages for specific purposes or workflows.
14
15
 
15
16
  ![Key Concepts](images/keyconcepts.png)
16
17
  :::
17
18
 
18
-
19
- ## 1. Building your project
19
+ ## 1. Building your project
20
20
 
21
21
  To start building your project with it's dedicated document models and editors we'll run the following command:
22
22
 
23
- ```bash
24
- ph init
25
- ```
23
+ ```bash
24
+ ph init
25
+ ```
26
26
 
27
27
  <details>
28
28
  <summary> Command not working? Did you install `ph-cmd`? </summary>
29
29
 
30
30
  The Powerhouse CLI (`ph-cmd`) is a command-line interface tool that provides essential commands for managing Powerhouse projects. You can get access to the Powerhouse Ecosystem tools by installing them globally using:
31
+
31
32
  ```bash
32
33
  pnpm install -g ph-cmd
33
- ```
34
+ ```
34
35
 
35
- For experimental features, use --version [version] which allows selecting a specific branch of our document-model-boilerplate. There are --dev and --staging options. Select `ph init --dev` to use the latest development version. Please be aware that this version can contain bugs and experimental features that aren't fully tested.
36
+ For experimental features, use --version [version] which allows selecting a specific branch of our document-model-boilerplate. There are --dev and --staging options. Select `ph init --dev` to use the latest development version. Please be aware that this version can contain bugs and experimental features that aren't fully tested.
36
37
 
37
38
  Key commands include:
39
+
38
40
  - `ph connect` for running the Connect application locally
39
41
  - `ph switchboard` or `ph reactor` for starting the API service
40
42
  - `ph init` to start a new project and build a document model
@@ -64,6 +66,7 @@ yarn cache list
64
66
  yarn cache clean --force
65
67
  yarn cache list
66
68
  ```
69
+
67
70
  </details>
68
71
 
69
72
  <details>
@@ -71,35 +74,36 @@ yarn cache list
71
74
 
72
75
  When installing or using the Powerhouse CLI commands you are able to make use of the dev & staging branches. These branches contain more experimental features then the latest stable release the PH CLI uses by default. They can be used to get access to a bugfix or features under development.
73
76
 
74
- | Command | Description |
75
- |---------|-------------|
76
- | **pnpm install -g ph-cmd** | Install latest stable version |
77
- | **pnpm install -g ph-cmd@dev** | Install development version |
78
- | **pnpm install -g ph-cmd@staging** | Install staging version |
79
- | **ph init** | Use latest stable version of the boilerplate |
80
- | **ph init --dev** | Use development version of the boilerplate |
81
- | **ph init --staging** | Use staging version of the boilerplate |
82
- | **ph use** | Switch all dependencies to latest production versions |
83
- | **ph use dev** | Switch all dependencies to development versions |
84
- | **ph use prod** | Switch all dependencies to production versions |
77
+ | Command | Description |
78
+ | ---------------------------------- | ----------------------------------------------------- |
79
+ | **pnpm install -g ph-cmd** | Install latest stable version |
80
+ | **pnpm install -g ph-cmd@dev** | Install development version |
81
+ | **pnpm install -g ph-cmd@staging** | Install staging version |
82
+ | **ph init** | Use latest stable version of the boilerplate |
83
+ | **ph init --dev** | Use development version of the boilerplate |
84
+ | **ph init --staging** | Use staging version of the boilerplate |
85
+ | **ph use** | Switch all dependencies to latest production versions |
86
+ | **ph use dev** | Switch all dependencies to development versions |
87
+ | **ph use prod** | Switch all dependencies to production versions |
85
88
 
86
89
  Please be aware that these versions can contain bugs and experimental features that aren't fully tested.
90
+
87
91
  </details>
88
92
 
89
93
  ### 1.1. Specifying your project details
90
94
 
91
- When you are creating your own project, you will be asked to name your project.
95
+ When you are creating your own project, you will be asked to name your project.
92
96
  Which will also become the package name when someone else wants to install it in a cloud environment via npm in the future.
93
97
 
94
98
  Please feel free to navigate to the package.json file and fill in all the other available fields such as `name`, `version`, `author`, `license` and `main`.
95
99
 
96
- ```bash
97
- {
98
- "name": "@your-org-ph/package-name", #Your organization name with the -ph suffix to indicate it's a powerhouse related organization & package.
99
- "version": "1.0.0",
100
+ ```bash
101
+ {
102
+ "name": "@your-org-ph/package-name", #Your organization name with the -ph suffix to indicate it's a powerhouse related organization & package.
103
+ "version": "1.0.0",
100
104
  "author": "Your Name",
101
105
  "license": "AGPL-3.0-only",
102
- "main": "index.js"
106
+ "main": "index.js"
103
107
  }
104
108
  ```
105
109
 
@@ -108,17 +112,19 @@ Now that you've created your powerhouse project you are ready to generate the ne
108
112
  ```bash
109
113
  ph generate
110
114
  ```
111
- The **generate** command will start the configuration of your powerhouse project and generates a directory with the necessary files and folders to build your project.
115
+
116
+ The **generate** command will start the configuration of your powerhouse project and generates a directory with the necessary files and folders to build your project.
112
117
  These include:
118
+
113
119
  - `document-models`: A folder containing the document models schemas you've defined in Connect Studio Mode.
114
- - `editors`: A folder containing the editors you've defined in react, potentially making use of the reusable components.
115
- - `processors`: A folder containing the processors you might be running on your document models later on.
120
+ - `editors`: A folder containing the editors you've defined in react, potentially making use of the reusable components.
121
+ - `processors`: A folder containing the processors you might be running on your document models later on.
116
122
  - `scripts`: A folder containing the scripts you might use.
117
123
  - `tests`: A folder containing your unit tests.
118
124
 
119
125
  ### 1.2. Adding document models, editors and unit tests
120
126
 
121
- Now that you've set up your directory.
127
+ Now that you've set up your directory.
122
128
  Go ahead and add the document models you'd like to add by going through the standard document model building flow:
123
129
 
124
130
  :::info
@@ -134,9 +140,11 @@ These steps are explained more in depth in any of our tutorials. Follow along wi
134
140
  7. Add a **manifest file** to your project and updating your index.js file to export your modules.
135
141
 
136
142
  ### 1.3. Verifying your project
137
- Now that we've completed our directory with the reducers, tests and editors, and your project is populated with modules we'll verify the build output and see if everything is working correctly.
143
+
144
+ Now that we've completed our directory with the reducers, tests and editors, and your project is populated with modules we'll verify the build output and see if everything is working correctly.
138
145
 
139
146
  Let's **verify the package build output** with the following command:
147
+
140
148
  ```bash
141
149
  pnpm build
142
150
  ```
@@ -145,7 +153,7 @@ This command will **build** the project and create a build directory with the ou
145
153
 
146
154
  This command will **start a local server** and serve the build output.
147
155
  Inspect the build output and verify that the document models are working correctly.
148
- Instead of `pnpm serve`, we'll be using:
156
+ Instead of `pnpm serve`, we'll be using:
149
157
 
150
158
  ```bash
151
159
  ph connect
@@ -154,7 +162,8 @@ ph connect
154
162
  ### 1.4 Storing your project in a git repository
155
163
 
156
164
  Now that you've verified your project is working correctly, you can store your project in a git repository.
157
- Why?
165
+ Why?
166
+
158
167
  - So you can track the changes of your project in a remote repository and benefit from the collaboration features of git.
159
168
  - So you can publish your project to the npm registry and install it on a server instance or locally on your machine.
160
169
 
@@ -172,24 +181,29 @@ This will initialize a git repository and add all the files to the repository.
172
181
 
173
182
  For this step you'll need to register your organization on npm.
174
183
  If you haven't already registered your organization on npm, you can do so by running the following command:
184
+
175
185
  ```bash
176
186
  pnpm adduser
177
187
  ```
188
+
178
189
  Create an organization on [NPM](https://www.npmjs.com/) using the naming convention: `@yourorganization-ph`
179
- - The `-ph` suffix indicates its a Powerhouse ecosystem package to help you and others identify it from regular NPM packages.
180
- - Example: `@acme-ph`
190
+
191
+ - The `-ph` suffix indicates its a Powerhouse ecosystem package to help you and others identify it from regular NPM packages.
192
+ - Example: `@acme-ph`
181
193
 
182
194
  To make sure you can differentiate between Powerhouse ecosystem packages and other packages we recommend setting up a separate npm account for your organization with the -ph suffix. **example: @yourorg-ph**
183
195
  We advise you to **use a dedicated npm account for your organization and not your personal account**.
184
196
 
185
- Once you've registered your organization on npm, you can now publish your project to the npm registry.
197
+ Once you've registered your organization on npm, you can now publish your project to the npm registry.
186
198
  Log in via the command line:
199
+
187
200
  ```bash
188
201
  npm login
189
202
  ```
190
- You'll be prompted for your username, password, and email in a separate browser window.
191
203
 
192
- Once you've logged in, you can configure your package.json for npm before publishing.
204
+ You'll be prompted for your username, password, and email in a separate browser window.
205
+
206
+ Once you've logged in, you can configure your package.json for npm before publishing.
193
207
  If you're publishing a package under a scope (like @your-org/my-package), you might need to add the `publishConfig` to ensure it's public, otherwise scoped packages default to private:
194
208
 
195
209
  ```json
@@ -210,6 +224,7 @@ Before publishing, it's crucial to version your package correctly and tag the re
210
224
  #### 1. Versioning with pnpm
211
225
 
212
226
  Use the `pnpm version` command to update your package version according to semantic versioning rules (`patch` for bugfixes, `minor` for new features, `major` for breaking changes). This command will:
227
+
213
228
  - Update the `version` in your `package.json`.
214
229
  - Create a Git commit for the version change.
215
230
  - Create a Git tag for the new version (e.g., `v1.0.1`).
@@ -224,6 +239,7 @@ pnpm version minor
224
239
  # For a major release (e.g., from 1.1.0 to 2.0.0)
225
240
  pnpm version major
226
241
  ```
242
+
227
243
  Take note of the new version tag created (e.g., `v1.0.1`), as you'll need it in the next step.
228
244
 
229
245
  #### 2. Pushing changes to Git
@@ -239,9 +255,11 @@ git push origin main
239
255
  # Replace vX.Y.Z with the actual tag name (e.g., v1.0.1)
240
256
  git push origin vX.Y.Z
241
257
  ```
258
+
242
259
  The specific tag name (e.g., `v1.0.1`) is usually output by the `pnpm version` command. Pushing the specific tag is recommended to avoid unintentionally pushing other local tags.
243
260
 
244
261
  Alternatively, to push all new local tags (use with caution):
262
+
245
263
  ```bash
246
264
  # git push --tags
247
265
  ```
@@ -250,10 +268,10 @@ Alternatively, to push all new local tags (use with caution):
250
268
 
251
269
  It's important to distinguish between Git tags and NPM distributor tags (dist-tags):
252
270
 
253
- - **Git Tags**: These are markers in your Git repository's history. They are primarily for developers to pinpoint specific release versions in the codebase (e.g., `v1.0.0`, `v1.0.1`). The `pnpm version` command creates these.
254
- - **NPM Distributor Tags (dist-tags)**: These are labels used by the NPM registry to point to specific published versions of your package. Common NPM tags include:
255
- - `latest`: This is the default tag. When someone runs `pnpm install my-package`, NPM installs the version tagged as `latest`.
256
- - `beta`, `next`, `alpha`: Often used for pre-release versions.
271
+ - **Git Tags**: These are markers in your Git repository's history. They are primarily for developers to pinpoint specific release versions in the codebase (e.g., `v1.0.0`, `v1.0.1`). The `pnpm version` command creates these.
272
+ - **NPM Distributor Tags (dist-tags)**: These are labels used by the NPM registry to point to specific published versions of your package. Common NPM tags include:
273
+ - `latest`: This is the default tag. When someone runs `pnpm install my-package`, NPM installs the version tagged as `latest`.
274
+ - `beta`, `next`, `alpha`: Often used for pre-release versions.
257
275
  When you publish a package without specifying an NPM tag, it usually gets the `latest` tag by default.
258
276
 
259
277
  #### 4. Publishing to npm
@@ -263,23 +281,28 @@ Now you are ready to publish your package to the NPM registry. Ensure you are lo
263
281
  ```bash
264
282
  pnpm publish
265
283
  ```
284
+
266
285
  This command will publish the version of your package that is currently specified in your `package.json`. By default, this will also set the `latest` NPM dist-tag for this version.
267
286
 
268
287
  If your package is scoped (e.g., `@your-org/my-package`) and intended to be public, ensure your `package.json` includes the `publishConfig` shown earlier. If this is not set in `package.json` (and your package is scoped), you might need to use:
288
+
269
289
  ```bash
270
290
  pnpm publish --access public
271
291
  ```
272
292
 
273
293
  You can also publish a version to a specific NPM dist-tag. For example, to publish a beta version:
294
+
274
295
  ```bash
275
296
  # Ensure your package.json version reflects the beta (e.g., 1.1.0-beta.0)
276
297
  pnpm publish --tag beta
277
298
  ```
299
+
278
300
  This is useful for testing releases before making them `latest`.
279
301
 
280
302
  Now let's verify that the package(s) get published in the package repository, next to pre-existing packages that you might have been publishing before.
281
303
 
282
304
  ## 3. Deploying the host apps and project
305
+
283
306
  Now that we've installed all the necessary services on our server instance, we can start deploying the host apps & our packaged project from npm.
284
307
 
285
308
  Install your project package we've published earlier on your local connect (`ph connect`) instance by running the following command:
@@ -287,9 +310,10 @@ Install your project package we've published earlier on your local connect (`ph
287
310
  ```bash
288
311
  ph install @<your-org/package-name>
289
312
  ```
313
+
290
314
  Alternatively you can also install the package in the settings of Connect in the 'package manager' section. (Not available yet)
291
315
  Where you'll be able to use the same package name as you've used in the `package.json` file and install it at the click of a button.
292
316
 
293
317
  ![package manager](images/homedesign.png)
294
318
 
295
- Got this far? Congratulations on publishing your first package!
319
+ Got this far? Congratulations on publishing your first package!