@bonnard/cli 0.2.1 → 0.2.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,256 @@
1
+ ---
2
+ name: bonnard-metabase-migrate
3
+ description: Guide migration from an existing Metabase instance to a Bonnard semantic layer. Use when user says "migrate from metabase", "import metabase", "metabase to semantic layer", or has Metabase data they want to model.
4
+ allowed-tools: Bash(bon *)
5
+ ---
6
+
7
+ # Migrate from Metabase to Bonnard
8
+
9
+ This skill guides you through analyzing an existing Metabase instance and
10
+ building a semantic layer that replicates its most important metrics.
11
+ Walk through each phase in order, confirming progress before moving on.
12
+
13
+ ## Phase 1: Connect to Metabase
14
+
15
+ Set up a connection to the Metabase instance:
16
+
17
+ ```bash
18
+ bon metabase connect
19
+ ```
20
+
21
+ This prompts for the Metabase URL and API key. The API key should be created
22
+ in Metabase under Admin > Settings > Authentication > API Keys.
23
+ An admin-level key gives the richest analysis (permissions, schema access).
24
+
25
+ ## Phase 2: Analyze the Instance
26
+
27
+ Generate an intelligence report that maps the entire Metabase instance:
28
+
29
+ ```bash
30
+ bon metabase analyze
31
+ ```
32
+
33
+ This writes a report to `.bon/metabase-analysis.md`. Read it carefully — it
34
+ drives every decision in the remaining phases.
35
+
36
+ ### How to interpret each section
37
+
38
+ | Report Section | What It Tells You | Action |
39
+ |----------------|-------------------|--------|
40
+ | **Most Referenced Tables** | Tables used most in SQL queries | Create cubes for these first — they are the core of the data model |
41
+ | **Top Cards by Activity** | Most-viewed questions/models | `analytical` cards (GROUP BY + aggregation) map to measures; `lookup` cards indicate key filter dimensions; `display` cards can be skipped |
42
+ | **Common Filter Variables** | Template vars (`{{var}}`) used across 3+ cards | These must be dimensions on relevant cubes |
43
+ | **Foreign Key Relationships** | FK links between tables | Define `joins` between cubes using these relationships |
44
+ | **Collection Structure** | How users organize content by business area | Map each top-level collection to a view (one view per business domain) |
45
+ | **Dashboard Parameters** | Shared filters across dashboards | The most important shared dimensions — ensure they exist on relevant cubes |
46
+ | **Table Inventory** | Field counts and classification per table | Field classification (dims/measures/time) guides each cube definition; tables with 0 refs can be deprioritized |
47
+ | **Schema Access** | Which schemas non-admin groups can query | Focus on user-facing schemas — skip admin-only/staging schemas |
48
+
49
+ ## Phase 3: Connect the Data Warehouse
50
+
51
+ Add a datasource pointing to the same database that Metabase queries:
52
+
53
+ ```bash
54
+ # Interactive setup
55
+ bon datasource add
56
+
57
+ # Or import from dbt if available
58
+ bon datasource add --from-dbt
59
+ ```
60
+
61
+ Then verify the connection:
62
+
63
+ ```bash
64
+ bon datasource test <name>
65
+ ```
66
+
67
+ The database connection details can often be found in Metabase under
68
+ Admin > Databases, or in the analysis report header.
69
+
70
+ ## Phase 4: Explore Key Tables
71
+
72
+ Before writing cubes, drill into the most important tables and cards
73
+ identified in Phase 2. Use the explore commands to understand field types
74
+ and existing SQL patterns:
75
+
76
+ ```bash
77
+ # View table fields with type classification
78
+ bon metabase explore table <id>
79
+
80
+ # View card SQL and columns
81
+ bon metabase explore card <id>
82
+
83
+ # View schemas and tables in a database
84
+ bon metabase explore database <id>
85
+
86
+ # View cards in a collection
87
+ bon metabase explore collection <id>
88
+ ```
89
+
90
+ ### How explore output maps to cube definitions
91
+
92
+ | Explore Field | Cube Mapping |
93
+ |---------------|-------------|
94
+ | Field class `pk` | Set `primary_key: true` on dimension |
95
+ | Field class `fk` | Join candidate — note the target table |
96
+ | Field class `time` | Dimension with `type: time` |
97
+ | Field class `measure` | Measure candidate — check card SQL for aggregation type |
98
+ | Field class `dim` | Dimension with `type: string` or `type: number` |
99
+
100
+ ### How card SQL maps to measures
101
+
102
+ Look at the SQL in `analytical` cards to determine measure types:
103
+
104
+ | Card SQL Pattern | Cube Measure |
105
+ |-----------------|-------------|
106
+ | `SUM(amount)` | `type: sum`, `sql: amount` |
107
+ | `COUNT(*)` | `type: count` |
108
+ | `COUNT(DISTINCT user_id)` | `type: count_distinct`, `sql: user_id` |
109
+ | `AVG(price)` | `type: avg`, `sql: price` |
110
+ | `MIN(date)` / `MAX(date)` | `type: min` / `type: max`, `sql: date` |
111
+
112
+ Use `bon docs cubes.measures.types` for all 12 measure types.
113
+
114
+ ## Phase 5: Build Cubes
115
+
116
+ Create cubes for the most-referenced tables (from Phase 2). Start with the
117
+ highest-referenced table and work down. Create one file per cube in
118
+ `bonnard/cubes/`.
119
+
120
+ For each cube:
121
+ 1. Set `sql_table` to the full `schema.table` path
122
+ 2. Set `data_source` to the datasource name from Phase 3
123
+ 3. Add a `primary_key` dimension
124
+ 4. Add time dimensions for date/datetime columns
125
+ 5. Add measures based on card SQL patterns (Phase 4)
126
+ 6. Add dimensions for columns used as filters (template vars from Phase 2)
127
+ 7. Add `description` to every measure and dimension
128
+
129
+ Example — `bonnard/cubes/orders.yaml`:
130
+
131
+ ```yaml
132
+ cubes:
133
+ - name: orders
134
+ sql_table: public.orders
135
+ data_source: my_warehouse
136
+ description: Order transactions
137
+
138
+ measures:
139
+ - name: count
140
+ type: count
141
+ description: Total number of orders
142
+
143
+ - name: total_revenue
144
+ type: sum
145
+ sql: amount
146
+ description: Sum of order amounts
147
+
148
+ dimensions:
149
+ - name: id
150
+ type: number
151
+ sql: id
152
+ primary_key: true
153
+
154
+ - name: created_at
155
+ type: time
156
+ sql: created_at
157
+ description: Order creation timestamp
158
+
159
+ - name: status
160
+ type: string
161
+ sql: status
162
+ description: Order status (pending, completed, cancelled)
163
+ ```
164
+
165
+ ### Adding joins
166
+
167
+ Use FK relationships from the analysis report to define joins between cubes:
168
+
169
+ ```yaml
170
+ joins:
171
+ - name: customers
172
+ sql: "{CUBE}.customer_id = {customers.id}"
173
+ relationship: many_to_one
174
+ ```
175
+
176
+ Use `bon docs cubes.joins` for the full reference.
177
+
178
+ ## Phase 6: Build Views
179
+
180
+ Map Metabase collections to views. Each top-level collection (business domain)
181
+ from the analysis report becomes a view that composes the relevant cubes.
182
+
183
+ Create one file per view in `bonnard/views/`.
184
+
185
+ Example — `bonnard/views/sales_analytics.yaml`:
186
+
187
+ ```yaml
188
+ views:
189
+ - name: sales_analytics
190
+ description: Sales metrics and dimensions for the sales team
191
+ cubes:
192
+ - join_path: orders
193
+ includes:
194
+ - count
195
+ - total_revenue
196
+ - created_at
197
+ - status
198
+
199
+ - join_path: orders.customers
200
+ prefix: true
201
+ includes:
202
+ - name
203
+ - region
204
+ ```
205
+
206
+ Use `bon docs views` for the full reference.
207
+
208
+ ## Phase 7: Validate and Deploy
209
+
210
+ Validate the semantic layer:
211
+
212
+ ```bash
213
+ bon validate
214
+ ```
215
+
216
+ Fix any errors. Common issues:
217
+ - Missing `primary_key` dimension
218
+ - Unknown measure/dimension types
219
+ - Undefined cube referenced in a view join path
220
+ - Missing `data_source`
221
+
222
+ Then deploy:
223
+
224
+ ```bash
225
+ bon login
226
+ bon deploy -m "Migrate semantic layer from Metabase"
227
+ ```
228
+
229
+ ## Phase 8: Verify
230
+
231
+ Compare results from the semantic layer against Metabase card outputs.
232
+ Pick 3-5 important `analytical` cards from the analysis report and run
233
+ equivalent queries:
234
+
235
+ ```bash
236
+ # Run a semantic layer query
237
+ bon query '{"measures": ["orders.total_revenue"], "dimensions": ["orders.status"]}'
238
+
239
+ # SQL format
240
+ bon query --sql "SELECT status, MEASURE(total_revenue) FROM orders GROUP BY 1"
241
+ ```
242
+
243
+ Compare the numbers with the corresponding Metabase card. If they don't match:
244
+ - Check the SQL in the card (`bon metabase explore card <id>`) for filters or transformations
245
+ - Ensure the measure type matches the aggregation (SUM vs COUNT vs AVG)
246
+ - Check for WHERE clauses that should be segments or pre-filters
247
+
248
+ ## Next Steps
249
+
250
+ After the core migration is working:
251
+
252
+ - Add remaining tables as cubes (work down the reference count list)
253
+ - Add calculated measures for complex card SQL (`bon docs cubes.measures.calculated`)
254
+ - Add segments for common WHERE clauses (`bon docs cubes.segments`)
255
+ - Set up MCP for AI agent access (`bon mcp`)
256
+ - Review and iterate with `bon deployments` and `bon diff <id>`
@@ -0,0 +1,255 @@
1
+ ---
2
+ description: "Guide migration from an existing Metabase instance to a Bonnard semantic layer. Use when user says 'migrate from metabase', 'import metabase', 'metabase to semantic layer', or has Metabase data they want to model."
3
+ alwaysApply: false
4
+ ---
5
+
6
+ # Migrate from Metabase to Bonnard
7
+
8
+ This skill guides you through analyzing an existing Metabase instance and
9
+ building a semantic layer that replicates its most important metrics.
10
+ Walk through each phase in order, confirming progress before moving on.
11
+
12
+ ## Phase 1: Connect to Metabase
13
+
14
+ Set up a connection to the Metabase instance:
15
+
16
+ ```bash
17
+ bon metabase connect
18
+ ```
19
+
20
+ This prompts for the Metabase URL and API key. The API key should be created
21
+ in Metabase under Admin > Settings > Authentication > API Keys.
22
+ An admin-level key gives the richest analysis (permissions, schema access).
23
+
24
+ ## Phase 2: Analyze the Instance
25
+
26
+ Generate an intelligence report that maps the entire Metabase instance:
27
+
28
+ ```bash
29
+ bon metabase analyze
30
+ ```
31
+
32
+ This writes a report to `.bon/metabase-analysis.md`. Read it carefully — it
33
+ drives every decision in the remaining phases.
34
+
35
+ ### How to interpret each section
36
+
37
+ | Report Section | What It Tells You | Action |
38
+ |----------------|-------------------|--------|
39
+ | **Most Referenced Tables** | Tables used most in SQL queries | Create cubes for these first — they are the core of the data model |
40
+ | **Top Cards by Activity** | Most-viewed questions/models | `analytical` cards (GROUP BY + aggregation) map to measures; `lookup` cards indicate key filter dimensions; `display` cards can be skipped |
41
+ | **Common Filter Variables** | Template vars (`{{var}}`) used across 3+ cards | These must be dimensions on relevant cubes |
42
+ | **Foreign Key Relationships** | FK links between tables | Define `joins` between cubes using these relationships |
43
+ | **Collection Structure** | How users organize content by business area | Map each top-level collection to a view (one view per business domain) |
44
+ | **Dashboard Parameters** | Shared filters across dashboards | The most important shared dimensions — ensure they exist on relevant cubes |
45
+ | **Table Inventory** | Field counts and classification per table | Field classification (dims/measures/time) guides each cube definition; tables with 0 refs can be deprioritized |
46
+ | **Schema Access** | Which schemas non-admin groups can query | Focus on user-facing schemas — skip admin-only/staging schemas |
47
+
48
+ ## Phase 3: Connect the Data Warehouse
49
+
50
+ Add a datasource pointing to the same database that Metabase queries:
51
+
52
+ ```bash
53
+ # Interactive setup
54
+ bon datasource add
55
+
56
+ # Or import from dbt if available
57
+ bon datasource add --from-dbt
58
+ ```
59
+
60
+ Then verify the connection:
61
+
62
+ ```bash
63
+ bon datasource test <name>
64
+ ```
65
+
66
+ The database connection details can often be found in Metabase under
67
+ Admin > Databases, or in the analysis report header.
68
+
69
+ ## Phase 4: Explore Key Tables
70
+
71
+ Before writing cubes, drill into the most important tables and cards
72
+ identified in Phase 2. Use the explore commands to understand field types
73
+ and existing SQL patterns:
74
+
75
+ ```bash
76
+ # View table fields with type classification
77
+ bon metabase explore table <id>
78
+
79
+ # View card SQL and columns
80
+ bon metabase explore card <id>
81
+
82
+ # View schemas and tables in a database
83
+ bon metabase explore database <id>
84
+
85
+ # View cards in a collection
86
+ bon metabase explore collection <id>
87
+ ```
88
+
89
+ ### How explore output maps to cube definitions
90
+
91
+ | Explore Field | Cube Mapping |
92
+ |---------------|-------------|
93
+ | Field class `pk` | Set `primary_key: true` on dimension |
94
+ | Field class `fk` | Join candidate — note the target table |
95
+ | Field class `time` | Dimension with `type: time` |
96
+ | Field class `measure` | Measure candidate — check card SQL for aggregation type |
97
+ | Field class `dim` | Dimension with `type: string` or `type: number` |
98
+
99
+ ### How card SQL maps to measures
100
+
101
+ Look at the SQL in `analytical` cards to determine measure types:
102
+
103
+ | Card SQL Pattern | Cube Measure |
104
+ |-----------------|-------------|
105
+ | `SUM(amount)` | `type: sum`, `sql: amount` |
106
+ | `COUNT(*)` | `type: count` |
107
+ | `COUNT(DISTINCT user_id)` | `type: count_distinct`, `sql: user_id` |
108
+ | `AVG(price)` | `type: avg`, `sql: price` |
109
+ | `MIN(date)` / `MAX(date)` | `type: min` / `type: max`, `sql: date` |
110
+
111
+ Use `bon docs cubes.measures.types` for all 12 measure types.
112
+
113
+ ## Phase 5: Build Cubes
114
+
115
+ Create cubes for the most-referenced tables (from Phase 2). Start with the
116
+ highest-referenced table and work down. Create one file per cube in
117
+ `bonnard/cubes/`.
118
+
119
+ For each cube:
120
+ 1. Set `sql_table` to the full `schema.table` path
121
+ 2. Set `data_source` to the datasource name from Phase 3
122
+ 3. Add a `primary_key` dimension
123
+ 4. Add time dimensions for date/datetime columns
124
+ 5. Add measures based on card SQL patterns (Phase 4)
125
+ 6. Add dimensions for columns used as filters (template vars from Phase 2)
126
+ 7. Add `description` to every measure and dimension
127
+
128
+ Example — `bonnard/cubes/orders.yaml`:
129
+
130
+ ```yaml
131
+ cubes:
132
+ - name: orders
133
+ sql_table: public.orders
134
+ data_source: my_warehouse
135
+ description: Order transactions
136
+
137
+ measures:
138
+ - name: count
139
+ type: count
140
+ description: Total number of orders
141
+
142
+ - name: total_revenue
143
+ type: sum
144
+ sql: amount
145
+ description: Sum of order amounts
146
+
147
+ dimensions:
148
+ - name: id
149
+ type: number
150
+ sql: id
151
+ primary_key: true
152
+
153
+ - name: created_at
154
+ type: time
155
+ sql: created_at
156
+ description: Order creation timestamp
157
+
158
+ - name: status
159
+ type: string
160
+ sql: status
161
+ description: Order status (pending, completed, cancelled)
162
+ ```
163
+
164
+ ### Adding joins
165
+
166
+ Use FK relationships from the analysis report to define joins between cubes:
167
+
168
+ ```yaml
169
+ joins:
170
+ - name: customers
171
+ sql: "{CUBE}.customer_id = {customers.id}"
172
+ relationship: many_to_one
173
+ ```
174
+
175
+ Use `bon docs cubes.joins` for the full reference.
176
+
177
+ ## Phase 6: Build Views
178
+
179
+ Map Metabase collections to views. Each top-level collection (business domain)
180
+ from the analysis report becomes a view that composes the relevant cubes.
181
+
182
+ Create one file per view in `bonnard/views/`.
183
+
184
+ Example — `bonnard/views/sales_analytics.yaml`:
185
+
186
+ ```yaml
187
+ views:
188
+ - name: sales_analytics
189
+ description: Sales metrics and dimensions for the sales team
190
+ cubes:
191
+ - join_path: orders
192
+ includes:
193
+ - count
194
+ - total_revenue
195
+ - created_at
196
+ - status
197
+
198
+ - join_path: orders.customers
199
+ prefix: true
200
+ includes:
201
+ - name
202
+ - region
203
+ ```
204
+
205
+ Use `bon docs views` for the full reference.
206
+
207
+ ## Phase 7: Validate and Deploy
208
+
209
+ Validate the semantic layer:
210
+
211
+ ```bash
212
+ bon validate
213
+ ```
214
+
215
+ Fix any errors. Common issues:
216
+ - Missing `primary_key` dimension
217
+ - Unknown measure/dimension types
218
+ - Undefined cube referenced in a view join path
219
+ - Missing `data_source`
220
+
221
+ Then deploy:
222
+
223
+ ```bash
224
+ bon login
225
+ bon deploy -m "Migrate semantic layer from Metabase"
226
+ ```
227
+
228
+ ## Phase 8: Verify
229
+
230
+ Compare results from the semantic layer against Metabase card outputs.
231
+ Pick 3-5 important `analytical` cards from the analysis report and run
232
+ equivalent queries:
233
+
234
+ ```bash
235
+ # Run a semantic layer query
236
+ bon query '{"measures": ["orders.total_revenue"], "dimensions": ["orders.status"]}'
237
+
238
+ # SQL format
239
+ bon query --sql "SELECT status, MEASURE(total_revenue) FROM orders GROUP BY 1"
240
+ ```
241
+
242
+ Compare the numbers with the corresponding Metabase card. If they don't match:
243
+ - Check the SQL in the card (`bon metabase explore card <id>`) for filters or transformations
244
+ - Ensure the measure type matches the aggregation (SUM vs COUNT vs AVG)
245
+ - Check for WHERE clauses that should be segments or pre-filters
246
+
247
+ ## Next Steps
248
+
249
+ After the core migration is working:
250
+
251
+ - Add remaining tables as cubes (work down the reference count list)
252
+ - Add calculated measures for complex card SQL (`bon docs cubes.measures.calculated`)
253
+ - Add segments for common WHERE clauses (`bon docs cubes.segments`)
254
+ - Set up MCP for AI agent access (`bon mcp`)
255
+ - Review and iterate with `bon deployments` and `bon diff <id>`
@@ -73,6 +73,9 @@ All tables are in the `contoso` schema. The datasource is named `contoso_demo`.
73
73
  | `bon query '{...}'` | Execute a semantic layer query (JSON or `--sql` format) |
74
74
  | `bon mcp` | Show MCP setup instructions for AI agents |
75
75
  | `bon docs` | Browse documentation |
76
+ | `bon metabase connect` | Connect to a Metabase instance (API key) |
77
+ | `bon metabase analyze` | Generate analysis report for semantic layer planning |
78
+ | `bon metabase explore` | Browse Metabase databases, collections, cards, dashboards |
76
79
 
77
80
  ## Learning YAML Syntax
78
81
 
@@ -102,6 +105,7 @@ Topics follow dot notation (e.g., `cubes.dimensions.time`). Use `--recursive` to
102
105
  6. **Review** — `bon deployments` to list, `bon diff <id>` to inspect changes
103
106
 
104
107
  For a guided walkthrough: `/bonnard-get-started`
108
+ For projects migrating from Metabase: `/bonnard-metabase-migrate`
105
109
 
106
110
  ## Deployment & Change Tracking
107
111
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@bonnard/cli",
3
- "version": "0.2.1",
3
+ "version": "0.2.2",
4
4
  "type": "module",
5
5
  "bin": {
6
6
  "bon": "./dist/bin/bon.mjs"