@bonnard/cli 0.1.13 → 0.2.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (55) hide show
  1. package/dist/bin/bon.mjs +305 -620
  2. package/dist/bin/validate-DEh1XQnH.mjs +365 -0
  3. package/dist/docs/_index.md +1 -1
  4. package/dist/docs/topics/cubes.data-source.md +2 -2
  5. package/dist/docs/topics/cubes.dimensions.format.md +2 -2
  6. package/dist/docs/topics/cubes.dimensions.md +2 -2
  7. package/dist/docs/topics/cubes.dimensions.primary-key.md +2 -2
  8. package/dist/docs/topics/cubes.dimensions.sub-query.md +2 -2
  9. package/dist/docs/topics/cubes.dimensions.time.md +2 -2
  10. package/dist/docs/topics/cubes.dimensions.types.md +2 -2
  11. package/dist/docs/topics/cubes.extends.md +2 -2
  12. package/dist/docs/topics/cubes.hierarchies.md +2 -2
  13. package/dist/docs/topics/cubes.joins.md +2 -2
  14. package/dist/docs/topics/cubes.md +2 -2
  15. package/dist/docs/topics/cubes.measures.calculated.md +2 -2
  16. package/dist/docs/topics/cubes.measures.drill-members.md +2 -2
  17. package/dist/docs/topics/cubes.measures.filters.md +2 -2
  18. package/dist/docs/topics/cubes.measures.format.md +21 -2
  19. package/dist/docs/topics/cubes.measures.md +2 -2
  20. package/dist/docs/topics/cubes.measures.rolling.md +2 -2
  21. package/dist/docs/topics/cubes.measures.types.md +2 -2
  22. package/dist/docs/topics/cubes.public.md +2 -2
  23. package/dist/docs/topics/cubes.refresh-key.md +2 -2
  24. package/dist/docs/topics/cubes.segments.md +2 -2
  25. package/dist/docs/topics/cubes.sql.md +2 -2
  26. package/dist/docs/topics/features.catalog.md +31 -0
  27. package/dist/docs/topics/features.cli.md +59 -0
  28. package/dist/docs/topics/features.context-graph.md +18 -0
  29. package/dist/docs/topics/features.governance.md +84 -0
  30. package/dist/docs/topics/features.mcp.md +48 -0
  31. package/dist/docs/topics/features.md +15 -0
  32. package/dist/docs/topics/features.sdk.md +53 -0
  33. package/dist/docs/topics/features.semantic-layer.md +50 -0
  34. package/dist/docs/topics/features.slack-teams.md +18 -0
  35. package/dist/docs/topics/getting-started.md +2 -143
  36. package/dist/docs/topics/pre-aggregations.md +2 -2
  37. package/dist/docs/topics/pre-aggregations.rollup.md +2 -2
  38. package/dist/docs/topics/syntax.context-variables.md +2 -2
  39. package/dist/docs/topics/syntax.md +2 -2
  40. package/dist/docs/topics/syntax.references.md +2 -2
  41. package/dist/docs/topics/views.cubes.md +2 -2
  42. package/dist/docs/topics/views.folders.md +2 -2
  43. package/dist/docs/topics/views.includes.md +2 -2
  44. package/dist/docs/topics/views.md +2 -2
  45. package/dist/docs/topics/workflow.deploy.md +79 -14
  46. package/dist/docs/topics/workflow.mcp.md +19 -13
  47. package/dist/docs/topics/workflow.md +25 -8
  48. package/dist/docs/topics/workflow.query.md +2 -2
  49. package/dist/docs/topics/workflow.validate.md +4 -31
  50. package/dist/templates/claude/skills/bonnard-get-started/SKILL.md +16 -26
  51. package/dist/templates/cursor/rules/bonnard-get-started.mdc +16 -26
  52. package/dist/templates/shared/bonnard.md +31 -6
  53. package/package.json +4 -8
  54. package/dist/bin/validate-DiN3DaTl.mjs +0 -110
  55. /package/dist/bin/{cubes-De1_2_YJ.mjs → cubes-Bf0IPYd7.mjs} +0 -0
@@ -1,15 +1,33 @@
1
- # workflow.deploy
1
+ # Deploy
2
2
 
3
- > Push cubes and views to Bonnard for querying.
3
+ > Deploy your cubes and views to the Bonnard platform using the CLI. Once deployed, your semantic layer is queryable via the REST API, MCP for AI agents, and connected BI tools.
4
4
 
5
5
  ## Overview
6
6
 
7
- The `bon deploy` command uploads your cubes and views to Bonnard, making them available for querying via the API. It validates and tests connections before deploying.
7
+ The `bon deploy` command uploads your cubes and views to Bonnard, making them available for querying via the API. It validates and tests connections before deploying, and creates a versioned deployment with change detection.
8
8
 
9
9
  ## Usage
10
10
 
11
11
  ```bash
12
- bon deploy
12
+ bon deploy -m "description of changes"
13
+ ```
14
+
15
+ A `-m` message is **required** — it describes what changed in this deployment.
16
+
17
+ ### Flags
18
+
19
+ | Flag | Description |
20
+ |------|-------------|
21
+ | `-m "message"` | **Required.** Deployment description |
22
+ | `--ci` | Non-interactive mode (fails on missing datasources) |
23
+ | `--push-datasources` | Auto-push missing datasources to Bonnard |
24
+
25
+ ### CI/CD
26
+
27
+ For automated pipelines, combine `--ci` with `--push-datasources`:
28
+
29
+ ```bash
30
+ bon deploy --ci --push-datasources -m "CI deploy"
13
31
  ```
14
32
 
15
33
  ## Prerequisites
@@ -23,12 +41,13 @@ bon deploy
23
41
  1. **Validates** — checks cubes and views for errors
24
42
  2. **Tests connections** — verifies data source access
25
43
  3. **Uploads** — sends cubes and views to Bonnard
26
- 4. **Activates** — makes cubes and views available for queries
44
+ 4. **Detects changes** — compares against the previous deployment
45
+ 5. **Activates** — makes cubes and views available for queries
27
46
 
28
47
  ## Example Output
29
48
 
30
49
  ```
31
- bon deploy
50
+ bon deploy -m "Add revenue metrics"
32
51
 
33
52
  ✓ Validating...
34
53
  ✓ bonnard/cubes/orders.yaml
@@ -43,14 +62,55 @@ bon deploy
43
62
 
44
63
  ✓ Deploy complete!
45
64
 
46
- Your semantic layer is now available at:
47
- https://api.bonnard.dev/v1/your-org
65
+ Changes:
66
+ + orders.total_revenue (measure)
67
+ + orders.avg_order_value (measure)
68
+ ~ orders.count (measure) — type changed
69
+
70
+ ⚠ 1 breaking change detected
48
71
  ```
49
72
 
73
+ ## Change Detection
74
+
75
+ Every deployment is versioned. Bonnard automatically detects:
76
+
77
+ - **Added** — new cubes, views, measures, dimensions
78
+ - **Modified** — changes to type, SQL, format, description
79
+ - **Removed** — deleted fields (flagged as breaking)
80
+ - **Breaking changes** — removed measures/dimensions, type changes
81
+
82
+ ## Reviewing Deployments
83
+
84
+ After deploying, use these commands to review history and changes:
85
+
86
+ ### List deployments
87
+
88
+ ```bash
89
+ bon deployments # Recent deployments
90
+ bon deployments --all # Full history
91
+ ```
92
+
93
+ ### View changes in a deployment
94
+
95
+ ```bash
96
+ bon diff <deployment-id> # All changes
97
+ bon diff <deployment-id> --breaking # Breaking changes only
98
+ ```
99
+
100
+ ### Annotate changes
101
+
102
+ Add reasoning or context to deployment changes:
103
+
104
+ ```bash
105
+ bon annotate <deployment-id> --data '{"object": "note about why this changed"}'
106
+ ```
107
+
108
+ Annotations are visible in the schema catalog and help teammates understand why changes were made.
109
+
50
110
  ## Deploy Flow
51
111
 
52
112
  ```
53
- bon deploy
113
+ bon deploy -m "message"
54
114
 
55
115
  ├── 1. bon validate (must pass)
56
116
 
@@ -61,7 +121,9 @@ bon deploy
61
121
  │ - views from bonnard/views/
62
122
  │ - datasource configs
63
123
 
64
- └── 4. Activate deployment
124
+ ├── 4. Detect changes vs. previous deployment
125
+
126
+ └── 5. Activate deployment
65
127
  ```
66
128
 
67
129
  ## Error Handling
@@ -111,13 +173,16 @@ Run: bon login
111
173
  - **Replaces** previous deployment (not additive)
112
174
  - **All or nothing** — partial deploys don't happen
113
175
  - **Instant** — changes take effect immediately
176
+ - **Versioned** — every deployment is tracked with changes
114
177
 
115
178
  ## Best Practices
116
179
 
117
- 1. **Validate first** — run `bon validate` before deploy
118
- 2. **Test locally** — verify queries work before deploying
119
- 3. **Use version control** — commit cubes and views before deploying
120
- 4. **Monitor after deploy** — check for query errors
180
+ 1. **Always include a meaningful message** — helps teammates understand what changed
181
+ 2. **Validate first** — run `bon validate` before deploy
182
+ 3. **Test locally** — verify queries work before deploying
183
+ 4. **Use version control** — commit cubes and views before deploying
184
+ 5. **Review after deploy** — use `bon diff` to check for unintended breaking changes
185
+ 6. **Annotate breaking changes** — add context so consumers know what to update
121
186
 
122
187
  ## See Also
123
188
 
@@ -1,6 +1,6 @@
1
- # workflow.mcp
1
+ # MCP
2
2
 
3
- > Connect AI agents to your semantic layer via MCP.
3
+ > Connect AI agents like Claude, ChatGPT, and Cursor to your semantic layer using the Model Context Protocol. MCP gives agents governed access to your metrics and dimensions.
4
4
 
5
5
  ## Overview
6
6
 
@@ -18,37 +18,39 @@ https://mcp.bonnard.dev/mcp
18
18
 
19
19
  ### Claude Desktop
20
20
 
21
- 1. Open **Settings > Connectors**
21
+ 1. Click the **+** button in the chat input, then select **Connectors > Manage connectors**
22
+
23
+ ![Claude Desktop — Connectors menu in chat](/images/claude-chat-connectors.png)
24
+
22
25
  2. Click **Add custom connector**
23
26
  3. Enter a name (e.g. "Bonnard MCP") and the MCP URL: `https://mcp.bonnard.dev/mcp`
24
27
  4. Click **Add**
25
28
 
26
- ![Claude Desktop — Settings > Connectors](/images/claude-connectors.png)
27
-
28
29
  ![Claude Desktop — Add custom connector dialog](/images/claude-add-connector.png)
29
30
 
30
- Once added, enable the Bonnard connector in any chat via the **Connectors** menu:
31
-
32
- ![Claude Desktop — Bonnard MCP enabled in chat](/images/claude-chat-connectors.png)
31
+ Once added, enable the Bonnard connector in any chat via the **Connectors** menu.
33
32
 
34
33
  Remote MCP servers in Claude Desktop must be added through the Connectors UI, not the JSON config file.
35
34
 
36
35
  ### ChatGPT
37
36
 
38
- 1. Open **Settings > Apps**
39
- 2. Click **Advanced settings** and enable **Developer mode**
37
+ Custom MCP servers must be added in the browser at [chatgpt.com](https://chatgpt.com) the desktop app does not support this.
38
+
39
+ 1. Go to [chatgpt.com](https://chatgpt.com) in your browser
40
+ 2. Open **Settings > Apps**
40
41
 
41
42
  ![ChatGPT — Settings > Apps](/images/chatgpt-apps.png)
42
43
 
43
- ![ChatGPT Enable Developer mode](/images/chatgpt-developer-mode.png)
44
+ 3. Click **Advanced settings**, enable **Developer mode**, then click **Create app**
45
+
46
+ ![ChatGPT — Advanced settings with Developer mode and Create app](/images/advanced-create-app-chatgpt.png)
44
47
 
45
- 3. Click **Create app**
46
48
  4. Enter a name (e.g. "Bonnard MCP"), the MCP URL `https://mcp.bonnard.dev/mcp`, and select **OAuth** for authentication
47
49
  5. Check the acknowledgement box and click **Create**
48
50
 
49
51
  ![ChatGPT — Create new app with MCP URL](/images/chatgpt-new-app.png)
50
52
 
51
- Once created, the Bonnard connector appears in the **More** menu in any chat:
53
+ Once created, the Bonnard connector appears under **Enabled apps**:
52
54
 
53
55
  ![ChatGPT — Bonnard MCP available in chat](/images/chatgpt-chat-apps.png)
54
56
 
@@ -68,6 +70,8 @@ Open **Settings > MCP** and add the server URL, or add to `.cursor/mcp.json` in
68
70
  }
69
71
  ```
70
72
 
73
+ On first use, your browser will open to sign in and authorize the connection.
74
+
71
75
  ### VS Code / GitHub Copilot
72
76
 
73
77
  Open the Command Palette and run **MCP: Add Server**, or add to `.vscode/mcp.json` in your project:
@@ -83,6 +87,8 @@ Open the Command Palette and run **MCP: Add Server**, or add to `.vscode/mcp.jso
83
87
  }
84
88
  ```
85
89
 
90
+ On first use, your browser will open to sign in and authorize the connection.
91
+
86
92
  ### Claude Code
87
93
 
88
94
  Run in your terminal:
@@ -1,6 +1,6 @@
1
- # workflow
1
+ # Workflow
2
2
 
3
- > Development workflow for building and deploying cubes and views.
3
+ > The end-to-end workflow for building a semantic layer with Bonnard: validate your YAML definitions locally, deploy to the platform, and query your metrics via API or MCP.
4
4
 
5
5
  ## Overview
6
6
 
@@ -21,7 +21,7 @@ bon datasource add
21
21
  bon validate
22
22
 
23
23
  # 5. Deploy to Bonnard
24
- bon deploy
24
+ bon deploy -m "Initial semantic layer"
25
25
  ```
26
26
 
27
27
  ## Project Structure
@@ -81,9 +81,8 @@ views:
81
81
 
82
82
  Check for syntax errors and test connections:
83
83
 
84
- ```yaml
84
+ ```bash
85
85
  bon validate
86
- bon validate --test-connection
87
86
  ```
88
87
 
89
88
  ### 4. Deploy
@@ -91,7 +90,17 @@ bon validate --test-connection
91
90
  Push cubes and views to Bonnard:
92
91
 
93
92
  ```bash
94
- bon deploy
93
+ bon deploy -m "Add orders cube and overview view"
94
+ ```
95
+
96
+ ### 5. Review
97
+
98
+ Check what changed in your deployment:
99
+
100
+ ```bash
101
+ bon deployments # List recent deployments
102
+ bon diff <deployment-id> # View all changes
103
+ bon diff <deployment-id> --breaking # Breaking changes only
95
104
  ```
96
105
 
97
106
  ## File Organization
@@ -134,10 +143,18 @@ bonnard/cubes/
134
143
  |---------|-------------|
135
144
  | `bon init` | Create project structure |
136
145
  | `bon datasource add` | Add a data source |
146
+ | `bon datasource add --demo` | Add demo dataset (no warehouse needed) |
147
+ | `bon datasource add --from-dbt` | Import from dbt profiles |
137
148
  | `bon datasource list` | List configured sources |
138
- | `bon datasource test <name>` | Test connection |
149
+ | `bon datasource test <name>` | Test connection (requires login) |
139
150
  | `bon validate` | Check cube and view syntax |
140
- | `bon deploy` | Deploy to Bonnard |
151
+ | `bon deploy -m "message"` | Deploy to Bonnard (message required) |
152
+ | `bon deploy --ci` | Non-interactive deploy |
153
+ | `bon deployments` | List deployment history |
154
+ | `bon diff <id>` | View changes in a deployment |
155
+ | `bon annotate <id>` | Add context to deployment changes |
156
+ | `bon query '{...}'` | Query the semantic layer |
157
+ | `bon mcp` | MCP setup instructions for AI agents |
141
158
  | `bon docs` | Browse documentation |
142
159
 
143
160
  ## See Also
@@ -1,6 +1,6 @@
1
- # workflow.query
1
+ # Query
2
2
 
3
- > Query the deployed semantic layer using JSON or SQL.
3
+ > Query your deployed semantic layer using the Bonnard REST API. Send JSON query objects or SQL strings to retrieve measures and dimensions with filtering, grouping, and time ranges.
4
4
 
5
5
  ## Overview
6
6
 
@@ -1,19 +1,15 @@
1
- # workflow.validate
1
+ # Validate
2
2
 
3
- > Check cubes and views for errors before deploying.
3
+ > Run validation checks on your cubes and views before deploying to catch YAML syntax errors, missing references, circular joins, and other issues. Use `bon validate` from the CLI.
4
4
 
5
5
  ## Overview
6
6
 
7
- The `bon validate` command checks your YAML cubes and views for syntax errors, schema violations, and optionally tests data source connections. Run this before deploying to catch issues early.
7
+ The `bon validate` command checks your YAML cubes and views for syntax errors and schema violations. Run this before deploying to catch issues early.
8
8
 
9
9
  ## Usage
10
10
 
11
11
  ```bash
12
- # Basic validation
13
12
  bon validate
14
-
15
- # Also test data source connections
16
- bon validate --test-connection
17
13
  ```
18
14
 
19
15
  ## What Gets Validated
@@ -36,12 +32,6 @@ bon validate --test-connection
36
32
  - Referenced members exist
37
33
  - Join relationships are valid
38
34
 
39
- ### Connection Testing (--test-connection)
40
-
41
- - Data source credentials work
42
- - Database is accessible
43
- - Tables/schemas exist
44
-
45
35
  ## Example Output
46
36
 
47
37
  ### Success
@@ -73,17 +63,6 @@ bonnard/cubes/orders.yaml:15:5
73
63
  1 error found.
74
64
  ```
75
65
 
76
- ### Connection Warnings
77
-
78
- ```
79
- ✓ Validating YAML syntax...
80
- ✓ All cubes and views valid.
81
-
82
- ⚠ Testing connections...
83
- ⚠ datasource "analytics": Connection timed out
84
- (This won't block deploy, but queries may fail)
85
- ```
86
-
87
66
  ## Common Errors
88
67
 
89
68
  ### Missing Required Field
@@ -125,12 +104,6 @@ measures:
125
104
  type: count
126
105
  ```
127
106
 
128
- ## Options
129
-
130
- | Option | Description |
131
- |--------|-------------|
132
- | `--test-connection` | Also test datasource connections |
133
-
134
107
  ## Exit Codes
135
108
 
136
109
  | Code | Meaning |
@@ -143,7 +116,7 @@ measures:
143
116
  1. **Run before every deploy** — `bon validate && bon deploy`
144
117
  2. **Add to CI/CD** — validate on pull requests
145
118
  3. **Fix errors first** — don't deploy with validation errors
146
- 4. **Test connections periodically** — catch credential issues early
119
+ 4. **Test connections** — use `bon datasource test <name>` to check connectivity
147
120
 
148
121
  ## See Also
149
122
 
@@ -41,25 +41,12 @@ If the test fails, common issues:
41
41
 
42
42
  ## Phase 2: Explore the Data
43
43
 
44
- Use `bon preview` to understand what tables and columns are available.
45
- **Always run this before creating cubes** — use the results to decide which
46
- tables to model and what columns to expose.
44
+ Before creating cubes, understand what tables and columns are available in your warehouse.
47
45
 
48
- ```bash
49
- # List tables use the schema from the datasource config
50
- # For demo data (contoso schema):
51
- bon preview contoso_demo "SELECT table_name FROM information_schema.tables WHERE table_schema = 'contoso'"
52
-
53
- # For user's own data (typically public schema):
54
- bon preview <datasource> "SELECT table_name FROM information_schema.tables WHERE table_schema = 'public'"
55
-
56
- # Snowflake:
57
- bon preview <datasource> "SHOW TABLES"
58
-
59
- # Then sample the key tables to see columns and data:
60
- bon preview contoso_demo "SELECT * FROM contoso.fact_sales" --limit 10
61
- bon preview contoso_demo "SELECT * FROM contoso.dim_product" --limit 10
62
- ```
46
+ **Options for exploring your data:**
47
+ - Use your database IDE or CLI (e.g., `psql`, Snowflake web UI, BigQuery console) to browse tables
48
+ - Check your dbt docs or existing documentation for table schemas
49
+ - For the demo dataset, the tables are: `contoso.fact_sales`, `contoso.dim_product`, `contoso.dim_store`, `contoso.dim_customer`
63
50
 
64
51
  Note the table names, column names, and data types — you'll use these in Phase 3.
65
52
 
@@ -157,11 +144,9 @@ Fix any errors before proceeding. Common issues:
157
144
  - Unknown measure/dimension types (e.g., `text` should be `string`)
158
145
  - Bad YAML indentation
159
146
 
160
- Optionally test the datasource connection too:
161
-
162
- ```bash
163
- bon validate --test-connection
164
- ```
147
+ Validate also warns about:
148
+ - **Missing descriptions** — descriptions help AI agents and analysts discover metrics
149
+ - **Missing `data_source`** — cubes without an explicit `data_source` use the default warehouse, which can cause issues when multiple warehouses are configured
165
150
 
166
151
  ## Phase 6: Deploy
167
152
 
@@ -169,11 +154,16 @@ Log in (if not already) and deploy:
169
154
 
170
155
  ```bash
171
156
  bon login
172
- bon deploy
157
+ bon deploy -m "Initial semantic layer with sales cube and overview view"
173
158
  ```
174
159
 
175
- Deploy validates, tests connections, uploads cubes/views, and syncs datasource
176
- credentials (encrypted) to Bonnard.
160
+ Deploy requires a `-m` message describing the changes. It validates, tests
161
+ connections, uploads cubes/views, and syncs datasource credentials (encrypted)
162
+ to Bonnard.
163
+
164
+ After deploying, the output shows what changed (added/modified/removed) and
165
+ flags any breaking changes. Use `bon deployments` to see history and
166
+ `bon diff <id>` to review changes from any deployment.
177
167
 
178
168
  ## Phase 7: Test with a Query
179
169
 
@@ -40,25 +40,12 @@ If the test fails, common issues:
40
40
 
41
41
  ## Phase 2: Explore the Data
42
42
 
43
- Use `bon preview` to understand what tables and columns are available.
44
- **Always run this before creating cubes** — use the results to decide which
45
- tables to model and what columns to expose.
43
+ Before creating cubes, understand what tables and columns are available in your warehouse.
46
44
 
47
- ```bash
48
- # List tables use the schema from the datasource config
49
- # For demo data (contoso schema):
50
- bon preview contoso_demo "SELECT table_name FROM information_schema.tables WHERE table_schema = 'contoso'"
51
-
52
- # For user's own data (typically public schema):
53
- bon preview <datasource> "SELECT table_name FROM information_schema.tables WHERE table_schema = 'public'"
54
-
55
- # Snowflake:
56
- bon preview <datasource> "SHOW TABLES"
57
-
58
- # Then sample the key tables to see columns and data:
59
- bon preview contoso_demo "SELECT * FROM contoso.fact_sales" --limit 10
60
- bon preview contoso_demo "SELECT * FROM contoso.dim_product" --limit 10
61
- ```
45
+ **Options for exploring your data:**
46
+ - Use your database IDE or CLI (e.g., `psql`, Snowflake web UI, BigQuery console) to browse tables
47
+ - Check your dbt docs or existing documentation for table schemas
48
+ - For the demo dataset, the tables are: `contoso.fact_sales`, `contoso.dim_product`, `contoso.dim_store`, `contoso.dim_customer`
62
49
 
63
50
  Note the table names, column names, and data types — you'll use these in Phase 3.
64
51
 
@@ -156,11 +143,9 @@ Fix any errors before proceeding. Common issues:
156
143
  - Unknown measure/dimension types (e.g., `text` should be `string`)
157
144
  - Bad YAML indentation
158
145
 
159
- Optionally test the datasource connection too:
160
-
161
- ```bash
162
- bon validate --test-connection
163
- ```
146
+ Validate also warns about:
147
+ - **Missing descriptions** — descriptions help AI agents and analysts discover metrics
148
+ - **Missing `data_source`** — cubes without an explicit `data_source` use the default warehouse, which can cause issues when multiple warehouses are configured
164
149
 
165
150
  ## Phase 6: Deploy
166
151
 
@@ -168,11 +153,16 @@ Log in (if not already) and deploy:
168
153
 
169
154
  ```bash
170
155
  bon login
171
- bon deploy
156
+ bon deploy -m "Initial semantic layer with sales cube and overview view"
172
157
  ```
173
158
 
174
- Deploy validates, tests connections, uploads cubes/views, and syncs datasource
175
- credentials (encrypted) to Bonnard.
159
+ Deploy requires a `-m` message describing the changes. It validates, tests
160
+ connections, uploads cubes/views, and syncs datasource credentials (encrypted)
161
+ to Bonnard.
162
+
163
+ After deploying, the output shows what changed (added/modified/removed) and
164
+ flags any breaking changes. Use `bon deployments` to see history and
165
+ `bon diff <id>` to review changes from any deployment.
176
166
 
177
167
  ## Phase 7: Test with a Query
178
168
 
@@ -63,10 +63,15 @@ All tables are in the `contoso` schema. The datasource is named `contoso_demo`.
63
63
  | `bon datasource add` | Add warehouse connection |
64
64
  | `bon datasource add --demo` | Add demo dataset (no warehouse needed) |
65
65
  | `bon datasource add --from-dbt` | Import from dbt profiles |
66
- | `bon datasource test <name>` | Test connection |
67
- | `bon validate` | Validate YAML syntax |
68
- | `bon validate --test-connection` | Validate + test connections |
69
- | `bon deploy` | Deploy to Bonnard (requires login) |
66
+ | `bon datasource test <name>` | Test connection (requires login) |
67
+ | `bon validate` | Validate YAML syntax, warn on missing descriptions and `data_source` |
68
+ | `bon deploy -m "message"` | Deploy to Bonnard (requires login, message required) |
69
+ | `bon deploy --ci` | Non-interactive deploy (fails on missing datasources) |
70
+ | `bon deployments` | List recent deployments (add `--all` for full history) |
71
+ | `bon diff <deployment-id>` | Show changes in a deployment (`--breaking` for breaking only) |
72
+ | `bon annotate <deployment-id>` | Add reasoning/context to deployment changes |
73
+ | `bon query '{...}'` | Execute a semantic layer query (JSON or `--sql` format) |
74
+ | `bon mcp` | Show MCP setup instructions for AI agents |
70
75
  | `bon docs` | Browse documentation |
71
76
 
72
77
  ## Learning YAML Syntax
@@ -92,7 +97,27 @@ Topics follow dot notation (e.g., `cubes.dimensions.time`). Use `--recursive` to
92
97
  1. **Setup datasource** — `bon datasource add --from-dbt` or manual
93
98
  2. **Create cubes** — Define measures/dimensions in `bonnard/cubes/*.yaml`
94
99
  3. **Create views** — Compose cubes in `bonnard/views/*.yaml`
95
- 4. **Validate** — `bon validate --test-connection`
96
- 5. **Deploy** — `bon login` then `bon deploy`
100
+ 4. **Validate** — `bon validate`
101
+ 5. **Deploy** — `bon login` then `bon deploy -m "description of changes"`
102
+ 6. **Review** — `bon deployments` to list, `bon diff <id>` to inspect changes
97
103
 
98
104
  For a guided walkthrough: `/bonnard-get-started`
105
+
106
+ ## Deployment & Change Tracking
107
+
108
+ Every deploy creates a versioned deployment with change detection:
109
+
110
+ - **Deploy** requires a message: `bon deploy -m "Add revenue metrics"`
111
+ - **Changes** are detected automatically: added, modified, removed fields
112
+ - **Breaking changes** (removed measures/dimensions, type changes) are flagged
113
+ - **Deployment history**: `bon deployments` lists recent deploys with IDs
114
+ - **Diff**: `bon diff <id>` shows all changes; `bon diff <id> --breaking` filters to breaking only
115
+ - **Annotate**: `bon annotate <id> --data '{"object": "note"}'` adds context to changes
116
+
117
+ For CI/CD pipelines, use `bon deploy --ci -m "message"` (non-interactive, fails on issues) or `bon deploy --push-datasources -m "message"` to auto-push missing datasources.
118
+
119
+ ## Best Practices
120
+
121
+ - **Always set `data_source`** on cubes — without it, cubes silently use the default warehouse, which breaks when multiple warehouses are added later. `bon validate` warns about this.
122
+ - **Add descriptions** to all cubes, views, measures, and dimensions — these power AI agent discovery and the schema catalog.
123
+ - **Use `sql_table` with full schema path** (e.g., `schema.table_name`) for clarity.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@bonnard/cli",
3
- "version": "0.1.13",
3
+ "version": "0.2.1",
4
4
  "type": "module",
5
5
  "bin": {
6
6
  "bon": "./dist/bin/bon.mjs"
@@ -18,16 +18,12 @@
18
18
  "@toon-format/toon": "^2.1.0",
19
19
  "commander": "^12.0.0",
20
20
  "open": "^11.0.0",
21
- "@cubejs-backend/schema-compiler": "^1.6.7",
22
- "@cubejs-backend/shared": "^1.6.7",
23
21
  "picocolors": "^1.0.0",
24
- "yaml": "^2.8.0"
25
- },
26
- "optionalDependencies": {
27
- "snowflake-sdk": "^2.3.3",
28
- "pg": "^8.18.0"
22
+ "yaml": "^2.8.0",
23
+ "zod": "^4.0.0"
29
24
  },
30
25
  "devDependencies": {
26
+ "@types/node": "^20.0.0",
31
27
  "tsdown": "^0.20.1",
32
28
  "vitest": "^2.0.0"
33
29
  },