@bonnard/cli 0.2.11 → 0.2.13

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,87 +1,202 @@
1
- # @bonnard/cli
1
+ <p align="center">
2
+ <a href="https://www.bonnard.dev">
3
+ <picture>
4
+ <source media="(prefers-color-scheme: dark)" srcset="./assets/banner-dark.png" />
5
+ <source media="(prefers-color-scheme: light)" srcset="./assets/banner-light.png" />
6
+ <img alt="Bonnard -the semantic engine for MCP clients, AI agents, and data teams" src="./assets/banner-light.png" width="100%" />
7
+ </picture>
8
+ </a>
9
+ </p>
2
10
 
3
- The Bonnard CLI (`bon`) takes you from zero to a deployed semantic layer in minutes. Define metrics in YAML, validate locally, deploy, and query — from your terminal or AI coding agent.
11
+ <p align="center">
12
+ <strong>The semantic engine for MCP clients. Define metrics once, query from anywhere.</strong>
13
+ </p>
4
14
 
5
- **Open source** — [view source on GitHub](https://github.com/meal-inc/bonnard-cli)
15
+ <p align="center">
16
+ <a href="https://www.npmjs.com/package/@bonnard/cli"><img src="https://img.shields.io/npm/v/@bonnard/cli?style=flat-square&color=0891b2" alt="npm version" /></a>
17
+ <a href="https://github.com/meal-inc/bonnard-cli/blob/main/LICENSE"><img src="https://img.shields.io/github/license/meal-inc/bonnard-cli?style=flat-square" alt="MIT License" /></a>
18
+ <a href="https://discord.com/invite/RQuvjGRz"><img src="https://img.shields.io/badge/Discord-Join%20us-5865F2?style=flat-square&logo=discord&logoColor=white" alt="Discord" /></a>
19
+ </p>
6
20
 
7
- ## Quick start
21
+ <p align="center">
22
+ <a href="https://docs.bonnard.dev/docs/">Docs</a> &middot;
23
+ <a href="https://docs.bonnard.dev/docs/getting-started">Getting Started</a> &middot;
24
+ <a href="https://docs.bonnard.dev/docs/changelog">Changelog</a> &middot;
25
+ <a href="https://discord.com/invite/RQuvjGRz">Discord</a> &middot;
26
+ <a href="https://www.bonnard.dev">Website</a>
27
+ </p>
28
+
29
+ ---
30
+
31
+ Bonnard is an agent-native semantic layer CLI. Deploy an MCP server and governed analytics API in minutes -for AI agents, BI tools, and data teams. Define metrics and dimensions in YAML, validate locally, and ship to production. Works with Snowflake, BigQuery, Databricks, and PostgreSQL. Ships with native integrations for Claude Code, Cursor, and Codex. Built with TypeScript.
32
+
33
+ ## Why Bonnard?
34
+
35
+ Most semantic layers were built for dashboards and retrofitted for AI. Bonnard was built the other way around -agent-native from day one with Model Context Protocol (MCP) as a core feature, not a plugin. One CLI takes you from an empty directory to a production semantic layer serving AI agents, BI tools, and human analysts through a single governed API.
36
+
37
+ <p align="center">
38
+ <img src="./assets/architecture.png" alt="Bonnard architecture -data sources flow through the semantic layer to AI agents, BI tools, and MCP clients" width="100%" />
39
+ </p>
40
+
41
+ ## Quick Start
42
+
43
+ No install required. Run directly with npx:
8
44
 
9
45
  ```bash
10
- npx @bonnard/cli init # Create project structure + agent templates
11
- bon datasource add --demo # Add demo dataset (no warehouse needed)
12
- bon validate # Check syntax
13
- bon login # Authenticate with Bonnard
14
- bon deploy -m "Initial deploy" # Deploy to Bonnard
46
+ npx @bonnard/cli init
15
47
  ```
16
48
 
17
- No install needed — `npx` runs the CLI directly. Or install globally for shorter commands:
49
+ Or install globally:
18
50
 
19
51
  ```bash
20
52
  npm install -g @bonnard/cli
21
53
  ```
22
54
 
55
+ Then follow the setup flow:
56
+
57
+ ```bash
58
+ bon init # Scaffold project + agent configs
59
+ bon datasource add # Connect your warehouse
60
+ bon validate # Check your models locally
61
+ bon login # Authenticate
62
+ bon deploy # Ship it
63
+ ```
64
+
65
+ No warehouse yet? Start exploring with a full retail demo dataset:
66
+
67
+ ```bash
68
+ bon datasource add --demo
69
+ ```
70
+
23
71
  Requires Node.js 20+.
24
72
 
25
- ## Commands
73
+ ## Agent-Native from Day One
26
74
 
27
- | Command | Description |
28
- |---------|-------------|
29
- | `bon init` | Create project structure and AI agent templates |
30
- | `bon login` | Authenticate with Bonnard |
31
- | `bon logout` | Remove stored credentials |
32
- | `bon whoami` | Show current login status |
33
- | `bon datasource add` | Add a data source (interactive) |
34
- | `bon datasource add --demo` | Add read-only demo dataset |
35
- | `bon datasource add --from-dbt` | Import from dbt profiles |
36
- | `bon datasource list` | List configured data sources |
37
- | `bon datasource remove <name>` | Remove a data source |
38
- | `bon validate` | Validate cube and view YAML |
39
- | `bon deploy -m "message"` | Deploy to Bonnard |
40
- | `bon deployments` | List deployment history |
41
- | `bon diff <id>` | View changes in a deployment |
42
- | `bon annotate <id>` | Add context to deployment changes |
43
- | `bon query '{"measures":["orders.count"]}'` | Query the semantic layer (JSON) |
44
- | `bon query "SELECT ..." --sql` | Query the semantic layer (SQL) |
45
- | `bon mcp` | MCP setup instructions for AI agents |
46
- | `bon mcp test` | Test MCP server connectivity |
47
- | `bon docs [topic]` | Browse modeling documentation |
48
- | `bon docs --search "joins"` | Search documentation |
75
+ When you run `bon init`, Bonnard generates context files so AI coding agents understand your semantic layer from the first prompt:
49
76
 
50
- ## Agent-ready from the start
77
+ ```
78
+ you@work my-project % bon init
79
+
80
+ Initialised Bonnard project
81
+ Core files:
82
+ bon.yaml
83
+ bonnard/cubes/
84
+ bonnard/views/
85
+ Agent support:
86
+ .claude/rules/bonnard.md
87
+ .claude/skills/bonnard-get-started/
88
+ .cursor/rules/bonnard.mdc
89
+ AGENTS.md
90
+ ```
51
91
 
52
- `bon init` generates context files for your AI coding tools:
92
+ | Agent | What gets generated |
93
+ | --- | --- |
94
+ | **Claude Code** | `.claude/rules/bonnard.md` + skill templates in `.claude/skills/` |
95
+ | **Cursor** | `.cursor/rules/bonnard.mdc` with frontmatter configuration |
96
+ | **Codex** | `AGENTS.md` + skills directory |
53
97
 
54
- - **Claude Code** `.claude/rules/` + get-started skill
55
- - **Cursor** — `.cursor/rules/` with auto-apply frontmatter
56
- - **Codex** — `AGENTS.md` + skills folder
98
+ Set up your MCP server so agents can query your semantic layer directly:
57
99
 
58
- Your agent understands Bonnard's modeling language from the first prompt.
100
+ ```bash
101
+ bon mcp setup # Configure MCP server
102
+ bon mcp test # Verify the connection
103
+ ```
59
104
 
60
- ## Project structure
105
+ ## Auto-Detected from Your Project
61
106
 
62
- After `bon init`:
107
+ <p align="center">
108
+ <img src="./assets/datasources.png" alt="Auto-detected warehouses and data tools -Snowflake, BigQuery, PostgreSQL, Databricks, DuckDB, dbt, Dagster, Prefect, Airflow, Looker, Cube, Evidence, SQLMesh, Soda, Great Expectations" width="100%" />
109
+ </p>
110
+
111
+ Bonnard automatically detects your warehouses and data tools. Point it at your project and it discovers schemas, tables, and relationships.
112
+
113
+ - **Snowflake** -full support including Snowpark
114
+ - **Google BigQuery** -native integration
115
+ - **Databricks** -SQL warehouses and Unity Catalog
116
+ - **PostgreSQL** -including cloud-hosted variants (Supabase, Neon, RDS)
117
+ - **DuckDB** -local development and testing
118
+ - **dbt** -model and profile import
119
+ - **Dagster, Prefect, Airflow** -orchestration tools
120
+ - **Looker, Cube, Evidence** -existing BI layers
121
+ - **SQLMesh, Soda, Great Expectations** -data quality and transformation
122
+
123
+ ## Querying
124
+
125
+ Query your semantic layer from the terminal using JSON or SQL syntax:
126
+
127
+ ```bash
128
+ # JSON query
129
+ bon query --measures revenue,order_count --dimensions product_category --time-dimension created_at
130
+
131
+ # SQL query
132
+ bon query --sql "SELECT product_category, MEASURE(revenue) FROM orders GROUP BY 1"
133
+ ```
134
+
135
+ Agents connected via MCP can run the same queries programmatically, with full access to your governed metric definitions.
136
+
137
+ ## Project Structure
63
138
 
64
139
  ```
65
140
  my-project/
66
141
  ├── bon.yaml # Project configuration
67
142
  ├── bonnard/
68
- │ ├── cubes/ # Cube definitions (measures, dimensions, joins)
69
- │ └── views/ # View definitions (curated query interfaces)
70
- └── .bon/ # Local config (gitignored)
71
- └── datasources.yaml # Data source credentials
143
+ │ ├── cubes/ # Metric and dimension definitions
144
+ │ └── views/ # Curated query interfaces
145
+ ├── .bon/ # Local credentials (gitignored)
146
+ ├── .claude/ # Claude Code agent context
147
+ ├── .cursor/ # Cursor agent context
148
+ └── AGENTS.md # Codex agent context
72
149
  ```
73
150
 
74
151
  ## CI/CD
75
152
 
153
+ Deploy from your pipeline with the `--ci` flag for non-interactive mode:
154
+
76
155
  ```bash
77
- bon deploy --ci -m "CI deploy"
156
+ bon deploy --ci
78
157
  ```
79
158
 
80
- Non-interactive mode for pipelines. Datasources are synced automatically.
159
+ Handles automatic datasource synchronisation and skips interactive prompts. Fits into GitHub Actions, GitLab CI, or any pipeline that runs Node.js.
160
+
161
+ ## Commands
162
+
163
+ | Command | Description |
164
+ | --- | --- |
165
+ | `bon init` | Scaffold a new project with agent configs |
166
+ | `bon datasource add` | Connect a data source (or `--demo` for sample data) |
167
+ | `bon datasource add --from-dbt` | Import from dbt profiles |
168
+ | `bon datasource list` | List connected data sources |
169
+ | `bon validate` | Validate models locally before deploying |
170
+ | `bon deploy` | Deploy semantic layer to production |
171
+ | `bon deployments` | List active deployments |
172
+ | `bon diff` | Preview changes before deploying |
173
+ | `bon annotate` | Add metadata and descriptions to models |
174
+ | `bon query` | Run queries from the terminal (JSON or SQL) |
175
+ | `bon mcp setup` | Configure MCP server for agent access |
176
+ | `bon mcp test` | Test MCP connection |
177
+ | `bon docs` | Browse or search documentation from the CLI |
178
+ | `bon login` / `bon logout` | Manage authentication |
179
+ | `bon whoami` | Check current session |
180
+
181
+ For the full CLI reference, see the [documentation](https://docs.bonnard.dev/docs/cli-reference).
81
182
 
82
183
  ## Documentation
83
184
 
84
- - [Getting Started](https://docs.bonnard.dev/docs/getting-started)
85
- - [CLI Reference](https://docs.bonnard.dev/docs/cli)
86
- - [Modeling Guide](https://docs.bonnard.dev/docs/modeling/cubes)
87
- - [Querying](https://docs.bonnard.dev/docs/querying)
185
+ - [Getting Started](https://docs.bonnard.dev/docs/getting-started) -from zero to deployed in minutes
186
+ - [CLI Reference](https://docs.bonnard.dev/docs/cli-reference) -every command, flag, and option
187
+ - [Modeling Guide](https://docs.bonnard.dev/docs/modeling) -cubes, views, metrics, and dimensions
188
+ - [Querying](https://docs.bonnard.dev/docs/querying) -JSON and SQL query syntax
189
+ - [Changelog](https://docs.bonnard.dev/docs/changelog) -what shipped and when
190
+
191
+ ## Community
192
+
193
+ - [Discord](https://discord.com/invite/RQuvjGRz) -ask questions, share feedback, connect with the team
194
+ - [GitHub Issues](https://github.com/meal-inc/bonnard-cli/issues) -bug reports and feature requests
195
+ - [LinkedIn](https://www.linkedin.com/company/bonnarddev/) -follow for updates
196
+ - [Website](https://www.bonnard.dev) -learn more about Bonnard
197
+
198
+ Contributions are welcome. If you find a bug or have an idea, open an issue or submit a pull request.
199
+
200
+ ## License
201
+
202
+ [MIT](./LICENSE)
package/dist/bin/bon.mjs CHANGED
@@ -1,6 +1,6 @@
1
1
  #!/usr/bin/env node
2
2
  import { n as getProjectPaths, t as BONNARD_DIR } from "./project-Dj085D_B.mjs";
3
- import { a as loadCredentials, i as clearCredentials, n as get, o as saveCredentials, r as post } from "./api-DqgY-30K.mjs";
3
+ import { a as loadCredentials, i as clearCredentials, n as get, o as saveCredentials, r as post, t as del } from "./api-DqgY-30K.mjs";
4
4
  import { i as ensureBonDir, n as addLocalDatasource, o as loadLocalDatasources, r as datasourceExists, s as removeLocalDatasource, t as isDatasourcesTrackedByGit } from "./local-ByvuW3eV.mjs";
5
5
  import { createRequire } from "node:module";
6
6
  import { program } from "commander";
@@ -117,7 +117,8 @@ function mapDbtType(dbtType) {
117
117
  postgresql: "postgres",
118
118
  redshift: "redshift",
119
119
  bigquery: "bigquery",
120
- databricks: "databricks"
120
+ databricks: "databricks",
121
+ duckdb: "duckdb"
121
122
  }[dbtType.toLowerCase()] ?? null;
122
123
  }
123
124
  /**
@@ -190,6 +191,8 @@ const PYTHON_PACKAGES = {
190
191
  "dbt-postgres": "dbt",
191
192
  "dbt-bigquery": "dbt",
192
193
  "dbt-databricks": "dbt",
194
+ "dbt-duckdb": "dbt",
195
+ duckdb: "duckdb",
193
196
  dagster: "dagster",
194
197
  sqlmesh: "sqlmesh",
195
198
  "apache-airflow": "airflow",
@@ -374,7 +377,8 @@ function extractWarehouseFromEnv(cwd) {
374
377
  postgres: "postgres",
375
378
  redshift: "redshift",
376
379
  bigquery: "bigquery",
377
- databricks: "databricks"
380
+ databricks: "databricks",
381
+ duckdb: "duckdb"
378
382
  }[cubeDbType[1].trim().toLowerCase()];
379
383
  if (type) return {
380
384
  type,
@@ -395,6 +399,11 @@ function extractWarehouseFromEnv(cwd) {
395
399
  source: "env",
396
400
  config: {}
397
401
  };
402
+ if (content.match(/^MOTHERDUCK_TOKEN=/m) || content.match(/^CUBEJS_DB_DUCKDB_DATABASE_PATH=/m)) return {
403
+ type: "duckdb",
404
+ source: "env",
405
+ config: {}
406
+ };
398
407
  } catch {}
399
408
  return null;
400
409
  }
@@ -907,6 +916,19 @@ function mapDatabricks(config) {
907
916
  };
908
917
  }
909
918
  /**
919
+ * Map DuckDB dbt config to Bonnard format
920
+ */
921
+ function mapDuckDB(config) {
922
+ const dbPath = getString(config, "path") || getString(config, "database");
923
+ return {
924
+ config: {
925
+ ...dbPath && { database_path: dbPath },
926
+ ...getString(config, "schema") && { schema: getString(config, "schema") }
927
+ },
928
+ credentials: { ...getString(config, "motherduck_token") && { motherduck_token: getString(config, "motherduck_token") } }
929
+ };
930
+ }
931
+ /**
910
932
  * Map a parsed dbt connection to Bonnard format
911
933
  * Values are copied as-is, including {{ env_var(...) }} patterns
912
934
  */
@@ -926,6 +948,9 @@ function mapDbtConnection(connection) {
926
948
  case "databricks":
927
949
  mapped = mapDatabricks(config);
928
950
  break;
951
+ case "duckdb":
952
+ mapped = mapDuckDB(config);
953
+ break;
929
954
  default: throw new Error(`Unsupported warehouse type: ${type}`);
930
955
  }
931
956
  return { datasource: {
@@ -1120,6 +1145,26 @@ const WAREHOUSE_CONFIGS = [
1120
1145
  secret: true,
1121
1146
  required: true
1122
1147
  }]
1148
+ },
1149
+ {
1150
+ value: "duckdb",
1151
+ label: "DuckDB",
1152
+ configFields: [{
1153
+ name: "database_path",
1154
+ flag: "databasePath",
1155
+ message: "Database path (file path, :memory:, or md:db_name for MotherDuck)",
1156
+ required: true
1157
+ }, {
1158
+ name: "schema",
1159
+ message: "Schema name",
1160
+ default: "main"
1161
+ }],
1162
+ credentialFields: [{
1163
+ name: "motherduck_token",
1164
+ flag: "motherduckToken",
1165
+ message: "MotherDuck token (required for md: paths)",
1166
+ secret: true
1167
+ }]
1123
1168
  }
1124
1169
  ];
1125
1170
  /**
@@ -1135,8 +1180,10 @@ function formatType$1(type) {
1135
1180
  return {
1136
1181
  snowflake: "Snowflake",
1137
1182
  postgres: "Postgres",
1183
+ redshift: "Redshift",
1138
1184
  bigquery: "BigQuery",
1139
- databricks: "Databricks"
1185
+ databricks: "Databricks",
1186
+ duckdb: "DuckDB"
1140
1187
  }[type] || type;
1141
1188
  }
1142
1189
  /**
@@ -1170,7 +1217,7 @@ async function importFromDbt(options) {
1170
1217
  }
1171
1218
  if (connections.length === 0) {
1172
1219
  console.log(pc.yellow("No supported connections found in dbt profiles."));
1173
- console.log(pc.dim("Supported types: snowflake, postgres, bigquery, databricks"));
1220
+ console.log(pc.dim("Supported types: snowflake, postgres, redshift, bigquery, databricks, duckdb"));
1174
1221
  process.exit(0);
1175
1222
  }
1176
1223
  if (typeof options.fromDbt === "string") {
@@ -1286,7 +1333,7 @@ async function addManual(options) {
1286
1333
  const warehouseConfig = WAREHOUSE_CONFIGS.find((w) => w.value === warehouseType);
1287
1334
  if (!warehouseConfig) {
1288
1335
  console.error(pc.red(`Invalid warehouse type: ${warehouseType}`));
1289
- console.log(pc.dim("Valid types: snowflake, postgres, bigquery, databricks"));
1336
+ console.log(pc.dim("Valid types: snowflake, postgres, redshift, bigquery, databricks, duckdb"));
1290
1337
  process.exit(1);
1291
1338
  }
1292
1339
  const config = {};
@@ -1310,6 +1357,7 @@ async function addManual(options) {
1310
1357
  let value;
1311
1358
  if (field.name === "password" && options.passwordEnv) value = envVarRef(options.passwordEnv);
1312
1359
  else if (field.name === "token" && options.tokenEnv) value = envVarRef(options.tokenEnv);
1360
+ else if (field.name === "motherduck_token" && options.motherduckTokenEnv) value = envVarRef(options.motherduckTokenEnv);
1313
1361
  else value = getOptionValue(options, field);
1314
1362
  if (!value && !nonInteractive) if (field.secret) value = await password({ message: field.message + ":" });
1315
1363
  else value = await input({ message: field.message + ":" });
@@ -3694,6 +3742,110 @@ async function metabaseAnalyzeCommand(options) {
3694
3742
  console.log(pc.dim(`Full report: ${outputPath}`));
3695
3743
  }
3696
3744
 
3745
+ //#endregion
3746
+ //#region src/commands/keys/list.ts
3747
+ function formatDate(dateStr) {
3748
+ if (!dateStr) return "—";
3749
+ return new Date(dateStr).toLocaleDateString("en-US", {
3750
+ month: "short",
3751
+ day: "numeric",
3752
+ year: "numeric"
3753
+ });
3754
+ }
3755
+ function printKeyTable(keys, title, description) {
3756
+ console.log(pc.bold(title));
3757
+ console.log(pc.dim(description));
3758
+ console.log();
3759
+ if (keys.length === 0) {
3760
+ console.log(pc.dim(" No keys."));
3761
+ return;
3762
+ }
3763
+ const maxNameLen = Math.max(...keys.map((k) => k.name.length), 4);
3764
+ const maxPrefixLen = Math.max(...keys.map((k) => k.key_prefix.length + 3), 3);
3765
+ const header = ` ${"NAME".padEnd(maxNameLen)} ${"KEY".padEnd(maxPrefixLen)} ${"CREATED".padEnd(14)} LAST USED`;
3766
+ console.log(pc.dim(header));
3767
+ console.log(pc.dim(" " + "─".repeat(header.length - 2)));
3768
+ for (const k of keys) {
3769
+ const name = k.name.padEnd(maxNameLen);
3770
+ const prefix = (k.key_prefix + "...").padEnd(maxPrefixLen);
3771
+ const created = formatDate(k.created_at).padEnd(14);
3772
+ const lastUsed = formatDate(k.last_used_at);
3773
+ console.log(` ${pc.bold(name)} ${pc.dim(prefix)} ${created} ${lastUsed}`);
3774
+ }
3775
+ }
3776
+ async function keysListCommand() {
3777
+ try {
3778
+ const result = await get("/api/web/keys");
3779
+ printKeyTable(result.publishableKeys, "Publishable Keys", "Client-side, read-only access");
3780
+ console.log();
3781
+ printKeyTable(result.secretKeys, "Secret Keys", "Server-side, full access");
3782
+ const total = result.publishableKeys.length + result.secretKeys.length;
3783
+ console.log();
3784
+ console.log(pc.dim(`${total} key${total !== 1 ? "s" : ""} total`));
3785
+ } catch (err) {
3786
+ console.error(pc.red(`Error: ${err.message}`));
3787
+ process.exit(1);
3788
+ }
3789
+ }
3790
+
3791
+ //#endregion
3792
+ //#region src/commands/keys/create.ts
3793
+ async function keysCreateCommand(options) {
3794
+ const { name, type } = options;
3795
+ if (type !== "publishable" && type !== "secret") {
3796
+ console.error(pc.red("Error: --type must be 'publishable' or 'secret'"));
3797
+ process.exit(1);
3798
+ }
3799
+ try {
3800
+ const result = await post("/api/web/keys", {
3801
+ name,
3802
+ type
3803
+ });
3804
+ console.log(pc.green("Key created successfully."));
3805
+ console.log();
3806
+ console.log(pc.bold(" Name: ") + result.name);
3807
+ console.log(pc.bold(" Type: ") + type);
3808
+ console.log(pc.bold(" Key: ") + result.key);
3809
+ console.log();
3810
+ console.log(pc.yellow("⚠ Save this key now — it won't be shown again."));
3811
+ } catch (err) {
3812
+ console.error(pc.red(`Error: ${err.message}`));
3813
+ process.exit(1);
3814
+ }
3815
+ }
3816
+
3817
+ //#endregion
3818
+ //#region src/commands/keys/revoke.ts
3819
+ async function keysRevokeCommand(nameOrPrefix) {
3820
+ try {
3821
+ const result = await get("/api/web/keys");
3822
+ let match;
3823
+ let type;
3824
+ for (const k of result.publishableKeys) if (k.name === nameOrPrefix || k.key_prefix.startsWith(nameOrPrefix)) {
3825
+ match = k;
3826
+ type = "publishable";
3827
+ break;
3828
+ }
3829
+ if (!match) {
3830
+ for (const k of result.secretKeys) if (k.name === nameOrPrefix || k.key_prefix.startsWith(nameOrPrefix)) {
3831
+ match = k;
3832
+ type = "secret";
3833
+ break;
3834
+ }
3835
+ }
3836
+ if (!match || !type) {
3837
+ console.error(pc.red(`No key found matching "${nameOrPrefix}".`));
3838
+ console.error(pc.dim("Use `bon keys list` to see available keys."));
3839
+ process.exit(1);
3840
+ }
3841
+ await del(`/api/web/keys/${match.id}?type=${type}`);
3842
+ console.log(pc.green(`Revoked ${type} key "${match.name}" (${match.key_prefix}...).`));
3843
+ } catch (err) {
3844
+ console.error(pc.red(`Error: ${err.message}`));
3845
+ process.exit(1);
3846
+ }
3847
+ }
3848
+
3697
3849
  //#endregion
3698
3850
  //#region src/bin/bon.ts
3699
3851
  const { version } = createRequire(import.meta.url)("../../package.json");
@@ -3703,7 +3855,7 @@ program.command("login").description("Authenticate with Bonnard via your browser
3703
3855
  program.command("logout").description("Remove stored credentials").action(logoutCommand);
3704
3856
  program.command("whoami").description("Show current login status").option("--verify", "Verify session is still valid with the server").action(whoamiCommand);
3705
3857
  const datasource = program.command("datasource").description("Manage warehouse data source connections");
3706
- datasource.command("add").description("Add a data source to .bon/datasources.yaml. Use --name and --type together for non-interactive mode").option("--demo", "Add a read-only demo datasource (Contoso retail dataset) for testing").option("--from-dbt [profile]", "Import from dbt profiles.yml (optionally specify profile/target)").option("--target <target>", "Target name when using --from-dbt").option("--all", "Import all connections from dbt profiles").option("--default-targets", "Import only default targets from dbt profiles (non-interactive)").option("--name <name>", "Datasource name (required for non-interactive mode)").option("--type <type>", "Warehouse type: snowflake, postgres, bigquery, databricks (required for non-interactive mode)").option("--account <account>", "Snowflake account identifier").option("--database <database>", "Database name").option("--schema <schema>", "Schema name").option("--warehouse <warehouse>", "Warehouse name (Snowflake)").option("--role <role>", "Role (Snowflake)").option("--host <host>", "Host (Postgres)").option("--port <port>", "Port (Postgres, default: 5432)").option("--project-id <projectId>", "GCP Project ID (BigQuery)").option("--dataset <dataset>", "Dataset name (BigQuery)").option("--location <location>", "Location (BigQuery)").option("--hostname <hostname>", "Server hostname (Databricks)").option("--http-path <httpPath>", "HTTP path (Databricks)").option("--catalog <catalog>", "Catalog name (Databricks)").option("--user <user>", "Username").option("--password <password>", "Password (use --password-env for env var reference)").option("--token <token>", "Access token (use --token-env for env var reference)").option("--service-account-json <json>", "Service account JSON (BigQuery)").option("--keyfile <path>", "Path to service account key file (BigQuery)").option("--password-env <varName>", "Env var name for password, stores as {{ env_var('NAME') }}").option("--token-env <varName>", "Env var name for token, stores as {{ env_var('NAME') }}").option("--force", "Overwrite existing datasource without prompting").action(datasourceAddCommand);
3858
+ datasource.command("add").description("Add a data source to .bon/datasources.yaml. Use --name and --type together for non-interactive mode").option("--demo", "Add a read-only demo datasource (Contoso retail dataset) for testing").option("--from-dbt [profile]", "Import from dbt profiles.yml (optionally specify profile/target)").option("--target <target>", "Target name when using --from-dbt").option("--all", "Import all connections from dbt profiles").option("--default-targets", "Import only default targets from dbt profiles (non-interactive)").option("--name <name>", "Datasource name (required for non-interactive mode)").option("--type <type>", "Warehouse type: snowflake, postgres, redshift, bigquery, databricks, duckdb (required for non-interactive mode)").option("--account <account>", "Snowflake account identifier").option("--database <database>", "Database name").option("--schema <schema>", "Schema name").option("--warehouse <warehouse>", "Warehouse name (Snowflake)").option("--role <role>", "Role (Snowflake)").option("--host <host>", "Host (Postgres)").option("--port <port>", "Port (Postgres, default: 5432)").option("--project-id <projectId>", "GCP Project ID (BigQuery)").option("--dataset <dataset>", "Dataset name (BigQuery)").option("--location <location>", "Location (BigQuery)").option("--hostname <hostname>", "Server hostname (Databricks)").option("--http-path <httpPath>", "HTTP path (Databricks)").option("--catalog <catalog>", "Catalog name (Databricks)").option("--database-path <databasePath>", "Database path: file path, :memory:, or md:db_name for MotherDuck (DuckDB)").option("--motherduck-token <token>", "MotherDuck token (DuckDB, for md: paths)").option("--motherduck-token-env <varName>", "Env var name for MotherDuck token, stores as {{ env_var('NAME') }}").option("--user <user>", "Username").option("--password <password>", "Password (use --password-env for env var reference)").option("--token <token>", "Access token (use --token-env for env var reference)").option("--service-account-json <json>", "Service account JSON (BigQuery)").option("--keyfile <path>", "Path to service account key file (BigQuery)").option("--password-env <varName>", "Env var name for password, stores as {{ env_var('NAME') }}").option("--token-env <varName>", "Env var name for token, stores as {{ env_var('NAME') }}").option("--force", "Overwrite existing datasource without prompting").action(datasourceAddCommand);
3707
3859
  datasource.command("list").description("List data sources (shows both local and remote by default)").option("--local", "Show only local data sources from .bon/datasources.yaml").option("--remote", "Show only remote data sources from Bonnard server (requires login)").action(datasourceListCommand);
3708
3860
  datasource.command("remove").description("Remove a data source from .bon/datasources.yaml (local by default)").argument("<name>", "Data source name").option("--remote", "Remove from Bonnard server instead of local (requires login)").action(datasourceRemoveCommand);
3709
3861
  program.command("validate").description("Validate YAML syntax in bonnard/cubes/ and bonnard/views/").action(validateCommand);
@@ -3714,6 +3866,10 @@ program.command("annotate").description("Annotate deployment changes with reason
3714
3866
  program.command("mcp").description("MCP connection info and setup instructions").action(mcpCommand).command("test").description("Test MCP server connectivity").action(mcpTestCommand);
3715
3867
  program.command("query").description("Execute a query against the deployed semantic layer").argument("<query>", "JSON query or SQL (with --sql flag)").option("--sql", "Use SQL API instead of JSON format").option("--limit <limit>", "Max rows to return").option("--format <format>", "Output format: toon or json", "toon").action(cubeQueryCommand);
3716
3868
  program.command("docs").description("Browse documentation for building cubes and views").argument("[topic]", "Topic to display (e.g., cubes, cubes.measures)").option("-r, --recursive", "Show topic and all child topics").option("-s, --search <query>", "Search topics for a keyword").option("-f, --format <format>", "Output format: markdown or json", "markdown").action(docsCommand).command("schema").description("Show JSON schema for a type (cube, view, measure, etc.)").argument("<type>", "Schema type to display").action(docsSchemaCommand);
3869
+ const keys = program.command("keys").description("Manage API keys for the Bonnard SDK");
3870
+ keys.command("list").description("List all API keys for your organization").action(keysListCommand);
3871
+ keys.command("create").description("Create a new API key").requiredOption("--name <name>", "Key name (e.g. 'Production SDK')").requiredOption("--type <type>", "Key type: publishable or secret").action(keysCreateCommand);
3872
+ keys.command("revoke").description("Revoke an API key by name or prefix").argument("<name-or-prefix>", "Key name or key prefix to revoke").action(keysRevokeCommand);
3717
3873
  const metabase = program.command("metabase").description("Connect to and explore Metabase content");
3718
3874
  metabase.command("connect").description("Configure Metabase API connection").option("--url <url>", "Metabase instance URL").option("--api-key <key>", "Metabase API key").option("--force", "Overwrite existing configuration").action(metabaseConnectCommand);
3719
3875
  metabase.command("explore").description("Browse Metabase databases, collections, cards, and dashboards").argument("[resource]", "databases, collections, cards, dashboards, card, dashboard, database, table, collection").argument("[id]", "Resource ID (e.g. card <id>, dashboard <id>, database <id>, table <id>, collection <id>)").action(metabaseExploreCommand);
@@ -23,7 +23,7 @@ Choose the component that best fits your data:
23
23
 
24
24
  - Components are self-closing (`/>`)
25
25
  - `data` uses curly braces: `data={query_name}`
26
- - Other props use quotes: `x="field_name"`
26
+ - Other props use quotes: `x="orders.city"`
27
27
  - Boolean props can be shorthand: `horizontal`
28
28
 
29
29
  ## Component Reference
@@ -33,13 +33,13 @@ Choose the component that best fits your data:
33
33
  Displays a single KPI metric as a large number.
34
34
 
35
35
  ```markdown
36
- <BigValue data={total_revenue} value="total_revenue" title="Revenue" />
36
+ <BigValue data={total_revenue} value="orders.total_revenue" title="Revenue" />
37
37
  ```
38
38
 
39
39
  | Prop | Type | Required | Description |
40
40
  |------|------|----------|-------------|
41
41
  | `data` | query ref | Yes | Query name (should return a single row) |
42
- | `value` | string | Yes | Measure field name to display |
42
+ | `value` | string | Yes | Fully qualified measure field name to display |
43
43
  | `title` | string | No | Label above the value |
44
44
  | `fmt` | string | No | Format preset or Excel code (e.g. `fmt="eur2"`, `fmt="$#,##0.00"`) |
45
45
 
@@ -48,16 +48,16 @@ Displays a single KPI metric as a large number.
48
48
  Renders a line chart, typically for time series. Supports multiple y columns and series splitting.
49
49
 
50
50
  ```markdown
51
- <LineChart data={monthly_revenue} x="created_at" y="total_revenue" title="Revenue Trend" />
52
- <LineChart data={trend} x="date" y="revenue,cases" />
53
- <LineChart data={revenue_by_type} x="created_at" y="total_revenue" series="type" />
51
+ <LineChart data={monthly_revenue} x="orders.created_at" y="orders.total_revenue" title="Revenue Trend" />
52
+ <LineChart data={trend} x="orders.created_at" y="orders.total_revenue,orders.count" />
53
+ <LineChart data={revenue_by_type} x="orders.created_at" y="orders.total_revenue" series="orders.type" />
54
54
  ```
55
55
 
56
56
  | Prop | Type | Required | Description |
57
57
  |------|------|----------|-------------|
58
58
  | `data` | query ref | Yes | Query name |
59
59
  | `x` | string | Yes | Field for x-axis (typically a time dimension) |
60
- | `y` | string | Yes | Field(s) for y-axis. Comma-separated for multiple (e.g. `y="revenue,cases"`) |
60
+ | `y` | string | Yes | Field(s) for y-axis. Comma-separated for multiple (e.g. `y="orders.total_revenue,orders.count"`) |
61
61
  | `title` | string | No | Chart title |
62
62
  | `series` | string | No | Column to split data into separate colored lines |
63
63
  | `type` | string | No | `"stacked"` for stacked lines (default: no stacking) |
@@ -68,17 +68,17 @@ Renders a line chart, typically for time series. Supports multiple y columns and
68
68
  Renders a vertical bar chart. Add `horizontal` for horizontal bars. Supports multi-series with stacked or grouped display.
69
69
 
70
70
  ```markdown
71
- <BarChart data={revenue_by_city} x="city" y="total_revenue" />
72
- <BarChart data={revenue_by_city} x="city" y="total_revenue" horizontal />
73
- <BarChart data={revenue_by_type} x="month" y="total_revenue" series="type" />
74
- <BarChart data={revenue_by_type} x="month" y="total_revenue" series="type" type="grouped" />
71
+ <BarChart data={revenue_by_city} x="orders.city" y="orders.total_revenue" />
72
+ <BarChart data={revenue_by_city} x="orders.city" y="orders.total_revenue" horizontal />
73
+ <BarChart data={revenue_by_type} x="orders.created_at" y="orders.total_revenue" series="orders.type" />
74
+ <BarChart data={revenue_by_type} x="orders.created_at" y="orders.total_revenue" series="orders.type" type="grouped" />
75
75
  ```
76
76
 
77
77
  | Prop | Type | Required | Description |
78
78
  |------|------|----------|-------------|
79
79
  | `data` | query ref | Yes | Query name |
80
80
  | `x` | string | Yes | Field for category axis |
81
- | `y` | string | Yes | Field(s) for value axis. Comma-separated for multiple (e.g. `y="revenue,cases"`) |
81
+ | `y` | string | Yes | Field(s) for value axis. Comma-separated for multiple (e.g. `y="orders.total_revenue,orders.count"`) |
82
82
  | `title` | string | No | Chart title |
83
83
  | `horizontal` | boolean | No | Render as horizontal bar chart |
84
84
  | `series` | string | No | Column to split data into separate colored bars |
@@ -90,15 +90,15 @@ Renders a vertical bar chart. Add `horizontal` for horizontal bars. Supports mul
90
90
  Renders a filled area chart. Supports series splitting and stacked areas.
91
91
 
92
92
  ```markdown
93
- <AreaChart data={monthly_revenue} x="created_at" y="total_revenue" />
94
- <AreaChart data={revenue_by_source} x="created_at" y="total_revenue" series="source" type="stacked" />
93
+ <AreaChart data={monthly_revenue} x="orders.created_at" y="orders.total_revenue" />
94
+ <AreaChart data={revenue_by_source} x="orders.created_at" y="orders.total_revenue" series="orders.source" type="stacked" />
95
95
  ```
96
96
 
97
97
  | Prop | Type | Required | Description |
98
98
  |------|------|----------|-------------|
99
99
  | `data` | query ref | Yes | Query name |
100
100
  | `x` | string | Yes | Field for x-axis |
101
- | `y` | string | Yes | Field(s) for y-axis. Comma-separated for multiple (e.g. `y="revenue,cases"`) |
101
+ | `y` | string | Yes | Field(s) for y-axis. Comma-separated for multiple (e.g. `y="orders.total_revenue,orders.count"`) |
102
102
  | `title` | string | No | Chart title |
103
103
  | `series` | string | No | Column to split data into separate colored areas |
104
104
  | `type` | string | No | `"stacked"` for stacked areas (default: no stacking) |
@@ -109,7 +109,7 @@ Renders a filled area chart. Supports series splitting and stacked areas.
109
109
  Renders a pie/donut chart.
110
110
 
111
111
  ```markdown
112
- <PieChart data={by_status} name="status" value="count" title="Order Status" />
112
+ <PieChart data={by_status} name="orders.status" value="orders.count" title="Order Status" />
113
113
  ```
114
114
 
115
115
  | Prop | Type | Required | Description |
@@ -125,7 +125,7 @@ Renders query results as a sortable, paginated table. Click any column header to
125
125
 
126
126
  ```markdown
127
127
  <DataTable data={top_products} />
128
- <DataTable data={top_products} columns="name,revenue,count" />
128
+ <DataTable data={top_products} columns="orders.category,orders.total_revenue,orders.count" />
129
129
  <DataTable data={top_products} rows="25" />
130
130
  <DataTable data={top_products} rows="all" />
131
131
  ```
@@ -135,7 +135,7 @@ Renders query results as a sortable, paginated table. Click any column header to
135
135
  | `data` | query ref | Yes | Query name |
136
136
  | `columns` | string | No | Comma-separated list of columns to show (default: all) |
137
137
  | `title` | string | No | Table title |
138
- | `fmt` | string | No | Column format map: `fmt="revenue:eur2,date:shortdate"` |
138
+ | `fmt` | string | No | Column format map: `fmt="orders.total_revenue:eur2,orders.created_at:shortdate"` |
139
139
  | `rows` | string | No | Rows per page. Default `10`. Use `rows="all"` to disable pagination. |
140
140
 
141
141
  **Sorting:** Click a column header to sort ascending. Click again to sort descending. Null values always sort to the end. Numbers sort numerically, strings sort case-insensitively.
@@ -149,9 +149,9 @@ Renders query results as a sortable, paginated table. Click any column header to
149
149
  Consecutive `<BigValue>` components are automatically wrapped in a responsive grid — no `<Grid>` tag needed:
150
150
 
151
151
  ```markdown
152
- <BigValue data={total_revenue} value="total_revenue" title="Revenue" />
153
- <BigValue data={order_count} value="count" title="Orders" />
154
- <BigValue data={avg_order} value="avg_order_value" title="Avg Order" />
152
+ <BigValue data={total_revenue} value="orders.total_revenue" title="Revenue" />
153
+ <BigValue data={order_count} value="orders.count" title="Orders" />
154
+ <BigValue data={avg_order} value="orders.avg_order_value" title="Avg Order" />
155
155
  ```
156
156
 
157
157
  This renders as a 3-column row. The grid auto-sizes up to 4 columns based on the number of consecutive BigValues. For more control, use an explicit `<Grid>` tag.
@@ -162,9 +162,9 @@ Wrap components in a `<Grid>` tag to arrange them in columns:
162
162
 
163
163
  ```markdown
164
164
  <Grid cols="3">
165
- <BigValue data={total_orders} value="count" title="Orders" />
166
- <BigValue data={total_revenue} value="total_revenue" title="Revenue" />
167
- <BigValue data={avg_order} value="avg_order_value" title="Avg Order" />
165
+ <BigValue data={total_orders} value="orders.count" title="Orders" />
166
+ <BigValue data={total_revenue} value="orders.total_revenue" title="Revenue" />
167
+ <BigValue data={avg_order} value="orders.avg_order_value" title="Avg Order" />
168
168
  </Grid>
169
169
  ```
170
170
 
@@ -198,7 +198,7 @@ Values are auto-formatted by default — numbers get locale grouping (1,234.56),
198
198
  | `longdate` | `d mmmm yyyy` | 13 January 2025 |
199
199
  | `monthyear` | `mmm yyyy` | Jan 2025 |
200
200
 
201
- Any string that isn't a preset name is treated as a raw Excel format code (ECMA-376). For example: `fmt="revenue:$#,##0.00"`.
201
+ Any string that isn't a preset name is treated as a raw Excel format code (ECMA-376). For example: `fmt="orders.total_revenue:$#,##0.00"`.
202
202
 
203
203
  Note: Percentage presets (`pct`, `pct1`, `pct2`) multiply by 100 per Excel convention — 0.45 displays as "45%".
204
204
 
@@ -206,19 +206,19 @@ Note: Percentage presets (`pct`, `pct1`, `pct2`) multiply by 100 per Excel conve
206
206
 
207
207
  ```markdown
208
208
  <!-- BigValue with currency -->
209
- <BigValue data={total_revenue} value="total_revenue" title="Revenue" fmt="eur2" />
209
+ <BigValue data={total_revenue} value="orders.total_revenue" title="Revenue" fmt="eur2" />
210
210
 
211
211
  <!-- DataTable with per-column formatting -->
212
- <DataTable data={sales} fmt="total_revenue:usd2,created_at:shortdate,margin:pct1" />
212
+ <DataTable data={sales} fmt="orders.total_revenue:usd2,orders.created_at:shortdate,orders.margin:pct1" />
213
213
 
214
214
  <!-- Chart with formatted tooltips -->
215
- <BarChart data={monthly} x="month" y="revenue" yFmt="usd" />
216
- <LineChart data={trend} x="date" y="growth" yFmt="pct1" />
215
+ <BarChart data={monthly} x="orders.created_at" y="orders.total_revenue" yFmt="usd" />
216
+ <LineChart data={trend} x="orders.created_at" y="orders.growth" yFmt="pct1" />
217
217
  ```
218
218
 
219
219
  ## Field Names
220
220
 
221
- Component field names (e.g. `x="city"`, `value="total_revenue"`) use the **unqualified** measure or dimension name — the same names defined in your cube. For example, if your cube has `measures: [{ name: total_revenue, ... }]`, use `value="total_revenue"`.
221
+ All field names in component props must be **fully qualified** with the view or cube name — the same format used in query blocks. For example, use `value="orders.total_revenue"` not `value="total_revenue"`.
222
222
 
223
223
  ## See Also
224
224
 
@@ -15,50 +15,45 @@ description: Monthly revenue trends and breakdowns
15
15
  # Revenue Overview
16
16
 
17
17
  ` ``query total_revenue
18
- cube: orders
19
- measures: [total_revenue]
18
+ measures: [orders.total_revenue]
20
19
  ` ``
21
20
 
22
21
  ` ``query order_count
23
- cube: orders
24
- measures: [count]
22
+ measures: [orders.count]
25
23
  ` ``
26
24
 
27
25
  ` ``query avg_order
28
- cube: orders
29
- measures: [avg_order_value]
26
+ measures: [orders.avg_order_value]
30
27
  ` ``
31
28
 
32
29
  <Grid cols="3">
33
- <BigValue data={total_revenue} value="total_revenue" title="Total Revenue" />
34
- <BigValue data={order_count} value="count" title="Orders" />
35
- <BigValue data={avg_order} value="avg_order_value" title="Avg Order" />
30
+ <BigValue data={total_revenue} value="orders.total_revenue" title="Total Revenue" />
31
+ <BigValue data={order_count} value="orders.count" title="Orders" />
32
+ <BigValue data={avg_order} value="orders.avg_order_value" title="Avg Order" />
36
33
  </Grid>
37
34
 
38
35
  ## Monthly Trend
39
36
 
40
37
  ` ``query monthly_revenue
41
- cube: orders
42
- measures: [total_revenue]
38
+ measures: [orders.total_revenue]
43
39
  timeDimension:
44
- dimension: created_at
40
+ dimension: orders.created_at
45
41
  granularity: month
46
42
  dateRange: [2025-01-01, 2025-12-31]
47
43
  ` ``
48
44
 
49
- <LineChart data={monthly_revenue} x="created_at" y="total_revenue" title="Monthly Revenue" />
45
+ <LineChart data={monthly_revenue} x="orders.created_at" y="orders.total_revenue" title="Monthly Revenue" />
50
46
 
51
47
  ## By Category
52
48
 
53
49
  ` ``query by_category
54
- cube: orders
55
- measures: [total_revenue, count]
56
- dimensions: [category]
50
+ measures: [orders.total_revenue, orders.count]
51
+ dimensions: [orders.category]
57
52
  orderBy:
58
- total_revenue: desc
53
+ orders.total_revenue: desc
59
54
  ` ``
60
55
 
61
- <BarChart data={by_category} x="category" y="total_revenue" title="Revenue by Category" />
56
+ <BarChart data={by_category} x="orders.category" y="orders.total_revenue" title="Revenue by Category" />
62
57
  <DataTable data={by_category} />
63
58
  ```
64
59
 
@@ -75,43 +70,40 @@ description: Order status breakdown and city analysis
75
70
  # Sales Pipeline
76
71
 
77
72
  ` ``query by_status
78
- cube: orders
79
- measures: [count]
80
- dimensions: [status]
73
+ measures: [orders.count]
74
+ dimensions: [orders.status]
81
75
  ` ``
82
76
 
83
- <PieChart data={by_status} name="status" value="count" title="Order Status" />
77
+ <PieChart data={by_status} name="orders.status" value="orders.count" title="Order Status" />
84
78
 
85
79
  ## Top Cities
86
80
 
87
81
  ` ``query top_cities
88
- cube: orders
89
- measures: [total_revenue, count]
90
- dimensions: [city]
82
+ measures: [orders.total_revenue, orders.count]
83
+ dimensions: [orders.city]
91
84
  orderBy:
92
- total_revenue: desc
85
+ orders.total_revenue: desc
93
86
  limit: 10
94
87
  ` ``
95
88
 
96
- <BarChart data={top_cities} x="city" y="total_revenue" horizontal />
89
+ <BarChart data={top_cities} x="orders.city" y="orders.total_revenue" horizontal />
97
90
  <DataTable data={top_cities} />
98
91
 
99
92
  ## Completed Orders Over Time
100
93
 
101
94
  ` ``query completed_trend
102
- cube: orders
103
- measures: [total_revenue]
95
+ measures: [orders.total_revenue]
104
96
  timeDimension:
105
- dimension: created_at
97
+ dimension: orders.created_at
106
98
  granularity: week
107
99
  dateRange: [2025-01-01, 2025-06-30]
108
100
  filters:
109
- - dimension: status
101
+ - dimension: orders.status
110
102
  operator: equals
111
103
  values: [completed]
112
104
  ` ``
113
105
 
114
- <AreaChart data={completed_trend} x="created_at" y="total_revenue" title="Completed Order Revenue" />
106
+ <AreaChart data={completed_trend} x="orders.created_at" y="orders.total_revenue" title="Completed Order Revenue" />
115
107
  ```
116
108
 
117
109
  ## Multi-Series Dashboard
@@ -127,39 +119,37 @@ description: Multi-series charts showing revenue breakdown by sales channel
127
119
  # Revenue by Channel
128
120
 
129
121
  ` ``query revenue_by_channel
130
- cube: orders
131
- measures: [total_revenue]
132
- dimensions: [channel]
122
+ measures: [orders.total_revenue]
123
+ dimensions: [orders.channel]
133
124
  timeDimension:
134
- dimension: created_at
125
+ dimension: orders.created_at
135
126
  granularity: month
136
127
  dateRange: [2025-01-01, 2025-12-31]
137
128
  ` ``
138
129
 
139
130
  ## Stacked Bar (default)
140
131
 
141
- <BarChart data={revenue_by_channel} x="created_at" y="total_revenue" series="channel" title="Revenue by Channel" />
132
+ <BarChart data={revenue_by_channel} x="orders.created_at" y="orders.total_revenue" series="orders.channel" title="Revenue by Channel" />
142
133
 
143
134
  ## Grouped Bar
144
135
 
145
- <BarChart data={revenue_by_channel} x="created_at" y="total_revenue" series="channel" type="grouped" title="Revenue by Channel (Grouped)" />
136
+ <BarChart data={revenue_by_channel} x="orders.created_at" y="orders.total_revenue" series="orders.channel" type="grouped" title="Revenue by Channel (Grouped)" />
146
137
 
147
138
  ## Multi-Line
148
139
 
149
140
  ` ``query trend
150
- cube: orders
151
- measures: [total_revenue, count]
141
+ measures: [orders.total_revenue, orders.count]
152
142
  timeDimension:
153
- dimension: created_at
143
+ dimension: orders.created_at
154
144
  granularity: month
155
145
  dateRange: [2025-01-01, 2025-12-31]
156
146
  ` ``
157
147
 
158
- <LineChart data={trend} x="created_at" y="total_revenue,count" title="Revenue vs Orders" />
148
+ <LineChart data={trend} x="orders.created_at" y="orders.total_revenue,orders.count" title="Revenue vs Orders" />
159
149
 
160
150
  ## Stacked Area by Channel
161
151
 
162
- <AreaChart data={revenue_by_channel} x="created_at" y="total_revenue" series="channel" type="stacked" title="Revenue by Channel" />
152
+ <AreaChart data={revenue_by_channel} x="orders.created_at" y="orders.total_revenue" series="orders.channel" type="stacked" title="Revenue by Channel" />
163
153
  ```
164
154
 
165
155
  ## Formatted Dashboard
@@ -175,40 +165,37 @@ description: Formatted revenue metrics and trends
175
165
  # Sales Performance
176
166
 
177
167
  ` ``query totals
178
- cube: orders
179
- measures: [total_revenue, count, avg_order_value]
168
+ measures: [orders.total_revenue, orders.count, orders.avg_order_value]
180
169
  ` ``
181
170
 
182
171
  <Grid cols="3">
183
- <BigValue data={totals} value="total_revenue" title="Revenue" fmt="eur2" />
184
- <BigValue data={totals} value="count" title="Orders" fmt="num0" />
185
- <BigValue data={totals} value="avg_order_value" title="Avg Order" fmt="eur2" />
172
+ <BigValue data={totals} value="orders.total_revenue" title="Revenue" fmt="eur2" />
173
+ <BigValue data={totals} value="orders.count" title="Orders" fmt="num0" />
174
+ <BigValue data={totals} value="orders.avg_order_value" title="Avg Order" fmt="eur2" />
186
175
  </Grid>
187
176
 
188
177
  ## Revenue Trend
189
178
 
190
179
  ` ``query monthly
191
- cube: orders
192
- measures: [total_revenue]
180
+ measures: [orders.total_revenue]
193
181
  timeDimension:
194
- dimension: created_at
182
+ dimension: orders.created_at
195
183
  granularity: month
196
184
  dateRange: [2025-01-01, 2025-12-31]
197
185
  ` ``
198
186
 
199
- <LineChart data={monthly} x="created_at" y="total_revenue" title="Monthly Revenue" yFmt="eur" />
187
+ <LineChart data={monthly} x="orders.created_at" y="orders.total_revenue" title="Monthly Revenue" yFmt="eur" />
200
188
 
201
189
  ## Detail Table
202
190
 
203
191
  ` ``query details
204
- cube: orders
205
- measures: [total_revenue, count]
206
- dimensions: [category]
192
+ measures: [orders.total_revenue, orders.count]
193
+ dimensions: [orders.category]
207
194
  orderBy:
208
- total_revenue: desc
195
+ orders.total_revenue: desc
209
196
  ` ``
210
197
 
211
- <DataTable data={details} fmt="total_revenue:eur2,count:num0" />
198
+ <DataTable data={details} fmt="orders.total_revenue:eur2,orders.count:num0" />
212
199
  ```
213
200
 
214
201
  ## Interactive Dashboard
@@ -224,47 +211,43 @@ description: Sales dashboard with date and channel filters
224
211
  # Interactive Sales
225
212
 
226
213
  <DateRange name="period" default="last-6-months" label="Time Period" />
227
- <Dropdown name="channel" dimension="channel" data={channels} queries="trend,by_city" label="Channel" />
214
+ <Dropdown name="channel" dimension="orders.channel" data={channels} queries="trend,by_city" label="Channel" />
228
215
 
229
216
  ` ``query channels
230
- cube: orders
231
- dimensions: [channel]
217
+ dimensions: [orders.channel]
232
218
  ` ``
233
219
 
234
220
  ` ``query kpis
235
- cube: orders
236
- measures: [total_revenue, count]
221
+ measures: [orders.total_revenue, orders.count]
237
222
  ` ``
238
223
 
239
224
  <Grid cols="2">
240
- <BigValue data={kpis} value="total_revenue" title="Revenue" fmt="eur2" />
241
- <BigValue data={kpis} value="count" title="Orders" fmt="num0" />
225
+ <BigValue data={kpis} value="orders.total_revenue" title="Revenue" fmt="eur2" />
226
+ <BigValue data={kpis} value="orders.count" title="Orders" fmt="num0" />
242
227
  </Grid>
243
228
 
244
229
  ## Revenue Trend
245
230
 
246
231
  ` ``query trend
247
- cube: orders
248
- measures: [total_revenue]
232
+ measures: [orders.total_revenue]
249
233
  timeDimension:
250
- dimension: created_at
234
+ dimension: orders.created_at
251
235
  granularity: month
252
236
  ` ``
253
237
 
254
- <LineChart data={trend} x="created_at" y="total_revenue" title="Monthly Revenue" yFmt="eur" />
238
+ <LineChart data={trend} x="orders.created_at" y="orders.total_revenue" title="Monthly Revenue" yFmt="eur" />
255
239
 
256
240
  ## By City
257
241
 
258
242
  ` ``query by_city
259
- cube: orders
260
- measures: [total_revenue]
261
- dimensions: [city]
243
+ measures: [orders.total_revenue]
244
+ dimensions: [orders.city]
262
245
  orderBy:
263
- total_revenue: desc
246
+ orders.total_revenue: desc
264
247
  limit: 10
265
248
  ` ``
266
249
 
267
- <BarChart data={by_city} x="city" y="total_revenue" title="Top Cities" yFmt="eur" />
250
+ <BarChart data={by_city} x="orders.city" y="orders.total_revenue" title="Top Cities" yFmt="eur" />
268
251
  ```
269
252
 
270
253
  The `<DateRange>` automatically applies to all queries with a `timeDimension` (here: `trend`). The `<Dropdown>` filters `trend` and `by_city` by channel. The `channels` query populates the dropdown and is never filtered by it.
@@ -273,12 +256,12 @@ The `<DateRange>` automatically applies to all queries with a `timeDimension` (h
273
256
 
274
257
  - **Start with KPIs**: Use `BigValue` in a `Grid` at the top for key metrics
275
258
  - **One query per chart**: Each component gets its own query — keep them focused
276
- - **Use views**: Prefer view names over cube names when available for cleaner field names
259
+ - **Use views**: Prefer view names over cube names when available
277
260
  - **Name queries descriptively**: `monthly_revenue` is better than `q1`
278
261
  - **Limit large datasets**: Add `limit` to dimension queries to avoid oversized charts
279
262
  - **Time series**: Always use `timeDimension` with `granularity` for time-based charts
280
- - **Multi-series**: Use `series="column"` to split data by a dimension. For bars, default is stacked; use `type="grouped"` for side-by-side
281
- - **Multiple y columns**: Use comma-separated values like `y="revenue,cases"` to show multiple measures on one chart
263
+ - **Multi-series**: Use `series="cube.column"` to split data by a dimension. For bars, default is stacked; use `type="grouped"` for side-by-side
264
+ - **Multiple y columns**: Use comma-separated values like `y="orders.revenue,orders.cases"` to show multiple measures on one chart
282
265
 
283
266
  ## See Also
284
267
 
@@ -54,7 +54,7 @@ Targeting lets you control which queries a filter affects — useful when some c
54
54
  Renders a dropdown selector populated from a query's dimension values. Adds a filter on the specified dimension to targeted queries.
55
55
 
56
56
  ```markdown
57
- <Dropdown name="channel" dimension="channel" data={channels} queries="main,trend" label="Channel" />
57
+ <Dropdown name="channel" dimension="orders.channel" data={channels} queries="main,trend" label="Channel" />
58
58
  ```
59
59
 
60
60
  ### Props
@@ -86,14 +86,13 @@ title: Revenue Trends
86
86
  <DateRange name="period" default="last-6-months" label="Time Period" />
87
87
 
88
88
  ` ``query monthly_revenue
89
- cube: orders
90
- measures: [total_revenue]
89
+ measures: [orders.total_revenue]
91
90
  timeDimension:
92
- dimension: created_at
91
+ dimension: orders.created_at
93
92
  granularity: month
94
93
  ` ``
95
94
 
96
- <LineChart data={monthly_revenue} x="created_at" y="total_revenue" />
95
+ <LineChart data={monthly_revenue} x="orders.created_at" y="orders.total_revenue" />
97
96
  ```
98
97
 
99
98
  The DateRange automatically applies to `monthly_revenue` because it has a `timeDimension`. No hardcoded `dateRange` needed in the query.
@@ -106,19 +105,17 @@ title: Sales by Channel
106
105
  ---
107
106
 
108
107
  ` ``query channels
109
- cube: orders
110
- dimensions: [channel]
108
+ dimensions: [orders.channel]
111
109
  ` ``
112
110
 
113
- <Dropdown name="ch" dimension="channel" data={channels} queries="main" label="Channel" />
111
+ <Dropdown name="ch" dimension="orders.channel" data={channels} queries="main" label="Channel" />
114
112
 
115
113
  ` ``query main
116
- cube: orders
117
- measures: [total_revenue]
118
- dimensions: [city]
114
+ measures: [orders.total_revenue]
115
+ dimensions: [orders.city]
119
116
  ` ``
120
117
 
121
- <BarChart data={main} x="city" y="total_revenue" />
118
+ <BarChart data={main} x="orders.city" y="orders.total_revenue" />
122
119
  ```
123
120
 
124
121
  ### Combined inputs
@@ -129,30 +126,27 @@ title: Sales Dashboard
129
126
  ---
130
127
 
131
128
  <DateRange name="period" default="last-6-months" label="Time Period" />
132
- <Dropdown name="channel" dimension="channel" data={channels} queries="trend,by_city" label="Channel" />
129
+ <Dropdown name="channel" dimension="orders.channel" data={channels} queries="trend,by_city" label="Channel" />
133
130
 
134
131
  ` ``query channels
135
- cube: orders
136
- dimensions: [channel]
132
+ dimensions: [orders.channel]
137
133
  ` ``
138
134
 
139
135
  ` ``query trend
140
- cube: orders
141
- measures: [total_revenue]
136
+ measures: [orders.total_revenue]
142
137
  timeDimension:
143
- dimension: created_at
138
+ dimension: orders.created_at
144
139
  granularity: month
145
140
  ` ``
146
141
 
147
- <LineChart data={trend} x="created_at" y="total_revenue" />
142
+ <LineChart data={trend} x="orders.created_at" y="orders.total_revenue" />
148
143
 
149
144
  ` ``query by_city
150
- cube: orders
151
- measures: [total_revenue]
152
- dimensions: [city]
145
+ measures: [orders.total_revenue]
146
+ dimensions: [orders.city]
153
147
  ` ``
154
148
 
155
- <BarChart data={by_city} x="city" y="total_revenue" />
149
+ <BarChart data={by_city} x="orders.city" y="orders.total_revenue" />
156
150
  ```
157
151
 
158
152
  Both inputs work together: the DateRange scopes the time window on `trend` (which has a timeDimension), and the Dropdown filters both `trend` and `by_city` by channel.
@@ -27,19 +27,17 @@ description: Key metrics for the orders pipeline
27
27
  # Order Summary
28
28
 
29
29
  ` ``query order_count
30
- cube: orders
31
- measures: [count]
30
+ measures: [orders.count]
32
31
  ` ``
33
32
 
34
- <BigValue data={order_count} value="count" title="Total Orders" />
33
+ <BigValue data={order_count} value="orders.count" title="Total Orders" />
35
34
 
36
35
  ` ``query by_status
37
- cube: orders
38
- measures: [count]
39
- dimensions: [status]
36
+ measures: [orders.count]
37
+ dimensions: [orders.status]
40
38
  ` ``
41
39
 
42
- <BarChart data={by_status} x="status" y="count" />
40
+ <BarChart data={by_status} x="orders.status" y="orders.count" />
43
41
  ```
44
42
 
45
43
  ## Frontmatter
@@ -6,7 +6,7 @@
6
6
 
7
7
  Each query fetches data from your semantic layer and makes it available to chart components. Queries use the same measures and dimensions defined in your cubes and views — field names stay consistent whether you're querying from a dashboard, MCP, or the API.
8
8
 
9
- Query blocks have a unique name and map to a `QueryOptions` shape. Components reference them using `data={query_name}`. Field names are unqualified use `count` not `orders.count` — because the `cube` property provides the context.
9
+ Query blocks have a unique name and map to a `QueryOptions` shape. Components reference them using `data={query_name}`. All field names must be fully qualified with the cube or view name (e.g. `orders.count`, `orders.created_at`).
10
10
 
11
11
  ## Syntax
12
12
 
@@ -14,10 +14,9 @@ Query blocks use fenced code with the `query` language tag followed by a name:
14
14
 
15
15
  ````markdown
16
16
  ```query revenue_trend
17
- cube: orders
18
- measures: [total_revenue]
17
+ measures: [orders.total_revenue]
19
18
  timeDimension:
20
- dimension: created_at
19
+ dimension: orders.created_at
21
20
  granularity: month
22
21
  dateRange: [2025-01-01, 2025-12-31]
23
22
  ```
@@ -27,19 +26,18 @@ timeDimension:
27
26
 
28
27
  | Property | Type | Required | Description |
29
28
  |----------|------|----------|-------------|
30
- | `cube` | string | Yes | The cube or view to query (e.g. `orders`) |
31
- | `measures` | string[] | No | Measures to aggregate (e.g. `[count, total_revenue]`) |
32
- | `dimensions` | string[] | No | Dimensions to group by (e.g. `[status, city]`) |
29
+ | `measures` | string[] | No | Fully qualified measures to aggregate (e.g. `[orders.count, orders.total_revenue]`) |
30
+ | `dimensions` | string[] | No | Fully qualified dimensions to group by (e.g. `[orders.status, orders.city]`) |
33
31
  | `filters` | Filter[] | No | Row-level filters |
34
32
  | `timeDimension` | object | No | Time-based grouping and date range |
35
- | `orderBy` | object | No | Sort specification (e.g. `{total_revenue: desc}`) |
33
+ | `orderBy` | object | No | Sort specification (e.g. `{orders.total_revenue: desc}`) |
36
34
  | `limit` | number | No | Maximum rows to return |
37
35
 
38
36
  ### timeDimension
39
37
 
40
38
  | Property | Type | Required | Description |
41
39
  |----------|------|----------|-------------|
42
- | `dimension` | string | Yes | Time dimension name (e.g. `created_at`) |
40
+ | `dimension` | string | Yes | Fully qualified time dimension name (e.g. `orders.created_at`) |
43
41
  | `granularity` | string | No | `day`, `week`, `month`, `quarter`, or `year` |
44
42
  | `dateRange` | string[] | No | `[start, end]` in `YYYY-MM-DD` format |
45
43
 
@@ -59,8 +57,7 @@ Each filter is an object with:
59
57
 
60
58
  ````markdown
61
59
  ```query total_orders
62
- cube: orders
63
- measures: [count]
60
+ measures: [orders.count]
64
61
  ```
65
62
  ````
66
63
 
@@ -68,11 +65,10 @@ measures: [count]
68
65
 
69
66
  ````markdown
70
67
  ```query revenue_by_city
71
- cube: orders
72
- measures: [total_revenue]
73
- dimensions: [city]
68
+ measures: [orders.total_revenue]
69
+ dimensions: [orders.city]
74
70
  orderBy:
75
- total_revenue: desc
71
+ orders.total_revenue: desc
76
72
  limit: 10
77
73
  ```
78
74
  ````
@@ -81,10 +77,9 @@ limit: 10
81
77
 
82
78
  ````markdown
83
79
  ```query monthly_revenue
84
- cube: orders
85
- measures: [total_revenue]
80
+ measures: [orders.total_revenue]
86
81
  timeDimension:
87
- dimension: created_at
82
+ dimension: orders.created_at
88
83
  granularity: month
89
84
  dateRange: [2025-01-01, 2025-12-31]
90
85
  ```
@@ -94,11 +89,10 @@ timeDimension:
94
89
 
95
90
  ````markdown
96
91
  ```query completed_orders
97
- cube: orders
98
- measures: [count, total_revenue]
99
- dimensions: [category]
92
+ measures: [orders.count, orders.total_revenue]
93
+ dimensions: [orders.category]
100
94
  filters:
101
- - dimension: status
95
+ - dimension: orders.status
102
96
  operator: equals
103
97
  values: [completed]
104
98
  ```
@@ -108,8 +102,7 @@ filters:
108
102
 
109
103
  - Query names must be valid identifiers (letters, numbers, `_`, `$`)
110
104
  - Query names must be unique within a dashboard
111
- - Every query must specify a `cube`
112
- - Field names are unqualified (use `count` not `orders.count`) — the `cube` provides the context
105
+ - All field names must be fully qualified with the cube or view name (e.g. `orders.count`, not `count`)
113
106
  - Components reference queries by name: `data={query_name}`
114
107
 
115
108
  ## See Also
@@ -18,11 +18,10 @@ const bonnard = createClient({
18
18
  });
19
19
 
20
20
  const result = await bonnard.query({
21
- cube: 'orders',
22
- measures: ['revenue', 'count'],
23
- dimensions: ['status'],
21
+ measures: ['orders.revenue', 'orders.count'],
22
+ dimensions: ['orders.status'],
24
23
  timeDimension: {
25
- dimension: 'created_at',
24
+ dimension: 'orders.created_at',
26
25
  granularity: 'month',
27
26
  dateRange: ['2025-01-01', '2025-12-31'],
28
27
  },
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@bonnard/cli",
3
- "version": "0.2.11",
3
+ "version": "0.2.13",
4
4
  "type": "module",
5
5
  "bin": {
6
6
  "bon": "./dist/bin/bon.mjs"