@bonnard/cli 0.2.10 → 0.2.12

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,87 +1,202 @@
1
- # @bonnard/cli
1
+ <p align="center">
2
+ <a href="https://www.bonnard.dev">
3
+ <picture>
4
+ <source media="(prefers-color-scheme: dark)" srcset="./assets/banner-dark.png" />
5
+ <source media="(prefers-color-scheme: light)" srcset="./assets/banner-light.png" />
6
+ <img alt="Bonnard -the semantic engine for MCP clients, AI agents, and data teams" src="./assets/banner-light.png" width="100%" />
7
+ </picture>
8
+ </a>
9
+ </p>
2
10
 
3
- The Bonnard CLI (`bon`) takes you from zero to a deployed semantic layer in minutes. Define metrics in YAML, validate locally, deploy, and query — from your terminal or AI coding agent.
11
+ <p align="center">
12
+ <strong>The semantic engine for MCP clients. Define metrics once, query from anywhere.</strong>
13
+ </p>
4
14
 
5
- **Open source** — [view source on GitHub](https://github.com/meal-inc/bonnard-cli)
15
+ <p align="center">
16
+ <a href="https://www.npmjs.com/package/@bonnard/cli"><img src="https://img.shields.io/npm/v/@bonnard/cli?style=flat-square&color=0891b2" alt="npm version" /></a>
17
+ <a href="https://github.com/meal-inc/bonnard-cli/blob/main/LICENSE"><img src="https://img.shields.io/github/license/meal-inc/bonnard-cli?style=flat-square" alt="MIT License" /></a>
18
+ <a href="https://discord.com/invite/RQuvjGRz"><img src="https://img.shields.io/badge/Discord-Join%20us-5865F2?style=flat-square&logo=discord&logoColor=white" alt="Discord" /></a>
19
+ </p>
6
20
 
7
- ## Quick start
21
+ <p align="center">
22
+ <a href="https://docs.bonnard.dev/docs/">Docs</a> &middot;
23
+ <a href="https://docs.bonnard.dev/docs/getting-started">Getting Started</a> &middot;
24
+ <a href="https://docs.bonnard.dev/docs/changelog">Changelog</a> &middot;
25
+ <a href="https://discord.com/invite/RQuvjGRz">Discord</a> &middot;
26
+ <a href="https://www.bonnard.dev">Website</a>
27
+ </p>
28
+
29
+ ---
30
+
31
+ Bonnard is an agent-native semantic layer CLI. Deploy an MCP server and governed analytics API in minutes -for AI agents, BI tools, and data teams. Define metrics and dimensions in YAML, validate locally, and ship to production. Works with Snowflake, BigQuery, Databricks, and PostgreSQL. Ships with native integrations for Claude Code, Cursor, and Codex. Built with TypeScript.
32
+
33
+ ## Why Bonnard?
34
+
35
+ Most semantic layers were built for dashboards and retrofitted for AI. Bonnard was built the other way around -agent-native from day one with Model Context Protocol (MCP) as a core feature, not a plugin. One CLI takes you from an empty directory to a production semantic layer serving AI agents, BI tools, and human analysts through a single governed API.
36
+
37
+ <p align="center">
38
+ <img src="./assets/architecture.png" alt="Bonnard architecture -data sources flow through the semantic layer to AI agents, BI tools, and MCP clients" width="100%" />
39
+ </p>
40
+
41
+ ## Quick Start
42
+
43
+ No install required. Run directly with npx:
8
44
 
9
45
  ```bash
10
- npx @bonnard/cli init # Create project structure + agent templates
11
- bon datasource add --demo # Add demo dataset (no warehouse needed)
12
- bon validate # Check syntax
13
- bon login # Authenticate with Bonnard
14
- bon deploy -m "Initial deploy" # Deploy to Bonnard
46
+ npx @bonnard/cli init
15
47
  ```
16
48
 
17
- No install needed — `npx` runs the CLI directly. Or install globally for shorter commands:
49
+ Or install globally:
18
50
 
19
51
  ```bash
20
52
  npm install -g @bonnard/cli
21
53
  ```
22
54
 
55
+ Then follow the setup flow:
56
+
57
+ ```bash
58
+ bon init # Scaffold project + agent configs
59
+ bon datasource add # Connect your warehouse
60
+ bon validate # Check your models locally
61
+ bon login # Authenticate
62
+ bon deploy # Ship it
63
+ ```
64
+
65
+ No warehouse yet? Start exploring with a full retail demo dataset:
66
+
67
+ ```bash
68
+ bon datasource add --demo
69
+ ```
70
+
23
71
  Requires Node.js 20+.
24
72
 
25
- ## Commands
73
+ ## Agent-Native from Day One
26
74
 
27
- | Command | Description |
28
- |---------|-------------|
29
- | `bon init` | Create project structure and AI agent templates |
30
- | `bon login` | Authenticate with Bonnard |
31
- | `bon logout` | Remove stored credentials |
32
- | `bon whoami` | Show current login status |
33
- | `bon datasource add` | Add a data source (interactive) |
34
- | `bon datasource add --demo` | Add read-only demo dataset |
35
- | `bon datasource add --from-dbt` | Import from dbt profiles |
36
- | `bon datasource list` | List configured data sources |
37
- | `bon datasource remove <name>` | Remove a data source |
38
- | `bon validate` | Validate cube and view YAML |
39
- | `bon deploy -m "message"` | Deploy to Bonnard |
40
- | `bon deployments` | List deployment history |
41
- | `bon diff <id>` | View changes in a deployment |
42
- | `bon annotate <id>` | Add context to deployment changes |
43
- | `bon query '{"measures":["orders.count"]}'` | Query the semantic layer (JSON) |
44
- | `bon query "SELECT ..." --sql` | Query the semantic layer (SQL) |
45
- | `bon mcp` | MCP setup instructions for AI agents |
46
- | `bon mcp test` | Test MCP server connectivity |
47
- | `bon docs [topic]` | Browse modeling documentation |
48
- | `bon docs --search "joins"` | Search documentation |
75
+ When you run `bon init`, Bonnard generates context files so AI coding agents understand your semantic layer from the first prompt:
49
76
 
50
- ## Agent-ready from the start
77
+ ```
78
+ you@work my-project % bon init
79
+
80
+ Initialised Bonnard project
81
+ Core files:
82
+ bon.yaml
83
+ bonnard/cubes/
84
+ bonnard/views/
85
+ Agent support:
86
+ .claude/rules/bonnard.md
87
+ .claude/skills/bonnard-get-started/
88
+ .cursor/rules/bonnard.mdc
89
+ AGENTS.md
90
+ ```
51
91
 
52
- `bon init` generates context files for your AI coding tools:
92
+ | Agent | What gets generated |
93
+ | --- | --- |
94
+ | **Claude Code** | `.claude/rules/bonnard.md` + skill templates in `.claude/skills/` |
95
+ | **Cursor** | `.cursor/rules/bonnard.mdc` with frontmatter configuration |
96
+ | **Codex** | `AGENTS.md` + skills directory |
53
97
 
54
- - **Claude Code** `.claude/rules/` + get-started skill
55
- - **Cursor** — `.cursor/rules/` with auto-apply frontmatter
56
- - **Codex** — `AGENTS.md` + skills folder
98
+ Set up your MCP server so agents can query your semantic layer directly:
57
99
 
58
- Your agent understands Bonnard's modeling language from the first prompt.
100
+ ```bash
101
+ bon mcp setup # Configure MCP server
102
+ bon mcp test # Verify the connection
103
+ ```
59
104
 
60
- ## Project structure
105
+ ## Auto-Detected from Your Project
61
106
 
62
- After `bon init`:
107
+ <p align="center">
108
+ <img src="./assets/datasources.png" alt="Auto-detected warehouses and data tools -Snowflake, BigQuery, PostgreSQL, Databricks, DuckDB, dbt, Dagster, Prefect, Airflow, Looker, Cube, Evidence, SQLMesh, Soda, Great Expectations" width="100%" />
109
+ </p>
110
+
111
+ Bonnard automatically detects your warehouses and data tools. Point it at your project and it discovers schemas, tables, and relationships.
112
+
113
+ - **Snowflake** -full support including Snowpark
114
+ - **Google BigQuery** -native integration
115
+ - **Databricks** -SQL warehouses and Unity Catalog
116
+ - **PostgreSQL** -including cloud-hosted variants (Supabase, Neon, RDS)
117
+ - **DuckDB** -local development and testing
118
+ - **dbt** -model and profile import
119
+ - **Dagster, Prefect, Airflow** -orchestration tools
120
+ - **Looker, Cube, Evidence** -existing BI layers
121
+ - **SQLMesh, Soda, Great Expectations** -data quality and transformation
122
+
123
+ ## Querying
124
+
125
+ Query your semantic layer from the terminal using JSON or SQL syntax:
126
+
127
+ ```bash
128
+ # JSON query
129
+ bon query --measures revenue,order_count --dimensions product_category --time-dimension created_at
130
+
131
+ # SQL query
132
+ bon query --sql "SELECT product_category, MEASURE(revenue) FROM orders GROUP BY 1"
133
+ ```
134
+
135
+ Agents connected via MCP can run the same queries programmatically, with full access to your governed metric definitions.
136
+
137
+ ## Project Structure
63
138
 
64
139
  ```
65
140
  my-project/
66
141
  ├── bon.yaml # Project configuration
67
142
  ├── bonnard/
68
- │ ├── cubes/ # Cube definitions (measures, dimensions, joins)
69
- │ └── views/ # View definitions (curated query interfaces)
70
- └── .bon/ # Local config (gitignored)
71
- └── datasources.yaml # Data source credentials
143
+ │ ├── cubes/ # Metric and dimension definitions
144
+ │ └── views/ # Curated query interfaces
145
+ ├── .bon/ # Local credentials (gitignored)
146
+ ├── .claude/ # Claude Code agent context
147
+ ├── .cursor/ # Cursor agent context
148
+ └── AGENTS.md # Codex agent context
72
149
  ```
73
150
 
74
151
  ## CI/CD
75
152
 
153
+ Deploy from your pipeline with the `--ci` flag for non-interactive mode:
154
+
76
155
  ```bash
77
- bon deploy --ci -m "CI deploy"
156
+ bon deploy --ci
78
157
  ```
79
158
 
80
- Non-interactive mode for pipelines. Datasources are synced automatically.
159
+ Handles automatic datasource synchronisation and skips interactive prompts. Fits into GitHub Actions, GitLab CI, or any pipeline that runs Node.js.
160
+
161
+ ## Commands
162
+
163
+ | Command | Description |
164
+ | --- | --- |
165
+ | `bon init` | Scaffold a new project with agent configs |
166
+ | `bon datasource add` | Connect a data source (or `--demo` for sample data) |
167
+ | `bon datasource add --from-dbt` | Import from dbt profiles |
168
+ | `bon datasource list` | List connected data sources |
169
+ | `bon validate` | Validate models locally before deploying |
170
+ | `bon deploy` | Deploy semantic layer to production |
171
+ | `bon deployments` | List active deployments |
172
+ | `bon diff` | Preview changes before deploying |
173
+ | `bon annotate` | Add metadata and descriptions to models |
174
+ | `bon query` | Run queries from the terminal (JSON or SQL) |
175
+ | `bon mcp setup` | Configure MCP server for agent access |
176
+ | `bon mcp test` | Test MCP connection |
177
+ | `bon docs` | Browse or search documentation from the CLI |
178
+ | `bon login` / `bon logout` | Manage authentication |
179
+ | `bon whoami` | Check current session |
180
+
181
+ For the full CLI reference, see the [documentation](https://docs.bonnard.dev/docs/cli-reference).
81
182
 
82
183
  ## Documentation
83
184
 
84
- - [Getting Started](https://docs.bonnard.dev/docs/getting-started)
85
- - [CLI Reference](https://docs.bonnard.dev/docs/cli)
86
- - [Modeling Guide](https://docs.bonnard.dev/docs/modeling/cubes)
87
- - [Querying](https://docs.bonnard.dev/docs/querying)
185
+ - [Getting Started](https://docs.bonnard.dev/docs/getting-started) -from zero to deployed in minutes
186
+ - [CLI Reference](https://docs.bonnard.dev/docs/cli-reference) -every command, flag, and option
187
+ - [Modeling Guide](https://docs.bonnard.dev/docs/modeling) -cubes, views, metrics, and dimensions
188
+ - [Querying](https://docs.bonnard.dev/docs/querying) -JSON and SQL query syntax
189
+ - [Changelog](https://docs.bonnard.dev/docs/changelog) -what shipped and when
190
+
191
+ ## Community
192
+
193
+ - [Discord](https://discord.com/invite/RQuvjGRz) -ask questions, share feedback, connect with the team
194
+ - [GitHub Issues](https://github.com/meal-inc/bonnard-cli/issues) -bug reports and feature requests
195
+ - [LinkedIn](https://www.linkedin.com/company/bonnarddev/) -follow for updates
196
+ - [Website](https://www.bonnard.dev) -learn more about Bonnard
197
+
198
+ Contributions are welcome. If you find a bug or have an idea, open an issue or submit a pull request.
199
+
200
+ ## License
201
+
202
+ [MIT](./LICENSE)
@@ -0,0 +1,3 @@
1
+ import { n as get, r as post, t as del } from "./api-DqgY-30K.mjs";
2
+
3
+ export { del, get };
@@ -0,0 +1,75 @@
1
+ import fs from "node:fs";
2
+ import path from "node:path";
3
+ import os from "node:os";
4
+ import pc from "picocolors";
5
+
6
+ //#region src/lib/credentials.ts
7
+ const CREDENTIALS_DIR = path.join(os.homedir(), ".config", "bon");
8
+ const CREDENTIALS_FILE = path.join(CREDENTIALS_DIR, "credentials.json");
9
+ function saveCredentials(credentials) {
10
+ fs.mkdirSync(CREDENTIALS_DIR, {
11
+ recursive: true,
12
+ mode: 448
13
+ });
14
+ fs.writeFileSync(CREDENTIALS_FILE, JSON.stringify(credentials, null, 2), { mode: 384 });
15
+ }
16
+ function loadCredentials() {
17
+ try {
18
+ const raw = fs.readFileSync(CREDENTIALS_FILE, "utf-8");
19
+ const parsed = JSON.parse(raw);
20
+ if (parsed.token && parsed.email) return parsed;
21
+ return null;
22
+ } catch {
23
+ return null;
24
+ }
25
+ }
26
+ function clearCredentials() {
27
+ try {
28
+ fs.unlinkSync(CREDENTIALS_FILE);
29
+ } catch {}
30
+ }
31
+
32
+ //#endregion
33
+ //#region src/lib/api.ts
34
+ const APP_URL = process.env.BON_APP_URL || "https://app.bonnard.dev";
35
+ const VERCEL_BYPASS = process.env.VERCEL_AUTOMATION_BYPASS_SECRET;
36
+ function getToken() {
37
+ const creds = loadCredentials();
38
+ if (!creds) {
39
+ console.error(pc.red("Not logged in. Run `bon login` first."));
40
+ process.exit(1);
41
+ }
42
+ return creds.token;
43
+ }
44
+ async function request(method, path, body) {
45
+ const token = getToken();
46
+ const url = `${APP_URL}${path}`;
47
+ const headers = {
48
+ Authorization: `Bearer ${token}`,
49
+ "Content-Type": "application/json"
50
+ };
51
+ if (VERCEL_BYPASS) headers["x-vercel-protection-bypass"] = VERCEL_BYPASS;
52
+ const res = await fetch(url, {
53
+ method,
54
+ headers,
55
+ body: body ? JSON.stringify(body) : void 0
56
+ });
57
+ const data = await res.json();
58
+ if (!res.ok) {
59
+ const message = data.error || res.statusText;
60
+ throw new Error(message);
61
+ }
62
+ return data;
63
+ }
64
+ function get(path) {
65
+ return request("GET", path);
66
+ }
67
+ function post(path, body) {
68
+ return request("POST", path, body);
69
+ }
70
+ function del(path) {
71
+ return request("DELETE", path);
72
+ }
73
+
74
+ //#endregion
75
+ export { loadCredentials as a, clearCredentials as i, get as n, saveCredentials as o, post as r, del as t };