@databrainhq/mcp-server 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +283 -0
- package/dist/index.d.ts +2 -0
- package/dist/index.js +5249 -0
- package/dist/index.js.map +1 -0
- package/package.json +44 -0
package/README.md
ADDED
|
@@ -0,0 +1,283 @@
|
|
|
1
|
+
# @databrainhq/mcp-server
|
|
2
|
+
|
|
3
|
+
MCP server for Databrain embedded analytics. Connect it to any AI assistant and say **"set up a Databrain embed"** — it walks you through the entire flow interactively.
|
|
4
|
+
|
|
5
|
+
---
|
|
6
|
+
|
|
7
|
+
## Quick Start
|
|
8
|
+
|
|
9
|
+
### Prerequisites
|
|
10
|
+
|
|
11
|
+
- Node.js 18+
|
|
12
|
+
- Databrain Cloud account ([usedatabrain.com](https://usedatabrain.com))
|
|
13
|
+
- Service token (Settings > Service Tokens in Databrain UI)
|
|
14
|
+
|
|
15
|
+
### 1. Get Your Service Token
|
|
16
|
+
|
|
17
|
+
1. Log in to [Databrain](https://usedatabrain.com)
|
|
18
|
+
2. Go to **Settings > Service Tokens**
|
|
19
|
+
3. Create a new token and copy it
|
|
20
|
+
|
|
21
|
+
### 2. Connect to Your AI Assistant
|
|
22
|
+
|
|
23
|
+
Pick your client below. All configs use the same MCP server — just different config file locations.
|
|
24
|
+
|
|
25
|
+
---
|
|
26
|
+
|
|
27
|
+
#### Claude Desktop
|
|
28
|
+
|
|
29
|
+
File: `~/Library/Application Support/Claude/claude_desktop_config.json` (macOS) or `%APPDATA%\Claude\claude_desktop_config.json` (Windows)
|
|
30
|
+
|
|
31
|
+
```json
|
|
32
|
+
{
|
|
33
|
+
"mcpServers": {
|
|
34
|
+
"databrain": {
|
|
35
|
+
"command": "npx",
|
|
36
|
+
"args": ["@databrainhq/mcp-server"],
|
|
37
|
+
"env": {
|
|
38
|
+
"DATABRAIN_SERVICE_TOKEN": "<your-service-token>",
|
|
39
|
+
"DATABRAIN_API_URL": "https://api.usedatabrain.com"
|
|
40
|
+
}
|
|
41
|
+
}
|
|
42
|
+
}
|
|
43
|
+
}
|
|
44
|
+
```
|
|
45
|
+
|
|
46
|
+
#### Claude Code (CLI)
|
|
47
|
+
|
|
48
|
+
```bash
|
|
49
|
+
claude mcp add databrain -- npx @databrainhq/mcp-server
|
|
50
|
+
```
|
|
51
|
+
|
|
52
|
+
Then set your token in the environment:
|
|
53
|
+
```bash
|
|
54
|
+
export DATABRAIN_SERVICE_TOKEN="<your-service-token>"
|
|
55
|
+
```
|
|
56
|
+
|
|
57
|
+
Or add to your project's `.claude/settings.json`:
|
|
58
|
+
```json
|
|
59
|
+
{
|
|
60
|
+
"env": {
|
|
61
|
+
"DATABRAIN_SERVICE_TOKEN": "<your-service-token>"
|
|
62
|
+
}
|
|
63
|
+
}
|
|
64
|
+
```
|
|
65
|
+
|
|
66
|
+
#### Cursor
|
|
67
|
+
|
|
68
|
+
File: `.cursor/mcp.json` (project-level) or `~/.cursor/mcp.json` (global)
|
|
69
|
+
|
|
70
|
+
```json
|
|
71
|
+
{
|
|
72
|
+
"mcpServers": {
|
|
73
|
+
"databrain": {
|
|
74
|
+
"command": "npx",
|
|
75
|
+
"args": ["@databrainhq/mcp-server"],
|
|
76
|
+
"env": {
|
|
77
|
+
"DATABRAIN_SERVICE_TOKEN": "<your-service-token>"
|
|
78
|
+
}
|
|
79
|
+
}
|
|
80
|
+
}
|
|
81
|
+
}
|
|
82
|
+
```
|
|
83
|
+
|
|
84
|
+
#### Windsurf
|
|
85
|
+
|
|
86
|
+
File: `~/.codeium/windsurf/mcp_config.json`
|
|
87
|
+
|
|
88
|
+
```json
|
|
89
|
+
{
|
|
90
|
+
"mcpServers": {
|
|
91
|
+
"databrain": {
|
|
92
|
+
"command": "npx",
|
|
93
|
+
"args": ["@databrainhq/mcp-server"],
|
|
94
|
+
"env": {
|
|
95
|
+
"DATABRAIN_SERVICE_TOKEN": "<your-service-token>"
|
|
96
|
+
}
|
|
97
|
+
}
|
|
98
|
+
}
|
|
99
|
+
}
|
|
100
|
+
```
|
|
101
|
+
|
|
102
|
+
#### ChatGPT (via MCP bridge)
|
|
103
|
+
|
|
104
|
+
ChatGPT doesn't natively support MCP yet. Use an MCP-to-OpenAI bridge like [mcp-bridge](https://github.com/nicobailon/mcp-bridge) or [mcp-openai](https://github.com/AshDevFr/mcp-openai-proxy):
|
|
105
|
+
|
|
106
|
+
```bash
|
|
107
|
+
# Example with mcp-bridge
|
|
108
|
+
npx mcp-bridge --config mcp-config.json
|
|
109
|
+
```
|
|
110
|
+
|
|
111
|
+
`mcp-config.json`:
|
|
112
|
+
```json
|
|
113
|
+
{
|
|
114
|
+
"mcpServers": {
|
|
115
|
+
"databrain": {
|
|
116
|
+
"command": "npx",
|
|
117
|
+
"args": ["@databrainhq/mcp-server"],
|
|
118
|
+
"env": {
|
|
119
|
+
"DATABRAIN_SERVICE_TOKEN": "<your-service-token>"
|
|
120
|
+
}
|
|
121
|
+
}
|
|
122
|
+
}
|
|
123
|
+
}
|
|
124
|
+
```
|
|
125
|
+
|
|
126
|
+
#### Any MCP-Compatible Client
|
|
127
|
+
|
|
128
|
+
The server uses **stdio transport** by default. Any client that supports MCP stdio can connect:
|
|
129
|
+
|
|
130
|
+
```bash
|
|
131
|
+
npx @databrainhq/mcp-server
|
|
132
|
+
```
|
|
133
|
+
|
|
134
|
+
Set environment variables before running:
|
|
135
|
+
```bash
|
|
136
|
+
export DATABRAIN_SERVICE_TOKEN="<your-service-token>"
|
|
137
|
+
export DATABRAIN_API_URL="https://api.usedatabrain.com" # optional, this is the default
|
|
138
|
+
```
|
|
139
|
+
|
|
140
|
+
---
|
|
141
|
+
|
|
142
|
+
### 3. Start Using It
|
|
143
|
+
|
|
144
|
+
Once connected, just tell your AI assistant:
|
|
145
|
+
|
|
146
|
+
> **"Set up a Databrain dashboard embed"**
|
|
147
|
+
|
|
148
|
+
The assistant will use the `setup_embed_interactive` tool to walk you through:
|
|
149
|
+
1. Discovering your data apps (confirms your choice even if there's only one)
|
|
150
|
+
2. Checking/creating an API token
|
|
151
|
+
3. Listing embeds and confirming which one to use
|
|
152
|
+
4. Validating the pipeline with a test guest token
|
|
153
|
+
5. Returning a ready-to-use config block with all IDs and env vars
|
|
154
|
+
|
|
155
|
+
No need to memorize tool names or API details — the orchestrator handles it.
|
|
156
|
+
|
|
157
|
+
---
|
|
158
|
+
|
|
159
|
+
## Environment Variables
|
|
160
|
+
|
|
161
|
+
| Variable | Required | Description |
|
|
162
|
+
|----------|----------|-------------|
|
|
163
|
+
| `DATABRAIN_SERVICE_TOKEN` | Yes* | Organization-level token for datasources, data apps, datamarts |
|
|
164
|
+
| `DATABRAIN_API_TOKEN` | No | Data-app-scoped token for embeds, queries, guest tokens. Can be created via `create_api_token` tool |
|
|
165
|
+
| `DATABRAIN_API_URL` | No | API base URL (default: `https://api.usedatabrain.com`) |
|
|
166
|
+
|
|
167
|
+
*At least one of `DATABRAIN_SERVICE_TOKEN` or `DATABRAIN_API_TOKEN` is required.
|
|
168
|
+
|
|
169
|
+
---
|
|
170
|
+
|
|
171
|
+
## What's Inside
|
|
172
|
+
|
|
173
|
+
### Orchestration
|
|
174
|
+
- `setup_embed_interactive` — Interactive step-by-step embed setup. Confirms choices with the user, validates the pipeline, and returns a deterministic config block.
|
|
175
|
+
|
|
176
|
+
### Tools (28 total)
|
|
177
|
+
|
|
178
|
+
<details>
|
|
179
|
+
<summary>Infrastructure</summary>
|
|
180
|
+
|
|
181
|
+
- `list_datasources` — List all connected datasources
|
|
182
|
+
</details>
|
|
183
|
+
|
|
184
|
+
<details>
|
|
185
|
+
<summary>Data Apps & Tokens</summary>
|
|
186
|
+
|
|
187
|
+
- `create_data_app` — Create a logical container for embeds
|
|
188
|
+
- `list_data_apps` — List data apps
|
|
189
|
+
- `create_api_token` — Generate API token scoped to a data app
|
|
190
|
+
- `list_api_tokens` — List API tokens for a data app
|
|
191
|
+
- `rotate_api_token` — Rotate an API token (invalidates the old one)
|
|
192
|
+
</details>
|
|
193
|
+
|
|
194
|
+
<details>
|
|
195
|
+
<summary>Datamarts & Semantic Layer</summary>
|
|
196
|
+
|
|
197
|
+
- `list_datamarts` — List datamarts
|
|
198
|
+
- `get_semantic_layer_status` — Check semantic layer generation progress
|
|
199
|
+
</details>
|
|
200
|
+
|
|
201
|
+
<details>
|
|
202
|
+
<summary>Embed Management</summary>
|
|
203
|
+
|
|
204
|
+
- `create_embed` — Create embed config for a dashboard/metric
|
|
205
|
+
- `list_embeds` — List embed configurations
|
|
206
|
+
- `get_embed_details` — Get full embed details
|
|
207
|
+
- `update_embed` — Update embed access settings, permissions, or theme
|
|
208
|
+
- `delete_embed` — Delete an embed configuration
|
|
209
|
+
- `rename_embed` — Rename an embed
|
|
210
|
+
- `apply_embed_preset` — Apply theme/access presets (light/dark/corporate/minimal + view-only/power-user/export-only/ai-enabled)
|
|
211
|
+
</details>
|
|
212
|
+
|
|
213
|
+
<details>
|
|
214
|
+
<summary>Frontend Embedding</summary>
|
|
215
|
+
|
|
216
|
+
- `generate_guest_token` — Generate frontend auth token
|
|
217
|
+
- `generate_embed_code` — Generate framework-specific code (React, Next.js, Vue, Angular, Svelte, SolidJS, vanilla JS)
|
|
218
|
+
</details>
|
|
219
|
+
|
|
220
|
+
<details>
|
|
221
|
+
<summary>Data & Analytics</summary>
|
|
222
|
+
|
|
223
|
+
- `list_dashboards` — List dashboards
|
|
224
|
+
- `list_metrics` — List metrics
|
|
225
|
+
- `query_metric_data` — Fetch metric data
|
|
226
|
+
- `get_dashboard_data` — Fetch all metrics in a dashboard
|
|
227
|
+
- `ask_ai_pilot` — Natural language data questions
|
|
228
|
+
- `download_metric_csv` — Export metric data as CSV
|
|
229
|
+
- `list_scheduled_reports` — List scheduled email reports
|
|
230
|
+
- `export_dashboard` — Export a dashboard as JSON
|
|
231
|
+
- `import_dashboard` — Import a dashboard from JSON
|
|
232
|
+
</details>
|
|
233
|
+
|
|
234
|
+
### Resources (9)
|
|
235
|
+
|
|
236
|
+
- `databrain://getting-started` — Entity model and onboarding guide
|
|
237
|
+
- `databrain://api-reference` — API endpoint reference
|
|
238
|
+
- `databrain://embedding-guide` — Framework-specific embedding instructions
|
|
239
|
+
- `databrain://filter-reference` — Filter types and operators
|
|
240
|
+
- `databrain://theme-reference` — Theme customization
|
|
241
|
+
- `databrain://web-component-reference` — Web component props
|
|
242
|
+
- `databrain://self-serve-reference` — Self-serve analytics settings
|
|
243
|
+
- `databrain://semantic-layer-guide` — Semantic layer for AI chat
|
|
244
|
+
- `databrain://multi-tenancy-guide` — Multi-tenant row-level security
|
|
245
|
+
|
|
246
|
+
### Prompts (1)
|
|
247
|
+
|
|
248
|
+
- `explore-data` — Discover available data and dashboards in your workspace
|
|
249
|
+
|
|
250
|
+
---
|
|
251
|
+
|
|
252
|
+
## For AI/LLM Developers
|
|
253
|
+
|
|
254
|
+
See [`docs/LLM_INSTRUCTIONS.md`](docs/LLM_INSTRUCTIONS.md) for detailed guidance on:
|
|
255
|
+
- Tool chaining and dependency graph
|
|
256
|
+
- Common workflows (embed, explore, theme, roles, export/import)
|
|
257
|
+
- Error recovery patterns
|
|
258
|
+
- Expected output formats
|
|
259
|
+
- Security guardrails
|
|
260
|
+
|
|
261
|
+
---
|
|
262
|
+
|
|
263
|
+
## Development
|
|
264
|
+
|
|
265
|
+
```bash
|
|
266
|
+
npm install
|
|
267
|
+
npm run build # Build with tsup
|
|
268
|
+
npm test # Run tests with vitest
|
|
269
|
+
npm run dev # Run with tsx (development)
|
|
270
|
+
npm run lint # Type check
|
|
271
|
+
```
|
|
272
|
+
|
|
273
|
+
### Testing with MCP Inspector
|
|
274
|
+
|
|
275
|
+
```bash
|
|
276
|
+
npx @modelcontextprotocol/inspector -- npx tsx src/index.ts
|
|
277
|
+
```
|
|
278
|
+
|
|
279
|
+
---
|
|
280
|
+
|
|
281
|
+
## License
|
|
282
|
+
|
|
283
|
+
MIT
|
package/dist/index.d.ts
ADDED