@einlogic/mcp-fabric-api 2.5.0 → 2.6.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,580 +1,662 @@
1
- # mcp-fabric-api
2
-
3
- MCP (Model Context Protocol) server for the Microsoft Fabric REST APIs. Built for data engineers and data analysts who want to use AI assistants beyond Copilot — such as Claude, Claude Code, or any MCP-compatible client — to build and manage their Fabric components. Covers workspaces, lakehouses, warehouses, notebooks, pipelines, semantic models, reports, dataflows, eventhouses, eventstreams, reflexes, GraphQL APIs, SQL endpoints, variable libraries, git integration, deployment pipelines, mirrored databases, KQL databases, ML models, ML experiments, copy jobs, and external data shares.
4
-
5
- > **Safe by default:** This server blocks all destructive operations (create, update, delete) until you explicitly configure the `WRITABLE_WORKSPACES` environment variable. Read operations always work. Set `WRITABLE_WORKSPACES="*"` to allow writes to all workspaces, or use patterns to limit access. See [Workspace Safety Guard](#workspace-safety-guard) for details.
6
-
7
- ## Prerequisites
8
-
9
- - Node.js 18+
10
- - Azure CLI (`az login` for authentication)
11
- - Access to a Microsoft Fabric workspace
12
-
13
- ## Quick Start
14
-
15
- Authenticate with Azure CLI:
16
-
17
- ```bash
18
- az login
19
- ```
20
-
21
- Run directly with npx (no install needed):
22
-
23
- ```bash
24
- npx @einlogic/mcp-fabric-api
25
- ```
26
-
27
- ## Setup
28
-
29
- ### Claude Desktop
30
-
31
- Add to your Claude Desktop config file:
32
-
33
- - **macOS:** `~/Library/Application Support/Claude/claude_desktop_config.json`
34
- - **Windows:** `%APPDATA%\Claude\claude_desktop_config.json`
35
-
36
- ```json
37
- {
38
- "mcpServers": {
39
- "fabric": {
40
- "command": "npx",
41
- "args": ["-y", "@einlogic/mcp-fabric-api"]
42
- }
43
- }
44
- }
45
- ```
46
-
47
- ### Claude Code CLI
48
-
49
- ```bash
50
- claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api
51
- ```
52
-
53
- To verify it was added:
54
-
55
- ```bash
56
- claude mcp list
57
- ```
58
-
59
- ### HTTP Mode (Remote)
60
-
61
- For remote deployments, set environment variables:
62
-
63
- ```bash
64
- export TRANSPORT=http
65
- export PORT=3000
66
- export AZURE_CLIENT_ID=your-client-id
67
- export AZURE_CLIENT_SECRET=your-client-secret
68
- export AZURE_TENANT_ID=your-tenant-id
69
- npx @einlogic/mcp-fabric-api
70
- ```
71
-
72
- The server exposes:
73
- - `POST /mcp` MCP endpoint (StreamableHTTP)
74
- - `GET /mcp` — SSE stream for server notifications
75
- - `DELETE /mcp` — Session cleanup
76
- - `GET /.well-known/oauth-protected-resource` — OAuth metadata
77
-
78
- ### Workspace Safety Guard
79
-
80
- Control which workspaces allow write operations (create, update, delete) via the `WRITABLE_WORKSPACES` environment variable. Only workspaces matching the configured name patterns will permit CUD (Create, Update, Delete) operations. Read operations are never restricted.
81
-
82
- > **Default behavior: When `WRITABLE_WORKSPACES` is not set or empty, all destructive operations are blocked.** You must explicitly configure this variable to enable writes.
83
-
84
- | `WRITABLE_WORKSPACES` value | Behavior |
85
- |------------------------------|----------|
86
- | Not set / empty | **All writes blocked** (safe default) |
87
- | `*` | All workspaces writable |
88
- | `*-Dev,*-Test,Sandbox*` | Only matching workspaces writable |
89
-
90
- Set comma-separated glob patterns:
91
-
92
- ```bash
93
- WRITABLE_WORKSPACES=*-Dev,*-Test,Sandbox*
94
- ```
95
-
96
- **Wildcard examples:**
97
- - `*` matches all workspaces (allow everything)
98
- - `*-Dev` matches "Sales-Dev", "Finance-Dev"
99
- - `Sandbox*` matches "Sandbox-123", "Sandbox-Mike"
100
- - `Exact-Name` matches only "Exact-Name" (case-insensitive)
101
-
102
- **Guarded tools (89 total)** — every tool that creates, updates, or deletes workspace items:
103
-
104
- | Domain | Guarded tools |
105
- |--------|--------------|
106
- | Workspace | `workspace_update`, `workspace_delete` |
107
- | Lakehouse | `lakehouse_create`, `lakehouse_update`, `lakehouse_delete`, `lakehouse_load_table`, `lakehouse_create_shortcut`, `lakehouse_update_definition`, `lakehouse_delete_shortcut` |
108
- | Warehouse | `warehouse_create`, `warehouse_update`, `warehouse_delete`, `warehouse_update_definition` |
109
- | Notebook | `notebook_create`, `notebook_update`, `notebook_delete`, `notebook_update_definition` |
110
- | Pipeline | `pipeline_create`, `pipeline_update`, `pipeline_delete`, `pipeline_create_schedule`, `pipeline_update_schedule`, `pipeline_delete_schedule`, `pipeline_update_definition` |
111
- | Semantic Model | `semantic_model_create_bim`, `semantic_model_create_tmdl`, `semantic_model_update_details`, `semantic_model_delete`, `semantic_model_update_bim`, `semantic_model_update_tmdl`, `semantic_model_take_over` |
112
- | Report | `report_create_definition`, `report_update`, `report_delete`, `report_clone`, `report_update_definition`, `report_rebind` |
113
- | Dataflow | `dataflow_create`, `dataflow_update`, `dataflow_delete` |
114
- | Eventhouse | `eventhouse_create`, `eventhouse_update`, `eventhouse_delete` |
115
- | Eventstream | `eventstream_create`, `eventstream_update`, `eventstream_delete`, `eventstream_update_definition` |
116
- | Reflex | `reflex_create`, `reflex_update`, `reflex_delete`, `reflex_update_definition` |
117
- | GraphQL API | `graphql_api_create`, `graphql_api_update`, `graphql_api_delete` |
118
- | Variable Library | `variable_library_create`, `variable_library_update`, `variable_library_delete`, `variable_library_update_definition` |
119
- | Git Integration | `git_connect`, `git_disconnect`, `git_initialize_connection`, `git_commit_to_git`, `git_update_from_git`, `git_update_credentials` |
120
- | Deployment Pipeline | `deployment_pipeline_assign_workspace`, `deployment_pipeline_unassign_workspace`, `deployment_pipeline_deploy` |
121
- | Mirrored Database | `mirrored_database_create`, `mirrored_database_update`, `mirrored_database_delete`, `mirrored_database_update_definition`, `mirrored_database_start_mirroring`, `mirrored_database_stop_mirroring` |
122
- | KQL Database | `kql_database_create`, `kql_database_update`, `kql_database_delete`, `kql_database_update_definition` |
123
- | ML Model | `ml_model_create`, `ml_model_update`, `ml_model_delete` |
124
- | ML Experiment | `ml_experiment_create`, `ml_experiment_update`, `ml_experiment_delete` |
125
- | Copy Job | `copy_job_create`, `copy_job_update`, `copy_job_delete`, `copy_job_update_definition` |
126
- | External Data Share | `external_data_share_create`, `external_data_share_revoke` |
127
-
128
- **Not guarded:** Read operations (list, get, get_definition, get_bim, get_tmdl), query execution (DAX, KQL, SQL, GraphQL), run/refresh/cancel operations, export operations, and deployment pipeline CRUD (tenant-level, not workspace-scoped).
129
-
130
- **Claude Desktop config with guard:**
131
- ```json
132
- {
133
- "mcpServers": {
134
- "fabric": {
135
- "command": "npx",
136
- "args": ["-y", "@einlogic/mcp-fabric-api"],
137
- "env": {
138
- "WRITABLE_WORKSPACES": "*-Dev,*-Test,Sandbox*"
139
- }
140
- }
141
- }
142
- }
143
- ```
144
-
145
- **Claude Code CLI with guard:**
146
- ```bash
147
- WRITABLE_WORKSPACES="*-Dev,*-Test" claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api
148
- ```
149
-
150
- **Error when not configured:**
151
- ```
152
- WRITABLE_WORKSPACES is not configured. Destructive actions are blocked by default. Set WRITABLE_WORKSPACES to a comma-separated list of workspace name patterns, or "*" to allow all.
153
- ```
154
-
155
- **Error when workspace not in allow list:**
156
- ```
157
- Workspace "Production-Analytics" is not in the writable workspaces list. Allowed patterns: *-Dev, *-Test, Sandbox*
158
- ```
159
-
160
- ### Debug Logging
161
-
162
- Enable verbose debug logging to diagnose API errors, inspect request/response details, and trace long-running operations. All log output goes to `stderr` (visible in Claude Desktop's log files, never interferes with JSON-RPC on stdout).
163
-
164
- Set the `LOG_LEVEL` environment variable to `debug`:
165
-
166
- **Claude Desktop config:**
167
- ```json
168
- {
169
- "mcpServers": {
170
- "fabric": {
171
- "command": "npx",
172
- "args": ["-y", "@einlogic/mcp-fabric-api"],
173
- "env": {
174
- "LOG_LEVEL": "debug"
175
- }
176
- }
177
- }
178
- }
179
- ```
180
-
181
- **Claude Code CLI:**
182
- ```bash
183
- LOG_LEVEL=debug claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api
184
- ```
185
-
186
- **What gets logged at debug level:**
187
-
188
- | Category | Details logged |
189
- |----------|---------------|
190
- | HTTP requests | Method, full URL, request body size in bytes |
191
- | HTTP responses | Status code, duration (ms), `x-ms-request-id` header |
192
- | Definition uploads | Part paths, payload types, payload sizes never payload content |
193
- | API errors | Full error body including `errorCode`, `details[]`, `innererror`, `relatedResource`, `x-ms-request-id` |
194
- | LRO polling | Operation ID, poll count, elapsed time, final status |
195
- | Pagination | Page count, items per page, total items |
196
- | SQL/KQL queries | Server, database, duration, column/row counts — never query text or result data |
197
- | Rate limiting | Retry-after duration, affected endpoint |
198
-
199
- **Compliance:** Debug logging never captures actual data content — no query text, no query results, no definition payloads, no bearer tokens. Only structural metadata (URLs, sizes, counts, timing, error details) is logged.
200
-
201
- **Viewing logs in Claude Desktop:**
202
-
203
- - **macOS:** `~/Library/Logs/Claude/mcp-server-fabric.log`
204
- - **Windows:** `%APPDATA%\Claude\logs\mcp-server-fabric.log`
205
-
206
- You can also tail the log in real time:
207
- ```bash
208
- # macOS
209
- tail -f ~/Library/Logs/Claude/mcp-server-fabric.log
210
-
211
- # Windows (PowerShell)
212
- Get-Content "$env:APPDATA\Claude\logs\mcp-server-fabric.log" -Wait
213
- ```
214
-
215
- The `x-ms-request-id` value logged with every API error is the key identifier needed when opening a support case with Microsoft for Fabric API issues.
216
-
217
- ### File-Based I/O
218
-
219
- To avoid large payloads overwhelming MCP clients, definition tools use file paths instead of inline content. The server reads files from disk when sending definitions to Fabric, and writes files to disk when retrieving definitions from Fabric.
220
-
221
- **Input tools** — the server reads definition files from the specified path and uploads to Fabric:
222
-
223
- | Tool | Parameter | Description |
224
- |------|-----------|-------------|
225
- | `semantic_model_create_bim` | `definitionFilePath` | Path to model.bim JSON file |
226
- | `semantic_model_update_bim` | `definitionFilePath` | Path to model.bim JSON file |
227
- | `semantic_model_create_tmdl` | `filesDirectoryPath` | Directory of `.tmdl` and `.pbism` files |
228
- | `semantic_model_update_tmdl` | `filesDirectoryPath` | Directory of `.tmdl` and `.pbism` files |
229
- | `notebook_update_definition` | `definitionDirectoryPath` | Directory containing notebook definition files |
230
- | `eventstream_update_definition` | `definitionDirectoryPath` | Directory containing eventstream definition files |
231
- | `report_create_definition` | `definitionDirectoryPath` | Directory of PBIR report definition files |
232
- | `report_update_definition` | `definitionDirectoryPath` | Directory of PBIR report definition files |
233
- | `variable_library_create` | `definitionDirectoryPath` | Directory of `.json` and `.platform` files |
234
- | `variable_library_update_definition` | `definitionDirectoryPath` | Directory of `.json` and `.platform` files |
235
- | `lakehouse_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
236
- | `warehouse_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
237
- | `pipeline_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
238
- | `reflex_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
239
- | `mirrored_database_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
240
- | `kql_database_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
241
- | `copy_job_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
242
-
243
- **Output tools** — the server retrieves definitions from Fabric and writes them to disk:
244
-
245
- | Tool | Parameter | What gets written |
246
- |------|-----------|-------------------|
247
- | `semantic_model_get_bim` | `outputFilePath` | Single `model.bim` JSON file |
248
- | `semantic_model_get_tmdl` | `outputDirectoryPath` | TMDL files preserving folder structure |
249
- | `notebook_get_definition` | `outputDirectoryPath` | Notebook definition files |
250
- | `lakehouse_get_definition` | `outputDirectoryPath` | Lakehouse definition files |
251
- | `warehouse_get_definition` | `outputDirectoryPath` | Warehouse definition files |
252
- | `pipeline_get_definition` | `outputDirectoryPath` | Pipeline definition files |
253
- | `report_get_definition` | `outputDirectoryPath` | Report definition files (report.json, pages, visuals) |
254
- | `dataflow_get_definition` | `outputDirectoryPath` | Dataflow definition files |
255
- | `eventstream_get_definition` | `outputDirectoryPath` | Eventstream definition files |
256
- | `graphql_api_get_definition` | `outputDirectoryPath` | GraphQL schema definition files |
257
- | `reflex_get_definition` | `outputDirectoryPath` | Reflex definition files |
258
- | `variable_library_get_definition` | `outputDirectoryPath` | Variable library files (variables.json, valueSets/) |
259
- | `mirrored_database_get_definition` | `outputDirectoryPath` | Mirrored database definition files |
260
- | `kql_database_get_definition` | `outputDirectoryPath` | KQL database definition files |
261
- | `copy_job_get_definition` | `outputDirectoryPath` | Copy job definition files |
262
-
263
- **TMDL directory structure example:**
264
- ```
265
- /tmp/my-model/
266
- model.tmdl
267
- definition.pbism
268
- definition/
269
- tables/
270
- Sales.tmdl
271
- Product.tmdl
272
- relationships.tmdl
273
- ```
274
-
275
- ## Development
276
-
277
- ```bash
278
- git clone https://github.com/your-org/mcp-fabric-api.git
279
- cd mcp-fabric-api
280
- npm install
281
- npm run build
282
- npm start
283
- npm run dev # Watch mode
284
- npm run inspect # Launch MCP Inspector
285
- ```
286
-
287
- ## Tools (197 total)
288
-
289
- ### Auth (4 tools)
290
- | Tool | Description |
291
- |------|-------------|
292
- | `auth_get_current_account` | Show current Azure identity, tenant, and token expiry |
293
- | `auth_list_available_accounts` | List subscriptions/tenants from local `az login` state (does not query Entra) |
294
- | `auth_switch_tenant` | Switch to a different Azure tenant (with rollback on failure) |
295
- | `auth_clear_token_cache` | Clear cached tokens to force re-acquisition |
296
-
297
- ### Workspace (6 tools)
298
- | Tool | Description |
299
- |------|-------------|
300
- | `workspace_list` | List all accessible Fabric workspaces |
301
- | `workspace_get` | Get details of a specific workspace |
302
- | `workspace_create` | Create a new workspace |
303
- | `workspace_update` | Update a workspace's name or description |
304
- | `workspace_delete` | Delete a workspace |
305
- | `workspace_list_items` | List all items in a workspace (with optional type filter) |
306
-
307
- ### Lakehouse (14 tools)
308
- | Tool | Description |
309
- |------|-------------|
310
- | `lakehouse_list` | List all lakehouses in a workspace |
311
- | `lakehouse_get` | Get lakehouse details (SQL endpoint, OneLake paths) |
312
- | `lakehouse_create` | Create a new lakehouse (LRO, schemas enabled by default) |
313
- | `lakehouse_update` | Update lakehouse name or description |
314
- | `lakehouse_delete` | Delete a lakehouse |
315
- | `lakehouse_list_tables` | List all tables in a lakehouse (falls back to SQL endpoint for schema-enabled lakehouses) |
316
- | `lakehouse_load_table` | Load data into a table from OneLake (LRO). Not supported for schema-enabled lakehouses |
317
- | `lakehouse_create_shortcut` | Create a OneLake shortcut (file, folder, table, or schema level) with support for multiple target types |
318
- | `lakehouse_get_sql_endpoint` | Get SQL endpoint details |
319
- | `lakehouse_get_definition` | Get lakehouse definition (LRO). Writes files to `outputDirectoryPath` |
320
- | `lakehouse_update_definition` | Update lakehouse definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
321
- | `lakehouse_list_shortcuts` | List all OneLake shortcuts in a lakehouse |
322
- | `lakehouse_get_shortcut` | Get details of a specific OneLake shortcut |
323
- | `lakehouse_delete_shortcut` | Delete a OneLake shortcut |
324
-
325
- ### Warehouse (9 tools)
326
- | Tool | Description |
327
- |------|-------------|
328
- | `warehouse_list` | List all warehouses in a workspace |
329
- | `warehouse_get` | Get warehouse details including connection string and provisioning status |
330
- | `warehouse_create` | Create a new warehouse (LRO) |
331
- | `warehouse_update` | Update warehouse name or description |
332
- | `warehouse_delete` | Delete a warehouse |
333
- | `warehouse_get_sql_endpoint` | Get SQL connection details for a warehouse |
334
- | `warehouse_list_tables` | List all tables in a warehouse |
335
- | `warehouse_get_definition` | Get warehouse definition (LRO). Writes files to `outputDirectoryPath` |
336
- | `warehouse_update_definition` | Update warehouse definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
337
-
338
- ### Notebook (10 tools)
339
- | Tool | Description |
340
- |------|-------------|
341
- | `notebook_list` | List all notebooks in a workspace |
342
- | `notebook_get` | Get notebook details |
343
- | `notebook_create` | Create a new notebook (LRO) |
344
- | `notebook_update` | Update notebook name or description |
345
- | `notebook_delete` | Delete a notebook |
346
- | `notebook_get_definition` | Get notebook definition (LRO). Writes files to `outputDirectoryPath` |
347
- | `notebook_update_definition` | Update notebook definition (LRO). Reads files from `definitionDirectoryPath` |
348
- | `notebook_run` | Run a notebook on demand |
349
- | `notebook_get_run_status` | Get notebook run status |
350
- | `notebook_cancel_run` | Cancel a running notebook |
351
-
352
- ### Pipeline (15 tools)
353
- | Tool | Description |
354
- |------|-------------|
355
- | `pipeline_list` | List all data pipelines |
356
- | `pipeline_get` | Get pipeline details |
357
- | `pipeline_create` | Create a new pipeline |
358
- | `pipeline_update` | Update pipeline name or description |
359
- | `pipeline_delete` | Delete a pipeline |
360
- | `pipeline_run` | Run a pipeline on demand |
361
- | `pipeline_get_run_status` | Get pipeline run status |
362
- | `pipeline_cancel_run` | Cancel a running pipeline |
363
- | `pipeline_list_runs` | List all run instances |
364
- | `pipeline_list_schedules` | List pipeline schedules |
365
- | `pipeline_create_schedule` | Create a pipeline schedule |
366
- | `pipeline_update_schedule` | Update a pipeline schedule |
367
- | `pipeline_delete_schedule` | Delete a pipeline schedule |
368
- | `pipeline_get_definition` | Get pipeline definition (LRO). Writes files to `outputDirectoryPath` |
369
- | `pipeline_update_definition` | Update pipeline definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
370
-
371
- ### Semantic Model (15 tools)
372
- | Tool | Description |
373
- |------|-------------|
374
- | `semantic_model_list` | List all semantic models |
375
- | `semantic_model_get_details` | Get semantic model metadata (name, ID, description) does not return the definition |
376
- | `semantic_model_create_bim` | Create a semantic model from a BIM/JSON file (LRO). Reads `model.bim` from `definitionFilePath` |
377
- | `semantic_model_create_tmdl` | Create a semantic model from TMDL files (LRO). Reads `.tmdl`/`.pbism` from `filesDirectoryPath` |
378
- | `semantic_model_update_details` | Update semantic model name or description — does not modify the definition |
379
- | `semantic_model_delete` | Delete a semantic model |
380
- | `semantic_model_refresh` | Trigger a model refresh (Power BI API) |
381
- | `semantic_model_execute_dax` | Execute a DAX query (Power BI API) |
382
- | `semantic_model_get_bim` | Get definition in BIM/JSON format (LRO). Writes `model.bim` to `outputFilePath` |
383
- | `semantic_model_get_tmdl` | Get definition in TMDL format (LRO). Writes TMDL files to `outputDirectoryPath` |
384
- | `semantic_model_update_bim` | Update definition from BIM/JSON file (LRO). Reads `model.bim` from `definitionFilePath` |
385
- | `semantic_model_update_tmdl` | Update definition from TMDL files (LRO). Reads `.tmdl`/`.pbism` from `filesDirectoryPath` |
386
- | `semantic_model_get_refresh_history` | Get refresh history (Power BI API) |
387
- | `semantic_model_take_over` | Take over ownership of a semantic model (Power BI API) |
388
- | `semantic_model_get_datasources` | Get data sources of a semantic model (Power BI API) |
389
-
390
- ### Report (13 tools)
391
- | Tool | Description |
392
- |------|-------------|
393
- | `report_list` | List all reports |
394
- | `report_get` | Get report details |
395
- | `report_create_definition` | Create a new report from PBIR definition files (LRO). Reads from `definitionDirectoryPath` |
396
- | `report_update` | Update report name or description |
397
- | `report_delete` | Delete a report |
398
- | `report_clone` | Clone a report (Power BI API) |
399
- | `report_export` | Export report to file format (PDF, PPTX, PNG, etc.) via Power BI API |
400
- | `report_get_export_status` | Check report export status |
401
- | `report_get_definition` | Get report definition (LRO). Writes files to `outputDirectoryPath` |
402
- | `report_update_definition` | Update report definition from PBIR directory (LRO). Reads from `definitionDirectoryPath` |
403
- | `report_rebind` | Rebind a report to a different semantic model/dataset (Power BI API) |
404
- | `report_get_pages` | Get the list of pages in a report (Power BI API) |
405
- | `report_get_datasources` | Get data sources used by a report (Power BI API) |
406
-
407
- ### Dataflow Gen2 (8 tools)
408
- | Tool | Description |
409
- |------|-------------|
410
- | `dataflow_list` | List all Dataflow Gen2 items |
411
- | `dataflow_get` | Get dataflow details |
412
- | `dataflow_create` | Create a new dataflow |
413
- | `dataflow_update` | Update dataflow name or description |
414
- | `dataflow_delete` | Delete a dataflow |
415
- | `dataflow_refresh` | Trigger a dataflow refresh |
416
- | `dataflow_get_refresh_status` | Get refresh job status |
417
- | `dataflow_get_definition` | Get dataflow definition (LRO). Writes files to `outputDirectoryPath` |
418
-
419
- ### Eventhouse (7 tools)
420
- | Tool | Description |
421
- |------|-------------|
422
- | `eventhouse_list` | List all eventhouses |
423
- | `eventhouse_get` | Get eventhouse details |
424
- | `eventhouse_create` | Create a new eventhouse (LRO) |
425
- | `eventhouse_update` | Update eventhouse name or description |
426
- | `eventhouse_delete` | Delete an eventhouse |
427
- | `eventhouse_get_sql_endpoint` | Get query service URI and connection details |
428
- | `eventhouse_execute_kql` | Execute a KQL query against a KQL database |
429
-
430
- ### Eventstream (7 tools)
431
- | Tool | Description |
432
- |------|-------------|
433
- | `eventstream_list` | List all eventstreams |
434
- | `eventstream_get` | Get eventstream details |
435
- | `eventstream_create` | Create a new eventstream (LRO) |
436
- | `eventstream_update` | Update eventstream name or description |
437
- | `eventstream_delete` | Delete an eventstream |
438
- | `eventstream_get_definition` | Get eventstream definition (LRO). Writes files to `outputDirectoryPath` |
439
- | `eventstream_update_definition` | Update eventstream definition (LRO). Reads from `definitionDirectoryPath` |
440
-
441
- ### Reflex / Activator (7 tools)
442
- | Tool | Description |
443
- |------|-------------|
444
- | `reflex_list` | List all Reflex (Activator) items |
445
- | `reflex_get` | Get reflex details |
446
- | `reflex_create` | Create a new reflex |
447
- | `reflex_update` | Update reflex name or description |
448
- | `reflex_delete` | Delete a reflex |
449
- | `reflex_get_definition` | Get reflex definition (LRO). Writes files to `outputDirectoryPath` |
450
- | `reflex_update_definition` | Update reflex definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
451
-
452
- ### GraphQL API (7 tools)
453
- | Tool | Description |
454
- |------|-------------|
455
- | `graphql_api_list` | List all GraphQL API items |
456
- | `graphql_api_get` | Get GraphQL API details |
457
- | `graphql_api_create` | Create a new GraphQL API |
458
- | `graphql_api_update` | Update GraphQL API name or description |
459
- | `graphql_api_delete` | Delete a GraphQL API |
460
- | `graphql_api_get_definition` | Get GraphQL schema definition (LRO). Writes files to `outputDirectoryPath` |
461
- | `graphql_api_execute_query` | Execute a GraphQL query |
462
-
463
- ### SQL Endpoint (4 tools)
464
- | Tool | Description |
465
- |------|-------------|
466
- | `sql_endpoint_list` | List all SQL endpoints |
467
- | `sql_endpoint_get` | Get SQL endpoint details |
468
- | `sql_endpoint_get_connection_string` | Get TDS connection string |
469
- | `sql_endpoint_execute_query` | Execute a T-SQL query against a lakehouse or warehouse SQL endpoint |
470
-
471
- ### Variable Library (7 tools)
472
- | Tool | Description |
473
- |------|-------------|
474
- | `variable_library_list` | List all variable libraries in a workspace |
475
- | `variable_library_get` | Get variable library details including active value set name |
476
- | `variable_library_create` | Create a variable library, optionally with definition files from `definitionDirectoryPath` (LRO) |
477
- | `variable_library_update` | Update name, description, or active value set |
478
- | `variable_library_delete` | Delete a variable library |
479
- | `variable_library_get_definition` | Get definition (LRO). Writes files (variables.json, valueSets/) to `outputDirectoryPath` |
480
- | `variable_library_update_definition` | Update definition from directory of `.json` and `.platform` files (LRO) |
481
-
482
- ### Git Integration (9 tools)
483
- | Tool | Description |
484
- |------|-------------|
485
- | `git_get_connection` | Get Git connection details for a workspace |
486
- | `git_get_status` | Get Git status of items (sync state between workspace and remote) |
487
- | `git_connect` | Connect a workspace to a Git repository (Azure DevOps or GitHub) |
488
- | `git_disconnect` | Disconnect a workspace from its Git repository |
489
- | `git_initialize_connection` | Initialize a Git connection after connecting (LRO) |
490
- | `git_commit_to_git` | Commit workspace changes to the connected Git repository (LRO) |
491
- | `git_update_from_git` | Update workspace from the connected Git repository (LRO) |
492
- | `git_get_credentials` | Get Git credentials configuration for the current user |
493
- | `git_update_credentials` | Update Git credentials configuration for the current user |
494
-
495
- ### Deployment Pipeline (12 tools)
496
- | Tool | Description |
497
- |------|-------------|
498
- | `deployment_pipeline_list` | List all deployment pipelines accessible to the user |
499
- | `deployment_pipeline_get` | Get details of a specific deployment pipeline |
500
- | `deployment_pipeline_create` | Create a new deployment pipeline |
501
- | `deployment_pipeline_update` | Update deployment pipeline name or description |
502
- | `deployment_pipeline_delete` | Delete a deployment pipeline |
503
- | `deployment_pipeline_list_stages` | List all stages in a deployment pipeline |
504
- | `deployment_pipeline_list_stage_items` | List all items in a specific stage |
505
- | `deployment_pipeline_assign_workspace` | Assign a workspace to a pipeline stage |
506
- | `deployment_pipeline_unassign_workspace` | Unassign a workspace from a pipeline stage |
507
- | `deployment_pipeline_deploy` | Deploy items from one stage to another (LRO) |
508
- | `deployment_pipeline_list_operations` | List operations (deployment history) |
509
- | `deployment_pipeline_get_operation` | Get details of a specific deployment operation |
510
-
511
- ### Mirrored Database (11 tools)
512
- | Tool | Description |
513
- |------|-------------|
514
- | `mirrored_database_list` | List all mirrored databases in a workspace |
515
- | `mirrored_database_get` | Get details of a specific mirrored database |
516
- | `mirrored_database_create` | Create a new mirrored database (LRO) |
517
- | `mirrored_database_update` | Update mirrored database name or description |
518
- | `mirrored_database_delete` | Delete a mirrored database |
519
- | `mirrored_database_get_definition` | Get mirrored database definition (LRO). Writes files to `outputDirectoryPath` |
520
- | `mirrored_database_update_definition` | Update definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
521
- | `mirrored_database_start_mirroring` | Start mirroring for a mirrored database |
522
- | `mirrored_database_stop_mirroring` | Stop mirroring for a mirrored database |
523
- | `mirrored_database_get_mirroring_status` | Get the mirroring status |
524
- | `mirrored_database_get_tables_mirroring_status` | Get mirroring status of individual tables |
525
-
526
- ### KQL Database (7 tools)
527
- | Tool | Description |
528
- |------|-------------|
529
- | `kql_database_list` | List all KQL databases in a workspace |
530
- | `kql_database_get` | Get details of a specific KQL database |
531
- | `kql_database_create` | Create a new KQL database (LRO). Requires a parent eventhouse |
532
- | `kql_database_update` | Update KQL database name or description |
533
- | `kql_database_delete` | Delete a KQL database |
534
- | `kql_database_get_definition` | Get KQL database definition (LRO). Writes files to `outputDirectoryPath` |
535
- | `kql_database_update_definition` | Update definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
536
-
537
- ### ML Model (5 tools)
538
- | Tool | Description |
539
- |------|-------------|
540
- | `ml_model_list` | List all ML models in a workspace |
541
- | `ml_model_get` | Get details of a specific ML model |
542
- | `ml_model_create` | Create a new ML model (LRO) |
543
- | `ml_model_update` | Update ML model name or description |
544
- | `ml_model_delete` | Delete an ML model |
545
-
546
- ### ML Experiment (5 tools)
547
- | Tool | Description |
548
- |------|-------------|
549
- | `ml_experiment_list` | List all ML experiments in a workspace |
550
- | `ml_experiment_get` | Get details of a specific ML experiment |
551
- | `ml_experiment_create` | Create a new ML experiment (LRO) |
552
- | `ml_experiment_update` | Update ML experiment name or description |
553
- | `ml_experiment_delete` | Delete an ML experiment |
554
-
555
- ### Copy Job (11 tools)
556
- | Tool | Description |
557
- |------|-------------|
558
- | `copy_job_list` | List all copy jobs in a workspace |
559
- | `copy_job_get` | Get details of a specific copy job |
560
- | `copy_job_create` | Create a new copy job |
561
- | `copy_job_update` | Update copy job name or description |
562
- | `copy_job_delete` | Delete a copy job |
563
- | `copy_job_get_definition` | Get copy job definition (LRO). Writes files to `outputDirectoryPath` |
564
- | `copy_job_update_definition` | Update definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
565
- | `copy_job_run` | Run a copy job on demand |
566
- | `copy_job_get_run_status` | Get copy job run status |
567
- | `copy_job_cancel_run` | Cancel a running copy job |
568
- | `copy_job_list_runs` | List all run instances for a copy job |
569
-
570
- ### External Data Share (4 tools)
571
- | Tool | Description |
572
- |------|-------------|
573
- | `external_data_share_list` | List all external data shares for an item |
574
- | `external_data_share_get` | Get details of a specific external data share |
575
- | `external_data_share_create` | Create a new external data share for an item |
576
- | `external_data_share_revoke` | Revoke an external data share |
577
-
578
- ## License
579
-
580
- AGPL-3.0
1
+ # mcp-fabric-api
2
+
3
+ MCP (Model Context Protocol) server for the Microsoft Fabric REST APIs. Built for data engineers and data analysts who want to use AI assistants beyond Copilot — such as Claude, Claude Code, or any MCP-compatible client — to build and manage their Fabric components. Covers workspaces, lakehouses, warehouses, notebooks, pipelines, semantic models, reports, dataflows, eventhouses, eventstreams, reflexes, GraphQL APIs, SQL endpoints, variable libraries, git integration, deployment pipelines, mirrored databases, KQL databases, ML models, ML experiments, copy jobs, and external data shares.
4
+
5
+ > **Safe by default:** This server blocks all destructive operations (create, update, delete) until you explicitly configure the `WRITABLE_WORKSPACES` environment variable. Read operations always work. Set `WRITABLE_WORKSPACES="*"` to allow writes to all workspaces, or use patterns to limit access. See [Workspace Safety Guard](#workspace-safety-guard) for details.
6
+
7
+ ## Prerequisites
8
+
9
+ - Node.js 18+
10
+ - Access to a Microsoft Fabric workspace
11
+ - One of:
12
+ - Azure CLI (`az login`) — easiest on Windows
13
+ - Azure app registration with device code flow enabled — best for Mac / Claude Desktop
14
+ - Service principal credentials — best for headless / automated scenarios
15
+
16
+ ## Quick Start
17
+
18
+ **Windows (Azure CLI):**
19
+
20
+ ```bash
21
+ az login
22
+ npx @einlogic/mcp-fabric-api
23
+ ```
24
+
25
+ **Mac / Claude Desktop (Device Code):**
26
+
27
+ ```bash
28
+ AUTH_METHOD=device-code AZURE_CLIENT_ID=your-app-id AZURE_TENANT_ID=your-tenant-id npx @einlogic/mcp-fabric-api
29
+ ```
30
+
31
+ On first API call, a sign-in URL and code will appear in the logs. Open the URL in your browser, enter the code, and authenticate.
32
+
33
+ ## Setup
34
+
35
+ ### Claude Desktop
36
+
37
+ Add to your Claude Desktop config file:
38
+
39
+ - **macOS:** `~/Library/Application Support/Claude/claude_desktop_config.json`
40
+ - **Windows:** `%APPDATA%\Claude\claude_desktop_config.json`
41
+
42
+ **Windows (uses Azure CLI credentials):**
43
+
44
+ ```json
45
+ {
46
+ "mcpServers": {
47
+ "fabric": {
48
+ "command": "npx",
49
+ "args": ["-y", "@einlogic/mcp-fabric-api"]
50
+ }
51
+ }
52
+ }
53
+ ```
54
+
55
+ **macOS (uses device code flow):**
56
+
57
+ ```json
58
+ {
59
+ "mcpServers": {
60
+ "fabric": {
61
+ "command": "npx",
62
+ "args": ["-y", "@einlogic/mcp-fabric-api"],
63
+ "env": {
64
+ "AUTH_METHOD": "device-code",
65
+ "AZURE_CLIENT_ID": "your-app-client-id",
66
+ "AZURE_TENANT_ID": "your-tenant-id"
67
+ }
68
+ }
69
+ }
70
+ }
71
+ ```
72
+
73
+ When the server starts, check the Claude Desktop logs for a sign-in prompt:
74
+ - **macOS:** `~/Library/Logs/Claude/mcp-server-fabric.log`
75
+ - **Windows:** `%APPDATA%\Claude\logs\mcp-server-fabric.log`
76
+
77
+ The prompt will say: *"To sign in, use a web browser to open https://microsoft.com/devicelogin and enter the code XXXXXXX"*. Complete the sign-in once and the token is cached for the session.
78
+
79
+ ### Claude Code CLI
80
+
81
+ **Windows:**
82
+ ```bash
83
+ claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api
84
+ ```
85
+
86
+ **macOS:**
87
+ ```bash
88
+ claude mcp add fabric -e AUTH_METHOD=device-code -e AZURE_CLIENT_ID=your-app-id -e AZURE_TENANT_ID=your-tenant-id -- npx -y @einlogic/mcp-fabric-api
89
+ ```
90
+
91
+ To verify it was added:
92
+
93
+ ```bash
94
+ claude mcp list
95
+ ```
96
+
97
+ ### HTTP Mode (Remote)
98
+
99
+ For remote deployments, set environment variables:
100
+
101
+ ```bash
102
+ export TRANSPORT=http
103
+ export PORT=3000
104
+ export AZURE_CLIENT_ID=your-client-id
105
+ export AZURE_CLIENT_SECRET=your-client-secret
106
+ export AZURE_TENANT_ID=your-tenant-id
107
+ npx @einlogic/mcp-fabric-api
108
+ ```
109
+
110
+ The server exposes:
111
+ - `POST /mcp` MCP endpoint (StreamableHTTP)
112
+ - `GET /mcp` SSE stream for server notifications
113
+ - `DELETE /mcp` Session cleanup
114
+ - `GET /.well-known/oauth-protected-resource` OAuth metadata
115
+
116
+ ### Authentication Methods
117
+
118
+ The server supports multiple authentication methods via the `AUTH_METHOD` environment variable. Choose the method that fits your platform and scenario:
119
+
120
+ | Method | `AUTH_METHOD` | Required env vars | Best for |
121
+ |--------|--------------|-------------------|----------|
122
+ | Azure CLI (default) | `default` | None | Windows with `az login` |
123
+ | Device Code | `device-code` | `AZURE_CLIENT_ID`, `AZURE_TENANT_ID` | **Mac / Claude Desktop** |
124
+ | Client Secret | `client-secret` | `AZURE_CLIENT_ID`, `AZURE_CLIENT_SECRET`, `AZURE_TENANT_ID` | Headless / automated |
125
+ | Interactive Browser | `interactive-browser` | None (optional: `AZURE_CLIENT_ID`, `AZURE_TENANT_ID`) | Systems with browser access |
126
+
127
+ **Default (Azure CLI):** Uses the `DefaultAzureCredential` chain from the Azure Identity SDK. On a developer machine this picks up credentials from `az login`. No extra configuration needed. This is the original behavior and works best on Windows where Claude Desktop can access the Azure CLI token cache.
128
+
129
+ **Device Code:** On first API call, prints a URL and one-time code to stderr. You open the URL in any browser, enter the code, and sign in with your Azure account. The token is cached in memory for the session. This is the recommended method for **Mac users with Claude Desktop**, because the Claude Desktop process on macOS cannot access the Azure CLI token cache.
130
+
131
+ To use device code flow, you need an Azure app registration with **"Allow public client flows"** enabled:
132
+ 1. Go to [Azure Portal](https://portal.azure.com) > App registrations > New registration
133
+ 2. Name it (e.g., "Fabric MCP") and register
134
+ 3. Under **Authentication** > **Advanced settings**, set **"Allow public client flows"** to **Yes**
135
+ 4. Under **API permissions**, add `https://api.fabric.microsoft.com/Workspace.ReadWrite.All` (or the scopes your tools need)
136
+ 5. Copy the **Application (client) ID** and your **Directory (tenant) ID**
137
+
138
+ **Client Secret:** Uses a service principal with client credentials. Requires an Azure app registration with a client secret. Suitable for CI/CD pipelines, automated scripts, or any headless environment where interactive sign-in is not possible.
139
+
140
+ **Interactive Browser:** Opens a browser window for OAuth sign-in. Works on systems where the server process can launch a browser. Optional `AZURE_CLIENT_ID` and `AZURE_TENANT_ID` can be provided to target a specific app and tenant.
141
+
142
+ ### Workspace Safety Guard
143
+
144
+ Control which workspaces allow write operations (create, update, delete) via the `WRITABLE_WORKSPACES` environment variable. Only workspaces matching the configured name patterns will permit CUD (Create, Update, Delete) operations. Read operations are never restricted.
145
+
146
+ > **Default behavior: When `WRITABLE_WORKSPACES` is not set or empty, all destructive operations are blocked.** You must explicitly configure this variable to enable writes.
147
+
148
+ | `WRITABLE_WORKSPACES` value | Behavior |
149
+ |------------------------------|----------|
150
+ | Not set / empty | **All writes blocked** (safe default) |
151
+ | `*` | All workspaces writable |
152
+ | `*-Dev,*-Test,Sandbox*` | Only matching workspaces writable |
153
+
154
+ Set comma-separated glob patterns:
155
+
156
+ ```bash
157
+ WRITABLE_WORKSPACES=*-Dev,*-Test,Sandbox*
158
+ ```
159
+
160
+ **Wildcard examples:**
161
+ - `*` matches all workspaces (allow everything)
162
+ - `*-Dev` matches "Sales-Dev", "Finance-Dev"
163
+ - `Sandbox*` matches "Sandbox-123", "Sandbox-Mike"
164
+ - `Exact-Name` matches only "Exact-Name" (case-insensitive)
165
+
166
+ **Guarded tools (89 total)** — every tool that creates, updates, or deletes workspace items:
167
+
168
+ | Domain | Guarded tools |
169
+ |--------|--------------|
170
+ | Workspace | `workspace_update`, `workspace_delete` |
171
+ | Lakehouse | `lakehouse_create`, `lakehouse_update`, `lakehouse_delete`, `lakehouse_load_table`, `lakehouse_create_shortcut`, `lakehouse_update_definition`, `lakehouse_delete_shortcut` |
172
+ | Warehouse | `warehouse_create`, `warehouse_update`, `warehouse_delete`, `warehouse_update_definition` |
173
+ | Notebook | `notebook_create`, `notebook_update`, `notebook_delete`, `notebook_update_definition` |
174
+ | Pipeline | `pipeline_create`, `pipeline_update`, `pipeline_delete`, `pipeline_create_schedule`, `pipeline_update_schedule`, `pipeline_delete_schedule`, `pipeline_update_definition` |
175
+ | Semantic Model | `semantic_model_create_bim`, `semantic_model_create_tmdl`, `semantic_model_update_details`, `semantic_model_delete`, `semantic_model_update_bim`, `semantic_model_update_tmdl`, `semantic_model_take_over` |
176
+ | Report | `report_create_definition`, `report_update`, `report_delete`, `report_clone`, `report_update_definition`, `report_rebind` |
177
+ | Dataflow | `dataflow_create`, `dataflow_update`, `dataflow_delete` |
178
+ | Eventhouse | `eventhouse_create`, `eventhouse_update`, `eventhouse_delete` |
179
+ | Eventstream | `eventstream_create`, `eventstream_update`, `eventstream_delete`, `eventstream_update_definition` |
180
+ | Reflex | `reflex_create`, `reflex_update`, `reflex_delete`, `reflex_update_definition` |
181
+ | GraphQL API | `graphql_api_create`, `graphql_api_update`, `graphql_api_delete` |
182
+ | Variable Library | `variable_library_create`, `variable_library_update`, `variable_library_delete`, `variable_library_update_definition` |
183
+ | Git Integration | `git_connect`, `git_disconnect`, `git_initialize_connection`, `git_commit_to_git`, `git_update_from_git`, `git_update_credentials` |
184
+ | Deployment Pipeline | `deployment_pipeline_assign_workspace`, `deployment_pipeline_unassign_workspace`, `deployment_pipeline_deploy` |
185
+ | Mirrored Database | `mirrored_database_create`, `mirrored_database_update`, `mirrored_database_delete`, `mirrored_database_update_definition`, `mirrored_database_start_mirroring`, `mirrored_database_stop_mirroring` |
186
+ | KQL Database | `kql_database_create`, `kql_database_update`, `kql_database_delete`, `kql_database_update_definition` |
187
+ | ML Model | `ml_model_create`, `ml_model_update`, `ml_model_delete` |
188
+ | ML Experiment | `ml_experiment_create`, `ml_experiment_update`, `ml_experiment_delete` |
189
+ | Copy Job | `copy_job_create`, `copy_job_update`, `copy_job_delete`, `copy_job_update_definition` |
190
+ | External Data Share | `external_data_share_create`, `external_data_share_revoke` |
191
+
192
+ **Not guarded:** Read operations (list, get, get_definition, get_bim, get_tmdl), query execution (DAX, KQL, SQL, GraphQL), run/refresh/cancel operations, export operations, and deployment pipeline CRUD (tenant-level, not workspace-scoped).
193
+
194
+ **Claude Desktop config with guard (Windows):**
195
+ ```json
196
+ {
197
+ "mcpServers": {
198
+ "fabric": {
199
+ "command": "npx",
200
+ "args": ["-y", "@einlogic/mcp-fabric-api"],
201
+ "env": {
202
+ "WRITABLE_WORKSPACES": "*-Dev,*-Test,Sandbox*"
203
+ }
204
+ }
205
+ }
206
+ }
207
+ ```
208
+
209
+ **Claude Desktop config with guard (macOS):**
210
+ ```json
211
+ {
212
+ "mcpServers": {
213
+ "fabric": {
214
+ "command": "npx",
215
+ "args": ["-y", "@einlogic/mcp-fabric-api"],
216
+ "env": {
217
+ "AUTH_METHOD": "device-code",
218
+ "AZURE_CLIENT_ID": "your-app-client-id",
219
+ "AZURE_TENANT_ID": "your-tenant-id",
220
+ "WRITABLE_WORKSPACES": "*-Dev,*-Test,Sandbox*"
221
+ }
222
+ }
223
+ }
224
+ }
225
+ ```
226
+
227
+ **Claude Code CLI with guard:**
228
+ ```bash
229
+ WRITABLE_WORKSPACES="*-Dev,*-Test" claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api
230
+ ```
231
+
232
+ **Error when not configured:**
233
+ ```
234
+ WRITABLE_WORKSPACES is not configured. Destructive actions are blocked by default. Set WRITABLE_WORKSPACES to a comma-separated list of workspace name patterns, or "*" to allow all.
235
+ ```
236
+
237
+ **Error when workspace not in allow list:**
238
+ ```
239
+ Workspace "Production-Analytics" is not in the writable workspaces list. Allowed patterns: *-Dev, *-Test, Sandbox*
240
+ ```
241
+
242
+ ### Debug Logging
243
+
244
+ Enable verbose debug logging to diagnose API errors, inspect request/response details, and trace long-running operations. All log output goes to `stderr` (visible in Claude Desktop's log files, never interferes with JSON-RPC on stdout).
245
+
246
+ Set the `LOG_LEVEL` environment variable to `debug`:
247
+
248
+ **Claude Desktop config:**
249
+ ```json
250
+ {
251
+ "mcpServers": {
252
+ "fabric": {
253
+ "command": "npx",
254
+ "args": ["-y", "@einlogic/mcp-fabric-api"],
255
+ "env": {
256
+ "LOG_LEVEL": "debug"
257
+ }
258
+ }
259
+ }
260
+ }
261
+ ```
262
+
263
+ **Claude Code CLI:**
264
+ ```bash
265
+ LOG_LEVEL=debug claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api
266
+ ```
267
+
268
+ **What gets logged at debug level:**
269
+
270
+ | Category | Details logged |
271
+ |----------|---------------|
272
+ | HTTP requests | Method, full URL, request body size in bytes |
273
+ | HTTP responses | Status code, duration (ms), `x-ms-request-id` header |
274
+ | Definition uploads | Part paths, payload types, payload sizes — never payload content |
275
+ | API errors | Full error body including `errorCode`, `details[]`, `innererror`, `relatedResource`, `x-ms-request-id` |
276
+ | LRO polling | Operation ID, poll count, elapsed time, final status |
277
+ | Pagination | Page count, items per page, total items |
278
+ | SQL/KQL queries | Server, database, duration, column/row counts — never query text or result data |
279
+ | Rate limiting | Retry-after duration, affected endpoint |
280
+
281
+ **Compliance:** Debug logging never captures actual data content — no query text, no query results, no definition payloads, no bearer tokens. Only structural metadata (URLs, sizes, counts, timing, error details) is logged.
282
+
283
+ **Viewing logs in Claude Desktop:**
284
+
285
+ - **macOS:** `~/Library/Logs/Claude/mcp-server-fabric.log`
286
+ - **Windows:** `%APPDATA%\Claude\logs\mcp-server-fabric.log`
287
+
288
+ You can also tail the log in real time:
289
+ ```bash
290
+ # macOS
291
+ tail -f ~/Library/Logs/Claude/mcp-server-fabric.log
292
+
293
+ # Windows (PowerShell)
294
+ Get-Content "$env:APPDATA\Claude\logs\mcp-server-fabric.log" -Wait
295
+ ```
296
+
297
+ The `x-ms-request-id` value logged with every API error is the key identifier needed when opening a support case with Microsoft for Fabric API issues.
298
+
299
+ ### File-Based I/O
300
+
301
+ To avoid large payloads overwhelming MCP clients, definition tools use file paths instead of inline content. The server reads files from disk when sending definitions to Fabric, and writes files to disk when retrieving definitions from Fabric.
302
+
303
+ **Input tools** the server reads definition files from the specified path and uploads to Fabric:
304
+
305
+ | Tool | Parameter | Description |
306
+ |------|-----------|-------------|
307
+ | `semantic_model_create_bim` | `definitionFilePath` | Path to model.bim JSON file |
308
+ | `semantic_model_update_bim` | `definitionFilePath` | Path to model.bim JSON file |
309
+ | `semantic_model_create_tmdl` | `filesDirectoryPath` | Directory of `.tmdl` and `.pbism` files |
310
+ | `semantic_model_update_tmdl` | `filesDirectoryPath` | Directory of `.tmdl` and `.pbism` files |
311
+ | `notebook_update_definition` | `definitionDirectoryPath` | Directory containing notebook definition files |
312
+ | `eventstream_update_definition` | `definitionDirectoryPath` | Directory containing eventstream definition files |
313
+ | `report_create_definition` | `definitionDirectoryPath` | Directory of PBIR report definition files |
314
+ | `report_update_definition` | `definitionDirectoryPath` | Directory of PBIR report definition files |
315
+ | `variable_library_create` | `definitionDirectoryPath` | Directory of `.json` and `.platform` files |
316
+ | `variable_library_update_definition` | `definitionDirectoryPath` | Directory of `.json` and `.platform` files |
317
+ | `lakehouse_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
318
+ | `warehouse_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
319
+ | `pipeline_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
320
+ | `reflex_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
321
+ | `mirrored_database_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
322
+ | `kql_database_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
323
+ | `copy_job_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
324
+
325
+ **Output tools** the server retrieves definitions from Fabric and writes them to disk:
326
+
327
+ | Tool | Parameter | What gets written |
328
+ |------|-----------|-------------------|
329
+ | `semantic_model_get_bim` | `outputFilePath` | Single `model.bim` JSON file |
330
+ | `semantic_model_get_tmdl` | `outputDirectoryPath` | TMDL files preserving folder structure |
331
+ | `notebook_get_definition` | `outputDirectoryPath` | Notebook definition files |
332
+ | `lakehouse_get_definition` | `outputDirectoryPath` | Lakehouse definition files |
333
+ | `warehouse_get_definition` | `outputDirectoryPath` | Warehouse definition files |
334
+ | `pipeline_get_definition` | `outputDirectoryPath` | Pipeline definition files |
335
+ | `report_get_definition` | `outputDirectoryPath` | Report definition files (report.json, pages, visuals) |
336
+ | `dataflow_get_definition` | `outputDirectoryPath` | Dataflow definition files |
337
+ | `eventstream_get_definition` | `outputDirectoryPath` | Eventstream definition files |
338
+ | `graphql_api_get_definition` | `outputDirectoryPath` | GraphQL schema definition files |
339
+ | `reflex_get_definition` | `outputDirectoryPath` | Reflex definition files |
340
+ | `variable_library_get_definition` | `outputDirectoryPath` | Variable library files (variables.json, valueSets/) |
341
+ | `mirrored_database_get_definition` | `outputDirectoryPath` | Mirrored database definition files |
342
+ | `kql_database_get_definition` | `outputDirectoryPath` | KQL database definition files |
343
+ | `copy_job_get_definition` | `outputDirectoryPath` | Copy job definition files |
344
+
345
+ **TMDL directory structure example:**
346
+ ```
347
+ /tmp/my-model/
348
+ model.tmdl
349
+ definition.pbism
350
+ definition/
351
+ tables/
352
+ Sales.tmdl
353
+ Product.tmdl
354
+ relationships.tmdl
355
+ ```
356
+
357
+ ## Development
358
+
359
+ ```bash
360
+ git clone https://github.com/your-org/mcp-fabric-api.git
361
+ cd mcp-fabric-api
362
+ npm install
363
+ npm run build
364
+ npm start
365
+ npm run dev # Watch mode
366
+ npm run inspect # Launch MCP Inspector
367
+ ```
368
+
369
+ ## Tools (197 total)
370
+
371
+ ### Auth (4 tools)
372
+ | Tool | Description |
373
+ |------|-------------|
374
+ | `auth_get_current_account` | Show current Azure identity, tenant, and token expiry |
375
+ | `auth_list_available_accounts` | List subscriptions/tenants from local `az login` state (does not query Entra) |
376
+ | `auth_switch_tenant` | Switch to a different Azure tenant (with rollback on failure) |
377
+ | `auth_clear_token_cache` | Clear cached tokens to force re-acquisition |
378
+
379
+ ### Workspace (6 tools)
380
+ | Tool | Description |
381
+ |------|-------------|
382
+ | `workspace_list` | List all accessible Fabric workspaces |
383
+ | `workspace_get` | Get details of a specific workspace |
384
+ | `workspace_create` | Create a new workspace |
385
+ | `workspace_update` | Update a workspace's name or description |
386
+ | `workspace_delete` | Delete a workspace |
387
+ | `workspace_list_items` | List all items in a workspace (with optional type filter) |
388
+
389
+ ### Lakehouse (14 tools)
390
+ | Tool | Description |
391
+ |------|-------------|
392
+ | `lakehouse_list` | List all lakehouses in a workspace |
393
+ | `lakehouse_get` | Get lakehouse details (SQL endpoint, OneLake paths) |
394
+ | `lakehouse_create` | Create a new lakehouse (LRO, schemas enabled by default) |
395
+ | `lakehouse_update` | Update lakehouse name or description |
396
+ | `lakehouse_delete` | Delete a lakehouse |
397
+ | `lakehouse_list_tables` | List all tables in a lakehouse (falls back to SQL endpoint for schema-enabled lakehouses) |
398
+ | `lakehouse_load_table` | Load data into a table from OneLake (LRO). Not supported for schema-enabled lakehouses |
399
+ | `lakehouse_create_shortcut` | Create a OneLake shortcut (file, folder, table, or schema level) with support for multiple target types |
400
+ | `lakehouse_get_sql_endpoint` | Get SQL endpoint details |
401
+ | `lakehouse_get_definition` | Get lakehouse definition (LRO). Writes files to `outputDirectoryPath` |
402
+ | `lakehouse_update_definition` | Update lakehouse definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
403
+ | `lakehouse_list_shortcuts` | List all OneLake shortcuts in a lakehouse |
404
+ | `lakehouse_get_shortcut` | Get details of a specific OneLake shortcut |
405
+ | `lakehouse_delete_shortcut` | Delete a OneLake shortcut |
406
+
407
+ ### Warehouse (9 tools)
408
+ | Tool | Description |
409
+ |------|-------------|
410
+ | `warehouse_list` | List all warehouses in a workspace |
411
+ | `warehouse_get` | Get warehouse details including connection string and provisioning status |
412
+ | `warehouse_create` | Create a new warehouse (LRO) |
413
+ | `warehouse_update` | Update warehouse name or description |
414
+ | `warehouse_delete` | Delete a warehouse |
415
+ | `warehouse_get_sql_endpoint` | Get SQL connection details for a warehouse |
416
+ | `warehouse_list_tables` | List all tables in a warehouse |
417
+ | `warehouse_get_definition` | Get warehouse definition (LRO). Writes files to `outputDirectoryPath` |
418
+ | `warehouse_update_definition` | Update warehouse definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
419
+
420
+ ### Notebook (10 tools)
421
+ | Tool | Description |
422
+ |------|-------------|
423
+ | `notebook_list` | List all notebooks in a workspace |
424
+ | `notebook_get` | Get notebook details |
425
+ | `notebook_create` | Create a new notebook (LRO) |
426
+ | `notebook_update` | Update notebook name or description |
427
+ | `notebook_delete` | Delete a notebook |
428
+ | `notebook_get_definition` | Get notebook definition (LRO). Writes files to `outputDirectoryPath` |
429
+ | `notebook_update_definition` | Update notebook definition (LRO). Reads files from `definitionDirectoryPath` |
430
+ | `notebook_run` | Run a notebook on demand |
431
+ | `notebook_get_run_status` | Get notebook run status |
432
+ | `notebook_cancel_run` | Cancel a running notebook |
433
+
434
+ ### Pipeline (15 tools)
435
+ | Tool | Description |
436
+ |------|-------------|
437
+ | `pipeline_list` | List all data pipelines |
438
+ | `pipeline_get` | Get pipeline details |
439
+ | `pipeline_create` | Create a new pipeline |
440
+ | `pipeline_update` | Update pipeline name or description |
441
+ | `pipeline_delete` | Delete a pipeline |
442
+ | `pipeline_run` | Run a pipeline on demand |
443
+ | `pipeline_get_run_status` | Get pipeline run status |
444
+ | `pipeline_cancel_run` | Cancel a running pipeline |
445
+ | `pipeline_list_runs` | List all run instances |
446
+ | `pipeline_list_schedules` | List pipeline schedules |
447
+ | `pipeline_create_schedule` | Create a pipeline schedule |
448
+ | `pipeline_update_schedule` | Update a pipeline schedule |
449
+ | `pipeline_delete_schedule` | Delete a pipeline schedule |
450
+ | `pipeline_get_definition` | Get pipeline definition (LRO). Writes files to `outputDirectoryPath` |
451
+ | `pipeline_update_definition` | Update pipeline definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
452
+
453
+ ### Semantic Model (15 tools)
454
+ | Tool | Description |
455
+ |------|-------------|
456
+ | `semantic_model_list` | List all semantic models |
457
+ | `semantic_model_get_details` | Get semantic model metadata (name, ID, description) — does not return the definition |
458
+ | `semantic_model_create_bim` | Create a semantic model from a BIM/JSON file (LRO). Reads `model.bim` from `definitionFilePath` |
459
+ | `semantic_model_create_tmdl` | Create a semantic model from TMDL files (LRO). Reads `.tmdl`/`.pbism` from `filesDirectoryPath` |
460
+ | `semantic_model_update_details` | Update semantic model name or description does not modify the definition |
461
+ | `semantic_model_delete` | Delete a semantic model |
462
+ | `semantic_model_refresh` | Trigger a model refresh (Power BI API) |
463
+ | `semantic_model_execute_dax` | Execute a DAX query (Power BI API) |
464
+ | `semantic_model_get_bim` | Get definition in BIM/JSON format (LRO). Writes `model.bim` to `outputFilePath` |
465
+ | `semantic_model_get_tmdl` | Get definition in TMDL format (LRO). Writes TMDL files to `outputDirectoryPath` |
466
+ | `semantic_model_update_bim` | Update definition from BIM/JSON file (LRO). Reads `model.bim` from `definitionFilePath` |
467
+ | `semantic_model_update_tmdl` | Update definition from TMDL files (LRO). Reads `.tmdl`/`.pbism` from `filesDirectoryPath` |
468
+ | `semantic_model_get_refresh_history` | Get refresh history (Power BI API) |
469
+ | `semantic_model_take_over` | Take over ownership of a semantic model (Power BI API) |
470
+ | `semantic_model_get_datasources` | Get data sources of a semantic model (Power BI API) |
471
+
472
+ ### Report (13 tools)
473
+ | Tool | Description |
474
+ |------|-------------|
475
+ | `report_list` | List all reports |
476
+ | `report_get` | Get report details |
477
+ | `report_create_definition` | Create a new report from PBIR definition files (LRO). Reads from `definitionDirectoryPath` |
478
+ | `report_update` | Update report name or description |
479
+ | `report_delete` | Delete a report |
480
+ | `report_clone` | Clone a report (Power BI API) |
481
+ | `report_export` | Export report to file format (PDF, PPTX, PNG, etc.) via Power BI API |
482
+ | `report_get_export_status` | Check report export status |
483
+ | `report_get_definition` | Get report definition (LRO). Writes files to `outputDirectoryPath` |
484
+ | `report_update_definition` | Update report definition from PBIR directory (LRO). Reads from `definitionDirectoryPath` |
485
+ | `report_rebind` | Rebind a report to a different semantic model/dataset (Power BI API) |
486
+ | `report_get_pages` | Get the list of pages in a report (Power BI API) |
487
+ | `report_get_datasources` | Get data sources used by a report (Power BI API) |
488
+
489
+ ### Dataflow Gen2 (8 tools)
490
+ | Tool | Description |
491
+ |------|-------------|
492
+ | `dataflow_list` | List all Dataflow Gen2 items |
493
+ | `dataflow_get` | Get dataflow details |
494
+ | `dataflow_create` | Create a new dataflow |
495
+ | `dataflow_update` | Update dataflow name or description |
496
+ | `dataflow_delete` | Delete a dataflow |
497
+ | `dataflow_refresh` | Trigger a dataflow refresh |
498
+ | `dataflow_get_refresh_status` | Get refresh job status |
499
+ | `dataflow_get_definition` | Get dataflow definition (LRO). Writes files to `outputDirectoryPath` |
500
+
501
+ ### Eventhouse (7 tools)
502
+ | Tool | Description |
503
+ |------|-------------|
504
+ | `eventhouse_list` | List all eventhouses |
505
+ | `eventhouse_get` | Get eventhouse details |
506
+ | `eventhouse_create` | Create a new eventhouse (LRO) |
507
+ | `eventhouse_update` | Update eventhouse name or description |
508
+ | `eventhouse_delete` | Delete an eventhouse |
509
+ | `eventhouse_get_sql_endpoint` | Get query service URI and connection details |
510
+ | `eventhouse_execute_kql` | Execute a KQL query against a KQL database |
511
+
512
+ ### Eventstream (7 tools)
513
+ | Tool | Description |
514
+ |------|-------------|
515
+ | `eventstream_list` | List all eventstreams |
516
+ | `eventstream_get` | Get eventstream details |
517
+ | `eventstream_create` | Create a new eventstream (LRO) |
518
+ | `eventstream_update` | Update eventstream name or description |
519
+ | `eventstream_delete` | Delete an eventstream |
520
+ | `eventstream_get_definition` | Get eventstream definition (LRO). Writes files to `outputDirectoryPath` |
521
+ | `eventstream_update_definition` | Update eventstream definition (LRO). Reads from `definitionDirectoryPath` |
522
+
523
+ ### Reflex / Activator (7 tools)
524
+ | Tool | Description |
525
+ |------|-------------|
526
+ | `reflex_list` | List all Reflex (Activator) items |
527
+ | `reflex_get` | Get reflex details |
528
+ | `reflex_create` | Create a new reflex |
529
+ | `reflex_update` | Update reflex name or description |
530
+ | `reflex_delete` | Delete a reflex |
531
+ | `reflex_get_definition` | Get reflex definition (LRO). Writes files to `outputDirectoryPath` |
532
+ | `reflex_update_definition` | Update reflex definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
533
+
534
+ ### GraphQL API (7 tools)
535
+ | Tool | Description |
536
+ |------|-------------|
537
+ | `graphql_api_list` | List all GraphQL API items |
538
+ | `graphql_api_get` | Get GraphQL API details |
539
+ | `graphql_api_create` | Create a new GraphQL API |
540
+ | `graphql_api_update` | Update GraphQL API name or description |
541
+ | `graphql_api_delete` | Delete a GraphQL API |
542
+ | `graphql_api_get_definition` | Get GraphQL schema definition (LRO). Writes files to `outputDirectoryPath` |
543
+ | `graphql_api_execute_query` | Execute a GraphQL query |
544
+
545
+ ### SQL Endpoint (4 tools)
546
+ | Tool | Description |
547
+ |------|-------------|
548
+ | `sql_endpoint_list` | List all SQL endpoints |
549
+ | `sql_endpoint_get` | Get SQL endpoint details |
550
+ | `sql_endpoint_get_connection_string` | Get TDS connection string |
551
+ | `sql_endpoint_execute_query` | Execute a T-SQL query against a lakehouse or warehouse SQL endpoint |
552
+
553
+ ### Variable Library (7 tools)
554
+ | Tool | Description |
555
+ |------|-------------|
556
+ | `variable_library_list` | List all variable libraries in a workspace |
557
+ | `variable_library_get` | Get variable library details including active value set name |
558
+ | `variable_library_create` | Create a variable library, optionally with definition files from `definitionDirectoryPath` (LRO) |
559
+ | `variable_library_update` | Update name, description, or active value set |
560
+ | `variable_library_delete` | Delete a variable library |
561
+ | `variable_library_get_definition` | Get definition (LRO). Writes files (variables.json, valueSets/) to `outputDirectoryPath` |
562
+ | `variable_library_update_definition` | Update definition from directory of `.json` and `.platform` files (LRO) |
563
+
564
+ ### Git Integration (9 tools)
565
+ | Tool | Description |
566
+ |------|-------------|
567
+ | `git_get_connection` | Get Git connection details for a workspace |
568
+ | `git_get_status` | Get Git status of items (sync state between workspace and remote) |
569
+ | `git_connect` | Connect a workspace to a Git repository (Azure DevOps or GitHub) |
570
+ | `git_disconnect` | Disconnect a workspace from its Git repository |
571
+ | `git_initialize_connection` | Initialize a Git connection after connecting (LRO) |
572
+ | `git_commit_to_git` | Commit workspace changes to the connected Git repository (LRO) |
573
+ | `git_update_from_git` | Update workspace from the connected Git repository (LRO) |
574
+ | `git_get_credentials` | Get Git credentials configuration for the current user |
575
+ | `git_update_credentials` | Update Git credentials configuration for the current user |
576
+
577
+ ### Deployment Pipeline (12 tools)
578
+ | Tool | Description |
579
+ |------|-------------|
580
+ | `deployment_pipeline_list` | List all deployment pipelines accessible to the user |
581
+ | `deployment_pipeline_get` | Get details of a specific deployment pipeline |
582
+ | `deployment_pipeline_create` | Create a new deployment pipeline |
583
+ | `deployment_pipeline_update` | Update deployment pipeline name or description |
584
+ | `deployment_pipeline_delete` | Delete a deployment pipeline |
585
+ | `deployment_pipeline_list_stages` | List all stages in a deployment pipeline |
586
+ | `deployment_pipeline_list_stage_items` | List all items in a specific stage |
587
+ | `deployment_pipeline_assign_workspace` | Assign a workspace to a pipeline stage |
588
+ | `deployment_pipeline_unassign_workspace` | Unassign a workspace from a pipeline stage |
589
+ | `deployment_pipeline_deploy` | Deploy items from one stage to another (LRO) |
590
+ | `deployment_pipeline_list_operations` | List operations (deployment history) |
591
+ | `deployment_pipeline_get_operation` | Get details of a specific deployment operation |
592
+
593
+ ### Mirrored Database (11 tools)
594
+ | Tool | Description |
595
+ |------|-------------|
596
+ | `mirrored_database_list` | List all mirrored databases in a workspace |
597
+ | `mirrored_database_get` | Get details of a specific mirrored database |
598
+ | `mirrored_database_create` | Create a new mirrored database (LRO) |
599
+ | `mirrored_database_update` | Update mirrored database name or description |
600
+ | `mirrored_database_delete` | Delete a mirrored database |
601
+ | `mirrored_database_get_definition` | Get mirrored database definition (LRO). Writes files to `outputDirectoryPath` |
602
+ | `mirrored_database_update_definition` | Update definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
603
+ | `mirrored_database_start_mirroring` | Start mirroring for a mirrored database |
604
+ | `mirrored_database_stop_mirroring` | Stop mirroring for a mirrored database |
605
+ | `mirrored_database_get_mirroring_status` | Get the mirroring status |
606
+ | `mirrored_database_get_tables_mirroring_status` | Get mirroring status of individual tables |
607
+
608
+ ### KQL Database (7 tools)
609
+ | Tool | Description |
610
+ |------|-------------|
611
+ | `kql_database_list` | List all KQL databases in a workspace |
612
+ | `kql_database_get` | Get details of a specific KQL database |
613
+ | `kql_database_create` | Create a new KQL database (LRO). Requires a parent eventhouse |
614
+ | `kql_database_update` | Update KQL database name or description |
615
+ | `kql_database_delete` | Delete a KQL database |
616
+ | `kql_database_get_definition` | Get KQL database definition (LRO). Writes files to `outputDirectoryPath` |
617
+ | `kql_database_update_definition` | Update definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
618
+
619
+ ### ML Model (5 tools)
620
+ | Tool | Description |
621
+ |------|-------------|
622
+ | `ml_model_list` | List all ML models in a workspace |
623
+ | `ml_model_get` | Get details of a specific ML model |
624
+ | `ml_model_create` | Create a new ML model (LRO) |
625
+ | `ml_model_update` | Update ML model name or description |
626
+ | `ml_model_delete` | Delete an ML model |
627
+
628
+ ### ML Experiment (5 tools)
629
+ | Tool | Description |
630
+ |------|-------------|
631
+ | `ml_experiment_list` | List all ML experiments in a workspace |
632
+ | `ml_experiment_get` | Get details of a specific ML experiment |
633
+ | `ml_experiment_create` | Create a new ML experiment (LRO) |
634
+ | `ml_experiment_update` | Update ML experiment name or description |
635
+ | `ml_experiment_delete` | Delete an ML experiment |
636
+
637
+ ### Copy Job (11 tools)
638
+ | Tool | Description |
639
+ |------|-------------|
640
+ | `copy_job_list` | List all copy jobs in a workspace |
641
+ | `copy_job_get` | Get details of a specific copy job |
642
+ | `copy_job_create` | Create a new copy job |
643
+ | `copy_job_update` | Update copy job name or description |
644
+ | `copy_job_delete` | Delete a copy job |
645
+ | `copy_job_get_definition` | Get copy job definition (LRO). Writes files to `outputDirectoryPath` |
646
+ | `copy_job_update_definition` | Update definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
647
+ | `copy_job_run` | Run a copy job on demand |
648
+ | `copy_job_get_run_status` | Get copy job run status |
649
+ | `copy_job_cancel_run` | Cancel a running copy job |
650
+ | `copy_job_list_runs` | List all run instances for a copy job |
651
+
652
+ ### External Data Share (4 tools)
653
+ | Tool | Description |
654
+ |------|-------------|
655
+ | `external_data_share_list` | List all external data shares for an item |
656
+ | `external_data_share_get` | Get details of a specific external data share |
657
+ | `external_data_share_create` | Create a new external data share for an item |
658
+ | `external_data_share_revoke` | Revoke an external data share |
659
+
660
+ ## License
661
+
662
+ AGPL-3.0