@einlogic/mcp-fabric-api 2.5.0 → 2.6.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +662 -580
- package/build/auth/token-manager.d.ts +22 -1
- package/build/auth/token-manager.d.ts.map +1 -1
- package/build/auth/token-manager.js +119 -9
- package/build/auth/token-manager.js.map +1 -1
- package/build/core/errors.d.ts +10 -1
- package/build/core/errors.d.ts.map +1 -1
- package/build/core/errors.js +19 -0
- package/build/core/errors.js.map +1 -1
- package/build/index.js +11 -2
- package/build/index.js.map +1 -1
- package/build/server.d.ts +2 -1
- package/build/server.d.ts.map +1 -1
- package/build/server.js +9 -2
- package/build/server.js.map +1 -1
- package/package.json +64 -64
- package/build/utils/file-reader.d.ts +0 -11
- package/build/utils/file-reader.d.ts.map +0 -1
- package/build/utils/file-reader.js +0 -84
- package/build/utils/file-reader.js.map +0 -1
package/README.md
CHANGED
|
@@ -1,580 +1,662 @@
|
|
|
1
|
-
# mcp-fabric-api
|
|
2
|
-
|
|
3
|
-
MCP (Model Context Protocol) server for the Microsoft Fabric REST APIs. Built for data engineers and data analysts who want to use AI assistants beyond Copilot — such as Claude, Claude Code, or any MCP-compatible client — to build and manage their Fabric components. Covers workspaces, lakehouses, warehouses, notebooks, pipelines, semantic models, reports, dataflows, eventhouses, eventstreams, reflexes, GraphQL APIs, SQL endpoints, variable libraries, git integration, deployment pipelines, mirrored databases, KQL databases, ML models, ML experiments, copy jobs, and external data shares.
|
|
4
|
-
|
|
5
|
-
> **Safe by default:** This server blocks all destructive operations (create, update, delete) until you explicitly configure the `WRITABLE_WORKSPACES` environment variable. Read operations always work. Set `WRITABLE_WORKSPACES="*"` to allow writes to all workspaces, or use patterns to limit access. See [Workspace Safety Guard](#workspace-safety-guard) for details.
|
|
6
|
-
|
|
7
|
-
## Prerequisites
|
|
8
|
-
|
|
9
|
-
- Node.js 18+
|
|
10
|
-
-
|
|
11
|
-
-
|
|
12
|
-
|
|
13
|
-
|
|
14
|
-
|
|
15
|
-
|
|
16
|
-
|
|
17
|
-
|
|
18
|
-
|
|
19
|
-
|
|
20
|
-
|
|
21
|
-
|
|
22
|
-
|
|
23
|
-
```
|
|
24
|
-
|
|
25
|
-
|
|
26
|
-
|
|
27
|
-
|
|
28
|
-
|
|
29
|
-
|
|
30
|
-
|
|
31
|
-
|
|
32
|
-
|
|
33
|
-
|
|
34
|
-
|
|
35
|
-
|
|
36
|
-
|
|
37
|
-
|
|
38
|
-
|
|
39
|
-
|
|
40
|
-
|
|
41
|
-
|
|
42
|
-
|
|
43
|
-
|
|
44
|
-
|
|
45
|
-
|
|
46
|
-
|
|
47
|
-
|
|
48
|
-
|
|
49
|
-
|
|
50
|
-
|
|
51
|
-
|
|
52
|
-
|
|
53
|
-
|
|
54
|
-
|
|
55
|
-
|
|
56
|
-
|
|
57
|
-
```
|
|
58
|
-
|
|
59
|
-
|
|
60
|
-
|
|
61
|
-
|
|
62
|
-
|
|
63
|
-
|
|
64
|
-
|
|
65
|
-
|
|
66
|
-
|
|
67
|
-
|
|
68
|
-
|
|
69
|
-
|
|
70
|
-
|
|
71
|
-
|
|
72
|
-
|
|
73
|
-
|
|
74
|
-
-
|
|
75
|
-
-
|
|
76
|
-
|
|
77
|
-
|
|
78
|
-
|
|
79
|
-
|
|
80
|
-
|
|
81
|
-
|
|
82
|
-
|
|
83
|
-
|
|
84
|
-
|
|
85
|
-
|
|
86
|
-
|
|
87
|
-
|
|
88
|
-
|
|
89
|
-
|
|
90
|
-
|
|
91
|
-
|
|
92
|
-
|
|
93
|
-
|
|
94
|
-
|
|
95
|
-
|
|
96
|
-
|
|
97
|
-
|
|
98
|
-
|
|
99
|
-
|
|
100
|
-
|
|
101
|
-
|
|
102
|
-
|
|
103
|
-
|
|
104
|
-
|
|
105
|
-
|
|
106
|
-
|
|
107
|
-
|
|
108
|
-
|
|
109
|
-
|
|
110
|
-
|
|
111
|
-
|
|
112
|
-
|
|
113
|
-
|
|
114
|
-
|
|
115
|
-
|
|
116
|
-
|
|
117
|
-
|
|
118
|
-
|
|
119
|
-
|
|
120
|
-
|
|
|
121
|
-
|
|
122
|
-
|
|
|
123
|
-
|
|
|
124
|
-
|
|
|
125
|
-
|
|
|
126
|
-
|
|
127
|
-
|
|
128
|
-
|
|
129
|
-
|
|
130
|
-
|
|
131
|
-
|
|
132
|
-
|
|
133
|
-
|
|
134
|
-
|
|
135
|
-
|
|
136
|
-
|
|
137
|
-
|
|
138
|
-
|
|
139
|
-
|
|
140
|
-
|
|
141
|
-
|
|
142
|
-
|
|
143
|
-
|
|
144
|
-
|
|
145
|
-
|
|
146
|
-
|
|
147
|
-
|
|
148
|
-
|
|
149
|
-
|
|
150
|
-
**
|
|
151
|
-
|
|
152
|
-
|
|
153
|
-
|
|
154
|
-
|
|
155
|
-
|
|
156
|
-
```
|
|
157
|
-
|
|
158
|
-
```
|
|
159
|
-
|
|
160
|
-
|
|
161
|
-
|
|
162
|
-
|
|
163
|
-
|
|
164
|
-
|
|
165
|
-
|
|
166
|
-
**
|
|
167
|
-
|
|
168
|
-
|
|
169
|
-
|
|
170
|
-
|
|
171
|
-
|
|
172
|
-
|
|
173
|
-
|
|
174
|
-
|
|
175
|
-
|
|
176
|
-
|
|
177
|
-
|
|
178
|
-
|
|
179
|
-
|
|
180
|
-
|
|
181
|
-
|
|
182
|
-
|
|
183
|
-
|
|
184
|
-
|
|
185
|
-
|
|
186
|
-
|
|
187
|
-
|
|
188
|
-
|
|
|
189
|
-
|
|
190
|
-
|
|
|
191
|
-
|
|
192
|
-
|
|
193
|
-
|
|
194
|
-
|
|
195
|
-
|
|
196
|
-
|
|
197
|
-
|
|
198
|
-
|
|
199
|
-
|
|
200
|
-
|
|
201
|
-
|
|
202
|
-
|
|
203
|
-
|
|
204
|
-
|
|
205
|
-
|
|
206
|
-
|
|
207
|
-
```
|
|
208
|
-
|
|
209
|
-
|
|
210
|
-
|
|
211
|
-
|
|
212
|
-
|
|
213
|
-
|
|
214
|
-
|
|
215
|
-
|
|
216
|
-
|
|
217
|
-
|
|
218
|
-
|
|
219
|
-
|
|
220
|
-
|
|
221
|
-
|
|
222
|
-
|
|
223
|
-
|
|
224
|
-
|
|
225
|
-
|
|
226
|
-
|
|
227
|
-
|
|
228
|
-
|
|
229
|
-
|
|
230
|
-
|
|
231
|
-
|
|
232
|
-
|
|
233
|
-
|
|
234
|
-
|
|
235
|
-
|
|
236
|
-
|
|
237
|
-
|
|
238
|
-
|
|
239
|
-
|
|
240
|
-
|
|
241
|
-
|
|
242
|
-
|
|
243
|
-
|
|
244
|
-
|
|
245
|
-
|
|
246
|
-
|
|
247
|
-
|
|
248
|
-
|
|
249
|
-
|
|
250
|
-
|
|
251
|
-
|
|
252
|
-
|
|
253
|
-
|
|
254
|
-
|
|
255
|
-
|
|
256
|
-
|
|
257
|
-
|
|
258
|
-
|
|
259
|
-
|
|
260
|
-
|
|
261
|
-
|
|
262
|
-
|
|
263
|
-
**
|
|
264
|
-
```
|
|
265
|
-
/
|
|
266
|
-
|
|
267
|
-
|
|
268
|
-
|
|
269
|
-
|
|
270
|
-
|
|
271
|
-
|
|
272
|
-
|
|
273
|
-
|
|
274
|
-
|
|
275
|
-
|
|
276
|
-
|
|
277
|
-
|
|
278
|
-
|
|
279
|
-
|
|
280
|
-
|
|
281
|
-
|
|
282
|
-
|
|
283
|
-
|
|
284
|
-
|
|
285
|
-
|
|
286
|
-
|
|
287
|
-
|
|
288
|
-
|
|
289
|
-
|
|
290
|
-
|
|
291
|
-
|
|
292
|
-
|
|
293
|
-
|
|
294
|
-
|
|
295
|
-
|
|
296
|
-
|
|
297
|
-
|
|
298
|
-
|
|
299
|
-
|
|
300
|
-
|
|
301
|
-
|
|
302
|
-
|
|
303
|
-
|
|
304
|
-
|
|
305
|
-
|
|
|
306
|
-
|
|
307
|
-
|
|
308
|
-
|
|
|
309
|
-
|
|
310
|
-
| `
|
|
311
|
-
| `
|
|
312
|
-
| `
|
|
313
|
-
| `
|
|
314
|
-
| `
|
|
315
|
-
| `
|
|
316
|
-
| `
|
|
317
|
-
| `
|
|
318
|
-
| `
|
|
319
|
-
| `
|
|
320
|
-
| `
|
|
321
|
-
| `
|
|
322
|
-
| `
|
|
323
|
-
| `
|
|
324
|
-
|
|
325
|
-
|
|
326
|
-
|
|
327
|
-
|
|
328
|
-
|
|
329
|
-
| `
|
|
330
|
-
| `
|
|
331
|
-
| `
|
|
332
|
-
| `
|
|
333
|
-
| `
|
|
334
|
-
| `
|
|
335
|
-
| `
|
|
336
|
-
| `
|
|
337
|
-
|
|
338
|
-
|
|
339
|
-
|
|
|
340
|
-
|
|
341
|
-
| `
|
|
342
|
-
| `
|
|
343
|
-
| `
|
|
344
|
-
|
|
345
|
-
|
|
346
|
-
|
|
347
|
-
|
|
348
|
-
|
|
349
|
-
|
|
350
|
-
|
|
351
|
-
|
|
352
|
-
|
|
353
|
-
|
|
354
|
-
|
|
355
|
-
|
|
356
|
-
|
|
357
|
-
|
|
358
|
-
|
|
359
|
-
|
|
360
|
-
|
|
361
|
-
|
|
362
|
-
|
|
363
|
-
|
|
364
|
-
|
|
365
|
-
|
|
366
|
-
|
|
367
|
-
|
|
368
|
-
|
|
369
|
-
|
|
370
|
-
|
|
371
|
-
###
|
|
372
|
-
| Tool | Description |
|
|
373
|
-
|------|-------------|
|
|
374
|
-
| `
|
|
375
|
-
| `
|
|
376
|
-
| `
|
|
377
|
-
| `
|
|
378
|
-
|
|
379
|
-
|
|
380
|
-
|
|
|
381
|
-
|
|
382
|
-
| `
|
|
383
|
-
| `
|
|
384
|
-
| `
|
|
385
|
-
| `
|
|
386
|
-
| `
|
|
387
|
-
| `
|
|
388
|
-
|
|
389
|
-
|
|
390
|
-
|
|
391
|
-
|
|
392
|
-
|
|
393
|
-
| `
|
|
394
|
-
| `
|
|
395
|
-
| `
|
|
396
|
-
| `
|
|
397
|
-
| `
|
|
398
|
-
| `
|
|
399
|
-
| `
|
|
400
|
-
| `
|
|
401
|
-
| `
|
|
402
|
-
| `
|
|
403
|
-
| `
|
|
404
|
-
| `
|
|
405
|
-
| `
|
|
406
|
-
|
|
407
|
-
###
|
|
408
|
-
| Tool | Description |
|
|
409
|
-
|------|-------------|
|
|
410
|
-
| `
|
|
411
|
-
| `
|
|
412
|
-
| `
|
|
413
|
-
| `
|
|
414
|
-
| `
|
|
415
|
-
| `
|
|
416
|
-
| `
|
|
417
|
-
| `
|
|
418
|
-
|
|
419
|
-
|
|
420
|
-
|
|
421
|
-
|
|
422
|
-
|
|
423
|
-
| `
|
|
424
|
-
| `
|
|
425
|
-
| `
|
|
426
|
-
| `
|
|
427
|
-
| `
|
|
428
|
-
| `
|
|
429
|
-
|
|
430
|
-
|
|
431
|
-
|
|
|
432
|
-
|
|
433
|
-
|
|
434
|
-
|
|
435
|
-
|
|
|
436
|
-
|
|
437
|
-
| `
|
|
438
|
-
| `
|
|
439
|
-
| `
|
|
440
|
-
|
|
441
|
-
|
|
442
|
-
|
|
|
443
|
-
|
|
444
|
-
| `
|
|
445
|
-
| `
|
|
446
|
-
| `
|
|
447
|
-
| `
|
|
448
|
-
| `
|
|
449
|
-
| `
|
|
450
|
-
| `
|
|
451
|
-
|
|
452
|
-
|
|
453
|
-
|
|
454
|
-
|
|
455
|
-
|
|
456
|
-
| `
|
|
457
|
-
| `
|
|
458
|
-
| `
|
|
459
|
-
| `
|
|
460
|
-
| `
|
|
461
|
-
| `
|
|
462
|
-
|
|
463
|
-
|
|
464
|
-
|
|
|
465
|
-
|
|
466
|
-
| `
|
|
467
|
-
| `
|
|
468
|
-
| `
|
|
469
|
-
| `
|
|
470
|
-
|
|
471
|
-
|
|
472
|
-
|
|
473
|
-
|
|
474
|
-
|
|
475
|
-
| `
|
|
476
|
-
| `
|
|
477
|
-
| `
|
|
478
|
-
| `
|
|
479
|
-
| `
|
|
480
|
-
| `
|
|
481
|
-
|
|
482
|
-
|
|
483
|
-
|
|
|
484
|
-
|
|
485
|
-
| `
|
|
486
|
-
| `
|
|
487
|
-
| `
|
|
488
|
-
|
|
489
|
-
|
|
490
|
-
|
|
|
491
|
-
|
|
492
|
-
| `
|
|
493
|
-
| `
|
|
494
|
-
|
|
495
|
-
|
|
496
|
-
|
|
|
497
|
-
|
|
498
|
-
| `
|
|
499
|
-
| `
|
|
500
|
-
|
|
501
|
-
|
|
502
|
-
|
|
|
503
|
-
|
|
504
|
-
| `
|
|
505
|
-
| `
|
|
506
|
-
| `
|
|
507
|
-
| `
|
|
508
|
-
| `
|
|
509
|
-
| `
|
|
510
|
-
|
|
511
|
-
|
|
512
|
-
|
|
513
|
-
|
|
514
|
-
|
|
515
|
-
| `
|
|
516
|
-
| `
|
|
517
|
-
| `
|
|
518
|
-
| `
|
|
519
|
-
| `
|
|
520
|
-
| `
|
|
521
|
-
| `
|
|
522
|
-
|
|
523
|
-
|
|
524
|
-
|
|
|
525
|
-
|
|
526
|
-
|
|
527
|
-
|
|
|
528
|
-
|
|
529
|
-
| `
|
|
530
|
-
| `
|
|
531
|
-
| `
|
|
532
|
-
| `
|
|
533
|
-
|
|
534
|
-
|
|
535
|
-
|
|
|
536
|
-
|
|
537
|
-
|
|
538
|
-
|
|
|
539
|
-
|
|
540
|
-
| `
|
|
541
|
-
| `
|
|
542
|
-
| `
|
|
543
|
-
| `
|
|
544
|
-
|
|
545
|
-
|
|
546
|
-
|
|
547
|
-
|
|
548
|
-
|
|
549
|
-
| `
|
|
550
|
-
| `
|
|
551
|
-
| `
|
|
552
|
-
|
|
553
|
-
|
|
554
|
-
|
|
555
|
-
|
|
556
|
-
|
|
|
557
|
-
|
|
558
|
-
| `
|
|
559
|
-
| `
|
|
560
|
-
| `
|
|
561
|
-
| `
|
|
562
|
-
| `
|
|
563
|
-
|
|
564
|
-
|
|
565
|
-
|
|
|
566
|
-
|
|
567
|
-
| `
|
|
568
|
-
| `
|
|
569
|
-
|
|
570
|
-
|
|
571
|
-
|
|
|
572
|
-
|
|
573
|
-
| `
|
|
574
|
-
| `
|
|
575
|
-
| `
|
|
576
|
-
|
|
577
|
-
|
|
578
|
-
|
|
579
|
-
|
|
580
|
-
|
|
1
|
+
# mcp-fabric-api
|
|
2
|
+
|
|
3
|
+
MCP (Model Context Protocol) server for the Microsoft Fabric REST APIs. Built for data engineers and data analysts who want to use AI assistants beyond Copilot — such as Claude, Claude Code, or any MCP-compatible client — to build and manage their Fabric components. Covers workspaces, lakehouses, warehouses, notebooks, pipelines, semantic models, reports, dataflows, eventhouses, eventstreams, reflexes, GraphQL APIs, SQL endpoints, variable libraries, git integration, deployment pipelines, mirrored databases, KQL databases, ML models, ML experiments, copy jobs, and external data shares.
|
|
4
|
+
|
|
5
|
+
> **Safe by default:** This server blocks all destructive operations (create, update, delete) until you explicitly configure the `WRITABLE_WORKSPACES` environment variable. Read operations always work. Set `WRITABLE_WORKSPACES="*"` to allow writes to all workspaces, or use patterns to limit access. See [Workspace Safety Guard](#workspace-safety-guard) for details.
|
|
6
|
+
|
|
7
|
+
## Prerequisites
|
|
8
|
+
|
|
9
|
+
- Node.js 18+
|
|
10
|
+
- Access to a Microsoft Fabric workspace
|
|
11
|
+
- One of:
|
|
12
|
+
- Azure CLI (`az login`) — easiest on Windows
|
|
13
|
+
- Azure app registration with device code flow enabled — best for Mac / Claude Desktop
|
|
14
|
+
- Service principal credentials — best for headless / automated scenarios
|
|
15
|
+
|
|
16
|
+
## Quick Start
|
|
17
|
+
|
|
18
|
+
**Windows (Azure CLI):**
|
|
19
|
+
|
|
20
|
+
```bash
|
|
21
|
+
az login
|
|
22
|
+
npx @einlogic/mcp-fabric-api
|
|
23
|
+
```
|
|
24
|
+
|
|
25
|
+
**Mac / Claude Desktop (Device Code):**
|
|
26
|
+
|
|
27
|
+
```bash
|
|
28
|
+
AUTH_METHOD=device-code AZURE_CLIENT_ID=your-app-id AZURE_TENANT_ID=your-tenant-id npx @einlogic/mcp-fabric-api
|
|
29
|
+
```
|
|
30
|
+
|
|
31
|
+
On first API call, a sign-in URL and code will appear in the logs. Open the URL in your browser, enter the code, and authenticate.
|
|
32
|
+
|
|
33
|
+
## Setup
|
|
34
|
+
|
|
35
|
+
### Claude Desktop
|
|
36
|
+
|
|
37
|
+
Add to your Claude Desktop config file:
|
|
38
|
+
|
|
39
|
+
- **macOS:** `~/Library/Application Support/Claude/claude_desktop_config.json`
|
|
40
|
+
- **Windows:** `%APPDATA%\Claude\claude_desktop_config.json`
|
|
41
|
+
|
|
42
|
+
**Windows (uses Azure CLI credentials):**
|
|
43
|
+
|
|
44
|
+
```json
|
|
45
|
+
{
|
|
46
|
+
"mcpServers": {
|
|
47
|
+
"fabric": {
|
|
48
|
+
"command": "npx",
|
|
49
|
+
"args": ["-y", "@einlogic/mcp-fabric-api"]
|
|
50
|
+
}
|
|
51
|
+
}
|
|
52
|
+
}
|
|
53
|
+
```
|
|
54
|
+
|
|
55
|
+
**macOS (uses device code flow):**
|
|
56
|
+
|
|
57
|
+
```json
|
|
58
|
+
{
|
|
59
|
+
"mcpServers": {
|
|
60
|
+
"fabric": {
|
|
61
|
+
"command": "npx",
|
|
62
|
+
"args": ["-y", "@einlogic/mcp-fabric-api"],
|
|
63
|
+
"env": {
|
|
64
|
+
"AUTH_METHOD": "device-code",
|
|
65
|
+
"AZURE_CLIENT_ID": "your-app-client-id",
|
|
66
|
+
"AZURE_TENANT_ID": "your-tenant-id"
|
|
67
|
+
}
|
|
68
|
+
}
|
|
69
|
+
}
|
|
70
|
+
}
|
|
71
|
+
```
|
|
72
|
+
|
|
73
|
+
When the server starts, check the Claude Desktop logs for a sign-in prompt:
|
|
74
|
+
- **macOS:** `~/Library/Logs/Claude/mcp-server-fabric.log`
|
|
75
|
+
- **Windows:** `%APPDATA%\Claude\logs\mcp-server-fabric.log`
|
|
76
|
+
|
|
77
|
+
The prompt will say: *"To sign in, use a web browser to open https://microsoft.com/devicelogin and enter the code XXXXXXX"*. Complete the sign-in once and the token is cached for the session.
|
|
78
|
+
|
|
79
|
+
### Claude Code CLI
|
|
80
|
+
|
|
81
|
+
**Windows:**
|
|
82
|
+
```bash
|
|
83
|
+
claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api
|
|
84
|
+
```
|
|
85
|
+
|
|
86
|
+
**macOS:**
|
|
87
|
+
```bash
|
|
88
|
+
claude mcp add fabric -e AUTH_METHOD=device-code -e AZURE_CLIENT_ID=your-app-id -e AZURE_TENANT_ID=your-tenant-id -- npx -y @einlogic/mcp-fabric-api
|
|
89
|
+
```
|
|
90
|
+
|
|
91
|
+
To verify it was added:
|
|
92
|
+
|
|
93
|
+
```bash
|
|
94
|
+
claude mcp list
|
|
95
|
+
```
|
|
96
|
+
|
|
97
|
+
### HTTP Mode (Remote)
|
|
98
|
+
|
|
99
|
+
For remote deployments, set environment variables:
|
|
100
|
+
|
|
101
|
+
```bash
|
|
102
|
+
export TRANSPORT=http
|
|
103
|
+
export PORT=3000
|
|
104
|
+
export AZURE_CLIENT_ID=your-client-id
|
|
105
|
+
export AZURE_CLIENT_SECRET=your-client-secret
|
|
106
|
+
export AZURE_TENANT_ID=your-tenant-id
|
|
107
|
+
npx @einlogic/mcp-fabric-api
|
|
108
|
+
```
|
|
109
|
+
|
|
110
|
+
The server exposes:
|
|
111
|
+
- `POST /mcp` — MCP endpoint (StreamableHTTP)
|
|
112
|
+
- `GET /mcp` — SSE stream for server notifications
|
|
113
|
+
- `DELETE /mcp` — Session cleanup
|
|
114
|
+
- `GET /.well-known/oauth-protected-resource` — OAuth metadata
|
|
115
|
+
|
|
116
|
+
### Authentication Methods
|
|
117
|
+
|
|
118
|
+
The server supports multiple authentication methods via the `AUTH_METHOD` environment variable. Choose the method that fits your platform and scenario:
|
|
119
|
+
|
|
120
|
+
| Method | `AUTH_METHOD` | Required env vars | Best for |
|
|
121
|
+
|--------|--------------|-------------------|----------|
|
|
122
|
+
| Azure CLI (default) | `default` | None | Windows with `az login` |
|
|
123
|
+
| Device Code | `device-code` | `AZURE_CLIENT_ID`, `AZURE_TENANT_ID` | **Mac / Claude Desktop** |
|
|
124
|
+
| Client Secret | `client-secret` | `AZURE_CLIENT_ID`, `AZURE_CLIENT_SECRET`, `AZURE_TENANT_ID` | Headless / automated |
|
|
125
|
+
| Interactive Browser | `interactive-browser` | None (optional: `AZURE_CLIENT_ID`, `AZURE_TENANT_ID`) | Systems with browser access |
|
|
126
|
+
|
|
127
|
+
**Default (Azure CLI):** Uses the `DefaultAzureCredential` chain from the Azure Identity SDK. On a developer machine this picks up credentials from `az login`. No extra configuration needed. This is the original behavior and works best on Windows where Claude Desktop can access the Azure CLI token cache.
|
|
128
|
+
|
|
129
|
+
**Device Code:** On first API call, prints a URL and one-time code to stderr. You open the URL in any browser, enter the code, and sign in with your Azure account. The token is cached in memory for the session. This is the recommended method for **Mac users with Claude Desktop**, because the Claude Desktop process on macOS cannot access the Azure CLI token cache.
|
|
130
|
+
|
|
131
|
+
To use device code flow, you need an Azure app registration with **"Allow public client flows"** enabled:
|
|
132
|
+
1. Go to [Azure Portal](https://portal.azure.com) > App registrations > New registration
|
|
133
|
+
2. Name it (e.g., "Fabric MCP") and register
|
|
134
|
+
3. Under **Authentication** > **Advanced settings**, set **"Allow public client flows"** to **Yes**
|
|
135
|
+
4. Under **API permissions**, add `https://api.fabric.microsoft.com/Workspace.ReadWrite.All` (or the scopes your tools need)
|
|
136
|
+
5. Copy the **Application (client) ID** and your **Directory (tenant) ID**
|
|
137
|
+
|
|
138
|
+
**Client Secret:** Uses a service principal with client credentials. Requires an Azure app registration with a client secret. Suitable for CI/CD pipelines, automated scripts, or any headless environment where interactive sign-in is not possible.
|
|
139
|
+
|
|
140
|
+
**Interactive Browser:** Opens a browser window for OAuth sign-in. Works on systems where the server process can launch a browser. Optional `AZURE_CLIENT_ID` and `AZURE_TENANT_ID` can be provided to target a specific app and tenant.
|
|
141
|
+
|
|
142
|
+
### Workspace Safety Guard
|
|
143
|
+
|
|
144
|
+
Control which workspaces allow write operations (create, update, delete) via the `WRITABLE_WORKSPACES` environment variable. Only workspaces matching the configured name patterns will permit CUD (Create, Update, Delete) operations. Read operations are never restricted.
|
|
145
|
+
|
|
146
|
+
> **Default behavior: When `WRITABLE_WORKSPACES` is not set or empty, all destructive operations are blocked.** You must explicitly configure this variable to enable writes.
|
|
147
|
+
|
|
148
|
+
| `WRITABLE_WORKSPACES` value | Behavior |
|
|
149
|
+
|------------------------------|----------|
|
|
150
|
+
| Not set / empty | **All writes blocked** (safe default) |
|
|
151
|
+
| `*` | All workspaces writable |
|
|
152
|
+
| `*-Dev,*-Test,Sandbox*` | Only matching workspaces writable |
|
|
153
|
+
|
|
154
|
+
Set comma-separated glob patterns:
|
|
155
|
+
|
|
156
|
+
```bash
|
|
157
|
+
WRITABLE_WORKSPACES=*-Dev,*-Test,Sandbox*
|
|
158
|
+
```
|
|
159
|
+
|
|
160
|
+
**Wildcard examples:**
|
|
161
|
+
- `*` matches all workspaces (allow everything)
|
|
162
|
+
- `*-Dev` matches "Sales-Dev", "Finance-Dev"
|
|
163
|
+
- `Sandbox*` matches "Sandbox-123", "Sandbox-Mike"
|
|
164
|
+
- `Exact-Name` matches only "Exact-Name" (case-insensitive)
|
|
165
|
+
|
|
166
|
+
**Guarded tools (89 total)** — every tool that creates, updates, or deletes workspace items:
|
|
167
|
+
|
|
168
|
+
| Domain | Guarded tools |
|
|
169
|
+
|--------|--------------|
|
|
170
|
+
| Workspace | `workspace_update`, `workspace_delete` |
|
|
171
|
+
| Lakehouse | `lakehouse_create`, `lakehouse_update`, `lakehouse_delete`, `lakehouse_load_table`, `lakehouse_create_shortcut`, `lakehouse_update_definition`, `lakehouse_delete_shortcut` |
|
|
172
|
+
| Warehouse | `warehouse_create`, `warehouse_update`, `warehouse_delete`, `warehouse_update_definition` |
|
|
173
|
+
| Notebook | `notebook_create`, `notebook_update`, `notebook_delete`, `notebook_update_definition` |
|
|
174
|
+
| Pipeline | `pipeline_create`, `pipeline_update`, `pipeline_delete`, `pipeline_create_schedule`, `pipeline_update_schedule`, `pipeline_delete_schedule`, `pipeline_update_definition` |
|
|
175
|
+
| Semantic Model | `semantic_model_create_bim`, `semantic_model_create_tmdl`, `semantic_model_update_details`, `semantic_model_delete`, `semantic_model_update_bim`, `semantic_model_update_tmdl`, `semantic_model_take_over` |
|
|
176
|
+
| Report | `report_create_definition`, `report_update`, `report_delete`, `report_clone`, `report_update_definition`, `report_rebind` |
|
|
177
|
+
| Dataflow | `dataflow_create`, `dataflow_update`, `dataflow_delete` |
|
|
178
|
+
| Eventhouse | `eventhouse_create`, `eventhouse_update`, `eventhouse_delete` |
|
|
179
|
+
| Eventstream | `eventstream_create`, `eventstream_update`, `eventstream_delete`, `eventstream_update_definition` |
|
|
180
|
+
| Reflex | `reflex_create`, `reflex_update`, `reflex_delete`, `reflex_update_definition` |
|
|
181
|
+
| GraphQL API | `graphql_api_create`, `graphql_api_update`, `graphql_api_delete` |
|
|
182
|
+
| Variable Library | `variable_library_create`, `variable_library_update`, `variable_library_delete`, `variable_library_update_definition` |
|
|
183
|
+
| Git Integration | `git_connect`, `git_disconnect`, `git_initialize_connection`, `git_commit_to_git`, `git_update_from_git`, `git_update_credentials` |
|
|
184
|
+
| Deployment Pipeline | `deployment_pipeline_assign_workspace`, `deployment_pipeline_unassign_workspace`, `deployment_pipeline_deploy` |
|
|
185
|
+
| Mirrored Database | `mirrored_database_create`, `mirrored_database_update`, `mirrored_database_delete`, `mirrored_database_update_definition`, `mirrored_database_start_mirroring`, `mirrored_database_stop_mirroring` |
|
|
186
|
+
| KQL Database | `kql_database_create`, `kql_database_update`, `kql_database_delete`, `kql_database_update_definition` |
|
|
187
|
+
| ML Model | `ml_model_create`, `ml_model_update`, `ml_model_delete` |
|
|
188
|
+
| ML Experiment | `ml_experiment_create`, `ml_experiment_update`, `ml_experiment_delete` |
|
|
189
|
+
| Copy Job | `copy_job_create`, `copy_job_update`, `copy_job_delete`, `copy_job_update_definition` |
|
|
190
|
+
| External Data Share | `external_data_share_create`, `external_data_share_revoke` |
|
|
191
|
+
|
|
192
|
+
**Not guarded:** Read operations (list, get, get_definition, get_bim, get_tmdl), query execution (DAX, KQL, SQL, GraphQL), run/refresh/cancel operations, export operations, and deployment pipeline CRUD (tenant-level, not workspace-scoped).
|
|
193
|
+
|
|
194
|
+
**Claude Desktop config with guard (Windows):**
|
|
195
|
+
```json
|
|
196
|
+
{
|
|
197
|
+
"mcpServers": {
|
|
198
|
+
"fabric": {
|
|
199
|
+
"command": "npx",
|
|
200
|
+
"args": ["-y", "@einlogic/mcp-fabric-api"],
|
|
201
|
+
"env": {
|
|
202
|
+
"WRITABLE_WORKSPACES": "*-Dev,*-Test,Sandbox*"
|
|
203
|
+
}
|
|
204
|
+
}
|
|
205
|
+
}
|
|
206
|
+
}
|
|
207
|
+
```
|
|
208
|
+
|
|
209
|
+
**Claude Desktop config with guard (macOS):**
|
|
210
|
+
```json
|
|
211
|
+
{
|
|
212
|
+
"mcpServers": {
|
|
213
|
+
"fabric": {
|
|
214
|
+
"command": "npx",
|
|
215
|
+
"args": ["-y", "@einlogic/mcp-fabric-api"],
|
|
216
|
+
"env": {
|
|
217
|
+
"AUTH_METHOD": "device-code",
|
|
218
|
+
"AZURE_CLIENT_ID": "your-app-client-id",
|
|
219
|
+
"AZURE_TENANT_ID": "your-tenant-id",
|
|
220
|
+
"WRITABLE_WORKSPACES": "*-Dev,*-Test,Sandbox*"
|
|
221
|
+
}
|
|
222
|
+
}
|
|
223
|
+
}
|
|
224
|
+
}
|
|
225
|
+
```
|
|
226
|
+
|
|
227
|
+
**Claude Code CLI with guard:**
|
|
228
|
+
```bash
|
|
229
|
+
WRITABLE_WORKSPACES="*-Dev,*-Test" claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api
|
|
230
|
+
```
|
|
231
|
+
|
|
232
|
+
**Error when not configured:**
|
|
233
|
+
```
|
|
234
|
+
WRITABLE_WORKSPACES is not configured. Destructive actions are blocked by default. Set WRITABLE_WORKSPACES to a comma-separated list of workspace name patterns, or "*" to allow all.
|
|
235
|
+
```
|
|
236
|
+
|
|
237
|
+
**Error when workspace not in allow list:**
|
|
238
|
+
```
|
|
239
|
+
Workspace "Production-Analytics" is not in the writable workspaces list. Allowed patterns: *-Dev, *-Test, Sandbox*
|
|
240
|
+
```
|
|
241
|
+
|
|
242
|
+
### Debug Logging
|
|
243
|
+
|
|
244
|
+
Enable verbose debug logging to diagnose API errors, inspect request/response details, and trace long-running operations. All log output goes to `stderr` (visible in Claude Desktop's log files, never interferes with JSON-RPC on stdout).
|
|
245
|
+
|
|
246
|
+
Set the `LOG_LEVEL` environment variable to `debug`:
|
|
247
|
+
|
|
248
|
+
**Claude Desktop config:**
|
|
249
|
+
```json
|
|
250
|
+
{
|
|
251
|
+
"mcpServers": {
|
|
252
|
+
"fabric": {
|
|
253
|
+
"command": "npx",
|
|
254
|
+
"args": ["-y", "@einlogic/mcp-fabric-api"],
|
|
255
|
+
"env": {
|
|
256
|
+
"LOG_LEVEL": "debug"
|
|
257
|
+
}
|
|
258
|
+
}
|
|
259
|
+
}
|
|
260
|
+
}
|
|
261
|
+
```
|
|
262
|
+
|
|
263
|
+
**Claude Code CLI:**
|
|
264
|
+
```bash
|
|
265
|
+
LOG_LEVEL=debug claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api
|
|
266
|
+
```
|
|
267
|
+
|
|
268
|
+
**What gets logged at debug level:**
|
|
269
|
+
|
|
270
|
+
| Category | Details logged |
|
|
271
|
+
|----------|---------------|
|
|
272
|
+
| HTTP requests | Method, full URL, request body size in bytes |
|
|
273
|
+
| HTTP responses | Status code, duration (ms), `x-ms-request-id` header |
|
|
274
|
+
| Definition uploads | Part paths, payload types, payload sizes — never payload content |
|
|
275
|
+
| API errors | Full error body including `errorCode`, `details[]`, `innererror`, `relatedResource`, `x-ms-request-id` |
|
|
276
|
+
| LRO polling | Operation ID, poll count, elapsed time, final status |
|
|
277
|
+
| Pagination | Page count, items per page, total items |
|
|
278
|
+
| SQL/KQL queries | Server, database, duration, column/row counts — never query text or result data |
|
|
279
|
+
| Rate limiting | Retry-after duration, affected endpoint |
|
|
280
|
+
|
|
281
|
+
**Compliance:** Debug logging never captures actual data content — no query text, no query results, no definition payloads, no bearer tokens. Only structural metadata (URLs, sizes, counts, timing, error details) is logged.
|
|
282
|
+
|
|
283
|
+
**Viewing logs in Claude Desktop:**
|
|
284
|
+
|
|
285
|
+
- **macOS:** `~/Library/Logs/Claude/mcp-server-fabric.log`
|
|
286
|
+
- **Windows:** `%APPDATA%\Claude\logs\mcp-server-fabric.log`
|
|
287
|
+
|
|
288
|
+
You can also tail the log in real time:
|
|
289
|
+
```bash
|
|
290
|
+
# macOS
|
|
291
|
+
tail -f ~/Library/Logs/Claude/mcp-server-fabric.log
|
|
292
|
+
|
|
293
|
+
# Windows (PowerShell)
|
|
294
|
+
Get-Content "$env:APPDATA\Claude\logs\mcp-server-fabric.log" -Wait
|
|
295
|
+
```
|
|
296
|
+
|
|
297
|
+
The `x-ms-request-id` value logged with every API error is the key identifier needed when opening a support case with Microsoft for Fabric API issues.
|
|
298
|
+
|
|
299
|
+
### File-Based I/O
|
|
300
|
+
|
|
301
|
+
To avoid large payloads overwhelming MCP clients, definition tools use file paths instead of inline content. The server reads files from disk when sending definitions to Fabric, and writes files to disk when retrieving definitions from Fabric.
|
|
302
|
+
|
|
303
|
+
**Input tools** — the server reads definition files from the specified path and uploads to Fabric:
|
|
304
|
+
|
|
305
|
+
| Tool | Parameter | Description |
|
|
306
|
+
|------|-----------|-------------|
|
|
307
|
+
| `semantic_model_create_bim` | `definitionFilePath` | Path to model.bim JSON file |
|
|
308
|
+
| `semantic_model_update_bim` | `definitionFilePath` | Path to model.bim JSON file |
|
|
309
|
+
| `semantic_model_create_tmdl` | `filesDirectoryPath` | Directory of `.tmdl` and `.pbism` files |
|
|
310
|
+
| `semantic_model_update_tmdl` | `filesDirectoryPath` | Directory of `.tmdl` and `.pbism` files |
|
|
311
|
+
| `notebook_update_definition` | `definitionDirectoryPath` | Directory containing notebook definition files |
|
|
312
|
+
| `eventstream_update_definition` | `definitionDirectoryPath` | Directory containing eventstream definition files |
|
|
313
|
+
| `report_create_definition` | `definitionDirectoryPath` | Directory of PBIR report definition files |
|
|
314
|
+
| `report_update_definition` | `definitionDirectoryPath` | Directory of PBIR report definition files |
|
|
315
|
+
| `variable_library_create` | `definitionDirectoryPath` | Directory of `.json` and `.platform` files |
|
|
316
|
+
| `variable_library_update_definition` | `definitionDirectoryPath` | Directory of `.json` and `.platform` files |
|
|
317
|
+
| `lakehouse_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
|
|
318
|
+
| `warehouse_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
|
|
319
|
+
| `pipeline_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
|
|
320
|
+
| `reflex_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
|
|
321
|
+
| `mirrored_database_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
|
|
322
|
+
| `kql_database_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
|
|
323
|
+
| `copy_job_update_definition` | `partsDirectoryPath` | Directory of definition files (or inline `parts`) |
|
|
324
|
+
|
|
325
|
+
**Output tools** — the server retrieves definitions from Fabric and writes them to disk:
|
|
326
|
+
|
|
327
|
+
| Tool | Parameter | What gets written |
|
|
328
|
+
|------|-----------|-------------------|
|
|
329
|
+
| `semantic_model_get_bim` | `outputFilePath` | Single `model.bim` JSON file |
|
|
330
|
+
| `semantic_model_get_tmdl` | `outputDirectoryPath` | TMDL files preserving folder structure |
|
|
331
|
+
| `notebook_get_definition` | `outputDirectoryPath` | Notebook definition files |
|
|
332
|
+
| `lakehouse_get_definition` | `outputDirectoryPath` | Lakehouse definition files |
|
|
333
|
+
| `warehouse_get_definition` | `outputDirectoryPath` | Warehouse definition files |
|
|
334
|
+
| `pipeline_get_definition` | `outputDirectoryPath` | Pipeline definition files |
|
|
335
|
+
| `report_get_definition` | `outputDirectoryPath` | Report definition files (report.json, pages, visuals) |
|
|
336
|
+
| `dataflow_get_definition` | `outputDirectoryPath` | Dataflow definition files |
|
|
337
|
+
| `eventstream_get_definition` | `outputDirectoryPath` | Eventstream definition files |
|
|
338
|
+
| `graphql_api_get_definition` | `outputDirectoryPath` | GraphQL schema definition files |
|
|
339
|
+
| `reflex_get_definition` | `outputDirectoryPath` | Reflex definition files |
|
|
340
|
+
| `variable_library_get_definition` | `outputDirectoryPath` | Variable library files (variables.json, valueSets/) |
|
|
341
|
+
| `mirrored_database_get_definition` | `outputDirectoryPath` | Mirrored database definition files |
|
|
342
|
+
| `kql_database_get_definition` | `outputDirectoryPath` | KQL database definition files |
|
|
343
|
+
| `copy_job_get_definition` | `outputDirectoryPath` | Copy job definition files |
|
|
344
|
+
|
|
345
|
+
**TMDL directory structure example:**
|
|
346
|
+
```
|
|
347
|
+
/tmp/my-model/
|
|
348
|
+
model.tmdl
|
|
349
|
+
definition.pbism
|
|
350
|
+
definition/
|
|
351
|
+
tables/
|
|
352
|
+
Sales.tmdl
|
|
353
|
+
Product.tmdl
|
|
354
|
+
relationships.tmdl
|
|
355
|
+
```
|
|
356
|
+
|
|
357
|
+
## Development
|
|
358
|
+
|
|
359
|
+
```bash
|
|
360
|
+
git clone https://github.com/your-org/mcp-fabric-api.git
|
|
361
|
+
cd mcp-fabric-api
|
|
362
|
+
npm install
|
|
363
|
+
npm run build
|
|
364
|
+
npm start
|
|
365
|
+
npm run dev # Watch mode
|
|
366
|
+
npm run inspect # Launch MCP Inspector
|
|
367
|
+
```
|
|
368
|
+
|
|
369
|
+
## Tools (197 total)
|
|
370
|
+
|
|
371
|
+
### Auth (4 tools)
|
|
372
|
+
| Tool | Description |
|
|
373
|
+
|------|-------------|
|
|
374
|
+
| `auth_get_current_account` | Show current Azure identity, tenant, and token expiry |
|
|
375
|
+
| `auth_list_available_accounts` | List subscriptions/tenants from local `az login` state (does not query Entra) |
|
|
376
|
+
| `auth_switch_tenant` | Switch to a different Azure tenant (with rollback on failure) |
|
|
377
|
+
| `auth_clear_token_cache` | Clear cached tokens to force re-acquisition |
|
|
378
|
+
|
|
379
|
+
### Workspace (6 tools)
|
|
380
|
+
| Tool | Description |
|
|
381
|
+
|------|-------------|
|
|
382
|
+
| `workspace_list` | List all accessible Fabric workspaces |
|
|
383
|
+
| `workspace_get` | Get details of a specific workspace |
|
|
384
|
+
| `workspace_create` | Create a new workspace |
|
|
385
|
+
| `workspace_update` | Update a workspace's name or description |
|
|
386
|
+
| `workspace_delete` | Delete a workspace |
|
|
387
|
+
| `workspace_list_items` | List all items in a workspace (with optional type filter) |
|
|
388
|
+
|
|
389
|
+
### Lakehouse (14 tools)
|
|
390
|
+
| Tool | Description |
|
|
391
|
+
|------|-------------|
|
|
392
|
+
| `lakehouse_list` | List all lakehouses in a workspace |
|
|
393
|
+
| `lakehouse_get` | Get lakehouse details (SQL endpoint, OneLake paths) |
|
|
394
|
+
| `lakehouse_create` | Create a new lakehouse (LRO, schemas enabled by default) |
|
|
395
|
+
| `lakehouse_update` | Update lakehouse name or description |
|
|
396
|
+
| `lakehouse_delete` | Delete a lakehouse |
|
|
397
|
+
| `lakehouse_list_tables` | List all tables in a lakehouse (falls back to SQL endpoint for schema-enabled lakehouses) |
|
|
398
|
+
| `lakehouse_load_table` | Load data into a table from OneLake (LRO). Not supported for schema-enabled lakehouses |
|
|
399
|
+
| `lakehouse_create_shortcut` | Create a OneLake shortcut (file, folder, table, or schema level) with support for multiple target types |
|
|
400
|
+
| `lakehouse_get_sql_endpoint` | Get SQL endpoint details |
|
|
401
|
+
| `lakehouse_get_definition` | Get lakehouse definition (LRO). Writes files to `outputDirectoryPath` |
|
|
402
|
+
| `lakehouse_update_definition` | Update lakehouse definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
|
|
403
|
+
| `lakehouse_list_shortcuts` | List all OneLake shortcuts in a lakehouse |
|
|
404
|
+
| `lakehouse_get_shortcut` | Get details of a specific OneLake shortcut |
|
|
405
|
+
| `lakehouse_delete_shortcut` | Delete a OneLake shortcut |
|
|
406
|
+
|
|
407
|
+
### Warehouse (9 tools)
|
|
408
|
+
| Tool | Description |
|
|
409
|
+
|------|-------------|
|
|
410
|
+
| `warehouse_list` | List all warehouses in a workspace |
|
|
411
|
+
| `warehouse_get` | Get warehouse details including connection string and provisioning status |
|
|
412
|
+
| `warehouse_create` | Create a new warehouse (LRO) |
|
|
413
|
+
| `warehouse_update` | Update warehouse name or description |
|
|
414
|
+
| `warehouse_delete` | Delete a warehouse |
|
|
415
|
+
| `warehouse_get_sql_endpoint` | Get SQL connection details for a warehouse |
|
|
416
|
+
| `warehouse_list_tables` | List all tables in a warehouse |
|
|
417
|
+
| `warehouse_get_definition` | Get warehouse definition (LRO). Writes files to `outputDirectoryPath` |
|
|
418
|
+
| `warehouse_update_definition` | Update warehouse definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
|
|
419
|
+
|
|
420
|
+
### Notebook (10 tools)
|
|
421
|
+
| Tool | Description |
|
|
422
|
+
|------|-------------|
|
|
423
|
+
| `notebook_list` | List all notebooks in a workspace |
|
|
424
|
+
| `notebook_get` | Get notebook details |
|
|
425
|
+
| `notebook_create` | Create a new notebook (LRO) |
|
|
426
|
+
| `notebook_update` | Update notebook name or description |
|
|
427
|
+
| `notebook_delete` | Delete a notebook |
|
|
428
|
+
| `notebook_get_definition` | Get notebook definition (LRO). Writes files to `outputDirectoryPath` |
|
|
429
|
+
| `notebook_update_definition` | Update notebook definition (LRO). Reads files from `definitionDirectoryPath` |
|
|
430
|
+
| `notebook_run` | Run a notebook on demand |
|
|
431
|
+
| `notebook_get_run_status` | Get notebook run status |
|
|
432
|
+
| `notebook_cancel_run` | Cancel a running notebook |
|
|
433
|
+
|
|
434
|
+
### Pipeline (15 tools)
|
|
435
|
+
| Tool | Description |
|
|
436
|
+
|------|-------------|
|
|
437
|
+
| `pipeline_list` | List all data pipelines |
|
|
438
|
+
| `pipeline_get` | Get pipeline details |
|
|
439
|
+
| `pipeline_create` | Create a new pipeline |
|
|
440
|
+
| `pipeline_update` | Update pipeline name or description |
|
|
441
|
+
| `pipeline_delete` | Delete a pipeline |
|
|
442
|
+
| `pipeline_run` | Run a pipeline on demand |
|
|
443
|
+
| `pipeline_get_run_status` | Get pipeline run status |
|
|
444
|
+
| `pipeline_cancel_run` | Cancel a running pipeline |
|
|
445
|
+
| `pipeline_list_runs` | List all run instances |
|
|
446
|
+
| `pipeline_list_schedules` | List pipeline schedules |
|
|
447
|
+
| `pipeline_create_schedule` | Create a pipeline schedule |
|
|
448
|
+
| `pipeline_update_schedule` | Update a pipeline schedule |
|
|
449
|
+
| `pipeline_delete_schedule` | Delete a pipeline schedule |
|
|
450
|
+
| `pipeline_get_definition` | Get pipeline definition (LRO). Writes files to `outputDirectoryPath` |
|
|
451
|
+
| `pipeline_update_definition` | Update pipeline definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
|
|
452
|
+
|
|
453
|
+
### Semantic Model (15 tools)
|
|
454
|
+
| Tool | Description |
|
|
455
|
+
|------|-------------|
|
|
456
|
+
| `semantic_model_list` | List all semantic models |
|
|
457
|
+
| `semantic_model_get_details` | Get semantic model metadata (name, ID, description) — does not return the definition |
|
|
458
|
+
| `semantic_model_create_bim` | Create a semantic model from a BIM/JSON file (LRO). Reads `model.bim` from `definitionFilePath` |
|
|
459
|
+
| `semantic_model_create_tmdl` | Create a semantic model from TMDL files (LRO). Reads `.tmdl`/`.pbism` from `filesDirectoryPath` |
|
|
460
|
+
| `semantic_model_update_details` | Update semantic model name or description — does not modify the definition |
|
|
461
|
+
| `semantic_model_delete` | Delete a semantic model |
|
|
462
|
+
| `semantic_model_refresh` | Trigger a model refresh (Power BI API) |
|
|
463
|
+
| `semantic_model_execute_dax` | Execute a DAX query (Power BI API) |
|
|
464
|
+
| `semantic_model_get_bim` | Get definition in BIM/JSON format (LRO). Writes `model.bim` to `outputFilePath` |
|
|
465
|
+
| `semantic_model_get_tmdl` | Get definition in TMDL format (LRO). Writes TMDL files to `outputDirectoryPath` |
|
|
466
|
+
| `semantic_model_update_bim` | Update definition from BIM/JSON file (LRO). Reads `model.bim` from `definitionFilePath` |
|
|
467
|
+
| `semantic_model_update_tmdl` | Update definition from TMDL files (LRO). Reads `.tmdl`/`.pbism` from `filesDirectoryPath` |
|
|
468
|
+
| `semantic_model_get_refresh_history` | Get refresh history (Power BI API) |
|
|
469
|
+
| `semantic_model_take_over` | Take over ownership of a semantic model (Power BI API) |
|
|
470
|
+
| `semantic_model_get_datasources` | Get data sources of a semantic model (Power BI API) |
|
|
471
|
+
|
|
472
|
+
### Report (13 tools)
|
|
473
|
+
| Tool | Description |
|
|
474
|
+
|------|-------------|
|
|
475
|
+
| `report_list` | List all reports |
|
|
476
|
+
| `report_get` | Get report details |
|
|
477
|
+
| `report_create_definition` | Create a new report from PBIR definition files (LRO). Reads from `definitionDirectoryPath` |
|
|
478
|
+
| `report_update` | Update report name or description |
|
|
479
|
+
| `report_delete` | Delete a report |
|
|
480
|
+
| `report_clone` | Clone a report (Power BI API) |
|
|
481
|
+
| `report_export` | Export report to file format (PDF, PPTX, PNG, etc.) via Power BI API |
|
|
482
|
+
| `report_get_export_status` | Check report export status |
|
|
483
|
+
| `report_get_definition` | Get report definition (LRO). Writes files to `outputDirectoryPath` |
|
|
484
|
+
| `report_update_definition` | Update report definition from PBIR directory (LRO). Reads from `definitionDirectoryPath` |
|
|
485
|
+
| `report_rebind` | Rebind a report to a different semantic model/dataset (Power BI API) |
|
|
486
|
+
| `report_get_pages` | Get the list of pages in a report (Power BI API) |
|
|
487
|
+
| `report_get_datasources` | Get data sources used by a report (Power BI API) |
|
|
488
|
+
|
|
489
|
+
### Dataflow Gen2 (8 tools)
|
|
490
|
+
| Tool | Description |
|
|
491
|
+
|------|-------------|
|
|
492
|
+
| `dataflow_list` | List all Dataflow Gen2 items |
|
|
493
|
+
| `dataflow_get` | Get dataflow details |
|
|
494
|
+
| `dataflow_create` | Create a new dataflow |
|
|
495
|
+
| `dataflow_update` | Update dataflow name or description |
|
|
496
|
+
| `dataflow_delete` | Delete a dataflow |
|
|
497
|
+
| `dataflow_refresh` | Trigger a dataflow refresh |
|
|
498
|
+
| `dataflow_get_refresh_status` | Get refresh job status |
|
|
499
|
+
| `dataflow_get_definition` | Get dataflow definition (LRO). Writes files to `outputDirectoryPath` |
|
|
500
|
+
|
|
501
|
+
### Eventhouse (7 tools)
|
|
502
|
+
| Tool | Description |
|
|
503
|
+
|------|-------------|
|
|
504
|
+
| `eventhouse_list` | List all eventhouses |
|
|
505
|
+
| `eventhouse_get` | Get eventhouse details |
|
|
506
|
+
| `eventhouse_create` | Create a new eventhouse (LRO) |
|
|
507
|
+
| `eventhouse_update` | Update eventhouse name or description |
|
|
508
|
+
| `eventhouse_delete` | Delete an eventhouse |
|
|
509
|
+
| `eventhouse_get_sql_endpoint` | Get query service URI and connection details |
|
|
510
|
+
| `eventhouse_execute_kql` | Execute a KQL query against a KQL database |
|
|
511
|
+
|
|
512
|
+
### Eventstream (7 tools)
|
|
513
|
+
| Tool | Description |
|
|
514
|
+
|------|-------------|
|
|
515
|
+
| `eventstream_list` | List all eventstreams |
|
|
516
|
+
| `eventstream_get` | Get eventstream details |
|
|
517
|
+
| `eventstream_create` | Create a new eventstream (LRO) |
|
|
518
|
+
| `eventstream_update` | Update eventstream name or description |
|
|
519
|
+
| `eventstream_delete` | Delete an eventstream |
|
|
520
|
+
| `eventstream_get_definition` | Get eventstream definition (LRO). Writes files to `outputDirectoryPath` |
|
|
521
|
+
| `eventstream_update_definition` | Update eventstream definition (LRO). Reads from `definitionDirectoryPath` |
|
|
522
|
+
|
|
523
|
+
### Reflex / Activator (7 tools)
|
|
524
|
+
| Tool | Description |
|
|
525
|
+
|------|-------------|
|
|
526
|
+
| `reflex_list` | List all Reflex (Activator) items |
|
|
527
|
+
| `reflex_get` | Get reflex details |
|
|
528
|
+
| `reflex_create` | Create a new reflex |
|
|
529
|
+
| `reflex_update` | Update reflex name or description |
|
|
530
|
+
| `reflex_delete` | Delete a reflex |
|
|
531
|
+
| `reflex_get_definition` | Get reflex definition (LRO). Writes files to `outputDirectoryPath` |
|
|
532
|
+
| `reflex_update_definition` | Update reflex definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
|
|
533
|
+
|
|
534
|
+
### GraphQL API (7 tools)
|
|
535
|
+
| Tool | Description |
|
|
536
|
+
|------|-------------|
|
|
537
|
+
| `graphql_api_list` | List all GraphQL API items |
|
|
538
|
+
| `graphql_api_get` | Get GraphQL API details |
|
|
539
|
+
| `graphql_api_create` | Create a new GraphQL API |
|
|
540
|
+
| `graphql_api_update` | Update GraphQL API name or description |
|
|
541
|
+
| `graphql_api_delete` | Delete a GraphQL API |
|
|
542
|
+
| `graphql_api_get_definition` | Get GraphQL schema definition (LRO). Writes files to `outputDirectoryPath` |
|
|
543
|
+
| `graphql_api_execute_query` | Execute a GraphQL query |
|
|
544
|
+
|
|
545
|
+
### SQL Endpoint (4 tools)
|
|
546
|
+
| Tool | Description |
|
|
547
|
+
|------|-------------|
|
|
548
|
+
| `sql_endpoint_list` | List all SQL endpoints |
|
|
549
|
+
| `sql_endpoint_get` | Get SQL endpoint details |
|
|
550
|
+
| `sql_endpoint_get_connection_string` | Get TDS connection string |
|
|
551
|
+
| `sql_endpoint_execute_query` | Execute a T-SQL query against a lakehouse or warehouse SQL endpoint |
|
|
552
|
+
|
|
553
|
+
### Variable Library (7 tools)
|
|
554
|
+
| Tool | Description |
|
|
555
|
+
|------|-------------|
|
|
556
|
+
| `variable_library_list` | List all variable libraries in a workspace |
|
|
557
|
+
| `variable_library_get` | Get variable library details including active value set name |
|
|
558
|
+
| `variable_library_create` | Create a variable library, optionally with definition files from `definitionDirectoryPath` (LRO) |
|
|
559
|
+
| `variable_library_update` | Update name, description, or active value set |
|
|
560
|
+
| `variable_library_delete` | Delete a variable library |
|
|
561
|
+
| `variable_library_get_definition` | Get definition (LRO). Writes files (variables.json, valueSets/) to `outputDirectoryPath` |
|
|
562
|
+
| `variable_library_update_definition` | Update definition from directory of `.json` and `.platform` files (LRO) |
|
|
563
|
+
|
|
564
|
+
### Git Integration (9 tools)
|
|
565
|
+
| Tool | Description |
|
|
566
|
+
|------|-------------|
|
|
567
|
+
| `git_get_connection` | Get Git connection details for a workspace |
|
|
568
|
+
| `git_get_status` | Get Git status of items (sync state between workspace and remote) |
|
|
569
|
+
| `git_connect` | Connect a workspace to a Git repository (Azure DevOps or GitHub) |
|
|
570
|
+
| `git_disconnect` | Disconnect a workspace from its Git repository |
|
|
571
|
+
| `git_initialize_connection` | Initialize a Git connection after connecting (LRO) |
|
|
572
|
+
| `git_commit_to_git` | Commit workspace changes to the connected Git repository (LRO) |
|
|
573
|
+
| `git_update_from_git` | Update workspace from the connected Git repository (LRO) |
|
|
574
|
+
| `git_get_credentials` | Get Git credentials configuration for the current user |
|
|
575
|
+
| `git_update_credentials` | Update Git credentials configuration for the current user |
|
|
576
|
+
|
|
577
|
+
### Deployment Pipeline (12 tools)
|
|
578
|
+
| Tool | Description |
|
|
579
|
+
|------|-------------|
|
|
580
|
+
| `deployment_pipeline_list` | List all deployment pipelines accessible to the user |
|
|
581
|
+
| `deployment_pipeline_get` | Get details of a specific deployment pipeline |
|
|
582
|
+
| `deployment_pipeline_create` | Create a new deployment pipeline |
|
|
583
|
+
| `deployment_pipeline_update` | Update deployment pipeline name or description |
|
|
584
|
+
| `deployment_pipeline_delete` | Delete a deployment pipeline |
|
|
585
|
+
| `deployment_pipeline_list_stages` | List all stages in a deployment pipeline |
|
|
586
|
+
| `deployment_pipeline_list_stage_items` | List all items in a specific stage |
|
|
587
|
+
| `deployment_pipeline_assign_workspace` | Assign a workspace to a pipeline stage |
|
|
588
|
+
| `deployment_pipeline_unassign_workspace` | Unassign a workspace from a pipeline stage |
|
|
589
|
+
| `deployment_pipeline_deploy` | Deploy items from one stage to another (LRO) |
|
|
590
|
+
| `deployment_pipeline_list_operations` | List operations (deployment history) |
|
|
591
|
+
| `deployment_pipeline_get_operation` | Get details of a specific deployment operation |
|
|
592
|
+
|
|
593
|
+
### Mirrored Database (11 tools)
|
|
594
|
+
| Tool | Description |
|
|
595
|
+
|------|-------------|
|
|
596
|
+
| `mirrored_database_list` | List all mirrored databases in a workspace |
|
|
597
|
+
| `mirrored_database_get` | Get details of a specific mirrored database |
|
|
598
|
+
| `mirrored_database_create` | Create a new mirrored database (LRO) |
|
|
599
|
+
| `mirrored_database_update` | Update mirrored database name or description |
|
|
600
|
+
| `mirrored_database_delete` | Delete a mirrored database |
|
|
601
|
+
| `mirrored_database_get_definition` | Get mirrored database definition (LRO). Writes files to `outputDirectoryPath` |
|
|
602
|
+
| `mirrored_database_update_definition` | Update definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
|
|
603
|
+
| `mirrored_database_start_mirroring` | Start mirroring for a mirrored database |
|
|
604
|
+
| `mirrored_database_stop_mirroring` | Stop mirroring for a mirrored database |
|
|
605
|
+
| `mirrored_database_get_mirroring_status` | Get the mirroring status |
|
|
606
|
+
| `mirrored_database_get_tables_mirroring_status` | Get mirroring status of individual tables |
|
|
607
|
+
|
|
608
|
+
### KQL Database (7 tools)
|
|
609
|
+
| Tool | Description |
|
|
610
|
+
|------|-------------|
|
|
611
|
+
| `kql_database_list` | List all KQL databases in a workspace |
|
|
612
|
+
| `kql_database_get` | Get details of a specific KQL database |
|
|
613
|
+
| `kql_database_create` | Create a new KQL database (LRO). Requires a parent eventhouse |
|
|
614
|
+
| `kql_database_update` | Update KQL database name or description |
|
|
615
|
+
| `kql_database_delete` | Delete a KQL database |
|
|
616
|
+
| `kql_database_get_definition` | Get KQL database definition (LRO). Writes files to `outputDirectoryPath` |
|
|
617
|
+
| `kql_database_update_definition` | Update definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
|
|
618
|
+
|
|
619
|
+
### ML Model (5 tools)
|
|
620
|
+
| Tool | Description |
|
|
621
|
+
|------|-------------|
|
|
622
|
+
| `ml_model_list` | List all ML models in a workspace |
|
|
623
|
+
| `ml_model_get` | Get details of a specific ML model |
|
|
624
|
+
| `ml_model_create` | Create a new ML model (LRO) |
|
|
625
|
+
| `ml_model_update` | Update ML model name or description |
|
|
626
|
+
| `ml_model_delete` | Delete an ML model |
|
|
627
|
+
|
|
628
|
+
### ML Experiment (5 tools)
|
|
629
|
+
| Tool | Description |
|
|
630
|
+
|------|-------------|
|
|
631
|
+
| `ml_experiment_list` | List all ML experiments in a workspace |
|
|
632
|
+
| `ml_experiment_get` | Get details of a specific ML experiment |
|
|
633
|
+
| `ml_experiment_create` | Create a new ML experiment (LRO) |
|
|
634
|
+
| `ml_experiment_update` | Update ML experiment name or description |
|
|
635
|
+
| `ml_experiment_delete` | Delete an ML experiment |
|
|
636
|
+
|
|
637
|
+
### Copy Job (11 tools)
|
|
638
|
+
| Tool | Description |
|
|
639
|
+
|------|-------------|
|
|
640
|
+
| `copy_job_list` | List all copy jobs in a workspace |
|
|
641
|
+
| `copy_job_get` | Get details of a specific copy job |
|
|
642
|
+
| `copy_job_create` | Create a new copy job |
|
|
643
|
+
| `copy_job_update` | Update copy job name or description |
|
|
644
|
+
| `copy_job_delete` | Delete a copy job |
|
|
645
|
+
| `copy_job_get_definition` | Get copy job definition (LRO). Writes files to `outputDirectoryPath` |
|
|
646
|
+
| `copy_job_update_definition` | Update definition (LRO). Reads from `partsDirectoryPath` or inline `parts` |
|
|
647
|
+
| `copy_job_run` | Run a copy job on demand |
|
|
648
|
+
| `copy_job_get_run_status` | Get copy job run status |
|
|
649
|
+
| `copy_job_cancel_run` | Cancel a running copy job |
|
|
650
|
+
| `copy_job_list_runs` | List all run instances for a copy job |
|
|
651
|
+
|
|
652
|
+
### External Data Share (4 tools)
|
|
653
|
+
| Tool | Description |
|
|
654
|
+
|------|-------------|
|
|
655
|
+
| `external_data_share_list` | List all external data shares for an item |
|
|
656
|
+
| `external_data_share_get` | Get details of a specific external data share |
|
|
657
|
+
| `external_data_share_create` | Create a new external data share for an item |
|
|
658
|
+
| `external_data_share_revoke` | Revoke an external data share |
|
|
659
|
+
|
|
660
|
+
## License
|
|
661
|
+
|
|
662
|
+
AGPL-3.0
|