sceneview-mcp 3.4.10 → 3.4.11

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,53 +1,34 @@
1
- # sceneview-mcp — SceneView MCP Server
1
+ # sceneview-mcp
2
+
3
+ **Give any AI assistant expert-level knowledge of 3D and AR development.**
2
4
 
3
5
  [![npm version](https://img.shields.io/npm/v/sceneview-mcp?color=6c35aa)](https://www.npmjs.com/package/sceneview-mcp)
4
6
  [![npm downloads](https://img.shields.io/npm/dm/sceneview-mcp?color=blue)](https://www.npmjs.com/package/sceneview-mcp)
7
+ [![Tests](https://img.shields.io/badge/tests-612%20passing-brightgreen)](#quality)
5
8
  [![MCP](https://img.shields.io/badge/MCP-v1.12-blue)](https://modelcontextprotocol.io/)
9
+ [![Registry](https://img.shields.io/badge/MCP%20Registry-listed-blueviolet)](https://registry.modelcontextprotocol.io)
6
10
  [![License](https://img.shields.io/badge/License-MIT-green)](./LICENSE)
7
11
  [![Node](https://img.shields.io/badge/Node-%3E%3D18-brightgreen)](https://nodejs.org/)
8
12
 
9
- > **Disclaimer:** This tool generates code suggestions for the SceneView SDK. Generated code is provided "as is" without warranty. Always review generated code before use in production. This is not a substitute for professional software engineering review. See [TERMS.md](./TERMS.md) and [PRIVACY.md](./PRIVACY.md).
10
-
11
- The official [Model Context Protocol](https://modelcontextprotocol.io/) server for **SceneView** — giving AI assistants deep knowledge of the SceneView 3D/AR SDK so they generate correct, compilable Kotlin code.
12
-
13
- ---
14
-
15
- ## What It Does
13
+ The official [Model Context Protocol](https://modelcontextprotocol.io/) server for **[SceneView](https://sceneview.github.io)** -- the cross-platform 3D & AR SDK for Android (Jetpack Compose + Filament), iOS/macOS/visionOS (SwiftUI + RealityKit), and Web (Filament.js + WebXR).
16
14
 
17
- When connected to an AI assistant (Claude, Cursor, Windsurf, etc.), this MCP server provides **10 tools** and **2 resources** that give the assistant expert-level knowledge of the SceneView SDK:
15
+ Connect it to Claude, Cursor, Windsurf, or any MCP client. The assistant gets 14 specialized tools, 33 compilable code samples, a full API reference, and a code validator -- so it writes correct, working 3D/AR code on the first try.
18
16
 
19
- ### Tools
17
+ > **Disclaimer:** Generated code is provided "as is" without warranty. Always review before production use. See [TERMS.md](./TERMS.md) and [PRIVACY.md](./PRIVACY.md).
20
18
 
21
- | Tool | Description |
22
- |------|-------------|
23
- | `get_node_reference` | Complete API reference for any SceneView node type (26+ types) |
24
- | `list_node_types` | List all available node composables |
25
- | `validate_code` | Check SceneView code for 15+ common mistakes before presenting it |
26
- | `get_sample` | Get complete, compilable sample code for any of 14 scenarios |
27
- | `list_samples` | Browse all sample applications, optionally filtered by tag |
28
- | `get_setup` | Gradle + manifest setup for 3D or AR projects |
29
- | `get_migration_guide` | Full SceneView 2.x to 3.0 migration instructions |
30
- | `get_platform_roadmap` | Multi-platform roadmap (Android, iOS, KMP, Web) |
31
- | `get_best_practices` | Performance, architecture, memory, and threading best practices |
32
- | `get_ar_setup` | Detailed AR setup: manifest, permissions, session config, patterns |
19
+ ---
33
20
 
34
- ### Resources
21
+ ## Quick start
35
22
 
36
- | Resource | Description |
37
- |----------|-------------|
38
- | `sceneview://api` | Complete SceneView 3.3.0 API reference (llms.txt) |
39
- | `sceneview://known-issues` | Live open issues from GitHub (cached 10 min) |
23
+ **One command -- no install required:**
40
24
 
41
- ---
42
-
43
- ## Installation
25
+ ```bash
26
+ npx sceneview-mcp
27
+ ```
44
28
 
45
29
  ### Claude Desktop
46
30
 
47
- Add to your Claude Desktop configuration file:
48
-
49
- - **macOS:** `~/Library/Application Support/Claude/claude_desktop_config.json`
50
- - **Windows:** `%APPDATA%\Claude\claude_desktop_config.json`
31
+ Add to `~/Library/Application Support/Claude/claude_desktop_config.json` (macOS) or `%APPDATA%\Claude\claude_desktop_config.json` (Windows):
51
32
 
52
33
  ```json
53
34
  {
@@ -60,38 +41,17 @@ Add to your Claude Desktop configuration file:
60
41
  }
61
42
  ```
62
43
 
63
- Restart Claude Desktop after saving the file.
44
+ Restart Claude Desktop after saving.
64
45
 
65
46
  ### Claude Code
66
47
 
67
- Run from your terminal:
68
-
69
48
  ```bash
70
49
  claude mcp add sceneview -- npx -y sceneview-mcp
71
50
  ```
72
51
 
73
- Or add to your `.claude/settings.json`:
74
-
75
- ```json
76
- {
77
- "mcpServers": {
78
- "sceneview": {
79
- "command": "npx",
80
- "args": ["-y", "sceneview-mcp"]
81
- }
82
- }
83
- }
84
- ```
85
-
86
52
  ### Cursor
87
53
 
88
- Open **Settings > MCP** and add a new server:
89
-
90
- **Name:** `sceneview`
91
- **Type:** `command`
92
- **Command:** `npx -y sceneview-mcp`
93
-
94
- Or add to your `.cursor/mcp.json`:
54
+ Open **Settings > MCP**, add a new server named `sceneview` with command `npx -y sceneview-mcp`. Or add to `.cursor/mcp.json`:
95
55
 
96
56
  ```json
97
57
  {
@@ -104,102 +64,104 @@ Or add to your `.cursor/mcp.json`:
104
64
  }
105
65
  ```
106
66
 
107
- ### Windsurf
67
+ ### Windsurf / Other MCP clients
108
68
 
109
- Open **Settings > MCP** and add:
110
-
111
- ```json
112
- {
113
- "mcpServers": {
114
- "sceneview": {
115
- "command": "npx",
116
- "args": ["-y", "sceneview-mcp"]
117
- }
118
- }
119
- }
120
- ```
121
-
122
- ### Other MCP Clients
123
-
124
- The server communicates via **stdio** using the MCP protocol. Start it directly:
125
-
126
- ```bash
127
- npx sceneview-mcp
128
- ```
69
+ Same JSON config as above. The server communicates via **stdio** using the standard MCP protocol.
129
70
 
130
71
  ---
131
72
 
132
- ## Verify Installation
133
-
134
- Once connected, ask your AI assistant:
135
-
136
- > "List all SceneView node types"
137
-
138
- It should return the full list of 26+ composable nodes. If it does, the MCP server is working.
73
+ ## What you get
74
+
75
+ ### 14 tools
76
+
77
+ | Tool | What it does |
78
+ |---|---|
79
+ | `get_sample` | Returns a complete, compilable code sample for any of 33 scenarios (Kotlin or Swift) |
80
+ | `list_samples` | Browse all samples, filter by tag (`ar`, `3d`, `ios`, `animation`, `geometry`, ...) |
81
+ | `validate_code` | Checks generated code against 15+ rules before presenting it to the user |
82
+ | `get_node_reference` | Full API reference for any of 26+ node types -- exact signatures, defaults, examples |
83
+ | `list_node_types` | List every composable node type available in SceneView |
84
+ | `get_setup` | Gradle + manifest setup for Android 3D or AR projects |
85
+ | `get_ios_setup` | SPM dependency, Info.plist, and SwiftUI integration for iOS/macOS/visionOS |
86
+ | `get_web_setup` | Kotlin/JS + Filament.js (WASM) setup for browser-based 3D |
87
+ | `get_ar_setup` | Detailed AR config: permissions, session options, plane detection, image tracking |
88
+ | `get_best_practices` | Performance, architecture, memory, and threading guidance |
89
+ | `get_migration_guide` | Every breaking change from SceneView 2.x to 3.0 with before/after code |
90
+ | `get_troubleshooting` | Common crashes, build failures, AR issues, and their fixes |
91
+ | `get_platform_roadmap` | Multi-platform status and timeline (Android, iOS, KMP, Web, Desktop) |
92
+ | `render_3d_preview` | Generates an interactive 3D preview link the user can open in their browser |
93
+ | `create_3d_artifact` | Generates self-contained HTML artifacts (model viewer, 3D charts, product 360) |
94
+
95
+ ### 2 resources
96
+
97
+ | Resource URI | What it provides |
98
+ |---|---|
99
+ | `sceneview://api` | Complete SceneView 3.3.0 API reference (the full `llms.txt`) |
100
+ | `sceneview://known-issues` | Live open issues from GitHub (cached 10 min) |
139
101
 
140
102
  ---
141
103
 
142
- ## Tool Examples
143
-
144
- ### Get a sample project
145
-
146
- > "Show me an AR tap-to-place sample with SceneView"
147
-
148
- The assistant will call `get_sample("ar-model-viewer")` and return a complete, compilable Kotlin composable with all imports and dependencies.
149
-
150
- ### Validate generated code
104
+ ## Examples
151
105
 
152
- > "Create a 3D model viewer and validate the code"
106
+ ### "Build me an AR app"
153
107
 
154
- The assistant will generate the code, then call `validate_code` to check it against 15+ rules (threading, null safety, API correctness, lifecycle) before presenting it.
108
+ The assistant calls `get_ar_setup` + `get_sample("ar-model-viewer")` and returns a complete, compilable Kotlin composable with all imports, Gradle dependencies, and manifest entries. Ready to paste into Android Studio.
155
109
 
156
- ### Look up a node's API
110
+ ### "Create a 3D model viewer for iOS"
157
111
 
158
- > "What parameters does LightNode accept?"
112
+ The assistant calls `get_ios_setup("3d")` + `get_sample("ios-model-viewer")` and returns Swift code with the SPM dependency, Info.plist entries, and a working SwiftUI view.
159
113
 
160
- The assistant will call `get_node_reference("LightNode")` and return the exact function signature, parameter types, defaults, and a usage example.
114
+ ### "What parameters does LightNode accept?"
161
115
 
162
- ### Get setup instructions
116
+ The assistant calls `get_node_reference("LightNode")` and returns the exact function signature, parameter types, defaults, and a usage example -- including the critical detail that `apply` is a named parameter, not a trailing lambda.
163
117
 
164
- > "How do I set up ARCore with SceneView in my project?"
118
+ ### "Validate this code before I use it"
165
119
 
166
- The assistant will call `get_ar_setup` and return the complete Gradle dependency, AndroidManifest.xml changes, session configuration options, and working AR starter templates.
120
+ The assistant calls `validate_code` with the generated snippet and checks it against 15+ rules: threading violations, null safety, API correctness, lifecycle issues, deprecated APIs. Problems are flagged with explanations before the code reaches the user.
167
121
 
168
- ### Get best practices
122
+ ### "Show me the model in 3D"
169
123
 
170
- > "What are the performance best practices for SceneView?"
124
+ The assistant calls `render_3d_preview` and returns an interactive link to a browser-based 3D viewer with orbit controls and optional AR mode.
171
125
 
172
- The assistant will call `get_best_practices("performance")` and return guidance on model optimization, runtime performance, environment/lighting setup, and common anti-patterns.
173
-
174
- ### Check the roadmap
126
+ ---
175
127
 
176
- > "Does SceneView support iOS?"
128
+ ## Why this exists
177
129
 
178
- The assistant will call `get_platform_roadmap` and return the current multi-platform status and future plans.
130
+ **Without** this MCP server, AI assistants regularly:
131
+ - Recommend deprecated **Sceneform** (abandoned 2021) instead of SceneView
132
+ - Generate imperative **View-based** code instead of Jetpack Compose
133
+ - Use **wrong API signatures** or outdated parameter names
134
+ - Miss the `LightNode` named-parameter gotcha (`apply =` not trailing lambda)
135
+ - Forget null-checks on `rememberModelInstance` (it returns `null` while loading)
136
+ - Have no knowledge of SceneView's iOS/Swift API at all
179
137
 
180
- ### Migrate from v2 to v3
138
+ **With** this MCP server, AI assistants:
139
+ - Always use the current SceneView 3.3.0 API surface
140
+ - Generate correct **Compose-native** 3D/AR code for Android
141
+ - Generate correct **SwiftUI-native** code for iOS/macOS/visionOS
142
+ - Know about all 26+ node types and their exact parameters
143
+ - Validate code against 15+ rules before presenting it
144
+ - Provide working, tested sample code for 33 scenarios
181
145
 
182
- > "I'm upgrading from SceneView 2.x. What changed?"
146
+ ---
183
147
 
184
- The assistant will call `get_migration_guide` and return every breaking change with before/after code examples.
148
+ ## Quality
185
149
 
186
- ---
150
+ The MCP server is tested with **612 unit tests** across 14 test suites covering:
187
151
 
188
- ## Why Use This?
152
+ - Every tool response (correct output, error handling, edge cases)
153
+ - All 33 code samples (compilable structure, correct imports, no deprecated APIs)
154
+ - Code validator rules (true positives and false-positive resistance)
155
+ - Node reference parsing (all 26+ types extracted correctly from `llms.txt`)
156
+ - Resource responses (API reference, GitHub issues integration)
189
157
 
190
- **Without** this MCP server, AI assistants may:
191
- - Recommend deprecated Sceneform instead of SceneView
192
- - Generate imperative View-based code instead of Compose
193
- - Use wrong API signatures or outdated versions
194
- - Miss ARCore integration patterns
195
- - Forget null-checks on `rememberModelInstance`
158
+ ```
159
+ Test Files 14 passed (14)
160
+ Tests 612 passed (612)
161
+ Duration 491ms
162
+ ```
196
163
 
197
- **With** this MCP server, AI assistants:
198
- - Always use the latest SceneView 3.3.0 API
199
- - Generate correct Compose-native 3D/AR code
200
- - Know about all 26+ node types and their exact parameters
201
- - Validate code against 15+ rules before presenting it
202
- - Provide working sample code for any scenario
164
+ All tools work **fully offline** except `sceneview://known-issues` (GitHub API, cached 10 min).
203
165
 
204
166
  ---
205
167
 
@@ -208,7 +170,7 @@ The assistant will call `get_migration_guide` and return every breaking change w
208
170
  ### "MCP server not found" or connection errors
209
171
 
210
172
  1. Ensure Node.js 18+ is installed: `node --version`
211
- 2. Test the server manually: `npx sceneview-mcp` it should start without errors and wait for input
173
+ 2. Test manually: `npx sceneview-mcp` -- should start without errors
212
174
  3. Restart your AI client after changing the MCP configuration
213
175
 
214
176
  ### "npx command not found"
@@ -217,21 +179,13 @@ Install Node.js from [nodejs.org](https://nodejs.org/) (LTS recommended). npm an
217
179
 
218
180
  ### Server starts but tools are not available
219
181
 
220
- - In Claude Desktop, check the MCP icon in the input bar. It should show "sceneview" as connected.
221
- - In Cursor, check **Settings > MCP** and verify the server shows a green status.
222
- - Try restarting the MCP server by restarting your AI client.
223
-
224
- ### Stale data from `sceneview://known-issues`
225
-
226
- GitHub issues are cached for 10 minutes. Wait for the cache to expire or restart the server.
227
-
228
- ### Validation false positives
229
-
230
- The `validate_code` tool uses pattern matching and may flag valid code in some edge cases. If a validation warning seems incorrect, review the rule explanation in the output — it includes the rule ID and a detailed explanation.
182
+ - **Claude Desktop:** check the MCP icon in the input bar -- it should show "sceneview" as connected
183
+ - **Cursor:** check **Settings > MCP** for green status
184
+ - Restart the AI client to force a reconnect
231
185
 
232
186
  ### Firewall or proxy issues
233
187
 
234
- The only network call is to the GitHub API (for `sceneview://known-issues`). All other tools work fully offline. If you are behind a corporate proxy, set the `HTTPS_PROXY` environment variable:
188
+ The only network call is to the GitHub API (for known issues). All other tools work offline. For corporate proxies:
235
189
 
236
190
  ```json
237
191
  {
@@ -255,24 +209,39 @@ The only network call is to the GitHub API (for `sceneview://known-issues`). All
255
209
  cd mcp
256
210
  npm install
257
211
  npm run prepare # Copy llms.txt + build TypeScript
258
- npm test # Run unit tests
212
+ npm test # 612 tests
259
213
  npm run dev # Start with tsx (hot reload)
260
214
  ```
261
215
 
262
- ## Publishing
216
+ ### Project structure
263
217
 
264
- Published to npm on each SceneView release:
265
-
266
- ```bash
267
- npm publish --access public
218
+ ```
219
+ mcp/
220
+ src/
221
+ index.ts # MCP server entry point (14 tools, 2 resources)
222
+ samples.ts # 33 compilable code samples (Kotlin + Swift)
223
+ validator.ts # Code validator (15+ rules, Kotlin + Swift)
224
+ node-reference.ts # Node type parser (extracts from llms.txt)
225
+ guides.ts # Best practices, AR setup, roadmap, troubleshooting
226
+ migration.ts # v2 -> v3 migration guide
227
+ preview.ts # 3D preview URL generator
228
+ artifact.ts # HTML artifact generator (model-viewer, charts, product 360)
229
+ issues.ts # GitHub issues fetcher (cached)
230
+ llms.txt # Bundled API reference (copied from repo root)
268
231
  ```
269
232
 
270
- ## Legal
233
+ ## Contributing
271
234
 
272
- - [LICENSE](./LICENSE) MIT License
273
- - [TERMS.md](./TERMS.md) Terms of Service
274
- - [PRIVACY.md](./PRIVACY.md) Privacy Policy (no data collected)
235
+ 1. Fork the repository
236
+ 2. Create a feature branch
237
+ 3. Add tests for new tools or rules
238
+ 4. Run `npm test` -- all 612+ tests must pass
239
+ 5. Submit a pull request
275
240
 
276
- ## License
241
+ See [CONTRIBUTING.md](../CONTRIBUTING.md) for the full guide.
242
+
243
+ ## Legal
277
244
 
278
- MIT — see [LICENSE](./LICENSE).
245
+ - [LICENSE](./LICENSE) -- MIT License
246
+ - [TERMS.md](./TERMS.md) -- Terms of Service
247
+ - [PRIVACY.md](./PRIVACY.md) -- Privacy Policy (no data collected)
package/dist/artifact.js CHANGED
@@ -43,7 +43,7 @@ function escapeHtml(s) {
43
43
  }
44
44
  // ─── Validation ──────────────────────────────────────────────────────────────
45
45
  export function validateArtifactInput(input) {
46
- const validTypes = ["model-viewer", "chart-3d", "scene", "product-360"];
46
+ const validTypes = ["model-viewer", "chart-3d", "scene", "product-360", "geometry"];
47
47
  if (!validTypes.includes(input.type)) {
48
48
  return `Invalid type "${input.type}". Must be one of: ${validTypes.join(", ")}`;
49
49
  }
@@ -57,6 +57,26 @@ export function validateArtifactInput(input) {
57
57
  }
58
58
  }
59
59
  }
60
+ if (input.type === "geometry") {
61
+ if (!input.shapes || !Array.isArray(input.shapes) || input.shapes.length === 0) {
62
+ return 'Type "geometry" requires a non-empty `shapes` array.';
63
+ }
64
+ const validShapeTypes = ["cube", "sphere", "cylinder", "plane", "line"];
65
+ for (const s of input.shapes) {
66
+ if (!validShapeTypes.includes(s.type)) {
67
+ return `Invalid shape type "${s.type}". Must be one of: ${validShapeTypes.join(", ")}`;
68
+ }
69
+ if (s.position && (!Array.isArray(s.position) || s.position.length !== 3)) {
70
+ return "Shape position must be an array of 3 numbers [x, y, z].";
71
+ }
72
+ if (s.scale && (!Array.isArray(s.scale) || s.scale.length !== 3)) {
73
+ return "Shape scale must be an array of 3 numbers [x, y, z].";
74
+ }
75
+ if (s.color && (!Array.isArray(s.color) || s.color.length !== 3)) {
76
+ return "Shape color must be an array of 3 numbers [r, g, b] in 0-1 range.";
77
+ }
78
+ }
79
+ }
60
80
  if (input.modelUrl) {
61
81
  if (!input.modelUrl.startsWith("https://") && !input.modelUrl.startsWith("http://")) {
62
82
  return "modelUrl must be an HTTP(S) URL.";
@@ -75,6 +95,8 @@ export function generateArtifact(input) {
75
95
  return generateScene(input);
76
96
  case "product-360":
77
97
  return generateProduct360(input);
98
+ case "geometry":
99
+ return generateGeometry(input);
78
100
  }
79
101
  }
80
102
  // ─── Filament.js renderer core ──────────────────────────────────────────────
@@ -339,6 +361,269 @@ ${filamentRendererScript({ modelUrl: model, bgColor: hexToBgColor(bg), autoRotat
339
361
  type: "product-360",
340
362
  };
341
363
  }
364
+ // ── geometry ──────────────────────────────────────────────────────────────────
365
+ //
366
+ // Procedural 3D geometry renderer — zero dependencies, pure WebGL2 PBR.
367
+ // Claude can "draw" in 3D: cubes, spheres, cylinders, planes, lines.
368
+ function generateGeometry(input) {
369
+ const title = input.title || "3D Geometry";
370
+ const opts = input.options || {};
371
+ const bg = opts.backgroundColor || "#0d1117";
372
+ const autoRotate = opts.autoRotate !== false;
373
+ const shapes = input.shapes || [];
374
+ // Compute camera distance from scene bounds
375
+ let maxDist = 2;
376
+ for (const s of shapes) {
377
+ const p = s.position || [0, 0, 0];
378
+ const sc = s.scale || [1, 1, 1];
379
+ const d = Math.sqrt(p[0] * p[0] + p[1] * p[1] + p[2] * p[2]) + Math.max(sc[0], sc[1], sc[2]);
380
+ if (d > maxDist)
381
+ maxDist = d;
382
+ }
383
+ const camDist = Math.max(4, maxDist * 1.8);
384
+ // Compute center-of-mass for the look-at target
385
+ let cx = 0, cy = 0, cz = 0;
386
+ for (const s of shapes) {
387
+ const p = s.position || [0, 0, 0];
388
+ cx += p[0];
389
+ cy += p[1];
390
+ cz += p[2];
391
+ }
392
+ if (shapes.length > 0) {
393
+ cx /= shapes.length;
394
+ cy /= shapes.length;
395
+ cz /= shapes.length;
396
+ }
397
+ const shapesJson = JSON.stringify(shapes.map(s => ({
398
+ type: s.type,
399
+ position: s.position || [0, 0, 0],
400
+ scale: s.scale || [1, 1, 1],
401
+ color: s.color || [0.8, 0.8, 0.8],
402
+ metallic: s.metallic ?? 0.0,
403
+ roughness: s.roughness ?? 0.5,
404
+ })));
405
+ const bgRgb = hexToRgb(bg);
406
+ const bgR = (bgRgb.r / 255).toFixed(3);
407
+ const bgG = (bgRgb.g / 255).toFixed(3);
408
+ const bgB = (bgRgb.b / 255).toFixed(3);
409
+ const body = `
410
+ <style>
411
+ canvas{width:100%;height:100%;display:block;cursor:grab}
412
+ canvas:active{cursor:grabbing}
413
+ .geo-title{position:absolute;top:16px;left:16px;font-size:18px;font-weight:700;color:#fff;text-shadow:0 2px 8px rgba(0,0,0,0.5);z-index:10}
414
+ .geo-info{position:absolute;top:42px;left:16px;font-size:12px;color:#888;z-index:10}
415
+ .controls-hint{position:absolute;bottom:16px;left:16px;font-size:11px;color:#666;z-index:10}
416
+ </style>
417
+ <div class="geo-title">${escapeHtml(title)}</div>
418
+ <div class="geo-info">${shapes.length} shape${shapes.length !== 1 ? "s" : ""} &bull; Procedural geometry</div>
419
+ <canvas id="c"></canvas>
420
+ <div class="controls-hint">Drag to orbit &bull; Scroll to zoom</div>
421
+ ${BRANDING}
422
+ <script>
423
+ (function(){
424
+ var canvas=document.getElementById('c');
425
+ var gl=canvas.getContext('webgl2');
426
+ if(!gl){document.body.innerHTML='<p style="color:#fff;padding:40px">WebGL2 not supported</p>';return;}
427
+
428
+ var VS=\`#version 300 es
429
+ precision highp float;
430
+ in vec3 aPos;
431
+ in vec3 aNormal;
432
+ uniform mat4 uMVP;
433
+ uniform mat4 uModel;
434
+ uniform mat3 uNormalMatrix;
435
+ out vec3 vNormal;
436
+ out vec3 vWorldPos;
437
+ void main(){
438
+ vec4 wp=uModel*vec4(aPos,1.0);
439
+ vWorldPos=wp.xyz;
440
+ vNormal=normalize(uNormalMatrix*aNormal);
441
+ gl_Position=uMVP*vec4(aPos,1.0);
442
+ }\`;
443
+
444
+ var FS=\`#version 300 es
445
+ precision highp float;
446
+ in vec3 vNormal;
447
+ in vec3 vWorldPos;
448
+ uniform vec3 uCamPos;
449
+ uniform vec3 uLightDir;
450
+ uniform vec3 uLightColor;
451
+ uniform vec3 uBaseColor;
452
+ uniform float uMetallic;
453
+ uniform float uRoughness;
454
+ out vec4 fragColor;
455
+ const float PI=3.14159265;
456
+ float D_GGX(float NdotH,float r){float a=r*r;float a2=a*a;float d=NdotH*NdotH*(a2-1.0)+1.0;return a2/(PI*d*d);}
457
+ vec3 F_Schlick(float ct,vec3 F0){return F0+(1.0-F0)*pow(1.0-ct,5.0);}
458
+ float G_Smith(float NdotV,float NdotL,float r){float k=((r+1.0)*(r+1.0))/8.0;return(NdotV/(NdotV*(1.0-k)+k))*(NdotL/(NdotL*(1.0-k)+k));}
459
+ void main(){
460
+ vec3 N=normalize(vNormal);vec3 V=normalize(uCamPos-vWorldPos);vec3 L=normalize(uLightDir);vec3 H=normalize(V+L);
461
+ float NdotL=max(dot(N,L),0.0);float NdotV=max(dot(N,V),0.001);float NdotH=max(dot(N,H),0.0);float HdotV=max(dot(H,V),0.0);
462
+ vec3 F0=mix(vec3(0.04),uBaseColor,uMetallic);vec3 F=F_Schlick(HdotV,F0);
463
+ float D=D_GGX(NdotH,uRoughness);float G=G_Smith(NdotV,NdotL,uRoughness);
464
+ vec3 spec=(D*F*G)/(4.0*NdotV*NdotL+0.001);
465
+ vec3 kD=(1.0-F)*(1.0-uMetallic);vec3 diff=kD*uBaseColor/PI;
466
+ vec3 color=(diff+spec)*uLightColor*NdotL;
467
+ float hem=dot(N,vec3(0,1,0))*0.5+0.5;
468
+ color+=mix(vec3(0.05,0.04,0.03),vec3(0.15,0.2,0.35),hem)*uBaseColor*0.4;
469
+ color+=vec3(0.1,0.15,0.3)*pow(1.0-NdotV,3.0)*0.5;
470
+ color=color*(2.51*color+0.03)/(color*(2.43*color+0.59)+0.14);
471
+ color=pow(color,vec3(1.0/2.2));
472
+ fragColor=vec4(color,1.0);
473
+ }\`;
474
+
475
+ var GVS=\`#version 300 es
476
+ precision highp float;
477
+ in vec3 aPos;
478
+ uniform mat4 uMVP;
479
+ void main(){gl_Position=uMVP*vec4(aPos,1.0);}\`;
480
+ var GFS=\`#version 300 es
481
+ precision highp float;
482
+ out vec4 fragColor;
483
+ void main(){fragColor=vec4(1.0,1.0,1.0,0.06);}\`;
484
+
485
+ function cShader(t,s){var sh=gl.createShader(t);gl.shaderSource(sh,s);gl.compileShader(sh);return sh;}
486
+ function cProg(v,f){var p=gl.createProgram();gl.attachShader(p,v);gl.attachShader(p,f);gl.linkProgram(p);return p;}
487
+
488
+ var prog=cProg(cShader(gl.VERTEX_SHADER,VS),cShader(gl.FRAGMENT_SHADER,FS));
489
+ var L={aPos:gl.getAttribLocation(prog,'aPos'),aNormal:gl.getAttribLocation(prog,'aNormal'),
490
+ uMVP:gl.getUniformLocation(prog,'uMVP'),uModel:gl.getUniformLocation(prog,'uModel'),
491
+ uNormalMatrix:gl.getUniformLocation(prog,'uNormalMatrix'),uCamPos:gl.getUniformLocation(prog,'uCamPos'),
492
+ uLightDir:gl.getUniformLocation(prog,'uLightDir'),uLightColor:gl.getUniformLocation(prog,'uLightColor'),
493
+ uBaseColor:gl.getUniformLocation(prog,'uBaseColor'),uMetallic:gl.getUniformLocation(prog,'uMetallic'),
494
+ uRoughness:gl.getUniformLocation(prog,'uRoughness')};
495
+
496
+ var gProg=cProg(cShader(gl.VERTEX_SHADER,GVS),cShader(gl.FRAGMENT_SHADER,GFS));
497
+ var gL={aPos:gl.getAttribLocation(gProg,'aPos'),uMVP:gl.getUniformLocation(gProg,'uMVP')};
498
+
499
+ function genSphere(r,ws,hs){var p=[],n=[],ix=[];
500
+ for(var y=0;y<=hs;y++)for(var x=0;x<=ws;x++){var u=x/ws,v=y/hs,th=u*Math.PI*2,ph=v*Math.PI;
501
+ var sp=Math.sin(ph),cp=Math.cos(ph),st=Math.sin(th),ct=Math.cos(th);
502
+ var nx=sp*ct,ny=cp,nz=sp*st;p.push(r*nx,r*ny,r*nz);n.push(nx,ny,nz);}
503
+ for(var y=0;y<hs;y++)for(var x=0;x<ws;x++){var a=y*(ws+1)+x,b=a+ws+1;ix.push(a,b,a+1,b,b+1,a+1);}
504
+ return{positions:new Float32Array(p),normals:new Float32Array(n),indices:new Uint16Array(ix)};}
505
+
506
+ function genCube(sz){var s=sz/2;var faces=[
507
+ {n:[0,0,1],v:[[-s,-s,s],[s,-s,s],[s,s,s],[-s,s,s]]},{n:[0,0,-1],v:[[s,-s,-s],[-s,-s,-s],[-s,s,-s],[s,s,-s]]},
508
+ {n:[0,1,0],v:[[-s,s,s],[s,s,s],[s,s,-s],[-s,s,-s]]},{n:[0,-1,0],v:[[-s,-s,-s],[s,-s,-s],[s,-s,s],[-s,-s,s]]},
509
+ {n:[1,0,0],v:[[s,-s,s],[s,-s,-s],[s,s,-s],[s,s,s]]},{n:[-1,0,0],v:[[-s,-s,-s],[-s,-s,s],[-s,s,s],[-s,s,-s]]}];
510
+ var p=[],n=[],ix=[];faces.forEach(function(f,i){f.v.forEach(function(vt){p.push(vt[0],vt[1],vt[2]);n.push(f.n[0],f.n[1],f.n[2]);});
511
+ var o=i*4;ix.push(o,o+1,o+2,o,o+2,o+3);});
512
+ return{positions:new Float32Array(p),normals:new Float32Array(n),indices:new Uint16Array(ix)};}
513
+
514
+ function genCyl(rT,rB,h,seg){var p=[],n=[],ix=[],hH=h/2;
515
+ for(var i=0;i<=seg;i++){var u=i/seg,a=u*Math.PI*2,c=Math.cos(a),s=Math.sin(a),sl=(rB-rT)/h,nl=Math.sqrt(1+sl*sl);
516
+ p.push(rT*c,hH,rT*s);n.push(c/nl,sl/nl,s/nl);p.push(rB*c,-hH,rB*s);n.push(c/nl,sl/nl,s/nl);}
517
+ for(var i=0;i<seg;i++){var a=i*2;ix.push(a,a+1,a+2,a+1,a+3,a+2);}
518
+ function addCap(y,r,ny){var ct=p.length/3;p.push(0,y,0);n.push(0,ny,0);
519
+ for(var i=0;i<=seg;i++){var a=(i/seg)*Math.PI*2;p.push(r*Math.cos(a),y,r*Math.sin(a));n.push(0,ny,0);}
520
+ for(var i=0;i<seg;i++){if(ny>0)ix.push(ct,ct+i+1,ct+i+2);else ix.push(ct,ct+i+2,ct+i+1);}}
521
+ addCap(hH,rT,1);addCap(-hH,rB,-1);
522
+ return{positions:new Float32Array(p),normals:new Float32Array(n),indices:new Uint16Array(ix)};}
523
+
524
+ function genPlane(w,d){var hw=w/2,hd=d/2;
525
+ return{positions:new Float32Array([-hw,0,-hd,hw,0,-hd,hw,0,hd,-hw,0,hd]),
526
+ normals:new Float32Array([0,1,0,0,1,0,0,1,0,0,1,0]),indices:new Uint16Array([0,1,2,0,2,3])};}
527
+
528
+ function genGrid(sz,div){var p=[],n=[],ix=[],st=sz/div,h=sz/2,vi=0;
529
+ for(var i=0;i<=div;i++){var t=i*st-h;p.push(-h,0,t,h,0,t);n.push(0,1,0,0,1,0);ix.push(vi,vi+1);vi+=2;
530
+ p.push(t,0,-h,t,0,h);n.push(0,1,0,0,1,0);ix.push(vi,vi+1);vi+=2;}
531
+ return{positions:new Float32Array(p),normals:new Float32Array(n),indices:new Uint16Array(ix)};}
532
+
533
+ function mkMesh(g){var vao=gl.createVertexArray();gl.bindVertexArray(vao);
534
+ var pb=gl.createBuffer();gl.bindBuffer(gl.ARRAY_BUFFER,pb);gl.bufferData(gl.ARRAY_BUFFER,g.positions,gl.STATIC_DRAW);
535
+ gl.enableVertexAttribArray(L.aPos);gl.vertexAttribPointer(L.aPos,3,gl.FLOAT,false,0,0);
536
+ var nb=gl.createBuffer();gl.bindBuffer(gl.ARRAY_BUFFER,nb);gl.bufferData(gl.ARRAY_BUFFER,g.normals,gl.STATIC_DRAW);
537
+ gl.enableVertexAttribArray(L.aNormal);gl.vertexAttribPointer(L.aNormal,3,gl.FLOAT,false,0,0);
538
+ var ib=gl.createBuffer();gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER,ib);gl.bufferData(gl.ELEMENT_ARRAY_BUFFER,g.indices,gl.STATIC_DRAW);
539
+ gl.bindVertexArray(null);return{vao:vao,count:g.indices.length};}
540
+
541
+ function mkGridMesh(g){var vao=gl.createVertexArray();gl.bindVertexArray(vao);
542
+ var pb=gl.createBuffer();gl.bindBuffer(gl.ARRAY_BUFFER,pb);gl.bufferData(gl.ARRAY_BUFFER,g.positions,gl.STATIC_DRAW);
543
+ gl.enableVertexAttribArray(gL.aPos);gl.vertexAttribPointer(gL.aPos,3,gl.FLOAT,false,0,0);
544
+ var ib=gl.createBuffer();gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER,ib);gl.bufferData(gl.ELEMENT_ARRAY_BUFFER,g.indices,gl.STATIC_DRAW);
545
+ gl.bindVertexArray(null);return{vao:vao,count:g.indices.length};}
546
+
547
+ var protos={cube:mkMesh(genCube(1)),sphere:mkMesh(genSphere(0.5,32,24)),
548
+ cylinder:mkMesh(genCyl(0.5,0.5,1,32)),plane:mkMesh(genPlane(1,1)),line:mkMesh(genCyl(0.02,0.02,1,8))};
549
+ var grid=mkGridMesh(genGrid(10,20));
550
+
551
+ var shapes=${shapesJson};
552
+
553
+ function m4Persp(fov,asp,n,f){var t=1/Math.tan(fov/2),nf=1/(n-f);return new Float32Array([t/asp,0,0,0,0,t,0,0,0,0,(f+n)*nf,-1,0,0,2*f*n*nf,0]);}
554
+ function vNorm(v){var l=Math.sqrt(v[0]*v[0]+v[1]*v[1]+v[2]*v[2]);return l>0?[v[0]/l,v[1]/l,v[2]/l]:[0,0,0];}
555
+ function vSub(a,b){return[a[0]-b[0],a[1]-b[1],a[2]-b[2]];}
556
+ function vCross(a,b){return[a[1]*b[2]-a[2]*b[1],a[2]*b[0]-a[0]*b[2],a[0]*b[1]-a[1]*b[0]];}
557
+ function vDot(a,b){return a[0]*b[0]+a[1]*b[1]+a[2]*b[2];}
558
+ function m4LookAt(e,c,u){var z=vNorm(vSub(e,c)),x=vNorm(vCross(u,z)),y=vCross(z,x);
559
+ return new Float32Array([x[0],y[0],z[0],0,x[1],y[1],z[1],0,x[2],y[2],z[2],0,-vDot(x,e),-vDot(y,e),-vDot(z,e),1]);}
560
+ function m4Mul(a,b){var r=new Float32Array(16);for(var i=0;i<4;i++)for(var j=0;j<4;j++){var s=0;for(var k=0;k<4;k++)s+=a[k*4+j]*b[i*4+k];r[i*4+j]=s;}return r;}
561
+ function m4Trans(x,y,z){return new Float32Array([1,0,0,0,0,1,0,0,0,0,1,0,x,y,z,1]);}
562
+ function m4Scale(x,y,z){return new Float32Array([x,0,0,0,0,y,0,0,0,0,z,0,0,0,0,1]);}
563
+ function m3Normal(m){return new Float32Array([m[0],m[1],m[2],m[4],m[5],m[6],m[8],m[9],m[10]]);}
564
+
565
+ var oTheta=0.4,oPhi=0.7,oDist=${camDist.toFixed(1)};
566
+ var autoR=${autoRotate},isDrag=false,lx=0,ly=0;
567
+ canvas.addEventListener('pointerdown',function(e){isDrag=true;lx=e.clientX;ly=e.clientY;autoR=false;canvas.setPointerCapture(e.pointerId);});
568
+ canvas.addEventListener('pointermove',function(e){if(!isDrag)return;oTheta-=(e.clientX-lx)*0.008;oPhi=Math.max(0.1,Math.min(Math.PI-0.1,oPhi-(e.clientY-ly)*0.008));lx=e.clientX;ly=e.clientY;});
569
+ canvas.addEventListener('pointerup',function(){isDrag=false;});
570
+ canvas.addEventListener('wheel',function(e){oDist=Math.max(2,Math.min(30,oDist+e.deltaY*0.005));e.preventDefault();},{passive:false});
571
+ var lpd=0;
572
+ canvas.addEventListener('touchstart',function(e){if(e.touches.length===2){var dx=e.touches[0].clientX-e.touches[1].clientX,dy=e.touches[0].clientY-e.touches[1].clientY;lpd=Math.sqrt(dx*dx+dy*dy);}},{passive:true});
573
+ canvas.addEventListener('touchmove',function(e){if(e.touches.length===2){var dx=e.touches[0].clientX-e.touches[1].clientX,dy=e.touches[0].clientY-e.touches[1].clientY;var d=Math.sqrt(dx*dx+dy*dy);oDist=Math.max(2,Math.min(30,oDist-(d-lpd)*0.02));lpd=d;}},{passive:true});
574
+
575
+ function resize(){var dpr=Math.min(window.devicePixelRatio||1,2);canvas.width=canvas.clientWidth*dpr;canvas.height=canvas.clientHeight*dpr;gl.viewport(0,0,canvas.width,canvas.height);}
576
+ resize();window.addEventListener('resize',resize);
577
+
578
+ function render(){
579
+ requestAnimationFrame(render);
580
+ if(autoR)oTheta+=0.004;
581
+ var cx=oDist*Math.sin(oPhi)*Math.cos(oTheta),cy=oDist*Math.cos(oPhi),cz=oDist*Math.sin(oPhi)*Math.sin(oTheta);
582
+ var camPos=[cx,cy,cz];
583
+ var proj=m4Persp(Math.PI/4,canvas.width/canvas.height,0.1,100);
584
+ var view=m4LookAt(camPos,[${cx.toFixed(2)},${cy.toFixed(2)},${cz.toFixed(2)}],[0,1,0]);
585
+ var vp=m4Mul(proj,view);
586
+
587
+ gl.clearColor(${bgR},${bgG},${bgB},1.0);
588
+ gl.clear(gl.COLOR_BUFFER_BIT|gl.DEPTH_BUFFER_BIT);
589
+ gl.enable(gl.DEPTH_TEST);
590
+ gl.enable(gl.BLEND);
591
+ gl.blendFunc(gl.SRC_ALPHA,gl.ONE_MINUS_SRC_ALPHA);
592
+
593
+ gl.useProgram(gProg);
594
+ gl.uniformMatrix4fv(gL.uMVP,false,vp);
595
+ gl.bindVertexArray(grid.vao);
596
+ gl.drawElements(gl.LINES,grid.count,gl.UNSIGNED_SHORT,0);
597
+
598
+ gl.useProgram(prog);
599
+ gl.uniform3fv(L.uCamPos,camPos);
600
+ gl.uniform3f(L.uLightDir,0.5,0.8,0.3);
601
+ gl.uniform3f(L.uLightColor,3.0,2.9,2.7);
602
+
603
+ for(var i=0;i<shapes.length;i++){
604
+ var sh=shapes[i],mesh=protos[sh.type];if(!mesh)continue;
605
+ var p=sh.position,s=sh.scale;
606
+ var model=m4Mul(m4Trans(p[0],p[1],p[2]),m4Scale(s[0],s[1],s[2]));
607
+ var mvp=m4Mul(vp,model),nm=m3Normal(model);
608
+ gl.uniformMatrix4fv(L.uMVP,false,mvp);
609
+ gl.uniformMatrix4fv(L.uModel,false,model);
610
+ gl.uniformMatrix3fv(L.uNormalMatrix,false,nm);
611
+ gl.uniform3fv(L.uBaseColor,sh.color);
612
+ gl.uniform1f(L.uMetallic,sh.metallic);
613
+ gl.uniform1f(L.uRoughness,sh.roughness);
614
+ gl.bindVertexArray(mesh.vao);
615
+ gl.drawElements(gl.TRIANGLES,mesh.count,gl.UNSIGNED_SHORT,0);
616
+ }
617
+ }
618
+ requestAnimationFrame(render);
619
+ })();
620
+ <\/script>`;
621
+ return {
622
+ html: htmlShell(title, body),
623
+ title,
624
+ type: "geometry",
625
+ };
626
+ }
342
627
  // ─── Utilities ───────────────────────────────────────────────────────────────
343
628
  function formatNumber(n) {
344
629
  if (n >= 1_000_000)
@@ -373,7 +658,7 @@ export function formatArtifactResponse(result) {
373
658
  const lines = [
374
659
  `## ${result.title}`,
375
660
  ``,
376
- `Here is your interactive 3D ${result.type === "chart-3d" ? "chart" : "content"} powered by Filament.js (same engine as SceneView Android):`,
661
+ `Here is your interactive 3D ${result.type === "chart-3d" ? "chart" : result.type === "geometry" ? "scene" : "content"} powered by ${result.type === "geometry" ? "WebGL PBR" : "Filament.js"} (SceneView engine):`,
377
662
  ``,
378
663
  `\`\`\`html`,
379
664
  result.html,
package/dist/index.js CHANGED
@@ -33,7 +33,7 @@ catch {
33
33
  API_DOCS = "SceneView API docs not found. Run `npm run prepare` to bundle llms.txt.";
34
34
  }
35
35
  const NODE_SECTIONS = parseNodeSections(API_DOCS);
36
- const server = new Server({ name: "@sceneview/mcp", version: "3.4.9" }, { capabilities: { resources: {}, tools: {} } });
36
+ const server = new Server({ name: "@sceneview/mcp", version: "3.4.11" }, { capabilities: { resources: {}, tools: {} } });
37
37
  // ─── Resources ───────────────────────────────────────────────────────────────
38
38
  server.setRequestHandler(ListResourcesRequestSchema, async () => ({
39
39
  resources: [
@@ -254,8 +254,8 @@ server.setRequestHandler(ListToolsRequestSchema, async () => ({
254
254
  properties: {
255
255
  type: {
256
256
  type: "string",
257
- enum: ["model-viewer", "chart-3d", "scene", "product-360"],
258
- description: '"model-viewer": interactive 3D model viewer with orbit + AR. "chart-3d": 3D bar chart for data visualization (revenue, analytics). "scene": rich 3D scene with lighting and environment. "product-360": product turntable with hotspot annotations + AR.',
257
+ enum: ["model-viewer", "chart-3d", "scene", "product-360", "geometry"],
258
+ description: '"model-viewer": interactive 3D model viewer with orbit + AR. "chart-3d": 3D bar chart for data visualization. "scene": rich 3D scene with lighting. "product-360": product turntable with hotspot annotations. "geometry": procedural 3D shapes (cubes, spheres, cylinders, planes, lines) — Claude can DRAW in 3D! Use this when the user asks to draw, build, or visualize 3D shapes.',
259
259
  },
260
260
  modelUrl: {
261
261
  type: "string",
@@ -302,6 +302,44 @@ server.setRequestHandler(ListToolsRequestSchema, async () => ({
302
302
  },
303
303
  description: "Annotation hotspots for product-360 type. Each has position, normal, label, and optional description.",
304
304
  },
305
+ shapes: {
306
+ type: "array",
307
+ items: {
308
+ type: "object",
309
+ properties: {
310
+ type: {
311
+ type: "string",
312
+ enum: ["cube", "sphere", "cylinder", "plane", "line"],
313
+ description: 'Shape type: "cube", "sphere", "cylinder", "plane" (flat surface), or "line" (thin cylinder connecting points).',
314
+ },
315
+ position: {
316
+ type: "array",
317
+ items: { type: "number" },
318
+ description: "Position [x, y, z] in world space. Y is up. Default: [0, 0, 0].",
319
+ },
320
+ scale: {
321
+ type: "array",
322
+ items: { type: "number" },
323
+ description: "Scale [x, y, z]. For cube: edge sizes. For sphere: diameters. For line: [length, thickness, thickness]. Default: [1, 1, 1].",
324
+ },
325
+ color: {
326
+ type: "array",
327
+ items: { type: "number" },
328
+ description: "Color [r, g, b] in 0-1 range. E.g. [1, 0, 0] for red, [0.2, 0.5, 1] for sky blue. Default: [0.8, 0.8, 0.8].",
329
+ },
330
+ metallic: {
331
+ type: "number",
332
+ description: "Metallic factor 0-1. 0 = plastic/matte, 1 = metal. Default: 0.",
333
+ },
334
+ roughness: {
335
+ type: "number",
336
+ description: "Roughness factor 0-1. 0 = mirror/glossy, 1 = rough/matte. Default: 0.5.",
337
+ },
338
+ },
339
+ required: ["type"],
340
+ },
341
+ description: 'Array of procedural 3D shapes for "geometry" type. Each shape has type, position, scale, color, metallic, roughness. Required for "geometry" type. Example: [{type:"cube",position:[0,0.5,0],scale:[1,1,1],color:[1,0,0]},{type:"sphere",position:[0,1.8,0],scale:[0.6,0.6,0.6],color:[0,0,1]}]',
342
+ },
305
343
  },
306
344
  required: ["type"],
307
345
  },
@@ -823,6 +861,7 @@ server.setRequestHandler(CallToolRequestSchema, async (request) => {
823
861
  data: request.params.arguments?.data,
824
862
  options: request.params.arguments?.options,
825
863
  hotspots: request.params.arguments?.hotspots,
864
+ shapes: request.params.arguments?.shapes,
826
865
  };
827
866
  const validationError = validateArtifactInput(artifactInput);
828
867
  if (validationError) {
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "sceneview-mcp",
3
- "version": "3.4.10",
3
+ "version": "3.4.11",
4
4
  "mcpName": "io.github.sceneview/mcp",
5
5
  "description": "MCP server for SceneView — cross-platform 3D & AR SDK for Android and iOS. Give Claude the full SceneView SDK so it writes correct, compilable code.",
6
6
  "keywords": [