@createlex/figgen 1.4.6 → 1.4.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +587 -88
  2. package/package.json +1 -1
package/README.md CHANGED
@@ -2,54 +2,75 @@
2
2
 
3
3
  A local MCP (Model Context Protocol) runtime that connects your Figma plugin session to Claude Code, Cursor, Windsurf, or any MCP-compatible AI tool — and writes production-ready SwiftUI directly into your Xcode project.
4
4
 
5
- ## Generation tiers
5
+ ---
6
+
7
+ ## Table of contents
8
+
9
+ 1. [Architecture](#architecture)
10
+ 2. [Install & start](#install--start)
11
+ 3. [MCP client configuration](#mcp-client-configuration)
12
+ 4. [Generation tiers](#generation-tiers)
13
+ 5. [Recommended workflow](#recommended-workflow)
14
+ 6. [MCP tools reference](#mcp-tools-reference)
15
+ 7. [Generation modes](#generation-modes)
16
+ 8. [Output file structure](#output-file-structure)
17
+ 9. [Building from source](#building-from-source)
18
+ 10. [code.ts internals](#codets-internals)
19
+ 11. [Key data types](#key-data-types)
20
+ 12. [Bridge protocol](#bridge-protocol)
21
+ 13. [Environment variables](#environment-variables)
22
+ 14. [Key source files](#key-source-files)
23
+ 15. [How to extend the codebase](#how-to-extend-the-codebase)
6
24
 
7
- ### Tier 1 — AI-native (zero extra API cost)
8
- Your AI IDE (Claude Code, Cursor, Windsurf, etc.) calls the MCP tools itself:
25
+ ---
9
26
 
10
- 1. `get_swiftui_generation_prompt` → returns a ready-to-use system prompt + user message
11
- 2. The AI generates SwiftUI using **its own model** (your existing subscription)
12
- 3. `write_generated_swiftui_to_xcode` → writes the file(s) to Xcode
27
+ ## Architecture
13
28
 
14
- No CreateLex tokens consumed. No extra API keys required.
29
+ Three components run together whenever you use the plugin:
15
30
 
16
- ### Tier 2 — BYOK (Bring Your Own Key)
17
- Set one of these env vars in your MCP client config and `generate_swiftui` runs locally:
31
+ ```
32
+ Figma desktop app
33
+ └── Plugin sandbox (code.ts → code.js)
34
+ │ WebSocket ws://localhost:7765/bridge
35
+
36
+ companion/bridge-server.cjs HTTP + WebSocket on :7765
37
+ │ WebSocket
38
+
39
+ companion/mcp-server.mjs MCP stdio server
40
+ │ MCP protocol
41
+
42
+ AI IDE (Claude Code / Cursor / Windsurf / …)
43
+ ```
18
44
 
19
- | Env var | Provider | Notes |
45
+ | Layer | File | Role |
20
46
  |---|---|---|
21
- | `ANTHROPIC_API_KEY` | Claude (recommended) | Override model with `ANTHROPIC_MODEL` |
22
- | `HF_API_TOKEN` | Hugging Face | Default model: `Qwen/Qwen2.5-Coder-32B-Instruct`, override with `HF_MODEL` |
23
- | `OPENAI_API_KEY` | OpenAI or compatible | Override model with `OPENAI_MODEL` (default `gpt-4o`) |
24
- | `OPENAI_BASE_URL` | Ollama / LM Studio | Set alongside `OPENAI_API_KEY=ollama` for fully local generation |
47
+ | Figma plugin | `code.ts` → `code.js` | Reads Figma node tree, exports PNG/SVG assets, generates SwiftUI code |
48
+ | Bridge | `companion/bridge-server.cjs` | Routes messages between plugin and MCP agents over WebSocket; exposes REST endpoints |
49
+ | MCP server | `companion/mcp-server.mjs` | Wraps bridge in MCP protocol; exposes 23 tools; orchestrates BYOK generation |
50
+ | Xcode writer | `companion/xcode-writer.cjs` | Writes Swift files + asset catalog entries to the Xcode project on disk |
51
+ | Local LLM | `companion/local-llm-generator.cjs` | Calls Anthropic / OpenAI / HuggingFace APIs for Tier 2 generation |
52
+ | Auth | `companion/createlex-auth.cjs` | JWT validation and refresh for Tier 3 CreateLex subscription |
25
53
 
26
- ### Tier 3 CreateLex hosted (subscription fallback)
27
- If no BYOK keys are set, falls back to the CreateLex hosted pattern matcher. Requires an active [CreateLex subscription](https://createlex.com/figma-swiftui).
54
+ **Port 7765** is the single shared port. The bridge server listens on HTTP (REST) and WebSocket (`/bridge` path) simultaneously.
28
55
 
29
56
  ---
30
57
 
31
- ## Requirements
32
-
33
- - Node.js 18 or later
34
- - Figma desktop app with the Figma to SwiftUI plugin open
35
- - A local Xcode project (optional for Tier 1 prompt-copy workflow)
36
-
37
- ## Install
58
+ ## Install & start
38
59
 
39
60
  ```bash
61
+ # One-shot (no install)
40
62
  npx @createlex/figgen start --project /path/to/MyApp/MyApp
41
- ```
42
63
 
43
- Or install globally:
44
-
45
- ```bash
64
+ # Or install globally
46
65
  npm install -g @createlex/figgen
47
66
  figgen start --project /path/to/MyApp/MyApp
48
67
  ```
49
68
 
50
- ## MCP client configuration
69
+ **Requirements:** Node.js 18+, Figma desktop app with the plugin open.
70
+
71
+ ---
51
72
 
52
- Add to your Claude Code / Cursor / Windsurf MCP config:
73
+ ## MCP client configuration
53
74
 
54
75
  ```json
55
76
  {
@@ -66,97 +87,575 @@ Add to your Claude Code / Cursor / Windsurf MCP config:
66
87
  }
67
88
  ```
68
89
 
69
- For Hugging Face BYOK:
90
+ For Hugging Face:
91
+ ```json
92
+ { "env": { "HF_API_TOKEN": "hf_...", "HF_MODEL": "Qwen/Qwen2.5-Coder-32B-Instruct" } }
93
+ ```
70
94
 
95
+ For Ollama (fully local):
71
96
  ```json
72
- {
73
- "env": {
74
- "FIGMA_SWIFTUI_PROJECT_PATH": "/path/to/MyApp/MyApp",
75
- "HF_API_TOKEN": "hf_...",
76
- "HF_MODEL": "Qwen/Qwen2.5-Coder-32B-Instruct"
77
- }
78
- }
97
+ { "env": { "OPENAI_API_KEY": "ollama", "OPENAI_BASE_URL": "http://localhost:11434/v1", "OPENAI_MODEL": "llama3" } }
79
98
  ```
80
99
 
81
- For fully local generation with Ollama:
100
+ ---
82
101
 
83
- ```json
84
- {
85
- "env": {
86
- "FIGMA_SWIFTUI_PROJECT_PATH": "/path/to/MyApp/MyApp",
87
- "OPENAI_API_KEY": "ollama",
88
- "OPENAI_BASE_URL": "http://localhost:11434/v1",
89
- "OPENAI_MODEL": "llama3"
90
- }
91
- }
102
+ ## Generation tiers
103
+
104
+ | Tier | How | Cost |
105
+ |---|---|---|
106
+ | **1 — AI-native** | AI IDE calls `write_selection_to_xcode`; plugin generates code; no extra key needed | Your existing AI subscription |
107
+ | **2 — BYOK** | Set `ANTHROPIC_API_KEY`, `OPENAI_API_KEY`, or `HF_API_TOKEN`; MCP server calls the model | Your API key |
108
+ | **3 — CreateLex hosted** | No keys set; falls back to hosted pattern matcher | CreateLex subscription |
109
+
110
+ ### Login (Tier 3 only)
111
+
112
+ ```bash
113
+ npx @createlex/figgen login
114
+ # Saves session to ~/.createlex/auth.json
92
115
  ```
93
116
 
94
- ## MCP tools
117
+ ---
118
+
119
+ ## Recommended workflow
120
+
121
+ ```
122
+ 1. Select a frame in Figma
123
+ 2. Ask your AI: "Generate SwiftUI for my selected Figma frame"
124
+ 3. AI calls write_selection_to_xcode
125
+ 4. Plugin generates code + exports PNG/SVG assets
126
+ 5. Files appear in Xcode immediately
127
+ ```
128
+
129
+ **Critical:** `write_selection_to_xcode` **overwrites the Swift file** on every call. Apply all manual refinements (adaptive layout, hidden node removal, real buttons, etc.) **after** the last MCP call, not before.
130
+
131
+ To hide a node from export: set it invisible in Figma before running. Hidden nodes (`visible: false`) are skipped in both the asset export plan and code generation.
132
+
133
+ ---
134
+
135
+ ## MCP tools reference
136
+
137
+ ### Bridge & status
95
138
 
96
139
  | Tool | Description |
97
140
  |---|---|
98
- | `get_design_context` | Get the full Figma node tree, colors, typography, and layout for a selection |
99
- | `get_swiftui_generation_prompt` | Returns a ready-to-use system prompt + user message — feed to your AI model to generate SwiftUI |
100
- | `generate_swiftui` | Generate SwiftUI via BYOK key or CreateLex hosted (Tier 2/3) |
101
- | `write_generated_swiftui_to_xcode` | Write a generated Swift file (plus optional DesignTokens.swift and component files) to Xcode |
102
- | `get_project_path` / `set_project_path` | Read or update the target Xcode source directory |
103
- | `bridge_status` | Check if the Figma plugin bridge is connected |
141
+ | `bridge_status` | WebSocket health, protocol version, connected agent count |
142
+ | `auth_status` | CreateLex subscription status |
143
+ | `get_document_summary` | File name, pages, current page |
144
+ | `get_viewport_context` | Viewport center, zoom, current page |
104
145
 
105
- ### Multi-file output
146
+ ### Inspection
106
147
 
107
- `write_generated_swiftui_to_xcode` accepts an `additionalFiles` array for AI-generated companion files:
148
+ | Tool | Parameters | Description |
149
+ |---|---|---|
150
+ | `get_metadata` | `scope`: `'selection'`\|`'page'`\|`'node'` | Node metadata, fills, effects, layout |
151
+ | `get_node_snapshot` | `nodeId`, `maxDepth` 0–8 (default 3) | Rich structural snapshot for a specific node ID |
152
+ | `get_selection_snapshot` | `maxDepth` (default 3) | Snapshot of current selection |
153
+ | `get_page_snapshot` | `maxDepth` (default 2) | Full page hierarchy |
154
+ | `find_nodes` | `name` (substring), `type` (optional), `limit` ≤200 | Search nodes by name |
155
+ | `dump_tree` | `nodeIds` (optional) | Readable indented text tree of selection or specific nodes |
108
156
 
109
- ```json
110
- {
111
- "structName": "LoginScreen",
112
- "code": "import SwiftUI\n...",
113
- "additionalFiles": [
114
- { "name": "DesignTokens.swift", "code": "...", "dir": "shared" },
115
- { "name": "PrimaryButton.swift", "code": "...", "dir": "components" }
116
- ]
117
- }
157
+ ### Design context & generation
158
+
159
+ | Tool | Parameters | Description |
160
+ |---|---|---|
161
+ | `get_design_context` | `excludeScreenshot` (bool) | Full node tree + asset export plan + generation hints. Also available via `GET http://localhost:7765/design-context` |
162
+ | `get_swiftui_generation_prompt` | | Returns `systemPrompt` + `userMessage` to feed to any LLM manually |
163
+ | `generate_swiftui` | `generationMode`, `scope` | Runs plugin generator; returns code without writing to disk |
164
+ | `analyze_generation` | `generationMode` | Code + per-node diagnostics + refinement hints |
165
+
166
+ ### Asset export
167
+
168
+ | Tool | Parameters | Description |
169
+ |---|---|---|
170
+ | `get_asset_export_plan` | — | Lists all icon/vector/image/raster candidates with format and blend mode |
171
+ | `get_screenshot` | `nodeId` (optional), `includeImages` (bool) | PNG screenshot of a node |
172
+ | `export_svg` | `nodeId` | Exact SVG for a vector-friendly node |
173
+
174
+ ### Xcode integration
175
+
176
+ | Tool | Parameters | Description |
177
+ |---|---|---|
178
+ | `get_project_path` | — | Reads saved Xcode source folder |
179
+ | `set_project_path` | `path` | Persists Xcode project path to config |
180
+ | `write_selection_to_xcode` | `generationMode`, `includeOverflow`, `projectPath`, `nodeIds` | **Primary tool.** Generates from Figma selection + exports PNG/SVG assets + writes Swift + asset catalog in one call |
181
+ | `write_generated_swiftui_to_xcode` | `code`, `structName`, `images[]`, `additionalFiles[]`, `projectPath` | Writes pre-generated Swift code + manually provided image payloads |
182
+ | `write_svg_to_xcode` | `nodeId`, `assetName` | Exports a single SVG directly to Assets.xcassets |
183
+ | `figma_to_swiftui` | `scope`, `generationMode` | One-shot: generates code + assets + tokens + components in a single response |
184
+
185
+ ### Component analysis
186
+
187
+ | Tool | Parameters | Description |
188
+ |---|---|---|
189
+ | `extract_reusable_components` | `scope`: `'selection'`\|`'page'` | Identifies repeated structures for SwiftUI component extraction |
190
+
191
+ ### `write_selection_to_xcode` vs `write_generated_swiftui_to_xcode`
192
+
193
+ - **`write_selection_to_xcode`** — plugin generates code AND exports assets automatically. One call does everything. This is what the AI should use in the standard workflow.
194
+ - **`write_generated_swiftui_to_xcode`** — you supply the Swift code string and base64-encoded image payloads. Use when an external LLM generated the code and you need to write it plus manually gathered assets to disk.
195
+
196
+ ---
197
+
198
+ ## Generation modes
199
+
200
+ ### Editable (default)
201
+
202
+ Generates native SwiftUI: `Text`, `VStack`, `HStack`, `ZStack`, `Image("name")`, `TextField`, `Button`. Each asset becomes `Image("assetName").resizable().blendMode(...)`. Best for responsive, interactive screens.
203
+
204
+ ### Fidelity
205
+
206
+ Rasterizes entire frames to a single PNG. Applied automatically when `shouldRasterizeAbsoluteFrame()` returns true (only in fidelity mode): 6+ complex children, layered image+text combos, or unsupported styles. Preserves exact visual appearance at cost of editability.
207
+
208
+ ### When the plugin chooses each mode automatically
209
+
210
+ 1. **Interactive scaffold** — `maybeGenerateInteractiveScaffold()` runs first on every node. If it recognises a semantic screen pattern (welcome/auth, calendar, task list, generic form, timeline), it generates a fully wired `@State` scaffold and skips the node-by-node path entirely.
211
+ 2. **Node-by-node** — If no scaffold matches, `nodeToSwiftUI()` is called recursively.
212
+
213
+ ---
214
+
215
+ ## Output file structure
216
+
217
+ ```
218
+ YourApp/ ← Xcode source folder (FIGMA_SWIFTUI_PROJECT_PATH)
219
+ ├── FigmaGenerated/
220
+ │ ├── Screens/
221
+ │ │ ├── Welcome.swift
222
+ │ │ └── GetStartedRegister.swift
223
+ │ ├── Components/
224
+ │ │ └── PrimaryButton.swift
225
+ │ ├── DesignTokens.swift ← if design tokens extracted
226
+ │ └── Manifest/
227
+ │ ├── index.json ← catalog of all generated screens
228
+ │ ├── Welcome.json
229
+ │ └── GetStartedRegister.json
230
+ └── Assets.xcassets/
231
+ ├── Welcome_BackgroundArt.imageset/
232
+ │ ├── Welcome_BackgroundArt.png
233
+ │ └── Contents.json ← { scale: "2x" }
234
+ ├── Welcome_LogoArt.imageset/
235
+ │ ├── Welcome_LogoArt.svg
236
+ │ └── Contents.json ← { preserves-vector-representation: true }
237
+ └── …
238
+ ```
239
+
240
+ Asset naming: `{StructName}_{AssetName}` — e.g. `Welcome_HeroArt`, `GetStartedRegister_BgTexture`.
241
+
242
+ ---
243
+
244
+ ## Building from source
245
+
246
+ ```bash
247
+ npm install
248
+
249
+ # Compile code.ts → code.js (required after any change to code.ts)
250
+ npm run build
251
+
252
+ # Watch mode
253
+ npm run watch
254
+
255
+ # Smoke tests
256
+ npm run test:bridge-smoke
257
+ npm run test:high-fidelity-smoke
258
+ ```
259
+
260
+ After editing `code.ts`: run `npm run build`, then reload the plugin in Figma (right-click plugin → Reload plugin).
261
+
262
+ Figma manifest (`manifest.json`): plugin ID `1620328279208269762`, entry point `code.js`, UI `ui.html`.
263
+
264
+ ---
265
+
266
+ ## code.ts internals
267
+
268
+ `code.ts` is ~7400 lines. It is organized into logical sections:
269
+
270
+ ### Section map
271
+
272
+ | Lines | Section | What it contains |
273
+ |---|---|---|
274
+ | 1–50 | Bootstrap | `figma.showUI`, global state declaration, message handler wiring |
275
+ | 51–560 | Serialization | `serializeNodeForBridge`, `serializeNodeLayout`, `serializeNodeStyle`, `serializeNodeText`, `serializePaint`, `serializeEffects` — convert Figma objects to plain JSON for bridge transmission |
276
+ | 561–730 | Asset export plan | `flattenSceneNodes`, `isSvgFriendlyNode`, `isPrimitiveVectorNode`, `buildAssetExportPlan` |
277
+ | 731–1075 | Reusable components | `buildReusableComponentPlan`, `buildStructureSignature`, `buildTextPropCandidates` |
278
+ | 1076–1160 | Refinement hints | `buildManualRefinementHints` — produces the `manualRefinementHints` array in design context |
279
+ | 1161–1530 | Bridge handler | `handleBridgeRequest`, `postBridgeResponse`, `postBridgeEvent`, `resolveBridgeTargetNodes` |
280
+ | 1525–1830 | Interfaces & registry | All TypeScript interfaces; `imageRegistry`, `generationDiagnostics`, `imageNames` globals; `uniqueImageName`, `recordGenerationDiagnostic` |
281
+ | 1831–2190 | Asset utilities | `exportNodeAsImage`, `exportNodeAsPng`, `exportNodeAsSvg`, `extractImageFillBytes`, `hasDropShadow`, `collectRasterizationReasons`, `isRasterHeavyNode`, `shouldRasterizeAbsoluteFrame` |
282
+ | 2187–2291 | Generation orchestration | `buildDesignContext`, `generateSwiftUIResult`, `generateSwiftUI` |
283
+ | 2292–3260 | Interactive scaffolds (task/calendar) | `maybeGenerateInteractiveScaffold`, `detectInteractiveAddTaskScreen`, `buildInteractiveAddTaskScreenCode`, `detectInteractiveCalendarScreen`, `buildInteractiveCalendarScreenCode` |
284
+ | 3260–3980 | Interactive scaffold helpers | Tab detection, heading/subtitle extraction, bottom bar inference, primary button detection |
285
+ | 3981–5250 | Calendar & timeline | `detectInteractiveCalendarPickerScreen`, `detectInteractiveTimelineCalendarScreen` and their builders |
286
+ | 5258–6158 | Generic list & form | `detectInteractiveGenericListScreen`, `detectInteractiveGenericFormScreen` and their builders |
287
+ | 6159–6400 | Core node dispatch | `nodeToSwiftUI` (switch on node type), `exportAsImageNode`, `frameToSwiftUI` |
288
+ | 6401–6720 | Node type renderers | `groupToSwiftUI`, `textToSwiftUI`, `rectToSwiftUI`, `ellipseToSwiftUI`, `absoluteRasterChildToSwiftUI`, `exportMaskedNodesToSwiftUI` |
289
+ | 6720–7060 | Layout utilities | `groupAbsoluteChildren`, `mergeButtonPairs`, `absoluteChildOrigin`, `visibleBoundsWithinParent`, `inferContainer` |
290
+ | 7060–7390 | SwiftUI modifier builders | `buildFrameModifiers`, `strokeToSwiftUI`, `shadowToSwiftUI`, `blendModeToSwiftUI`, `fillToSwiftUIColor`, `fontWeightToSwiftUI`, `sanitizeName`, `generatePreview` |
291
+
292
+ ### Global state (mutable across a generation run)
293
+
294
+ ```typescript
295
+ let generationModeOption: GenerationMode = 'editable'; // set at start of generateSwiftUIResult
296
+ let includeOverflowOption = false; // whether to ignore Figma clip regions
297
+ const imageRegistry: ImageExport[] = []; // accumulates every PNG/SVG export
298
+ const generationDiagnostics: GenerationDiagnostic[] = []; // per-node strategy log
299
+ const imageNames = new Set<string>(); // prevents duplicate asset names
300
+ let imageNameNamespace = ''; // prefix for uniqueImageName()
301
+ ```
302
+
303
+ All four are reset at the top of `generateSwiftUIResult()` before each run.
304
+
305
+ ### Generation call graph
306
+
307
+ ```
308
+ write_selection_to_xcode (MCP tool)
309
+ └── handleBridgeRequest('generate-swiftui-write', …) [code.ts:1307]
310
+ └── generateSwiftUIResult(nodes, includeOverflow, generationMode) [code.ts:2228]
311
+ ├── imageRegistry.length = 0 (reset global state)
312
+ ├── for each node:
313
+ │ ├── maybeGenerateInteractiveScaffold(node, structName) [tries semantic patterns first]
314
+ │ │ ├── detectInteractiveWelcomeScreen → buildInteractiveWelcomeScreenCode
315
+ │ │ ├── detectInteractiveCalendarScreen → buildInteractiveCalendarScreenCode
316
+ │ │ ├── detectInteractiveGenericListScreen → buildInteractiveGenericListScreenCode
317
+ │ │ ├── detectInteractiveGenericFormScreen → buildInteractiveGenericFormScreenCode
318
+ │ │ └── (returns null if no pattern matches)
319
+ │ └── nodeToSwiftUI(node, indent) [if no scaffold]
320
+ │ ├── FRAME/COMPONENT/INSTANCE → frameToSwiftUI
321
+ │ │ ├── shouldRasterizeAbsoluteFrame? → exportAsImageNode [fidelity only]
322
+ │ │ ├── hasImageFill? → image + children ZStack
323
+ │ │ ├── hasFill + isPill + allTextChildren? → styled Button
324
+ │ │ ├── layoutMode VERTICAL → VStack { children }
325
+ │ │ ├── layoutMode HORIZONTAL → HStack { children }
326
+ │ │ └── layoutMode NONE → ZStack { absoluteRasterChildToSwiftUI | nodeToSwiftUI }
327
+ │ │ └── auto-layout child? → nodeToSwiftUI (recurse)
328
+ │ │ └── other child? → absoluteRasterChildToSwiftUI (rasterize)
329
+ │ ├── GROUP → groupToSwiftUI → exportAsImageNode (if raster-heavy)
330
+ │ ├── TEXT → textToSwiftUI
331
+ │ ├── RECTANGLE → rectToSwiftUI
332
+ │ ├── ELLIPSE → ellipseToSwiftUI
333
+ │ └── VECTOR/STAR/POLYGON/LINE/BOOLEAN_OPERATION → exportAsImageNode
334
+ └── returns SwiftUIResult { code, images: imageRegistry, diagnostics }
335
+ └── xcode-writer writes Swift file + asset catalog entries
118
336
  ```
119
337
 
120
- Output layout inside your Xcode source folder:
338
+ ### nodeToSwiftUI decision tree
121
339
 
122
340
  ```
123
- FigmaGenerated/
124
- Screens/ ← primary view files
125
- Components/ ← reusable component files
126
- DesignTokens.swift ← shared tokens (dir: "shared")
127
- Manifest/ generation metadata
341
+ nodeToSwiftUI(node)
342
+
343
+ ├── visible === false → return ''
344
+
345
+ ├── FRAME / COMPONENT / INSTANCE → frameToSwiftUI
346
+ │ │
347
+ │ ├── [fidelity mode only] shouldRasterizeAbsoluteFrame?
348
+ │ │ → exportAsImageNode (whole frame becomes one PNG)
349
+ │ │
350
+ │ ├── hasImageFill?
351
+ │ │ → export image fill + render children on top
352
+ │ │
353
+ │ ├── hasFill AND cornerRadius ≥ 40% AND all children are TEXT?
354
+ │ │ → styled Text button (pill shape)
355
+ │ │
356
+ │ ├── layoutMode VERTICAL → VStack(alignment:, spacing:) { each child via nodeToSwiftUI }
357
+ │ ├── layoutMode HORIZONTAL → HStack(alignment:, spacing:) { each child via nodeToSwiftUI }
358
+ │ │
359
+ │ └── layoutMode NONE (absolute) → ZStack(alignment: .topLeading)
360
+ │ groupAbsoluteChildren() groups children into:
361
+ │ 'button' pair → buttonFromPair() (shape + text → styled Button)
362
+ │ 'mask' group → exportMaskedNodesToSwiftUI()
363
+ │ single node →
364
+ │ TEXT → nodeToSwiftUI
365
+ │ auto-layout FRAME → nodeToSwiftUI ← recurse (preserves native SwiftUI)
366
+ │ everything else → absoluteRasterChildToSwiftUI (export as PNG)
367
+
368
+ ├── GROUP → groupToSwiftUI
369
+ │ └── raster-heavy or has blend children → exportAsImageNode
370
+ │ otherwise → ZStack of children via nodeToSwiftUI
371
+
372
+ ├── TEXT → textToSwiftUI → Text("…").font(…).foregroundStyle(…)…
373
+ ├── RECTANGLE → rectToSwiftUI → Color / Image / RoundedRectangle
374
+ ├── ELLIPSE → ellipseToSwiftUI → Circle / Ellipse / Image
375
+ └── VECTOR / STAR / POLYGON / LINE / BOOLEAN_OPERATION → exportAsImageNode (always rasterized)
128
376
  ```
129
377
 
130
- ## AI-native workflow (Tier 1 example)
378
+ ### Rasterization decision chain
379
+
380
+ A node is rasterized (exported as PNG) when any of the following is true:
381
+
382
+ 1. Its type is `VECTOR`, `STAR`, `POLYGON`, `LINE`, or `BOOLEAN_OPERATION` — always rasterized.
383
+ 2. It is a `GROUP` with raster-heavy children or blend-mode children.
384
+ 3. It has `rotation != 0` (SwiftUI `.rotationEffect` can approximate, but rasterizing is exact).
385
+ 4. Its blend mode is not `NORMAL` or `PASS_THROUGH`.
386
+ 5. It has a drop shadow (`hasDropShadow` returns true).
387
+ 6. It has an image fill and is not a simple `RECTANGLE` or `FRAME`.
388
+ 7. [Fidelity mode only] `shouldRasterizeAbsoluteFrame` returns true: frame has 6+ complex children, OR 2+ image fills under 1+ text child, OR is a complex absolute ZStack.
389
+
390
+ `collectRasterizationReasons(node)` returns an array of string reason codes (e.g. `'rotation'`, `'blend-mode'`, `'image-fill'`, `'mask-descendants'`) that appear in `GenerationDiagnostic.reasons`.
131
391
 
132
- In Claude Code or any MCP-connected AI:
392
+ ### Asset export plan classification
393
+
394
+ `buildAssetExportPlan` iterates all descendant nodes and classifies each:
133
395
 
134
396
  ```
135
- 1. Select a frame in Figma
136
- 2. Ask: "Generate SwiftUI for my selected Figma frame"
137
- 3. Claude calls get_swiftui_generation_prompt generates code calls write_generated_swiftui_to_xcode
138
- 4. File appears in Xcode immediately
397
+ for each node (skip if hidden, skip if already seen):
398
+ isSvgFriendlyNode(node)?
399
+ kind: 'icon' (≤160×160) or 'vector' (larger), format: 'svg'
400
+ hasImageFill(node)?
401
+ → kind: 'image', format: 'png'
402
+ isRasterHeavyNode(node) AND size > 0?
403
+ → kind: 'raster', format: 'png'
404
+
405
+ Sort: SVG candidates first, then alphabetically. Cap at 50 candidates.
139
406
  ```
140
407
 
141
- Zero CreateLex tokens consumed.
408
+ `isSvgFriendlyNode` requirements:
409
+ - No image fills
410
+ - No drop shadows
411
+ - No unsupported effects
412
+ - Either: is a primitive vector node (VECTOR, BOOLEAN_OPERATION, STAR, POLYGON, LINE, ELLIPSE, RECTANGLE)
413
+ - Or: has ≤24 descendants, all of which are primitive vectors or groups, and none are TEXT
142
414
 
143
- ## Plugin UI
415
+ ### Interactive scaffold detection
144
416
 
145
- The plugin's **Smart Generate** button supports two modes via the dropdown:
146
- - **Smart AI** — uses your AI IDE or BYOK key for true AI generation
147
- - **Classic** — uses the hosted pattern matcher (requires subscription)
417
+ Before node-by-node generation runs, `maybeGenerateInteractiveScaffold` tests the top-level node against a series of semantic detectors:
148
418
 
149
- After generation, the **Copy AI Prompt** button copies a ready-to-use prompt you can paste into Claude.ai, ChatGPT, or any AI chat. Requires the MCP server to be running.
419
+ | Detector | Pattern it recognises | Scaffold it generates |
420
+ |---|---|---|
421
+ | `detectInteractiveWelcomeScreen` | Large hero image + 1–3 pill buttons at bottom | `@State var selectedAction` + `Button` closures |
422
+ | `detectInteractiveAddTaskScreen` | Input fields + tag list + subtask list | Full task-creation form with `@State` |
423
+ | `detectInteractiveCalendarScreen` | Calendar grid + event rows | `@State var selectedDate` + event list |
424
+ | `detectInteractiveCalendarPickerScreen` | Month grid with day cells | Date picker with month navigation |
425
+ | `detectInteractiveTimelineCalendarScreen` | Day columns + time slots | Timeline scroll view |
426
+ | `detectInteractiveGenericListScreen` | Repeated row structures | `List` with `ForEach` and row view |
427
+ | `detectInteractiveGenericFormScreen` | Label + value field pairs | Form with `@State` per field |
150
428
 
151
- ## Login (CreateLex subscription)
429
+ If none match, the node is rendered node-by-node via `nodeToSwiftUI`.
152
430
 
153
- Only required for Tier 3 hosted generation:
431
+ ---
154
432
 
155
- ```bash
156
- npx @createlex/figgen login
433
+ ## Key data types
434
+
435
+ ```typescript
436
+ // What generateSwiftUIResult returns
437
+ interface SwiftUIResult {
438
+ code: string; // complete Swift source (may be multiple structs joined by separator)
439
+ images: ImageExport[]; // all PNG/SVG assets collected during generation
440
+ diagnostics: GenerationDiagnostic[]; // per-node strategy log
441
+ selection: { ids: string[]; names: string[] };
442
+ }
443
+
444
+ // One exported asset (PNG or SVG)
445
+ interface ImageExport {
446
+ name: string; // asset catalog name, used in Image("name") in SwiftUI
447
+ base64: string; // PNG bytes as base64 string (SVGs are stored as text in this field)
448
+ width: number;
449
+ height: number;
450
+ }
451
+
452
+ // Per-node record of what strategy was used and why
453
+ interface GenerationDiagnostic {
454
+ strategy: string; // e.g. 'absolute-raster-child', 'interactive-welcome-screen'
455
+ nodeIds: string[];
456
+ nodeNames: string[];
457
+ nodeTypes: string[];
458
+ reasons: string[]; // e.g. ['mask-descendants', 'absolute-layout-export']
459
+ assetName?: string; // set when node was exported as an asset
460
+ component?: Record<string, unknown> | null;
461
+ }
462
+
463
+ // One candidate in the asset export plan
464
+ interface AssetExportPlanCandidate {
465
+ nodeId: string;
466
+ nodeName: string;
467
+ nodeType: string;
468
+ kind: 'icon' | 'vector' | 'image' | 'raster';
469
+ suggestedAssetName: string;
470
+ suggestedFormat: 'svg' | 'png';
471
+ dimensions: { width: number | null; height: number | null };
472
+ reasons: string[];
473
+ blendMode: string | null; // Figma blend mode string
474
+ blendModeSwiftUI: string | null; // ready-to-paste SwiftUI value e.g. '.multiply'
475
+ }
476
+
477
+ // Identified reusable component candidate
478
+ interface ReusableComponentCandidate {
479
+ source: 'figma-component' | 'repeated-structure';
480
+ suggestedComponentName: string;
481
+ occurrenceCount: number;
482
+ signature: string; // structural hash used to detect repetition
483
+ nodeIds: string[];
484
+ nodeNames: string[];
485
+ parentName: string | null;
486
+ figmaComponent: Record<string, unknown> | null;
487
+ propCandidates: ReusableComponentPropCandidate[];
488
+ }
489
+ ```
490
+
491
+ ---
492
+
493
+ ## Bridge protocol
494
+
495
+ The plugin communicates with the bridge server via WebSocket. All messages are JSON.
496
+
497
+ ### Plugin → bridge (outbound events)
498
+
499
+ ```jsonc
500
+ // Selection changed
501
+ { "type": "bridge-event", "event": "selection-changed", "data": { "count": 1, "names": ["Welcome"] } }
502
+
503
+ // Response to a bridge request
504
+ { "type": "bridge-response", "requestId": "uuid", "action": "get-design-context", "ok": true, "payload": { … } }
505
+
506
+ // Error response
507
+ { "type": "bridge-response", "requestId": "uuid", "action": "generate-swiftui", "ok": false, "error": "No selection" }
508
+ ```
509
+
510
+ ### Bridge → plugin (inbound requests)
511
+
512
+ ```jsonc
513
+ {
514
+ "type": "bridge-request",
515
+ "requestId": "uuid",
516
+ "action": "generate-swiftui-write", // action name maps to handleBridgeRequest() switch cases
517
+ "params": {
518
+ "generationMode": "editable",
519
+ "includeOverflow": false,
520
+ "nodeIds": [] // optional: override current selection
521
+ },
522
+ "protocolVersion": 1,
523
+ "timestamp": 1712345678000
524
+ }
157
525
  ```
158
526
 
159
- Saves your session to `~/.createlex/auth.json`.
527
+ ### Action strings (used in `handleBridgeRequest`)
528
+
529
+ | Action | What it does |
530
+ |---|---|
531
+ | `get-design-context` | Calls `buildDesignContext()`, returns full node tree + hints |
532
+ | `generate-swiftui-write` | Calls `generateSwiftUIResult()`, returns `SwiftUIResult` |
533
+ | `get-selection-snapshot` | Calls `buildSelectionSnapshot()` |
534
+ | `get-page-snapshot` | Calls `buildPageSnapshot()` |
535
+ | `get-node-snapshot` | Calls `serializeNodeForBridge()` on a specific node |
536
+ | `get-metadata` | Calls `buildMetadataSnapshot()` |
537
+ | `dump-tree` | Calls `buildNodeTreeDump()` |
538
+ | `find-nodes` | Calls `findBridgeNodes()` |
539
+ | `get-document-summary` | Calls `buildDocumentSummary()` |
540
+ | `get-viewport-context` | Calls `buildViewportContext()` |
541
+ | `get-asset-export-plan` | Calls `buildAssetExportPlan()` |
542
+ | `get-screenshot` | Calls `buildNodeScreenshot()` |
543
+ | `export-svg` | Calls `buildSvgExport()` |
544
+ | `get-swiftui-generation-prompt` | Returns `_generationPrompt` from design context |
545
+ | `extract-reusable-components` | Calls `buildReusableComponentPlan()` |
546
+
547
+ ### HTTP endpoints (bridge-server.cjs)
548
+
549
+ | Method | Path | Description |
550
+ |---|---|---|
551
+ | GET | `/ping` | Health check |
552
+ | GET | `/bridge/info` | Connected agents, plugin status, protocol version |
553
+ | GET | `/design-context` | Live design context JSON (same as MCP `get_design_context`) |
554
+ | POST | `/set-project` | `{ path }` — sets Xcode project path |
555
+ | POST | `/write` | `{ code, structName, images[], … }` — writes files to Xcode (50 MB limit) |
556
+ | POST | `/shutdown` | Graceful bridge shutdown |
557
+
558
+ ---
559
+
560
+ ## Environment variables
561
+
562
+ ### Bridge server
563
+
564
+ | Variable | Default | Description |
565
+ |---|---|---|
566
+ | `FIGMA_SWIFTUI_BRIDGE_PORT` | `7765` | HTTP/WebSocket port |
567
+ | `FIGMA_SWIFTUI_BRIDGE_HOST` | `localhost` | Listen hostname |
568
+
569
+ ### MCP server
570
+
571
+ | Variable | Default | Description |
572
+ |---|---|---|
573
+ | `FIGMA_SWIFTUI_BRIDGE_HTTP_URL` | `http://localhost:7765` | Bridge HTTP base URL |
574
+ | `FIGMA_SWIFTUI_BRIDGE_WS_URL` | `ws://localhost:7765/bridge` | Bridge WebSocket URL |
575
+ | `FIGMA_SWIFTUI_BRIDGE_TIMEOUT_MS` | `30000` | Bridge request timeout |
576
+ | `FIGMA_SWIFTUI_RESPONSE_SIZE_CAP` | `102400` | Max response payload (bytes) |
577
+
578
+ ### BYOK generation
579
+
580
+ | Variable | Description |
581
+ |---|---|
582
+ | `ANTHROPIC_API_KEY` | Claude API key (Tier 2, recommended) |
583
+ | `ANTHROPIC_MODEL` | Claude model (default: `claude-sonnet-4-6`) |
584
+ | `OPENAI_API_KEY` | OpenAI key or `"ollama"` for local |
585
+ | `OPENAI_MODEL` | Model name (default: `gpt-4o`) |
586
+ | `OPENAI_BASE_URL` | Custom base URL (Ollama, LM Studio) |
587
+ | `HF_API_TOKEN` | Hugging Face inference token |
588
+ | `HF_MODEL` | HF model (default: `Qwen/Qwen2.5-Coder-32B-Instruct`) |
589
+
590
+ ### Xcode
591
+
592
+ | Variable | Description |
593
+ |---|---|
594
+ | `FIGMA_SWIFTUI_PROJECT_PATH` | Explicit Xcode source folder path |
595
+
596
+ ### CreateLex auth
597
+
598
+ | Variable | Description |
599
+ |---|---|
600
+ | `FIGMA_SWIFTUI_ACCESS_TOKEN` | Explicit CreateLex JWT |
601
+ | `FIGMA_SWIFTUI_BYPASS_AUTH` | `"true"` to skip subscription check (testing) |
602
+ | `CREATELEX_CONFIG_DIR` | Auth config directory (default: `~/.createlex`) |
603
+ | `CREATELEX_AUTH_FILE` | Auth file path (default: `~/.createlex/auth.json`) |
604
+ | `CREATELEX_API_BASE_URL` | CreateLex API base (default: `https://api.createlex.com/api`) |
605
+
606
+ ---
607
+
608
+ ## Key source files
609
+
610
+ | File | Lines | Role |
611
+ |---|---|---|
612
+ | `code.ts` | ~7400 | Figma plugin: all node serialization, SwiftUI generation, asset export |
613
+ | `companion/mcp-server.mjs` | ~1000 | 23 MCP tools, BYOK orchestration, auth |
614
+ | `companion/bridge-server.cjs` | ~400 | HTTP + WebSocket bridge on :7765 |
615
+ | `companion/xcode-writer.cjs` | ~400 | Writes Swift files + asset catalog to disk |
616
+ | `companion/local-llm-generator.cjs` | ~300 | Anthropic / OpenAI / HuggingFace API calls |
617
+ | `companion/createlex-auth.cjs` | ~200 | JWT validation and refresh |
618
+ | `bin/figgen.js` | — | CLI entry point (`start`, `login`) |
619
+ | `ui.html` | — | Figma plugin panel UI |
620
+ | `manifest.json` | — | Figma plugin manifest (ID, permissions, entry) |
621
+ | `companion-app/` | — | Optional native macOS app for project path selection |
622
+
623
+ Config persistence path: `~/Library/Application Support/FigmaSwiftUICompanion/config.json` (macOS).
624
+
625
+ ---
626
+
627
+ ## How to extend the codebase
628
+
629
+ ### Add a new MCP tool
630
+
631
+ 1. Open `companion/mcp-server.mjs`.
632
+ 2. Register a new tool with `server.tool(name, description, zodSchema, handler)`.
633
+ 3. In the handler, call `bridgeRequest(ws, action, params)` with a new action string.
634
+ 4. Open `code.ts`, add a `case 'your-new-action':` block in `handleBridgeRequest()` (around line 1307).
635
+ 5. Implement the logic, call `postBridgeResponse(requestId, action, true, { data: result })`.
636
+ 6. Run `npm run build` and reload the plugin.
637
+
638
+ ### Add a new interactive scaffold detector
639
+
640
+ 1. Define a new `interface InteractiveXxxBlueprint` near line 1596 in `code.ts`.
641
+ 2. Write `detectInteractiveXxxScreen(node)` returning the blueprint or `null`.
642
+ 3. Write `buildInteractiveXxxScreenCode(structName, node, blueprint)` returning the Swift string.
643
+ 4. Add a call to both in `maybeGenerateInteractiveScaffold()` (around line 2250).
644
+ 5. Build and reload.
645
+
646
+ ### Change rasterization behavior
647
+
648
+ - To prevent a node type from rasterizing: modify `isRasterHeavyNode()` (~line 2104) or `collectRasterizationReasons()` (~line 2067).
649
+ - To prevent absolute-position children from rasterizing: modify the `isAutoLayoutChild` check in `frameToSwiftUI()` (~line 6368).
650
+ - To add SVG export support for new node patterns: modify `isSvgFriendlyNode()` (~line 629).
651
+
652
+ ### Add a new SwiftUI modifier
653
+
654
+ All modifier builder functions live in the 7060–7390 range of `code.ts`:
655
+ - `buildFrameModifiers` — padding, background, corner radius, border
656
+ - `strokeToSwiftUI`, `shadowToSwiftUI`, `blendModeToSwiftUI`, `opacityToSwiftUI`, `rotationToSwiftUI`
657
+
658
+ ---
160
659
 
161
660
  ## Support
162
661
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@createlex/figgen",
3
- "version": "1.4.6",
3
+ "version": "1.4.7",
4
4
  "description": "CreateLex MCP runtime for Figma-to-SwiftUI generation and Xcode export",
5
5
  "bin": {
6
6
  "figgen": "bin/figgen.js"